Testing ASP.NET Core With Entity Framework In-Memory Providers

Testing ASP.NET Core With Entity Framework In-Memory Providers

With the final release of ASP.NET Core and Entity Framework Core there were a number of new and welcome improvements that came along with those releases. One of them being the ability to run Entity Framework without having to bootstrap a physical database. That is, Entity Framework Core can use an In-Memory database, which is useful for things like unit testing. Now, granted that there are a number of other possible ways to fake out a database and conduct unit test, but I haven't found one that I thought was as simple to implement as with Entity Framework Core. Therefore, I thought it would be neat to show how one can test an ASP.NET Core controller, leveraging IoC, and simply using an actual database at runtime for the real code execution and then switching to a testable database when running tests. By the way, I will make the source code available for this sample at the end of this blog post, so if you want to run it, then I suggest you install Visual Studio 2017 along with the web programming bits. So with that said, let's start by going over the sample project structure.

Project Layout

One of the things that I wanted to do for this sample, is that I wanted to separate the data access and data model declarations into their own project. Even though I could have kept everything contained inside a Web API project, I wanted to keep things a bit more organized, even if it's just sample code. So I structured the projects in three simple sections:

  • The SampleApi project which is an ASP.NET Core project that hosts a very simple Web API
  • The SampleApi.Data project, that is where the data context and model declarations are placed
  • Finally, there is the SampleApi.Tests project, which is a xUnit style project that references the Web API project and runs assertions against an ASP.NET Core controller

Here is a snapshot of the physical setup of the Visual Studio 2017 solution:

Sample API Project Structure

Data Model Project

The model for the sample is made up of a Client and a list of Appointments for each Client, so there's really not much to it. However, when I was setting up EF Core to run the first code migration, I quickly realized that I had to setup a factory in order for the EF tooling to work. It turned out to be simple to do, however, I wanted to point it out in case anyone else ran into the same issue. Here is the snippet for the sample factory:

This factory is used by the EF Core tools in order to create the data migration. However, the downside of this approach is that I hard-coded the development connection string in order to make things a bit more clear, nevertheless, this works OK as a development sample(only). However, improving this piece (very possible) is left as an exercise for the reader. :)
One more thing worth mentioning about the SampleApi.Data project is that I used a .NET Core Console project instead of a .NET Core library to host the data access code. There are a couple of reasons for this, and they are all related to the requirements of the EF Core tooling. As of the time of this writing, the EF Core tooling requires an executable project file in order to run EF Code migrations, so that leaves library projects out of the equation, at least for now. The good news is that when you compare a .NET Core class library project template with that of a .NET Core console project, the main difference that I see is that the console project has an extra attribute in the .csproj declaration:

Console Project EXE Definition

The other thing you will need in order to let the EF Core tools do their work is a 'Program.cs' file with a 'Main()' method in it:

Program.cs with Main Entry Method

Thus, using a console project made sense here until the EF Core tooling evolves and we don't have to concoct this kind of work-around just to make things work. But in any case, it is not that bad of a hack work-around. :)

Web API Project

I went ahead and scaffold the Clients Controller using the EF Data Model, however, the only method I am really using for the testing demo, is the one that exposes the retrieval of all the clients. This is the only endpoint that I will be unit testing in this sample since it's simple enough to get the point across:

Tests Project

In order to keep things consistent with the frameworks targeted by the rest of the projects, I setup an unit test project using the xUnit template for .NET Core. Then I added a class to host the tests for the Clients Controller and proceeded to add the references for the SampleApi and SampleApi.Data projects. I then created two tests, both of which are similar in the sense that they test the same scenario, however the setup for each is different. The first one tests the controller by instantiating a Data Context using the In-Memory provider for Entity Framework. This makes it simpler to test simple scenarios where you need to interact with a database but do not want to or need to instantiate a full database, you know, as in a unit test. :)
This is what the code for that test looks like:

Notice the call to the UseInMemoryDatabase extension method, it is this method that serves to instantiate a inMemoryDataContextOptions object, that is then used to create a test client that subsequently will aid in testing the controller's results.

The other test in here is simply the same test scenario as the previous one, except that this time the test setup is using an In-Memory SQL Lite connection to handle any database calls needed during the test:

And that is how you would test an ASP.NET Core controller that uses an Entity Framework data access component. If you would like to know more about testing with EF Core In-Memory instances then see the official docs at: https://docs.microsoft.com/en-us/ef/core/miscellaneous/testing/

Running The Tests

Now that we have the tests ready, we can simply run them from Test Explorer:

Running Test Explorer

Notice that the EF In-Memory test is faster than the SQL Lite In-Memory test, but in reality they are both much faster than using a real database. And when it comes to unit testing, speed matters, because you want to get quick feedback on whether your code is working or not. This is specially true if you practice TDD, where you write a test and then the code to make the test pass. In any case, using the In-Memory database providers should decrease the running time of your unit tests, whether you use a TDD approach or write your tests after you write your code implementation.

However, with all that said, I recommend that using the In-Memory approach be used when actual unit testing is being conducted, as opposed to integration testing, in which your tests would benefit from actual interaction with a real database. So if you are testing logic in a certain component, and such component happens to use a SQL database to gather or update some data as part of accomplishing its logic, then using the In-Memory database will reduce your testing friction and increase the processing speed of your tests.

A Few Observations

The sample code should be ready to run the two included tests without further configuration. However, I noticed that when I first restored the NuGet packages from the command line (dotnet restore) for the solution, I received an error in the output window that some of the library projects are not compatible with the version of .NET Core App included in the solution:

Failed NuGet Restore Message

However, after restoring the packages I compiled the solution from the command line (dotnet build) and everything built just fine:

Build Success

So just be aware that .NET Core is still new and you may still encounter a few bumps on the road here and there, so perhaps don't put this out in your production environment just yet. However, this should not stop you from researching and playing with this technology to see what is possible.

Finally, here is a link to the source code, have fun with it:
Sample ASP.NET Core Entity Framework Test with In-Memory Data Providers

Update: I updated the project file in order to fix the compiling error, the updated code is published on GitHub.