mardi 9 juin 2020

How to test an Entity Framework Core database as InMemory with insert functionality for a large number of rows

In one of our projects, we have a database with lots of tables and thousands of rows. For our integration tests, we want our tests to run against a fixed database state, e.g., 5000 rows, so that the tests are deterministic and always return the same result.

We've worked with a small Entity Framework Core InMemoryDatabases before, where we added like 20 rows like this:

private DbContextOptions<OurDataContext> GetInMemoryDbContextOptions()
{
    var options = new DbContextOptionsBuilder<OurDataContext>()
        .UseInMemoryDatabase(databaseName: "foo")
        .Options;

    using (var context = new OurDataContext(options))
    {
        context.OurTable.Add(new OurTable(){...});
        // ...
        context.SaveChanges();
    }
    return options;
}

However, in our new case, that's not going to be feasible with this many rows that should be derived from the production database. We need a good way to sync real data into our in-memory database. How can this be done?

Ideally, we would export relevant parts of our production database into an SQL script with the SQL Management Studio and track this SQL export in Git as part of our test code. As far as we can see, there is no import data from SQL script due to the fact that we can't run SQL scripts against it.

https://docs.microsoft.com/en-us/ef/core/miscellaneous/testing/ sums the available options up pretty good, but I still don't know how a solution would look like in our case since we need

  • the ability to anonymize data from a production database to a state that we can store in Git. This sync should be done if we want it to, so explicitly not every time we want to run tests. We can't manually write inserts for thousands of rows.
  • A way to reset the database to the stored saved before each test run.

Which in-memory database approach should we choose?

Aucun commentaire:

Enregistrer un commentaire