I’ve just started a new open source project you might be interested in if you use both TDD and Entity Framework 6.1.0.
It’s called Blink, and you can get hold of the pre-release binaries on NuGet and the source on GitHub.
Here’s the problem it tries to solve. (If you have this problem, let me know!)
When performing automated testing, it can be very expensive to initialize a fresh, real database. So expensive that you avoid testing against the real database at all costs. For example, the project that inspired me to start this library takes about a minute to build its database; that’s fine in a deployment scenario, but intolerable if you want to write tens or hundreds of integration tests. Blink re-initialises the DB in ~3s. That’s fast enough for TDD, if you’re careful about which tests you run.
It’s a very young project, currently so young it’s not really designed to be used on other people’s machine’s quite yet — there are some hard-coded strings that need replacing before it’ll work on anything other than a default instance of SQL Server 2012 x64, for instance. That’ll come soon, though.
This blog post is more of an announcement, though. If you’re interested, get in touch via the comments. Let me know if the project looks useful to you. We’ll see if we can’t make something good.
Here’s roughly what the code looks like;
// Create a new BlinkDBFactory, maybe inside [TestInitialize] or [SetUp]
var factory = Blink.BlinkDB.CreateDbFactory<TestDbContext, TestDbConfiguration>(
BlinkDBCreationMode.UseDBIfItAlreadyExists,
() => new TestDbContext());
// Execute code, inside a transaction;
factory.ExecuteDbCode(context =>
{
// use the context here;
});
// db edits are rolled back automatically
Really like you’re idea of running everything within a transaction. I’m currently using Telerik JustMock for mocking some static instances in a CMS I’m using and it has the idea that it’ll mock for the context of your test (be it a method, a class initialize call etc).
It’d be interesting to do something similar, automatically wrap method calls in a transaction block, or wrap class init calls in a transaction for the entire test class.
The transaction approach can be really fast, actually. Just been developing with it and tests that previously took 60s now take 2s. So much more confident now about the code!
I wonder if you could write a custom test executor; here’s an msdn blog showing how to do custom invocations of test methods – http://blogs.msdn.com/b/vstsqualitytools/archive/2009/09/04/extending-the-visual-studio-unit-test-type-part-1.aspx – bet you could use it to pass a custom parameter to the test method.
I might look into that for Blink!
An alternative is to make your SQL server point to a RAMDisk for its DBF LDB. And use simple recovery + truncate table statements at the end of the test (if possible).
Q: what happens if your test uses transactions itself? Do they stack? I didn’t think they did.
Yeah, tranactions stack, so no worries there.
I’ve looked at ram drives recently for building from source, and found that it’s hard to strike a balance: using, say, 2gb of an 8gb machine for your source gets a little pokey (the drive runs out if space fairly easily) and you’re much more likely to swap if you run too many apps. But sql and databases might be a much better fit, actually.
I need to look into recovery models a bit more, actually; I suspect they may be a good bet for something like re-initializong test data. Something to look into.
Sorry, I’m still thinking 16gb RAM was the norm for a dev machine!
I’m with you on the idea though – as I think mocking a database is a fruitless endeavour, and just want small tests to run against it FAST without shared state.