Site Meter

LINQ-to-SQL: the multi-tier story

If your web application is even slightly bigger than the typical MSDN example, you’re almost certainly using some sort of multi-tier architecture. Unfortunately, as Dinesh Kulkari (MS LINQ Program Manager) explains on his blog, LINQ-to-SQL version 1 has no out-of-the-box multi-tier story.

So, if you’re going to use it, you’re going to have to get imaginative and write your own story.

What we already do

The typical multi-tier web application handles requests like this:


It’s a simplified view but covers the basic point, which is that to fulfil a request, the top layer makes multiple calls into the lower layers.

Using LINQ-to-SQL

Simple example: the user wants to rename a “Customer” entity. You want to do something like this:

void RenameButton_Click(object sender, EventArgs e)
   Customer myCustomer = ServiceLayer.CustomerEngine.GetCustomer(_customerID);
   myCustomer.Name = NameTextbox.Text;

Easy – you load the record, update it, then send it back to be saved. We made two calls to the service layer.

And that brings us to… the problem

Where do DataContexts come in? With LINQ to SQL, you’re always querying or saving changes on a DataContext, which is responsible for two things:

  • Managing connections to the database
  • Tracking changes on entities it created, so it knows which ones to write back to the database

Who creates the DataContext, and what is its lifespan?

You certainly can’t create and destroy a new DataContext within each service layer call, even though most trivial examples you’ll see do exactly that. You can’t do that because our Customer object above needs to remain attached to the same DataContext during its entire lifetime. If we created a new DataContext during the SaveCustomer() call, the new context wouldn’t know what changes had been made to the customer, so wouldn’t write anything to the database.

Somehow, we have to manage the lifespan of a DataContext across multiple service layer calls. This is the “multi-tier story” we’ve been missing. There are three most obvious mechanisms I can think of.

Story one: DataContext creation in UI layer

You can, if you want, create and destroy the DataContext objects in your ASPX-code-behind event handlers, passing the context into each service layer call. This allows the DataContext to span multiple calls, as follows:

void RenameButton_Click(object sender, EventArgs e)
   MyDomainDataContext dataContext = new MyDomainDataContext(CONNECTION_STRING);
   Customer myCustomer = ServiceLayer.CustomerEngine.GetCustomer(dataContext, _customerID);
   myCustomer.Name = NameTextbox.Text;
   ServiceLayer.CustomerEngine.SaveCustomer(dataContext, myCustomer);

Advantages of this approach:

  • Fine-grained control over DataContext lifespan and reuse

Disadvantages of this approach:

  • DataContext is exposed to UI layer (unless you wrap it in some sort of handle object), which encourages lazy non-separating-concerns coding
  • Code smell throughout your application – any chain of calls through the service layers involves a “context” parameter passed down through each layer
  • Tedious work of creating DataContexts all the time

Story two: DataContext per application

The opposite extreme is having a single, static DataContext shared across your entire application. You may instantiate it either on application start, or lazily, and your service classes may simply access it whenever they need database access.

Do not do this, because:

  • DataContext isn’t thread-safe, as far as I know
  • You lose isolation and cannot control when SubmitChanges() is called – concurrent requests will interfere with one another
  • Memory leaks are pretty likely

Story three: DataContext per unit-of-work (i.e. per request)

The classic solution to the problem is slightly more tricky that the others, but achieves – almost – the best of all worlds. This is the typical solution advocated by many when using other object-relational mappers, like NHibernate.

If each HTTP request has access to its own private DataContext object which lives for the duration of the request, you can expose it to the whole data access layer (and not to the UI layer), knowing that related calls will use the same DataContext object and thus keep track of object changes properly. Also, you don’t get unwanted interaction between concurrent requests.

But where would you store such an object? The natural place is in the IDictionary HttpContext.Current.Items. That’s a storage area whose lifespan equals the lifespan of the request, and is private to that request. You could, therefore, set up a static helper method available to your whole service layer:

internal static class MyDomainDataContextHelper
    public static MyDomainDataContext CurrentContext
            if (HttpContext.Current.Items["MyDomainDataContext"] == null)
                    = new MyDomainDataContext(ConfigurationSettings.AppSettings["connectionString"]);
            return (MyDomainDataContext)HttpContext.Current.Items["MyDomainDataContext"];

Now, querying the database is as simple as:

var query = from c in MyDomainDataContextHelper.CurrentContext.Customers where // ...etc

Since the DataContext is available statically, there’s no need to construct new instances or pass them around from the UI layer downwards. The UI layer doesn’t even need to know that such a thing exists. How nice!

Stop! That’s still not good enough

The simple implementation of method three leads to a serious limitation. The service layers are now coupled to HttpContext.Current, which is not a good idea – firstly because you ought to be striving to make your service layers independent of the UI platform (in this case, the web), and secondly because it’s going to break in some cases. HttpContext.Current won’t be available to your unit test runner, for instance.

Fortunately we can fix this with a trivial implementation of Inversion of Control. Let’s define an abstract notion of a “unit of work datastore” in our service layer.

namespace ServiceLayer
    public interface IUnitOfWorkDataStore
        object this[string key] { get; set; }
    public static class UnitOfWorkHelper
        public static IUnitOfWorkDataStore CurrentDataStore;

Now, when our application starts, we can map this datastore to HttpContext.Current.Items, by adding code to Global.asax:

private class HttpContextDataStore : IUnitOfWorkDataStore
    public object this[string key]
        get { return HttpContext.Current.Items[key]; }
        set { HttpContext.Current.Items[key] = value; }
protected void Application_Start(object sender, EventArgs e)
    ServiceLayer.UnitOfWorkHelper.CurrentDataStore = new HttpContextDataStore();

… and then the references to System.Web.HttpContext.Current.Item in the service layer can be replaced with UnitOfWorkHelper.CurrentDataStore:

internal static class MyDomainDataContextHelper
    public static MyDomainDataContext CurrentContext
            if (UnitOfWorkHelper.CurrentDataStore["MyDomainDataContext"] == null)
                    = new MyDomainDataContext(ConfigurationSettings.AppSettings["connectionString"]);
            return (MyDomainDataContext)UnitOfWorkHelper.CurrentDataStore["MyDomainDataContext"];

And we’re done! The service layer no longer has any dependency on or awareness of System.Web, and our unit tests can supply any IUnitOfWorkDataStore of their own creation by assigning it to ServiceLayer.UnitOfWorkHelper.CurrentDataStore.If you’re using a full-fledged Inversion of Control container, like Castle Windsor or Spring, you will no doubt register your IUnitOfWorkDataStore with it and access it that way. For the rest of us, this simple implementation works nicely.

So that’s all perfect then

Yes, I thought so for a little while. In the next post, I’ll explain some issues you’re likely to run into when using LINQ to SQL in this way. Problems that don’t happen with NHibernate…

23 Responses to LINQ-to-SQL: the multi-tier story

  1. Pingback: LINQ to SQL: Lazy and Eager Loading Hiccups « codeville

  2. One could accomplish the unit testing more simply than adding IUnitOfWorkDataStore by placing a conditionally compiled section of code to lookup the DataContext in a dictionary rather than in the HttpContext.Current (which can be reserved for your mainline configuration Debug/Release configurations). You just define a configuration variable (project properties, build) for a UNIT_TESTS configuration and reference it like:
    … dictionary code goes here
    … HttpContext code here

  3. Steve

    Hi Jeffrey. That’s an interesting approach – thanks for the idea.

    I do wonder, though, through which component you’d manage the instantiation/lifetime of your dictionary. It doesn’t feel clean to expose it outside the Data layer, but then the data layer would ideally be agnostic to unit testing too. What do you have in mind?

    In principle, I would support your approach to avoid inversion of control if possible, as long as clean separation isn’t compromised.

  4. This is a good approach, unfortunately, it doesn’t work with DataLoadOptions, and it’s important, do you have any idea about that?

  5. Steve

    James, unfortunately you’re right about DataLoadOptions. I wrote more about it here:

    I don’t yet have a good recommendation about working around this, besides trying to avoid changing fetching strategies on a per-query basis!

    Does anyone reading have more knowledge or good ideas?

  6. Adam Tibi

    Thank you for this nice pattern, when you start working with LINQ you feel directly that this pattern is missing.
    You have mentioned:
    […]DataContext isn’t thread-safe, as far as I know[…]
    However, you are treating it as Thread-Safe by using it as Static. Please correct me if I’m wrong.

  7. Steve

    Hi Adam, that’s a good question. Although I’m exposing a static wrapper around the DataContext concept, it’s not the same instance that gets returned to every thread. The instance is different for each HTTP request, and in ASP.NET, each request is handled by a single thread so by default there are no thread safety dangers here.

    There *would* be a danger if somebody used it with some sort of multithreaded IHttpHandler, or otherwise spun off multiple threads during request processing. For server performance reasons, it’s extremely rare for anyone to do that, and if they did, the whole model ceases to be applicable anyway. No amount of blocking would create the desired thread-safety, since our whole objective here is to reuse the same DataContext object and the threads would by definition interfere with one another.

  8. Steve

    Redlock, thanks for letting me know about that. The code you show is pretty neat.

    I’m a bit unsure about how well that would work with deferred execution. Let’s say I set up two queries before running either of them:

    1: var query1 = MakeQueryWithEagerLoading();
    2: var query2 = MakeQueryWithLazyLoading();
    3: List results1 = query1.ToList();
    4: List
    results2 = query2.ToList();

    Isn’t line 2 going to override the DataLoadOptions set up in line 1? This could lead to unexpected dependencies and odd behaviour.

  9. Mikee

    Hi Steve,
    yes, it will overwrite settings. I’m thinking about solution using IDisposable object. something like:

    using (LoadOptions lo = new LoadOptions(dataContext)) {

    // Create end !EXECUTE! your queries here
    return dataContext.Orders.ToList();
    } // LoadOptions will restore original DataLoadOptions

    then you would have to use it in same manner as TransactionScope, that is execute your queries in it, and not passing just IQueryable outside it (to web page for example).
    I hope you’ve got my point, it could be of course written in the manner using current data context implicitly etc..

  10. Pingback: LINQ to SQL first impressions - Service Endpoint

  11. I’ve come up with something very similar to your solution, which works like TransactionScope and allows the use of DataLoadOptions. I blogged about it a few months ago. We’re making some improvements at the moment so it works well with WinForms/WPF, but we have several applications using this at the moment. If you’re interested in seeing the full code and some samples let me know. I should get around to updating my post with more complete code.

  12. Hi Steve, thanks for this article, it helped me a lot getting started with this subject. Looking forward to your upcoming book.
    I had a similar issue but came up with a slightly different system. This allows registration of a context and a caching strategy with a Context Provider.

    Thanks again!

  13. Ashraf

    Very good presentation.

  14. You can reattach a linq to sql entity to a new context. See this post on how to do this:

  15. Pingback: My db4o Wishlist | emphess .NET

  16. İsmail SEZEN

    You’re exactly a life saver. Thanks.

  17. I am thankful that I found this weblog , exactly the right info that I was searching for! .

  18. Thank you for the auspicious writeup. It in fact was a amusement account it. Look advanced to more added agreeable from you! By the way, how can we communicate?

  19. I am impressed with this internet site , real I am a big fan .

  20. Generally I don’t read article on blogs, but I would like to say that this write-up very compelled me to try and do so! Your writing taste has been amazed me. Thanks, very great article.

  21. R

    I am new to Linq2Sql. Is it possible to get the source code of this tutorial? I get erros when I manually type all code.


  22. priyanksster

    Hi Steve,

    Thanks for this wonderful article. Just wondering, did you get a chance to do your article on this topic,i.e. regarding problems which don’t happen with NHibernate.