With the introduction of SQL Express LocalDB (part of SQL Server 2012 a.k.a. "Denali"), the available Microsoft SQL database options available now are:

  • Regular SQL Server SKUs
  • SQL Server Express
  • SQL Server Compact Edition (CE)
  • SQL Express LocalDB

LocalDB is a lightweight version of SQL Express with zero administration and low installation foot-print. Strictly speaking, LocalDB is not a new SQL Server edition by itself; rather, it is a database engine activation mode in which the SQL Server process (sqlservr.exe) runs as a child of the calling process but without exposing any connectivity surface. The engine runs as long as the parent process runs but as a private named instance. The name of the instance can be anything given at the time of creation. Once started, other applications cannot connect to it unless the instance is explicitly shared by its owner. MSDN says user instance feature of Express editions is deprecated and developers should switch to LocalDB going forward. LocalDB also has an automatic instance which is created & started automatically and public. Any application can connect to it and perform database operations. Technically, this automatic instance is also a named instance with a special name that reflects the engine version; for SQL Server 2012, it is v11.0. Hence, the automatic LocalDB instance is identified as (localdb)\v11.0. Any number of private instances can be spun off from single installed LocalDB bits.

From a programming perspective, the API surface remains the same, be it managed or native. In order to connect to a LocalDB instance, the Data Source keyword of the connection string should start with (localdb) followed by the instance name. Example:

SqlConnection con = new SqlConnection (@"Data Source=(localdb)\v11.0;Integrated Security=true;Initial Catalog=master");

Here is a n00b code!

LocalDB Sample C# Code

As with user instances, AttachDbFileName keyword is also supported to connect to a database directly. Output from a SQL Server 2012 RC0 instance for the above code is shown below:

LocalDB Sample Output

Note the following about LocalDB:

  • SQL Express edition is upgradable to higher SKUs, but LocalDB is not. LocalDB is intended to be for developers without the overhead of regular SQL Server installations.
  • All the database size limitations of Express apply to LocalDB as well.
  • LocalDB instances do not use Win32 services; the engine is loaded on demand by the parent process.
  • sqllocaldb.exe is the simple command line admin companion for LocalDB. It supports, creating, starting, stopping & deleting LocalDB instances (you cannot delete an automatic instance).
  • LocalDB instances are named, user specific and private (except the auto instance).
  • LocalDB stores system database files in user’s AppData folder (see below). This means that when LocalDB is used from an application running as a non-interactive user account (including IIS), that user account should be configured to load user profiles, Otherwise, the instance starting process will fail.

LocalDB System DB Folder

  • There may be a delay when a local DB instance is created or started for the first time, causing the parent application to timeout.
  • Multiple users on a computer can create and start database instances simultaneously using the same installed LocalDB bits. Unlike other SQL Server editions, LocalDB does not require multiple installations for multiple database engine instances.
  • Current (pre-SQL Server 2012) versions of SQL Server Management Studio cannot connect to LocalDB instances using server name prefix "(localdb)". Instead they should use the raw named pipe path of the LocalDB instance (this can be retrieved using sqllocaldb utility):

LocalDB Connectivity from SSMS

Remember that the part of the named pipe path after # will change with every instance start and hence it cannot be hardcoded in the code. You can also use this named pipe path in the connection string from code. The following is a perfectly valid connection string & connects to the specified local DB instance:

SqlConnection con = new SqlConnection (@"Data Source=np:\\.\pipe\LOCALDB#B7BB9C55\tsql\query;Integrated Security=true;Initial Catalog=NorthWind");

While working on a SQL Server change notification library (for cache invalidation) recently, the SqlDependency change notification handler kept receiving a notification with SqlNotificationEventArgs.Info = Options, SqlNotificationEventArgs.Source = Statement and SqlNotificationEventArgs.Type = Subscribe thus failing with the subscription process. Using the lead SqlNotificationEventArgs.Info = Options, further diagnosis revealed that one of the SET options required for SQL Server query notification was not correct. The offending option was ARITHABORT, which was set to OFF (connection default) but should be ON. Except this, other SET options were correctly set by default however.

The obvious solution is to explicitly turn on ARITHABORT on the connection that would be used for notification subscription.

Subtle SqlDependency Error

Please note that ARITHABORT should be set before running the SQL query to be monitored, otherwise the subscription process will still fail as above.

There are many ways to export data to SQL Azure. Plain T-SQL INSERT statements, bulk copy utility (bcp.exe) of SQL Server 2008 R2, SQL Azure Migration Wizard and SQL Server Integration Services (SSIS) are some of them. This weekend I wanted to upload the latest Stack Overflow data dump to the SQL Azure database for testing purposes. As you know, Stack Overflow data dump is available only in XML format and exporting XML into SQL Azure (or even SQL Server 2005/2008/R2) is a bit tricky because none of the out-of-the box utilities directly supports this. Remember, I am not talking about the XML data type columns which store the complete XML as a BLOB. Rather, the attributes of XML elements and child elements will contain the column values. For example, here is a simple XML fragment that represents an order and its product line items:

When importing this XML data into SQL Azure, the OriderID and CustomerID attribute values go into their respective columns of the Order table and ProductID goes into its own column in the OrderItems table. Most importantly, the relationship between the Order and Product elements (entities) should be maintained when importing this XML.

On a bit of investigation I found XML Bulk Load, a COM library from Microsoft doing exactly what I wanted but I am not sure how it would work with SQL Azure and didn’t spend further on that (for the moment). I thought I would do the XML data import using the ADO.NET class System.Data.SqlClient.SqlBulkCopy. Actually, this approach turned out be very simple and did the job quite well. With a blank database and tables to receive the Stack Overflow data, the below few lines of code imports the XML data into SQL Azure database:

var sbc = new SqlBulkCopy (@”Server=tcp:server_name.database.windows.net;Database=StackOverflow;User ID=user_id@server_name;Password=azure_password;Trusted_Connection=False;Encrypt=True;”)
    BatchSize = 100,
    DestinationTableName = “Users”,
    NotifyAfter = 1000

var ds = new DataSet();
sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler((sender, e) => Console.WriteLine(e.RowsCopied + ” rows copied.”));


    ds.ReadXml (@”E:\Downloads\Stack Overflow Data Dump – Jun 10\Content\Users.xml”);
    foreach (DataColumn dc in ds.Tables[0].Columns)
catch (Exception xcp)



As you can see, it’s not a lot of code but it did the job.

One big challenge I had in the whole process was that some of the XML data dump files were too large, above 2GB and the ADO.NET data set declined to read such files to infer the XML schema. The result was ‘out of memory’ exception – obvious! My work-around was to split the large XML files into sizes that the data set can read and proceed with the same code above (I used the Windows version of the famous UNIX shell commands head and tail).