Looking at Data-Tier Applications (DAC)? Look at version 1.1

About a week or so ago, I read a request for information about a SQL Server database management feature known as Data-Tier Applications, abbreviated as DAC (apparently someone realized that the DTA abbreviation was already "taken" in SQL Server (Database Tuning Advisor) but not that DAC was too (Dedicated Admin Connection)). Data-Tier Applications is a feature that was introduced as part of Visual Studio 2010 and is supported in SQL Server Management Studio for SQL Server 2008 R2. It is a fairly (understatment here) controversial feature, that provides a file (.dacpac) that uses an XML format that encapsulates descriptions of database objects, suitable for deployment.

The .dacpac could be thought of as the ".msi file" of database deployment. The basic premise is that you can produce a .dacpac from an existing database or create one with Visual Studio. Then you can point at the instance node in Object Explorer pane of SSMS and choose "Deploy Data-Tier Application". The underlying infrastructure produces and runs DDL to set up the database and underlying database objects. You can also automate all the use cases using PowerShell, as its based on SMO.

DAC-deployed databases keep deployment history information in MSDB (in master if you're using SQL Azure, as Azure has no MSDB).
DAC-deployed databases also integrate with SQL Server 2008 R2's Utility Control Point feature.

Most (well, all that I've seen) articles cover version 1.0 of the DAC feature, which has the following limitations:
1. Didn't support all database objects, not even all the ones in SQL Azure Database.
2. It didn't address server-level objects (except for logins) like SQL Agent jobs, at all.
3. The "upgrade a database" action didn't upgrade in place; instead, it made a copy of the database set up the new DDL and copied
the data in existing tables "the best it could". Then renamed the current database to old and new database to current. The "best
it could" could result in data loss, and reading the docs (and runtime warnings) informed you of where this could occur. You would (naturally) need to
back up your database after a DAC "upgrade", as your transaction log recovery chain (and anything else in the old database would
be renamed away.
Although they do keep the old database around for those instances where you need to (and are able to) scavenge from it.

Folks correctly target this as a SQL Azure Database-centric (but not only) feature. You can support other objects with manual
pre-and-post deployment scripts. Moving just a database and logins directly relate to the SQL Azure product structure. But there
is an upcoming Denali feature, Contained Databases, that also go by the concept that everything should be contained inside the
database, to simpify moving a database from one SQL Server instance to another. This would another place where DAC will likely come
into play.

Back to my original premise…since the original DAC annoncement and general upsetness by the SQL Server DBA community, the folks
at DAC have listened to your "suggestions" (not the "off with their head" ones, just the constructive ones). And DAC has already
released a 1.1 version. And a 2.0 beta version is also available. Because this posting is getting a bit long, I'll cover them in detail in
the next entry. But for now, realize that if you're reading an article about DAC, its most likely about DAC 1.0. And things have
changed.
You can get version 1.1 here. It fixes the "upgrade" scenario, point number 3 in the list above.

@bobbeauch

Other articles

Imagine feeling confident enough to handle whatever your database throws at you.

With training and consulting from SQLskills, you’ll be able to solve big problems, elevate your team’s capacity, and take control of your data career.