DACFx 3.0: The new file formats

In the last posting, I talked about the differences between DACFx 3.0 and previous versions from an API point of view. This time, I'll look at the files in a DACPAC and see how those differ.

DACPAC really stores a database model, that can be used to recreate the database and server (3.0 only) objects in the model, in an XML format. Note also that I've seen it written the a DACPAC is a simple zip file and you can unzip it to produce the same files that the "Unpack" Windows explorer shell extention (i.e. right-click on a DACPAC on a machine where DAC is installed and you'll see an "Unpack" context menu entry. This is not exactly true, as unzipping a V1.1 DACPAC on my system produces the following files:
 [Content Types].xml

And using the Unpack extention produces:

Note that the "unpacker" uses the object definitions in the XML files to generate a SQL script for the model. To stay on the V1.1 DACPAC for a minute, if the DACPAC has been produced using the VS2010 DAC Project, additional files may be present. These include:

Pre and post deployment scripts are intuitively named. TargetSelection refers to the file the VS2010 DAC Project produces if you use a SQL target selection policy. With this project type, you can specify (editable using the ServerSelection.sqlpolicy node in the Solution Explorer for the project) which servers this DAC should be installed on. Choices include Collation, IsCaseSensitive, OSVersion, and more. Note: TargetSelection seems to be missing from the SSDT projects.

With an DACFx 3.0 DACPAC, different files appear. The "unpacker" will produce:

Note that, just with pre-3.0 versions, no .sql file appears in the unzipped version, so the .sql script is actually produced by the "unpacker". The model is strictly XML-based until unpacked. As interesting aside is that in DACFx 3.0, unpacking is supported by the API, using the Unpack method in the DacPackage or BacPackage classes. These methods support unpacking 3.0 or pre-3.0 DACPAC/BACPAC formats. The equivalent code in the pre-3.0 unpacker couldn't possibly know about future versions.

So what do these files contain?
 DacMetadata.xml – The name and version of the "application" that this DACPAC represents. Note that in 3.0, an XML namespace is added.
 Origin.xml (3.0 only) – Some metadata about where this DACPAC came from (which version, which tool, etc).

So the difference really amounts to the distinction between LogicalObjectStream.xml and PhysicalObjectStream.xml (pre-3.0) and model.xml (3.0). These files contain an XML model of a database. In appears that, in DACFx 3.0, the two XML files were consolidated into one. And all the additional database and server objects that DACFx 3.0 represents were added.

Finally, let's take a look at the model files. DACPAC pre-3.0 refers to two XML schema, in both the "Logical" and "Physical" model files. These are:

I was not able to find a current version of the XSD files at the location implied by the namespace name (that's not a requirement of the XML schema spec, but sometimes it works), but I was able to find an older version of the ManagementModel schema at the schema.microsoft.com website.

In DACFx 3.0, there is reference to a single XML schema in the model.xml and DacMetadata.xml files:

This schema appears at the location that matches its namespace name. And two additional XML schemas also exist related to DACFx 3.0. They are:

So, we've just taken a little spin through the items in a pre-3.0 and 3.0 DACPACs. The XML files and the information contained in those files have evolved greatly in DACFx 3.0. This is likely due to the integration of SQL Server Database Projects from VS2010 and the DAC Project in VS2010 SP1 into a single project type and format, that of SQL Server Data Tools, which subsumes both types of project. As for the concept of using DAC to manage "Deployment target" systems (i.e. the TargetSelection.xml file) depending on its utility and acceptence for use in database/application management, maybe this will return one day.


Other articles

Imagine feeling confident enough to handle whatever your database throws at you.

With training and consulting from SQLskills, you’ll be able to solve big problems, elevate your team’s capacity, and take control of your data career.