DACFx 3.0 – Import and Export with BACPACs

The last piece of the DAC puzzle (at least for now) is import and export. Export makes a non-transactionally consistant copy of database content that uses the BACPAC format (schema and data) and Import creates a new database with the BACPAC schema and data. Currently, its the most-used method of backing up and restoring a SQL Azure database. To make a transactionally-consistant backup of a SQL Azure database without using third-party tools, you must use "CREATE DATABASE … AS COPY OF" to create a transactionally-consistant copy of a database, then export the copy to a BACPAC.

Import and export can be accomplished by using the API directly. In DACFx 3.0, the API contains three overloads of the ExportBacpac method in the DacServices class, and an ImportBacpac class. If you don't want to program it yourself, Import and Export functionality is provided in SSMS 2012. There is also an Import/Export Web Service, but this currently differs from the SSMS 2012 Import/Export. The SQL Server Data Tools (SSDT) and SqlPackage.exe don't deal with BACPACs; they emit and consume only schema information via the DACPAC format.

During the development of SQL Server 2012, SSMS supported DAC 2.0. This used the data format for the BACPAC (compressed JSON format) along with the schema files from the 2.0 DACPAC, an extension of the 1.0/1.1 DACPAC. Just as with the DACPAC, DACFx 3.0 can consume 2.0 BACPACs, but DAC 2.0 cannot consume 3.0 BACPACs. No production version of SSMS and VS2010 projects supports DAC 2.0.

The Import/Export Web Service is a REST-base web service that runs at every Windows Azure data center and uses the BACPAC format. It currently (as of 5/22/2012) uses the BACPAC 2.0 format, not the DACFx 3.0 BACPAC format. The web service requires that your BACPAC input/output be network addressable; therefore is uses Windows Azure. You can use the web service directly from the Windows Azure portal; it's invocable using the Import and Export controls at the top of the page when a SQL Azure database is selected on the portal. There is also a web service client in the SQL DAC Examples project available on CodePlex.

The SQL DAC examples client provides one additional feature that's not available on the portal; you can create a BACPAC file that contains only a subset of tables. This is called "selective export" This functionality is also available when using SSMS 2012 export and the DACFx 3.0 API.

Because the web service uses a pre-DACFx 3.0 format, you'll get the following error if you try to import a DACFx 3.0 BACPAC using the web service "Unsupported BACPAC version.The BACPAC was created with a different version of the DAC Framework which will be supported in an upcoming service update". This happens whether you use the Windows Azure portal or the SQL DAC Examples project. You can, however, use to DACFx 3.0 BACPAC to populate a SQL Azure database by using SSMS 2012 Object Explorer, connecting to a SQL Azure database and using the "Import Data-Tier Application"  menu entry after right-clicking on the "Databases" folder.

When you use the web service, you're actually submitting a request to import or export using a BACPAC that the web service runs asynchronously. Submitting a request returns a request ID GUID. You can use this GUID along with the status command to see the results of your request using the "Status" function.

The BACPAC format does have a few limitations, even with DACFx 3.0. It doesn't currently support tables with sql_variant datatype columns, extended properties, or SQLCLR assemblies/SQLCLR database objects. Attempting to export these to a BACPAC produces a error. With sql_variant columns and extended properties, that's "selective export" comes in handy. SQL Azure doesn't support these items currently, you'll only see this error when doing a BACPAC export from SSMS 2012. The are some DacExtractOptions and DacDeployOptions that look like they'd be helpful when using the API, but there's currently no way to pass these in to the relevent Export and Import methods.

Chances are, you'll see some upgrades to the BACPAC functionality and tools in the upcoming months. I'll be updating as the story progresses. For now, the interesting thing about the BACPAC format is that, unlike SQL Server's native backup and restore format, it is backward as well as forward compatible for different versions of SQL Server, including SQL Azure, modulo feature support in different releases. Importing a BACPAC does take longer than restoring a backup, however. I found that out the easy way, when I received a "database backup" from SQL Azure as a BACPAC and was able to restore it to an on-premise SQL Server 2012 with a few clicks.


3 thoughts on “DACFx 3.0 – Import and Export with BACPACs

  1. Bob,

    great articles on DACFx 3.0 which we’re also using succesfully in our development team. We’ve establishes a quite robust strategy and infrastructure for encapsulating each iteration of our software / database schema in DACPAC snapshots and use the DAC API to deploy available DACPACs through a custom installer.

    I have to admit that, compared to previous available Microsoft tools, this is definitely a big step forward.

    You mention that sql_variant datatypes aren’t supported by DACPACs. This is not entirely true as we’re using them with SSDT, although what you probably mean is that SSDT is unable to infer a database schema from an existing database using sql_variant. However, authoring new objects in hand through SSDT is definitely possible today.

    But it’s important to realize that SSDT and the DACFx 3.0 API is not fully supporting objects of type sql_variant. For instance, when deploying DACPACs the SQL script generated always wants to recreate tables containing computed columns referencing sql_variants (there’s a workaround for this – and I’ve posted on MSDNs SSDT forum as well).

    Looking forward to v.next of SSDT. We’re still missing some features from SQL Server DB projects.

  2. Hi Bob.

    First of all thanks for your blog, it is really helpful.

    Secondly I would like to ask if BACPAC file can help me with a specific scenario: I have a *huge* table that I need to rebuild its clustered index in order to change the partition scheme. I tried just to delete and recreate the clustered index but I ran out of disk space for log. I also tried to export and import using BCP but that took forever. Would think I will be able to resolve my issue thru a BACPAC export/import since the table structure is a little different?


  3. Glad that you liked the blog series. About BACPAC for a specific scenario. If I understand you correctly, I wouldn’t (and really, you can’t), use BACPAC for the reason. BACPAC can do a selective Export (only certain tables), but not selective Import (it always Imports to an empty database). Also, BCP should be a bit faster than BACPAC if you use BCPs binary format.


Comments are closed.

Other articles

Imagine feeling confident enough to handle whatever your database throws at you.

With training and consulting from SQLskills, you’ll be able to solve big problems, elevate your team’s capacity, and take control of your data career.