Last week in Azure SQL Database – Part 5 – Wrapup

This post contains miscellaneous information about the current/future state of Azure SQL Database (AST). You know I couldn’t write just one more blog post when I said I would in Part3, didn’t ya’? This post has some properties of a rant in some places, but I’m genuinely interested. I try not to judge technologies, just […]

Last week in Azure SQL Database – Part 3 – HADR preview service for premium

This post is about a new SQL Azure Database feature called a “Business Continuity Feature”, called “Disaster recovery/geo-replication”. This feature was announced last week as a preview. For the Premium tiers, this is a lovely feature that include “Active geo-replication” (their term) and cmdlets (and portal) for controlling it. For Basic and Standard tiers, you […]

Last week in Azure SQL Database – Part 2 – New preview services

Note: Well that was quick. I’ve updated this blog entry (same day) to reflect clarifications provided by a member of the Azure SQL Database team. Thanks for these excellent clarifications. For now (I may go back and change this later) changes from the original blog post are indicated with italics. The last post in this […]

Using PowerShell with DAC 3.0

At my DAC talk at TechEd last week, the thing that seemed to cause the most interest was the PowerShell scripts to use DACPACs/BACPACs. I thought I'd post that here, along a question about cmdlets in general. Using PowerShell against the library allows admins to use all of the functionality that's in SSMS, with the […]

I’ve got the last talk at TechEd and its almost LAST CALL

I've been at TechEd North America in Orlando this week and have seen some amazing new things. Upcoming operating systems, new hardware, and a detailed technical followup on the Azure (cloud) announcements from last week, including some Windows Azure SQL Database (was SQL Azure) upcoming enhancements. I've spend quite a bit of time at the […]

DACFx 3.0 – Import and Export with BACPACs

The last piece of the DAC puzzle (at least for now) is import and export. Export makes a non-transactionally consistant copy of database content that uses the BACPAC format (schema and data) and Import creates a new database with the BACPAC schema and data. Currently, its the most-used method of backing up and restoring a […]

DACFx 3.0: The new file formats

In the last posting, I talked about the differences between DACFx 3.0 and previous versions from an API point of view. This time, I'll look at the files in a DACPAC and see how those differ. DACPAC really stores a database model, that can be used to recreate the database and server (3.0 only) objects in the […]

DACFx 3.0: The new programming API

Looking at the API and at the serialized form (i.e. the DACPAC), it turns out that DACFx 3.0 is not just "DAC V-next". It's an entire quantum change from all other versions, including DAC 2.0, the version that introduced the BACPAC (serialized schema and data). This has some interesting repercussions with compatibility. Let's start with the […]

Scripting SQL Server databases sans SMO Scripter

I can script out a database schema (and optionally data) from SQL Server databases. Any version from 2005 and beyond, as well as SQL Azure Database. And most all instance level objects.  And I'm not the SMO scripter object. Or a DBA. Who am I? I would be DACFx 3.0. And I can do some things (unlike […]

The Rest of the Story: Co-existing VS2010 and SSDT database projects

After my last adventure, I thought it would be interesting to try the "upgrade path" with SSDT and older "Database Projects" (i.e. Visual Studio for Database Professionals, or whatever the last name was before we went to the nice SSDT acronym, I always called them "Data Dude" projects after Gert Drapers, the original Data Dude). […]

A First Look at Data-Tier Applications 2.0

The next version of Data-Tier Applications is version 2.0. You can get a CTP of it today; where the CTP is located really gives away the game. It's located at SQL Azure Labs, listed under SQL Azure "Import/Export". The writeup on this page identifies a few interesting things: 1. It's a preview of the Denali […]

Data-Tier Applications – Version 1.1 – Where?

So DAC 1.1 has been out for a while (since March 2, 2011) now. The big change is that the DAC "upgrade" process is now an in-place upgrade rather than a side-by-side upgrade. As far as I can see, you couldn't do a side-by-side upgrade in 1.1 if you wanted to. AFAIK, in-place upgrade is […]

Looking at Data-Tier Applications (DAC)? Look at version 1.1

About a week or so ago, I read a request for information about a SQL Server database management feature known as Data-Tier Applications, abbreviated as DAC (apparently someone realized that the DTA abbreviation was already "taken" in SQL Server (Database Tuning Advisor) but not that DAC was too (Dedicated Admin Connection)). Data-Tier Applications is a […]

DAC support SQL Azure and vice-versa. It’s live.

Last week I did a talk at SQLConnections on SQL Azure Database and Data-Tier Applications (DAC). At the time (it was the day of Visual Studio 2010 launch), I explained that conference abstracts had to be submitted 6 months ago. At the time, because of some coincidental feature correspondence (e.g. the DAC whitepaper suggests only using […]

DACPAC and pre-SQL Server 2008 R2 versions

I thought it was curious that in a DACPAC you can specify required version and edition of SQL Server as a deployment option. But DAC (Data-Tier Applications) is a new feature of SQL Server 2008 R2 and VS2010 data tools. So what versions and editions does it support? (or will it support?). The somewhat surprising answer […]