Search Engine Q&A #9: How to update constraints?

A quickie today to get back into the swing of things.

In Kimberly's whitepaper on partitioning she discusses the 'sliding window' scenario (where you switch in and out partitions of data into an existing table – see this previous post for a few more details). She recommends that the constraints are extended rather than dropped and recreated – which I totally agree with. I had a question about why this is a best practice, and is it more efficient than dropping and recreating the constraints?

Let's create a little example to illustrate all these points. A simple table called Sales with a couple of indexes and 100000 rows of data.

CREATE TABLE Sales (salesID INT IDENTITY, SalesDate DATETIME);
GO

CREATE CLUSTERED INDEX Sales_CL ON Sales (SalesID);
CREATE NONCLUSTERED INDEX Sales_NCL ON Sales (SalesDate
);
GO

SET NOCOUNT ON;
GO

DECLARE @a INT;
SELECT @a = 1
;
WHILE (@a < 100000
)
BEGIN

INSERT INTO Sales VALUES (GETDATE ());
SELECT @a = @a + 1;

END;
GO

Now I want to create two constraints – for the lower and upper bounds of the sales date. I could do this using a single constraint with both conditions or two constraints with a single condition each. For simplicity I'll use one constraint, but first I want to see how expensive the operation is, so I'm going to turn on STATISTICS IO – this is a very cool feature that gives the IO costs of a query after it's completed.

SET STATISTICS IO ON;
GO

ALTER TABLE Sales WITH CHECK ADD CONSTRAINT [CK_Sales_SalesDate] CHECK (YEAR (SalesDate) >= 2005 AND YEAR (SalesDate) < 2008);
GO

Table 'Sales'. Scan count 1, logical reads 399, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

The 399 logical reads are for the table scan that's done to ensure that the constraint is valid for the data currently in the table.

So – the first question is why not drop/create the constraint to update it? Well, what if invalid data is entered into the table between dropping and recreating the constraint?

ALTER TABLE Sales DROP CONSTRAINT [CK_Sales_SalesDate];
GO

ALTER TABLE Sales WITH CHECK ADD CONSTRAINT [CK_Sales_SalesDate] CHECK (YEAR (SalesDate) >= 2005 AND YEAR (SalesDate) < 2009);
GO

Msg 547, Level 16, State 0, Line 1

The ALTER TABLE statement conflicted with the CHECK constraint "CK_Sales_SalesDate". The conflict occurred in database "ConstraintTest", table "dbo.Sales", column 'SalesDate'.

The constraint can't be recreated and you have to find the invalid data and get rid of it – which may not be as easy as it sounds depending on your schema and business logic. So, the best practice is always to update a constraint. Well, you can't update an existing constraint but you can create a new constraint with different conditions and then drop the old constraint:

ALTER TABLE Sales WITH CHECK ADD CONSTRAINT [CK_Sales_SalesDate2] CHECK (YEAR (SalesDate) >= 2005 AND YEAR (SalesDate) < 2009);
ALTER TABLE Sales DROP CONSTRAINT [CK_Sales_SalesDate]
;
GO

Table 'Sales'. Scan count 1, logical reads 399, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

The second question is which way is more efficient? The answer is neither. Dropping and recreating the constraint will obviously do a table scan again, but so does adding the new constraint – even though there's a trusted constraint in place already which guarantees that adding the new constraint can't possibly fail!!! Hopefully in the future the smarts will be built into the SQL Engine to recognize this and not do the unnecessary table scan (this process is known as interval subsumption – according to my geeky wife :-))

2 thoughts on “Search Engine Q&A #9: How to update constraints?

Leave a Reply

Your email address will not be published. Required fields are marked *

Other articles

Imagine feeling confident enough to handle whatever your database throws at you.

With training and consulting from SQLskills, you’ll be able to solve big problems, elevate your team’s capacity, and take control of your data career.