Using the OPTION (RECOMPILE) option for a statement

I started this series with the post titled: Little Bobby Tables, SQL Injection and EXECUTE AS. I then moved to discussing some of the differences with the post titled: EXEC and sp_executesql – how are they different?

Today, I want to address a few of the comments as well as continue with a few tips and tricks using these commands.

First off – could we have helped the performance of the sp_executesql statement?


If we know that a statement returns a varying amount of data (based on the parameters supplied) then we can use the SQL Server 2005 feature WITH RECOMPILE to tell SQL Server that the statement being executed should have it’s own plan created and that prior plans (if they exist) should not reuse the statement. It also tells SQL Server that this particular plan is a “single-use plan” that should not be reused for subsequent users. To see the combination of all of these things – I’ll use some of the DMVs that track plan cache and plan utilization.


SELECT st.text, qs.EXECUTION_COUNT –, qs.*, cp.*
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS st
CROSS APPLY sys.dm_exec_query_plan(plan_handle) AS cp
WHERE st.text like ‘%FROM dbo.member%’

Right now, this query returns 0 rows.

I’ll execute the following and then recheck the plan cache:

DECLARE @ExecStr nvarchar(4000)
SELECT @ExecStr = ‘SELECT * FROM dbo.member WHERE lastname LIKE @lastname’
EXEC sp_executesql @ExecStr, N’@lastname varchar(15)’, ‘Tripp’

Now, we have a row for our parameterized query plan:

text                                                                                       EXECUTION_COUNT
(@lastname varchar(15))SELECT * FROM dbo.member WHERE lastname LIKE @lastname              1

So, what is this showing us… it’s showing us that there’s a plan in cache for this statement. And, if we’re interested in seeing the plan, we can remove the commented out cp.* in the query above to get the cp.query_plan column. Click on an XML showplan and SSMS will go DIRECTLY into a graphical query plan window (2008 onward):

And, once again, we see the optimal plan (to use the index and do a bookmark lookup) because this query is highly selective (only 1 row).

We’ll execute the exact same statement again – using the value of Anderson just to get setup to the point where we were last week:

DECLARE @ExecStr nvarchar(4000)
SELECT @ExecStr = ‘SELECT * FROM dbo.member WHERE lastname LIKE @lastname’
EXEC sp_executesql @ExecStr, N’@lastname varchar(15)’, ‘Anderson’

And, we see that it uses the EXACT same plan (looking at showplan). In fact, we can see that from checking our plan cache as well:

text                                                                                       EXECUTION_COUNT
(@lastname varchar(15))SELECT * FROM dbo.member WHERE lastname LIKE @lastname              2

And, while we know that this plan (to use the index) is good for the highly selective value of ‘Tripp’ it’s is not good for the value of Anderson as there are many rows that match. If we suspected this and/or knew this when we were executing (from the client) then we could use OPTION (RECOMPILE) to force SQL Server to get a new plan:

DECLARE @ExecStr nvarchar(4000)
SELECT @ExecStr = ‘SELECT * FROM dbo.member WHERE lastname LIKE @lastname OPTION (RECOMPILE)’
EXEC sp_executesql @ExecStr, N’@lastname varchar(15)’, ‘Anderson’

The query plan as a result of this execution is:

And, this is the more optimal plan given this parameter. At ths point, the questions (IMO) are:

  1. Would someone really be able to programmatically guestimate that a parameter submitted from the client is “atypical” and/or that it warrants recompilation? And, I guess I can answer yes – for some parameters – like those with a wildcard. But, if we’re just talking about two different values against a single column, this would be guessing statistics of the data.
  2. Should this particular statement always be executed to recompile and never save a plan?
  3. What did SQL Server do with the plan itself?

I’ll go with answering #3 first as that one is easy to answer. Using the same statement, I’ll again query the plan cache:

text                                                                                       EXECUTION_COUNT
(@lastname varchar(15))SELECT * FROM dbo.member WHERE lastname LIKE @lastname              2

Even though this is the third time we have executed this statement, this final execution was NOT put in cache. It was used solely for the execution with OPTION (RECOMPILE). And, it will NOT affect future executions. If I go back and execute without the OPTION (RECOMPILE) then I will get the prior plan (to use the index).

Now, the other two questions – these are a lot more interesting and this is where I think that stored procedures should be used. Personally, I think that developers that know the data and know the application – will be a lot better at creating the RIGHT code especially when they understand all of the options available to them.

Here’s the way I think about stored procedures and recompilation:

  • If I know that a particular statement always returns the same number of rows and uses the same plan (and, I’d know this from testing), then I’ll create the stored procedure normally and let the plan get cached.
  • If I know that a particular statement wildly varies from execution to execution and the optimal plan varies (again, I should know this from testing multiple sample executions), then I’ll create the stored procedure normally and I’ll use OPTION (RECOMPILE) to make sure that the statement’s plan is not cached or saved with the stored procedure. On each execution that stored procedure will get different parameters and the particularly nasty statement will get a new plan on each execution.

However, this is also where things get more challenging. I’ve often seen stored procedures where people try and use conditional logic to break up the different types of parameters and this doesn’t usually work out as well as expected (I’ll blog this next). And, this is always where some decide that they want to dynamically build the statement that gets executed – now, we need to determine whether or not we should use sp_executesql or EXEC. And, there are really a couple of options at this point. Ultimately, in one or two more posts – I’ll finally show you where EXEC is a clear winner over sp_executesql because even the OPTION (RECOMPILE) doesn’t always help ALL kinds of plans (and especially one of the more common types of plans I see).

So, I’m getting closer… at least two more to go here. Thanks for the great questions/comments!


11 thoughts on “Using the OPTION (RECOMPILE) option for a statement

  1. Great series! I am really enjoying how it feels like a conversation.

    A big consideration in my environment is volume and that is where I think option recompile really shines. By not caching gobs and gobs of dynamic SQL I feel like I get better performance. If I did go the purely dynamic route I feel like my server might get bogged down searching the cache for a dynamic SQL string before eventually caching it only to run once.

    I guess what I am getting at is: Is there a tipping point in terms of volume where option recompile will beat dynamic SQL?

  2. A word of advice, or maybe someone can offer me some advice. The one time I have put this into a SP that was heavily used (several times a second) I ran into lock contention on the query plan. Apparently SQL server still only creates one plan for the store procedure, therefore if the procedure is still executing when another user attempts to execute the procedure the second user is forced to wait for the first to finish before SQL will kill the previous execution plan and build a new one. So while it fixes the high utilization and performance problem with queries that return vastly different result sets it still makes your users wait for their data to return. If at all possible you are probably much better off rewriting the SP so that it doesn’t have this problem in the first place.

  3. @Dave – One of the things that can help is SQL Server 2008 with query_hash being used instead. This simplifies the lookup and if the server-option (optimize for adhoc workload) is set, then it can also reduce the overall amount of cache that’s given to "single-plan usage." OK, I think this series is going to go beyond "only two more" posts. I’ll add some of that into the next couple for sure. But, you’re definitely right – on 2005 (or earlier) then the plan cache could get filled with dynamic statements (or LINQ, etc.). The best balance is to use OPTION (RECOMPILE) only when truly necessary. Then, if/when that doesn’t work (and I’m getting to that case soon :)… consider sp_executesql and/or EXEC. But, those should definitely be used sparingly. It’s all about moderation… well, except for truffles. There’s no limit on those. ;-) ;-)

    @Nathan – What version of SQL Server are/were you using? The biggest problems with recompilation were in SQL Server 2000 (where compile locks were procedure-level). Here’s a KB for that problem – Q263889: Description of SQL Server blocking caused by compile locks – Things are significantly better in SQL Server 2005/2008 where recompilation is at the statement-level. I really haven’t seen as much with compile locks in awhile but there are still cases where it can be problematic. I guess that goes back to moderation/limiting it to "only when really necessary." In a lot of cases, I don’t make it my default by any means but it’s definitely one of my first choices in troubleshooting when really bad plans happen to good people ;-).

    But, I guess I have to agree that many of these recommendations are relative to the cost of recompilation vs. the problems created by bad plans. If this is executed many times a second and does not take a long period of time to execute – then it probably wasn’t a problem before the "wildly" varying plans really can’t wildly vary (or they’re all relatively good). The real problem occurs when the bad plan is REALLY bad. And, I’ve seen this even more frequently… where a server is essentially pegged so badly because of poor plans that I know of [more than one] customers failing over their cluster to get out of these problems.

    So, for smaller stored procs where the plans aren’t causing a lot of problems already then this could add unnecessary overhead. However, re-writing to avoid this might not be possible either. In fact, I’ll tackle the types of unsuccessful rewrites that I’ve seen! However, I’m always open to other options! Let me know if you’ve seen some clever strategies to rewrite code when it’s executing against skewed data. That’s really challenging and also very hard to programmatically determine (except potentially with rowcounts, etc. but that’s not overly pretty either).

    Anyway, I’m working on more on this and the good news is that when it’s necessary – OPTION (RECOMPILE) is fantastic. I’ve seen this work wonders on frequently executed procedures (yes, those that are executing many times a second!) where executing the bad plan was REALLY, REALLY bad.

    Thanks David/Nathan!

  4. Your blog is not that easy for me to understand (i’m french), but it’s really interesting and i learn a lot of stuff. Keep up the good work !

  5. Kimberly,

    That’s for that info! This was back on a 2000 box, and now that I’ve thought about it I haven’t seen those blocking issues with the recompile option on that machine since it was upgraded to 2008.

    Ahh, the enhancements they don’t tell you about. :)

  6. @Nathan – Excellent. Thanks for the reply back! I actually tried really hard to reproduce problems on 2008 (after your comment – as I thought maybe there was still an issue that I hadn’t seen – that’s always a possibility – this product certainly keeps me on my toes :-)). Anyway, I created a sproc with OPTION (RECOMPILE) and then created multiple connections (using SQLCMD) all with different plans of execution and then ran them all in a tight loop. I just couldn’t get any COMPILE locks on 2008 (note: I didn’t try 2005 but I haven’t heard of this problem there either).

    I *definitely* know what you mean on 2000 though as it was a nightmare!

    So, this is good news. The compile lock problem was resolved in the way they switched from procedure-level recompiles to statement-level recompiles.

    Cool! Thanks again for responding back!


  7. As usual, a superb article.

    I was writing a stored procedure to provide users the ability to search data using a variety of optional filters across various joined tables, many of which may be left as null.

    Using OPTION (RECOMPILE) has proved to be a simple and effective solution.

    Also of significant usefulness in this topic is the following article:

Leave a Reply

Your email address will not be published. Required fields are marked *

Other articles

Imagine feeling confident enough to handle whatever your database throws at you.

With training and consulting from SQLskills, you’ll be able to solve big problems, elevate your team’s capacity, and take control of your data career.