Compiled T-SQL? Don’t throw away all your SQLCLR code just yet

The speed increase from memory-optimized tables in SQL Server 2014 is nice, but I thought, to get the most bang for your buck for really need to use compiled stored procedures in addition. After looking at some of the C code these produce, it almost looked like I was doing direct vector branching into the table’s C-based specific access routines. But, in the meantime, I’d worked with a friend of mine who was trying to convert some custom SQLCLR code to compiled T-SQL. He was using table variables in place of CLR objects and, although the code run way faster than the non-compiled T-SQL code, it was still 90% slower than his SQLCLR equivalent.

In general, you’ll get the biggest speed increase from a compiled sproc if you use do lots of logic in code, or use nested loops in plans that return lots of rows. I thought I had just the use case.

Once upon a time, someone had given me code that did a lot of calculations in SQLCLR and in T-SQL. The SQLCLR code uses 2-and-3-dimensional arrays as well as some data access. The T-SQL code was an unoptimized, line-by-line port of the SQLCLR code used multiple temporary tables and a cursor over outer iterator. You could remove the outer cursor and still SQLCLR performed 100 iterations (100 products in this case) 7-8 times faster than T-SQL performed 1 iteration (1 product). In other words, SQLCLR was about 700-800X faster than T-SQL. Folks used to tell me this was “cheating” because of the large number of SQL queries the T-SQL code did (100s of queries, many using INSERT INTO a temp table). Nevertheless, it was worth a try.

Converting the code was pretty straightforward.
Replace temp tables with strongly-typed in-memory table variables
Make sure each table variable has a key and index (they’re required in in-memory table variables)
Change a subquery into another select into table variable + main query (compiled sprocs don’t support subqueries yet)
Make sure everything follows the rules for schemabinding
Didn’t use the outer cursor, so the compiled T-SQL code either performed a single iteration or called the compiled proc from within non-compiled T-SQL cursor code

I also had to wait until CTP2, because the procedure used DATEADD. That’s wasn’t supported until CTP2 (good choice of what to support sooner, SQL team, thanks for adding date functions).

The results for switching to in-memory tables were encouraging:
SQLCLR – standard tables – 5 sec
TSQL – standard tables – 37 sec (x100 = 370 sec) – 740X slower
TSQL – in-memory tables – 10 sec (x100 = 100 sec) – 200x slower
SQLCLR doesn’t support in-memory tables through context connection. I didn’t try it with an external connection back into the same instance.

Switching to compiled sproc (and pre-compiling the sprocs to factor out the first-time compile overhead for each)
TSQL – in-memory tables – 10 sec (x100 = 100 sec)
Compiled TSQL (which only supports in-memory tables) – 10 sec (x100 – 100 sec)

So for this particular use case (just a straight port, didn’t try to optimize the code), the difference was minimal. BTW, I’m not pretending this is a benchmark or anywhere near, just a quick test and some round numbers for comparison.

Then I took a harder look at my code. The SQL code was doing a large *number* of SQL statements, each of which used and returned a small number of rows. The computations were done, not in program logic, but with iterations over SQL statements (not a good way to do custom computations in ANY case). Horrible use case, I guess. Lessons learned:
-In-memory tables are not multidimensional arrays or custom “objects” as far as speed is concerned.
-The SQL statement still has overhead over in-memory operations even with simple SQL in a compiled sproc
-Iterating with SQL statements doesn’t substitute for aggregations and computations done in a single SQL statement (I knew that one already).
-Remember where the sweet spot is for compiled stored procedures.

So folks, don’t throw away your custom SQLCLR sprocs that do custom computations just yet. But do give it a test. And push as much of the computation as possible into the SQL statement and optimize those SQL statements.

Cheers, Bob
@bobbeauch

Other articles

Imagine feeling confident enough to handle whatever your database throws at you.

With training and consulting from SQLskills, you’ll be able to solve big problems, elevate your team’s capacity, and take control of your data career.