Reconciling set-based operations with row-by-row iterative processing

Yesterday in class we had a discussion around the conceptual problem of reconciling the fact that SQL Server does set-based operations, but that it does them in query plans that pass single rows around between operators. In other words, it uses iterative processing to implement set-based operators.

The crux of the discussion is: if SQL Server is passing single rows around, how is that set-based operations?

I explained it in two different ways…

SQL Server Example

This explanation compares two ways of doing the following logical operation using SQL Server: update all the rows in the Products table where ProductType = 1 and set the Price field to be 10% higher.

The cursor based way (row-by-agonizing-row, or RBAR) would be something like the following:

DECLARE @Price       FLOAT;

SELECT [ProductID], [Price]
FROM [Products]
WHERE [ProductType] = 1;

OPEN [MyUpdate];

FETCH NEXT FROM [MyUpdate] INTO @ProductID, @Price;

    UPDATE [Products]
    SET [Price] = @Price * 1.1
    WHERE [ProductID] = @ProductID;

    FETCH NEXT FROM [MyUpdate] INTO @ProductID, @Price;

CLOSE [MyUpdate];

This method has to set up a scan over the Products table based on the ProductType, and then runs a separate UPDATE transaction for each row returned from the scan, incurring all the overhead of setting up the UPDATE query, starting the transaction, seeking to the correct row based on the ProductID, updating it, and tearing down the transaction and query framework again each time.

The set-based way of doing it would be:

UPDATE [Products]
SET [Price] = [Price] * 1.1
WHERE [ProductType] = 1;

This will have one scan based on the ProductType, which will update rows matching the ProductType, but the query, transaction, and scan are only set up once, and then all the rows are processed, one-at-a-time inside SQL Server.

The difference is that in the set-based way, all the iteration is done inside SQL Server, in the most efficient way it can, rather than manually iterating outside of SQL Server using the cursor.

Non-Technical Example

This explanation involves a similar problem but not involving SQL Server. Imagine you need to acquire twelve 4′ x 8′ plywood sheets from your local home improvement store.

You could drive to and from the store twelve times, and each time you need to go into the store, purchase the sheet, and wait for a staff member to become available to load the sheet into your pickup truck, then drive home and unload the sheet.

Or you could drive to the store once and purchase all twelve sheets in one go, with maybe four staff members making three trips each out to your pickup, carrying one sheet each time. Or even just one staff member making twelve trips out to your pickup.

Which method is more efficient? Multiple trips to the store or one trip to the store, no matter how many staff members are available to carry the sheets out?


No-one in their right mind is going to make twelve trips to the home improvement store when one will suffice. Just like no developer should be writing cursor/RBAR code to perform an operation that SQL Server can do in a set-based manner (when possible).

Set-based operations don’t mean that SQL Server processes the whole set at once – that’s clearly not possible as most sets have more rows than your server has processors (so all the rows in the set simply *can’t* be processed at the same time, even if all processors were running the same code at the same time) – but that it can process the set very, very efficiently by only constructing the processing framework (i.e. query plan with operators, scans, etc.) for the operation once and then iterating over the set of rows inside this framework.

PS Check out the technical comment from Conor Cunningham below (Architect on the SQL Server team, and my counterpart on the Query Optimizer when I was a Dev Lead in the Storage Engine for SQL Server 2005)

Updated sys.dm_os_waiting_tasks script to add query DOP

[Edit 2016: Check out my new resource – a comprehensive library of all wait types and latch classes – see here.]

A question came up in class today about easily seeing the degree of parallelism for parallel query plans, so I’ve updated my waiting tasks script to pull in the dop field from sys.dm_exec_query_memory_grants. I’ve also added in a URL field that points into the new waits library, and shortened some of the column names.

Here it is for your use.


(Note that ‘text’ on one line does not have delimiters because that messes up the code formatting plugin):

  File:     WaitingTasks.sql

  Summary:  Snapshot of waiting tasks

  SQL Server Versions: 2005 onwards
  Written by Paul S. Randal,

  (c) 2016, All rights reserved.

  For more scripts and sample code, check out

  You may alter this code for your own *non-commercial* purposes. You may
  republish altered code as long as you include this copyright and give due
  credit, but you must obtain prior permission before blogging this code.
    [owt].[session_id] AS [SPID],
    [owt].[exec_context_id] AS [Thread],
    [ot].[scheduler_id] AS [Scheduler],
    [owt].[wait_duration_ms] AS [wait_ms],
    [owt].[blocking_session_id] AS [Blocking SPID],
    CASE [owt].[wait_type]
            RIGHT ([owt].[resource_description],
                CHARINDEX (N'=', REVERSE ([owt].[resource_description])) - 1)
        ELSE NULL
    END AS [Node ID],
    [eqmg].[dop] AS [DOP],
    [er].[database_id] AS [DBID],
    CAST ('' + [owt].[wait_type] as XML) AS [Help/Info URL],
FROM sys.dm_os_waiting_tasks [owt]
INNER JOIN sys.dm_os_tasks [ot] ON
    [owt].[waiting_task_address] = [ot].[task_address]
INNER JOIN sys.dm_exec_sessions [es] ON
    [owt].[session_id] = [es].[session_id]
INNER JOIN sys.dm_exec_requests [er] ON
    [es].[session_id] = [er].[session_id]
FULL JOIN sys.dm_exec_query_memory_grants [eqmg] ON
    [owt].[session_id] = [eqmg].[session_id]
OUTER APPLY sys.dm_exec_sql_text ([er].[sql_handle]) [est]
OUTER APPLY sys.dm_exec_query_plan ([er].[plan_handle]) [eqp]
    [es].[is_user_process] = 1

Code to analyze the transaction hierarchy in the log

Over the weekend there was a discussion on the MVP distribution list about the sys.dm_tran_database_transactions DMV and how one cannot use it to accurately determine how much log an operation has generated because it doesn’t provide a roll-up of the sub-transaction metrics to the outer transaction. This makes the output somewhat non-intuitive.

The discussion prompted me to write some code I’ve been meaning to do since 2012, when SQL Server 2012 introduced a field in LOP_BEGIN_XACT log records that tracks the transaction ID of the parent transaction, allowing the hierarchy of transactions to be investigated.

The actual code is at the bottom of the article, and is available in a zip file here.

It provides two stored procs, sp_SQLskillsAnalyzeLog and sp_SQLskillsAnalyzeLogInner, with the former making use of the latter, and the latter calling itself recursively.

The sp_SQLskillsAnalyzeLog proc will dump the hierarchy of transactions in the transaction log. By default it will only show the top-level transactions (with no parent transaction), and it has the following parameters:

  • @DBName (with a default of master)
  • @Detailed (default 0, when 1 it will shows the transaction begin time and Windows login, for top-level transactions only)
  • @Deep (default 0, when 1 it will show the sub-transaction hiearchy)
  • @PrintOption (default 0 for a resultset, 1 for textual output)

I’ve set the procs to be in master and system objects using sp_MS_marksystemobject. You can change them to be stored wherever you want.

The pseudo-code is as follows:

  • Get the info from the log into temp table 1
  • Create temp table 2 with a clustered index on an identity column
  • For each top-level transaction
    • If @Detailed, add the user name and start time
    • Get the last transaction added to temp table 2
    • If it’s the same as the one we’re about to add, increment the counter for the last one added, else add the new one
    • if @Deep, then, with recursion depth = 1,
      • **RP** for each sub-transaction of current next-level up transaction
        • Prefix ‘…’ x the recursion depth to the transaction name
        • Get the last transaction added to temp table 2
        • If it’s the same as the one we’re about to add, increment the counter for the last one added, else add the new one
        • Recurse to **RP**, increasing recursion depth
    • (doing it this way vastly reduces the amount of data to be stored in temp table 2)
  • select the result set or print it, depending on @PrintOption

Let’s look at an example, using the SalesDB database that you can restore from a zip file on our resources page:

-- Restore the database
USE [master];
	FROM DISK = N'D:\SQLskills\DemoBackups\SalesDB2014.bak'


-- Create a smaller copy of the Sales table
USE [SalesDB];

INTO [SalesCopy]
FROM [Sales]
WHERE [SalesID] < 100000;

CREATE CLUSTERED INDEX [SalesCopy_CL] ON [SalesCopy] ([SalesID]);

-- Empty the log

-- Online rebuild the clustered index

-- Analyze the log
EXEC sp_SQLskillsAnalyzeLog salesdb, @Detailed = 1, @Deep = 1, @PrintOption = 1;
ALTER INDEX by APPLECROSS\Paul @ 2016/05/01 11:26:48:113
OnlineIndexInsertTxn by APPLECROSS\Paul @ 2016/05/01 11:26:48:113
...BTree Split/Shrink
...BTree Split/Shrink
...SplitPage 85 times
Allocate Root by APPLECROSS\Paul @ 2016/05/01 11:26:48:113
Allocate Root by APPLECROSS\Paul @ 2016/05/01 11:26:48:113
OnlineIndexInsertTxn by APPLECROSS\Paul @ 2016/05/01 11:26:48:150
...SplitPage 86 times
...SplitPage 89 times
...SplitPage 57 times
...SplitPage 31 times
...SplitPage 88 times
...SplitPage 52 times
SetFileSize @ 2016/05/01 11:26:48:303

Pretty cool, eh? You can see that the online rebuild uses a bunch of top-level transactions, which makes it difficult to determine exactly how much transaction log it generated as there isn’t one transaction that then drives everything else. But using this script, now you can see what an operation does.

There are other uses of this too:

  • Searching through the log to see who’s doing what
  • Analysis of your stored proc transactions and what they cause to happen under the covers on the system (e.g. page splits)

I hope you find this useful! Let me know if there are any other features you’d like to see and I’ll figure out if they’re possible and feasible. I can think of at least:

  • Making it work on log backups
  • Providing a roll-up of log space used for transactions and their sub-transactions (would be pretty slow, but do-able)


Here’s the code, and it’s in the zip file here. I’m sure there are probably some ways to make this code more efficient, I’m not an expert T-SQL programmer :-)

  File:     sp_SQLskillsAnalyzeLog.sql

  Summary:  This script cracks the transaction log and prints a hierarchy of

  SQL Server Versions: 2012 onwards
  Written by Paul S. Randal,

  (c) 2016, All rights reserved.

  For more scripts and sample code, check out

  You may alter this code for your own *non-commercial* purposes. You may
  republish altered code as long as you include this copyright and give due
  credit, but you must obtain prior permission before blogging this code.

USE [master];

IF OBJECT_ID (N'sp_SQLskillsAnalyzeLog') IS NOT NULL
	DROP PROCEDURE [sp_SQLskillsAnalyzeLog];

IF OBJECT_ID (N'sp_SQLskillsAnalyzeLogInner') IS NOT NULL
	DROP PROCEDURE [sp_SQLskillsAnalyzeLogInner];

CREATE PROCEDURE sp_SQLskillsAnalyzeLogInner (
	@XactID AS CHAR (13),
	@Depth AS INT)
	DECLARE @String VARCHAR (8000);
	DECLARE @InsertString VARCHAR (8000);

	DECLARE @SubXactID CHAR (13);
	DECLARE @SubDepth INT = @Depth + 3;

	SELECT [Transaction ID], [Transaction Name]
	FROM ##SQLskills_Log_Analysis
	WHERE [Parent Transaction ID] = @XactID;

	OPEN [LogAnalysisX];

	FETCH NEXT FROM [LogAnalysisX] INTO @SubXactID, @Name;

		SELECT @InsertString = REPLICATE ('.', @Depth) + @Name;

		-- Select the last transaction name inserted into the table
			@ID = [ID],
			@String = [XactName]

		IF @String = @InsertString
				[Times] = [Times] + 1
				[ID] = @ID;
			INSERT INTO ##SQLskills_Log_Analysis2
			VALUES (@InsertString, 1);

		-- Recurse...
		EXEC sp_SQLskillsAnalyzeLogInner @SubXactID, @SubDepth;

		FETCH NEXT FROM [LogAnalysisX] INTO @SubXactID, @Name;

	CLOSE [LogAnalysisX];
	DEALLOCATE [LogAnalysisX];

CREATE PROCEDURE sp_SQLskillsAnalyzeLog (
	-- The name of a database, default of master
	@DBName AS sysname = N'master',

	-- Detailed = 0 means just the transaction name
	-- Detailed = 1 means time and user
	@Detailed AS INT = 0,

	-- Deep = 0 means only the top-level transactions
	-- Deep = 1 means sub-transaction hierarchy (slow!)
	@Deep AS INT = 0,

	-- PrintOption = 0 means SELECT as a resultset
	-- PrintOption = 1 means PRINT as text
	@PrintOption VARCHAR (25) = 0)

	IF EXISTS (SELECT * FROM [tempdb].[sys].[objects]
		WHERE [name] = N'##SQLskills_Log_Analysis')
		DROP TABLE [##SQLskills_Log_Analysis];

	IF EXISTS (SELECT * FROM [tempdb].[sys].[objects]
		WHERE [name] = N'##SQLskills_Log_Analysis2')
		DROP TABLE [##SQLskills_Log_Analysis2];

	-- Only get the detailed info if we need it
	IF @Detailed = 1
		EXEC ('USE ' + @DBName + ';' +
			'SELECT [Transaction ID], [Transaction Name], [Parent Transaction ID],' +
			'[Begin Time], SUSER_SNAME ([Transaction SID]) AS [Who] ' +
			'INTO ##SQLskills_Log_Analysis FROM fn_dblog (null,null) ' +
			'WHERE [Operation] = ''LOP_BEGIN_XACT'';');
		EXEC ('USE ' + @DBName + ';' +
			'SELECT [Transaction ID], [Transaction Name], [Parent Transaction ID],' +
			'NULL AS [Begin Time], NULL AS [Who]' +
			'INTO ##SQLskills_Log_Analysis FROM fn_dblog (null,null) ' +
			'WHERE [Operation] = ''LOP_BEGIN_XACT'';');

	CREATE TABLE ##SQLskills_Log_Analysis2 (
		[XactName]	VARCHAR (8000),
		[Times]		INT);

	ON ##SQLskills_Log_Analysis2 ([ID]);

	-- Insert a dummy row to make the loop logic simpler
	INSERT INTO ##SQLskills_Log_Analysis2
	VALUES ('PSRDummy', 1);

	-- Calculate the transaction hierarchy
	DECLARE @XactID		CHAR (13);
	DECLARE @Name		VARCHAR (256);
	DECLARE @Begin		VARCHAR (100);
	DECLARE @String		VARCHAR (8000);
	DECLARE @Counter	INT;

		[Transaction ID], [Transaction Name], [Begin Time], [Who]
		[Parent Transaction ID] IS NULL;

	OPEN [LogAnalysis];

	FETCH NEXT FROM [LogAnalysis] INTO @XactID, @Name, @Begin, @Who;

		-- Select the last transaction name inserted into the table
			@ID = [ID],
			@String = [XactName]

		-- If it's the same as we're about to insert, update the counter,
		-- otherwise insert the new transaction name
		IF @String = @Name
				[Times] = [Times] + 1
				[ID] = @ID;
			SELECT @String = @Name;

			-- Add detail if necessary
			IF @Detailed = 1
				-- Do this separately in case CONCAT_NULL_YIELDS_NULL is set
					 SELECT @String = @String + ' by ' + @Who;

				SELECT @String = @String + ' @ ' + @Begin;

			INSERT INTO ##SQLskills_Log_Analysis2 VALUES (@String, 1);

		-- Look for subtransactions of this one
		IF @Deep = 1
			EXEC sp_SQLskillsAnalyzeLogInner @XactID, 3;

		FETCH NEXT FROM [LogAnalysis] INTO @XactID, @Name, @Begin, @Who;

	CLOSE [LogAnalysis];
	DEALLOCATE [LogAnalysis];

	-- Discard the dummy row
		[ID] = 1;

	-- Print the hierachy

	OPEN [LogAnalysis2];

	-- Fetch the first transaction name, if any
	FETCH NEXT FROM [LogAnalysis2] INTO @ID, @String, @Counter;

		IF @Counter > 1
			SELECT @String = @String + ' ' +
				CONVERT (VARCHAR, @Counter) + ' times';
		-- If we're going to SELECT the output, update the row
		IF @PrintOption = 0
				[XactName] = @String
				[ID] = @ID;
			PRINT @String;

		FETCH NEXT FROM [LogAnalysis2] INTO @ID, @String, @Counter;

	CLOSE [LogAnalysis2];
	DEALLOCATE [LogAnalysis2];

	IF @PrintOption = 0

	DROP TABLE ##SQLskills_Log_Analysis;
	DROP TABLE ##SQLskills_Log_Analysis2;

EXEC sys.sp_MS_marksystemobject [sp_SQLskillsAnalyzeLog];
EXEC sys.sp_MS_marksystemobject [sp_SQLskillsAnalyzeLogInner];

-- EXEC sp_SQLskillsAnalyzeLog salesdb, 1, 1, 1;

SQL Server health check by SQLskills: Spring discount offer

Through the end of July, we’re offering a complete single-instance health-check for a flat fee of US$2,500… 

One of the most popular services we provide to new clients is a SQL Server Health Check, whether the new client is a Fortune-25 company or a small company with only a handful of employees. There’s a lot of healthy competition in the SQL Server world to provide this service, but we at SQLskills believe we provide the best value for money, because…

  • We automate the data collection process using a minimal-impact diagnostic tool that’s already installed with SQL Server, saving the client time and allowing data collection to be scheduled and handled by the client based on their schedule, rather than requiring many hours of data collection through an interactive online session.
  • We have custom data processing tools that we’ve developed that help us to analyze the data, saving the client money.
  • We document all our findings, and the report includes advice and justification (and links to deeper explanations) on remediation of each problem found, allowing the client to make the necessary changes on their own or investigate the problem further and plan/implement an appropriate fix. We also have a summary call with the client, allowing them to ask whatever questions they have on the information in the report. And of course, sometimes the client may choose to have us assist with or perform all the remediation work/further investigations – whatever works for them.
  • We only charge for the time we use (typically 12 hours or less per instance) rather than locking the client in to a costly, fixed-price engagement, saving the client money.
  • Our health check process has been developed and refined over the last 7 years based on the accumulated knowledge and experience of our small team (all Data Platform MVPs with a combined 80+ years of deep SQL Server experience), past client engagements and problems, and is constantly updated based on current trends and issues we see with SQL Server installations. Each of our consultants utilizes an internally-developed checklist of over 130 items that we look for during our health audits to ensure consistency and accuracy across our team, and we routinely review our processes to ensure that the team is current with new checks and findings with each release of SQL Server. Our process is constantly evolving to be more efficient and thorough based on our customer interactions.
  • For large environments, we encourage the client to pick representative instances to check, and then extrapolate the results to instances with common configurations, saving the client money.

We do all of this to take the least amount of our client’s time, and provide the best return on their investment, whether for small environments or large corporate data centers. Also, as our audit is engineered to be as efficient as possible, it allows small companies with small I.T. budgets to make use of our services.

Some of the clients we’ve been working with for many years started with a single-instance health check and come back to us a few times per year for help when they need it (and we don’t charge any retainer fee). It’s really fun to get to know our clients, watch how their environments grow and improve, and meet them in our classes and at conferences like SQLintersection.

Back to the point of my blog post… It’s Spring, so it’s time for some Spring Cleaning! Through the end of July, we’re offering new or existing clients a complete single-instance health-check for a flat fee of US$2,500 – that’s more than 1/3 off the usual price for a 12-hour health check. The discount price covers us performing the health check, documenting the results, and a wrap-up conference call or Webex to go over the results.

So no matter what your I.T. budget, you CAN afford to have SQLskills on your team. And if you’re from a large corporation, for US$2,500, you really CAN’T afford to pass up this opportunity!

If you’re interested in working with us, send us an email and we’ll get in touch with you right away.

We look forward to getting to know you and your data team – we promise you won’t be disappointed!

New course: Scaling SQL Server 2012 and 2014: Part 2

Glenn’s new course is called SQL Server: Scaling SQL Server 2012 and 2014: Part 2 and is just under 3 hours long. It covers a plethora of configuration and hardware issues that can prevent a workload from scaling, plus methods for scaling up and out and new features in 2014 that can help. Part 1 of the course (here) covers application and code design issues that can prevent workload scaling.

The modules are:

  • Introduction
  • Database Configuration Settings
  • Instance Configuration Settings
  • Storage Subsystem Issues
  • Hardware Issues
  • Scaling Up SQL Server
  • Scaling Out SQL Server
  • New Scalability Features in SQL Server 2014

Check it out here.

We now have more than 135 hours of SQLskills online training available (see all our courses here), all for as little as $29/month through Pluralsight (including more than 4,500 other developer and IT training courses). That’s unbeatable value that you can’t afford to ignore.


New course: Understanding and Using DBCC Commands

Erin’s new course is called SQL Server: Understanding and Using DBCC Commands and is just over 2.25 hours long. It covers all the documented DBCC commands and a few of the undocumented ones, plus Erin goes into details about how to use DMVs, when available, to provide more detailed information.

The modules are:

  • Introduction
  • Basic Commands
  • Informational Commands
  • Maintenance Commands
  • Validation Commands
  • Undocumented Commands

Check it out here.

We now have more than 135 hours of SQLskills online training available (see all our courses here), all for as little as $29/month through Pluralsight (including more than 4,500 other developer and IT training courses). That’s unbeatable value that you can’t afford to ignore.


Developer Edition and Pluralsight subscription for free, courtesy of Microsoft

No, this isn’t an April fools joke. Microsoft announced today at the Build conference that they’ve made SQL Server 2014 (and 2016 when it ships) Developer Edition completely free to download for anyone who’s a member of the free Visual Studio Dev Essentials community. The idea is to make it easier for developers to work with SQL Server for their application.

Now that in itself is cool, as it saves $59.95 per Developer Edition license (today’s price on the Microsoft Store), but they’re also throwing in a six-month, completely unlimited subscription to Pluralsight (where we have 140 hours of SQL Server training). That’s worth 6 x $29 = $174 at today’s prices. When they run out of six-month subscriptions, they’ll be giving out three-month subscriptions.

With more than 4,500 courses online to learn from, how can you beat free?

Check out the Visual Studio page to sign up – first come, first served on the six-month subscriptions!

New course: Building Simple Asynchronous Applications

My first post of the year is about our first Pluralsight course of the year!

Jonathan’s new course is called SQL Server: Building Simple Asynchronous Applications and is just over 1.5 hours long. It’s the first in a series of courses that Jonathan’s doing this year about using Service Broker, based on the extensive work he’s done with some of our clients building asynchronous processes. I’m really excited about this course being published as I think Service Broker is hugely underutilized in the SQL Server world.

The modules are:

  • Introduction
  • “Hello World” with Service Broker
  • Basic Architecture and Components
  • Building a Full Application
  • Basic Troubleshooting

Check it out here.

We now have 135 hours of SQLskills online training available (see all our courses here), all for as little as $29/month through Pluralsight (including more than four thousand other developer and IT training courses). That’s unbeatable value that you can’t afford to ignore.


2015 review: the year by the numbers

The last post of the year! It’s been a really excellent year all round and time for my traditional post counting down some of the numbers that have been my life this year.

  • 109318: the number of miles I flew on United
  • 33313: my current tweet total (up 1345 from 2014)
  • 12941: the number of subscribers to our Insider mailing list (up 1320 from 2014)
  • 11823: the number of emails I sent (down 444 from 2014)
  • 10843: the number of people who follow my Twitter ramblings (up 1448 from 2014)
  • 1603: the number of books (real ones) that I own (up 129 from 2014)
  • 868: the number of books I own but haven’t read yet (up 56 from 2014)
  • 148: the number of nights away from home (nearly all with Kimberly, so not *too* bad)
  • 131: the total number of hours of online training we have available on Pluralsight
  • 126: the number of dives I did this year in the Bahamas, Yap, Palau, and the Philippines, taking my total to 526
  • 115: the number of feet down on my deepest dive this year (going through swim-throughs with Jonathan in the Bahamas in January)
  • 91: the number of minutes of my longest dive this year
  • 88: the number of books I read (see this post)
  • 70: the number of days in Immersion Events and conferences
  • 42: the number of flights this year
  • 42: the number of Pluralsight courses we have available
  • 42: the answer to the question of life, the universe, and everything!
  • 40.55: the percentage of time we were away from home (which is why we call it our vacation home!)
  • 39: the number of SQLskills blog posts, including this one
  • 19: the number of different places we slept apart from our house and on planes
  • 18: the number of airports I flew through this year
  • 15: the number of new bird species I saw, taking my total to 499
  • 12: the number of monthly magazines I subscribe to
  • 8: the number of years I’ve been married to Kimberly
  • 8: the number of countries we visited this year
  • 7: the number of  SQLskills full-time employees, all of whom are fabulous and indispensable
  • 7: the number of new airports I flew through, taking my total to 89
  • 4: the number of new countries I visited (Bahamas, Federated States of Micronesia, Palau, Philippines), taking my total to 36
  • 2: the number of new airlines I flew on, taking my total to 34
  • 2: the number of awesome daughters we have
  • 1: number of new U.S. states I visited, taking my total to 23, and my first new one since 2011
  • 1: the number of new SQLskills team members, and accomplished breeder of tilapias: Tim Radney
  • 1: the person who is the best as snapping her fingers (especially when making fun of me – snap snap snap!): Erin Stellato
  • 1: the biggest hardware geek and ex-tank commander I know: Glenn Berry
  • 1: the number of Jonathan Kehayias in the world – thankfully :-)
  • 1: the number of indispensable assistants, without whom our lives would be a distressing quagmire – Libby we love you!
  • Finally, the one and only best person in my life: Kimberly, without whom I would be lost…

Thank you to everyone who reads our blogs, follows us on Twitter, sends us questions, watches our videos, comes to our classes, and generally makes being deeply involved in the SQL community a joy.

I sincerely wish you all a happy, healthy, and prosperous New Year!


(At Kanangra Walls in February a few hundred kilometers from Sydney, with Erin and Jon before teaching IEPTO2)



(On board the Palau Aggressor liveaboard dive boat in July, our eldest behind me)


2015: the year in books

Back in 2009 I started posting a summary at the end of the year of what I read during the year (see my posts from 200920102011, 2012, 2013, 2014) and people have been enjoying it, so here I present the 2015 end-of-year post. I set a moderate goal of 50 books this year and I managed 88! I thought about pushing for 100 like I did in 2009 but I didn’t read enough in October and November to be able to do it. Just like last year, I wanted to get through some of my larger non-fiction books but ended up not doing as many as I thought (reading more, shorter books). Next year I’m setting myself a goal of reading 50 books again.

For the record, I read ‘real’ books – i.e. not in electronic form – I don’t like reading off a screen. Yes, I’ve seen electronic readers – we both have iPads – and I’m not interested in ever reading electronically. I also don’t ‘speed read’ – I read quickly and make lots of time for reading.

Why do I track metrics? Because I like doing it, and being able to compare against previous years. Some people don’t understand the logic in that – each to their own :-)

I vacillated for the last few days about which book to crown as my favorite, and I just couldn’t come to a decision, so just like in 2012, I give you my favorite 3 books: Seveneves by Neal Stephenson, All The Light We Cannot See by Anthony Doerr, and The Bone Clocks by David Mitchell. All three are just superb books and I strongly recommend you give them a try. You can read my review of them in the top-10 (well, 14) list below.

Now the details. I enjoy putting this together as it will also serve as a record for me many years from now. I hope you get inspired to try some of these books – push yourself with new authors and very often you’ll be surprisingly pleased. Don’t forget to check out the previous year’s blog posts for more inspiration too.

Once again I leave you with a quote that describes a big part of my psychological make-up:

In omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro!

Analysis of What I Read

I read 37353 pages, or 102.34 pages a day, and a book every 4.1 days or so. The chart below shows the number of pages (y-axis) in each book I read (x-axis).



The average book length was 423 pages, more than 100 pages shorter than last year. That’s because I read a lot of series books where each isn’t hugely long.

The Top-1014

I read a lot of truly *superb* books this year, and I just couldn’t whittle it down to a top-10, so here’s my top-14 (well, really more as some of them are the start of series). If you don’t read much, at least consider looking at some of these in 2016. It’s impossible to put them into a priority order so I’ve listed them in the order I read them, along with the short Facebook review I wrote at the time.

1 #2; All The Light We Cannot See; Anthony Doerr; 531pp; Historical Fiction; January 10; (Fabulous book about a blind French girl and an orphaned German boy who both experience WWII in their teenage years in vastly different ways, and come together briefly at the end of it. Wonderfully told, with richly evocative writing – I could visualize everything that was happening. Describes some of the horrors faced by those living through and perpetrating the occupation of France. Heading to Amazon to investigate his earlier works. Very strongly recommended.)

2 #10; Mr. Midshipman Hornblower (and the rest of the series); C.S. Forrester; 320pp; Historical Fiction; February 7; (I’m rereading the Hornblower Saga this year after having last (and first) read them in 2000. An excellent start to the series, this book introduces the young, inexperienced Hornblower and sees him transform into an honorable, competent Lieutenant. This book was also the inspiration for the first 4 episodes of the terribly good A&E television series starring Ioan Gruffudd. Looking forward to getting into the second one, and maybe I’ll shoot for 100 read books again this year?)

3 #12; Ready Player One; Ernest Cline; 384pp; Science Fiction; February 12; (Really good novel about players competing to ‘win’ a world-encompassing immersive, VR game after the founder dies and leaves a giant fortune to the winner. Quite similar in scope to Snow Crash, but obviously a different story. Quite a page turner, recommended.)

4 #13; The Storied Life of A.J. Fikry; Gabrielle Zevin; 288pp; Contemporary Fiction; February 13; (Start reading this yesterday morning and it became a page turner for me. It’s a great chick flick basically (which I love, but not usually in book form), about a book store and its owner and his life. Lots of little twists in the gentle story and a nice read. Now I’m taking the girls to Elliot Bay Bookstore in Seattle to buy more books. Chain book stores just don’t cut it unfortunately. Recommended!)

5 #40; The Girl Who Played With Fire; Stieg Larsson; 630pp; Contemporary Fiction; May 4; (I read the first book (The Girl With The Dragon Tattoo) back in 2011 and loved the movie last year (the new one, not the older Swedish one). This book’s even better than the first one I think – it turned into a real page turner for me over the last couple of hundred pages. Again it’s hard to talk about the plot without giving things away, but it’s a great thriller and strongly recommended.)

6 #41; Gone Girl; Gillian Flynn; 432pp; Contemporary Fiction; May 8; (Excellent page turner with some great twists. Highly recommended and I can’t wait to see the movie!)

7 #44; Seveneves; Neal Stephenson; 869pp; Science Fiction; June 14; (Really excellent, and long, novel about the destruction of the surface of the Earth (from the break up of the moon and subsequent bombardment with trillions of meteorites) and the human race’s survival in space (over a period of 5,000 years until the Earth’s surface cools down again) and re-colonization of the Earth. Very believable with no sci-fi that requires suspension of belief. Hugely recommended and I hope there’s a sequel.)

8 #48; Nexus (and the rest of the series); Ramez Naam; 528pp; Science Fiction; June 30; (Excellent book! Start of a trilogy (I have the other two with me) about a mind-altering drug that expands consciousness and allows minds to talk to each other. The protagonists have extended the concept to run a Linux-like OS in their heads, with all kinds of interesting apps. And of course the US govt. is against it so all kinds of clandestine ops result, with lots of mayhem. A page-turner – highly recommended!)

9 #50; Master and Commander (and the rest of the series); Patrick O’Brian; 403pp; Historical Fiction; July 8; (First of the fantastic Aubrey-Maturin novels by Patrick O’Brian. I listened to all 20 of them in 2000-2002 while driving back-and-forth to work at Microsoft. This book introduces the principals, and deals with Jack Aubrey’s eventful captaincy of the sloop Sophie in the Mediterranean. Highly recommended, the entire series.)

10 #56; Avogadro Corp (and the rest of the series); William Hertling; 240pp; Science Fiction; July 23; (Cool start to the Singularity Series about runaway A.I. technology. In this book Avogadro gives it’s email program the capability to rewrite and/or send emails for maximum chance of success, based on who the email is being sent to. And then someone adds another directive to maximize the chances of the survival of the project, and the story takes off from there. Clever concept and a quick read. Looking forward to the rest of them. Recommended.)

11 #67; The Bone Clocks; David Mitchell; 624pp; Contemporary Fiction; August 25; (What an excellent book! A very clever story, woven through long chapters/novellas, each set in a different time, introducing and cleverly drawing together the principal characters. The character development is brilliant and I couldn’t put the book down – enormously entertaining and so far the best book I’ve read this year. Highly recommended!)

12 #70; Outlander; Diana Gabaldon; 640pp; Historical Fiction; September 6; (Several people have recommended this to me over the last year, given my Scottish roots, and I finally took the plunge and bought the first four books in the series. I’m glad I did! It’s a really good story about a woman who is transported back 200 years to just before the 1745 rebellion under Bonnie Prince Charlie and has to suddenly find her way in that time. It has plenty of colorful characters and action and I’m really looking forward to continuing with the next books. And of course there’s the T.V. series (which I haven’t watched yet but I’ve heard is really good). Highly recommended!)

13 #72; In Xanadu: A Quest; William Dalrymple; 320pp; Travel; September 17; (Excellent travelogue following Marco Polo’s journey along the Silk Road to Xanadu. They travel through Israel, Syria, Turkey, Iran, Pakistan, and China in the late ’80s, with all kinds of interesting encounters along the way. Highly recommended – love Dalrymple’s writing style!)

14 #76; The Golem and the Jinni; Helene Wecker; 512pp; Historical Fiction; September 30; (Excellent debut novel set in early 1900s New York, following the story of a golem (a creature made from clay and brought to life with Kabbalistic magic) and a jinni (a natural, elemental creature made of fire) that was trapped in a copper flask by a wizard a thousand years ago. It covers their problems integrating into the populace of New York, their eventual meeting, and problems when their true nature starts to be discovered. Very well written and high engaging – highly recommended!)

The Complete List

And the complete list, with links to Amazon so you can explore further. One thing to bear in mind, the dates I finished reading the book don’t mean that I started, for instance, book #2 after finishing book #1. I usually have anywhere from 10-15 books on the go at any one time so I can dip into whatever my mood is for that day. Some books I read start to finish without picking up another one and some books take me over a year. Lots of long airplane flights help too!

  1. Mission Mongolia; David Treanor; 351pp; Travel; January 5
  2. All The Light We Cannot See; Anthony Doerr; 531pp; Historical Fiction; January 10
  3. The Pagan Lord; Bernard Cornwell; 300pp; Historical Fiction; January 14
  4. A Man on the Moon: The Voyages of the Apollo Astronauts; Andrew Chaikin; 720pp; History; January 17
  5. Design for Survival; General Thomas Power; 255pp; History; January 19
  6. Turing’s Cathedral: The Origins of the Digital Universe; George Dyson; 464pp; History; January 25
  7. The Soul of a New Machine; Tracy Kidder; 295pp; History; February 1
  8. The Book of Air and Shadows; Michael Gruber; 280pp; Contemporary Fiction; February 3
  9. State of the Art; Stan Augarten; 108pp; Nonfiction; February 6
  10. Mr. Midshipman Hornblower; C.S. Forrester; 320pp; Historical Fiction; February 7
  11. African Air; George Steinmetz; 216pp; Photography; February 11
  12. Ready Player One; Ernest Cline; 384pp; Science Fiction; February 12
  13. The Storied Life of A.J. Fikry; Gabrielle Zevin; 288pp; Contemporary Fiction; February 13
  14. Half Way Home; Hugh Howey; 359pp; Science Fiction; February 14
  15. Lieutenant Hornblower; C.S. Forrester; 320pp; Historical Fiction; February 16
  16. The Tipping Point: How Little Things Can Make a Big Difference; Malcom Gladwell; 304pp; Nonfiction; February 17
  17. Daemon; Daniel Saruez; 640pp; Science Fiction; February 18
  18. See No Evil: The True Story of a Ground Soldier in the CIA’s War on Terrorism; Robert Baer; 320pp; Nonfiction; February 28
  19. Inferno; Dan Brown; 620pp; Contemporary Fiction; March 6
  20. Freedom; Daniel Saruez; 496pp; Science Fiction; March 8
  21. The Annotated Turing: A Guided Tour Through Alan Turing’s Historic Paper on Computability and the Turing Machine; Charles Petzold; 384pp; Nonfiction; March 14
  22. Influx; Daniel Saruez; 528pp; Science Fiction; March 15
  23. Diamond Dogs Turquoise Days; Alastair Reynolds; 304pp; Science Fiction; March 19
  24. Inferno: The Longfellow Translation; Dante; 200pp; Contemporary Fiction; March 19
  25. Wool; Hugh Howey; 528pp; Science Fiction; March 20
  26. Prador Moon; Neal Asher; 256pp; Science Fiction; March 21
  27. Halting State; Charles Stross; 336pp; Science Fiction; March 29
  28. Rule 34; Charles Stross; 352pp; Science Fiction; April 3
  29. Historical Atlas of the Pacific Northwest; Derek Hayes; 208pp; History; April 4
  30. Hornblower and the Hotspur; C. S. Forrester; 400pp; Historical Fiction; April 9
  31. Hornblower During the Crisis; C.S. Forrester; 176pp; Historical Fiction; April 11
  32. Hornblower and the Atropos; C.S. Forrester; 342pp; Historical Fiction; April 16
  33. Maps of North America; Ashley & Miles Baynton-Williams; 189pp; History; April 18
  34. Beat To Quarters; C.S. Forrester; 273pp; Historical Fiction; April 19
  35. Ship of the Line; C.S. Forrester; 304pp; Historical Fiction; April 24
  36. The New Health Rules; Frank Lipman & Danielle Claro; 224pp; Nonfiction; April 24
  37. Flying Colours; C.S. Forrester; 256pp; Historical Fiction; April 25
  38. Commodore Hornblower; C.S. Forrester; 343pp; Historical Fiction; April 26
  39. Lord Hornblower; C.S. Forrester; 336pp; Historical Fiction; May 2
  40. The Girl Who Played With Fire; Stieg Larsson; 630pp; Contemporary Fiction; May 4
  41. Gone Girl; Gillian Flynn; 432pp; Contemporary Fiction; May 8
  42. A Place Beyond Courage; Elizabeth Chadwick; 504pp; Historical Fiction; May 14
  43. Admiral Hornblower in the West Indies; C.S. Forrester; 336pp; Historical Fiction; May 16
  44. Seveneves; Neal Stephenson; 869pp; Science Fiction; June 14
  45. Kill Decision; Daniel Saruez; 513pp; Science Fiction; June 25
  46. Cibola Burn; James S. A. Corey; 610pp; Science Fiction; June 27
  47. Infinite Worlds: The People and Places of Space Exploration; Michael Soluri; 352pp; Photography; June 28
  48. Nexus; Ramez Naam; 528pp; Science Fiction; June 30
  49. Into The Black: Odyssey One; Evan Currie; 580pp; Science Fiction; July 5
  50. Master and Commander; Patrick O’Brian; 403pp; Historical Fiction; July 8
  51. Crux; Ramez Naam; 577pp; Science Fiction; July 11
  52. The Heart of Matter: Odyssey One; Evan Currie; 627pp; Science Fiction; July 14
  53. Homeworld: Odyssey One; Evan Currie; 500pp; Science Fiction; July 16
  54. A Constellation of Vital Phenomena; Anthony Marra; 383pp; Contemporary Fiction; July 18
  55. Apex; Ramez Naam; 602pp; Science Fiction; July 21
  56. Avogadro Corp; William Hertling; 240pp; Science Fiction; July 23
  57. A.I. Apocalypse; William Hertling; 239pp; Science Fiction; July 28
  58. The Last Firewall; William Hertling; 305pp; Science Fiction; July 30
  59. The Turing Exception; William Hertling; 290pp; Science Fiction; July 31
  60. The Kill Artist; Daniel Silva; 490pp; Contemporary Fiction; August 3
  61. Henry I; C. Warren Hollister; 588pp; History; August 9
  62. For The King’s Favor; Elizabeth Chadwick; 530pp; Historical Fiction; August 13
  63. Mapping the World; Michael Swift; 256pp; History; August 15
  64. Out of the Black; Evan Currie; 440pp; Science Fiction; August 16
  65. To Defy a King; Elizabeth Chadwick; 523pp; Historical Fiction; August 21
  66. @War: The Rise of the Military-Internet Complex; Shane Harris; 288pp; Nonfiction; August 22
  67. The Bone Clocks; David Mitchell; 624pp; Contemporary Fiction; August 25
  68. The Lions of Lucerne; Brad Thor; 624pp; Contemporary Fiction; August 28
  69. In An Antique Land: History in the Guise of a Traveller’s Tale; Amitav Ghosh; 400pp; Nonfiction; August 31
  70. Outlander; Diana Gabaldon; 640pp; Historical Fiction; September 6
  71. The Abyss Beyond Dreams; Peter F. Hamilton; 608pp; Science Fiction; September 12
  72. In Xanadu: A Quest; William Dalrymple; 320pp; Travel; September 17
  73. Post Captain; Patrick O’Brian; 467pp; Historical Fiction; September 22
  74. On The Steel Breeze; Alastair Reynolds; 532pp; Science Fiction; September 24
  75. The Age of Kali: Indian Travels and Encounters; William Dalrymple; 356pp; Travel; September 29
  76. The Golem and the Jinni; Helene Wecker; 512pp; Historical Fiction; September 30
  77. Veritas; Monaldi and Sorti; 693pp; Historical Fiction; October 7
  78. The Years of Rice and Salt; Kim Stanley Robinson; 784pp; Science Fiction; October 17
  79. The Moon is a Harsh Mistress; Robert Heinlein; 382pp; Science Fiction; October 18
  80. The Innovators: How a Group of Hackers; Geniuses; and Geeks Created the Digital Revolution; Walter Isaacson; 542pp; History; November 1
  81. Hunter Killer: Inside America’s Unmanned Air War; T. Mark McCurley; 368pp; Nonfiction; November 14
  82. Nemesis Games; James S. A. Corey; 544pp; Science Fiction; November 17
  83. The Girl Who Kicked the Hornet’s Nest; Steig Larsson; 672pp; Contemporary Fiction; November 30
  84. The English Assassin; Daniel Silva; 416pp; Contemporary Fiction; December 10
  85. Afghanistan: A Military History from Alexander the Great to the Taliban Insurgency; Stephen Tanner; 392pp; History; December 23
  86. H.M.S. Surprise; Patrick O’Brian; 416pp; Historical Fiction; December 24
  87. The Confessor; Daniel Silva; 480pp; Contemporary Fiction; December 26
  88. The Mauritius Command; Patrick O’Brian; 348pp; Historical Fiction; December 27