Who is overriding MAXDOP 1 on the instance?

In Monday’s newsletter I discussed an email question I’d been sent and promised to blog about it.

The question was “how can latch waits for ACCESS_METHODS_DATASET_PARENT occur when the instance MAXDOP is set to 1?” This shouldn’t happen, as that latch is only used for coordination between threads performing a parallel table/index scan. However, just because the instance MAXDOP is set to 1, that doesn’t mean no parallelism can occur.

Anyone can override the instance MAXDOP setting using a query hint.

There are two ways to prevent this:

  • Set the instance cost threshold for parallelism setting to be extremely high, so no serial plans can be deemed expensive enough to run in parallel
  • Make everyone run within a Resource Governor (Enterprise Edition only) workload group that has the MAX_DOP set to 1

Or you can figure out where the query hint is being used and remove it.

In this post I’m going to show you a simple Extended Events session that will capture information about what’s causing CXPACKET waits (you can’t have ACCESS_METHOD_DATASET_PARENT latch waits without CXPACKET waits happening too) and then refine it to use a less expensive event.

First off I’ll set my MAXDOP to 1:

EXEC sp_configure 'show advanced options', 1;

EXEC sp_configure 'max degree of parallelism', 1;

SELECT [name], [value_in_use] FROM sys.configurations WHERE [name] LIKE '%degree%';
Configuration option 'show advanced options' changed from 0 to 1. Run the RECONFIGURE statement to install.
Configuration option 'max degree of parallelism' changed from 0 to 1. Run the RECONFIGURE statement to install.
name                       value_in_use
-------------------------- -------------
max degree of parallelism  1

Then I’ll check for CXPACKET waits (using my waits query) after running the following query, that scans a 6.7 million row table (you can get the SalesDB database from here):

SELECT * FROM [SalesDB].[dbo].[Sales] WHERE [Quantity] > 10000;

No CXPACKET waits.

But if I add a MAXDOP query hint and then check for waits:

SELECT * FROM [SalesDB].[dbo].[Sales] WHERE [Quantity] > 10000 OPTION (MAXDOP 8);
WaitType            Wait_S  Resource_S Signal_S WaitCount Percentage AvgWait_S AvgRes_S AvgSig_S Help/Info URL
------------------- ------- ---------- -------- --------- ---------- --------- -------- -------- ---------------------------------------------
LATCH_EX            0.20    0.16       0.05     93        79.61      0.0022    0.0017   0.0005   https://www.sqlskills.com/help/waits/LATCH_EX
CXPACKET            0.05    0.05       0.00     16        20.00      0.0032    0.0032   0.0000   https://www.sqlskills.com/help/waits/CXPACKET

And the instance MAXDOP was successfully overridden.

Now I’ll set up a simple Extended Events session to track down the offending code (based on the query from here). It’s very important that you query the sys.dm_xe_map_values DMV to find the correct number to use in the query for the CXPACKET wait, as these numbers often change from release to release, and even in Service Packs. For instance, CXPACKET was 191 in SQL Server 2014 RTM, but is 190 in the 2014 build I’m using.

Be very careful about running this in production, as the event will fire for *every* wait that occurs and so will likely affect your workload throughput, even though it’ll short-circuit if the wait isn’t CXPACKET. I’ll show you a better event to use lower down.

    SELECT * FROM sys.server_event_sessions
        WHERE [name] = N'InvestigateWaits')

ADD EVENT [sqlos].[wait_info]
    ACTION (
    WHERE [wait_type] = 190 -- CXPACKET only
    AND [opcode] = 1 -- Just the end wait events
ADD TARGET [package0].[ring_buffer]
    MAX_MEMORY = 50 MB,


And then when I run the select statement again I can look in the ring buffer and see the events. I put in a DISTINCT to minimize the number of lines of output. The code is:

    DISTINCT ([data1].[value] ('(./@timestamp)[1]', 'datetime')) AS [Time],
    [data1].[value] ('(./action[@name="client_hostname"]/value)[1]', 'VARCHAR(MAX)') AS [Host],
    [data1].[value] ('(./action[@name="nt_username"]/value)[1]', 'VARCHAR(MAX)') AS [User],
    [data1].[value] ('(./action[@name="sql_text"]/value)[1]','VARCHAR(MAX)') AS [Statement]
    SELECT CONVERT (XML, [target_data]) AS data
    FROM sys.dm_xe_session_targets [xst]
    INNER JOIN sys.dm_xe_sessions [xs]
        ON [xst].[event_session_address] = [xs].[address]
    WHERE [xs].[name] = N'InvestigateWaits') AS t
CROSS APPLY data.nodes('//event') n (data1);
Time                    Host       User            Statement
----------------------- ---------- --------------- ---------------------------------------------------------------------------------
2017-02-16 00:20:16.937 APPLECROSS APPLECROSS\Paul SELECT * FROM [SalesDB].[dbo].[Sales] WHERE [Quantity] > 10000 OPTION (MAXDOP 8);
2017-02-16 00:20:16.987 APPLECROSS APPLECROSS\Paul SELECT * FROM [SalesDB].[dbo].[Sales] WHERE [Quantity] > 10000 OPTION (MAXDOP 8);
2017-02-16 00:20:16.937 APPLECROSS                 SELECT * FROM [SalesDB].[dbo].[Sales] WHERE [Quantity] > 10000 OPTION (MAXDOP 8);

Cool – so that works, but the session will likely affect workload performance, as I mentioned above. A better event to use is degree_of_parallelism, which was introduced in SQL Server 2012 and only fires once per batch execution, rather than once for every wait that occurs.

The updated event session is:

ADD EVENT [sqlserver].[degree_of_parallelism]
    ACTION (
    WHERE [dop] > 0 -- parallel plans
ADD TARGET [package0].[ring_buffer]
    MAX_MEMORY = 50 MB,

And the code to parse the XML, and sample output from my query is:

    [data1].[value] ('(./@timestamp)[1]', 'datetime') AS [Time],
    [data1].[value] ('(./data[@name="dop"]/value)[1]', 'INT') AS [DOP],
    [data1].[value] ('(./action[@name="client_hostname"]/value)[1]', 'VARCHAR(MAX)') AS [Host],
    [data1].[value] ('(./action[@name="nt_username"]/value)[1]', 'VARCHAR(MAX)') AS [User],
    [data1].[value] ('(./action[@name="sql_text"]/value)[1]','VARCHAR(MAX)') AS [Statement]
    SELECT CONVERT (XML, [target_data]) AS data
    FROM sys.dm_xe_session_targets [xst]
    INNER JOIN sys.dm_xe_sessions [xs]
        ON [xst].[event_session_address] = [xs].[address]
    WHERE [xs].[name] = N'InvestigateWaits') AS t
CROSS APPLY data.nodes('//event') n (data1);
Time                    DOP Host       User            Statement
----------------------- --- ---------- --------------- ---------------------------------------------------------------------------------
2017-02-16 00:36:37.347 8   APPLECROSS APPLECROSS\Paul SELECT * FROM [SalesDB].[dbo].[Sales] WHERE [Quantity] > 10000 OPTION (MAXDOP 8);

Pretty cool, eh?

You can mess around with more complex code that does things like grab the tsql_stack action and then parses it out, and more information to identify the culprit, like the application name – however complex you want to get – but now you know the base event session to capture when the query hint is being used.


Presenting at SQLSaturday Dublin in June

 SQLSaturday #620 - Dublin 2017

Although we won’t be presenting any Immersion Events in Europe in 2017, Kimberly and I will both be presenting workshops and sessions at SQLSaturday Dublin in June – our first SQLSaturday ever!


My workshop details:

  • Performance Troubleshooting Using Waits and Latches
  • One of the first things you should check when investigating performance issues are wait and latch statistics, as these can often point you in the right direction for further analysis. Unfortunately many people misinterpret what SQL Server is telling them and jump to conclusions about how to solve the problem – what is often called ‘knee-jerk performance tuning’.
    In this full-day workshop, you’ll learn how to investigate and interpret wait and latch statistics – practical techniques you can take home and start using immediately. You’ll also learn what a myriad of wait and latch types actually mean and how you can investigate lightly-documented types to gain insight into what causes them. Don’t waste time when it comes to performance troubleshooting; wait statistics are the fastest route to understanding your problems and this workshop will help you get there faster.
  • Thursday, June 15th
  • Eventbrite registration link

Kimberly’s workshop details:

  • Queries Gone Wrong: Statistics, Cardinality, Solutions
  • Have you ever wondered why SQL Server did what it did when processing your query? Have you wondered if it could have done better? Query estimates/statistics are often at the key to understanding; this session will explain the what, why, and how about estimates!
    Most of the time SQL Server returns data quickly – except when it doesn’t. Ultimately, what you see in the plan, just doesn’t seem to make sense. Why? Where did it come up with this plan? From one side, Transact-SQL is a declarative language that details what data you need but without information about how SQL Server should get it. Join order, predicate analysis – this is what SQL Server has to decide based on your query’s input? But, what should be processed first? Which table should we use to “drive” the join? Usually, it’s the table with the smallest number of rows that match your query’s predicates. But, how do they know which has the smallest set before they process the data? Statistics!
    There are numerous reasons why query performance can suffer and in this full-day workshop, Kimberly will cover a number of critical areas and for each – show you the behaviors, execution plans, troubleshooting techniques, and most importantly, possible solutions. This full-day workshop is about solving your query performance problems. Each problem has a different way of approaching it and you’ll walk away with a plethora of strategies to troubleshoot and tackle even gnarly query problems. Stop with the “sledgehammer” approaches (updating statistics, rebuilding indexes, recompiling plans, clearing cache, restarting SQL Server) and solve the problem. In this full-day workshop, you’ll learn much more finessed ways to solve query plan quality problems.
    Topics covered include understanding / maintaining statistics, handing VLTs / skewed data, distribution problems, troubleshooting common and advanced scenarios, and how to best utilize the cardinality estimation models (and trace flags) available in SQL Server versions 2008-2016.
  • Friday, June 16th
  • Eventbrite registration link

My Saturday session details:

  • Advanced Data Recovery Techniques
  • Disasters happen – plain and simple. When disaster strikes a database you’re responsible for, and backups and repair fail, how can you salvage data, and possibly your company and your job? This is where advanced data recovery techniques come in. Using undocumented tools and deep knowledge of database structures, you can manually patch up the database enough to extract critical data. This demo-heavy session will show you never-seen-before methods I’ve used extensively in the last year to salvage data for real-life clients after catastrophic corruption. You won’t believe what it’s possible to do!

Kimberly’s Saturday session details:

  • Plan Cache Pollution: Dealing with Ad Hoc Madness
  • How you execute your data requests can have a profound effect on performance and plan reuse. Did you ever wonder where that estimate (in the showplan) comes from? Is it correct or, is it way off? Why? You’ve probably heard that many of your problems are related to statistics. Potentially, you’ve even rebuilt statistics only to find that it fixes the problem. However, what you might be seeing is a false positive. And, it doesn’t always work. Come to this session to find out how you should be executing your statements for the best performance, caching, and reuse! We’ll look at ad hoc statements, dynamically constructed statements, and sp_executesql (e.g. forced statement caching) and I’ll debunk the mysteries around estimates so that you can solve performance problems the RIGHT way! If you want to demystify SQL Server’s decisions used for statement execution, query plans, and plan caching – this is the place to be!

The link to the main SQLSaturday Dublin web page is here.

There are a ton of excellent speakers attending from around the world, so it’s going to be a really great weekend – we hope to see you there!

New class: Immersion Event on PowerShell for SQL Server DBAs

We have a really cool new class debuting this May in Chicago: Immersion Event on PowerShell for SQL Server DBAs.

It’s a 3-day class, taught by MVP, MCM, and industry expert Ben Miller, and no prior PowerShell experience is necessary. PowerShell is getting more and more popular, and by the end of the class you’ll have learned the following:

  • Installation and configuration of PowerShell
  • Use of the ISE that comes with PowerShell
  • Shared Management Objects (SMO) for SQL Server
  • Programming concepts in PowerShell
  • Modules and scripts to manage SQL Server
  • Gathering data for analysis using PowerShell both with SMO and T-SQL
  • Repositories that contain scripts that you can leverage in your daily work

The modules are as follows:

  • PowerShell Environment
  • Commands, Cmdlets  and Modules
  • Environment Configuration
  • Assemblies in PowerShell
  • PowerShell Fundamentals
  • SQL Server PowerShell
  • PowerShell for the DBA Starter
  • Gathering Information
  • Power Tools
  • Scheduling

You can read a more detailed curriculum here and all the class registration and logistical details are here.

We hope to see you there!

New course: Introduction to Query Store

Our first new course of the year, and our 50th online training course with Pluralsight!!

Erin’s latest Pluralsight course has been published – SQL Server: Introduction to Query Store – and is just over three hours long. It’s based on her very popular user group and conference session, but much expanded with lots of cool demos.

The modules are:

  • Introduction
  • Defining Query Store
  • Implementing Query Store
  • Installing SQL Server 2016
  • Forcing Plans with Query Store
  • Practical Uses of Query Store

Check it out here.

We now have 150 hours of SQLskills online training available (see all our 50 courses here), all for as little as $29/month through Pluralsight (including more than 5,000 other developer and IT training courses). That’s unbeatable value that you can’t afford to ignore.



Last week I was sent an email question about the cause of LOGMGR_RESERVE_APPEND waits, and in Monday’s Insider newsletter I wrote a short explanation. It’s a very unusual wait to see as the highest wait on a server, and in fact it’s very unusual to see it at all.

It happens when a thread is generating a log record and needs to write it into a log block, but there’s no space in the log to do so. The thread first tries to grow the log, and if it fails, and the database is in the simple recovery mode, then it waits for 1 second to see if log clearing/truncation can happen in the meantime and free up some space. (Note that when I say ‘simple recovery mode’, this also includes a database in full or bulk_logged, but where a full backup has not been taken – i.e. the database is operating in what’s called pseudo-simple.)

It’s the fact that the database needs to be in the simple recovery mode and have no space available that makes this wait type very unusual to see.

An example call stack is on SQL Server 2014 (captured using this mechanism):


Here’s a scenario that shows it happening. First I’ll create the database with a fixed size log, set it to simple recovery mode, and create a table that will generate large log records for inserts:

    NAME = N'Company_data',
    FILENAME = N'D:\SQLskills\Company_data.mdf')
    NAME = N'Company_log',
    FILENAME = N'C:\SQLskills\Company_log.ldf',
    SIZE = 2MB,
USE [Company];
CREATE TABLE [BigRows] ([c1] INT IDENTITY, [c2] CHAR (8000) DEFAULT 'a');

And then in two other windows, run the following code:

USE [Company];

WHILE (1 = 1)

And within a few seconds, you’ll see LOGMGR_RESERVE_APPEND waits happening (using my waits script). Here’s an example (with a few columns removed for brevity):

WaitType                       WaitCount Percentage AvgWait_S AvgRes_S AvgSig_S Help/Info URL
------------------------------ --------- ---------- --------- -------- -------- -------------------------------------------------------------------
LOGMGR_RESERVE_APPEND          26        50.78      0.9847    0.9847   0.0000   https://www.sqlskills.com/help/waits/LOGMGR_RESERVE_APPEND
PREEMPTIVE_OS_FLUSHFILEBUFFERS 954       22.14      0.0116    0.0116   0.0000   https://www.sqlskills.com/help/waits/PREEMPTIVE_OS_FLUSHFILEBUFFERS
WRITELOG                       131625    21.63      0.0001    0.0001   0.0000   https://www.sqlskills.com/help/waits/WRITELOG
PAGEIOLATCH_UP                 5841      3.37       0.0003    0.0003   0.0000   https://www.sqlskills.com/help/waits/PAGEIOLATCH_UP

So if you ever see these waits, look for databases using the simple recovery mode where the log is set to have zero or very tiny autogrowth.


Code to show rolled back transactions after a crash

In Monday’s Insider newsletter I discussed an email question I’d been sent about how to identify the transactions that had rolled back because of a crash, and I said I’d blog some code to do it.

First of all you need to know the time of the crash. We can’t get this exactly (from SQL Server) unless SQL Server decides to shut itself down for some reason (like tempdb corruption) but we can easily get the time that SQL Server restarted, which is good enough, as we just need to know a time that’s after the transactions started before the crash, and before those transactions finished rolling back after a crash. We can get the startup time from the sqlserver_start_time column in the output from sys.dm_os_sys_info.

Then we can search in the transaction log, using the fn_dblog function, for LOP_BEGIN_XACT log records from before the crash point that have a matching LOP_ABORT_XACT log record after the crash point, and with the same transaction ID. This is easy because for LOP_BEGIN_XACT log records, there’s a Begin Time column, and for LOP_ABORT_XACT log records (and, incidentally, for LOP_COMMIT_XACT log records), there’s an End Time column in the TVF output.

And there’s a trick you need to use: to get the fn_dblog function to read log records from before the log clears (by the checkpoints that crash recovery does, in the simple recovery model, or by log backups, in other recovery models), you need to enable trace flag 2537. Now, if do all this too long after crash recovery runs, the log may have overwritten itself and so you won’t be able to get the info you need, but if you’re taking log backups, you could restore a copy of the database to the point just after crash recovery has finished, and then do the investigation.

After that, the tricky part is matching what those transactions were doing back to business operations that your applications were performing. If you don’t name your transactions, that’s going to be pretty hard, as all you’ve got are the generic names that SQL Server gives transactions (like INSERT, DELETE, DROPOBJ). Whatever the reason you might want this information, your applications should be written so they gracefully handle transaction failures and leave the database in a consistent state (as far as your business rules are concerned – of course SQL Server leaves the database in a transactionally-consistent state after a crash).

I’ve written some code and encapsulated it in a proc, sp_SQLskillsAbortedTransactions, which is shown in full at the end of the post. To use it, you go into the context of the database you’re interested in, and just run the proc. It takes care of enabling and disabling the trace flag.

Here’s an example of a crash situation and using the proc.

First I’ll create a table and start a transaction:

USE [master];

IF DATABASEPROPERTYEX (N'Company', N'Version') > 0
    DROP DATABASE [Company];

USE [Company];

CREATE TABLE [test] ([c1] INT, [c2] INT, [c3] INT);
INSERT INTO [test] VALUES (0, 0, 0);

BEGIN TRAN FirstTransaction;
INSERT INTO [Test] VALUES (1, 1, 1);

Now in a second window, I’ll start another transaction, and force the log to flush to disk (as I haven’t generated enough log to have the current log block automatically flush to disk):

USE [Company];

BEGIN TRAN SecondTransaction;
INSERT INTO [Test] VALUES (2, 2, 2);

EXEC sp_flush_log;

And in a third window, I’ll force a crash:


After restarting the instance, I can use this code to run my proc:

USE [Company];

EXEC sp_SQLskillsAbortedTransactions;
Begin Time               Transaction Name   Started By       Transaction ID
------------------------ ------------------ ---------------- --------------
2017/01/18 17:09:36:190  FirstTransaction   APPLECROSS\Paul  0000:00000374
2017/01/18 17:09:40:600  SecondTransaction  APPLECROSS\Paul  0000:00000375

Cool eh?

Here’s the code – enjoy!

  File:     sp_SQLskillsAbortedTransactions.sql
  Summary:  This script cracks the transaction log and shows which
            transactions were rolled back after a crash
  SQL Server Versions: 2012 onwards
  Written by Paul S. Randal, SQLskills.com
  (c) 2017, SQLskills.com. All rights reserved.
  For more scripts and sample code, check out 
  You may alter this code for your own *non-commercial* purposes. You may
  republish altered code as long as you include this copyright and give due
  credit, but you must obtain prior permission before blogging this code.
USE [master];
IF OBJECT_ID (N'sp_SQLskillsAbortedTransactions') IS NOT NULL
    DROP PROCEDURE [sp_SQLskillsAbortedTransactions];
CREATE PROCEDURE sp_SQLskillsAbortedTransactions

    DBCC TRACEON (2537);
    DECLARE @XactID     CHAR (13);

    SELECT @BootTime = [sqlserver_start_time] FROM sys.dm_os_sys_info;

    IF EXISTS (SELECT * FROM [tempdb].[sys].[objects]
        WHERE [name] = N'##SQLskills_Log_Analysis')
        DROP TABLE [##SQLskills_Log_Analysis];

    -- Get the list of started and rolled back transactions from the log
        [Begin Time],
        [Transaction Name],
        SUSER_SNAME ([Transaction SID]) AS [Started By],
        [Transaction ID],
        [End Time],
        0 AS [RolledBackAfterCrash],
    INTO ##SQLskills_Log_Analysis
    FROM fn_dblog (NULL, NULL)
    WHERE ([Operation] = 'LOP_BEGIN_XACT' AND [Begin Time] < @BootTime) OR ([Operation] = 'LOP_ABORT_XACT' AND [End Time] > @BootTime);

        [Transaction ID]
    OPEN [LogAnalysis];
    FETCH NEXT FROM [LogAnalysis] INTO @XactID;
        IF EXISTS (
            SELECT [End Time] FROM ##SQLskills_Log_Analysis
            WHERE [Operation] = 'LOP_ABORT_XACT' AND [Transaction ID] = @XactID)
        UPDATE ##SQLskills_Log_Analysis SET [RolledBackAfterCrash] = 1
            WHERE [Transaction ID] = @XactID
            AND [Operation] = 'LOP_BEGIN_XACT';

        FETCH NEXT FROM [LogAnalysis] INTO @XactID;
    CLOSE [LogAnalysis];
    DEALLOCATE [LogAnalysis];
        [Begin Time],
        [Transaction Name],
        [Started By],
        [Transaction ID]
    FROM ##SQLskills_Log_Analysis
    WHERE [RolledBackAfterCrash] = 1;
    DBCC TRACEOFF (2537);

    DROP TABLE ##SQLskills_Log_Analysis;
EXEC sys.sp_MS_marksystemobject [sp_SQLskillsAbortedTransactions];
-- USE [Company]; EXEC sp_SQLskillsAbortedTransactions;

Summer 2017 classes in Bellevue open for registration

I’ve just released our second set of classes for 2017 for registration!

Our classes in July/August will be in Bellevue, WA:

  • IEPTO1: Immersion Event on Performance Tuning and Optimization – Part 1
    • July 31-August 4
  • IEPTO2: Immersion Event on Performance Tuning and Optimization – Part 2
    • August 7-11

Note that there is a full set of classes in Chicago in April/May, and that there will be NO classes in Europe in 2017.

You can get all the logistical, registration, and curriculum details by drilling down from our main schedule page.

We hope to see you there!

2016 review: the year by the numbers

The last post of the year! It’s been a really excellent year all round and time for my traditional post counting down some of the numbers that have been my life this year.

  • 125,402: the number of miles I flew on United
  • 34,085: my current tweet total (up 772 from 2015)
  • 13,434: the number of subscribers to our Insider mailing list (up 493 from 2015)
  • 12,168: the number of people who follow my Twitter ramblings (up 1,325 from 2015)
  • 8,798: the number of emails I sent (down 3,025 from 2015 – this is a good thing!)
  • 1,779: the number of books (real ones) that I own (up 176 from 2015)
  • 998: the number of books I own but haven’t read yet (up 130 from 2015)
  • 286: the number of SQL Server wait types I have documented in my Wait Types Library
  • 184: the number of nights away from home (up 36 from 2015, and all with Kimberly, so not *too* bad)
  • 149: the total number of hours of online training we have available on Pluralsight
  • 127: the number of dives I will have completed by 12/31, all in Indonesia, taking my total to 653
  • 91: the number of minutes of my longest dive this year
  • 60: the number of days in Immersion Events and conferences
  • 50.41: the percentage of time we were away from home (which is why we call it our vacation home!)
  • 49: the number of Pluralsight courses we have available
  • 45: the number of books I read (see this post)
  • 42: the number of flights this year (same as last year)
  • 42: the answer to the question of life, the universe, and everything!
  • 30: the number of different places we slept apart from our house and on planes
  • 25: the number of SQLskills blog posts, including this one
  • 22: the number of airports I flew through this year
  • 12: the number of new bird species I saw, taking my total to 511
  • 12: the number of monthly magazines I subscribe to
  • 9: the number of years I’ve been married to Kimberly
  • 8: the number of different octopus species we saw in Indonesia (coconut, algae, reef, blue-ring, starry-night, mimic, wonderpus, long-arm)
  • 7: the number of countries we visited this year
  • 7: the number of full-time SQLskills employees, all of whom are fabulous and indispensable
  • 2: the number of new airports I flew through (Manado and Lubuan Bajo, both in Indonesia), taking my total to 91
  • 2: the number of new countries I visited (Sweden and Finland), taking my total to 38
  • 2: the number of awesome daughters we have
  • 1: the number of new airlines I flew on (Silk Air, based in Singapore), taking my total to 35
  • 1: the person who seems to cram the most into non-work time (farming, scouts, PokemonGo, building, …): Tim Radney
  • 1: the person who is the best at snapping her fingers and ran a marathon this year: Erin Stellato
  • 1: the biggest hardware geek, master beer brewer, and ex-tank commander I know: Glenn Berry
  • 1: the number of Jonathan Kehayias in the world – thankfully :-)
  • 1: the number of indispensable assistants, without whom our lives would be a distressing quagmire – Libby we love you!
  • Finally, the one and only best person in my life: Kimberly, without whom I would be lost…

Thank you to everyone who reads our blogs, follows us on Twitter, sends us questions, watches our videos, comes to our classes, and generally makes being deeply involved in the SQL community a joy.

I sincerely wish you all a happy, healthy, and prosperous New Year!


(The whole team with all their kids after watching/helping Jonathan propose to Kristi at The Bean in Chicago in May)


(With our wonderful girls in Longyearbyen on Svalbard in August)


(At about 80 degrees north, next to the 7th largest ice cap in the world (Austfonna on the island of Nordaustlandet in the Svalbard archipelago) a couple of weeks later, at about 10pm. The ambient temperature is just above freezing, with a gentle katabatic wind coming off the ice, and I’m in shorts and a t-shirt – I don’t get cold!)


2016: the year in books

Back in 2009 I started posting a summary at the end of the year of what I read during the year (see my posts from 200920102011, 2012, 2013, 2014, 2015) and people have been enjoying it, so here I present the 2016 end-of-year post. I set a moderate goal of 50 books this year but I only managed 45 – the first year since 2009 that I’ve missed my goal – as I spent a bunch of time catching up with my magazine backlog. Next year I’m setting myself a goal of reading 50 books again.

For the record, I read ‘real’ books – i.e. not in electronic form – I don’t like reading off a screen. Yes, I’ve seen electronic readers – we both have iPads – and I’m not interested in ever reading electronically. I also don’t ‘speed read’ – I read quickly and make lots of time for reading.

Why do I track metrics? Because I like doing it, and being able to compare against previous years. Some people don’t understand the logic in that – each to their own :-)

I went back-and-forth over the last few days about which book to nominate as my favorite, and I just couldn’t come to a decision, so just like in most years, I give you my favorite 3 books: Code: The Hidden Language of Computer Hardware and Software by Charles Pretzold, Gardens of the Moon: A Tale of the Malazan Book of the Fallen by Steven Erickson, and The Departure: The Owner: Book One by Neal Asher. All three are superb books (with the last two being the start of series) and I strongly recommend you give them a try. You can read my review of them in the top-10 list below.

Now the details. I enjoy putting this together as it will also serve as a record for me many years from now. I hope you get inspired to try some of these books – push yourself with new authors and very often you’ll be surprisingly pleased. Don’t forget to check out the previous year’s blog posts for more inspiration too.

As usual I leave you with a quote that describes a big part of my psychological make-up:

In omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro!

Analysis of What I Read

I read 19,344 pages, or 53.00 pages a day, and a book every 8.1 days or so. The chart below shows the number of pages (y-axis) in each book I read (x-axis).



The average book length was 429 pages, slightly longer than last year but shorter than previous years. That’s because I again read a lot of series books where each isn’t hugely long.

The Top 10

I read a lot of excellent books this year but because I only read 45, I was able to whittle them down to a top-10, unlike previous years. If you don’t read much, at least consider looking at some of these in 2017. It’s impossible to put them into a priority order so I’ve listed them in the order I read them, along with the short Facebook review I wrote at the time.

1 #2; Code: The Hidden Language of Computer Hardware and Software; Charles Petzold; 396pp; Nonfiction; January 14; (This book is really excellent! It’s a very cleverly written introduction and exploration of how computers work at the logic level, and takes the reader from the basics of electricity and binary to the intricacies of memory busses, CPUs, and assembly language. I quickly skimmed the first hundred or so pages until I got to the part about building counters from relays and it started to refresh my memory with things I’d learned back in 1990 when I did my B. Eng (Hons) degree in computer science and electronics in Edinburgh.I read this book as a way to kick start getting back into computer design as I want to build a CPU and computer system out of TTL logic (one of my many, many ‘spare time’ goals). First though, when I get home I’m going to build some logic circuits out of relays – just for the fun of hearing all the little clicks as the relays change state :-)! I highly recommend this book for anyone who wants to know a bit more about how computers work.)

2 #12; The Information: A History, A Theory, A Flood; James Gleick; 544pp; Nonfiction; February 12; (This is an excellent history of the methods of dissemination of information (think printing press, visual telegraph, morse code), and on the creation and development of the various facets of information theory, including quantum computation and genetics. Dense, but very interesting, and highly recommended.)

3 #13; Gardens of the Moon: A Tale of the Malazan Book of the Fallen; Steven Erikson; 752pp; Fantasy Fiction; March 2; (This is the start of a 10-volume, fantasy epic. I picked up the first few to try out and have been enjoying it since I started the first one on the flights home from diving earlier this month. The story is very involved, with magic, immortals, empires, and lots of intrigue and the book throws you right in from page one. All the books are 700-1000+ pages, so a real treat to read. Highly recommended!)

4 #20; SPQR: A History of Ancient Rome; Mary Beard; 606pp; History; May 14; (This is an excellent book for history buffs and anyone interested in ancient Rome. She adroitly covers how Rome developed over its first 1,000 years, clashes between prominent Romans, the rise of the Emperors, the development of political democracy and on into a dictatorial empire, how they treated slaves and women, and much more. I found it hugely readable and not dry in the least, and it’s inspired me to read a bunch of books on Rome and the Romans that I’ve had for a while. Highly recommended!)

5 #22; World Order; Henry Kissinger; 432pp; Nonfiction; June 3; (Kissinger’s excellent disquisition is part history, part survey, and part explanation and I found it immensely interesting. He examines in great detail the Westphalian order (named after the 1648 Treaty of Westphalia that ended the devastating Thirty Years War), where there’s a balance of power between states, and the problems that arise when the balance is disrupted (e.g. both World Wars), along with sections on Europe, Islamism and the Middle East, and Asia and China. Almost 1/3 of the book is dedicated to the history of major U.S. foreign policy since WWI, and all the wars we’ve fought since then to try to maintain order and/or defend a people being wrongly subjugated. He ends with an analysis of how technologies are affecting world order, and a call to action for future policy changes to maintain it. This is the first book of Kissinger’s that I’ve read, and his powerful intellect and clarity of thought are obvious throughout; I have many of his other works and I’m looking forward to reading them too. Hugely recommended!)

6 #23; The End of Alchemy: Money;  Banking;  and the Future of the Global Economy; Mervyn King; 447pp; Nonfiction; June 7; (King was the Governor of the Bank of England from 2003 to 2013, so had a ringside seat of the financial crisis in 2008-9. Rather than being self-aggrandizing or trying to deflect blame, he dispassionately analyzes what he believes led to the crisis: an unwillingness to look beyond liquidity problems to solvency problems, coupled with the alchemy of creating supposed liquid assets (e.g. derivatives based on slices of mortgage loans) from illiquid assets (e.g. sub-prime mortgages). He also explains why the world economy is still in disequilibrium rather than rebounding and presents some interesting ideas for how to change; basically a major reform of the banking sector, including how central banks respond to monetary crises. Unfortunately, it’ll require a multi-country effort to fix the financial problems the world faces – which at present seems unlikely to happen. Along the way King explains a lot about macro- and micro-economic theory and the history of finance over the last 100 years, which in itself makes for a fascinating read. Highly recommended!)

7 #28; Desert Air; George Steinmetz; 380pp; Photography; July 11; (Steinmetz is an aerial photographer who specializes in deserts and this book is a visually stunning collection of photography of major deserts and features in them from across the world. Other books of his I’ve read are African Air and Empty Quarter: A Photographic Journey to the Heart of the Arabian Desert, and I highly recommended all of them.)

8 #31; The Silk Roads: A New History of the World; Peter Frankopan; 645pp; History; August 11; (This is a very interesting book tracing the varied history of the countries along the Silk Road, including empires, explorers, and religions that affected the various routes. A lot of what’s in the book I already knew, but having it all presented in one volume in a chronological sequence was excellent. The last 100 pages or so detailed the quite despicable British and American machinations around the countries in the Middle East for their own (mostly oil-related) gains, to the huge detriment of the native populations, which I felt quite ashamed to read about, being of both nationalities. Highly recommended for history fans!)

9 #36; The Departure: The Owner: Book One; Neal Asher; 412pp; Science Fiction; September 29; (Asher is one of my favorite sci-fi authors and most of his novels are set in his Polity universe. This is the first in a trilogy, set on Earth a hundred years or so in the future, where there’s a single brutal government, and the colony on Mars that’s just been abandoned by Earth. The protagonist wakes up inside a sealed box on a conveyor belt leading to an incinerator and has to figure out his previous life and then start working on revenge. Lots of action, cool machines and robots, futuristic technology and all very fast paced. I can’t wait to read the next two – highly recommended!)

10 #40; Trainspotting; Irvine Welsh; 348pp; Contemporary Fiction; November 21; (This is one of my all-time favorite movies and Kimberly bought me a leather-bound, signed edition of the book for my birthday in July, so I decided to read it again. It jumps straight in to the lives of a handful of degenerate heroin addicts living in Edinburgh (see the prequel book Skagboys for the back-story). It’s a fantastic book, but not for the faint-hearted at all – it’s written in colloquial Scots, littered with four-letter words, and will likely be hard going for most people reading it. However, if you can stomach it, it’s well worth reading for insight into the Edinburgh drug culture of the 1980s and 1990s.)

The Complete List

And the complete list, with links to Amazon so you can explore further. One thing to bear in mind, the dates I finished reading the book don’t mean that I started, for instance, book #2 after finishing book #1. I usually have anywhere from 10-15 books on the go at any one time so I can dip into whatever my mood is for that day. Some books I read start to finish without picking up another one and some books take me over a year. Lots of long airplane flights and boat trips help too!

  1. Path of the Assassin; Brad Thor; 503pp; Contemporary Fiction; January 2
  2. Code: The Hidden Language of Computer Hardware and Software; Charles Petzold; 396pp; Nonfiction; January 14
  3. Sarum: The Novel of England; Edward Rutherford; 1344pp; Historical Fiction; January 16
  4. The Orphan Master’s Son; Adam Johnson; 480pp; Contemporary Fiction; January 20
  5. Desolation Island; Patrick O’Brian; 325pp; Historical Fiction; January 21
  6. The Abominable; Dan Simmons; 663pp; Historical Fiction; January 24
  7. Out of the Flames; Lawrence & Nancy Goldstone; 368pp; History; January 27
  8. The Fortune of War; Patrick O’Brian; 329pp; Historical Fiction; January 28
  9. A Short Guide to a Long Life; David Agus; 190pp; Nonfiction; January 29
  10. The Surgeon’s Mate; Patrick O’Brian; 382pp; Historical Fiction; January 30
  11. The Ionian Mission; Patrick O’Brian; 400pp; Historical Fiction; February 1
  12. The Information: A History, A Theory, A Flood; James Gleick; 544pp; Nonfiction; February 12
  13. Gardens of the Moon: A Tale of the Malazan Book of the Fallen; Steven Erikson; 752pp; Fantasy Fiction; March 2
  14. Soviet Ghosts: The Soviet Union Abandoned: A Communist Empire in Decay; Rebecca Litchfield; 192pp; Photography; March 26
  15. A Death in Vienna; Daniel Silva; 400pp; Contemporary Fiction; March 28
  16. Major Pettigrew’s Last Stand; Helen Simonson; 384pp; Contemporary Fiction; April 12
  17. The Swerve – How the World Became Modern; Stephen Jay Greenblatt; 368pp; History; April 13
  18. The Geography of Genius: A Search for the World’s Most Creative Places from Ancient Athens to Silicon Valley; Eric Weiner; 368pp; Nonfiction; April 26
  19. The Empty Throne; Bernard Cornwell; 296pp; Historical Fiction; May 11
  20. SPQR: A History of Ancient Rome; Mary Beard; 606pp; History; May 14
  21. Monsoon: The Indian Ocean and the Future of American Power; Robert D. Kaplan; 400pp; Nonfiction; May 21
  22. World Order; Henry Kissinger; 432pp; Nonfiction; June 3
  23. The End of Alchemy: Money; Banking; and the Future of the Global Economy; Mervyn King; 447pp; Nonfiction; June 7
  24. The Old Ways: A Journey on Foot; Robert MacFarlan; 448pp; Nonfiction; June 24
  25. Rubicon: The Last Years of the Roman Republic; Tom Holland; 464pp; History; June 27
  26. The Vikings: A History; Robert Ferguson; 464pp; History; July 10
  27. Journey Without Maps; Graham Greene; 272pp; Travel; July 11
  28. Desert Air; George Steinmetz; 380pp; Photography; July 11
  29. Travels with Charley: In Search of America; John Steinbeck; 288pp; Travel; July 13
  30. Treason’s Harbour; Patrick O’Brian; 314pp; Historical Fiction; August 6
  31. The Silk Roads: A New History of the World; Peter Frankopan; 645pp; History; August 11
  32. The Far Side of the World; Patrick O’Brian; 355pp; Historical Fiction; August 12
  33. The Reverse of the Medal; Patrick O’Brian; 269pp; Historical Fiction; August 13
  34. The Letter of Marque; Patrick O’Brian; 287pp; Historical Fiction; August 15
  35. Deadhouse Gates: A Tale of The Malazan Book of the Fallen; Steven Erikson; 959pp; Fantasy Fiction; September 27
  36. The Departure: The Owner: Book One; Neal Asher; 412pp; Science Fiction; September 29
  37. Zero Point: The Owner: Book Two; Neal Asher; 406pp; Science Fiction; October 9
  38. A Burglar’s Guide to the City; Geoff Manaugh; 304pp; Nonfiction; October 13
  39. Jupiter War: The Owner: Book Three; Neal Asher; 356pp; Science Fiction; November 3
  40. Trainspotting; Irvine Welsh; 348pp; Contemporary Fiction; November 21
  41. The Atrocity Archives; Charles Stross; 368pp; Science Fiction; December 5
  42. The Jennifer Morgue; Charles Stross; 416pp; Science Fiction; December 9
  43. The Thirteen-Gun Salute; Patrick O’Brian; 324pp; Historical Fiction; December 17
  44. The Nutmeg of Consolation; Patrick O’Brian; 384pp; Historical Fiction; December 20
  45. The Truelove; Patrick O’Brian; 267pp; Historical Fiction; December 2

New course: Installing and Configuring SQL Server 2016

Glenn’s latest Pluralsight course has been published – SQL Server: Installing and Configuring SQL Server 2016 – and is just over two hours long.

The modules are:

  • Introduction
  • Pre-installation Tasks for the Operating System
  • Pre-installation Tasks for SQL Server 2016
  • Installing SQL Server 2016
  • Post-installation Tasks for SQL Server 2016
  • Automating Common Maintenance Tasks

Check it out here.

We now have more than 146 hours of SQLskills online training available (see all our 49 courses here), all for as little as $29/month through Pluralsight (including more than 5,000 other developer and IT training courses). That’s unbeatable value that you can’t afford to ignore.