2014: the year in books

Back in 2009 I started posting a summary at the end of the year of what I read during the year (see my posts from 200920102011, 2012, 2013) and people have been enjoying it, so I present the 2014 end-of-year post. I set a moderate goal of 40 books this year and I managed 53. I wanted to get through some of my larger non-fiction books but ended up not doing as many as I thought (reading more, shorter books), and it’s just coincidence that my total works out to about a book per week. Next year I’m setting myself a goal of reading 60 books.

For the record, I read ‘real’ books – i.e. not in electronic form – I don’t like reading off a screen. Yes, I’ve seen electronic readers – we both have iPads – and I’m not interested in ever reading electronically. I also don’t ‘speed read’ – I read quickly and make lots of time for reading.

Why do I track metrics? Because I like doing it, and being able to compare against previous years. Some people don’t understand the logic in that – each to their own :-)

There were several strong candidates for my favorite book this year (including Command and Control by Eric Schlosser and Reamde by Neal Stephenson) but I’m choosing Natchez Burning by Greg Iles. It’s a fantastic and shocking drama that I knew as I read it that it was the best book I’d read this year. You can read my review of it in the top-10 list below.

Now the details. I enjoy putting this together as it will also serve as a record for me many years from now. I hope you get inspired to try some of these books – push yourself with new authors and very often you’ll be surprisingly pleased.

Once again I leave you with a quote that describes a big part of my psychological make-up:

In omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro!

Analysis of What I Read

I read 28407 pages, or 77.8 pages a day, and a book every 6.9 days or so. The chart below shows the number of pages (y-axis) in each book I read (x-axis).




The average book length was 536 pages, almost 100 pages longer than last year. Quite interestingly, I dropped travel and history this year in favor of non-fiction books, which was my goal. Fiction and historical fiction were the same as last year, and sci-fi was almost double.

The Top-10

I read a lot of really good books this year, but I didn’t struggle to get to a top-10 as I had in previous years. If you don’t read much, at least consider looking at some of these in 2015. It’s impossible to put them into a priority order so I’ve listed them in the order I read them, along with the short Facebook review I wrote at the time.

book1 #7 Command and Control; Eric Schlosser; 640pp; Non-Fiction; February 16; (Absolutely excellent history of US nuclear weapon control (or lack of it), accidents involving US nuclear weapons, and US nuclear strategy from the 40s to the 80s. Really shocking how close the US came to an accidental nuclear detonation several times. it also covers the Damascus accident in great detail, along with the almost laughable ineptitude of the Air Force and SAC in dealing with it. There are very comprehensive notes and sources, which actually led me to buy another 15 or so books over the last two weeks on related topics. Highly recommended – excellent book.)

book2 #9 Line War; Neal Asher; 566pp; Science Fiction; February 27; (This is an excellent finale to the Agent Cormac series, nicely wrapping up all the plot lines and leaving things set for continuations with some of the characters in other books. Hard to say anything else about the book without giving away major spoilers, but plenty of good battles and AI intrigue. Looking forward to reading The Gabble and Technician by him next. Strongly recommended (reading the Agent Cormac series in order).)

book3 #11 The Martian; Andy Weir; 369pp; Science Fiction; March 7; (Really excellent modern-day Robinson Crusoe story about an astronaut accidentally stranded on Mars after a hurried mission abort during a major dust storm. Very believable and no credibility stretching – a brilliant debut from Weir. I cruised through this in about 8 hours. Highly recommended and I’m really looking forward to Weir’s second book.)

book4 #14 The Dead Hand: The Untold Story of the Cold War Arms Race and Its Dangerous Legacy; David Hoffman; 608pp; Non-Fiction; March 23; (Really excellent book I picked up while reading Command and Control earlier this year. This book details the hidden Soviet bioweapons programs that existed all the way up to the 90s (anthrax, plague, tularemia, and even smallpox, after it had been eradicate by WHO, for goodness sake!) during the Cold War. The Soviets lied to the world, and Gorbachev and Yeltsin continued to lie after the fall of the USSR in ’91. It also explains the ‘dead hand’ semi-automatic mechanism to allow retaliation by the Soviets after a decapitative nuclear strike against them. Most alarming is the telling of how the a Soviet Union fell apart and left thousands of tons of bioweapons, enriched uranium and plutonium, and nuclear weapons spread across the various republics and Russia without adequate storage, security, and safeguards to stop profiteering and proliferation to rogue states like Iran. Interesting to read about how some parts of the US govt stepped in to buy uranium from Kazakhstan to stop it falling into the wrong hands, and built secure storage for Russia. Well worth reading. Not an alarmist, sensational book at all, but an insightful and level-headed description of what went on. Contains a lot about the transfer of power to Gorbachev, his meetings with Reagan, and his fall from power that I hadn’t read about before either. Strongly recommended.)

book5 #17 The Sunne in Splendour: A Novel of Richard III; 944pp; Historical Fiction; April 26; (Sharon Kay Penman’s novels are excellent historical fiction, meticulously based on actual history. In this book she follows the life of Richard III, steeped in betrayal and treason against both himself and his brother, the earlier Edward IV (with Edward V only reigning for a few months until pronounced illegitimate). In more than 900 pages of small-font, the book is huge and a long, engrossing read. Penman puts the death of the infamous ‘princes in the Tower’ at the hands of the Duke of Buckingham, which makes a lot of sense given that neither Richard nor Henry Tudor (later Henry VII) had both motivation *and* opportunity to do the deed. Altogether a splendid book, and I strongly recommend it for all fans of history.)

book6 #19 The Quarry; Iain Banks; 326pp; Fiction; May 10; (This is the last book that my favorite author of all time, Iain Banks, wrote before he died last year. It’s quite excellent. The dying character is irascible, bitter, and filled with justifiable invective against elements of the world and I have to imagine this is semi-autobiographical. It’s a great story and a fitting final novel. I think I’ll have to go back and re-read Banks’ non-sci-fi books (already read the sci-fi ones several times each). Thanks Iain – RIP.)

book7 #21 Duty: Memoirs of a Secretary at War; Robert M. Gates; 617pp; Non-Fiction; May 25; (This memoir describes Gates’ time as the Secretary of Defense under Presidents Bush (43) and Obama. Gates comes across as someone who really cares about the service men and women under his command (as second in the chain under the president).This is reiterated many times through the book when he describes budget battles to get more equipment, better equipment, the right equipment, unwanted/unneeded equipment cancelled, and more against parochial members of Congress and inexperienced White House staffers. He seems to have been a very, very different SecDef than his predecessor Rumsfeld, whose autobiography I now look back on with a much more negative view. Different how? Nicer is probably the easiest way to say it, just a nicer, more caring person. He looks back several times and wishes he’d handled a situation better and done things differently for the benefit of the troops. He takes personal responsibility for everything that happened on his watch, rather than blaming others (a failing of Rumsfeld.) The book also gives a lot of insight into the workings of the Department of Defense, the largest organization on the planet, with 3 million ‘staff’, and a budget of more than $700 billion in 2011. Altogether a very interesting book and well worth reading. I’m looking forward to reading the memoirs of the presidents and other cabinet members from the last 10-15 years too. Highly recommended.)

book8 #34 Reamde: A Novel; Neal Stephenson; 1042pp; Fiction; August 18; (Absolutely excellent, and a monster 1000+ pages. Terrorists, MMORPG, Chinese and Hungarian hackers, MI6, Idahoan survivalists, private jets, Russian mercenaries, lots of guns. Mix well with superlative storytelling and you get Stephenson’s usual top-notch page turner. Best book so far this year. Very, very strongly recommended.)

book9 #40 D-Day: The Battle For Normandy; Antony Beevor; 608pp; History; September 10; (Absolutely excellent narrative of the D-Day landings and the following several months of fighting up to the liberation of Paris. What seems obvious is that there was incompetence on both sides, especially around Field Marshall Montgomery on the Allied side and Hitler’s incessant micro-management of the German side, that led to greater losses or troops, civilian French, and widespread destruction in Normandy. I’m not a big war-history buff (yet!) but I found this a thoroughly engrossing book rather than just a dry battle chronology. I really liked that way Beevor could conjure up little vignettes of action to highlight interesting points in the wider battlefield. Highly recommended!)

book10 #50 Natchez Burning; Greg Iles; 800pp; Fiction; December 21; (This is easily the best book I’ve read this year, and is fantastic. It’s long, really involved, and deals with a murder case on the Mississippi/Louisiana border. Apparently it’s the fourth in a series about the main character – so I just bought the first three to add to my pile of unread books. The book is centered on the extreme racism in that area in the late 1960s and its continuation in some cases today, and is quite shocking in parts. The writing is superb, with rich characters and excellent dialog. I’m very glad I took a chance on this after seeing it reviewed in the W.S.Journal. Strongly recommended, but definitely not for the faint-hearted.)

The Complete List

And the complete list, with links to Amazon so you can explore further. One thing to bear in mind, the dates I finished reading the book don’t mean that I started, for instance, book #2 after finishing book #1. I usually have anywhere from 10-15 books on the go at any one time so I can dip into whatever my mood is for that day. Some books I read start to finish without picking up another one and some books take me over a year. Lots of long airplane flights help too!

  1. Travels in a Thin Country: A Journey Through Chile; Sara Wheeler; 336pp; Travel; January 3
  2. The Hundred Secret Senses; Amy Tan; 358pp; Fiction; January 7
  3. Zima Blue; Alastair Reynolds; 455pp; Science Fiction; January 11
  4. The Paleo Manifesto; John Durant; 300pp; Non-Fiction; January 15
  5. The Astronomer; Lawrence Goldstone; 304pp; Historical Fiction; January 19
  6. Blink: The Power of Thinking Without Thinking; Malcolm Gladwell; 266pp; Non-Fiction; January 27
  7. Command and Control; Eric Schlosser; 640pp; Non-Fiction; February 16
  8. Service: A Navy SEAL at War; Marcus Luttrell360pp; Non-Fiction; February 21
  9. Line War; Neal Asher; 566pp; Science Fiction; February 27
  10. Known and Unknown: A Memoir; Donald Rumsfeld; 832pp; Autobiography; March 3
  11. The Martian; Andy Weir; 369pp; Science Fiction; March 7
  12. Leviathan Wakes; James S. A. Corey; 592pp; Science Fiction; March 17
  13. The Sixth Extinction: An Unnatural History; Elizabeth Kolbert; 319pp; Non-Fiction; March 19
  14. The Dead Hand: The Untold Story of the Cold War Arms Race and Its Dangerous Legacy; David Hoffman; 608pp; Non-Fiction; March 23
  15. Caliban’s War; James S. A. Corey; 624pp; Science Fiction; April 2
  16. 2312; Kim Stanley Robinson; 576pp; Science Fiction; April 11
  17. The Sunne in Splendour: A Novel of Richard III; Sharon Kay Penman; 944pp; Historical Fiction; April 26
  18. A Game of Thrones: A Song of Ice and Fire: Book One; George R. R. Martin; 704pp; Fiction; May 8
  19. The Quarry; Iain Banks; 326pp; Fiction; May 10
  20. The Casual Vacancy; J.K. Rowling; 512pp; Fiction; May 15
  21. Duty: Memoirs of a Secretary at War; Robert M. Gates; 617pp; Non-Fiction; May 25
  22. Asia’s Cauldron: The South China Sea and the End of a Stable Pacific; Robert D. Kaplan; 225pp; Non-Fiction; June 4
  23. 1356: A Novel; Bernard Cornwell; 382pp; Historical Fiction; June 18
  24. Lady of the English; Elizabeth Chadwick; 544pp; Historical Fiction; June 29
  25. The Son; Philipp Meyer; 562pp; Historical Fiction; July 4
  26. The Operators: The Wild and Terrifying Inside Story of America’s War in Afghanistan; Michael Hastings; 432pp; Non-Fiction; July 18
  27. Legacy of Ashes: The History of the CIA; Tim Weiner; 848pp; History; July 21
  28. The Greatest Knight: The Unsung Story of the Queen’s Champion; Elizabeth Chadwick; 560pp; Historical Fiction; July 24
  29. A Clash of Kings: A Song of Ice and Fire: Book Two; George R.R. Martin; 784pp; Fiction; July 28
  30. The Divide: American Injustice in the Age of the Wealth Gap; Matt Taibbi; 448pp; Non-Fiction; August 3
  31. Ghost in the Wires: My Adventures as the World’s Most Wanted Hacker; Kevin Mitnick; 448pp; Non-Fiction; August 7
  32. Dear Leader: Poet, Spy, Escapee – A Look Inside North Korea; Jang Jin-sung; 368pp; Non-Fiction; August 8
  33. The Chemistry of Tears; Peter Carey; 229pp; Fiction; August 9
  34. Reamde: A Novel; Neal Stephenson; 1042pp; Fiction; August 18
  35. Outlaw Platoon: Heroes, Renegades, Infidels, and the Brotherhood of War in Afghanistan; Sean Parnell; 384pp; Non-Fiction; August 20
  36. The Scarlet Lion; Elizabeth Chadwick; 576pp; Historical Fiction; August 24
  37. Niccolò Rising: The First Book of The House of Niccolò; Dorothy Dunnett; 496pp; Historical Fiction; September 1
  38. The Spring of the Ram: The Second Book of The House of Niccolò; Dorothy Dunnett; 496pp; Historical Fiction; September 4
  39. Americana: Despatches from the New Frontier; Hampton Sides; 450pp; Non-Fiction; September 7
  40. D-Day: The Battle For Normandy; Antony Beevor; 608pp; History; September 10
  41. A Storm of Swords: A Song of Ice and Fire: Book Three; George R.R. Martin; 992pp; Fiction; September 14
  42. Abaddon’s Gate; James S. A. Corey; 539pp; Science Fiction; September 22
  43. A King’s Ransom; Sharon Kay Penman; 685pp; Historical Fiction; October 23
  44. The Forest Laird: A Tale of William Wallace; Jack Whyte; 512pp; Historical Fiction; November 22
  45. Wired for War: The Robotics Revolution and Conflict in the 21st Century; P.W. Singer; 512pp; Non-Fiction; November 28
  46. The Universe: 365 Days; Nemoriff & Bonnell; 750pp; Photography; November 30
  47. The Absolute Sandman: Volume 1; Neil Gaiman; 612pp; Comics; November 30
  48. Stealing Light; Gary Gibson; 603pp; Science Fiction; December 2
  49. Nova War; Gary Gibson; 568pp; Science Fiction; December 15
  50. Natchez Burning; Greg Iles; 800pp; Fiction; December 21
  51. Makers: All Kinds of People Making Amazing Things In Garages, Basements, and Backyards; Bob Parks; 184pp; Non-Fiction; December 27
  52. Empire of Light; Gary Gibson; 544pp; Science Fiction; December 29
  53. The Absolute Sandman: Volume 2; Neil Gaiman; 616pp; Comics; December 30

Christmas Eve RunAs Radio show 400

Last week we joined our wonderful friend (he MC’d our wedding in 2007) Richard Campbell for RunAs Radio show #400, where we jet-laggedly discuss all kinds of stuff, even occasionally getting technical!

You can download it here.

Enjoy and Happy Holidays!

PS We also did the 200th show back in 2011 – see here.

SQLskills hires all-round SQL expert Tim Radney

It’s hiring time again, as our consulting volume has reached the point where we need more help on our close-knit, expert team!

Specifically, we’ve asked well-known SQL expert and MVP Tim Radney to join us and we’re extremely pleased that he accepted our offer. He’ll bring our team up to seven people when he starts with us on Monday, January 19th, bringing a wealth of experience and knowledge to the team. You can reach him at tim@SQLskills.com.


I’ve known Tim for years (a prerequisite for working here) and he’s a huge contributor to the SQL Community: on Twitter (@TRadney), on his blog (here), as a frequent speaker at user groups, SQL Saturdays, and PASS Summits, as the Chapter Leader for the Columbus, GA SQL Server User Group, and as PASS Regional Mentor for the SE USA. Wow! Tim was also named a PASS Outstanding Volunteer in 2012 in recognition of all that he does.

Tim’s been working with SQL Server for 15 years, in a variety of roles including DBA, Lead DBA, and multi-department manager, which have given him expertise in areas like HA/DR, virtualization, SSIS, SSRS, and performance tuning, among everything else SQL Server-related. As a manager, Tim’s gained extensive experience planning and implementing large-scale environment changes and upgrades, which will be invaluable for working with some of our larger Fortune 20-100 clients.

Outside of SQL Server, Tim is married with three kids and also shares my passion for electronics and messing around with a soldering iron. He also farms chickens and tilapias in his spare time!

Tim is an excellent addition to our world-class consulting team and we’re very excited that he’s coming on board – welcome!

Thanks as always,

Paul and Kimberly

PS You can read Tim’s post here.

SQLskills holiday gift to you: all 2013 Insider videos

As we all wind down for the 2014 holiday season, we want to give the SQL Server community a holiday gift to say ‘thank you’ for all your support during 2014, and what better gift than more free content?!

As many of you know, I publish a bi-weekly newsletter to more than 11,500 subscribers that contains an editorial on a SQL Server topic, a demo video, and a book review of my most recently completed book. We’re making all the 2013 demo videos available  so everyone can watch them – 24 videos in all, WMV format only. I did the same thing the last few years for the 2012 videos and 2011 videos.

Here are the details:

  1. January 2013: Recreating missing log files on attach (from Pluralsight) (video | demo code)
  2. January 2013: Using the sys.dm_db_stats_properties DMV (video | demo code)
  3. February 2013: Using Microsoft Data Link (video)
  4. February 2013: Linked servers and statistics (video | demo code)
  5. March 2013: Using the system_health Extended Event session (video | demo code)
  6. March 2013: Moving from SQL Trace to Extended Events (video | demo code)
  7. April 2013: Color coding and other SSMS features (video)
  8. April 2013: DISTINCT aggregate improvements (video | demo code)
  9. May 2013: Using the tsql_stack Extended Event action (from Pluralsight) (video | demo code)
  10. May 2013: Deferred drop behavior (from Pluralsight) (video | demo code)
  11. June 2013: Finding duplicate statistics (video | demo code)
  12. June 2013: Using data compression (video | demo code)
  13. July 2013: Undetectable performance problems (video | demo code)
  14. July 2013: SSMS Object Explorer features (video)
  15. August 2013: Parallel crash recovery (from Pluralsight) (video | demo code)
  16. August 2013: Tracking tempdb space usage (video | demo code)
  17. September 2013: Enabling instant file initialization (video | demo code)
  18. September 2013: Using query hashes (video | demo code)
  19. September 2013: Recovering from data purity corruptions (from Pluralsight) (video | demo code)
  20. October 2013: Implicit conversions (from Pluralsight) (video | demo code)
  21. October 2013: Extended Events templates (video | demo code)
  22. November 2013: Using the missing index DMVs (video | demo code)
  23. November 2013: Using older backups to retrieve data after corruption (from Pluralsight) (video | demo code)
  24. December 2013: Enabling database mail (video)

If you want to see the 2014 videos before next December, get all the newsletter back-issues, and follow along as the newsletters come out, just sign-up at http://www.SQLskills.com/Insider. No strings attached, no marketing or advertising, just free content.

Happy Holidays and enjoy the videos!

Problems from having lots of server memory

A month ago I kicked off a survey asking how much memory is installed on your largest server that’s running SQL Server. Thank you to everyone that responded.

Here are the results:













The “other” values are:

  • 3 more for the ‘128 GB or more, but less than 256 GB’ count
  • 1 more for the ‘Less than 16 GB’ count
  • One poor soul who only has 512 MB in their server!

This is very interesting:

  • I expected the majority of servers to fall into the middle of the range (around 128GB), and it’s actually only 37% that fit into the 64 GB to 256 GB range.
  • I’m surprised at the percentage of servers (41%) of servers with 256 GB or more.
  • I didn’t know what percentage would have more than 1 TB, so almost 10% is really cool to see.

So what do these results mean? Well, the number of servers out there with lots (more than 128GB) of memory is more than half of all respondents. The more memory you have, the more important it is that you make sure that the memory is being used efficiently and that you’re not wasting space in the buffer pool (see here) and that you’re not churning the buffer pool with poor query plans causing lots of reads (see here).

What other things could be problems with large amounts of memory?

  • Shutting down the instance. This will checkpoint all the databases, which could take quite a long time (minutes to hours) if suddenly all databases have lots of dirty pages that all need to be flushed out to disk. This can eat into your maintenance window, if you’re shutting down to install an SP or a CU.
  • Starting up the instance. If the server’s POST checks memory, the more memory you have, the longer that will take. This can eat into your allowable downtime if a crash occurs.
  • Allocating the buffer pool. We’ve worked with clients with terabyte+ buffer pools where they hit a bug on 2008 R2 (also in 2008 and 2012) around NUMA memory allocations that would cause SQL Server to take many minutes to start up. That bug has been fixed in all affected versions and you can read about in KB 2819662.
  • Warming up the buffer pool. Assuming you don’t hit the memory allocation problem above, how do you warm up such a large buffer pool so that you’re not waiting a long time for your ‘working set’ of data file pages to be memory resident? One solution is to analyze your buffer pool when it’s warm, to figure out which tables and indexes are in memory, and then write some scripts that will read much of that data into memory quickly as part of starting up the instance. For one of the same customers that hit the allocation bug above, doing this produced a big boost in getting to the steady-state workload performance compared to waiting for the buffer pool to warm up naturally.
  • Complacency. With a large amount of memory available, there might be a tendency to slacken off proactively looking for unused and missing index tuning opportunities or plan cache bloat or wasted buffer pool space (I mentioned above), thinking that having all that memory will be more forgiving. Don’t fall into this trap. If one of these things becomes such a problem that it’s noticeable on your server with lots of memory, it’s a *big* problem that may be harder to get under control quickly.
  • Disaster recovery. If you’ve got lots of memory, it probably means your databases are getting larger. You need to start considering the need for multiple filegroups to allow small, targeted restores for fast disaster recovery. This may also mean you need to think about breaking up large tables, using partitioning for instance, or archiving old, unused data so that tables don’t become unwieldy.

Adding more memory is one of the easiest ways to alleviate some performance issues (as a band-aid, or seemingly risk-free temporary fix), but don’t think it’s a simple thing to just max out the server memory and then forget about it. As you can see, more memory leads to more potential problems, and these are just a few things that spring to mind as I’m sitting in the back of class here in Sydney.

Be careful out there!


T-SQL Tuesday: Giving back in 2015







My friend Wayne Sheffield (b|t)is hosting T-SQL Tuesday this month and it’s been a long time since I’ve taken part (wow – since January 2011!). His theme is about how you’re going to give back or continue giving back to the SQL Server community in 2015. I’m going to talk about stuff we at SQLskills do in the community, in the spirit of the T-SQL Tuesday theme, not as any kind of marketing or self-aggrandizement.

We do a lot (I think) for the SQL Server community, both because we’re just nice like that :-), and because we appreciate the people in the community as they provide our livelihood. To be honest, for me it’s mostly because I like helping people with SQL Server problems. It’s like an irresistible urge when I see someone with a problem I know how to help with.

We’re going to continue with these things in 2015:

  • Helping out on the fantastic #sqlhelp alias on Twitter
  • Blogging on all our blogs
  • Publishing our bi-weekly newsletter with a demo video and editorial (see the link on the RHS of your screen)
  • Glenn publishes his monthly DMV queries for all versions of SQL Server from 2005 onward
  • Erin helps run the Cleveland User Group

And we’re also going to be doing a lot more:

  • Remote user groups. I had this mad idea to do remote user group sessions for anyone that asks – so we’ve currently got 49(!) remote user group sessions scheduled in 2015 at user groups in the US, UK, Ireland, Belgium, Netherlands, New Zealand, South Africa, Israel, Australia, Canada, Poland, and PASS virtual chapters. I’m personally doing 20 of them. Check if your user group has signed up, and if you’re in a country not listed above definitely hit us up for a session. See here for details.
  • I’m putting together a comprehensive web encyclopedia of all wait and latch types that exist, across all versions, slated to go live in January/February. Shoot me an email if you’d like to help provide data (involves installing debug symbols and XEvent sessions on your prod server). That’s going to be cool!
  • Kimberly and I have another special project coming up… more details in January/February

As Wayne says in his blog post, I encourage you to give back to the community. We’ve got one of the best technical communities in the world – I know we’re the envy of many others.

Thank you to everyone who blogs, tweets, runs/speaks at/attends user groups/SQL Saturdays/PASS – long may we all continue!


October 2015 Dublin IE2/IEPTO-2 class open for registration

Through popular demand we’ve managed to juggle a bit more of our schedule around and found space to fit in another European class in 2015, and it’s open for registration!

We’ll be teaching our signature IEPTO-2 (formerly IE2) Immersion Event on Performance Tuning and Optimization, in partnership with our great friends Bob and Carmel Duffy of Prodata. This is the first time we’ve been back in Ireland since 2010.

The class will be October 12-16, and there are multiple discounts available depending on when you register:

  • Special Price (Before January 31st 2015) €2,195
  • Early Bird (Before June 30th 2015) €2,395
  • Full Price (After June 30th 2015) €2,795

You can get all the details on the class page here.

We hope to see you there!

Paul’s awesome ribs recipe

On Friday I made my signature pork ribs for our good friend Bob Ward (along with Kimberly’s truffled cauliflower mash and caramelized broccoli) and I promised to blog the ribs recipe. I’ve been making ribs like this for about ten years – they’re dry and you have to tear the meat off the ribs with your teeth, rather than the usual American ribs where the meat falls off on its own. I don’t remember which BBQ recipe book I originally got the recipe from but it’s evolved over the years into what’s below.

Let me know if you try it – enjoy!


  • 1 large rack of baby-back pork ribs
  • 1 full bulb of garlic
  • 1/2 cup soy sauce
  • 1/4 cup cream sherry (e.g. Sheffield)
  • 2 tablespoons honey

Prepare the ribs: wash the rack and cut it into individual ribs. Arrange in a deep rectangular dish (e.g. Pyrex or Le Creuset) on their side – layed out like ( ( ( ( (. Optional: discard weird-shaped or extra pieces of meat.

Prepare the sauce: pour the soy sauce into a Pyrex measuring jug. Pour the sherry in too. Pour the honey into the mixture. It’ll settle at the bottom of the jug. Peel the garlic (vary the amount based on your garlic-loving preference) and use a garlic press (I use this one) to get the ‘meat’ of each garlic glove into the sauce jug. Discard the skin of each garlic glove.

Marinate the ribs: Mix up the sauce with a small fork to get the honey mixed in. Pour the sauce evenly over the ribs. Cover the rib dish with cling wrap and let it sit in the fridge for 4-24 hours.

Prepare to cook: set the oven to broil and set the top oven shelf to be one level down from the very top. Line a deep rectangular baking/broiling pan with aluminum foil. Arrange the ribs on the side on the foil in two rows like ( ( ( ( (.

Broil: broil the ribs as prepared for 5 minutes. Remove from the oven and turn the ribs over in place. Drizzle the remains of the sauce and garlic over each rib using a small spoon. Broil for 4 more minutes. Remove from the oven.

Bake: after broiling the ribs the second time, set the same over to bake at 325F. Turn the ribs a second time and drizzle any remaining sauce on them. Bake on the bottom shelf for 20 minutes. Increase heat to 350F and bake for 10 more minutes. Increase heat to 375F and bake for 10 more minutes. You want the outside of the ribs to be blackening and caramelizing.

Remove and place in a serving dish. Eat immediately with your hands.

For garlic lovers, get a teaspoon and scoop the concentrated garlic mush and sauce from the bottom of the baking pan – yummmmm!

Extra ribs can be eaten cold over the next few days or reheated by baking for 5-8 minutes at 375.

Delayed Durability in SQL Server 2014

One of the cool new features in SQL Server 2014 is delayed durability (available in all Editions), which is described in detail in Books Online here.

I think I’m going to see a lot of people turn this on, as you can get a profound increase in transaction throughput with the right workload. However, I also think a lot of people are going to turn this on without realizing the potential for data loss and making the appropriate trade off.

Why can it give a throughput boost?

I put together a contrived workload with a small table where 50 concurrent clients are updating the same rows, and the database log is on a slow I/O subsystem. Here’s a graph showing my test:


At the obvious change point, that’s where I enabled delayed durability, with all transactions being forced to use it. Before the change, the number of Transactions/sec is equal to the number of Log Flushes/sec, as each transaction is holding locks that block all other transactions (I told you it’s a contrived workload). So why the profound jump in Transactions/sec when I forced delayed durability?

Under normal circumstances, when a transaction commits, the commit doesn’t complete until the log block (see this blog post for more details) containing the LOP_COMMIT_XACT log record for the transaction has been flushed to disk and the write is acknowledged back to SQL Server as having completed, providing the durability of the transaction (the D in the ACID properties of the transaction). The transaction’s locks cannot be dropped until the log flush completes.

In my workload, all the other transactions are waiting for the one that is committing, as they all need the same locks, so Transactions/sec is tied to Log Flushes/sec in this case.

With delayed durability, the transaction commit proceeds without the log block flush occurring – hence the act of making the transaction durable is delayed. Under delayed durability, log blocks are only flushed to disk when they reach their maximum size of 60KB. This means that transactions commit a lot faster, hold their locks for less time, and so Transactions/sec increases greatly (for this workload). You can also see that the Log Flushes/sec decreased greatly as well, as previously it was flushing lots of tiny log blocks and then changed to only flush maximum-sized log blocks.


  • I was forcing all transactions to be delayed durable, but the facility exists to make the delayed durability choice per transaction too (see Books Online for more details).
  • There’s a bit more to the log block flushing too: under delayed durability, a log block will flush when it fills up, or if a non-delayed durable transaction commits, or if the new sp_flush_log proc is executed.

My good friend Aaron Bertrand over at SQL Sentry has a long post about delayed durability that looks into its performance implications in a little bit more depth so I recommend you check out his post as well.

So this looks great, for the right type of workload. But I bet you’re thinking:

What’s the catch?

Your transactions aren’t durable when they commit. Simple.

Now you may be thinking that if the system crashes, the most you’re going to lose is up to 60KB of transaction log. Wrong. If that last log block contains the LOP_COMMIT_XACT log record for a long-running transaction, when the system crashes, and that log block isn’t on disk, that whole transaction will roll back during crash recovery. So the potential for work/data loss is greater than just 60KB.

And there’s more:

  • Log backups will not back up that unflushed log block, as it’s not on disk, so non-durable committed transactions may not be contained within a log backup.
  • Non-durable transactions that have committed are not protected by synchronous database mirroring or a synchronous availability group either, as these rely on log block flushes (and transmission to the mirror/replica).

For critical transactions, an sp_flush_log can be used, or per-transaction delayed durability used instead.

So the million-dollar question is:

Should I enable delayed durability?

It depends. Is your business comfortable making the throughput vs. durability trade off? Does enabling it give a throughput boost? If yes to both, go ahead. If no to either, don’t enable it. That’s a very simplistic way of thinking about it, but that’s what it boils down to really.

There are lots of other things you can do to increase the throughput and performance of the transaction log, and I explained them in a blog post series:

As I stated above though, I think a lot of people are going to be wowed by the throughput boost (if their workload benefits) from delayed durability and see this as a no-brainer, without considering the potential for data loss.

Tempting, isn’t it?

Bug: Error: 3449 and server restart during DBCC CHECKDB

This is a quick post to let you know about a bug that a few people are hitting when running DBCC CHECKDB. The symptoms are a series of errors in the error log plus SQL Server forcibly shuts itself down and restarts. I’ve heard of people hitting the bug on SQL Server 2014 and SQL Server 2012 SP1.

[Update 2/24/15] Microsoft has confirmed that it’s a bug in 2014 and 2012 and they’re planning a fix for the next CU of both.

[Update 3/18/15] The fix for SQL Server 2012 is in CU5 of SP2 and is described in KB 3044958.

The error log symptoms look something like I show below:

2014-11-18 09:04:15.69 spid64      The operating system returned error 665(The requested operation could not be completed due to a file system limitation) to SQL Server during a write at offset 0x00001EE6FD8000 in file 'C:\SQLskills\\Company_file2.ndf:MSSQL_DBCC23'. Additional messages in the SQL Server error log and system event log may provide more detail. This is a severe system-level error condition that threatens database integrity and must be corrected immediately. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.
2014-11-18 09:04:15.72 spid64      Error: 3314, Severity: 17, State: 3.
2014-11-18 09:04:15.72 spid64      During undoing of a logged operation in database 'Company', an error occurred at log record ID (887654:3321:14). Typically, the specific failure is logged previously as an error in the Windows Event Log service. Restore the database or file from a backup, or repair the database.
2014-11-18 09:04:34.38 spid64      Error: 831, Severity: 20, State: 1.
2014-11-18 09:04:34.38 spid64      Unable to deallocate a kept page.
2014-11-18 09:04:34.40 spid64      Error: 3449, Severity: 21, State: 1.
2014-11-18 09:04:34.40 spid64      SQL Server must shut down in order to recover a database (database ID 23). The database is either a user database that could not be shut down or a system database. Restart SQL Server. If the database fails to recover after another startup, repair or restore the database.

The 665 error is from the snapshot file that DBCC CHECKDB creates hitting an NTFS limitation on the number of file fragments in a sparse file. This causes the snapshot creation to fail. The failure causes the undo of a log record in the snapshot to fail (remember that a database snapshot undergoes crash recovery to make it transactionally consistent). This failure then leads to SQL Server thinking it has to forcibly restart to recover the snapshot database, which is should never do for a snapshot – and that’s the bug.

I’ll update this post as soon as I hear about the builds that the fix is in.

This is a rare bug to hit, but it’s a regression (from builds people are reporting), and you can help yourself to avoid it by:

  • Creating your own database snapshot, on a volume without file-system free space fragmentation, and running DBCC CHECKDB against the snapshot
  • OR, trying to run DBCC CHECKDB when there isn’t a significant amount of change occurring in the database, so the database snapshot doesn’t have to become very large
  • OR, using the backup-copy-restore-check method of running DBCC CHECKDB on a restored backup of the database on another server