Operating System Support for SQL Server Versions

(New: we’ve published a range of SQL Server interview candidate screening assessments with our partner Kandio, so you can avoid hiring an ‘expert’ who ends up causing problems. Check them out here.)

There are currently six major versions of SQL Server that I commonly see being used in Production, along with five major versions of Windows Server. Only certain combinations of SQL Server and Windows Server are officially supported by Microsoft, but tracking down this information is a little tedious.

Table 1 shows these possible combinations and whether they are officially supported by Microsoft. One possibly surprising combination is the fact that SQL Server 2012 is not officially supported on Windows Server 2016. Perhaps this is less surprising if you keep in mind that SQL Server 2012 will fall out of mainstream support on July 11, 2017, which is not that far away.

Picture1

Table 1: OS Support for Recent Versions of SQL Server

The available links that document this are listed below:

Hardware and Software Requirements for Installing SQL Server 2008 R2

Hardware and Software Requirements for Installing SQL Server 2012

Hardware and Software Requirements for Installing SQL Server 2014

Hardware and Software Requirements for Installing SQL Server (for 2016 and later)

If you are getting ready to deploy a new instance of SQL Server 2014 or SQL Server 2016, then you should prefer Windows Server 2016, even though they are also supported on older operating systems. If you are getting ready to deploy a new instance of SQL Server 2008 through SQL Server 2012, then you should prefer Windows Server 2012 R2, even though they are supported on older operating systems.

Finally, if you are getting ready to deploy a new instance of SQL Server 2005, then I feel a little sorry for you! SQL Server 2005 is out of extended support, and it is missing so many useful features that were added in newer versions of SQL Server.

Actually, I recently helped a client deploy some new instances of SQL Server 2005 for some pretty valid business reasons. We ended up deploying to a VM (on new, very fast host hardware) that was running Windows Server 2008 R2, which worked perfectly fine.

Using Windows Resource Monitor to Find Unusual Performance Bottlenecks

I recently had an interesting issue with a new SQL Server 2014 Enterprise Edition instance that was going to be hosting about 200 user databases that each use the SQL Server Filetable feature. The reported problem was that this new instance was taking two-three times longer to restore each full database backup than the restores were taking on a different instance (that had similar hardware and storage resources).

I had several initial ideas about what the problem might be, including:

  1. Windows instant file initialization was not enabled
    1. This would force Windows to have to “zero-initialize” the database data file after it was created during the restore, which slows down the restore
    2. It was already enabled on this instance, so that was not the problem
  2. The transaction log files were particularly large for these user databases
    1. Windows instant file initialization does not work on log files, only on data files
    2. This issue would be present on any database instance, not just one
  3. The VLF counts were particularly high in the log files for these databases
    1. This causes the recovery portion for a full database restore (or crash recovery) to take much longer
    2. None of the VLF counts were above 300, plus this would behave the same on any database instance

After my initial ideas were found to be incorrect, I tried restoring a full backup of one of the user databases to the new database server, making sure to use the Stats = 1 option, so the restore progress statistics would update every 1%. This gives you more immediate and fine-grained feedback about the progress of the restore, which is always a good idea.

While the restore was running, I carefully watched the disk activity in Windows Resource Monitor.  You can sort the columns in Windows Resource Monitor to see the highest Reads and Writes (in bytes/second) while the database restore is running. This will show the reads from the database backup file, then the writes to the database data file(s) and the database log file. It also showed the writes to the filetable files, and then something that I did not expect…

The full database restore went very fast until it got to 92%. Then I saw a huge number of parallel reads by the MsMpEng.exe process (which is the Microsoft Antimalware service), which slowed the restore progress to an absolute crawl, also pegging one of the CPU cores on the instance (so it looks like that service is single-threaded).  This seemed to be a major factor in the slowness of the database restore as MsMpEng.exe process was scanning the 8,684 Filetable documents that were in this particular user database.

The solution to this issue was simply changing the exclusion rules for the Microsoft Antimalware Service to not scan the files in the filetable directories. This one change decreased the database restore time by 56%.

Here is the restore command and the results of the test.

-- Test restore of TestUserDatabase
-- This database uses Filetables
RESTORE DATABASE [TestUserDatabase] 
FROM  DISK = N'T:\BackupRestoreTest\TestUserDatabase_FULL.bak' WITH  FILE = 1,  
MOVE N'TestUserDatabase' TO N'D:\SQLDATA\TestUserDatabase.mdf',  
MOVE N'TestUserDatabase_log' TO N'L:\Logs\TestUserDatabase_log.ldf',  
MOVE N'DOCUMENTS' TO N'D:\SQLDATA\TestUserDatabase_DOCUMENTS',  NOUNLOAD,  STATS = 1;
GO

-- Actual elapsed time 6:21 (Before directory exclusion change)
--Processed 2358856 pages for database 'TestUserDatabase', file 'TestUserDatabase' on file 1.
--Processed 4 pages for database 'TestUserDatabase', file 'TestUserDatabase_log' on file 1.
--Processed 208905 pages for database 'TestUserDatabase', file 'DOCUMENTS' on file 1.
--RESTORE DATABASE successfully processed 2567764 pages in 378.708 seconds (52.971 MB/sec).


-- Actual elapsed time 2:52 (After directory exclusion change)
--Processed 2358856 pages for database 'TestUserDatabase', file 'TestUserDatabase' on file 1.
--Processed 4 pages for database 'TestUserDatabase', file 'TestUserDatabase_log' on file 1.
--Processed 208905 pages for database 'TestUserDatabase', file 'DOCUMENTS' on file 1.
--RESTORE DATABASE successfully processed 2567764 pages in 167.607 seconds (119.688 MB/sec).

 

Figure 1: T-SQL for Database Restore Test

 

One thing you might notice is that the filetable directory is on the same LUN as the PRIMARY data file for this database, which is not really the best location. Ideally this directory would be on a separate LUN from any other SQL Server database files, but this may not always be possible due to economic resource constraints.

Here is some more information about filestream performance:

High-performance FILESTREAM tips and tricks

Best Practices on FILESTREAM implementations

A SQL Server Hardware Tidbit a Day – Day 16

For Day 16 of this series, I am going to talk about a few useful tools you can use to identify some hardware details about a database server. These tools all require that you have access to login to that server, which might be a problem for some people, depending on the policies of their organization. If you are barred from directly accessing your server, my Day 15 post from this series gives you another option to get some information from T-SQL.

The first tool is msinfo32.exe, which is built into all recent versions of Windows. You can simply type msinfo32 in a Run window, and you will see the System Information dialog shown in Figure 1.

System Information

Figure1: System Information Dialog

The System Information dialog shows that we have a Dell PowerEdge R720. This dialog also shows that I have two Intel Xeon E5-2670 processors, with 64GB of RAM, running Windows Server 2012 Datacenter Edition.

The second tool is the Computer Properties dialog shown in Figure 2. You can get there by bringing up the Windows Charm with Windows + C, then typing Computer then choosing Computer, right-clicking, and choosing Properties. This shows the version and edition of Windows, the computer name, the processor model, and the amount of installed RAM.

Computer Information

Figure 2: Computer Properties Dialog

The third tool is Windows Task Manager, which is shown in Figure 3. You can get there by right-clicking on the Task Bar, and choosing Start Task Manager. The Performance tab now has different pages for CPU, Memory, and some other items, depending on your hardware.

The CPU page tells you how many logical processors are visible to Windows (the number of sections you see in CPU Usage History) if you right-click and change the display to show logical processors. You can also see overall CPU usage  and CPU usage by NUMA node by selecting the appropriate display type. What is even more useful in Windows Server 2012 is the fact that you see the processor model number and base clock speed, along with the total number of sockets, physical cores, and logical processors. You can also see whether hardware virtualization is enabled, and the cumulative size of your L1, L2, and L3 caches across all of your processors.

 

image

Figure 3: Windows Server 2012 Task Manager, Performance Tab, CPU Page

Figure 4 gives you some pretty detailed information about your memory, including the amount of RAM that you have installed, the type and speed of the RAM, and the number of memory slots you have used out of the total number of memory slots in the system. This is all very useful information to know about the system.

image

Figure 4: Windows Server 2012 Task Manager, Performance Tab, Memory Page