sqlskills-logo-2015-white.png

Hardware Selection for a Home Lab – Part 1

I have long believed that if you want to be more successful as a developer, database professional, or in any technically-oriented I.T. career, it is a very good idea to have some sort of home lab environment to use for experimentation and learning.

Depending on your goals, needs, budget, available space, etc., there are many different ways to actually go about doing this. Some of these include using actual rack-mount or tower servers (which might be old, used equipment or newer equipment from a server vendor’s outlet), using desktop workstation tower servers built from parts (or purchased used), and even using old laptop machines.

Again, based on your needs, preferences, and available overall infrastructure, it may make more sense to use virtualization, with one or more fairly robust virtualization host machines that can run multiple, concurrent virtual machines (VMs) or to just have a number of “bare metal” machines.

One factor that is often ignored as people think about this is the spouse acceptance factor (SAF), where what you might like to do must be tempered by what your spouse will actually accept in terms of noise, space, heat and power usage. Having a number of rack mount servers running 24 x 7 can have a very negative impact on your electrical bill!

You also need to think about the network and storage infrastructure required to properly support your home lab. At a bare minimum, you will probably want wired Gigabit Ethernet, but you may want to think about having some 10 Gigabit Ethernet capability, especially as 10Gb equipment prices continue to decline. You also might want a decent NAS device to support your lab.

Using a NUC

One increasingly viable method for equipping a home lab is to use one or more Small Form Factor (SFF) machines, such as the Intel NUC series. These are very small footprint, bare-bone kits that include everything you need except RAM, storage, and an operating system. You buy the NUC kit, then buy and install your RAM and storage, install the operating system of your choice (with some important limitations), and then you are ready to go.

These machines are typically based on low power, dual-core laptop processors (with hyper-threading), using laptop SO-DIMM RAM, and have had somewhat limited storage capability, at least for earlier generations. These type of machines have been around for about four years now, with several generations that have been released from Intel and other smaller companies, such as Gigabyte. Each new generation has had various improvements that make a SFF machine a more realistic choice for many lab scenarios.

Intel has recently announced a new generation of NUCs that are based on the 7th generation Kaby Lake processor. The main potential advantage of this new generation is increased storage space and performance. The Core i7 and Core i5 models have 40Gbps Thunderbolt 3 support built-in with an Alpine Ridge controller that connects to a USB-C connector. You can get a primer about the various USB and Thunderbolt connectors and standards here.

Having Alpine Ridge built-in gives you a lot more I/O flexibility and performance. For example, you could plug in an external enclosure that uses USB 3.1 Gen 2 (up to 10 Gbps) or Thunderbolt 3 (up to 40 Gbps) connectivity to get a lot of additional I/O throughput from your NUC.

The two best choices from this generation (for a home lab) are the Intel NUC7i7BNH and the Intel NUC7i5BNH, which are the taller Core i7 and Core i5 versions respectively. The reason why you want the taller models is because they let you install a 2.5” SATA III SSD in addition to an M.2 2280 PCIe 3.0 NVMe storage card, which gives you additional storage space and performance.

The Intel NUC7i7BNH uses the Intel Core i7-7567U processor, which has a base clock speed of 3.5GHz, a Turbo clock speed of 4.0GHz, a TDP of 28W, and a 4MB L3 cache. The Intel NUC7i5BNH uses the Intel Core i5-7260U processor, which has a base clock speed of 2.2 GHz, a Turbo clock speed of 3.4GHz, a TDP of 15W, and a 4MB L3 cache. Both of these processors have integrated Iris Plus graphics with 64GB of eDRAM, which shows up as L4 cache. Whether or not Windows and SQL Server will take advantage of that L4 cache for general computing purposes is unknown to me right now.

A fully loaded Intel NUC7i7BNH would have four logical processor cores, 32GB of 2133MHz DDR4 RAM, wired Gigabit Ethernet, wireless 802.11ac networking, up to a 4TB SATA III SSD, up to a 2TB M.2 PCIe NVMe, plus whatever storage you plugged into the Thunderbolt 3 USB-C port. A system configured like this would be pretty expensive, mainly because of the storage costs for the top of the line components. The ultimate performance bottleneck for most scenarios would be the number of available processor cores, even though these will be pretty fast cores.

Another potential issue is that you might not be able to use a Server operating system such as Windows Server 2016 or Windows Server 2012 R2, not because the OS won’t install, but because of driver issues with the NIC. Quite often, Intel won’t let you install server-class NIC drivers on client-grade NIC hardware. Sometimes people figure out ways to hack around this, but often it is much easier to install Windows 10 Professional, and then use Windows Hyper-V to host VMs.

 

Figure 1: Rear Panel of Intel NUC7i7BNH

 

On the software side of your home lab, there are many free resources available from Microsoft. If you are a student or faculty member at a high school or university, you can take advantage of Microsoft Imagine (formerly called Dreamspark) to get a lot of valuable software and other useful resources (such as a free, three-month subscription to Pluralsight). Another free program is Microsoft Visual Studio Dev Essentials, and yet another free program is Microsoft IT Pro Cloud Essentials.

In future posts, I’ll talk about some other hardware options for a home lab.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Other articles

Imagine feeling confident enough to handle whatever your database throws at you.

With training and consulting from SQLskills, you’ll be able to solve big problems, elevate your team’s capacity, and take control of your data career.