This article is part of an Essential Guide, our editor-selected collection of our best articles, videos and other content on this topic. Explore more in this guide:
1. - SQL Server virtualization news and trends: Read more in this section
- SQL Server virtualization becoming the norm
- Selecting the right hardware for SQL Server virtualization
- Exploring Microsoft products and third-party tools
- Business intelligence systems and SQL Server virtualization
- SQL security issues in virtual world
Explore other sections in this guide:
Businesses are increasingly turning to SQL Server virtualization to reduce the number of physical servers they have to maintain, provide better availability, and enable fault tolerance and disaster recovery. Your SQL Server BI system is critical to the company’s success, so why not install that system into a virtual machine?
Caution is advisable, so consult with your BI system vendor before taking this step. Keep in mind that virtualization is viable because most physical servers’ computing resources go underused; applications aren’t used evenly throughout the day. These days, computers are fast enough to switch between several different tasks, making virtualization the ultimate expression of multitasking.
Today’s BI systems are a special case. You probably won’t use your SQL Server BI system to full capacity every moment of the day, but when you use it, you’ll want it to respond quickly. Because of the way BI systems use computing resources, they may not get the power they need if they’re sharing resources with other virtual machines on a virtualization host.
That’s especially true of modern BI systems, which use in-memory analytics. Rather than relying entirely on specially constructed data warehouses to store and crunch through data, in-memory analytics builds analysis models on the fly, in the server’s memory, as the name suggests. It’s not at all unusual for analytics servers to have copious memory, far more than even a hard-working database server might need. That’s because the database server is relying mainly on disk for its storage, and it uses memory only to process current queries and to cache small amounts of data. An analytics server with 32 GB of memory isn’t at all unusual. Giving it that 32 GB of memory from a virtualization host can be complicated.
Compounding the complexity is that most SQL Server virtualization hosts are configured to use memory overcommit. That means a host containing 64 GB of memory might allocate 16 GB to each of two virtual database servers, 32 GB to a virtual analytics server, 8 GB to a few virtual messaging servers and perhaps another 8 GB to some collaboration servers. That’s 112 GB of memory allocated -- far more than the server physically contains. The idea again is that no one virtual server needs all of its allocated memory at once, so the host transfers free memory around as it’s called for. An analytics server, however, might use very little (when it’s not being accessed) or all of its allocated memory, trashing the overcommit model and killing performance.
Plus, SMB-targeted BI systems also include a Web server and database engine, and other components can make these systems even less viable inside a virtual machine.
The moral here is to simply not assume that your BI system is a good candidate for SQL Server virtualization. Work closely with the vendor, consider recommendations and make sure that -- should you decide to go the virtual route -- your virtual BI server has all the resources it needs to work for you.
ABOUT THE AUTHOR
Don Jones is a senior partner and principal technologist at strategic consulting firm Concentrated Technology. Contact him via www.ConcentratedTech.com.