Guide to SQL Server virtualization best practices
A comprehensive collection of articles, videos and more, hand-picked by our editors
The stream of new Microsoft products seldom slows, as evidenced recently at the Microsoft Management Summit 2011 in Las Vegas, where the cloud-based monitoring and configuration system previously known as Project Atlanta was given full product status. In this edition of “SQL in Five,” Mark Kromer, database platform specialist at Microsoft, discusses the upgrading of Atlanta and other new releases -- such as a self-service portal for virtual machines -- as well as what effect they will have on the SQL Server world.
But Microsoft isn’t the only company interested in improving SQL Server. Third-party components for monitoring, performance and just about anything else make life a little easier for database administrator (DBAs). Over the past few weeks, SearchSQLServer.com has examined several of these tools and the jobs they fulfill, and that got us thinking about the unspoken contract between Microsoft and third-party providers. Kromer sheds some light on the company’s thinking before it packs up and ships out SQL Server.
The configuration-monitoring system Project Atlanta, first announced at the Professional Association for SQL Server Summit in November, was reborn at the Microsoft Management Summit in Las Vegas as a full-fledged product, System Center Advisor. What’s new? What were the changes to the beta version based on?
Kromer: For everything that’s new in the release candidate (RC), I’ll direct you [to the System Center Advisor website]. You can also get right to the RC site for System Center Advisor and try it out. More and more of the Microsoft System Center product line is moving to the IaaS (Infrastructure as a Service) model with the many configuration management, operations management and other tools in the product line now being offered in the cloud. With System Center Advisor RC, you can track and assess configuration changes over time to your SQL Server 2008 and Windows Server 2008 infrastructure. When you configure Advisor, you only need gateway agents configured to use the service.
Another product released in Las Vegas was System Center Orchestrator 2012, which allows users to automate workflows across different systems. What will this product mean for data center managers?
Kromer: System Center Orchestrator 2012 is the next generation of the Opalis acquisition. Opalis is an automation framework meant for data center administrators to automate and integrate systems with a workflow designer. The Opalis software for automation is now part of System Center, but for SQL Server DBAs, we’ve had automation and workflow designers for a number of years now directly through SQL Server and SSIS [SQL Server Integration Services]. But for more infrastructure and patching-related activities, System Center Orchestrator can be used to build workflows for automation of patching. Here is an Opalis example of automating SQL Server Cluster patching with Opalis.
And earlier last month came another new Microsoft product, System Center Virtual Machine Manager Self-Service Portal 2.0 Service Pack 1. How will this offering help IT professionals manage a virtualized environment?
Kromer: This is an area where I see a lot of momentum and excitement related to SQL Server environments. The VMM Self-Service Portal is key to enabling the “private cloud” or “optimized data center” where you build a streamlined, efficient system of automation, workflow and virtualization. The idea is to enable requirements-based provisioning of your databases and a flexible infrastructure which is elastic so that it can grow and shrink with demand. With the use of workflow forms to request and approve standing up databases for applications and the management of virtual machine templates, you get a clearer picture of your database environment and more control over the size and scope of database deployments. The virtual machines that are utilized for each database instance are right-sized based upon application requirements and result in maximum utilization of server resources. The three cloud models -- private, public and hybrid -- were very big focus areas at the Microsoft Management Summit this year and as SQL Server DBAs move more and more toward virtualization, I encourage you all to look more into the complete private cloud architecture as a way to do more with a virtual environment. This is a way to maximize investments and gain control over your ever-growing environments and the System Center products that were mentioned here such as Virtual Machine Manager and Self-Service Portal are key to Microsoft’s private cloud solution for your data center.
Let’s turn for a moment to third-party components. Are there capabilities that people go for in such tools that Microsoft might consider adding to native tools?
Kromer: Some of that has been added since SQL Server 2008. Specifically, I’m thinking of the server audit capability, management data warehouse, utility control point and backup compression. But the third-party tools that I see SQL Server customers using on a regular basis are key to enhancing the DBA’s overall experience, efficiency and minimizing complexity. Many of those features added to SQL Server were more geared toward overall database market competitiveness in terms of the core product. The third-party products that supplement a SQL Server DBA’s activities go beyond those capabilities and have been doing that for a long time. So I think that trend will continue. In Denali, for example, features like data integration lineage, data quality modules and workload replay, are examples of things that other tools can do today, but that SQL Server will now, including out of the box.
How does Microsoft go about determining which capabilities it needs to pack with SQL Server?
Mark Kromer: Microsoft employs a classic software factory model with product managers in marketing that are technical product managers, marketing managers and product strategy, working in conjunction with the SQL Server development team's program managers, who coordinate throughout the product lifecycle to manage product requirements. These requirements come from a number of sources which include market trends, direct customer feedback, competitive drivers, long-range product strategy and customer advocacy groups like PASS and MVPs [Most Valued Professionals]. As the product teams move through release cycles, those requirements will get translated into feature capabilities and then comes the hard work of scoping and sizing the efforts to get those requirements planned, architected, developed and tested in team for each planned release cycle.
Editor’s note: For more from Mark, check out his blog at MSSQLDUDE.
Mark Kromer has over 16 years experience in IT and software engineering and is well-known in the business intelligence (BI), data warehouse and database communities. He is the Microsoft data platform technology specialist for the mid-Atlantic region.