Sergiy Serdyuk - Fotolia
Journeys always require a starting point. For IT shops moving SQL Server databases to the cloud, a good picture of the journey's starting point, in the form of a SQL Server baseline on processing performance, is crucial to eventual success.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Even though database administrators usually have a general idea of the status of the SQL Server databases in their shops, a more precise SQL Server performance baseline is useful when going to the cloud. That emphasis on good performance baselines comes by way of a data professional at a hallowed U.S. university.
"Our biggest current challenge is the transition to cloud computing," said John W. Grover, a database administrator (DBA) at the University of Notre Dame in South Bend, Ind., adding that a good baseline can provide a target for migrated systems; one that helps size up cloud-based infrastructure to match on-premises performance.
From a database performance standpoint, this means translating, in most cases, from physical processors connected to a dedicated NAS to the vendor's presentation of a core and a disk, Grover said in an email interview. It also means using monitoring tools that can characterize that performance.
After considerable evaluation, Grover's team selected IDERA Inc.'s SQL Diagnostic Manager software to provide the all-important SQL Server performance baseline -- the You are here picture -- that underlies a cloud migration strategy. Diagnostic Manager, among its SQL Server monitoring capabilities, supports capacity planning, which can be useful in platform migrations.
"The historical views provided by Diagnostic Manager are invaluable to compare similar loads over time on new hardware," Grover said.
That's particularly useful because, in the Notre Dame database services department, there are more than 40 instances of SQL Server. The instances contain more than 575 databases. These support applications ranging from ERP and CRM, to document management and classroom equipment control.
Grover marks ease of installation and configuration, cost, completeness of desired features, ease of use and support as the key factors his team used in the selection of a database monitoring tool. He emphasized the importance of the latter two factors.
"I would recommend to anyone looking for a monitoring tool not to ignore the last two items. When you get stuck, a timely and informed response from the vendor will save you hours of troubleshooting on your own," he said.
People need to have a better handle on what is happening on premises before they move to the cloud, agreed Vicky Harp, senior manager for products at IDERA. In some part, that is because things that are nice to have on premises may be drawbacks in the cloud.
For instance, take the extra headroom that is commonly found for CPU and disk usage in data shops. While renting services online may have economic accounting benefits over owning your own hardware, there are ramifications.
"Things you can get away with on premises, you can't get away with in the cloud," said Harp. "That's because [any] CPU or disk overhead you have now may have to be paid for a different way on the cloud."
Moreover, on most clouds, she noted, the ability of DBAs to change setups can vary depending on whether the cloud database is deployed as infrastructure as a service (IaaS) or as data as a service (DaaS). With IaaS, administrators have more granular control of, for example, database block sizes.
"If it is [DaaS], you don't have insight into the operating system setup. You can control the knobs on the SQL Server, but not on the operating system," Harp said. That's true for both Amazon's Relational Database Service and Microsoft's Azure SQL Database, she added.
Cloud is no panacea
Notre Dame's Grover concurred that things are different on the cloud when it comes to databases, performance and provisioning. Transitioning to cloud computing includes not only a change in technology, but a change in mindset, as well.
"There is a learning curve related to tuning hardware to your load, but it no longer requires the lead time of ordering hardware. We just click on the console and the magic happens," he said.
That is not to say, however, that the cloud is appropriate for everything. Some health and safety applications, for example, need to be up and running, even if output connections are unavailable, Grover said. As a result, Notre Dame runs such applications in a hybrid, high-availability mode or completely on premises.
The issue of how databases run either in the cloud or on premises may be important, but it should not obscure efforts to address the broader trend in data today, according to Nancy Gohring, an analyst for application and infrastructure performance at 451 Research. That broader trend sees databases and their underlying performance moving to the heart of operations.
"Databases are increasingly important to applications. As a result, they have to perform well or they'll negatively impact the overall performance of the application," Gohring said.
This is driving demand for advanced tools that can monitor database performance, she said, adding that, used correctly, such tools help both DBAs and developers understand the root cause of issues that slow down databases.
Find out more about SQL-Azure migration
Learn about SQL Server 2016 updates for DBAs
Checkout SQL Server Reporting Services 2016