In 2011, SQL Server market trending toward cloud, virtualization

Higher SQL Server cloud adoption with more competition and SQL Server virtualization will get top billing in the SQL Server market in 2011, experts predict. Plus, self-service BI is on the up and up, along with solid-state devices.

The new year is here, and many experts in the SQL Server universe see some big developments ahead.

Predictions for the SQL Server market range from fiercer competition for Microsoft in the cloud, as more companies are moving from premises-based servers, to widespread adoption of virtualization technology and the growing popularity of self-service business intelligence (BI) tools.

“First and foremost, I think we are going to see continued increase in vendors such as Microsoft, Amazon, IBM and other big players pushing toward the cloud,” said Adam Machanic, a Boston-based independent database consultant. “Microsoft has been making a lot of noise about its Azure product. And, from my vantage point at least, it appears that they are putting a lot of money into that product and that they are gearing up for a much, much greater push going into 2011.”

Microsoft went live with SQL Azure early last year, and despite security and privacy concerns, adoption of cloud-based databases has been steadily growing, with more businesses drawn to the reduced costs of the highly automated and highly mobile platform.

“There will be continued adoption where it makes sense, maybe even some places it doesn’t,” said Andrew Novick, a consultant at Novick Software Inc. who was awarded Microsoft’s Most Valuable Professional award. “But it does make sense for a lot of people who want SQL Server but don’t want to have a server.”

Craig Utley, of consulting firm Solid Quality Mentors, said that as more businesses and organizations push their data into the cloud, we will see native application support for cloud storage and better tools for managing it.

“All of that, I think, is going to be very common in the near future,” Utley said.

Pricing is due for a change

And then there’s the pricing. Microsoft charges $9.99 per month for up to 1 GB, and $99.99 per month for up to 10 GB. Still, money is money.

“It’s not trivially cheap,” Novick said. “I think we might see them changing the pricing model a little bit. They ask questions about it a lot but they haven’t come out with anything definite. And if they do then they get more people to adopt it more quickly.”

On par with the cost benefits of cloud-based databases is virtualization, which companies are using to cut down on physical servers and save on licensing.

“Virtualization I certainly expect to become almost the norm in large corporations with large data centers,” Novick said. “They’re sort of insistent on it, the data center people, and it works for them.”

Utley said many of his customers are running at least some of their SQL Server instances as virtualized servers, and that makes sense in terms of licensing. It’s attractive from a management perspective, as well.

“If something gets messed up, it’s very easy to recover an entire instance,” he said.

Another way of saving on cash and boosting ROI this year, according to Pradeep Adiga, a subject matter expert in SQL Server, will be IT migration projects.

“Many of the SQL Server migration projects which were held back will be implemented this year,” he wrote in an email. “With the release of 2008 and 2008 R2, servers running on SQL Server 2000 can directly be migrated to 2008 to reduce the cost spent on buying the licenses for 2005.”

BI gives SQL Server a boost

Additionally, 2011 may see a quickening in the SQL Server adoption rate thanks to Microsoft’s business intelligence (BI) offerings , said Rita Sallam, a BI analyst and research director at Gartner Inc.

“With a license cost profile that is comparable or less than open source BI vendors and considerably less than its commercial competitors,” Sallam wrote in an email, “the functional premium for alternatives will be increasingly difficult to justify for many organizations. This will drive SQL Server adoption.”

Utley also sees BI as a wave of the near future, with self-service BI -- designed to let users access and analyze data without being dependent on the IT department -- rising in popularity. But first, he said, Microsoft has some work to do.

“Microsoft has a somewhat confusing landscape of front-end tools for accessing data, especially BI data,” he said. “They have Excel and PowerPivot and Reporting Services and PerformancePoint.”

One of Utley’s recent BI consulting project became a challenge because users had to toggle among tools before finding the right one for the job.

“And so, I think that what you’re going to see is some clarification and some unification of some of those particular tools, with a goal of the self-serve BI that we saw with PowerPivot,” he said.

But PowerPivot’s capabilities have led many to ask whether the in-memory cube PowerPivot builds means SQL Server Analysis Services is going away.

“The answer to that is no because there are a lot of things that the PowerPivot cubes can’t do on the fly, that you have to build in when you’re building an Analysis Services cube,” Utley said. “Until it changes, I fully expect [Microsoft] to enhance Analysis Services while they also improve on things like PowerPivot and the self-serve stuff.”

In fact, Sallam sees these online analytical processing (OLAP) and data-mining capabilities as a major selling point for SQL Server, continuing to drive up adoption.

“Even though organizations are increasingly turning to newer in memory OLAP architectures over traditional MOLAP [multidimensional online analytical processing] architectures to support dynamic and interactive analysis of large data sets, Microsoft has put an initial stake in the in-memory ground (albeit late) with the introduction this year of SQL Server PowerPivot, which requires SQL Server 2008 R2,” she wrote.

Third-party tools to challenge Microsoft

And the third-party front will see a lot of action this year, Machanic said. Look out for upstarts like Burlington, Mass.-based Expressor coming out with extract, transform and load [ETL] tools and thus competing directly with SQL Server Integration Services.

“I think we are going to see more and more of that coming up as shops are looking to be a lot leaner and solve problems more quickly, potentially using a number of third-party solutions instead of expecting to go to just Microsoft or just Oracle or just to whoever else to solve the problems,” he said.

The market will also see greater adoption of solid-state devices in 2011, Machanic said. He pointed to Fusion-io’s 5 TB version of its device.

“That’s a real game changer. It means that we can finally take the largest data warehouses, the largest database instances, and put them on solid-state devices,” he said. “So, I think we are going to see in 2011 the size continue to increase of those devices and the prices continue to come down to the point where it’ll become a no-brainer.”

One thing that won’t spread like wildfire in the new year is adoption of Master Data Services, Novick said. Because it is included with SQL Server 2008 R2, some will try out what he calls a long-term process.

Master data management is a long process for any corporation no matter what product you use,” he said. “Because it’s almost always about your corporate organization, your processes, getting control of them and documenting them and finding out where you’ve got data entered one way in one system, a different way in a different system.”

And will we see Denali, the new version of SQL Server announced at the Professional Association for SQL Server (PASS) conference in Seattle? Not anytime soon, Novick said.

“Because CTPs [community technology previews] take a while, and they’re very careful about the process,” he said.

Dig Deeper on SQL Server Database Modeling and Design