carloscastilla - Fotolia

Manage Learn to apply best practices and optimize your operations.

Microsoft technology refresh touches SQL Server, integration tooling

Microsoft is at work on a delicate technology refresh affecting database tuning and architecture, as well as data integration and business intelligence.

Startups in the data industry rely on shifts in technology. Their new products claim to be better than whatever...

they are supposed to replace. Their approach to a technology refresh is to throw out the old. Their problem, though, is that they start out with no customers.

An established industry player like Microsoft, however, actually has customers. But that can be a problem, too.

Customers have to be guided to new technology at their individual pace, without unnecessarily throwing out what came before. For Microsoft, a technology refresh has more than 50 shades of gray. Keeping track of all those shades is not easy, and it was a major preoccupation of many attendees at the recent PASS Summit 2017 in Seattle.

As data centers are shrinking and cloud computing is growing, Microsoft is trying to map a course toward cloud-based, subscription-oriented business software. Much of what went on at PASS Summit 2017 was about maintaining balance, while moving forward.

Big tent, two cities

Much of the content was about moving to the cloud and not forgetting any key pieces during the course of that migration.

In a way, it is a tale of two cities, one on premises and one in the cloud. Clearly, there is a wide range of constituencies to support under Microsoft's big tent.

A lot of different skills come into play in all this. SQL Server database administrators crowded into sessions on relational database tuning in the here and now, but they were also fed information on NoSQL alternatives.

In a way, it is a tale of two cities, one on premises and one in the cloud.

At the same time that small cadre of BI developers learned about new R and Python tools to analyze big data, larger groups of BI programmers were learning about Microsoft's U-SQL tools for data lake programming with C#.

The news also included word that some useful tools would finally make their way to the cloud or, alternatively, on premises. For example, the company rolled out a Power BI Report Server update that expands support for on-premises publishing and distribution of reports created as part of cloud-based Power BI workloads.

The company also confirmed that familiar on-premises integration methods were finally being included in Microsoft's Azure services cavalcade. This is an interesting case in point for what today's technology refresh can mean to Microsoft and its customers.

Integrating a data factory in the cloud

At PASS Summit, Microsoft made SQL Server Integration Services (SSIS) a first-class citizen within Azure Data Factory. That is no small feat. SSIS arose along with SOAP at a time when XML APIs were synonymous with web services. Extract, transform and load (ETL) is an essential part of an SSIS workflow.

But a lot has changed since the first heady days of web services. With Azure Data Factory, use of REST APIs takes prominence, along with a data pipeline that supports Hive, Pig, Hadoop and other new-age data tools. In the cloud, extract, load and transform (ELT) is the approach that tends to win out -- a slight adjustment in focus perhaps, but representative of larger changes taking place.

Now, with Azure Data Factory Version 2 recently available as a technology preview, SSIS ETL developers will be able to map their ETL skills more easily to the cloud environment, according to Carlos Bossy, who led a PASS session on implementing BI in the cloud.

For example, some SSIS work can move to the cloud without rewriting.

"Azure data integration was somewhat of a problem until this year with SSIS changes," said Bossy, who is senior managing partner and an architect at BI consultancy Datalere, based in Denver. Bossy showed BI developers a smorgasbord of data frameworks referring to Apache Kafka, Apache Spark and a host of others.

That is the rub in distributed data processing on the cloud. As Bossy said, "There's so much stuff -- it's hard to tell how to use it."

He advised BI developers to proceed at a reasonable pace, which seemed to resonate for conference attendees who have to bring in the new at the same time as they keep existing things running.

"Not everybody needs all of this," Bossy said. For that matter, he quipped, "some people are still trying to get yesterday's sales report out."

Whither the technology dumpster?

A technology refresh can sometimes really mean a new start, according to Mike Walsh, a PASS Summit presenter, as well as the founder and owner of Straight Path IT Solutions, a SQL Server consultancy and managed service provider in Milton, N.H. Change begets change.

"I see a lot of clients asking, 'Why do we do it that way?' or 'Can we make some simple changes on the way up to cloud?'" he wrote in an email interview.

"It's like moving into a newer apartment or house. What boxes have you just been carrying around from move to move?" he asked, then answered, "Well, there's an incentive to rent a dumpster sometimes and leave some of the baggage behind -- it hurts at first, but you can sense the peace on the other side of that move."

Clearly, helping users forward while supporting or jettisoning existing methods is not a one-size-fits-all undertaking. And not everyone will make the journey. Here is a made-up fact that hopefully holds some truth: Most surveys will tell you that 30% of people don't do anything new.

Without question, however, the crowd at PASS Summit 2017 was not a do-nothing crowd. The event was a valuable showcase for many of the moving parts involved in Microsoft's technology efforts. That was refreshing. 

Next Steps

Find out more about data migration and cloud

Look at the roots of Microsoft SSIS

Listen in as our editors talk about the Amazon-Microsoft cloud competition

Dig Deeper on Microsoft SQL Server Performance Monitoring and Tuning