One of the hottest IT trends today is augmenting traditional business applications with artificial intelligence...
or machine learning capabilities. I predict the next generation of data center application platforms will natively support the real-time convergence of online transaction processing with analytics. Why not bring the point of the sword on operational insight to the frontline where business actually happens?
But modifying production application code that is optimized for handling transactions to embed machine learning algorithms is a tough slog. As most IT folks are reluctant -- OK, absolutely refuse -- to take apart successfully deployed operational applications to fundamentally rebuild them from the inside out, software vendors have rolled out some new ways to insert machine intelligence into business workflows. Microsoft is among them, pushing SQL Server machine learning tools tied to its database software.
Basically, adding intelligence to an application means folding in a machine learning model to recognize patterns in data, automatically label or categorize new information, recommend priorities for action, score business opportunities or make behavioral predictions about customers. Sometimes this intelligence is overtly presented to the end user, but it can also transparently supplement existing application functionality.
In conventional data science and analytics activities, machine learning models typically are built, trained and run in separate analytics systems. But models applied to transactional workflows require a method that enables them to be used operationally at the right time and place, and may need another operational method to support ongoing training (e.g., to learn about new data).
Closeness counts in machine learning
In the broader IT world, many organizations are excited by serverless computing and lambda function cloud services in which small bits of code are executed in response to data flows and event triggers. But this isn't really a new idea in the database world, where stored procedures have been around for decades. They effectively bring compute processes closer to data, the core idea behind much of today's big data tools.
Database stored procedures offload data-intensive modeling tasks such as training, but can also integrate machine learning functionality directly into application data flows. With such injections, some transactional applications may be able to take advantage of embedded intelligence without any upstream application code which needs to be modified. Additionally, applying machine learning models close to the data in a database allows the operational intelligence to be readily shared among different downstream users.
What does this all mean for SQL Server users? Starting with its 2015 acquisition of Revolution Analytics, Microsoft has made a big investment in R, a powerful data science and analytics environment. There's a good reason for that: Many data analysts first learned to do advanced statistical work with R, such as building advanced visualizations for data exploration and coding machine learning models for predictive analytics uses.
Microsoft added R execution capabilities to the SQL Server database engine via its SQL Server R Services software. As a result, SQL Server machine learning applications can use stored procedures to tightly integrate with an R server, which lets business analysts leverage R and its extensive machine learning libraries in operational data pipelines.
SQL Server machine learning gets easier
In addition, MicrosoftML, a SQL Server machine learning library added to the vNext update of the database software, minimizes the coding needed to integrate R-based models with data stored in SQL Server. MicrosoftML lowers the machine learning bar for just about any enterprise: It currently offers several types of prebaked machine learning models that Microsoft said can be implemented in only six or seven lines of R code within a stored procedure. That means models can be applied dynamically and efficiently at scale, and maintained independently from core application code. In a similar manner, new data events can trigger other stored R code to train and update models dynamically.
Managing any kind of code embedded in database stored procedures, especially code stored as serialized strings, can be challenging at scale. But here Microsoft touts some successful examples of also using SQL Server to manage model data. In this way, machine learning models become just another type of managed data object in SQL Server.
For example, in a large-scale internet of things application scenario, unique predictive models might be created to analyze and forecast the operational behavior of each individual connected device. If there are millions of devices, that could mean managing millions of models -- no small task, but one with potentially huge operational benefits.
Folding R-based predictive modeling into SQL Server shows Microsoft is serious about providing a platform for the next generation of business applications -- those that combine transaction processing with dynamic advanced analytics for operational-speed intelligence. With these SQL Server machine learning approaches, any existing application can integrate with and benefit from machine learning tools. And by using stored procedures in the database you might not need to change any application-side code at all.
All of this can be run in the public cloud, too. Microsoft Azure is fast becoming a go-to data science platform in the cloud, and with cloud-based SQL Server now capable of embedding operational machine learning, Azure users can readily apply the results of advanced analytics discoveries to optimize and accelerate transactional business processes.
Software industry embraces machine learning
Healthcare thriving on machine learning
AI fuels chatbot technology