Problem solve Get help with specific problems with your technologies, process and projects.

The case for data stewardship

What does bad data cost your organization? Where does it come from? How can you fix it? A data steward is the answer.

One of the nation's largest insurers wanted to create a more integrated, customer-focused organization. It began by moving some 4 terabytes of premium, loss, customer, policy, Web traffic, and external data from disparate legacy systems "owned" by individual business units into an enterprise data warehouse (EDW).

As the project evolved, a host of data quality issues emerged: repetition, different formats, different metrics and multiple business definitions. Rather than deal with these problems piecemeal, the company recognized that data quality would be an ongoing concern that could make or break the success of its initiative. In response, it instituted a "data stewardship" program that drew on the expertise of high-level business and IT executives across the organization.

Four years later, by using data in ways "never before imagined or possible," the warehouse has helped maintain a level of product innovation, customer responsiveness, regulatory compliance, and smart underwriting that characterize today's most successful insurers. At the same time, it has reduced costs through labor force optimization, replacement of legacy applications, and faster retirement of maintenance hardware.

Without a data stewardship program, the initiative might never have achieved those results.

Where does it come from? What does it cost?

The single enterprise view a data warehouse enables has long promised breakthrough insight, especially for insurers who rely so heavily on data. Nevertheless, many data consolidation projects underperform. Quite often, the cause is bad data.

In March 2002, National Underwriter concluded that a mid-size insurer (approximately 2 million claims transactions per month) can lose $10 million per year due to the direct costs of analyzing and correcting data errors. (The estimate was based on a Data Warehouse Institute study that reported the costs to US industries of data quality issues exceed $600 billion per year.) In reaching its conclusion, National Underwriter assumed that only .001 percent of the company's data is bad, and that companies would only attempt to fix the ten percent of errors that are critical to the business.

But there would likely be indirect costs as well. When errors become exposed to customers and regulators, fines follow and the backlash can force an avalanche of expensive changes to how a company conducts its business.

Even worse, in the insurance industry, the sources for bad data are proliferating. External sources include government organizations, credit and claims bureaus, broker channels and online consumers.

Internally, as insurers have recognized the business potential of making data accessible beyond the "data elite," sales and service personnel, captive agents, and back office operations have all begun to gather and input data. The Data Warehousing Institute has found that employees make 76 percent of the data entry errors that help account for bad data.

The benefits of good data

Equally important, good quality data has always offered significant business advantages. Product managers use customer data to develop new products and gain market share. Actuaries use it to accurately price risk and evaluate loss reserves. Agents use it to provide a good customer experience and grow and maintain customer relationships.

Today's environmental factors make this an even more pressing issue. "Reinsurers are demanding that property and casualty insurers demonstrate their degree of catastrophic exposure and overall risk profile. Those companies that can quickly and reliably capture and organize that data can change their risk profile, if necessary, or use the data to negotiate for better reinsurance rates," says Jenny Emery, a senior vice president at Towers Perrin Reinsurance.

And according to Brad Fluegel, a principal at Reden and Anders, "Health insurers need good data to monitor trend so they can identify and address those areas that are unnecessarily driving up costs."

The answer: A data stewardship program

In today's environment, therefore, maximizing the accessibility, reusability, and quality of a company's data is an essential survival tool. A data stewardship program can achieve those goals. A number of best practices for such programs have now emerged.

First, as with any major corporate initiative, senior management must be fully engaged. Second, the business and technical sides must work closely together in cross-functional teams that include representatives from marketing underwriting, finance, legal, and technical. These teams are responsible for:

  • Business naming standards
  • Consistent data definitions
  • Data aliases
  • Developing standard calculations and derivations
  • Documenting the business rules of the corporation
  • Monitoring the quality of the data in the data warehouse
  • Security requirements

Cross-functional teams ensure that business people understand their role in maintaining the quality of the data. Moreover, by increasing their awareness of what data exists and where they can find it, they are more likely to maximize their use of the data. From the other side, technical people gain insights that allow them to fully align their work priorities with the company's overall business strategy.

Getting started

Unfortunately, fewer than half of all companies nationwide have a formal data stewardship program in place. Even if a company has a working program, it is worth evaluating that program to ensure it is in line with the best practices. An evaluation can help a company measure data quality costs and benefits, understand the data value chain, readily view where company data resides, and prioritize data quality efforts.

Evaluations can include an examination of overall data quality, a gap analysis that identifies organizational holes that lead to bad data, and a process review of existing data management programs.

Given the number of people who touch data throughout an organization, companies should also evaluate if they have an effective communication plan so that everyone knows their role in ensuring good quality data - and the importance of doing so. Finally, companies might spend some time identifying additional business users to increase the value of any data-related projects and help insurers realize a greater return on investment.

These evaluations are the first step in creating an effective data stewardship program. As in the real-life example cited in the introduction, creating a visible and vigilant process to guard against the "garbage in, garbage out" syndrome is one key to achieving the kind of return on data and technology investments that the insurance industry has sought for years.

ABOUT THE AUTHOR
Alan Greenspan
is Product Marketing Manager at Teradata. Alan started his career in the computer industry over 25 years ago as a software developer in a product development organization and holds a B.S. degree in Math and Computer Science from U.C.L.A. and an MBA in Finance from the Wharton School, University of Pennsylvania.

This was last published in November 2004

Dig Deeper on SQL Server Data Warehousing

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchDataCenter

SearchDataManagement

SearchAWS

SearchOracle

SearchContentManagement

SearchWindowsServer

Close