Rethinking business intelligence

BI has a reputation for being a resource sink that delivers reports almost no one reads. It doesn’t have to be that way. And you can no longer afford to let it be

It can seem like a no-win situation. Business execs want more reports to glean insight on how to manage the company. So IT invests in new BI point solutions — even as it spends more and more time cleansing data and producing reports — only to be asked for changes again, since the reports IT delivers keep missing the mark.

That vicious cycle has bedeviled many an organization hungry for meaningful data. At the diversified manufacturer Ingersoll Rand, for example, an escalating quantity of resources was feeding a mélange of BI solutions that various business units use. The company had bolted on BI systems to all sorts of systems, from ERP to finance to CRM, “but we were not getting the value out,” says Rob Martens, global director of front-office technology.

Boris Evelson, a BI analyst at Forrester Research, has heard such laments repeatedly. “Over the years, traditional BI deployments have encouraged a perception that it’s a costly, complex environment,” he says. In reaction, many enterprises have focused on dashboards — simplified graphical displays that bypass full-blown BI systems and provide executives with business metrics in near real time. But the dashboard solution can be a false economy, Evelson says, because under any meaningful dashboard lurks the same hard groundwork to deploy reporting analytics and data integration, which together incur “80 percent of the cost.”

Blame for BI’s hit-or-miss ROI lies not with the technology itself but with a fundamental disconnect, says IDC analyst Dan Vesset. “To IT, BI means reporting, query tools, multidimensional analysis, OLAP tools, and maybe data mining,” he says. “To an end-user, it could mean anything that supports their decisions.” By treating BI as a set of technologies, most organizations veer off track, building ever-more-complex systems that fail to meet user needs — while what’s really needed is a better understanding of the underlying data and business requirements.

“Don’t start with a data warehouse or analytics engine. Start with understanding the business issue,” Forrester’s Evelson advises.

14FEbizintel-in.gif
Click for larger view.

Getting a handle on BI is increasingly essential. Forrester, Gartner, and IDC all identify increasing demand for BI in large organizations, as rising competitiveness pushes execs and line-of-business managers to stay on top of key performance indicators. Not only is a poor BI implementation “a Band-Aid on a hemorrhaging wound,” says Ingersoll Rand’s Martens, it can’t meet the growing need to monitor how well processes are working, how customer demand is evolving, how current sales practices are affecting a company’s financial health, and so on.

The message, Martens says, is to fix what you have before you try to expand it. “If you don’t, you’ll have an absolute train wreck on the way,” he says.

Focusing on the core

Most corporations with major BI deployments create large data warehouses or collections of data marts whose incoming data must be cleansed to ensure integrity and consistency — and whose relationships must be clearly defined in data cubes to ensure that analysis tools can run queries embedded in standard reports.

But if you start with that sort of architectural model, you’re likely to fail, says Scott Sognefest, a partner in Deloitte Consulting’s BI practice. “There’s a growing realization that you can’t put BI technology on top of a big pile of data. It’s expensive and inefficient,” he says. “You wouldn’t build a factory and then decide what products you want to produce after it’s built, but that’s what people do in the BI space.”

So understand the business case first. Then you can begin the messy work IT organizations have struggled with for years: building and refining a common data model and ensuring the data you need from multiple systems is consistent. “Data quality and data integrity are not going away. There’s no easy way to solve them,” says Betsy Burton, a Gartner vice president.

Forrester’s Evelson agrees. Before launching a BI initiative, he says, “I would have a data governance effort — and drop everything else.”

BI vendors have tried to address data quality and integration issues with MDM (master data management) solutions, but efforts to govern, cleanse, and reconcile data go beyond BI to affect every corner of the organization. In many instances, BI stakeholders have lacked the clout to drive enterprisewide MDM, yielding frustration when business execs want to scale BI beyond the original requirements that drove adoption.

Until a company cleans up its data act globally — a long-running project if there ever was one — the best strategy is to reduce the data sources to those that serve well-defined business objectives. “You’ve got no business putting in BI unless you’ve whittled down those core systems,” Martens says. That can eliminate conflicting sources and yield manageable data integration and cleansing. Keeping data close to home also keeps it closer to its context and metadata, something that can get lost when data is transformed for storage in a data warehouse. “ETL [extract, transform, load] will cost you hugely,” Martens adds, referring to the common method of pulling huge chunks of static data from legacy systems.

Reducing the number of data sources helps avoid grunt work, but data quality must still be up to par. Some data will always be dirty, perhaps because it comes from outside sources or perhaps because you’re seeking something difficult to extract. One common example is getting birth dates of customers, who see no reason to share their age, notes Anne Milley, director of technology product marketing at SAS Institute, so you get false data, such as the easy-to-enter 11/11/11, or no information at all.

In such cases, thought should be given to whether you really need that information for your analysis and, if so, how your analysis will account for the missing data so results remain meaningful, she says. This kind of thinking should be done before you deploy data collection, transformation, mining, analysis, or reporting systems, she adds.

14FEbizintel-in1.gif
Fleet management services provider PHH Arval provides a simple example of how such compromises can be reached. The company tracks odometer readings when truckers refuel to aid customer analyses of vehicle efficiency, delivery costs, and conformance to safety standards. But many drivers don’t take the time to transcribe odometer readings and instead enter guesstimates at the fuel terminals where this data is collected. To adjust analyses appropriately, PHH Arval created a statistical processing model that took this data weakness into account, says Greg Corrigan, the company’s vice president of BI.

Downsizing solutions

Simplification should go beyond data, says Kirk Hewitt, director of reporting and finance at Valero Energy, an oil refiner. Consolidate your BI tools as well. After a decade of acquisitions, Valero found itself with five BI tools in use. The company had already simplified its data environment through the adoption of a common ERP system, common financial management artifacts (such as chart of accounts and management software), and unified databases such as those for customer or refinery information. “We are really a big believer in master data management and in cleaning data at the source,” Hewitt says.

But having multiple BI tools meant that the analytics themselves differed across departments, leading to different results even from that same data. “You often had two different people asking for information, running reports on two different tools, and getting different answers,” he recalls. So Hewitt convinced management to replace the five tools with one from Information Builders. He justified the effort by demonstrating that the license and maintenance savings alone would pay back the consolidation effort in two years. But the lasting value is deeper, he says: “Instead of exploring differences in people’s numbers, analysts can now spend the time actually analyzing the reports.”

Another common BI mistake is to believe all data must live in a warehouse before it can be analyzed. “Today’s BI tools can point to any data store,” says Martens, who uses Oracle’s BI tools.

At the Hillman Group, a metal-products distributor, CIO Jim Honerkamp reached the same conclusion. In its Information Builders implementation, “We’re not using a data warehouse at all. We’re looking right into the databases supporting the transactional systems,” such as for finance and shipping, he notes.

Data warehouses and other historical data stores have their place, however. “You do need to store the data somewhere,” says Forrester’s Evelson, whether it resides in a data warehouse, a database, or a cache. The key is to determine what you need to use for which type of data.

At Valero, Hewitt’s BI applications tap into the SAP Business Warehouse for transactional data, into SAP R/3 directly for sensitive information such as that used by human resources, into an Oracle data warehouse for financial data, and into various SQL databases for departmental data. “There’s no need to pull data from a source into a data warehouse for cleanup and roll-up, and then run the analysis from that intermediate source,” he says. Not only does that add cost and complexity, the act of transforming all data into an intermediate form for the convenience of the BI tool risks losing the associated metadata and associated relationships.

Pushing BI closer to operations

Both vendors and users have become enamored with so-called operational BI. This typically means analysis “in line” to a business process, such as identifying unusual supplier activity that might require a change in pricing or manufacturing schedules, or noting higher-than-expected sales activity of lower-margin products that may indicate a problem in marketing, sales, or distribution.

Vendors such as Business Objects foresee BI engines that hook into all sorts of processes and workflows to monitor anomalies and changes in trends, perhaps even using business rules engines to automate adjustments instead of just alerting people. “We see it as a layer that sits across all applications,” says James Thomas, senior vice president for corporate product marketing. Cognos has a similar vision, notes Don Campbell, vice president of platform strategies and technology.

Of course, Thomas and Campbell have software to sell. Instead of adding a BI layer on top, wouldn’t it be more effective to add BI functionality to the apps that pay attention to the results of the processes they execute? That, in fact, is part of what’s needed. And that’s why ERP, SCM, and other business applications are embedding more analytics for the transactions they manage, notes Ian Charlesworth, a BI analyst at the market research firm Ovum.

14FEbizintel-in2.gif
But users of app-specific solutions must keep their limitations in mind. “There’s no, for example, ERP application where reports roll up to a cross-process view like customer profitability,” says Deloitte’s Sognefest. In other words, applications that monitor certain processes may be immediately useful for certain managers who use those apps, but those same processes may also need to be monitored by software that works across multiple platforms. And IT should prepare to provide that operational visibility, he says, “now that the business case [for BI] has crystallized in users’ minds.”

One fashionable way to get the operational view is through the use of dashboards. But different dashboards may have different metrics under the hood, killing any shared understanding of what’s going on, notes Dan Thorpe, senior vice president of statistics and modeling for Wachovia bank.

The problem lies with how dashboards are typically deployed, either as canned products whose metrics are unknown to IT or as tools brought in at a departmental level by frustrated users. In either case, inconsistent views of what’s happening in the enterprise often result. “Anyone can get what they want and then argue over who’s right,” Sognefest says.

“There are more bad examples than good examples of dashboards,” PHH Arval’s Corrigan adds.

Ad hoc adoption of dashboards, KPIs, reports, and so forth are all warning signs that an enterprisewide BI strategy is failing — or that it doesn’t exist. “BI’s really the responsibility of the organization’s operating committee. But it’s often not a priority, so the technology moves into the middle,” Thorpe says, letting silos grow and confusion reign.

The bad news for IT is that local analytics can sneak easily into the enterprise, whether through Web-based offerings or common business tools. Microsoft Excel has long been used as a personal analytics tool, creating multiple views of the same information, says Mike Davis, a BI analyst at Ovum — and it will only get worse if enterprises deploy the new Excel server that includes powerful BI functions Microsoft acquired in April 2006 when it snapped up ProClarity, an analytics pure-play.

The commercial real estate firm General Investments and Development faced this dilemma, recalls CIO Shawn Mahoney. Different financial analysts had their own Excel formulas for calculating items such as internal rate of return, leading to inconsistent investment decisions. Rather than fight Excel, Mahoney implemented OutlookSoft, which uses Excel as a front end to an analysis engine and database, ensuring that everyone has the same data models and formulas for these decisions. “We got a standard process that everyone uses,” he says.

Weaving the BI fabric

The good news for IT is that it is easier to apply consistent BI technology to more operational systems, says Ovum’s Charlesworth, thanks to Web services, increased use of standards, more common APIs, and emerging concepts such as SOA. These newer approaches also help support consolidation of BI tools in the enterprise so that there can be a common analytics engine for typical processes such as finance and manufacturing.

Not only is it easier to use a common BI engine for many applications, “It’s easier to support a more dynamic approach to how we surface BI technology to users,” Charlesworth says. Remember Honerkamp’s goal of getting his IT group out of the query-and-reports business? He accomplished much of his objective by making his BI tool available to users via an enterprise portal. Rather than create queries and execute reports, developers at the Hillman Group created BI applications that can analyze specific business areas — and let business staff build their own queries on the fly using check boxes and pull-down menus.

That particular project actually helped the company improve the understanding of its own business. The first app IT created analyzed revenue — but in the definition phase, it became clear that the company had multiple ways of defining exactly what revenue was. “IT became the catalyst to get the groups together to agree upon a common definition of revenue before we would agree to build the app,” Honerkamp says. That not only eliminated a lot of data cleansing, it got the business on the same page about a fundamental financial issue for the first time.

Rather than filling endless requests for reports, Honerkamp’s team is now focused on working with economists and modelers to develop predictive modeling, a major shift in focus from plumbing history to preparing for future business activity. “The trick for us is to understand not just our lagging indicators but our leading indicators,” he says.

Going forward, enterprises should look for search and unstructured analytics tools that help make sense of text data and other information external to databases, says Ovum’s Davis. Such tools, most of which remain in the development phase, can augment BI’s quantitative analysis with qualitative analysis. A simple example: Call-center records can be analyzed for references to competitors to see, for example, which seem to be most attractive to new customers or which appear to be making good impressions on high-value customers, he says.

14FEbizintel-in3.gif
Beyond structured data

Communications equipment maker Harris offers one example. The company has augmented its internal search capabilities with more traditional analytics, says Janice Lindsay, director of supply chain management. When engineers do a search for parts based on criteria such as power consumption or interface, an Endeca Technologies search engine looks at the raw results, then looks up quantitative information such as defect rates, available discounts, reliability ratings, and how much longer the part is expected to be manufactured. It then uses those factors to recommend which parts engineers should use. The results returned are filtered and ranked based on as many as 200 criteria, using information from ERP, manufacturing, product design, and other internal systems as well as from supplier systems and industry databases.

Through the use of dynamic summarization — a technique that does not require data cubes to be defined up front for the analysis tool to traverse — the Endeca Information Access Platform can analyze any data source for patterns, says Endeca’s Matt Eichner, vice president of strategic development.

Davis cites Factiva as another example of the unstructured analysis that might be brought to bear. The service searches Web sites and blogs to find mentions of companies, then analyzes the text to determine if the reference is favorable or not, ultimately producing a reputation index. Marrying this capability to traditional BI “is an interesting idea,” says Ovum colleague Charlesworth, “but it’s very early.” Davis estimates it to be about fives years out.

In the meantime, IT has plenty to do rationalizing its BI environment and meeting increased demand, and bringing BI into more operational aspects of the business. Remember one fundamental truth through it all, says Wachovia’s Thorpe: “You need to be business-driven, not IT-driven. Otherwise, you get a tool that no one uses.”

From CIO: 8 Free Online Courses to Grow Your Tech Skills
Join the discussion
Be the first to comment on this article. Our Commenting Policies