It can seem like a no-win situation. Business execs want more reports to glean insight on how to manage the company. So IT invests in new BI point solutions — even as it spends more and more time cleansing data and producing reports — only to be asked for changes again, since the reports IT delivers keep missing the mark.
That vicious cycle has bedeviled many an organization hungry for meaningful data. At the diversified manufacturer Ingersoll Rand, for example, an escalating quantity of resources was feeding a mélange of BI solutions that various business units use. The company had bolted on BI systems to all sorts of systems, from ERP to finance to CRM, “but we were not getting the value out,” says Rob Martens, global director of front-office technology.
Boris Evelson, a BI analyst at Forrester Research, has heard such laments repeatedly. “Over the years, traditional BI deployments have encouraged a perception that it’s a costly, complex environment,” he says. In reaction, many enterprises have focused on dashboards — simplified graphical displays that bypass full-blown BI systems and provide executives with business metrics in near real time. But the dashboard solution can be a false economy, Evelson says, because under any meaningful dashboard lurks the same hard groundwork to deploy reporting analytics and data integration, which together incur “80 percent of the cost.”
Blame for BI’s hit-or-miss ROI lies not with the technology itself but with a fundamental disconnect, says IDC analyst Dan Vesset. “To IT, BI means reporting, query tools, multidimensional analysis, OLAP tools, and maybe data mining,” he says. “To an end-user, it could mean anything that supports their decisions.” By treating BI as a set of technologies, most organizations veer off track, building ever-more-complex systems that fail to meet user needs — while what’s really needed is a better understanding of the underlying data and business requirements.
“Don’t start with a data warehouse or analytics engine. Start with understanding the business issue,” Forrester’s Evelson advises.
Getting a handle on BI is increasingly essential. Forrester, Gartner, and IDC all identify increasing demand for BI in large organizations, as rising competitiveness pushes execs and line-of-business managers to stay on top of key performance indicators. Not only is a poor BI implementation “a Band-Aid on a hemorrhaging wound,” says Ingersoll Rand’s Martens, it can’t meet the growing need to monitor how well processes are working, how customer demand is evolving, how current sales practices are affecting a company’s financial health, and so on.
The message, Martens says, is to fix what you have before you try to expand it. “If you don’t, you’ll have an absolute train wreck on the way,” he says.
Focusing on the core
Most corporations with major BI deployments create large data warehouses or collections of data marts whose incoming data must be cleansed to ensure integrity and consistency — and whose relationships must be clearly defined in data cubes to ensure that analysis tools can run queries embedded in standard reports.
But if you start with that sort of architectural model, you’re likely to fail, says Scott Sognefest, a partner in Deloitte Consulting’s BI practice. “There’s a growing realization that you can’t put BI technology on top of a big pile of data. It’s expensive and inefficient,” he says. “You wouldn’t build a factory and then decide what products you want to produce after it’s built, but that’s what people do in the BI space.”
So understand the business case first. Then you can begin the messy work IT organizations have struggled with for years: building and refining a common data model and ensuring the data you need from multiple systems is consistent. “Data quality and data integrity are not going away. There’s no easy way to solve them,” says Betsy Burton, a Gartner vice president.
Forrester’s Evelson agrees. Before launching a BI initiative, he says, “I would have a data governance effort — and drop everything else.”
BI vendors have tried to address data quality and integration issues with MDM (master data management) solutions, but efforts to govern, cleanse, and reconcile data go beyond BI to affect every corner of the organization. In many instances, BI stakeholders have lacked the clout to drive enterprisewide MDM, yielding frustration when business execs want to scale BI beyond the original requirements that drove adoption.
Until a company cleans up its data act globally — a long-running project if there ever was one — the best strategy is to reduce the data sources to those that serve well-defined business objectives. “You’ve got no business putting in BI unless you’ve whittled down those core systems,” Martens says. That can eliminate conflicting sources and yield manageable data integration and cleansing. Keeping data close to home also keeps it closer to its context and metadata, something that can get lost when data is transformed for storage in a data warehouse. “ETL [extract, transform, load] will cost you hugely,” Martens adds, referring to the common method of pulling huge chunks of static data from legacy systems.
Reducing the number of data sources helps avoid grunt work, but data quality must still be up to par. Some data will always be dirty, perhaps because it comes from outside sources or perhaps because you’re seeking something difficult to extract. One common example is getting birth dates of customers, who see no reason to share their age, notes Anne Milley, director of technology product marketing at SAS Institute, so you get false data, such as the easy-to-enter 11/11/11, or no information at all.
In such cases, thought should be given to whether you really need that information for your analysis and, if so, how your analysis will account for the missing data so results remain meaningful, she says. This kind of thinking should be done before you deploy data collection, transformation, mining, analysis, or reporting systems, she adds.
Downsizing solutions
Simplification should go beyond data, says Kirk Hewitt, director of reporting and finance at Valero Energy, an oil refiner. Consolidate your BI tools as well. After a decade of acquisitions, Valero found itself with five BI tools in use. The company had already simplified its data environment through the adoption of a common ERP system, common financial management artifacts (such as chart of accounts and management software), and unified databases such as those for customer or refinery information. “We are really a big believer in master data management and in cleaning data at the source,” Hewitt says.
But having multiple BI tools meant that the analytics themselves differed across departments, leading to different results even from that same data. “You often had two different people asking for information, running reports on two different tools, and getting different answers,” he recalls. So Hewitt convinced management to replace the five tools with one from Information Builders. He justified the effort by demonstrating that the license and maintenance savings alone would pay back the consolidation effort in two years. But the lasting value is deeper, he says: “Instead of exploring differences in people’s numbers, analysts can now spend the time actually analyzing the reports.”
Another common BI mistake is to believe all data must live in a warehouse before it can be analyzed. “Today’s BI tools can point to any data store,” says Martens, who uses Oracle’s BI tools.
At the Hillman Group, a metal-products distributor, CIO Jim Honerkamp reached the same conclusion. In its Information Builders implementation, “We’re not using a data warehouse at all. We’re looking right into the databases supporting the transactional systems,” such as for finance and shipping, he notes.
Data warehouses and other historical data stores have their place, however. “You do need to store the data somewhere,” says Forrester’s Evelson, whether it resides in a data warehouse, a database, or a cache. The key is to determine what you need to use for which type of data.
At Valero, Hewitt’s BI applications tap into the SAP Business Warehouse for transactional data, into SAP R/3 directly for sensitive information such as that used by human resources, into an Oracle data warehouse for financial data, and into various SQL databases for departmental data. “There’s no need to pull data from a source into a data warehouse for cleanup and roll-up, and then run the analysis from that intermediate source,” he says. Not only does that add cost and complexity, the act of transforming all data into an intermediate form for the convenience of the BI tool risks losing the associated metadata and associated relationships.
Pushing BI closer to operations
Both vendors and users have become enamored with so-called operational BI. This typically means analysis “in line” to a business process, such as identifying unusual supplier activity that might require a change in pricing or manufacturing schedules, or noting higher-than-expected sales activity of lower-margin products that may indicate a problem in marketing, sales, or distribution.
Vendors such as Business Objects foresee BI engines that hook into all sorts of processes and workflows to monitor anomalies and changes in trends, perhaps even using business rules engines to automate adjustments instead of just alerting people. “We see it as a layer that sits across all applications,” says James Thomas, senior vice president for corporate product marketing. Cognos has a similar vision, notes Don Campbell, vice president of platform strategies and technology.
Of course, Thomas and Campbell have software to sell. Instead of adding a BI layer on top, wouldn’t it be more effective to add BI functionality to the apps that pay attention to the results of the processes they execute? That, in fact, is part of what’s needed. And that’s why ERP, SCM, and other business applications are embedding more analytics for the transactions they manage, notes Ian Charlesworth, a BI analyst at the market research firm Ovum.
One fashionable way to get the operational view is through the use of dashboards. But different dashboards may have different metrics under the hood, killing any shared understanding of what’s going on, notes Dan Thorpe, senior vice president of statistics and modeling for Wachovia bank.
The problem lies with how dashboards are typically deployed, either as canned products whose metrics are unknown to IT or as tools brought in at a departmental level by frustrated users. In either case, inconsistent views of what’s happening in the enterprise often result. “Anyone can get what they want and then argue over who’s right,” Sognefest says.
“There are more bad examples than good examples of dashboards,” PHH Arval’s Corrigan adds.
Ad hoc adoption of dashboards, KPIs, reports, and so forth are all warning signs that an enterprisewide BI strategy is failing — or that it doesn’t exist. “BI’s really the responsibility of the organization’s operating committee. But it’s often not a priority, so the technology moves into the middle,” Thorpe says, letting silos grow and confusion reign.
The bad news for IT is that local analytics can sneak easily into the enterprise, whether through Web-based offerings or common business tools. Microsoft Excel has long been used as a personal analytics tool, creating multiple views of the same information, says Mike Davis, a BI analyst at Ovum — and it will only get worse if enterprises deploy the new Excel server that includes powerful BI functions Microsoft acquired in April 2006 when it snapped up ProClarity, an analytics pure-play.