What if the LOB followed the right analytics strategy?

Part of that strategy means the business users will need to be tempered by people who understand the analytics tools and processes, and who know how to optimize their use

abstract data statistics

There’s no shortage of strategic advice for the LOB manager when it comes to formulating a strategy for bringing IoT analytics into the division. Any tools vendor worth its PowerPoints has a quick answer, often before the question is asked … and the answer typically suits that vendor best of all.

Make no mistake, the outpouring of self-service analytic tools is compelling. But a collection of tools does not equal a strategy. LOB managers are better advised to start with a basic understanding of the so-called big picture, of the functions, processes and people they’ll need to succeed in this brave new world. That requires taking a close look at the functions of the analytics pipeline itself, then moves to assessing the skills of the collaborative citizen-analyst teams that will run the analytics.

A five-step analytics pipeline

Internet of things analytics processes are complex and multifaceted, and far from a simple, sequential series of steps. Like businesses themselves, IoT analytics applications differ markedly and often change as they grow more adept. And many processes will use a subset of these five pipeline components. But for planning purposes it’s best to view the pipeline as a linear, logically connected entity, featuring these main stages.

1. Descriptive analytics

Data coming in from IoT devices—often at the rate of a quarter-million or more events per second—hits the first stage, descriptive analytics. Here, data is correlated and categorized via processes such as pattern matching and anomaly detection.

Behavioral analytics would typically be in play for a customer-journey application; geospatial analytics for an autonomous-driving application. These analytics use contextual awareness of historical data, combined with situational intelligence that comes from overlaying current conditions. Because the processes involve data that is already known, the analytics are typically rule-based, so performance characteristics mainly have to do with the speed and quality of the data and the rules engines.

2. Diagnostic analytics

Diagnostic process now asks the question why: Why did events occur as they did? Because the answers are unknown, these typically use probability rather than rules-based functions. Here is where business domain expertise is especially valuable, and where well-constructed teams of business-expert and analytics developers can have significant impact on results. Here, root-cause analysis might play a part in a manufacturing scenario, where analysts are investigating the causes of machine slowdowns. In a customer-journey application, analysts might ask why a large number of customers have decided to switch to a lower-cost product, giving up certain high-end features that the designers thought would be attractive.

3. Predictive analytics

Also, largely probability-based, predictive analytics use the results generated by the descriptive and diagnostic functions to suggest future outcomes. Again, business-domain knowledge is essential in configuring the various analyses.

Here, analysts typically employ sophisticated, compute-hungry modeling—to explore decision trees, for instance, or regression and classification tests. In the customer journey application, this is where the business unit manager might want to find out if a new set of high-end features would turn customers back to the higher-priced product. Or a railway might ask if a specific cost incentive would be enough to promote greater usage of the railroad during weekdays.

4. Prescriptive analytics

Prescriptive analytics essentially suggest what actions are most likely to succeed, based on results from the previous tests. Machine learning and other rules-based processes use classification and optimization algorithms to score prescriptive models.

The decision processes may seem simple, since many are based on if-then logic. But the computations can be immense because of the many factors to be considered. Prescriptive analytics must also abide by regular organizational policies regarding appropriate courses of action in specific circumstances. After all, they typically represent the last step before action is taken.

5. Process automation

Here is where the rubber meets the road or the product reaches the shopper’s online cart. To be effective, prescriptive analytics typically should create instant action, triggering a process that shuts down a faulty machine before it can cause any additional damage or putting a product offer before the eyes of a prospect before that person leaves the webpage.

Close integration with business processes is vital here, as is superfast computer processing. A healthy, end-to-end analytics pipeline might have undertaken several thousand descriptive-diagnostic-predictive-prescriptive and automated processes in the time it took to read this last paragraph.

In a real-world scenario for an energy utility the five-step pipeline might, while ingesting smart-grid data, look like this:

  1. Correlate all contextual data needed for advanced analytics, such as equipment profiles, recent maintenance history and grid topology.
  2. Correlate real-time weather conditions and near-term forecasts of severe weather to provide real-time situational awareness of what is happening to the grid at the moment.
  3. Predict at-jeopardy equipment and any likely imminent failures using predictive failure models that are based on machine learning of historical data. The machine learning uses years’ worth of data on equipment failures and weather events to produce a predict model.
  4. Prescriptive analytics then determines the best action to take.
  5. A process automation workflow may send a work crew or shut down the equipment in time to minimize the threat.

Building a citizen-analyst team

Armed with an understanding of the tools and their functions, you’re ready to fill in the human aspect of the strategy by identifying, recruiting and, if necessary, hiring the best talent possible. This is not as easy as it may sound, because the best candidates may well be coming from different parts of the corporate universe.

That means the business users will need to be tempered by people who understand the analytics tools and processes, and who know how to optimize their use. These would be data scientists and developers. Put them all together and you’ve got a winning team—maybe. Here are the skills you should be looking for:

  • Willingness to communicate. This isn’t as simple as it may sound. It requires measures of fearlessness and patience. A business-domain person shouldn’t be reticent to ask so-called technical questions, nor should the tech expert be unwilling to answer such questions with clarity. That works both ways, with the business person willing to take the time to help the data scientist understand the subtleties of a particular project.
  • Ability to see—and tell—the big picture. Team members should be enthusiastic about communicating to others outside the team, whether other teams’ members or other business-unit or management groups. They should be able to put their knowledge into a larger context, say for a report to C-level executives. This means that team personnel should learn the organization’s business goals and strategic interests and be able to explain to others how their work fits into these larger contexts.
  • Curiosity tempered with skepticism. Curiosity largely goes without saying, but it should be combined with healthy doses of skepticism about the data that’s being analyzed. Does it represent a true picture? Are shopping-bots or other artificial influences present? Is there sufficient data to populate the model, or do we need more?

With your teams up and running, you’ll need to think about your own skills—and your management priorities—to maximize their success. Be ready to match analysis tasks with the skill sets of team personnel, and to support team members’ efforts to develop new skills through training and education. You’ll want to promote standardization where possible, by fostering cross-team consistency in data sets and reporting styles, for example, and building libraries of reusable routines and templates.

And always be looking for opportunities to communicate your teams’ efforts outside of the business unit. Encourage teams to present their findings and methods to other business units and management groups. That’s a great way to amplify your strategy by fostering the synergies that may lead to cross-organizational analytics teams for future initiatives.

This article is published as part of the IDG Contributor Network. Want to Join?

Stay up to date with InfoWorld’s newsletters for software developers, analysts, database programmers, and data scientists. • Get expert insights from our member-only Insider articles.