Predictive analytics: Where big data and messy humans meet

In the real world, big data sometimes bumps up against Big Ideas

Last week, I wrote a blog post about big automotive companies and the challenges associated with ensuring consumers complete repairs. Today, the Wall Street Journal covered the news that fatalities from faulty GM ignition switches have hit 100.

That's sad news, and it underlines a key point about the reality of big data: behind every data point is a person, and when people get mixed up with technology and analytics, things get messy. So unlocking big data through smart systems have to account for the very human, very messy Big Ideas that sometimes complicate things.

UPS provides an example of the way that people complicate analytics at even the most sophisticated logistics companies, for whom data is a form of currency. A couple of weeks ago, UPS announced that earnings were up for the first quarter of 2015, due in part to a pricing increase the company put in place.

That's not entirely intuitive (some might have guessed that higher prices would have negatively impacted sales), but it doesn't have to be: it was born of data analytics that showed UPS executives how customers would be willing to pay higher prices because they were raised within the context of other changes -- like the ability to accept multiple packages from consumers at retail AccessPoint locations -- that work in customers' favor.

Data and analytics also get put to work at UPS to inform how drivers actually make deliveries. For example: the company has a no left turns policy that has saved the company 10 million gallons of gas a year. And the company developed a highly sophisticated, proprietary system for optimizing driver routes.

Also: because Google can aggregate mobile phone data to provide real-time traffic information to drivers, UPS should be able to gather a lot of data about optimizing delivery routes form every driver, while they are driving around with smartphones on board.

Yet, the company is "aiming for 70 percent of its U.S. drivers to use the system by year-end." Why only 70 percent? Are drivers not willing to adopt the system because they think they can make better decisions than the big-data predictive analytics routing algorithm?

Put another way: how many times have you decided to take a detour on your way to get somewhere, because you knew you'd hit traffic and, based on past experience, you thought you'd have a better chance of arriving on time if you took an alternate route? Or second-guessed the GPS because you know that computers are only as smart as their programming?

Companies are constantly facing this challenge: how to manage the tension between predictive analytics and human nature. It's why, on a gut level, we all have some suspicion about an Artificial Intelligence future. If you think about it, big data, business intelligence, and predictive analytics are all just pinpoints on a spectrum that stretches from human intelligence to artificial intelligence. And as much as the data scientists would like to convince us otherwise, the relationship between human intelligence and AI can't be plotted on the same circle. A straight line, maybe. Possibly a Venn diagram.

There's an economic corollary to the big data-messy human challenge, which is the concept of irrelevancy and its importance in human economic behavior. Accounting for “irrelevancy” when creating economic models is essential for predicting accurate outcomes.

In the same way, companies that rely heavily on data and analytics must account for human influence on even the most elegant systems. Those systems have to be like children's furniture: rigorously tested to withstand a beating from the hard-to-predict humans that sometimes operate based on data that seems irrelevant to predictive algorithmic modeling.

I think we can safely predict that big data will revolutionize the future, but the degree to which data will determine the future is entirely dependent on how well we account for messy human behavior in our data models. As Michael Scott in The Office once showed us: the machines are not always right.

Copyright © 2015 IDG Communications, Inc.