Without a predictive analytics (PA) package looking for patterns in the Twittersphere that correlate your brand with geographic location and factors such as the number of mentions, you could miss out on a great but small window of opportunity to move merchandise.
"In the past, we would have based [our decisions] on historical data-and, by the time we did it, that trend may have already passed us," says Barnes. "So that's PA on steroids, at warp speed."
How this is accomplished is a marriage of open source technologies (where most of the Big Data platforms are coming from these days), Moore's Law, commodity hardware, the cloud and the ability to capture and store huge volumes of non-transactional data that was once discarded because no one knew what to do with it.
Unstructured data such as video and email, often cited as a driving force behind big data, barely plays a part in this. Scour blog posts and user forums, though, then correlate that information with geographic data, couple it with flat files of your existing structured customer data and bring in streams from new sources such as the MicroStrategy Wisdom engine, which tracks what some 14 million Facebook users are saying about your brand, and now you've got a new and powerful tool.
R.K. Paleru, director of industry marketing for BI vendor MicroStrategy, says two things have happened with big data. "You're able to bring in more variety of data from different sources, but [you] can also take all that data and...micro-optimize. [For example,] how can you transform behavior using tools like the iPad or smartphones at the point where this tactical business decision has to be made?"
Shortening "time to answer" key to big data analytics
One big advantage to this type of analytics is the shortening of the "time to answer" (TTA), according to Paul Barth, founder and managing partner of New Vantage Partners, a boutique information management and analytics consulting firm. The queries, or models, that used to take data scientists months to build in order to answer forward looking business questions about supply chain or production schedules can now be done, in some instances, in hours, and in bulk.
This happens because big data technologies allow information to be worked with before it is optimized or rationalized or relational-ized. This, coupled with advanced analytics, lets line of business managers ask and answer questions in very short cycles. (It's not plug-and-play quite yet, though, so IT workers and data modelers will have to lend a hand.)
"These folks are using big data to automate machine-learning, turn-the-crank processes," Barth says. Doing so can generate upwards of 20,000 data models for each product line, in each market around the world, letting users look up to 18 months forward. "That's a big change. The reason they can do that is because big data technology can automate a lot of the modeling steps and execute it in a lights-out fashion."
Not long ago, this would have been nearly impossible. It took statistical analysts weeks or even months to build a single model. If you sold 100 products, you really couldn't move beyond 1,000 models for your entire product line, which means the information these models returned wasn't nearly as accurate, or as timely, as the big data models available today.
"Big Data is as much as about big analytics as it is about big data," Barth says. "This is what data scientist's love. They can iterate and iterate and iterate while they are learning the data and getting some initial insights during discovery."