The shut-off of Twitter's firehose is a hazard of the API economy

Twitter wants to sell access to its data directly via its own API set, but some see this as an innovation-destroying idea

hose valve
Pixabay

Social media giant Twitter announced Friday on Gnip, its data-analysis acquisition, that it is ending third party agreements for the resale of Twitter's "firehose" data -- the full, unfiltered stream of tweets available from the service.

Call it one of the occupational hazards of the API economy: The more widespread and multifaceted the reliance on a single entity -- be it as a data source, an analytics layer, or an infrastructure -- the easier it is to have the rug yanked out from under your feet.

In place of agreements with third party resellers such as Gnip (now owned by Twitter), Datasift, and NTT Data, Twitter instead plans to sell access to its firehose data directly via its own API set. That means anyone relying on the meta analytics provided by those resellers, such as Datasift's industry-specific analyses, customers will have to either reinvent those analytics, wait for them to be reimplemented by a reseller, or do without.

Twitter's motives are plain enough. The company is determined to generate more revenue by turning its data stream into a licensable resource for real-time sentiment analysis -- and in so doing become an API economy king as an indispensable source of real-world data. As Chris Moody, Twitter’s vice president for data strategy (and former chief of Gnip) said in the New York Times' Bits Blog, "In the future, every significant business decision will have Twitter data as an input, because why wouldn't you?"

This plan, which embodies the API economy, puts Twitter first and everyone else second -- and could do long-term damage to Twitter itself as much as to any of its partners.

Nick Halstead of Datasift was irked by the changes, but has resolved to do the heavy lifting for his customers; Datasift plans to set up a connector to Twitter's new APIs so existing Datasift customers will continue to be served. "This will not fix many of the failings of the GNIP processing + filtering," he wrote in a blog, "but it will still allow you to continue to use our APIs to receive the data."

Ben Kepes of Forbes was far less charitable, calling Twitter's plan nothing less than "an evil move" by "a company with a long history of making moves that are counter to the best interests of its ecosystem."

Steven Willmott of API management vendor 3Scale believes this is as innovation-destroying an idea as Twitter's restriction of client access back in 2012. "Less parties will now be building tools to realize [the overall amount of value created on the Twitter firehose]," Willmott wrote. "Less experimentation will take place (since there is no business model for gain). All that decreases value to customers."

Not all third parties that rely on Twitter data are impacted by this. IBM Insights for Twitter, for instance, relies on the Twitter "decahose," a random sampling of one out of every 10 tweets rather than the entire firehose. That particular feed remains untouched -- a sign that more of Twitter's API future monetization will be heavily tiered, with the most useful tiers also being the more expensive ones, making them more restricted --  and thus less useful in the long run.

Twitter is in a position of relative strength, since few other social-media companies can claim such a wealth of real-time data. It's also growing more determined to capitalize on its APIs. As their overall value to the company heads up, so do the the odds of Twitter becoming a more volatile data merchant in the API economy.

Copyright © 2015 IDG Communications, Inc.

InfoWorld Technology of the Year Awards 2023. Now open for entries!