Google has already achieved the enviable marketing distinction of turning its name into a verb. There's probably not an Internet user in the world (even Steve Ballmer) who hasn't accessed it frequently for search and mapping. But its enormous popularity and global reach place an unintended burden on the search giant: When it goes down, the entire Web is shaken.
That's exactly what happened on Thursday, May 14, when Google suffered a major failure. A routing error sent traffic to servers in Asia, creating what Google called "a traffic jam." No kidding. According to the company, 14 percent of its users experienced slowdowns or outages. Many accounts put the number of those inconvenienced quite a bit higher. And we can't even guess at how many people were seriously put out by subsequent outages including the Google Gmail failure last month. But this isn't the day to beat on Google or fret about the implications Google outages have for cloud computing.
[ For more on Google's long-term cloud play, see "Google at 11: Taking the battle to Microsoft" and "Google set to take on collaboration giants." | Is our Internet future in danger? See InfoWorld's special report on whether the Net's infrastructure can handle projected demand. ]
What got my attention this week was a study to be formally presented on Oct. 19 of Internet usage by Arbor Networks, which found that just 100 ASNs (autonomous system numbers) out of about 35,000 account for some 60 percent of traffic on the public Internet. Put another way, out of the 40,000 routed sites in the Internet, 30 large companies now generate and consume a disproportionate 30 percent of all Internet traffic, according to the two-year study.
Not surprisingly, the biggest kahuna of all the big kahunas is Google, which accounts for about 6 percent of all Internet traffic globally. The other big guys include Level3, LimeLight, Akamai, and Microsoft, in that order.
Yes, the Internet is stronger -- in a structural sense -- than ever. But the concentration of traffic in so few hands raises troubling questions about the ability of the Internet to function when a major originator of traffic goes down or becomes infected. Simply put, Google may be too big to fail, and as we learned during the financial meltdown, that ain't good.
The flat Internet
I tend not to be impressed by studies conducted by vendors, but this one strikes me as quite credible. Arbor -- in collaboration with University of Michigan and Merit Network -- looked at two years of Internet traffic across 110 large and geographically diverse cable operators, international transit backbones, regional networks, and content providers. The results were based on an analysis of 2,949 peering routers across nine Tier-1, 48 Tier-2, and 33 consumer and content providers in the Americas, Asia, and Europe.
The implications of the results are, well, scary. In part that's because the structure of the Internet has changed significantly in the past few years, says Danny McPherson, Arbor's chief security officer and a co-author of the study. Network traffic used to go up and down the food chain of transit providers, an inefficient situation, but one that did not create single points of failure.
[ Follow the cloud with InfoWorld's Cloud Computing blog and Cloud Computing Report newsletter.| Get the no-nonsense explanations and advice you need to take real advantage of cloud computing in the InfoWorld editors' 21-page Cloud Computing Deep Dive PDF special report. ]
These days, networks are far more likely to be interconnected. On one hand, these networks are more efficient and generally more robust. However, because many are interconnected -- McPherson calls that a "flattening of the Internet" -- when a big one goes down, lots and lots of sites are affected. The results can be far-reaching.
Take the Gmail failure. Not only were the millions of Google users unable to send or receive mail, but users of other systems who needed to send or receive mail from Gmail users were also out of luck. Given businesses' and consumers' dependence on e-mail, that's troubling.
Then there's the issue of ads. Because Google serves enormous numbers of ads for countless Web sites, what happens when Google's servers are on the fritz? Ad revenue, of course, would take a big hit. So would performance, as browsers try to load ads from unresponsive servers. And if Google is hit by an uncontrollable malware attack, we're all in trouble.
And what's true of Google is equally true of what McPherson calls "hyper-giants." As recently as five years ago, this wasn't the case. Internet traffic was proportionally distributed across tens of thousands of enterprise-managed Web sites and servers around the world. But now most content has increasingly migrated to a small number of very large hosting, cloud, and content providers.
More than access and economics may be at stake
The authors of the study don't examine the impact of this concentration on social and political life. But it's not a huge stretch to conclude that a handful of providers now have enormous influence over the Internet economy, as well as a good deal of social and political power should they choose to exercise it. I'm not at all sure I like that.
(There's lots more meat in this report concerning, among other things, new business models and changes in Web applications. When it's publicly available, I'll add a link to this post.)
I welcome your comments, tips, and suggestions. Reach me at firstname.lastname@example.org.