Gauging Net consumption

Peel back the energy onion, and you’ll find the Internet sucks up a lot of juice

However it is that urban myths get started, it’s kind of a bummer when one of them gets dispelled. A couple years ago, for example, a Chinese astronaut went into space and debunked the myth that the Great Wall is visible from up there.

Now comes an article in the San Francisco Chronicle — my hometown rag — purporting to dispel the belief that the Internet and computers consume a huge chunk of America’s energy resources and are therefore a key driver of the current energy crunch.

The article cited a just-released study by a consulting professor at Stanford University, who’s also a staff scientist at Lawrence Berkeley National Laboratory. The study, funded by AMD, basically said that total datacenter power consumption — including servers, cooling, and auxiliary infrastructure — accounts for about 1.2 percent of all U.S. energy consumption.

Now, 1.2 percent sounds pretty small — like on par with clock radios or toaster ovens (actually, the study says that it is equivalent to color-TV power consumption). This led the Chronicle to pen the grossly inaccurate headline teaser, “Net uses barely over 1 percent of U.S. electricity.”

So, a casual reader would walk away thinking the myth has been adequately shattered — that computers do not, in the end, use much electricity. And in fact, if you read the full Chronicle article, there are quotes from officials of the California Energy Commission and the University of California Energy Institute that shamefully seem to reinforce this notion.

But hold on a minute. The study was based only on machines actually located in datacenters — about 10 million of them. That’s nothing compared with the 75 million household PCs — 75 percent penetration of the roughly 100 million U.S. households — out there drawing power, plus some equally large number of corporate and small-business desktops. Not to mention printers, routers, LCD screens, and other peripherals.

So, you don’t need a Stanford Ph.D. to realize that total U.S. energy consumption from computers and the Internet could easily be in the double digits, way higher than 1.2 percent. So although it’s nice to have an academic estimate of the datacenter drain, let’s not kid ourselves: Cyberspace is sucking a lot of juice.

One other interesting thing here is that the study and the newspaper article are rife with the usual energy stats — megawatts and billions of kilowatt-hours. As a real person, I have no way of grasping what these abstract numbers really mean. They’re even worse than billions and trillions of dollars — at least I have a very solid, personal understanding of what that fundamental unit is: a dollar.

Abstraction that leads to numbness is part of the problem we all face when it comes to understanding technology and large-scale complex systems — what is a “zombie bot network,” anyway; ever seen one? The energy people must figure out a way to put energy consumption in terms that real people can understand. BTUs doesn’t work. Barrels of oil doesn’t do it. Gigawatts is meaningless.

My nomination: Just use dollars. Your laptop consumes $35 worth of energy a year (total guess on my part). Datacenters consumed about $3 billion of energy in 2005 (real data from the study), compared with $300 billion total U.S. energy consumption that year. Why can’t we just have a sticker on everything with a best guesstimate — like refrigerators do. “If you leave this turned on for a whole year, it’ll cost X.” Maybe that would be too much of a turnoff?

Mobile Security Insider: iOS vs. Android vs. BlackBerry vs. Windows Phone
Join the discussion
Be the first to comment on this article. Our Commenting Policies