Does “built to last” apply to IT?

Today’s innovation influences tomorrow’s products, but does Google have 100 years left in it?

Over the weekend, I bought an amazing antique chair: a fancy wooden office swivel chair in practically mint condition, including all its original cast-iron hardware. Although probably made between 1900 and 1915 (the patent date is 1897), it’s remarkably modern, with fully adjustable height, tilt, and back support, like the best Aeron chairs of today (well, its wooden surfaces are a tad stiffer). With any luck, it will last another 100 years and be just as functional.

[ Talkback: Does ‘built to last’ apply to IT? ]

Which made me wonder -- what are we building today in IT that could last 100 years? And in an age of Moore’s law, modular systems, and adaptive architectures, should we even be trying? Sure, some mainframes and legacy languages such as COBOL have hung around for a few decades. But other than that, IT gets obsolete pretty quickly.  Will this ever change? Has there ever been long-term craftsmanship in IT? Or is craftsmanship a different thing from longevity anyway? I’d be curious to hear what you think.

Jim Collins, in his business bestseller Built to Last, argues that great companies are less the result of great products and charismatic leaders than of sustainable cultures that foster innovation and excellence over the decades. The company that made my excellent 1897 chair, as far as I can tell, is long dead. It doesn’t even show up in Google. Hmm, I wonder whether 100 years from now, Google will show up in Google?

Power Feedback Loops With power grids across the country straining under record heat, I started to wonder whether there is a way to apply IT thinking and innovation to the energy cost crunch and carbon emissions problems. The power grid is similar to the Internet but with a kind of reverse Metcalf’s law. Instead of becoming exponentially more valuable as the number of users increases, it becomes exponentially more volatile as the number of users increases. The hotter it gets outside, the more users crank up their AC, putting that much more strain on an already peak-loaded system.

I’ll leave the matter of clean-energy innovation to others, but it strikes me that just as in pre-virtualization datacenters, there must be a lot of waste and underutilization hidden in this system, waiting to be uncovered. And the most efficient way to uncover it should be IT enablement. What if, for example, there was a new way to leverage distributed workload management technology, to quickly make a big impact on efficiency -- the energy-grid equivalent of server virtualization?

One idea: a waste-reporting Web site (energyhogs.com?) where individuals could snitch on large commercial energy users that are wasting energy on a large scale due to carelessness or mismanagement. Sort of a comprehensive database (think Chicagocrime.org) meets an insider dish site (such as The Smoking Gun or F----d Company).

Gross energy inefficiencies that otherwise would have gone unnoticed could get eliminated pretty quickly under this type of spotlight -- maybe there’s a mashup with Google maps so that local reporters can search by ZIP code or company.

And it would empower individuals on the perimeter to have an impact on something they otherwise couldn’t influence. Just a thought. Drop a dime, save some CO2 emissions? Why hasn’t this already been built?

Mobile Security Insider: iOS vs. Android vs. BlackBerry vs. Windows Phone
Recommended
Join the discussion
Be the first to comment on this article. Our Commenting Policies