If you work in IT for a big oil company, you know that your company (and probably your IT shop) is divided into two parts: upstream and downstream. Upstream is exploration and production of the oil. Downstream is the distribution through retail channels (aka gas stations) to the consumer.
Actually, pretty much everything in life has an upstream and downstream — everything comes from somewhere and ends up somewhere. But tracking things accurately starting far upstream may be one of the more daunting challenges facing IT during the next decade.
Case in point: My esteemed colleague Jon Udell’s provocative recent column, “The Carbon-Adjusted Supply Chain.” Jon posited that SOA-enabled supply chain optimization might someday allow Amazon.com to tell the consumer (downstream): Here’s the price if you want this product delivered with less carbon emission impact, and a different price if you don’t care how much carbon was emitted.
Great idea, in theory. The problem is, most of the carbon emissions don’t come from the ships and trains and trucks that got the product to you. They come from the dirty-coal-fired plant that powered the Chinese factory that made it to begin with.
Tracking carbon that far upstream poses the same challenges the food industry faces when it wants to know what farm a specific chicken or cow or grain shipment came from to be able to track disease or purity or pesticide use or whatever. Even if you had tiny embedded chips or sensors all the way back to the farm or factory, how do you get the upstream data populated and standardized? Who will pay for the tracking? And what if the upstream people have some secrets they want to keep?
Alas, there is hope. IT is reaching its tentacles further and further upstream, and downstream, shining light where there was once darkness and ignorance. I recently spoke with two datacenter managers from Pacific Gas and Electric at a conference, and was stunned to realize how far upstream and downstream their thoughts went.
PG&E’s IT department is undergoing a massive upgrade, they said, to support the utility’s transformation into a smarter, greener company. In terms of upstream, they were knowledgeable about the specs of the blades at the newest wind energy farm (90 feet long), and how much maintenance they need (a lot). They told me how often the lakes behind hydropower dams need to be dredged, and at what hours the water flow is regulated for fishermen.
As for downstream, they told me that tens of thousands of smart meters are already being installed in Bakersfield, Calif., so people can see in real time how much energy they’re consuming, repair crews can instantly know the location and nature of an outage, and transformers can be upgraded to much more efficient ones with less risk of outages, because the network is smarter. Of course, all this takes a lot more IT — more data, more servers, more analytics, you name it. But the payoff is huge.
Incidentally, PG&E is also offering a new rebate program for datacenter virtualization projects, promising up to four million dollars per project site to qualifying projects in California. Why? Because they want to cut energy demand so they don’t have to build more multibillion upstream power plants. Which just reinforces the adage: follow the money. It always starts upstream.