A respected mentor of mine, Steven Northcutt of SANS, once told me, "Eat the watermelon and spit out the seeds." It’s an appropriate metaphor: There is often truth buried in statements and concepts we disagree with.
The concepts of trusted computing and DRM (digital rights management) often require a watermelon approach. Trusted computing is the idea that integrity and authentication -- and at times encryption -- are built into all parts of a computing system. It's built into the hardware, into the CPU, and into supporting motherboard chips. It's supported by the OS and implemented all the way into the software application layers. Even the network and all the related network devices have authentication and trust built in. Default authentication and encryption are built in to every process, every computing cycle bit to terabyte.
Great discussions on trusted computing goals are covered by the Trusted Computing Group. If you really want to be a serious computer security professional, take the time to read and understand TCG's Trusted Platform Module, Trusted Network Connect, and PC Client specifications.
Trusted computing, if implemented correctly, is a beautiful thing. But can we trust our trusted vendors to be trustworthy? If the applications we install to our trusted platforms contain untrustworthy code, the whole exercise is for naught.
As for DRM, I support its overall objectives. Content providers and copyright holders should be able to charge for their content if they so choose -- as long as they are following commonly accepted laws and traditions. It's the American way.
But DRM is often swung as a sword by a drunken warrior blithely unaware of the damage he is causing. What's got my dander up this week is Sony's incredibly bad DRM decision making.
Two weeks ago, Sysinternal's Mark Russinovich discovered that many of Sony's music CDs are protected by a DRM mechanism that mimics a malicious rootkit program (see sysinternals.com/Blog for his excellent details on the issue). Sony's handling of the issue has gone from bad to worse. First, they basically said most people don't care about the issue and released flawed removal instructions. Then several worms and viruses started appearing that used Sony's rootkit hiding mechanisms.
Today, I read that Sony's new removal software leaves any computer it has been executed on in a very insecure state. Namely, the browser on any affected machine will allow any Web site to execute any program on the computer -- malicious or otherwise.
The removal process apparently leaves an insecure ActiveX control installed that can be remotely scripted by any Web site to do anything. Lest you think this is a joke, the post on the vulnerability was created (partially) by Ed Felten, one of the world’s most respected computer security experts. Add to that fact the many insane statements -- such as people filing bankruptcy have to delete the music and that you cannot play the music on a work computer -- Sony made in its EULA agreement, and the issue of DRM’s reach and who to trust becomes a lot seedier.
Our only hope is that consumers will pay enough attention to cause serious adverse financial consequences to entities that abuse our trust. Unfortunately, if history is any guide, most of the public is not paying attention and doesn't care -- although the Department of Homeland Security's almost uncloaked rebuke was meaningful.
What makes this situation worse is that Sony is far from alone in abusing consumer's trust and DRM's real objectives. InfoWorld's own Ed Foster's Gripe Line column and blog is full of competitors that apparently share Sony's warped treatment of customers.
Still, it is Sony’s continued poor handling that gets my ire up at the moment. Sony probably sees this as merely a minor spot of bad publicity and doesn't really "get it" about how wrong it was to install secretive software in the first place.
I think what my friend Chris Quirke said in an e-mail forum sums up the problem nicely: "Sony releases all manner of things under their brand, including IT products. If they are happy to leverage the brand association, then they can't disassociate themselves from the fallout. Would you buy or resell Sony DVD writers and other optical drives? Would you trust software bundled with Sony digital cameras, media playing devices, etc.?”
Trusted computing can fail at any one of several layers, and it's no good worrying about the deepest of these -- program code, hardware, and the like -- if the topmost layer is blown away. Trusted computing is going to be designed and built by entities that have proved they cannot be trusted, out of materials that are notoriously prone to bad behavior.
When audio CDs drop rootkits on PCs, "documents" auto-run macros, and JPEG image files run as raw code through some deep code bug, you don't have to look far to understand why we are scratching at the door to escape whenever we hear the term "trusted computing."
One thing is for sure: Trying to ensure that only good comes out of the trustworthy computing and DRM initiatives will be no picnic.