Enter the Semantic Web, an effort spearheaded by Tim Berners-Lee in 1999 to extend the Web to enable machines to take this mantle. At the outset, the idea -- to transform the Web into something machines can readily analyze -- seemed hopelessly academic. Yet with significant public data sets surfacing in Semantic Web form, the once crazy notion now stands to revolutionize how enterprise IT accesses and disseminates data via the Web.
RDF (Resource Description Framework) -- the Semantic Web's standard format for data interchange -- extends the URI linking structure of the Web beyond naming the two ends of a link, allowing relationships among all manner of resources to be delineated. But the key to the Semantic Web -- and where most people's eyes glaze over -- is its use of ontologies. If specialized communities can successfully create ontologies for classifying data within their domains of expertise, the Semantic Web can knit together these ontologies, which are written using RDF Schemas, SKOS (Simple Knowledge Organization System), and OWL (Web Ontology Language), thereby facilitating machine-based discovery and distribution of data.
Buy-in is essential to the success of the Semantic Web. And if it continues to show promise, that buy-in seems likely.
-- Martin Heller
12. Total information awareness
When the DoD's Information Awareness Office rolled out its high-tech scheme to track down terrorists in 2002, the program had all the hallmarks of a government boondoggle, invoking imagined -- and sometimes unimaginable -- future technologies to solve an immediate problem.
First, there was the hyperbolic, Orwellian name, Total Information Awareness (TIA); then there was the project leader, convicted Iran-Contra felon Rear Admiral John Poindexter. And finally there was the bloated goal: To aggregate, store, and analyze public and private data on an unimaginably massive scale, applying a predictive model that would correlate past activities to predict future acts. Minority Report, anyone?
The project eventually got a PR makeover, emerging as "Terrorism Information Awareness." Even so, the idea was still technically far-fetched. To create a system that could scoop up and analyze citizens' or foreign nationals' credit card transactions, medical records, Web site activity, travel itineraries, e-mails, or anything with an electronic fingerprint, Poindexter called for a "total reinvention of technologies for storing and accessing information." That's the IT equivalent of a Hail Mary pass.
Ultimately, the technical hurdles became moot. Privacy advocates howled, public sentiment turned, and the Feds officially pulled the plug in 2003. Yet for all its sci-fi underpinnings, many of the technologies that constituted TIA aren't as nutty as they sound.
For instance, companies such as Teradata offer solutions that can migrate petabytes of data from disparate databases to a massive, integrated data repository, where customers can employ sophisticated data mining. Meanwhile, CallMiner and other speech analytics software enable companies to mine customer phone calls for business intelligence. And although today's predictive analysis tools may not be able to foretell a terrorist attack, they can, for example, analyze the failure rates of mechanical parts so that companies can adjust their inventories accordingly. Not too bad a technical legacy for such a mixed bag of seemingly crackpot notions.
-- Steve Fox