What do World War I and business intelligence have in common? WWII and pretty good privacy (PGP) encryption? The Cold War and CDMA? Actually, a great deal. Many information technologies have their roots in wartime and defense-related efforts of the 20th century, from supercomputing to the Internet, from advanced microprocessors to fiber optics.
“A lot of the stuff people have built on to make the IT industry we know [today] has grown out of military applications," explains Jim Lewis, a Senior Fellow and military IT expert at the Center for Strategic and International Studies (CSIS), a Washington-based think tank. Lewis explains that for most of the 20th century, technology innovation flowed predominantly from the military to the private sector. Enterprises are still reaping the benefits today of US government defense-related R&D spending programs from the past.
During World War I for example, the U.S. Army Research Institute developed rudimentary data-mining technology, the forerunner to today’s BI systems. Using a combination of punch-card machines, a precursor to the modern computer, and pencil and paper, the army sifted through and analyzed the large masses of data from the standardized IQ tests it had administered to its troops.
Cryptography was another wartime pursuit that evolved into the advanced security algorithms used today to protect enterprise data. The successful British effort to break the German Enigma cipher code in World War II had its origins in the 1920’s, when military technologists started thinking about how to overcome radio communication security problems in World War I. "It was the start of moving cryptography into an automated environment, instead of some guy doing the coding by hand," Lewis says.
But in World War II, the development of core computing technologies that would later power the enterprise really got cooking. From both sides of the Atlantic, attention focused on developing powerful calculating machines for various wartime tasks, which would later evolve into the ENIAC (Electronic Numerical Integrator and Computer)- and UNIVAC (Universal Automatic Computer)-style computers of the 1950s.
The British needed to crack the German codes, so they developed a machine called the Turing Bombe -- essentially a mechanical ASIC (application specific integrated circuit) -- to unlock the German codes. This machine was later turned over to the Americans, according to Lewis, who tried to figure out how to advance it. "That's where you began to see some of the early early computers."
More importantly, the U.S. government set up wartime institutions like the National Defense Research Committee and the Office of Scientific Research and Development. "In WWII they learned that better technology gives you an advantage, and innovation is the key and happens continuously," explains Lewis. "The guy who has the best technology wins."
Once the war ended, these institutions morphed into better-known agencies like the Advanced Research Projects Agency (ARPA, later renamed Defense ARPA or DARPA), the Office of Naval Research (ONR), and the so-called National Laboratories under the Department of Energy (such as Los Alamos, Lawrence Livermore, Oak Ridge) which were heavily funded by the Defense Department to improve nuclear weapons and related computing technology. There were also federally funded research and development centers like MIT’s Lincoln Laboratory, the Institute for Defense Analyses (IDA), and MITRE Corporation, a Lincoln Laboratory spinoff.