Tech luminaries we lost in 2011

Some helped build an industry, while others helped save lives through technology, but all 13 of these tech pioneers shaped the future

It's been a rough year for the IT industry. The death of Apple co-founder Steve Jobs in October grabbed international headlines. But we also lost other major figures from almost every area of technology, including Xerox PARC founder Jacob E. Goldman, who died in late December. Here's one last look at some of the people who made a big difference.

[ Get the first word on what the important tech news really means with the InfoWorld Tech Watch blog. | For the latest developments in business technology news, follow on Twitter. ]

Dennis M. Ritchie: Godfather of Unix, father of C
September 1941-October 2011
Arguably the most influential programmer of the past 50 years, Dennis Ritchie helped create the Unix operating system and designed the C programming language. And he promoted both, starting in the 1970s.

Ritchie worked closely with Unix designer Ken Thompson starting in 1969, integrating work by other members of the Bell Labs research group. And in 1971, when Thompson wanted to make Unix more portable, Ritchie radically expanded a simple language Thompson had created, called B, into the much more powerful C. Just how influential has all that work been? Unix spawned lookalikes such as Linux and Apple's Mac OS X, which run devices ranging from smartphones to supercomputers. And by one account, eight of today's top 10 programming languages are direct descendants of C. (Read more about Unix in Computerworld's 40th anniversary of Unix package.)

While Ritchie was serious about Unix and its potential for creating a computing community, he knew better than to take himself too seriously. He quipped that Unix was simple, "but you have to be a genius to understand the simplicity." And Ritchie wasn't above an office prank. In 1989, he and Bell Labs cohort Rob Pike, with the help of magicians Penn and Teller, played an elaborate practical joke on their Nobel prize-winning boss, Arno Penzias. (You can see the prank in this video clip.)

Robert Morris: A knack for encryption
July 1932-June 2011
Among the Bell Labs researchers who worked on Unix with Thompson and Ritchie was Bob Morris, who developed Unix's password system, math library, text-processing applications, and crypt function.

Morris joined the Bell Labs research group in 1960 to work on compiler design, but by 1970 he was interested in encryption. He found a World War II U.S. Army encryption machine, the M-209, in a lower Manhattan junk shop. Morris, Ritchie, and University of California researcher Jim Reeds developed a way to break the machine's encryption system and planned to publish a paper on the subject in 1978.

Before they did, they sent a copy to the National Security Agency, the U.S. government's code-breaking arm -- and soon received a visit from a "retired gentleman from Virginia," according to Ritchie. The "gentleman" didn't threaten them, but he suggested discretion because the encryption techniques were still being used by some countries. The researchers decided not to publish the paper -- and eight years later, Morris left to join the NSA, where he led the agency's National Computer Security Center until 1994.

Ironically, it was Morris's son, Robert Tappan Morris, who brought him into the national spotlight: In 1988, the younger Morris, then 22, released an early computer worm that brought much of the Internet to its knees. The senior Morris said at the time that he hadn't paid much attention to his son's interest in programming: "I had a feeling this kind of thing would come to an end the day he found out about girls," he said. "Girls are more of a challenge."

John McCarthy: Intelligence, artificial and otherwise
September 1927-October 2011
He may be best known as the creator of the Lisp programming language and as the father of artificial intelligence (he coined the term in 1956), but John McCarthy's influence in IT reached far beyond would-be thinking machines. For example, in 1957 McCarthy started the first project to implement time-sharing on a computer, and that initiative sparked more elaborate time-sharing projects including Multics, which in turn led to the development of Unix.

In an early 1970s presentation, McCarthy suggested that people would one day buy and sell goods online, which led researcher Whitfield Diffie to develop public-key cryptography for authenticating e-commerce documents. In 1982, McCarthy even proposed a space elevator that was eventually considered by a government lab as an alternative to rockets.

But McCarthy's first love was AI, which turned out to be harder than he first thought. In the 1960s, McCarthy predicted that, with Pentagon funding, working AI would be achieved within a decade. It wasn't -- as McCarthy later joked, real AI would require "1.8 Einsteins and one-tenth of the resources of the Manhattan Project."

Ken Olsen: The digital man
February 1926-February 2011
As an engineer working at MIT's Lincoln Laboratory in the 1950s, Ken Olsen noticed that students lined up to use an outdated computer called the TX-0, even though a much faster mainframe was available. The difference? The mainframe ran batch jobs, while the TX-0 (which Olsen had helped develop as a grad student) allowed online interactivity.

In 1957, Olsen and colleague Harlan Anderson took that insight and $70,000 in venture capital money and started Digital Equipment Corp. (DEC) to make smaller, more interactive machines. The company's PDP minicomputers were inexpensive enough that a corporate department could own one (a PDP-7 was used to develop the first version of Unix at Bell Labs).

Olsen's management approach as CEO -- hire very smart people and expect them to perform as adults -- helped DEC become the second biggest computer maker after IBM. But Olsen was also opinionated and sometimes stubborn. He publicly grumbled about Unix (calling it "snake oil") even as his company sold lots of Unix workstations, and DEC was late to join the move to PCs. DEC's sales declined, and in July 1992, Olsen was forced to resign from the company he founded. DEC was sold to Compaq six years later.

Paul Baran: Packet thinker
April 1926-March 2011
Working to make electronic communications bulletproof at the height of the Cold War, Paul Baran developed what would eventually become a core technology of the Internet: packet switching. Baran was a researcher at the Rand think tank in 1961 when he suggested that messages could be broken into pieces, sent to a destination by multiple routes if necessary, and then reassembled upon arrival to guarantee delivery.

Baran wasn't the only one to think of the idea -- U.K. researcher Donald Davies came up with a remarkably similar idea at about the same time and gave it the name "packet switching." But the U.S. Air Force liked Baran's version of what was essentially an inexpensive, unreliable network with intelligence at the edges. AT&T, the dominant U.S. telephone company, didn't -- it had an expensive, reliable network -- and company engineers publicly scoffed at Baran's idea.

However, packet switching was adopted for ARPAnet, the predecessor to the Internet, and eventually for local-area networks in the form of Ethernet. Today, even phone calls are typically sent in digital packets. (This hour-long video interview shows Paul Baran receiving a 2005 Computer History Museum Fellow Award.)

Jean Bartik: Last of the first programmers
December 1924-March 2011
Jean Bartik
was the last surviving member of the original programming team for the Eniac, the first general-purpose electronic computer. But that understates her work. Bartik, the only female math graduate in her 1945 college class, was hired to make the physical connections that let the Eniac perform artillery calculations, and she served as a lead programmer on the project. But Bartik also developed circuit logic and did design work under the direction of Eniac's hardware developer, J. Presper Eckert.

After Eniac, Bartik followed Eckert to work on both hardware and software for the commercial Univac I mainframe and the specialized Binac (Binary Automatic Computer). But once the Univac was complete, Bartik retired at age 26 in 1951 to raise a family. She returned to a much-changed IT industry in 1967 and worked as an editor at several analyst companies until she was laid off in 1985, when she was in her 60s.

Jack Keil Wolf: Disk drivin' man
February 1926-February 2011
There's a reason why the amount of information we can store on hard disks keeps growing -- and its name is Jack Wolf. That may be an overstatement, but it's not too much to say that Wolf did more than almost anyone else to use math to cram more data into magnetic drives, flash memory, and electronic communications channels.

Wolf began his professional life as an information theorist, teaching and working at RCA and Bell Labs, with much of his work relating to compressing information. But in 1984, he moved to the new Center for Magnetic Recording Research at the University of California at San Diego. "I knew nothing about magnetic recording," he admitted in a 2010 lecture. "Not only did I not know how to spell 'coercivity,' but the first time I mentioned it in a talk I mispronounced it. But UCSD reluctantly made me an offer as the first faculty member in CMRR."

It was a good choice. Wolf and his students, dubbed the "Wolf pack," cross-pollinated magnetic drive design with information theory, applying compression in increasingly creative ways, and spread Wolf's ideas throughout the industry.

Julius Blank: Silicon machinist
June 1925-September 2011
Silicon Valley had many builders, but one of them literally built some of the high-tech hub's first silicon-making machines. Julius Blank was one of the "Traitorous Eight" engineers who founded Fairchild Semiconductor in 1957. He and his seven colleagues had acquired that unflattering sobriquet because they decided to strike out on their own just a year after Nobel Prize-winning physicist William Shockley had recruited them to create a new kind of transistor at Shockley Labs.

The Eight included future Intel founders Gordon Moore and Robert Noyce, but the lesser-known Blank had skills critical to the new venture: Before going to college, he had been trained as a machinist. Along with eventual venture capitalist Gene Kleiner, Blank built Fairchild's machine shop, created the manufacturing machinery and outfitted the rest of the fab. Within nine months, Fairchild went from occupying an empty building in Mountain View, Calif., to shipping its first transistor.

How well did that first hand-built equipment hold up? In 1962, Fairchild set up its first offshore plant in Hong Kong, and no new equipment was required. "We took the old, ancient equipment from Mountain View," Blank told an interviewer in 2008. "They just put it in crates and shipped it overseas. It came over there rusty, but they just sandblasted it, put a coat of paint on it and put it together. It worked fine."

Robert Galvin: No more mobile monopoly
October 1922-October 2011
Motorola CEO Bob Galvin didn't design the first working handheld mobile phone -- one of his researchers, Marty Cooper, did that in 1973. But Galvin broke AT&T's monopoly on mobile-phone service in the U.S. by demonstrating a Motorola phone at the White House in 1981, spurring then-President Ronald Reagan to push the FCC to approve Motorola's proposal for a competing cellular network, just three years after AT&T had lost its long-distance monopoly.

Galvin, whose father and uncle started the business that would become Motorola, took the company's reins in 1956 and led it for more than three decades. During that time, Motorola went from the car radios and walkie-talkies that the company had been making to microprocessors (the early Apple Macintosh's 68000 and PowerPC CPUs), TVs, and satellite communication systems.

Galvin also pushed to make Motorola's manufacturing competitive with non-U.S. companies, supporting development of the Six Sigma quality system starting in the 1970s. By the time Galvin retired as Motorola's chairman in 1990, the company dominated the cellphone hardware business.

Gerald A. Lawson: Cartridge creator
December 1940-April 2011
The man who created the first home video-game system that used interchangeable game cartridges wasn't a typical Silicon Valley engineer. Jerry Lawson was six-and-a-half feet tall, weighed more than 250 pounds, and was African-American -- even more of an IT industry rarity in the 1970s than today. Lawson's creation, the Fairchild Channel F, arrived in 1976, a year before Atari's first home game system, and sparked an industry of third-party video games.

That wasn't as simple as it sounds. Lawson, who worked for a succession of government contractors before joining Fairchild Semiconductor, discovered that the biggest challenge with plug-in cartridges was satisfying the FCC's radio-frequency interference requirements. "It was the first microprocessor device of any nature to go through FCC testing," Lawson said in a 2006 interview. "We had to put the whole motherboard in aluminum. We had a metal chute that went over the cartridge adapter to keep radiation in. Each time we made a cartridge, the FCC wanted to see it, and it had to be tested."

The resulting game system was a moderate market success, but its biggest impact was on Lawson's friends at Atari, who rushed their own cartridge-based home system into production. The rise of the video game had begun.

1 2 Page 1
Page 1 of 2
InfoWorld Technology of the Year Awards 2023. Now open for entries!