Interview with Woz: To innovate, get personal

The Apple co-founder explains why being human is key to good tech and why technology alone won't fix our schools

Many of us in the tech press say that we've hit a lull in technology innovation, after an amazing run of truly disruptive new technologies from cloud computing to social networking, from mobile devices to voice services, in the last decade. Steve "Woz" Wozniak is not so sure, but he does believe that innovations can't be scheduled or even predicted with any certainty. They take off only when many factors come together.

Wozniak, of course, knows a little something about innovation. The 63-year-old engineer is a co-founder of Apple and helped invent personal computing in the forms of the Apple II and Macintosh. He has been an adviser to and sounding board for Apple over much of its history, as well as a consultant to other tech companies. He founded a company called CL 9 in 1987 that produced the first programmable universal TV remote control. He has been an elementary school teacher. Now he is chief scientist at Fusion-io and speaks on technology and innovation throughout the world; this week, he's a featured speaker at the Apps World conference in San Francisco.

[ The Mac turns 30: Relive the innovations it brought to us all. | What the "Internet of things" really means. | Subscribe to InfoWorld's Consumerization of IT newsletter today. ]

There's no question that companies are doggedly pursuing the next big thing in technology, whatever that may be. For example, "everyone is talking about wearable computing. There are about 30 companies that seem to be doing the same thing. But nothing seems to be pointing to the right way," Woznak says. One reason is simple: "You tend to deal with the past," replicating what you know in a new form. Consider the notion of computing eyeware like Google Glass: "People have been marrying eyewear with TV inputs for 20 years."

What it takes for innovation to take root
What does it take for technology innovation to flower in the same way the PC did in the mid-1980s, the Internet in the early 2000s, smartphones in the late 2000s, cloud computing and social networking in the early 2010s, and tablet computing in the mid-2010s? For one, "the enabling technology has to become cheap enough," Wozniak says.

For example, you can buy tiny projectors today, but they're only useful if you have a blank wall to project them against, thus limiting their usefulness -- they're basically refinements of projectors we've used for decades. But imagine if a tiny projector, perhaps built into your smartphone, could project holographic images, à la the "Star Wars" movies. That would be a big shift, as then you could project anywhere you are, whether to show video or conduct a virtual face-to-face meeting. "It's too expensive to do that today -- we need to wait until it becomes affordable. But you can't predict when that happens." That's why companies like Apple, Google, Microsoft, and IBM have all sorts of research products going on, to see if enabling technologies can be made affordable, then pounce on them when they do.

Of course, having the enabling technology is not enough. Wozniak reminded me of the Segway scooter, promised at its 2001 debut as the next revolution in urban transportation, a personal vehicle that could go greater distances than a bicycle but was easier to use than a motocycle and took much less road space than a car. They are used today only in limited ways, such as for city tours. They never quite clicked, so society didn't adopt them or create the pressure to change some of the other key factors, such as making them street-legal or providing standard insurance.

1 2 Page
Recommended
Join the discussion
Be the first to comment on this article. Our Commenting Policies