A computer science freshman today should know in four years the pathway to an exascale system. By the time this same student completes his or her graduate work, there will be discussion about a zetascale system, something that's one thousand times more powerful.
If high performance computing maintains its historic development pattern, a zetascale system can be expected around 2030. But no one knows what a zetascale system will look like, or whether it's even possible. Zetascale computing may require entirely new approaches, such as quantum computing.
The White House says it doesn't want to be in an "arms race" in building ever faster computers, and warned in a report a year ago this month that a focus on speed "could divert resources away from basic research aimed at developing the fundamentally new approaches to HPC that could ultimately allow us to 'leapfrog' other nations."
But the United States is in a computing arms race whether it wants it or not. To develop technology that leapfrogs other nations, the United States will need sustained basic research funding as well as building an exascale system.
"A lot of countries have realized that one of the reasons the U.S. became so great was because of things like federally funded research," said Luis von Ahn, an associate professor of computer science at Carnegie Mellon University and a staff research scientist at Google, in an earlier interview. "There are lot of countries that are trying to really invest in science and technology. I think it's important to continue funding that in the U.S. Otherwise it is just going to lose the edge -- it's as simple as that. "
5: The United States hasn't explained what's at stake.
President Barack Obama was the first U.S. president to mention exascale computing, but he didn't really explain the potential of such systems.
Supercomputers can help scientists create models, at an atomic level, of human cells and how a virus may attack them. They can be used to model earthquakes and help find ways to predict them, as well design structures that can withstand them. They are increasingly used by industry to create products and test them in virtual environments.
Supercomputers can be used in any way imaginable, and the more power -- the more compute capability -- the more precise the science.
Today, the United States dominates the market. IBM alone accounts for nearly 45 percent of the system share of the Top 500 systems, followed by HP at 28 percent. Nearly 53 percent of the most powerful systems on the list are in the United States.
At the SC11 supercomputing conference held earlier this month in Seattle, there were 11,000 attendees, more than double the number from five years ago. A key reason: The growing importance of visualization and modeling.
This conference draws people from around the globe because the United States today is the center for high-performance computing, something the world is beginning to challenge on the path to exascale.
Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov, or subscribe to Patrick's RSS feed. His email address is firstname.lastname@example.org.
Read more about mainframes and supercomputers in Computerworld's Mainframes and Supercomputers Topic Center.