Strohmaier: We've seen record turnovers a few times in the last three or four lists, that's a reflection of the market adopting the new quad-core processors. It's the dominant architecture in terms of how many cores are used and it became that very quickly. Lots of these quad-cores are Intel Harpertown (the Xeon 5400 series), there are already more Harpertown systems on the list than Clovertown (the earlier Xeon 5300 series). It shows that our supercomputing community is ready to use those processors, and Linpack (the benchmark used to rank the supercomputers) can use a lot of features of the Harpertown and Clovertown quad-cores.
IDGNS: Intel seems to be increasingly dominant on the list, is that because AMD's quad-core chips were delayed coming to market?
Strohmaier: Yes I certainly agree with that, when AMD came out with their dual-core processors they had a headstart compared to Intel and gained a larger share of the list. In the last year to a year-and-a-half that has reversed and Intel's share has increased more. One reason has been the delays in AMD's quad-cores, the other is that for Clovertown, Intel introduced four floating point operations per cycle per core. AMD was late doing that; they do it now with the new quad-core but they didn't do it with the dual-core. And the Linpack benchmark and applications similar to it can use this four-floating-point feature, so they show up better on the list.
IDGNS: Was it a scramble to get the results in on time? Some people wondered if Roadrunner would be ready.
Strohmaier: Yes, for Roadrunner it wasn't too much of a scramble but they submitted it in time. But they still haven't used the full machine. The machine is in 18 segments and they used only 17 of those, so they still have room to grow in terms of doing a new measurement and squeezing out a little more. It was amazing they managed to do the petaflop.
IDGNS: Why did you start the list, is it just for fun or does it serve another purpose?
Strohmaier: It was fun, and also to get a handle on the market shares for supercomputers. My colleague Professor Hans Meuer started doing statistics in the late 1980s. That was the golden age of vector systems so it was easy to count supercomputers, you just counted the vector systems. Then in the early 90s when the first parallel systems were becoming important that method didn't work, so we scratched our heads and said 'What is the definition of a supercomputer?' We wanted a system that would scale over time because performance scales so quickly -- it's scaled 10,000-fold since we started the list. So we said, 'Ok let's pick a fixed number of computers that we know are supercomputers,' and there were 500 vector systems at the time, so that's why we picked the number 500.
IDGNS: Have you thought about making the list longer or shorter?