IBM pitches a supercomputing Fastball

Big Blue project has been able to achieve 102GBps of sustained read/write performance

IBM on Thursday said it had developed technology to speed up the way large computer networks access and share information.

Under a project code-named "Fastball," IBM's ASC Purple supercomputer has been able to achieve 102GB per second of sustained read-and-write performance to a single file -- the equivalent of downloading 25,000 songs in a second over the Internet, according to IBM.

IBM's General Parallel File System (GPFS) software was used to manage the transfer of data between thousands of processors and disk storage devices. IBM said it had to enhance the software in several areas to handle such fast data rates.

For example, it employed new fencing techniques to prevent individual hardware failures from causing the overall system to fail and added new capabilities to orchestrate flow control between all of the different hardware components in the system.

"If they all go real fast at the same time you get a traffic jam and performance goes down," said Chris Maher, director of high-performance computing development for IBM's Systems and Technology Group.

ASC Purple, the world's third most powerful supercomputer according to the Top500 list http://www.top500.org/, is housed at Lawrence Livermore National Laboratory (LLNL). The Fastball project capabilities were demonstrated at LLNL. IBM supplied the computer to the U.S. Department of Energy and LLNL for use in nuclear weapons research.

The Fastball project combined IBM servers, a high-performance computing switch network, and storage subsystems tied together through the enhanced version of the GPFS software. IBM used 416 individual storage controllers combined with 104 Power-based eServer p575 nodes.

In the Fastball demonstration, 1,000 clients requested a single file at the same time. Through virtualization techniques, the software then spread that file across hundreds of disk drives. The resulting file system was 1.6 petabytes in size.

Researchers for the project say this kind of computing could be applied to different applications.

"You can imagine the kind of problems you can solve with this, like a tsunami warning device that would scrutinize huge amounts of information from the ocean and then analyze that quite quickly, or for homeland security applications, where you need to scan images of people and match those images against large databases," Maher said.

Other applications could include medical research and online gaming, Maher said.

A future area of focus for the researchers is developing ways to match appropriate storage resources to data automatically as data is generated based on predefined policies, Maher said.

Mobile Security Insider: iOS vs. Android vs. BlackBerry vs. Windows Phone
Join the discussion
Be the first to comment on this article. Our Commenting Policies