IBM is making its Blue Gene supercomputer, ranked the fastest in the world, available on demand so that high-performance computing customers can get the processing power they need when they need it without having to worry about high upfront costs or management headaches.
IBM has been offering supercomputing on demand for nearly two years, but on traditional hardware based on Xeon, Opteron, and Power processors. Friday’s announcement marks the first time end-users will have access to Blue Gene, a supercomputing system designed to provide extreme processing muscle at a fraction of the size and consuming a fraction of the power of similar high-end systems.
End-users can tap into Blue Gene to run Linux-based workloads via a dedicated VPN into a new Deep Computing Capacity on Demand Center in Rochester, Minn. IBM develops and builds Blue Gene in Rochester, said David Gelardi, vice president of Deep Computing Capacity on Demand at IBM.
“This is where the manufacturing and development of Blue Gene largely takes place, so we wanted to center the location of the physical resource with the best possible skills inside IBM. And then we’ll supplement them with application experts, who come from all over the company,” Gelardi said. “So we’re going to centralize the building of this new ecosystem for Blue Gene all around the Rochester facility.”
Blue Gene is unique, made up of specially designed Power-based nodes that include only a processor and a small amount of memory. The key feature of Blue Gene is that it is extremely dense: a single rack includes 1,024 dual-processor nodes that can reach peak performance of 5.7 teraflops.
The system is well suited for applications in the life sciences, such as genome research, the petroleum industry, for seismic analysis and reservoir modeling, and automotive and aerospace manufacturing for computer-aided engineering.
“We’re looking at some more traditional commercial things like risk analysis, so there might be some analytics and business intelligence applications,” said Gelardi, who adds that a single rack will be available at the Rochester facility initially, but that processors will be added over time as demands grow.
Lawrence Livermore National Laboratory in Livermore, Calif., has been running a 32,000-processor Blue Gene system since December and is in the process of doubling its size.
In November, IBM made a commercial version of Blue Gene available for about $2 million to customers that want to bring the processing powerhouse into their data centers. By adding Blue Gene to its supercomputing-on-demand portfolio, IBM is giving users a more economical entryway into the system, Gelardi said.
“Customers can either buy a machine or they can buy capacity on demand, or both,” he said. “It depends on whether they want a little bit of Blue Gene all the time or a lot of Blue Gene some of the time.”
The nature of many applications that would be appropriate for Blue Gene -- in life sciences, for example -- go through periods of peaks and troughs, making an on-demand arrangement more cost-effective, Gelardi said.
IBM has three other Deep Computing Capacity on Demand Centers -- in Poughkeepsie, N.Y.; Houston; and Montpellier, France -- giving end-users access to more than 5,200 CPUs in Intel-, AMD- and Power-based clusters. The systems run Linux, Windows and AIX. The cost to tap into the systems is about 50 cents per CPU per hour, Gelardi said.
He said the per-CPU cost of the Blue Gene system will be similar, though the exact cost will differ because of the power of the Blue Gene processors.
IBM also is offering Blue Gene to ISVs and end users as a platform to test drive and tweak applications for the system. In many cases, customers in the life sciences, financial and manufacturing industries, which would need such processing power, write their own custom applications, analysts say.
“Having the remote access capabilities to Blue Gene enables companies to test their own workloads so they can have a real-world demonstration of performance. And then they can do capacity on demand,” said Stacey Quandt, an analyst at Robert Frances Group. “The coupling of those two offerings make this cost-effective and provides for a level of innovation that has not been available before to enterprise customers.”