IBM battles dire mainframe myths

Mainframes still pack a ton of bang for your buck, says IBM's chief architect for cloud computing

The perception of mainframe technology as outmoded or inefficient is wildly inaccurate in a number of important ways, according to IBM's chief architect for cloud computing, Frank DeGilio.

In a presentation at the Share user conference this week entitled "Hex, Lies and Videoblogs," the IBM mainframe expert argued that the conventional wisdom on big iron is plagued by several myths.

[ Among IT's most wanted? Mainframe programmers. | Also on InfoWorld: Mainframes can find new life hosting private clouds. | Also: IT managers worry about the coming brain drain from retiring Cobol programmers. | Keep up on the day's tech news headlines with InfoWorld's Today's Headlines: First Look newsletter. ]

More: NASA unplugs last mainframe
More: How to really bury a mainframe

  • The price is wrong

"When people talk about the cost of computing, what they're generally thinking about is the cost of the hardware and the cost of the software," DeGilio says. However, this ignores any number of other major expenses involved. Particularly for large-scale infrastructures, management complexity and personnel costs are often critically important parts of a system's final price tag.

As infrastructures expand, the number of people required to run a distributed system tends to stay significantly higher than that of a comparable mainframe-based alternative, he asserts.

  • Ancient languages and old frameworks

According to the IBM expert, the idea that mainframes only deal with outdated programming languages like Cobol and Assembly is also a myth. J2EE, Linux and other modern open standards are all widely supported, though Cobol is still important.

What's more, he added, there's nothing outdated about the way mainframes handle workload management. In fact, their ability to fine-tune resource allocation based on application need is far more granular and sophisticated than that of most distributed systems.

  • Break, don't bend

That same highly advanced ability to balance workloads, DeGilio said, gives the lie to the idea that mainframes are inflexible. Moreover, the very concept of capacity upgrade on demand was "pioneered" by the mainframe, he noted.

  • Slow and steady wins nothing

Benchmarks that demonstrate an ostensible performance superiority for distributed systems can be misleading, according to DeGilio. Once again, the mainframe's flexibility means that its speed in handling multiple real-world tasks is greater than what would be indicated by a rote test of a single activity, he argued. "All computers wait at the same speed," a presentation slide read.

While DeGilio said that no single model is a panacea for infrastructure needs - "You shouldn't expect everything to run in a mainframe," he cautioned - there's apparently no shortage of applications for the technology even in today's IT climate.

If the IBM expert is correct about the mainframe's future, the results of a recent Computerworld survey may indicate a potential conundrum ahead, however. Nearly half of the IT professionals polled said that they had already noticed a shortage of trained Cobol programmers.

Email Jon Gold at jgold@nww.com and follow him on Twitter at @NWWJonGold.

Read more about data center in Network World's Data Center section.

This story, "IBM battles dire mainframe myths" was originally published by Network World.

From CIO: 8 Free Online Courses to Grow Your Tech Skills
Join the discussion
Be the first to comment on this article. Our Commenting Policies