A surprising thing about Goldman Sachs, one of the globe's most influential investment banks, may be the sheer size of its technology organization. It makes up a major part of its workforce.
Another big surprise from the firm, a major player in one of the most risk-adverse industries, may be its philosophy that embraces commodity approaches, open systems and platforms, and ultimately perhaps, the public cloud.
[ For a quick, smart take on the news you'll be talking about, check out InfoWorld TechBrief -- subscribe today. | Find out what topics and issues affect tech's biggest names and news makers in the IDGE Insider CEO interview series. ]
Goldman Sachs has 36,000 employees. About 10,000 of those -- more than 25 percent of the workforce -- work, effectively, in technology.
Of those tech tech workers, about 6,000 are developers, and the bulk of the remainder work as quantitative strategists on the business side, though they are fundamentally technologists or engineers, according to Donald Duet, global co-chief operating officer of the Goldman Sachs technology division.
The investment bank's systems utilize, in total, about 500,000 compute cores. The total has increased by about 150 percent over the last three to four years, said Duet. In terms of data, the company now has more than 30 petabytes under management.
Duet, who appeared at the Gartner's Data Center Conference here to answer questions about its IT organization, said Goldman Sachs believes strongly in commodity computing and tries to stay away from "the heavier appliance-type models." The company aims "to stick to, as best we can, really, the commodity curve."
The emphasis on commodity is in sync with Goldman Sach's approach to managing risk.
"We work in an environment where risk is first and foremost the most important factor," Duet said, adding that "even things like cost and efficiency are lower."
Duet was asked by Gartner analyst Ray Paquet how he can justify going to commodity-based environments that are perceived to be cheap and more likely to break -- "and therefore cause more risk?"
"One big aspect of risk that we look at is agility," said Duet. "Having more agility, having more ability to make changes rapidly, having more ability to move quickly, is a great risk mitigator."
Goldman Sachs is a backer of the Open Compute Project, which was created by Facebook. The group is working to reduce hardware and data center costs with a more modular and open approach to hardware and management software.
Duet explained the Open Compute approach this way: "How do we take the paradigms of open source that work so well in the software space, and bring that to specifications in hardware?"
Although a company such as Facebook may be a strange bedfellow for an investment bank, Duet said Open Compute has been very helpful, and Goldman Sachs hopes to go live, "very near term," with a modular data center with severs and racks built to open compute standards. The goal is to realize benefits in energy savings and efficiency.
Duet also believes that most computing can be conducted in a pubic cloud once the challenges, such as legacy architecture, security, and data migration issues are resolved. He suggests that widespread use of the public cloud is some five years away.
Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers, and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov, or subscribe to Patrick's RSS feed. His email address is firstname.lastname@example.org.
Read more about data center in Computerworld's Data Center Topic Center.