Google is well-known for building its own server hardware to meet the unique needs of its massive compute network, but that won't always be the case, the head of its infrastructure team said Wednesday.
As cloud deployments get bigger and more widespread, the industry will eventually catch up to Google's style of computing and the company will no longer need to build its own systems.
[ Get the latest practical info and news with Paul Venezia's Deep End blog and InfoWorld's Data Center newsletter. | Doing server virtualization right is not so simple. InfoWorld's expert contributors show you how to get it right in this 24-page "Server Virtualization Deep Dive" PDF guide. ]
That's according to Urs Hölzle, the executive in charge of Google's technical infrastructure, in an on-stage interview at the Gigaom Structure conference Wednesday.
Google initially built its own servers just to save money, Hölzle said. It hasn't been short of money for a long time, but it still designs its own servers today.
That's because it thinks about data centers more holistically than most other companies, and designs its servers to work in tandem with the power and cooling equipment to minimize energy costs.
"We were doing our own hardware because ... we were thinking much more about the data center as a computer, rather than a single box as the computer, and that really pushes you in a different direction," Hölzle said.
"With the cloud, these things in the next five to 10 years converge again, because 'normal' workloads -- not just Google-scale workloads -- are going to run on the same infrastructure, and therefore can use the same hardware, pretty much," he said.
The implication for big businesses is that they'll become more Google-like, so equipment vendors will supply them with products that Google can use as well.
That's not to say they'll be running at Google's scale. To take one example -- Google provides many of the services that run on Android smartphones, like Gmail and Hangouts.
Almost every Android device has one or two TCP connections open to Google, meaning there are 1 or 2 billion active connections hitting Google's servers at any given time, Hölzle said.
That would have been a problem five years ago, he said, but Google has learned a lot in that time about how to use software to scale its infrastructure.
It's now adding new devices to its network, like satellites and connected thermostats.
"It's not actually that scary any more, because if you have 15 years behind you and every year you're growing at 50 percent or 100 percent, by now we've learned how to scale, and I think that will work out fine," Hölzle said.
Maybe one day, "normal" businesses will be that confident as well.