Google tech guru Urs Hölzle explained the company’s plan to beat one of the oldest laws of technology
‘That is a big problem for us internally, but it’s a much bigger problem for the IT space overall,’ warns the tech guru of Moore’s Law slowdown
For the last few years, there’s been increasing evidence that we’re reaching the limits of Moore’s Law, the prediction made by Intel co-founder Gordon Moore in 1965 that computing power will double ever two years or so.
That’s a potentially huge problem for companies like Google, which needs ever more computing brawn to power its various web services, even as it looks to next-generation problems like artificial intelligence.
On stage at the Structure conference on Tuesday, Google’s 8th employee and all-around tech guru Urs Hölzle explained the steps Google is taking to circumvent the limits of Moore’s Law — and why he thinks Google Cloud can serve as a kind of an escape hatch for businesses struggling with the same issue.
“Moore’s Law is slowing down for a number of reasons,” Hölzle says. “That is a big problem for us internally, but it’s a much bigger problem for the IT space overall.”
It’s a well-timed message, as Google redoubles its efforts to topple Amazon’s massive lead on the cloud computing market. In fact, almost exactly a year ago at last year’s Structure, Hölzle said that he thought cloud computing could eventually generate more revenue for Google than the advertising business which currently provides the majority of revenue.
Essentially, Hölzle says, as Moore’s Law slows down, it means that IT departments are seeing their costs go up: As companies come to depend more on their computing infrastructure, and as they look to do heavy-duty analysis on their business data, the IT departments need more servers just to meet demand.
And the cost of running all that infrastructure isn’t getting cheaper or better as fast as it used to, he says. That means you need more servers, which takes up more room, more energy, and more manpower to manage. But Google has more resources and engineering ability than your average IT department.
The idea, Hölzle explains, is that if you can’t count on normal processors doubling in speed every two years, you need to build systems that are designed to squeeze performance boosts for specific tasks. Even if a new system is only 30% more efficient, you need to take your wins where you get them.
“If you see a 30% opportunity, you have to take it,” Hölzle says.
For instance, Google is building special chips, called FPGAs, designed for artificial intelligence, as standard processors prove short of the task. Microsoft is building similar chips for similar ends.
First, this increase Google’s performance for its own services, which is good given its huge bet on artificial intelligence. Critically, though, it also means that customers of the Google Cloud, where companies offload huge chunks of their server infrastructure to the search giant, can also reap the benefits for their own AI tasks.
In other words, as Google works around the clock to beat Moore’s Law and squeeze efficiencies where it can, Hölzle says that its customers get that new technology, too. Every time Google figures out a better way to do things, he says, its customers benefit, too.
“In the cloud, it’s much easier to sort of insert new technology,” Hölzle says.