This article is more than 1 year old

CONFIRMED: Google bakes custom data centre chips

Not CPUs, but a custom ASIC for machine learning applications

Google's long-rumoured efforts to build its own silicon have come to fruition.

The Alphabet subsidiary today revealed it has baked a custom ASIC it calls a “Tensor Processing Unit” (TPU) and has been using them for a year for the machine learning applications that fuel many of its services.

“TPU is tailored to machine learning applications, allowing the chip to be more tolerant of reduced computational precision, which means it requires fewer transistors per operation,” writes Norm Jouppi, a distinguished hardware engineer at Google. “Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models and apply these models more quickly, so users get more intelligent results more rapidly.”

Intriguingly, Jouppi says “a board with a TPU fits into a hard disk drive slot in our data center racks.” He also says Google moved “from first tested silicon” to using the silicon in production “within 22 days.”

Performance is "an order of magnitude better-optimized performance per watt for machine learning," Jouppi says, or "roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law)."

Jouppi also sheds light on all those reports about Google developing silicon, writing “... great software shines brightest with great hardware underneath. That’s why we started a stealthy project at Google several years ago to see what we could accomplish with our own custom accelerators for machine learning applications.”

Google's also playing around with Power9 servers for its bit barns and often makes goo-goo eyes at ARM-powered servers.

TPUs therefore can account for some of Google's ads that ask for silicon designers. But not all, if its custom CPU ambitions remain bright. ®

More about

TIP US OFF

Send us news


Other stories you might like