Google Touts Speedy Custom AI Chips That Trounce GPUs And CPUs

google

Chip boast – Google claims its own AI chips remove the need to increase the number of its data centres

Google has revealed it has made great strides after it began developing its own in-house custom accelerators for machine learning applications.

Last May Google said that it had started a stealthy project several years ago to develop a custom ASIC, built specifically for machine learning.

Google called it a Tensor Processing Unit (TPU), and said that it had been using them in its data centres for more than a year, and had “found them to deliver an order of magnitude better-optimized performance per watt for machine learning”.

google singapore data centre robotsBenchmark Speeds

Now a year later it has conducted some benchmarks for these TPUs, which show the astonishing power of these chips.

Google made the announcement in a blog posting this week, and revealed that its TPUs are 15x to 30x faster than contemporary GPUs and CPUs.

And these devices are also much more energy efficient.

“The TPU also achieves much better energy efficiency than conventional chips, achieving 30x to 80x improvement in TOPS/Watt measure (tera-operations [trillion or 1012 operations] of computation per Watt of energy consumed),” blogged Norm Jouppi, a distinguished hardware engineer at Google.

And Jouppi wrote that the neural networks powering these applications require a surprisingly small amount of code: just 100 to 1500 lines.

“The need for TPUs really emerged about six years ago (2011), when we started using computationally expensive deep learning models in more and more places throughout our products,” wrote Jouppi. “The computational expense of using these models had us worried. If we considered a scenario where people use Google voice search for just three minutes a day and we ran deep neural nets for our speech recognition system on the processing units we were using, we would have had to double the number of Google data centres!”

Jouppi also said that the TPUs allowed Google to design products that respond in fractions of a second.

“TPUs are behind every search query; they power accurate vision models that underlie products like Google Image Search, Google Photos and the Google Cloud Vision API; they underpin the groundbreaking quality improvements that Google Translate rolled out last year; and they were instrumental in Google DeepMind’s victory over Lee Sedol, the first instance of a computer defeating a world champion in the ancient game of Go,” he wrote.

Secretive Google

Google in the past has been notoriously secretive when it came to discussing the technology it uses and indeed even the design of its data centres.

And it seems unlikely that these TPUs will be appearing outside a Google data centre any time soon.

That said, in the past couple of years Google does appear to be more open now and promised it to provide more details about its TPUs at a later data.

“We’re committed to building the best infrastructure and sharing those benefits with everyone,” wrote Google’s Jouppi. “We look forward to sharing more updates in the coming weeks and months.”

Quiz: Put your knowledge of artificial intelligence (AI) to the test.