Google Touts Speedy Custom AI Chips That Trounce GPUs And CPUs

Google has revealed it has made great strides after it began developing its own in-house custom accelerators for machine learning applications.

Last May Google said that it had started a stealthy project several years ago to develop a custom ASIC, built specifically for machine learning.

Google called it a Tensor Processing Unit (TPU), and said that it had been using them in its data centres for more than a year, and had “found them to deliver an order of magnitude better-optimized performance per watt for machine learning”.

Benchmark Speeds

Now a year later it has conducted some benchmarks for these TPUs, which show the astonishing power of these chips.

Google made the announcement in a blog posting this week, and revealed that its TPUs are 15x to 30x faster than contemporary GPUs and CPUs.

And these devices are also much more energy efficient.

“The TPU also achieves much better energy efficiency than conventional chips, achieving 30x to 80x improvement in TOPS/Watt measure (tera-operations [trillion or 1012 operations] of computation per Watt of energy consumed),” blogged Norm Jouppi, a distinguished hardware engineer at Google.

And Jouppi wrote that the neural networks powering these applications require a surprisingly small amount of code: just 100 to 1500 lines.

“The need for TPUs really emerged about six years ago (2011), when we started using computationally expensive deep learning models in more and more places throughout our products,” wrote Jouppi. “The computational expense of using these models had us worried. If we considered a scenario where people use Google voice search for just three minutes a day and we ran deep neural nets for our speech recognition system on the processing units we were using, we would have had to double the number of Google data centres!”

Jouppi also said that the TPUs allowed Google to design products that respond in fractions of a second.

“TPUs are behind every search query; they power accurate vision models that underlie products like Google Image Search, Google Photos and the Google Cloud Vision API; they underpin the groundbreaking quality improvements that Google Translate rolled out last year; and they were instrumental in Google DeepMind’s victory over Lee Sedol, the first instance of a computer defeating a world champion in the ancient game of Go,” he wrote.

Secretive Google

Google in the past has been notoriously secretive when it came to discussing the technology it uses and indeed even the design of its data centres.

And it seems unlikely that these TPUs will be appearing outside a Google data centre any time soon.

That said, in the past couple of years Google does appear to be more open now and promised it to provide more details about its TPUs at a later data.

“We’re committed to building the best infrastructure and sharing those benefits with everyone,” wrote Google’s Jouppi. “We look forward to sharing more updates in the coming weeks and months.”

Quiz: Put your knowledge of artificial intelligence (AI) to the test.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Google Consolidates DeepMind And AI Research Teams

AI push sees Alphabet's Google saying it will consolidate its AI teams in its Research…

16 hours ago

Apple Pulls WhatsApp, Threads From China App Store

Beijing orders Apple to pull Meta's WhatsApp and Threads from its Chinese App Store over…

20 hours ago

Intel Foundry Assembles Next Gen Chip Machine From ASML

Key milestone sees Intel Foundry assemble ASML's new “High NA EUV” lithography tool, to begin…

1 day ago

Creating Deepfake Porn Without Consent To Become A Crime

People who create sexually explicit ‘deepfakes’ of adults will face prosecution under a new law…

2 days ago

Google Fires 28 Staff Over Israel Protest, Undertakes More Layoffs

Protest at cloud contract with Israel results in staff firings, in addition to layoffs of…

2 days ago