Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud said Spotify and Paramount Global are amongst the first major customers using a new data centre processing chip it designed using ARM technology.

The company announced the Axion chip in April of this year, its first ARM-based central processing unit, alongside ARM-based Tensor Processing Units (TPUs) that it has been using since 2015.

Axion competes with Intel and AMD CPUs, but uses far less power – a key concern as companies build out data centres for electricity-hungry AI applications.

Amazon and Microsoft have also developed ARM-based CPUs that they offer to developers via their cloud infrastructure, while start-up Ampere Computing is selling such chips to Oracle’s cloud unit.

Image credit: Arm

Power efficiency

Google said its chip is about 60 percent more efficient than conventional counterparts.

The power saved can be used to drive resource-intensive tasks such as AI computing, the company said.

Before offering Axion to outside customers, Google used it internally to power applications such as YouTube ads, the Google Earth Engine and other services.

Google, Microsoft and Amazon have all reached deals aiming to use nuclear power to secure energy for their data centre needs.

When it introduced Axion, Google also announced a new generation of its TPU chips, which it uses to drive AI tasks.

Intel and AMD have both introduced AI-focused accelerators that aim to compete with those from market leader Nvidia.

AI chips

Nvidia itself announced the next-generation “Blackwell” GPU platform earlier this year.

While ARM-based chips are currently installed alongside Nvidia chips, SoftBank, the majority owner of ARM, has plans for the company’s technology to drive AI chips that compete directly with those from Nvidia, the Financial Times reported this week.

SoftBank head Masayoshi Son wants to put ARM technology at the centre of a new network of data centres purpose-built to train and run AI systems, the report said.

ARM chief executive Rene Haas has not commented on the plans, but told a Bloomberg event last week: “All those AI workloads are going to run on Arm somewhere, somehow. That’s the reason we spend a lot of time talking to SoftBank about the future.”

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Reddit Introduces AI Search Tool

AI-powered Reddit Answers allows users to access information based on Reddit posts, in move to…

12 hours ago

Former OpenAI Researcher Raises $40m For AI Voice Start-Up

Former co-developer of voice mode for OpenAI's ChatGPT launches WaveForms AI to make AI voice…

13 hours ago

OpenAI Releases Sora Video-Generation Tool

OpenAI releases Sora AI video-generation tool to ChatGPT Plus and Pro subscription users amidst concern…

13 hours ago

Tesla To Use Human Back-Up Drivers For Cybercab Fleet

Tesla to initially use human back-up controllers for company-owned robotaxi fleet at launch next year,…

14 hours ago

China Opens Nvidia Antitrust Probe After US Sanctions

Chinese government opens antitrust probe into Nvidia's $7bn acquisition of Mellanox, in move seen as…

14 hours ago

Google Announces Quantum Chip Error ‘Breakthrough’

Google Willow quantum chip makes significant improvements in error correction, moving quantum computing closer to…

15 hours ago