ARM Touts Performance Per Watt As New ‘Moore’s Law’

5GCloudComponentsDatacentreGreen-ITInnovationPCServerWorkspace

Traditional Moore’s Law is no longer relevant in today world’s, argues ARM, and says ‘performance per watt’ is the better metric

ARM Holdings has warned that ‘Moore’s Law’ is nearing its end, and the time has come for a new metric when discussing computing progress in the years ahead.

In a blog post, ARM fellow and director of technology, Rob Aitken, said the need to decarbonise compute for the sake of the planet means the technology roadmap can no longer prioritise processing power alone.

Moore’s Law has been around since the 1960s. It is not really a law but an observation of integrated circuit (IC) development by Intel co-founder Gordon Moore in 1965, which then became the company’s mission in the following decades.

Moore Intel

Moore’s Law

Intel’s Moore noted that the number of transistors in ICs doubled every two years or so.

Intel’s obsession with maintaining Moore’s Law has seen the expansion of transistor capacity from the few thousand on 1970s chips to the massive densities of today.

“Moore’s Law has long acted as a roadmap for the tech industry,” said Aitken. “What started as a prediction – that the number of transistors on a chip would double approximately every two years while the relative cost decreased – became a blueprint for progress, a self-fulfilling prophecy that has seen processors shrink from 2,250 transistors in an area of 12 mm² to current designs with more than a hundred million transistors per square millimeter.”

But ARM’s Aitken thinks Moore’s Law is not longer suitable, and we need to consider energy issues as well.

“But, as I wrote back in 2019, Moore’s Law is nearing its end,” wrote Aitken. “The rate of progress is slowing – transistors are getting so small that there are just a few dozen atoms along their gates, and the structure of individual grains of polycrystalline copper is a key consideration in signal timing.”

“When Moore’s Law was originally conceived, computing was relatively new and exciting,” he wrote. “Any correlation between the climate and the energy consumed by computers was not really on anyone’s radar – mostly because while computers were energy-hungry beasts, there were relatively few of them about.”

“The correlation between digital exclusion and social exclusion is well established, yet 3.7 billion people worldwide still do not have full access to digital technology,” wrote Aitken. “Closing the digital divide is a moral imperative, but it also poses a new conundrum for the tech industry: how do we mitigate the environmental impact of 3.7 billion new digital consumers, connecting everyone, everywhere without catastrophically accelerating climate change?”

Koomey’s Law

Thus Aitken said that is is clear that the computing roadmap in the future can no longer focus on increasing processing power alone.

“Squeezing more performance out of the same chips remains a top priority, and performance per watt is where it’s at,” wrote Aitken. “But it’s more than just watts – it’s also energy, the amount of power consumed over time.”

“Both are important but in most situations, one will be more important than the other,” said Aitken. “Data centres as a whole are limited by their ability to dissipate heat energy, while individual servers have power limitations. Mobile devices are limited by the energy stored in their batteries, while their instantaneous power is limited by thermal constraints.”

Aitken pointed out that a sensor running off a solar cell can tap into vast amounts of available energy, but can typically generate only a small amount of power. He said that power, energy, and heat constraints limit all computing devices.

The goal at a product level is to pack more performance per watt into those limits, while globally it’s to rein those limits in.

“In some ways, this isn’t a new concept,” wrote Aitken. “Koomey’s Law, coined in 2010 and named for Stanford professor Jonathan Koomey, describes a trend in the number of computations per joule of dissipated energy. This number doubled every 18 months from 1945 to 2000 (100x per decade), then slowed, doubling every 2.6 years or so since (16x per decade).”

Aitken feels that Koomey’s Law reflects the prevailing metric of computing power – peak efficiency initially then, more recently, typical use efficiency – and also the relatively static nature of power budgets.

“We expect our devices to have a certain power profile, even as Moore’s Law increases their compute capability,” he wrote. “Koomey’s Law is arguably more relevant to the way consumers experience computing today – and the way we should be constructing tech roadmaps. Our digital life tends to span multiple devices, where battery life and performance per watt are more important than gross performance alone.”

“Moore’s Law and Koomey’s Law are not laws of nature but observations on technology direction, and we can use them to see where things might be headed,” he added. “Extrapolating Koomey’s Law, for example, we would expect devices to continue to become increasingly power-efficient, with processors so low powered that they could harvest energy from their environment.”

“As a provider of foundational technology, ARM is uniquely placed to reduce the environmental impact of compute while maximizing its benefits,” said Aitken. “We can empower our ecosystem to drive up performance per watt and drive down emissions, helping technology stay on the right side of history as part of the climate solution and a more sustainable future.”

Read also :
Author: Tom Jowitt
Click to read the authors bio  Click to hide the authors bio