Surprisingly simple coding trick can slash data center energy usage by 30%
Scientists discovered that by adding 30 lines to the Linux operating system, they could dramatically reduce the amount of energy that data centers consume.
![3d rendered image of a computer network.](https://cdn.mos.cms.futurecdn.net/VWFzjFQuGnZvgqALnpDqxG-1200-80.jpg)
Researchers in Canada have discovered a method to reduce the energy that some data centers consume by as much as 30%.
In 2022, the global electricity consumption by data centers was estimated to be between 240 and 340 terawatt-hours, according to the International Energy Agency (IEA). This is between two and three times as much as cryptocurrency mining, while computing as a whole is responsible for 5% of all energy consumption around the world, the scientists said.
What’s more, data center energy consumption is expected to grow further, according to Goldman Sachs, driven by the exponential growth of artificial intelligence (AI).
But researchers at Waterloo University say they have developed a low-cost and simple solution that will cut consumption by almost one-third — and it centered on adding just 30 lines of new code to the Linux operating system
Improving packet allocation
Nearly all web traffic is routed through data centers, the majority of which use the open source operating system Linux. Information arrives in "packets”, which are then distributed and allocated by the data center’s "front end," Martin Kersten, professor of computer science at the University of Waterloo, explained Jan. 20 in a statement.
Related: The 9 most powerful supercomputers in the world right now
Karsten and the study's co-author, computer science graduate student Peter Cai, devised a small change to make data processing more efficient. The method was first outlined in a study presented in December 2023 in the journal Proceedings of the ACM on Measurement and Analysis of Computing Systems (POMACS) — but the code itself was published this month as part of Linux version 6.13.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
"We rearranged what is done and when, which leads to much better usage of the data center’s CPU caches. It’s kind of like rearranging the pipeline at a manufacturing plant, so that you don’t have people running around all the time," Karsten said in the statement.
He teamed up with Joe Damato, distinguished engineer at Fastly, the cloud computing services provider, to develop a small section of code — approximately 30 lines — that would improve Linux’s network traffic processing.
The method identifies and quantifies the direct and indirect costs of asynchronous hardware interrupt requests (IRQ), the process by which packets are allocated, as a major source of overhead. It also proposes that a small modification of the Linux system would significantly improve the efficiency and performance of traditional kernel-based networking by up to 45%, without compromising operational effectiveness.
"All these big companies — Amazon, Google, Meta — use Linux in some capacity, but they’re very picky about how they decide to use it," said Karsten in the statement. "If they choose to 'switch on' our method in their data centers, it could save gigawatt hours of energy worldwide. Almost every single service request that happens on the Internet could be positively affected by this."
Ruari McCallion has been writing about manufacturing, supply chains, automation and related topics for over a quarter of a century. His reports, articles and commentaries have been published in newspapers, magazines and online in the UK and across the world. He has been a contributing editor of PETplanet Insider since 2008, editor of the UK Manufacturing Review and is a founding director of Industrio Ltd, which provides content for companies involved in manufacturing and associated activities.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.