20.5 C
Washington

AI could gobble up a quarter of all electricity in the U.S. by 2030 if it doesn’t break its energy addiction, says Arm Holdings exec

Date:

Share:



Before artificial intelligence can transform society, the technology will first have to learn how to live within its means.

Right now generative AI have an “insatiable demand” for electricity to power the tens of thousands of compute clusters needed to operate large language models like OpenAI’s GPT-4, warned chief marketing officer Ami Badani from chip design company Arm Holdings. 

If generative AI is ever going to be able to run on every mobile device from a laptop and tablet to a smartphone, it will have to be able to scale without overwhelming the electricity grid at the same time.

“We won’t be able to continue the advancements of AI without addressing power,” Badani told Fortune’s Brainstorm AI conference in London on Monday. “ChatGPT requires 15 times more energy than a traditional web search.” 

Not only are more businesses using generative AI, but the tech industry is in a race to develop new and more powerful tools that will mean compute demand is only going to grow—and power consumption with it, unless something can be done. 

The latest breakthrough from OpenAI, the company behind ChatGPT, is Sora. It can create super realistic or stylized clips of video footage up to 60 seconds in length purely based on user text prompts. 

The marvel of GenAI comes at a steep cost

“It takes a 100,000 AI chips working at full compute capacity and full power consumption in order to train Sora,” Badani said. “That’s a huge amount.” 

Data centers, where most AI models are trained, currently account for 2% of global electricity consumption, according to Badani. But with generative AI expected to go mainstream, she predicts it could end up devouring a quarter of all power in the United States in 2030.

The solution to this conundrum is to develop semiconductor chips that are optimized to run on a minimum of energy.

That’s where Arm comes in: its RISC processor designs currently run on 99% of all smartphones, as opposed to the rival x86 architecture developed by Intel. The latter has been a standard for desktop PCs, but proved too inefficient to run battery-powered handheld devices like smartphones and tablets. 

Arm is adopting that same design philosophy for AI.

“If you think about AI, it comes with a cost,” Badani said, “and that cost is unfortunately power.”  

Subscribe to the Eye on AI newsletter to stay abreast of how AI is shaping the future of business. Sign up for free.



Source link

Subscribe to our magazine

━ more like this

iMessage had an outage, but now it’s back

On Thursday evening, many iPhone owners (including some here at The Verge) saw the “not delivered” flag when trying to send texts via...

The ultra-exclusive $100 billion club has swelled to an unprecedented 15 members

The world’s super-rich club now has 15 members with fortunes over $100 billion, the most on record, as they ride the waves of...

Millennials and Gen Z are skeptical of traditional financial advice. Here’s how to reach them

When it comes to traditional financial advice and markers of success, millennials and Gen Z are a skeptical bunch. But that should help...

Sam Altman, leader of OpenAI and one of Reddit’s largest investors, is further aligning his interests with an alliance between the two

OpenAI unveiled a partnership with Reddit Inc. that will bring the social network’s content to chatbot ChatGPT and other products, and allow Reddit...

Google’s Gemini AI plan for schools promises extra data protection and privacy

Google will soon offer schools access to Gemini AI — specifically for educators and students over the age of 18 — and promises...