As artificial intelligence powers more of the technologies we rely on, one expert is warning: “We do not have the energy today to support the demand.”
In an interview with The Cool Down, Dr. Chris Mattmann, UCLA’s chief data and AI officer and the author of one of the leading books on AI and machine learning, explained why AI will demand so much, but he also argued that it’s already improving our modern standard of living. The bigger focus, he argued, should be how best to responsibly power AI — rather than whether it should exist at all.
He also responded to the surprise announcement from Chinese startup DeepSeek that claimed they’d created an AI model at a fraction of the price and power with only minor sacrifices in the thoroughness of its results.
Hot take: AI is just search on steroids
“The future is [that] every search will be AI-enabled … [and] it will just be called ‘search,'” Mattmann said. As the technology improves, it will be much faster and better than what you experience today, which he equates to the seventh page of Google (read: not always great).
“The reality is … the future is [that] this [AI] is integrated into everything … intelligent assistance, your cars, your fridges, and I can only see the demand increasing.”
Put simply, AI is really just search on steroids, and while training AI models is expensive and energy-consuming, it’s not much different than when Google introduced search for information retrieval and data gathering decades ago.
“I think calling it AI became a buzzword for high capitalization and investment over the last three years, but now calling it AI is a ‘no-no,'” he said, referring to the fear and trepidation surrounding AI.
Why does AI use so much energy?
A new report from the International Energy Agency calls the next few years “a new Age of Electricity,” predicting that we’ll need nearly 4% more energy every year until 2027 — or the equivalent of adding more per year than the entire electricity consumption of Japan.
Some — though certainly not all — of that is due to increasing data centers to power artificial intelligence, in addition to the growing electrification of our world.
To explain why AI demands so much energy, Mattmann says to think of the expression “data is the new oil,” a term reportedly coined by British mathematician Clive Humby in 2006 and later adapted by businessman and former presidential candidate Andrew Yang.
“Before we can use oil or energy products, we have to refine it,” Mattmann explained.
“AI expects the world to look like tables with rows and columns … [but] the world doesn’t look that way. It’s messy, it’s multimodal, it’s video, image, sound, text,” he said, and making sense of all that information and “training AI models” takes the most energy.
Big companies like Amazon, Google, and Meta have “made really a business off of collecting people’s data, for many, many decades now, to develop those refined pipelines that turn that data into structured knowledge for AI.”
“That costs a lot of energy, and that’s why … all these people are looking into nuclear power, and putting data centers under lakes and all of those things,” Mattmann explained.
Once the model is created, searching something like ChatGPT or Perplexity uses far less energy just to query the model for an answer.
Still, those unwanted AI results can feel really wasteful.
For one thing, at least, Mattmann said those are often just regular search results labeled as AI, rather than a result of a custom processing of your question as if it had never been asked before. In other words, major search engines like Google can retain past AI results and serve them back up in a more static, energy-efficient way, as that’s good for their own business model as well.
That doesn’t mean those results are never wasteful, distracting, or inaccurate — the latter of which creates its own sets of problems, including misinformation — but Mattmann thinks this will improve over time.
What about AI images and videos?
Mattmann was not as quick to defend the use of AI tools for recreational image and video creation use — particularly in light of estimates that creating an AI image takes about as much energy as fully charging a cellphone, with videos over a hundred times that — but he said it was important to put that all into context.
“There have been many studies on regular old iPhone apps and how fast they drain your iPhone. No one ever had a problem with Roblox or Solitaire draining your iPhone battery, but they did,” he said. “I think in short just calling it ‘AI’ has this spooky thing to it, but the fact remains training is the high energy — causing people concern — part of AI, whereas inference is orders of magnitude less, and my feeling [is they are] similar to apps that just drain your iPhone battery that we all didn’t have problems with before.”
Part of what’s relevant to Mattmann’s point, then, is that there must be some comparison. While no one would say spending 100 cellphones’ worth of energy to make a silly video is a good use of that energy, it’s worth considering that if they were to make it on their own with video editing tools, they would be sitting at a computer using a lot of energy for days on end.
While The Cool Down will continue to report on inefficient uses of AI, it’s also fair to demystify AI as more like “a computer program” and to consider its energy use in a different light if and when it is a tool to replace other work or entertainment. Creating an AI image may often seem like it’s not a justified use of energy, but Mattmann is essentially saying: “Is it much more or less justified than playing video games or watching movies?”
That doesn’t mean concerns around the implications of AI media creation tools are worth waving away, by any means — with fears including the replacement of good jobs, copyright infringement, or kids using AI to create fake movie trailers dozens of times per day without thought for the cost, beyond any amount of energy they could otherwise use on their own.
Brands that use AI to replace real artists have generally paid the price with bad PR, while artists are also suing AI companies over copyright infringement, leading to a sense that the value of real artists will remain high. On the energy side, the hope is that AI services charge users enough for image and video creation to at least account for the electricity and deter frivolous usage, too — but it does take it all back to Mattmann’s point of view that the conversation worth the most attention is how to power these tools, versus whether they should exist.
But what’s the deal with DeepSeek — won’t that use less energy?
When DeepSeek, the Chinese AI company, claimed that they’d created an AI model that takes far less money and time to train, it stunned tech companies around the world.
“DeepSeek kind of caught the industry off-guard,” Mattmann said.
Decrypt summarized the cost implications as “staggering,” as “Deepseek’s API charges just 10 cents per million tokens, compared to $4.40 for similar services.”
While some of the previously reported cost reductions with training the model have been debunked, as Interesting Engineering reported, the lower computational power — and thus, lower energy and costs — to run the AI appears to be holding up so far, meaning the development still has major implications for the AI world, including how much power we can expect to go toward AI in the long haul.
“It’s become clear that some of those claims [from DeepSeek] were not true,” Mattmann said. Even so, the revelation does show that AI models will be developed much faster. “The velocity of the moment is what people should be concerned about — they’re going to come out much faster,” he added.
While some might assume that more efficient AI will use less energy, what they’re not taking into account is that “open source models themselves have used a ton of energy to train,” which is essentially a sunk cost versus ongoing operational costs.
And, if there are more people using AI, because it’s more efficient, it will create increased demand.
“The more people are going to be motivated to train other things, and … it’ll propagate. Potentially more energy could be used for future downstream trainings,” Mattmann explained.
“The demands for AI are only going to increase,” he added, and not just from the big companies that are commoditizing AI to make it available in the general language of every user, but also in research labs and educational institutions.
Still, AI can make our lives easier and protect our most precious commodity
Still, the opportunity for AI to dramatically improve our quality of life might offset some of the concern.
“Seeing AI as assistive and not a replacement is the take that I have that people [can] feel good about,” Mattmann explains, describing “AI as an accelerator of a future lifestyle … where it makes a difference in people’s lives, giving back time to people.”
His three kids under age 15 use AI devices like the Amazon Echo to learn things, and Mattmann uses a Timekettle earbud device to immediately translate up to 40 languages in real-time, which he calls “an AI device at the edge.”
“I’m excited about traveling. I’m excited about what it will do for our national security, what it will mean for language.”
It will be transformative for “those tasks that [require] robotic process automation, intelligent assistance, or whatever can give us back time, which is our only precious commodity here on this planet,” he said.
We need a common AI language
While we should be concerned about finding more responsible alternatives to fossil fuels to power the increased data needs of AI, Mattmann said he has other concerns about our country’s strategy around AI.
Mattmann argues that the Department of Energy needs to make big investments in the next generation of AI supercomputers to avoid big tech companies from controlling “everything.” Developing national capabilities for the United States and empowering companies to be thought leaders and private partners is a better model for the future, he said.
If he was leading a national strategy on AI, Mattmann’s focus would be on increasing AI literacy.
“I just want to explain this in a common way — we don’t even have that,” he said.
Tim Coughlin contributed to this report.
|TCD