Every time someone uses ChatGPT to write an essay, create an image or advise them on planning their day, the environment pays a price.
A query on the chatbot that uses artificial intelligence is estimated to require at least 10 times more electricity than a standard search on Google.
If all Google searches similarly used generative AI, they might consume as much electricity as a country the size of Ireland, calculates Alex de Vries, the founder of Digiconomist, a website that aims to expose the unintended consequences of digital trends.
Yet someone using ChatGPT or another artificial intelligence application has no way of knowing how much power their questions will consume as they are processed in the tech companies’ enormous data centers.
De Vries said the skyrocketing energy demand of AI technologies will no doubt require the world to burn more climate-warming oil, gas and coal.
“Even if we manage to feed AI with renewables, we have to realize those are limited in supply, so we’ll be using more fossil fuels elsewhere,” he said. “The ultimate outcome of this is more carbon emissions.”
AI is also thirsty for water. ChatGPT gulps roughly a 16-ounce bottle in as few as 10 queries, calculates Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside, and his colleagues.
The increasing consumption of energy and water by AI has raised concerns in California and around the globe. Experts have detailed how it could stall the transition to green energy, while increasing consumer’s electric bills and the risk of blackouts.
To try to prevent those consequences, De Vries, Ren and other experts are calling on the tech companies to disclose to users how much power and water their queries will consume.
“I think the first step is to have more transparency,” Ren said. The AI developers, he said, “tend to be secretive about their energy usage and their water consumption.”
Ren said users should be told on the websites where they are asked to type in their queries how much energy and water their requests will require. He said this would be similar to how Google now tells people searching for airline flights how much carbon emissions the trip will generate.
“If we had that knowledge,” he said, “then we could make more informed decisions.”
Data centers — enormous warehouses of computer servers that support the internet — have long been big power users. But the specialized computer chips required for generative AI use far more electricity because they are designed to read through vast amounts of data.
The new chips also generate so much heat that even more power and water is needed to keep them cool.
Even though the benefits and risks of AI aren’t yet fully known, companies are increasingly incorporating the technology into existing products.
In May, for example, Google announced that it was adding what it called “AI Overviews” to its search engine. Whenever someone now types a question into Google search, the company’s AI generates an answer from the search results, which is highlighted at the top.
Not all of Google’s AI-generated answers have been correct, including when it told a user to add Elmer’s glue to pizza sauce to keep cheese from sliding off the crust.
But searchers who don’t want those AI-generated answers or want to avoid the extra use of power and water can’t turn off the feature.
“Right now, the user doesn’t have the option to opt out,” Ren said.
Google did not respond to questions from The Times.
OpenAI, the company that created ChatGPT, responded with a prepared statement, but declined to answer specific questions, such as how much power and water the chatbot used.
“AI can be energy-intensive and that’s why we are constantly working to improve efficiencies,” OpenAI said. “We carefully consider the best use of our computing power and support our partners’ efforts to achieve their sustainability goals. We also believe that AI can play a key role in accelerating scientific progress in the discovery of climate solutions.”
Three years ago, Google vowed to reach “net-zero” — where its emissions of greenhouse gases would be equal to what it removed — by 2030.
The company isn’t making progress toward that goal. In 2023, its total carbon emissions increased by 13%, the company disclosed in a July report. Since 2019, its emissions are up 48%.
“As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute,” the company said in the report.
Google added that it expects its emissions to continue to rise before dropping sometime in the future. It did not say when that may be.
The company also disclosed that its data centers consumed 6.1 billion gallons of water in 2023 — 17% more than the year before.
“We’re committed to developing AI responsibly by working to address its environmental footprint,” the report said.
De Vries said he was disappointed Google had not disclosed in the report how much AI was adding to its power needs. The company said in the report that such a “distinction between AI and other workloads” would “not be meaningful.”
By not separately reporting the power use of AI, he said, it is impossible to calculate just how much more electricity Google search was now using with the addition of AI Overviews.
“While capable of delivering the required info,” he said, “they are now withholding it.”
Newsletter
Toward a more sustainable California
Get Boiling Point, our newsletter exploring climate change, energy and the environment, and become part of the conversation — and the solution.
You may occasionally receive promotional content from the Los Angeles Times.