Promptspace Logo
ai·5 min read9.9.2025

Three big things we still don’t know about AI’s energy burden

At the beginning of this year, when my colleague Casey Crownhart and I examined the climate and energy stress of the AI ​​for six months, we saw a number as our white whale: how much energy the leading AI models such as Chatgpt or Gemini, to create a single answer. This fundamental number was also unable to grasp when the Scramble to Power Ki escalated to the White House and Pentagon, and how the forecasts showed that AI was able to consume as much electricity in three years as 22% of all US households. The problem of finding this number, as we explain in our article published in May, was that AI companies are the only ones they have. We bothered Google, Openai and Microsoft, but every company refused to present its number. Researchers with whom we have spoken to examine the effects of AI on energy networks compared to the attempt to measure the fuel efficiency of a car without ever being able to drive it, and guess on the basis of rumors about its engine size and what it sounds like the highway. This story is part of the "Power Hungry: Ai and our Energy Future" series by with Technology Review to the energy requirements and carbon costs of the artificial intelligence revolution. But then this summer after we published, a strange thing began to happen. In June, Sam Altman from Openaai wrote that an average chatted query used 0.34 watt hours of energy. In July, the French Ki -Startup Mistral did not publish a direct number, but published an estimate of the emissions generated. In August, Google revealed that answering a question to Gemini consumed about 0.24 watt hours of energy. Google and Openai's numbers resembled what Casey and I appreciated for medium-sized AI models. Is our job complete with this newly discovered transparency? Do we finally have our white whale Harpoon and if so, what happens next to people who examine the climate effect of the AI? I turned to some of our old and new sources to find out. The numbers are vague and chat-just chat, first of all they told me that the tech companies that were published this summer are missing a lot. Openais number, for example, did not appear in a detailed technical paper, but in a blog post by Altman, which leaves many unanswered questions, e.g. As Crownhart emphasizes, the number of Google refers to the median amount of energy per query that does not give us a feeling for the energy-dissolving Gemini reactions, as if it uses an argumentation model to "think" through a hard problem or to create a really long reaction. The numbers also only refer to interactions with chatbots, not to the other options, how people are increasingly dependent on generative AI. "When video and image of more and more people are more economical and used, we need the numbers from different modalities and their measurement," says Sasha Luccioni, AI and air conditioning on the AI ​​platform that hugs the face. This is also important because the numbers for asking a question to a chat bot are undoubtedly small as expected - the same amount of electricity used by a microwave in only seconds. This is part of the reason why AI and climate researchers do not indicate that the AI ​​use of a single creates a significant climate impact. A complete accounting for the energy requirements of AI-Ein, which goes beyond what is used to answer a single query to understand us in order to understand the complete net effects on the climate, would require application-specific information about how all this AI is used. Ketan Joshi, an analyst for climate and energy groups, acknowledges that researchers usually do not receive such specific information from other industries, but in this case this could be justified. "The rate of growth of the data center is undeniably unusual," says Joshi. "Companies should check considerably more." We have questions about energy efficiency companies that make billion dollar investments in the AI ​​that deal with their sustainability goals against this growth of the energy requirement. In May, Microsoft said that his emissions have increased by over 23% since 2020 because the company promised to be carbon negative by 2030. "It has become clear that our way to the carbon negative is a marathon, not a sprint," wrote Microsoft. Tech companies often justify this emission pollution by arguing that the AI ​​itself will soon open enough efficiency that make it a positive net for the climate. Perhaps the right AI system, as the thinking is, could design more efficient heating and cooling systems for a building or discover the minerals required for electric vehicles. However, there is no signs that AI has still used these things useful. Companies have shared anecdotes about the use of AI to search for methane emissions hotspots, for example, however, are not transparent enough to help us, knowing whether these successes predominance the increase in electricity requirements and emissions that Big Tech produced in the AI ​​boom. In the meantime, more data centers are planned, and AI's energy requirements continue to rise and rise. The "bladder" question of one of the great unknown in the AI ​​energy is whether society will ever take AI on the levels in the plans of technology companies. Openai has announced that Chatgpt will receive 2.5 billion input requests per day. It is possible that this number and the equivalent figures for other AI companies will continue to increase in the coming years. Forecasts published last year by Lawrence Berkeley National Laboratory suggests that AI could consume so much electricity every year alone, Openai's start of GPT-5 was largely viewed by the company itself as a flop, and this flop caused critics to ask whether AI could possibly hit a wall. When a group found that 95% of companies did not achieve their massive AI investments, the shares are raised. The expansion of AI-specific data centers could be an investment that is difficult to recover, especially if the income for AI companies is difficult to grasp. One of the largest unknowns about the future energy stress of AI is not how much a single query or another number that can be revealed can be used up. It is whether the demand will ever reach the scale, build up the companies or whether the technology collapses under its own hype. The answer will determine whether today's structure becomes a permanent shift of our energy system or a short -lived spike.

Source: Original

Related