For the first time, a large technological company has decided to reveal how much energy really consumes a single request to an artificial intelligence chatbot. It is Google, who has published a technical report on its Gemini model, explaining that for an average “query” (a request) there are about 0.24 electricity Wattora. How far is?
About a second of use of the microwave or nine seconds of TV viewing. They are tiny numbers on an individual level, but huge if multiplied by billions of daily questions (given, moreover, constantly growing) that users ask their new virtual assistants.
So far no big tech had been unbalanced in this way, therefore this data represents a starting point on which the various Google, Openii, Microsoft, Meta, Mistral and other operators can confront each other, in the context of the huge energy impact of their services.
Inside the numbers. The Google report is the most detailed among those known so far, since it considers not only the chips that process the models (the so -called TPUs, responsible for 58%of the consumption), but also the CPU and the memory of the servers within the data centers (25%), the reserve machines ready in case of failure (10%) and cooling systems and conversion of the power (8%).
Even the water needed to lower the temperatures enters the account, so much so that every question asked to Gemini is equivalent to five drops. In technical jargon, what is described is called “inference”,
That is, the use of the model already trained to provide an answer, while the expenditure for training (training), which may require weeks of calculations and much higher energy quantities, is not included in these estimates.
The step forward. In recent years, analysts were forced to approximate hypotheses: there was talk of 3 WH for query, twelve times more than the current estimate. Sam Altman, CEO of Openai, instead mentioned a consumption of 0.34 WH for chatgpt, a data more similar to that provided by Google,
And the Epochai company has also published similar values (0.3 WH), while Mistral Ai has released more focally data on pollution, revealing that each text page produces about 1 gram of Co₂. We are aware of other summary data:
A query with Llama would require on average 1 WH, the generation of an image between 0.3 and 1.2 WH and the production of a short video with Sora even almost 1 kWh every five seconds. Google admits that its relationship does not consider consumption of external network or user devices, but still remains the most solid calculation ever widespread.
Who really pays. To support these consumption are the big techs that manage the data centers: Google for Gemini, Openii with Microsoft for Chatgpt, a destination for its models, Mistral in Europe.
The growth of electrical loads has raised concerns in different areas of the world, where there was also talk of reactivating coal or nuclear power plants to withstand the question. In his report, Google highlights its purchase agreements of over 22 Gigawatt from advanced renewable and nuclear sources, which would allow the card to break down the carbon duster.
Thanks to these policies, every request to Gemini would generate just 0.03 grams of CO₂, a value that, if confirmed, would be 44 times less than the previous year and a hundred times lower than the first independent estimates.
Future perspectives. Translated into practical examples, the figures appear less alarming than it could be thought: 10 questions to our trusted fourth would be equivalent to about a minute and a half of TV, 100 at a quarter of an hour.
In the United Kingdom, 100 daily requests would affect 0.3% of the daily electrical consumption of an average citizen for just 0.3%, while in the United States just for 0.1%. But the picture changes when we move on to heavier operations: images, videos and “Deep Research” require much more energy.
For this reason, several researchers would like a single evaluation standard, similar to the energy label of appliances, which allows to truly compare the impact of the different IAs. Today the Google figures offer a precious basis, but the energy challenge of artificial intelligence has just begun.
