Author: J. Debusscher
The average energy consumption for a single ChatGPT query is estimated to be around 0.34 watt-hours (Wh). This figure, recently released by OpenAI, is significantly lower than some earlier academic and media estimates, which ranged from 0.001 to 0.01 kilowatt-hours (1 to 10 Wh).
This newer, lower estimate is considered more plausible by many experts, and it is ten times less than an older estimate of 3 Wh. For perspective, 0.34 Wh is about the same amount of electricity a high-efficiency LED lightbulb uses in a couple of minutes.
The energy use of AI in IT departments is a complex and rapidly evolving issue, with significant energy demands associated with two main phases: training and inference. These processes primarily take place in data centers, which are already major consumers of electricity and water.
1. Training vs. Inference
The lifecycle of an AI model involves two primary stages, each with its own energy profile:
- Training: This is the process of building the AI model by feeding it massive datasets and adjusting billions of parameters. It’s an incredibly resource-intensive and time-consuming process. For example, training a model like GPT-4 can consume tens of millions of kilowatt-hours (kWh) of electricity over several months. While the energy cost of a single training session is immense, it’s a one-time event for each model version.
- Inference: This is the process of using a trained model to generate an output based on a user’s prompt. This is what happens every time you send a command to a chatbot like ChatGPT. While the energy consumption per query is small (around 0.34 Wh for a typical ChatGPT query), the sheer volume of daily queries—billions of them—means that the total energy used for inference can quickly surpass the energy used for training. The energy use for inference is also expected to grow exponentially as AI models become more widely adopted.
2. The Role of Data Centers
AI models are hosted in specialized facilities known as data centers. These are the physical hubs of the AI boom, and their energy demands are soaring.
- High Power Consumption: A modern data center can consume as much electricity as a small town. The International Energy Agency (IEA) projects that global electricity demand from data centers could double between 2022 and 2026, driven largely by AI. Data centers could account for a significant portion of global electricity use by the end of the decade.
- Cooling Systems: The powerful hardware used for AI, particularly GPUs (graphics processing units), generates immense heat. To prevent overheating and maintain performance, data centers require extensive cooling systems. These systems are themselves major energy consumers and are often highly water-intensive. It’s estimated that data centers consume billions of gallons of fresh water annually for cooling.
3. Key Factors Influencing Energy Consumption
The exact energy footprint of an AI task depends on several factors:
- Model Size and Complexity: Larger, more sophisticated models like GPT-4 require significantly more energy for both training and inference than smaller models. As AI models become more complex to improve their performance, their energy consumption increases.
- Prompt Complexity and Length: A longer, more complex user prompt requires more computational resources and therefore more energy to process.
- Hardware Efficiency: The efficiency of the processors (CPUs and GPUs) and the overall data center infrastructure plays a huge role. Newer, more efficient hardware can reduce energy consumption, but this is often offset by the increasing demand for more powerful systems.
- Location and Energy Source: The environmental impact of AI’s energy use is also determined by the energy grid a data center is connected to. A data center running on renewable energy will have a much lower carbon footprint than one powered by fossil fuels.
4. The Broader Environmental Impact
The rapid rise of AI has environmental implications beyond just electricity and water consumption.
- Carbon Emissions: When data centers are powered by non-renewable energy sources, their high electricity consumption translates directly into carbon emissions.
- E-waste: The specialized and expensive hardware used for AI has a relatively short lifespan before it becomes obsolete, contributing to a growing e-waste problem.
- The “Rebound Effect”: As AI models become more energy-efficient, they also become cheaper and easier to use. This can lead to increased adoption and usage, which in turn drives up total energy consumption. This phenomenon, known as the Jevons paradox, is a major concern for long-term sustainability.


