How Much Electricity Does AI Use—And What Does It Cost?
How Much Electricity Does AI Use—And What Does It Cost?
Artificial Intelligence (AI) is no longer just a buzzword. It is a revolutionary force transforming how we work, search, write, and even create art. However, powering these capabilities, especially visual applications like AI image generation comes with substantial energy demands and financial costs.
Training a large-scale AI model, like OpenAI’s GPT-4 or image generators such as DALL·E or Midjourney, requires hundreds of megawatt-hours (MWh) of electricity. Training just one of these advanced models can cost $500,000 to over $1 million in electricity alone. This is due to the intense computing power needed for processing vast amounts of data across powerful GPUs over days or weeks.
However, the cost does not stop after training. Once deployed, these models perform inference—generating responses, images, or videos—every time someone interacts with them. Inference, in the context of AI, refers to the process of applying the trained model to new, unseen data to make predictions or generate new content. For text, the inference is relatively light. However, for AI-generated images, the electricity usage is significantly higher.
Creating a single high-quality image using a diffusion model (like Midjourney, DALL·E 3, or Stable Diffusion) can use between 0.01 to 0.05 kilowatt-hours (kWh), depending on resolution and model complexity. That may seem small, but the costs add up fast at scale. For example, at an average electricity rate of $0.10 per kWh, each image might cost $0.001 to $0.005 in electricity. Multiply that by millions of images generated daily, and companies are spending thousands to tens of thousands of dollars per day just on power for image generation.
Data centers—where these models are run—require even more energy for cooling and maintaining stable operations. These facilities already account for about 1–1.5% of global electricity consumption, and with generative AI on the rise, that share is expected to increase rapidly.
To reduce costs and environmental impact, companies are investing in efficiency: optimizing model architectures, building custom chips (like NVIDIA’s H100 or Google’s TPUs), and relocating data centers to areas with abundant renewable energy. This proactive approach reduces costs and paves the way for a more sustainable future.
In short, AI does not just consume data—it consumes energy and money, too. As AI-generated content becomes more common, the tech world must carefully balance innovation with sustainability and cost control, highlighting the need for responsible and efficient use of AI.