New MIT report reveals energy costs of AI tools like ChatGPT
MIT’s energy accounting shows AI convenience is backed by hidden electricity demand, with video generation consuming orders of magnitude more power than ordinary text outputs.
May 21, 2025
Sources
Summary
An MIT Technology Review report quantified the energy used per response by large-language models like ChatGPT and found a range from 114 to 6,706 joules. The report maps AI performance differences to resource intensity, showing that model size and accuracy can be tied to higher energy consumption. The practical consequence is that routine use of AI text and especially AI video generation can impose materially higher electricity demand than many users assume.
Reality Check
When the public cannot see the real resource costs of digital services, institutions and markets can drift into major infrastructure decisions without informed consent. Normalizing opaque energy consumption in widely used AI tools weakens accountability for the downstream burdens placed on power systems and communities. The risk is a slow transfer of costs from private product choices to shared public capacity, with little transparency about who pays and who benefits.
Detail
<p>MIT Technology Review released a report examining how the artificial intelligence industry uses energy and estimating the energy cost of interacting with large-language-model services such as ChatGPT.</p><p>The report calculated that a response from large-language models can require between 114 joules and 6,706 joules, and it linked lower energy use to models with fewer parameters, which it said tend to produce less accurate answers.</p><p>The report also measured substantially higher energy use for AI-generated video. It found that generating a five-second video with a newer AI model used about 3.4 million joules, described as more than 700 times the energy required to generate a high-quality image, and compared this to running a microwave for over an hour.</p>