OpenAI leader Sam Altman has tried to put the environmental cost of a single chat into plain numbers. He writes that each prompt uses a tiny splash of cooling water in the data centres that power the model.
“The average query uses about 0.000085 gallons of water, or roughly one fifteenth of a teaspoon,” Altman explains in his blog post “The Gentle Singularity”. The line arrives as cloud operators face mounting questions over local resources.
The Washington Post calculated that producing a 100-word email with GPT-4 drains slightly more than one drinking bottle of water. Reporters added that usage changes according to climate and cooling design at every site.
Does One Chat Trouble The Power Grid?
Altman also laid out an energy figure… 0.34 watt-hours per request. He matched this to an oven running for a single second, or a low energy bulb shining for a couple of minutes.
Researchers quoted by the same newspaper said that AI could overtake Bitcoin mining in electricity demand before the year closes. The comparison shows how quickly total usage can grow when hundreds of millions of chats happen each day.
Even so, Altman says the price of intelligence should fall until it sits almost level with the cost of the electricity alone. In his view, cheaper renewable power and faster chips will tilt that line downward.
Why Could Thinking Grow Cheaper?
Altman sees a reinforcing loop between progress and investment. New breakthroughs make stronger models. Those models, in turn, help designers hunt for new scientific insights, shorter algorithms, and lighter hardware.
As performance improves, the value created pays for ever larger clusters. Suppliers race to build faster processors, while cloud companies add cooling pipes and substations. The cycle repeats, trimming the bill for every extra question.
More from News
- How The UK Government Is Helping With Employment Reform
- What Are The Data-Related Risks Of Period Tracker Apps?
- Investment in UK Businesses Up 3% This Year
- Why Is Tesla Facing Legal Action In Australia?
- How AI Is Helping Scammers Enrol Fake Students To Get College Funding
- Syria Set to Rejoin SWIFT International Payment System
- Searches For ‘Sell Tesla’ Up 372% As Donald Trump and Elon Musk Feud Goes Viral
- UK FinTech Wise Moves From London Stock Exchange To U.S: What Does It Mean?
He believes that, once data centre assembly itself becomes automated, intelligence “too cheap to meter” will feel realistic rather than fanciful. At that stage, the ceiling on human ambition may be set more by imagination than by resource limits.
How Soon Could Abilities Move Forward?
Altman says agents that write solid code already arrived in 2025. He expects 2026 systems to uncover novel insights across medicine, physics and more. Robots that handle everyday tasks may follow in 2027, he predicts. That schedule would let one worker finish projects in hours that once demanded weeks.
Family life, play and art will still matter, he adds, yet the tools behind those pastimes will feel almost unrecognisable compared with those of 2020.
Could Robots Build More Robots?
Altman sets out a picture of the first million humanoid machines leaving ordinary factories before the decade ends.
After that milestone, those units could dig minerals, drive freight lorries, and run assembly lines without human help.
Once automated workshops start turning out data centre racks and chip wafers, capacity could mushroom at a pace mankind has never seen.
He likens the prospect to early chip makers who relied on machine tools to craft better machine tools, creating self-reinforcing progress. Water and power use would grow, though automation might also unlock cleaner grids and closed loop cooling.