frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

AI won't use as much electricity as we are told

https://johnquigginblog.substack.com/p/ai-wont-use-as-much-electricity-as
39•hirpslop•1h ago

Comments

JohnFen•1h ago
> By contrast, the unglamorous and largely disregarded business of making cement accounts for around 7 per cent of global emissions.

Oh, that's not a good example of the point they're trying to make. The emissions from concrete are a point of major concern and are frequently discussed. A ton of effort is being put into trying to reduce the problem, and there are widespread calls to reduce the use of the material as much as possible.

dsr_•49m ago
The only useful point that they make is that predictions about unending growth are always wrong in detail. Every actual hockey stick turns into a sigmoid, then falls. Meanwhile, a new hockey stick comes along.
Mistletoe•39m ago
But AI training has been behaving like Bitcoin mining, which constantly increases the difficulty. AI companies so far have been having to release costlier and costlier models to keep up with the Joneses. We don’t want the final iteration to be a Dyson sphere around the sun or the black hole at the center of our galaxy so Gemini 10,000 Pro can tell us “Let there be light.” Or maybe we do, I don’t know.
Kye•24m ago
The previous 9,999 Geminis promised they'd solved entropy and said the words with no real effect so people stopped listening to it. It's very lonely now.
nerdponx•48m ago
Also modern infrastructure is literally built on concrete. Whereas the broad benefits of AI are dubious by comparison.
beepbooptheory•35m ago
In general there seems to be a big given in the argument that I don't think is obvious:

> At the other end of the policy spectrum, advocates of “degrowth” don’t want to concede that the explosive growth of the information economy is sustainable, unlike the industrial economy of the 20th century.

This seems to imply we all must agree that the industrial economy of the 20th century was sustainable, and that strikes me as an odd point of agreement to try to make. Isn't it just sidestepping the whole point?

PTOB•25m ago
Has he considered exactly how much concrete is needed to build a datacenter campus?
Diggsey•20m ago
Essentially zero as a fraction of global concrete usage...
altcognito•45m ago
Many "new" expenditures replace existing stuff. The initial versions are often the worst iterations we'll see so even though the capability is going up, the energy usage will go down over time. It isn't universal (as we've seen a lot of new true growth), but it is common.
vikramkr•45m ago
And what about the predictions of energy use that did pan out, like air conditioning and stuff? Also in 1999 how many personal computer companies were restarting nuclear power plants to fuel their projected energy consumption? Feels like a weird argument to make when the investments into AI I fra are literally measured in gigawatts. Feels like a weird argument in general - ai consuming lots of energy isn't some weird degrowth conspiracy theory
Mistletoe•42m ago
Let’s not forget Sam Altman tried to raise $7 trillion dollars for it somehow as well.
palata•41m ago
> But we have been here before. Predictions of this kind have been made ever since the emergence of the Internet

I don't think I live in the same world as the author. Ever since the emergence of the Internet, "stuff related to IT" has been using more and more energy.

It's like saying "5G won't use as much electricity as we are told! In fact 5G is more efficient than 4G". Yep, except that 5G enables us to use a lot more of it, and therefore we use more electricity.

It's called the rebound effect.

bicepjai•32m ago
Sounds similar to Jevons Paradox
onlyrealcuzzo•26m ago
If you're using more of it, because it's replacing corporate travel and going into the office and driving across town to see your friends and family and facetiming instead, then you are still MASSIVELY reducing your total energy.

It's not like the majority of electricity use by computers is complete waste.

You can poo-hoo and say I don't want to live in the digital world, and want to spend more time flying around the world to work with people in person or actually see my mom, or buy physical paper in stores that's shipped there and write physical words on it and have the USPS physically ship it, but that's just wildly, almost unfathomably, less efficient.

If Google didn't exist, who knows how many more books I'd need to own, how much time I'd spend buying those books, how much energy I'd spend going to the stores to pick them up, or having them shipped.

It's almost certainly a lot less than how much energy I spend using Google.

While we all like to think that Facebook is a complete waste of time, what would you be spending your time doing otherwise? Probably something that requires more energy than close to nothing looking at memes on your phone.

Not to mention, presumably, at least some people are getting some value from even the most wasteful pits of the Internet.

Not everything is Bitcoin.

wahnfrieden•17m ago
How do you account for overall energy use being up massively, and rising at record breaking pace
taeric•26m ago
Do we use more electricity because of 5G? I confess I'd assume modern phones and repeater networks use less power than older ones. Even at large.

I can easily agree that phones that have internet capabilities use more, as a whole, than those that didn't. The infrastructure needs were very different. But, especially if you are comparing to 4G technology, much of that infrastructure already had to distribute content that was driving the extra use.

I would think this would be like cars. If you had taken the estimates of how much pollution vehicles did 40 years ago and assume that that was going to be constant even as the number of cars went up, you'd probably assume we are living in the worst air imaginable. Instead, even gas cars got far better as time went on.

Doesn't mean the problem went away, of course. And some sources of the polution, like tires, did get worse as total makeup as we scaled up. Hopefully we can find ways to make that better, as well.

ElevenLathe•17m ago
The phones, towers, and networks are only the tip of the power iceberg. How much electricity are we burning to run the servers to service the requests that all these 5G phones can now make because of all the wonderfully cheap wireless connectivity?
aceazzameen•7m ago
As a data point, I turn 5G off on my phone and get several hours more battery life using 4G. I'm pretty sure the higher bandwidth is consuming more energy, especially since 5G works at shorter distances and probably needs the power to stay connected to cell towers.
Majestic121•25m ago
This is countered in the article.

"Yet throughout this period, the actual share of electricity use accounted for by the IT sector has hovered between 1 and 2 per cent, accounting for less than 1 per cent of global greenhouse gas emissions."

Arnt•22m ago
Nothing forces the rebound effect to dominate. Computers grow cheaper, we rebound by buying ones with higher capacity, but the overall price still shrinks. I bet the computer you used to post today cost much less than Colossus.

Similarly, nothing forces AI or 5G to use more power than whatever you would have done instead. You can stream films via 5G that you might not have done via 4G, but you might've streamed via WLAN or perhaps cat5 cable instead. The rebound effect doesn't force 5G to use more power than WLAN/GBE. Or more power than driving to a cinema, if you want to compare really widely. The film you stream makes it comparable, not?

bilekas•18m ago
> Similarly, nothing forces AI or 5G to use more power than whatever you would have done instead

Am I missing something or has the need to vast GPU horsepower been solved ? Those requirements were not in DC's before and they're only going up. Whatever way you look at it, there's got to be an increase in power consumption somewhere no ?

Arnt•10m ago
Not necessarily, no.

You can pick and choose your comparisons, and make an incease appear or not.

Take weather forecasts as an example. Weather forecasting uses massively powerful computers today. If you compare that forecasting with the lack of forecasts two hundred years ago there obviously is an increase in power usage (no electricity was used then) or there obviously isn't (today's result is something we didn't have then, so it would be an apples-to-nothing comparison).

If you say "the GPUs are using power now that they weren't using before" you're implicitly doing the former kind of comparison. Which is obviously correct or obviously wrong ;)

timschmidt•3m ago
GPU compute in datacenters has been a thing for at least 20 years. Many of the top500 have included significant GPU clusters for that long. There's nothing computationally special about AI compared to other workloads, and in fact it seems to lend itself to multiplexing quite efficiently - it's possible to process thousands of prompts for a negligable memory bandwidth increase over a single prompt.

AI is still very near the beginning of the optimization process. We're still using (relatively) general purpose processors to run it. Dedicated accelerators are beginning to appear. Many software optimizations will be found. FPGAs and ASICs will be designed and fabbed. Process nodes will continue to shrink. Moore will continue to exponentially decrease costs over time as with all other workloads.

Analemma_•12m ago
There is some limit to the rebound effect because people only have so many hours in the day, but we’re nowhere near the ceiling of how much AI compute people could use.

Note how many people pay for the $200/month plans from Anthropic, OAI etc. and still hit limits because they constantly spend $8000 worth of tokens letting the agents burn and churn. It’s pretty obvious that as compute gets cheaper via hardware improvements and power buildout, usage is going to climb exponentially as people go “eh, let the agent just run on autopilot, who cares if it takes 2MM tokens to do [simple task]”.

I think for the foreseeable future we should consider the rebound effect in this sector to be in full force and not expect any decreases in power usage for a long time.

everdrive•7m ago
>Nothing forces the rebound effect to dominate.

Human nature does. We're like a gas, and we fill to expand the space we're in. If technology uses less power, in general, we'll just use more of it until we hit whatever natural limits are present. (usually cost, or availability) I'm not sure I'm a proponent of usage taxes, but they definitely have the right idea; people will just keep doing more things until it becomes too expensive or they are otherwise restricted. The problem you run into is how the public reacts when "they" are trying to force a bunch of limitations on you that you didn't previously need to live with. It's politically impossible, even in a case where it's the right choice.

pwarner•38m ago
Hopefully the panic continues and we get a lot of extra electricity, ideally via nuclear, wind, solar - and then if AI is a flop at least we get big progress on global warming.
blain•25m ago
I thought you will say a cheaper energy but global warming works too.

Also its called climate change now.

wahnfrieden•13m ago
How does an urgent need for more energy use lead to overall cleaner energy? Won’t it also accelerate unclean energy use to saturation, even if additional clean sources are needed for capacity?
newsclues•35m ago
For humanity to continue increasing the quality of life for more people, more energy is required.
SketchySeaBeast•26m ago
At the risk of being called a luddite or a carriage driver, does the current iteration of AI actually increase quality of life that much?
bilekas•15m ago
> For humanity to continue increasing the quality of life for more people, more energy is required

I'm not 100% sure that's strictly true.. We naturally assume for the moment that more energy = more quality.

It's like the Kardashev scale which basically says you can't advance without more and more energy consuptions to progress. Is this a proven thing ? Does the line need to always go up indefinitely ?

sollewitt•33m ago
“You may not know about the issue but I bet you reckon something, so why not tell us what you reckon. Let us enjoy the full majesty of your uninformed ad-hoc reckon” - David Mitchell.
cph123•30m ago
"Let us enjoy the full majesty of your uninformed ad-hoc reckon, by going to bbc.co.uk… clicking on ‘what I reckon’ and then simply beating on the keyboard with your fists or head."
bobbyraduloff•32m ago
> But far from demanding more electricity personal computers have become more efficient with laptops mostly replacing large standalone boxes, and software improvements reducing waste.

If only it was true, I reckon we’re using multiple-orders of magnitude more computational per $ of business objectives simply because of the crazy abstractions. For example, I know of multiple small HFT firms that are crypto market makers with their trading bots in Python. Many banks in my country have excel macros on top of SQL extensions on top of COBOL. We’ve not reduced waste in software but rather quite the opposite.

I don’t think this is super relevant to the articles point but I think it’s an under discussed topic.

kalleboo•17m ago
Excel has already added an =COPILOT() function. Imagine the waste of all those formulas that probably amount to some basic mathematical formula that could be run on a 386.
timeon•27m ago
Sorry for off-topic - is Substack competing with Medium on amount of pop-ups?
dheera•24m ago
Even if AI doesn't use more electricity, electric cars and clean energy flight will need it.
SketchySeaBeast•1m ago
That's my current, probably misguided hope. They couldn't justify getting the grid ready for electrical vehicles, but frame it as a way to make a bunch of money and everyone's going to jump on board.

Of course, the fact that xAI is throwing up gas turbines at their data centres seems to indicate that clean energy isn't a given.

maerF0x0•24m ago
AI helped me fix my own car, no new parts, no driving to the stealership, no comfy lobby to light, no extra building to heat, no IT system to book me into...

It's my opinion AI, like many technologies since the 1950s, will lead to more dematerialization of the economy meaning it will net net save electricity and be "greener".

This is an extension of what steven pinker says in Enlightenment now.

jerf•23m ago
It's been a while, but I don't recall any of the dotcom startups making deals with nuclear energy companies to buy out entire nuclear power stations: https://www.npr.org/2024/09/20/nx-s1-5120581/three-mile-isla...

And that's just an example, there are many power-related deals of similar magnitude.

The companies building out capacity certainly believe that AI is going to use as much power as we are told. We are told this not on the basis of hypothetical speculation, but on the basis of billions of real dollars being spent on real power capacity for real data centers by real people who'd really rather keep the money in question. Previous hypotheses not backed by billions of dollars are not comparable predictions.

wheelerwj•19m ago
100% this.
kyledrake•10m ago
> The companies building out capacity certainly believe that AI is going to use as much power as we are told.

The same could be said of dark fiber laid during the dot com boom, or unused railroads, etc. Spending during a boom is not indicative of properly recognized future demand of resources.

skybrian•8m ago
Yes, big bets tell us something but they are not a crystal ball. Some of the same companies hired lots of people post-pandemic and then reversed. People who control enormous amounts of money can make risky bets that turn out to be wrong.
afavour•18m ago
That's an awful amount of certainty for something that isn't backed by very much certainty at all. Just "previous claims about inefficiency in tech have ended up being incorrect".

As a counterpoint: look at crypto. The amount of power used by cryptocurrency has _not_ gone down, in fact it's increased.

patapong•7m ago
While I don't disagree with your overall point, I don't think crypto is a good counterpoint here. Crypto is conditioned on using more and more energy to secure the network. As the value increases, more mining hardware can be thrown at it, which increases security adn thus value - there is no upper bound.

AI on the other hand aims at both increased quality but also reduced energy consumption. While there are certainly developments that favour the latter at the cost of the latter (e.g. reasoning models), there are also indications that companies are finding ways to make the models more efficient while maintaining quality. For example, the moves from GPT-4 -> GPT-4-turbo and 4o -> 5 were speculated to be in the service of efficiency. Hopefully the market forces that make computing cheaper and more energy effective will also push AI to become more energy effective over time.

more_corn•18m ago
I don’t believe it
skybrian•17m ago
There are contrary trends: LLM’s are getting lots of efficiency improvements, but they’re being used more.

Which is more important? Understanding what happened so far is impossible without data, and those trends can change. It depends on what new technologies people invent, and there are lots of smart researchers out there.

Armchair reasoning isn’t going tell us which trend is more important in the long term. We can imagine scenarios, but we shouldn’t be very confident about such predictions, and distrust other people’s confidence.

stevenjgarner•17m ago
> Most of the increase could be fully offset if the world put an end to the incredible waste of electricity on cryptocurrency mining (currently 0.5 to 1 per cent of total world electricity consumption, and not normally counted in estimates of IT use).

I do not accept this. It was once true under Proof-of-Work (typically ~1,000–2,000 kWh per transaction), not so much under Proof-of-Stake (typically 0.03–0.05 kWh per transaction).

Note that proof-of-stake may actually have a lower energy footprint than credit card or fiat banking transactions. An IMF analysis [1] pegged core processing for credit card companies at ~0.04 kWh per transaction (based on data centers and settlement systems), but noted that including user payment means like physical cards and terminals could increase this by about two orders of magnitude—though even then, it doesn't extend to bank branches or employee overhead - an overhead not implicit in decentralized finance.

[1] https://www.elibrary.imf.org/view/journals/063/2022/006/arti...

catlikesshrimp•16m ago
Datacenters didn't need water cooling before the AI explosion. (Air cooling was still possible)

At first, DW's estimate was one drop of potable water was consumed for each query (normal queries, not more expensive ones)

The Google, I don't know who allowed the sincerity, God bless him, released a first hand analysis of their water consumption, and it is higher that the one drop estimate: 5 drops

https://services.google.com/fh/files/misc/measuring_the_envi...

josefritzishere•14m ago
This claim is based on the idea that the use of AI will plateau. I hope that is true. The alternatives are ominous.
runako•14m ago
OpenAI yesterday announced[1] a partnership to deploy computer chips, but chose to denominate the size of the deal in gigawatts (instead of dollars, or some measure of computing capacity, or some measure of capability). They certainly seem to think about this in terms of electricity requirements, and seem to think they require a lot of it.

(I may have the units off a bit, but it looks like OpenAI's recent announcement would consume a bit more than the total residential electricity usage of Seattle.)

1 - https://openai.com/index/openai-nvidia-systems-partnership/

Tycho•10m ago
What’s the energy profile of running inference in a typical ChatGPT prompt compared to:

  - doing a google search and loading a linked webpage
  - taking a photo with your smartphone and uploading it to social media for sharing
  - playing Fortnite for 20 minutes
  - hosting a Zoom conference with 15 people
  - sending an email to a hundred colleagues
I’d be curious. AI inference is massively centralised, so of course the data centres will be using a lot of energy, but less centralised use cases may be less power efficient from a wholistic perspective.
js8•7m ago
I found this video https://youtu.be/IQvREfKsVXM interesting, especially because it mentions couple of AI studies/papers that argue in favor of much smaller (and more efficient) models. (And I have never heard of them.)

I suspect that yes, for AGI much smaller models will eventually prove to be sufficient. I think in 20 years everyone will have an AI agent in their phone, busily exchanging helpful information with other AI agents of people who you trust.

I think the biggest problem with tech companies is they effectively enclosed and privatized the social graph. I think it should be public, i.e. one shouldn't have to go through a 3rd party to make an inquiry for how much someone trusts a given source of information, or where the given piece of information originated. (There is more to be written about that topic but it's only marginally related to AI.)

j45•6m ago
Microsoft: https://finance.yahoo.com/news/microsoft-goes-nuclear-bigges...

Google: https://interestingengineering.com/energy/google-gen4-nuclea...

Amazon: https://techcrunch.com/2024/10/16/amazon-jumps-on-nuclear-po...

OpenAI/Sam Altman: https://interestingengineering.com/energy/oklo-to-generate-1...

More: https://www.technologyreview.com/2025/05/20/1116339/ai-nucle...

cratermoon•4m ago
This article was written over a year ago. How has the author's assessments worked out?

Shopify, pulling strings at Ruby Central, forces Bundler and RubyGems takeover

https://joel.drapper.me/p/rubygems-takeover/
1•bradgessler•1m ago•0 comments

Coins of Desire: The Erotic Currency of Parisian Brothels

https://www.messynessychic.com/2025/09/23/coins-of-desire-the-erotic-currency-of-parisian-brothels/
1•speckx•2m ago•1 comments

From hand-tuned Go to self-optimizing code: Building BitsEvolve

https://www.datadoghq.com/blog/engineering/self-optimizing-system/
1•foldU•3m ago•0 comments

Want to Know Your Future Breast-Cancer Risk? Just Ask AI

https://www.wsj.com/health/ai-breast-cancer-screening-tool-8d3ac976
1•brandonb•4m ago•0 comments

Grindr outage reports coincide with Kirk memorial service in Arizona

https://www.pride.com/culture/charlie-kirk-grindr-outage
2•bdellovibrio3•5m ago•2 comments

A Guide to Productive Nothingness

https://multiverseemployeehandbook.com/blog/filling-out-forms-in-the-void-a-guide-to-productive-n...
1•TMEHpodcast•5m ago•0 comments

Hacking OpenAI's Internet Search

https://www.onyx.app/blog/building-internet-search
1•yuhongsun•5m ago•0 comments

To make AI safe, we must develop it as fast as possible without safeguards

https://alignmentalignment.ai/caaac/blog/ai-safe-fast
1•louisbarclay•6m ago•1 comments

Scientists find proof that asteroid hit the North Sea 43M years ago

https://www.hw.ac.uk/news/2025/scientists-find-proof-that-an-asteroid-hit-the-north-sea-over-43-m...
2•geox•6m ago•0 comments

Building an Animated Sign-In Dialog

https://jakub.kr/components/sign-in-dialog
1•jakubkrehel•7m ago•0 comments

Show HN: Workflow Snapshot and Replay – Capture and replay your VS Code sessions

1•ArslantasM•7m ago•0 comments

Show HN: A special place for your ideas! Captured and sparked from the terminal

https://github.com/yusuke99/newi
1•yusuke99•7m ago•0 comments

Show HN: A novel jigsaw puzzle game

https://brainboxpassword.com/
1•wdamao•7m ago•0 comments

The enshittification of solar (and how to stop it)

https://pluralistic.net/2025/09/23/our-friend-the-electron/
2•a_shovel•9m ago•0 comments

Unit Testing in Coders at Work

https://gigamonkeys.wordpress.com/2009/10/05/coders-unit-testing/
2•varjag•9m ago•0 comments

Local-deepthink – perform ultra long thinking using a society of agents (QNN)

https://github.com/iblameandrew/local-deepthink
1•scraper02•12m ago•0 comments

Schedule tasks. Deliver webhooks. Zero infrastructure

https://orkera.com
1•rilesthefirst•12m ago•2 comments

The September NPM Attack Was a Warning. Are We Listening?

https://jdstaerk.substack.com/p/vulnerabilities-in-the-npm-ecosystem
1•DDerTyp•12m ago•0 comments

Detecting AI Fakes with Compression Artifacts

https://dmanco.dev/2025/09/15/basics-of-image-forensics-1.html
1•Doch88•12m ago•0 comments

Proposal: Amend Chemical Risk Evaluation Under the Toxic Substances Control Act

https://www.federalregister.gov/documents/2025/09/23/2025-18431/procedures-for-chemical-risk-eval...
1•impish9208•12m ago•0 comments

Show HN: Airbolt – Call LLM APIs from your app with zero back end

https://www.airbolt.ai
3•mkw5053•13m ago•0 comments

A new RAG algorithm to self-heal damaged datasets and query them on a graph

https://github.com/iblameandrew/spin-rag
1•scraper02•14m ago•0 comments

Show HN: Vbare – a simple alternative to Protobuf for schema evolution

https://www.rivet.dev/blog/2025-09-24-vbare-simple-schema-evolution-with-maximum-performance
1•NathanFlurry•15m ago•0 comments

From MCP to Shell: MCP Auth Flaws Enable RCE in Claude Code, Gemini CLI and More

https://verialabs.com/blog/from-mcp-to-shell/
3•stuxf•16m ago•0 comments

The Apache Incubator Cookbook

https://incubator.apache.org/cookbook/
1•gudzpoz•17m ago•0 comments

Show HN: Gamma API- Auto-generate decks, docs and carousels from raw input

https://developers.gamma.app/docs/getting-started#/docs/getting-started
1•sarafina-smith•18m ago•0 comments

He's ranked 2nd in the nation for youth rock climbing; AI just canceled him

https://insideinvestigator.org/hes-ranked-2nd-in-the-nation-for-youth-rock-climbing-ai-just-cance...
4•lukeinator42•18m ago•0 comments

Lessons from leaders who turned AI challenges into wins

https://www.fastcompany.com/91406043/20-lessons-from-leaders-who-turned-ai-challenges-into-wins
1•ashvardanian•18m ago•0 comments

Steve Jobs, Jef Raskin, and the first great war for your thumbs

https://aresluna.org/steve-jobs-jef-raskin-and-the-first-great-war-for-your-thumbs/
1•signa11•18m ago•0 comments

OpenAI 2025 ICPC Submissions

https://github.com/openai/openai-icpc-2025
1•limoce•18m ago•0 comments