I don't think I live in the same world as the author. Ever since the emergence of the Internet, "stuff related to IT" has been using more and more energy.
It's like saying "5G won't use as much electricity as we are told! In fact 5G is more efficient than 4G". Yep, except that 5G enables us to use a lot more of it, and therefore we use more electricity.
It's called the rebound effect.
It's not like the majority of electricity use by computers is complete waste.
You can poo-hoo and say I don't want to live in the digital world, and want to spend more time flying around the world to work with people in person or actually see my mom, or buy physical paper in stores that's shipped there and write physical words on it and have the USPS physically ship it, but that's just wildly, almost unfathomably, less efficient.
If Google didn't exist, who knows how many more books I'd need to own, how much time I'd spend buying those books, how much energy I'd spend going to the stores to pick them up, or having them shipped.
It's almost certainly a lot less than how much energy I spend using Google.
While we all like to think that Facebook is a complete waste of time, what would you be spending your time doing otherwise? Probably something that requires more energy than close to nothing looking at memes on your phone.
Not to mention, presumably, at least some people are getting some value from even the most wasteful pits of the Internet.
Not everything is Bitcoin.
1: https://www1.eere.energy.gov/buildings/publications/pdfs/cor...
2: https://www.eia.gov/energyexplained/use-of-energy/industry.p...
That has nothing to do with how much energy is spent on Google and the Internet vs how many more people there are, and how much more stuff the average person in developing economies has.
I can easily agree that phones that have internet capabilities use more, as a whole, than those that didn't. The infrastructure needs were very different. But, especially if you are comparing to 4G technology, much of that infrastructure already had to distribute content that was driving the extra use.
I would think this would be like cars. If you had taken the estimates of how much pollution vehicles did 40 years ago and assume that that was going to be constant even as the number of cars went up, you'd probably assume we are living in the worst air imaginable. Instead, even gas cars got far better as time went on.
Doesn't mean the problem went away, of course. And some sources of the polution, like tires, did get worse as total makeup as we scaled up. Hopefully we can find ways to make that better, as well.
"Yet throughout this period, the actual share of electricity use accounted for by the IT sector has hovered between 1 and 2 per cent, accounting for less than 1 per cent of global greenhouse gas emissions."
Similarly, nothing forces AI or 5G to use more power than whatever you would have done instead. You can stream films via 5G that you might not have done via 4G, but you might've streamed via WLAN or perhaps cat5 cable instead. The rebound effect doesn't force 5G to use more power than WLAN/GBE. Or more power than driving to a cinema, if you want to compare really widely. The film you stream makes it comparable, not?
Am I missing something or has the need to vast GPU horsepower been solved ? Those requirements were not in DC's before and they're only going up. Whatever way you look at it, there's got to be an increase in power consumption somewhere no ?
You can pick and choose your comparisons, and make an incease appear or not.
Take weather forecasts as an example. Weather forecasting uses massively powerful computers today. If you compare that forecasting with the lack of forecasts two hundred years ago there obviously is an increase in power usage (no electricity was used then) or there obviously isn't (today's result is something we didn't have then, so it would be an apples-to-nothing comparison).
If you say "the GPUs are using power now that they weren't using before" you're implicitly doing the former kind of comparison. Which is obviously correct or obviously wrong ;)
AI is still very near the beginning of the optimization process. We're still using (relatively) general purpose processors to run it. Dedicated accelerators are beginning to appear. Many software optimizations will be found. FPGAs and ASICs will be designed and fabbed. Process nodes will continue to shrink. Moore will continue to exponentially decrease costs over time as with all other workloads.
There's absolutely no guarantee of this. The continuation of Moore's law is far from certain (NVIDIA think it's dead already).
Note how many people pay for the $200/month plans from Anthropic, OAI etc. and still hit limits because they constantly spend $8000 worth of tokens letting the agents burn and churn. It’s pretty obvious that as compute gets cheaper via hardware improvements and power buildout, usage is going to climb exponentially as people go “eh, let the agent just run on autopilot, who cares if it takes 2MM tokens to do [simple task]”.
I think for the foreseeable future we should consider the rebound effect in this sector to be in full force and not expect any decreases in power usage for a long time.
Human nature does. We're like a gas, and we fill to expand the space we're in. If technology uses less power, in general, we'll just use more of it until we hit whatever natural limits are present. (usually cost, or availability) I'm not sure I'm a proponent of usage taxes, but they definitely have the right idea; people will just keep doing more things until it becomes too expensive or they are otherwise restricted. The problem you run into is how the public reacts when "they" are trying to force a bunch of limitations on you that you didn't previously need to live with. It's politically impossible, even in a case where it's the right choice.
Also its called climate change now.
I'm not 100% sure that's strictly true.. We naturally assume for the moment that more energy = more quality.
It's like the Kardashev scale which basically says you can't advance without more and more energy consuptions to progress. Is this a proven thing ? Does the line need to always go up indefinitely ?
If only it was true, I reckon we’re using multiple-orders of magnitude more computational per $ of business objectives simply because of the crazy abstractions. For example, I know of multiple small HFT firms that are crypto market makers with their trading bots in Python. Many banks in my country have excel macros on top of SQL extensions on top of COBOL. We’ve not reduced waste in software but rather quite the opposite.
I don’t think this is super relevant to the articles point but I think it’s an under discussed topic.
Of course, the fact that xAI is throwing up gas turbines at their data centres seems to indicate that clean energy isn't a given.
It's my opinion AI, like many technologies since the 1950s, will lead to more dematerialization of the economy meaning it will net net save electricity and be "greener".
This is an extension of what steven pinker says in Enlightenment now.
And that's just an example, there are many power-related deals of similar magnitude.
The companies building out capacity certainly believe that AI is going to use as much power as we are told. We are told this not on the basis of hypothetical speculation, but on the basis of billions of real dollars being spent on real power capacity for real data centers by real people who'd really rather keep the money in question. Previous hypotheses not backed by billions of dollars are not comparable predictions.
The same could be said of dark fiber laid during the dot com boom, or unused railroads, etc. Spending during a boom is not indicative of properly recognized future demand of resources.
There are new commitments.
Microsoft: https://finance.yahoo.com/news/microsoft-goes-nuclear-bigges...
Google: https://interestingengineering.com/energy/google-gen4-nuclea...
Amazon: https://techcrunch.com/2024/10/16/amazon-jumps-on-nuclear-po...
OpenAI/Sam Altman: https://interestingengineering.com/energy/oklo-to-generate-1...
More: https://www.technologyreview.com/2025/05/20/1116339/ai-nucle...
As a counterpoint: look at crypto. The amount of power used by cryptocurrency has _not_ gone down, in fact it's increased.
AI on the other hand aims at both increased quality but also reduced energy consumption. While there are certainly developments that favour the latter at the cost of the latter (e.g. reasoning models), there are also indications that companies are finding ways to make the models more efficient while maintaining quality. For example, the moves from GPT-4 -> GPT-4-turbo and 4o -> 5 were speculated to be in the service of efficiency. Hopefully the market forces that make computing cheaper and more energy effective will also push AI to become more energy effective over time.
Which is more important? Understanding what happened so far is impossible without data, and those trends can change. It depends on what new technologies people invent, and there are lots of smart researchers out there.
Armchair reasoning isn’t going tell us which trend is more important in the long term. We can imagine scenarios, but we shouldn’t be very confident about such predictions, and distrust other people’s confidence.
I do not accept this. It was once true under Proof-of-Work (typically ~1,000–2,000 kWh per transaction), not so much under Proof-of-Stake (typically 0.03–0.05 kWh per transaction).
Note that proof-of-stake may actually have a lower energy footprint than credit card or fiat banking transactions. An IMF analysis [1] pegged core processing for credit card companies at ~0.04 kWh per transaction (based on data centers and settlement systems), but noted that including user payment means like physical cards and terminals could increase this by about two orders of magnitude—though even then, it doesn't extend to bank branches or employee overhead - an overhead not implicit in decentralized finance.
[1] https://www.elibrary.imf.org/view/journals/063/2022/006/arti...
At first, DW's estimate was one drop of potable water was consumed for each query (normal queries, not more expensive ones)
The Google, I don't know who allowed the sincerity, God bless him, released a first hand analysis of their water consumption, and it is higher that the one drop estimate: 5 drops
https://services.google.com/fh/files/misc/measuring_the_envi...
(I may have the units off a bit, but it looks like OpenAI's recent announcement would consume a bit more than the total residential electricity usage of Seattle.)
1 - https://openai.com/index/openai-nvidia-systems-partnership/
- doing a google search and loading a linked webpage
- taking a photo with your smartphone and uploading it to social media for sharing
- playing Fortnite for 20 minutes
- hosting a Zoom conference with 15 people
- sending an email to a hundred colleagues
I’d be curious. AI inference is massively centralised, so of course the data centres will be using a lot of energy, but less centralised use cases may be less power efficient from a wholistic perspective.AI energy use is negligible compared with other everyday activities. This is a great article on the subject:
https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...
The same author has published a series of articles that go into a lot of depth when it comes to AI energy and water use:
I suspect that yes, for AGI much smaller models will eventually prove to be sufficient. I think in 20 years everyone will have an AI agent in their phone, busily exchanging helpful information with other AI agents of people who you trust.
I think the biggest problem with tech companies is they effectively enclosed and privatized the social graph. I think it should be public, i.e. one shouldn't have to go through a 3rd party to make an inquiry for how much someone trusts a given source of information, or where the given piece of information originated. (There is more to be written about that topic but it's only marginally related to AI.)
It seems like a lot of the hyperbolic angles are looking at this as a constant draw of power over time. There is no reason for a GPU inference farm to be ramped up to 100% clock speed when all of its users are in bed. The 5700XT in my computer is probably pulling a mere 8~12W right now since it is just sitting on an idle desktop. A hyperscaler could easily power down entire racks based upon anticipated demand and turn that into 0W.
Isn't this space a bit too fast moving to be submitting year old posts on it?
Plenty of grid-draining articles since:
Electricity prices are climbing more than twice as fast as inflation
https://news.ycombinator.com/item?id=44931763
Big Tech's A.I. Data Centers Are Driving Up Electricity Bills for Everyone
https://news.ycombinator.com/item?id=44905595
The U.S. grid is so weak, the AI race may be over
https://news.ycombinator.com/item?id=44910562
And nuclear ambitions:
Microsoft doubles down on small modular reactors and fusion energy
https://news.ycombinator.com/item?id=45172609
Google to back three new nuclear projects
https://news.ycombinator.com/item?id=43925982
Google commits to buying power generated by nuclear-energy startup Kairos Power
https://news.ycombinator.com/item?id=41840769
Three Mile Island nuclear plant restart in Microsoft AI power deal
https://news.ycombinator.com/item?id=41601443
Amazon buys stake in nuclear energy developer in push to power data centres
The electricity spend on AI datacenters won't be uniformly distributed. It will probably concentrate in areas that currently have cheaper (and dirtier) electricity, like what xAI is doing in Tennessee.
That will likely drive up local energy prices in those places, which will be further exacerbated by the US's disinvestment in renewable energy and resulting increased reliance on high cost fossil fuels.
JohnFen•2h ago
Oh, that's not a good example of the point they're trying to make. The emissions from concrete are a point of major concern and are frequently discussed. A ton of effort is being put into trying to reduce the problem, and there are widespread calls to reduce the use of the material as much as possible.
dsr_•1h ago
Mistletoe•1h ago
Kye•1h ago
timschmidt•35m ago
nerdponx•1h ago
beepbooptheory•1h ago
> At the other end of the policy spectrum, advocates of “degrowth” don’t want to concede that the explosive growth of the information economy is sustainable, unlike the industrial economy of the 20th century.
This seems to imply we all must agree that the industrial economy of the 20th century was sustainable, and that strikes me as an odd point of agreement to try to make. Isn't it just sidestepping the whole point?
PTOB•1h ago
Diggsey•1h ago