I really dislike the "people should be better and use less energy" argument for solving macro-problems like this
> My naive optimism led me to believe that technology would help us fight climate change. I was wrong: AI and Crypto are net negatives in this regard.
...why? why would technology that specifically requires a lot of energy help "fight climate change"?
this entirely article is missing any point to me -- it's very vague and speaking in generalities.
> “If you’re going to use generative tools powered by large language models, don’t pretend you don’t know how your sausage is made.”
why? if you generate code with a LLM, then read and deeply understand the code, what's wrong?
> I can’t help but feel the web would be a better place if the technology had never existed in the first place.
if we didn't invent the wheel, we wouldn't have so many cars/trucks polluting the planet
There's no good reason it would. Nevertheless, proponents of both AI and crypto have claimed multiple times it would. So I think it is quite fair to bring it up.
I also dislike energy frugality arguments. They come from a Luddite place. A civilization is defined by its energy usage. Let’s be advanced civilization.
... get Bitcoin to do 3% of what ANY PoS blockchain could do.
Yes, of course AI uses a lot of energy. But we have to give it a bit of time to see if there are benefits that come with this cost. I think there will be. Whether the tradeoff was worthwhile, I think we are not even close to being able to conclude.
Something like social media, which has a good long while behind it, I could accept if you started to close the book on the plus-minus.
https://www.science.org/content/article/ancient-romans-likel...
> Global energy consumption in 2023 was around 500,000 GWh per day. That means that ChatGPT’s global 3GWh per day is using 0.0006% of Earth’s energy demand.
https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...
(to be clear: I'm not vegan. but most people, incl. myself, could be quite easily, and it would save several orders of magnitude more energy and water)
But the environmental aspects aren't even my main concern: my main concern is where that energy is coming from - and how the idea of Three Mile Island being restarted solely for Microsoft's benefit marks a huge shift (that no-one seems to have noticed) towards infrastructure that was previously at least nominally for public use being reserved for the use of a private corporation.
I'm also annoyed by the kleptocratic attitude towards training data, but I'll happily accept that reasonable minds may differ on that subject.
So while I don't use LLMs myself for ethical reasons, I don't have a problem with other people choosing to use them.
I suspect nearly all of it. You framed it such that it appears insignificant, but with this framing nothing is significant.
Also, though this is not really relevant to your major incorrect point, airlines aren't part of civil aviation.
if you are actually worried about your ChatGPT energy usage, skip a hot shower or spend a few hours less playing Cyberpunk 2077 and a few more hours reading an old book.
I suspect at a "sensible" breakdown (trying to avoid the "How long is my coastline?" problem), which is presumably something akin to Zipfian, the main uses will actually account for most energy usage.
If anyone can find a breakdown at the level of granularity that "all deep learning" would make sense, let me know: so far my Googling has led primarily to either detailed breakdowns of energy _sources_, or high-level breakdowns of use at the level of "industry", "agriculture", or "residential use".
The Black Death alone was 25-50 million people in 7 years
Does it?
https://prospect.org/environment/2024-09-27-water-not-the-pr...
> training GPT-3 used as much water as just under twice what is required for the average American’s beef consumption. In other words, just two people swearing off beef would more than compensate.
https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...
> It would be a sad meaningless distraction for people who care about the climate to freak out about how often they use Google search. Imagine what your reaction would be to someone telling you they did ten Google searches. You should have the same reaction to someone telling you they prompted ChatGPT.
That beef consumption and other activities also require massive amounts of resources is independent. Both can be true. And both need to be addressed. We no longer have the luxury to pick and choose which climate interventions we find most convenient; we need to succeed on all fronts. If the patient is dehydrated and bleeding, do doctors sit around debating if they should give them water or gauze? No, they do both immediately.
Regardless, I’m not convinced by the parent comment. It basically suggests we solve this without analysis of costs and effects. Sounds insane to me.
Especially since the most promising solutions come from rapidly developing technologies and research. Turning off the lights just sends us backwards.
What solutions are you referring to?
This sentiment baffles me when I see it. It's not like we learned we'd this problem yesterday, and now we've to guess how we might solve it based on guessing.
For 35 years, "rapidly developing technologies and research" hasn't improved the numbers. I understand that there's a regular stream of news which touts one thing or another as a possible solution, but which one of them has done anything? How long do we wait for this techno-optimism to bear fruit?
The latter is outpacing the expectations.
And it’s not just engineering and manufacturing. These things have to be installed, maintained etc. In different types of places, which means you need specialized knowledge for all sorts of things.
On the research side there has been much progress on climate and weather models.
Software also plays a role in all of these, and material science and…
I’m not a techno optimist, but rather see these developments as incredibly necessary. Yes they should have happened yesterday. Yes I’m incredibly frustrated with political leadership all around the world and especially with outdated economic models.
But the only way is forward if it’s not too late.
I’ve heard the answers pre-AI, but I wonder how this new general purpose use of electricity changes the calculus?
Datacenters' used for training or batch jobs can be placed in places where water and power are plentiful, ambient temperature are relatively low and government/society are stable.
A datacenter is going to be built at the bottom of New Zealand where all these things are true. There must be plenty of places in the world where this holds true.
At least for training, I think it is possible to have our cake and eat it. For real-time inference, probably not.
There are many datacenters being built in Finland. Water is not running out, and one of the cheapest electricity prices in Europe thanks to plentiful wind power.
We might need to lay a few more cables to New Zealand though.
What I'm saying is that casual LLM use is also essentially trivial, and scolding people for using querying ChatGPT when you wouldn't scold people for, say, making a dozen Google searches or watching a Netflix show or taking a long shower or driving to the mall, is quite silly and a waste of time.
Similarly, there's a reason why doctors will tell you to worry more about your cigarette habit than almost any other bad habits you have, because it's so much more likely to kill you that it practically demands to be prioritized.
I returned to that place to visit family and friends last year. It was an eye opening experience. The people who live there have taken a keen interest on curtailing further development of any new data centers. One of the chief complaints is the constant power issues that didn’t exist before 2021-2022.
The locals argue that this is the result of the dramatic infusion of AI into every technology product. They’re likely not wrong. The communities in the area became quite politically active over the issue and have retained all manner of analysts and scientists, journalists and investigators, and so on, to aide them in making a political case against future data center development.
The thing that got me was the complaints about the noise. Over the past few years the locals and those in their employ have been monitoring noise levels near the data centers and they’ve tracked sustained increases in noise pollution with the timeline over which the use of AI has exploded. There it has become sort of a trope to measure the working day by how much noise pollution is produced by any data center nearby. Mostly belonging to a hyperscaler known by an acronym.
The data centers have reportedly not been the boon that was promised, which seems to be seen as insult to injury. The area already has a vibrant tech scene independent of data center operations. So the locals don’t really see value in allowing more data centers to be built, and they’re starting to organize politically around the idea of preventing future data center construction and implementing heavy usage based taxation on utilities used by the existing ones.
Two average American's beef consumption for training GPT-3 would place the cost in water of training GPT-3 at 1/150000000 of the US beef industry's average water consumption. That means it's a rounding error relative to all the other uses we place on water, and pointing that out is not whataboutism, it's putting the problem in its proper scale.
You use a laptop, you probably play some video games, you see movies, you probably eat meat, you probably drive a car around town instead of a velomobile.
So the question is, how much energy is used by AI and what does it get us? Google's latest TPUs consume ~250W per fp8-petaflop at the data-center level (ie including cooling etc, according to their recent presentations). Even assuming 50% utilization which is a bit poor, that's say 0.5kW per fp8-petaflop, or ~1kW per 10^15 dense-parameter-equivalent tokens per second. So using a huge model, say 10T dense-equivalent parameters, and doing inference at the rate of o3-mini or Gemini 2.5 Pro (around 150 tokens a second) consumes 1.5kW during generation time. But maybe a better way to think about it is just that you'd be using 10J of energy per token, or ~15J of energy per word.
In that context then, generating a book the length of Fellowship of the Ring (~180k words) would consume ~0.75kWh of energy. That's about like playing a video game on a PS5 for ~3 hours. In the US, that's on average ~270g of CO2, like driving ~1.5 miles or so in a Prius. It'd be about like riding ~30 miles on an e-bike.
Another way to think of it might be that if you are texting with a friend, wherein lets say you both aggressively type at ~120wpm, you'd be using ~15W of power to chat with an AI at a similar pace, about the power draw of your friend's Macbook Air he would be typing on. That's around 0.5-0.7 miles on an ebike per hour of chatting.
So even compared to your typical leisure activities, chatting with an AI is comparable in resource use to whatever else you'd be doing in that time most likely. And of course, a human in the US on average generates ~1.5kg of CO2 per hour they're alive in the US. In Bolivia, it's ~150g/hr, and in Sudan it's ~50g/hr or so. So if you're a company and you want to be climate conscious merely hiring someone who is a ~ subsistence farmer to do a job such that their standard of living rises enough to roughly emit the CO2 of the average Bolivian would mean they emit more carbon in a typical day than running an AI to produce as many words as are in the entire Game of Thrones book series to date. And if you were going to help someone immigrate from Bolivia to the US, you could have the AI write ~10 Songs of Ice and Fire a day for the same net CO2 output.
Not to say that we shouldn't do those things because of climate or whatever, but I'm just saying that the objective energy and climate impact of using these models for things is small compared to basically any alternative for completing a task and most entertainment activities people do (movie theaters for instance use ~7kW or so while running, so in the time you spend watching the Fellowship of the Ring in a theater, an AI could write a text as long as FOTR 28 times).
Water usage is also a nuanced concept because water isn’t destroyed when we use it. You have to consider the source and incidentals of processing it, as well as what alternatives it’s taking away from. None of this fits into convenient quotes though.
I think energy usage is a much better metric because we can quantify the impacts of energy usage by considering the blended sum of the sources going into the grid.
But let's look at what has happened with Grok, for example:
From May 6, 2025
https://www.yahoo.com/news/elon-musk-xai-memphis-35-14321739...
>The company has no Clean Air Act permits.
> In just 11 months since the company arrived in Memphis, xAI has become one of Shelby County's largest emitters of smog-producing nitrogen oxides, according to calculations by environmental groups whose data has been reviewed by POLITICO's E&E News. The plant is in an area whose air is already considered unhealthy due to smog.
> The turbines spew nitrogen oxides, also known as NOx, at an estimated rate of 1,200 to 2,000 tons a year — far more than the gas-fired power plant across the street or the oil refinery down the road.
The details are in the specifics here. People are _already_ feeling the effects of the AI race, the consequences just aren't evenly distributed.
And if we look at the "clean" nuclear deals to power these data centers:
https://www.reuters.com/business/energy/us-regulators-reject...
> The Talen agreement, however, would divert large amounts of power currently supplying the regional grid, which FERC said raised concerns about how that loss of supply would affect power bills and reliability. It was also unclear how transmission and distribution upgrades would be paid for.
The scale of environmental / social impacts comes down to how aggressive the AI race gets.
The whole issue of "using" water is meaningless to me in the context of the water cycle. Does a data center "use" water? Whatever water evaporates from their cooling systems falls again as rain and becomes someone else's water. Same with farming - it all either evaporates (sometimes frustratingly right from the field it was applied to, or otherwise from the surface of the river it eventually runs into, or from the food you bite into, or from your sweat, or from your excretions/the sewage system/rivers again), or ends up supplementing a (typically badly depleted) aquifer, or gets temporarily used by animals including humans to e.g. hydrolyze fats (but full completion of metabolism of said fats & fatty acids actually returns MORE water on a net basis than it took to metabolize them) and so on.
In short, water is never passing into anything or anyone. It's passing through it. You don't own it, you're just borrowing it.
Even water recirculated as a coolant in a data center (the closest thing to actually "using" water) is a finite quantity, needed only one time, with maybe small top-ups due to losses, all of which end up, you guessed it, evaporating into the commons.
Some places rely on deep aquifers that don't refill on human lifetimes, essentially fossil water. But that's mostly a local problem and we should just stop building water-hungry industry and agriculture in those places because it's stupid.
Trained on all the things/ideas of the past and creating a larger barrier of entry for new ideas.
Thinking about new computer languages, who would use one that AI couldn’t help you code in?
This isn't a very hopeful quote given how many people continue to eat sausages even though we all know how sausages are made.
I'm doubtful. There's a few documentaries showing the process and I've had people tell me multiple times that they were genuinely shocked by it. I'm assuming those people "knew" and it's just that knowing "it's all the waste at the butcher getting stuffed into guts" is not quite the same as seeing it first hand.
Natural sausage casings are specialty items. If you're buying it at the grocery store, it's probably collagen (closely related to gelatin).
And it's not "all the waste". It includes fatty cuts that people wouldn't want to eat whole, but it doesn't include organ meats outside of specialty items.
Perhaps people find meat-grinding distressing, though it's really not all that different from ground beef. The emulsified filling of hot dogs and bologna looks odd, but the ingredients are inoffensive.
I'm less disturbed by sausage-making than by the slaughter and prime butchering of animals. They're no less dead and dismembered if you're eating a steak or pot roast. I'd rather we at least make use of all of the other parts.
Then again, some people find seeing a huge Industrial room full of raw meat distressing, perhaps somewhat analogously to how they might not be afraid of one spider but suddenly panick upon seeing hundreds at one spot.
Hunger is an unforgiving teacher, and the reality that our forebears grew expert at squeezing every last calorie out of everything in sight might afford a lesson.
Surely, like everything else in tech, this too shall pass. I expect power requirements to fall away since there is no doubt strong incentives to do so.
> My naive optimism led me to believe that technology would help us fight climate change.
Yeah, proceeding full steam ahead with planet destruction and praying tech will save us is kind of naive. You're not alone though.
> There are also ethical concerns regarding the methods used to obtain data for training AI models.
This one hasn't registered as problematic for me. Maybe I'm unethical?
> Content creators can't even determine which parts of their work were used to train the model…
"Content creators" don't develop their style from a vacuum either. I'm not conflicted about this one either.
> LLMs are also contaminating the web with generic content produced without care or thought.
The web has been contaminated ever since SEO. Maybe AI will kill the web. So it goes.
This sounds extremely naive to my ears. Performance demands currently outpace resource utilization reductions and it's not clear that we're at the point where that will change soon. Also: https://en.wikipedia.org/wiki/Jevons_paradox
> > There are also ethical concerns regarding the methods used to obtain data for training AI models.
> This one hasn't registered as problematic for me. Maybe I'm unethical?
Yeah, the real ethical concern comes from the explosion of pernicious slop output that is utterly destroying everything good about the internet and media, not the training.
The problem is it was falling apart before generative AI. This AI has just hastened its demise.
AI might make things 1-2% worse, but we weren't suffering from a lack of legitimate content or media. We were suffering from centralized control and walled gardens force feeding us slop in the name of profit, the total enshittification of the internet without regard for the damage done - tragedy of the commons, maybe.
AI could also make things better, because it's no longer worth slogging through the bullshit with search - I'll have AI do my searching for me, or follow its recommendation directly, and never interact with any of the monetized bullshit the search engine tries to pawn off.
SEO content will have to get through ever more competent and discerning AI in order to get eyeballs. That eliminates a broad class of low effort trash, directly demonetizing bad faith actors.
So now you have content creators producing more plausible, but mediocre content, but OAI and other AI providers will have direct access to their logs - if they register someone as having had produced bulk trash content, they can shut down the account, report to authorities or other companies if there's some sort of fraud or illicit behavior going on.
There are competing pressures, and with Google also losing its monopoly on adtech, maybe we'll see companies forced to compete on quality products instead of exploitation of user data, all the while seeing the general quality of the internet improve.
Or maybe AI will just accelerate the race to the bottom and the internet's dead already.
This statement kinda blows my mind. You contemplate the destruction of the web, the single most world-changing invention of the last century, possibly the last millennium, a force that has democratized access to information, accelerated scientific progress and connected people around the world, the technology that one way or another played a huge role in the lives of every person on this site, and you shrug?
I see your Kurt Vonnegut and raise you Dylan Thomas: Rage, rage against the dying of the light.
It may be age related too. I had lived some 30-plus years before the Web became a thing ... and I kind of liked the 70's and 80's.
Halving power requirements is only a net gain if demand isn't doubled. Unfortunately, using less power means cheaper service, so demand will increase, and not necessarily smoothly. If there is a massive use case for which AI is currently too expensive, simply moving under that threshold could decuple demand -- a small efficiency gain leading to a giant spike in total energy use.
Not sure how anyone can with certainty say that AI will be a net negative in the long run for climate change. Logic I think says the opposite.
If you generate a terawatt hour of electricity with natural gas, most of the cost will be from the fuel. A nuclear plant will have a tiny fraction of the cost come from fuel for the same amount of energy. A solar farm will have none of the cost come from fuel.
If AI lowers construction costs, it will improve the relative economics of non-fossil energy compared to fossil energy. A natural gas plant constructed at half the cost will have its final energy cost decrease just a little whereas a half-as-expensive solar farm will have its final energy cost decrease nearly by half. Making clean energy cheaper than fossils means that it will out-compete dirty energy even in locations where there are no explicit policies to reduce CO2 emissions.
You can see the effects on pricing advantage with this interactive simulation of electricity supply in the United States. If you cut the overnight construction cost in half for all generating technologies, solar and wind dominate the country:
https://calculators.energy.utexas.edu/lcoe_map/#/county/tech
Some example modeling of gas/solar electricity economics in the United Kingdom here:
https://electrotechrevolution.substack.com/cp/160279905
Companies have already started using robotics and AI to construct solar farms faster and at lower cost:
https://www.aes.com/press-release/AES-Launches-First-AI-Enab...
https://cleantechnica.com/2025/02/27/leaptings-ai-powered-ro...
https://www.renewableenergyworld.com/solar/cool-solar-tech-w...
I've been using Perplexity, and it annotates its recommendations, as to sources. Maybe it isn't very complete, though.
The argument about power usage for developers, as opposed to consumers, is probably insignificant compared to the inefficient use of compute resources for deployed software today.
Arguably LLMs are probably enabling some web developers to create native applications rather than Electron monstrosities saving many P-core CPU cycles multiplied by the number of their users.
Optimising server applications with an LLM could eliminate unnecessary cloud servers.
Of course all the above could be done without LLMs, but LLMs can empower people to do this kind of work when they were not able before.
Except with MCP, essentially most consumers will become "programmers". See my other comment for the rationale (https://news.ycombinator.com/item?id=43997227).
(TLDR: MCP lets non-programmers convert their prompts into programs, for all practical purposes. Currently there is a barrier to entry to automate simple tasks: The need to learn programming. That barrier will go away)
You are more optimistic than I am. Most people I have seen are using LLMs are at best as an alternative to Grammarly or a document/web summary, or at worst making decision based on outdated LLM advice or as an inaccurate fact-engine.
The average person could code using Excel, but most don't even if they know how to use IF() and VLOOKUP().
> The average person could code using Excel, but most don't even if they know how to use IF() and VLOOKUP().
Using Excel (even without IF) is way more complicated than what I am saying. MCPs will enable people to program with natural language. It's not like vibe coding where the natural language will produce code we'll run. The prompt will be the program. You need to put in a lot more effort to learn the basics of Excel.
Does any such project exist, or are developers using LLMs to help them develop new Electron monstrosities? I think that if a developer has the sensibility to want to develop a native app, they will do so regardless, it's not that much harder.
When I AI code, it's more convenient for me to type in a prompt to change the name of a variable. This involves sending a request to the LLM provider, doing very expensive computations, and then doing a buggy job at it. Even though my IDE can do it for what 0.1% of the energy usage? Or even less? Try running an LLM locally on a CPU and you'll get a glimpse of how much energy it is using to do this simple task.
But coding with AI isn't that huge. What will become huge this year or next is MCP. It will bring "programming" to the masses, all of whom will do stupid queries like the above.
Consider this: I wrote an MCP server to fetch the weather forecast, and have separate tools to get a broad forecast, an hourly forecast, etc. I often want to check things like "I'm thinking of going to X tomorrow. It will be a bummer if it's cloudy. Which hours tomorrow have less than 50% cloud cover?" I could go to a weather web site, but that's more effort (lots of clicks to get to this detail). Way easier if I have a prompt ready.
OK - that doesn't sound too bad. Now let's say I want to do this check daily. What you have to realize is that with MCP the prompt above is as good as a program! It's trivial for an average non-programmer to write that prompt and put it as part of a cron job, and have the LLM email/text you when the weather hits a predefined criteria.
Consider emails. I sign up for deals from a retailer.[1] Now deals from them are a dime a dozen so I've been programmed to ignore those emails. But now with MCP, I can set a simple rule: Any email from that retailer goes to the LLM, and I've written a "program" that loosely describes what I think is a great deal, and let the LLM decide if it should notify me.
Everyone will do this - no programming required! That prompt + cron is the program.
Compared to traditional programming, this produces 100-1000x more CO2 emissions. And because there is no barrier to entry, easily 1000x more people will be doing programming than are doing now. So it's almost a millionfold in CO2 emissions for tasks like these.
[1] OK, I don't do it, but most people do.
bitpush•8mo ago
> Firstly, there’s the environmental impact
Their own blog contributes to climate crisis they are now crying about. We can argue someone in a developing country writing a similar article saying "all these self publishing technologists are making climate crisis worse" and it'll have a stronger point.
I say this without discounting the real environmental costs associated with technology, but LLMs / AI isnt uniquely problamatic.
Your latest macbook, iphone, datacenter, ssds.. all have impact.
kelseyfrog•8mo ago
If we choose to affect any lower-priority issue, it is an example of hypocriticality that de-legitimizes the whole project.
nicce•8mo ago
criddell•8mo ago
https://www.fb.org/in-the-news/modern-farmer-farmers-face-a-...
nicce•8mo ago
kirubakaran•8mo ago
frereubu•8mo ago
zdragnar•8mo ago
Individuals value things differently, so attempting to do society-wide prioritization is always going to be a reductive exercise.
For example: Your local cafe doesn't need to exist at all. You could still drink coffee, you'd just have to make it yourself. That cafe is taking up space, running expensive commercial equipment, keeping things warm even when there aren't customers ordering, keeping food items cool that aren't going to be eaten, using harsh commercial chemicals for regular sanitization, possibly inefficient cooling or heating due to heavy traffic going in and out the door, so on and so forth.
Imagine the environmental impact of turning all cafes into housing and nobody driving to go get a coffee.
frereubu•8mo ago
roywiggins•8mo ago
zdragnar•8mo ago
If by "we're going to do anything" you mean presumably fiat power to ban LLMs, then you're better off using that fiat power to just put a sin tax on carbon emissions and letting people decide where they want to cut back.
sherburt3•8mo ago
roywiggins•8mo ago
Uehreka•8mo ago
Thus LLMs don’t have to be that useful to be worth it. And if used in certain ways they can be very useful.
Source (with links to further sources): https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...
sherburt3•8mo ago