I really dislike the "people should be better and use less energy" argument for solving macro-problems like this
> My naive optimism led me to believe that technology would help us fight climate change. I was wrong: AI and Crypto are net negatives in this regard.
...why? why would technology that specifically requires a lot of energy help "fight climate change"?
this entirely article is missing any point to me -- it's very vague and speaking in generalities.
> “If you’re going to use generative tools powered by large language models, don’t pretend you don’t know how your sausage is made.”
why? if you generate code with a LLM, then read and deeply understand the code, what's wrong?
> I can’t help but feel the web would be a better place if the technology had never existed in the first place.
if we didn't invent the wheel, we wouldn't have so many cars/trucks polluting the planet
There's no good reason it would. Nevertheless, proponents of both AI and crypto have claimed multiple times it would. So I think it is quite fair to bring it up.
I also dislike energy frugality arguments. They come from a Luddite place. A civilization is defined by its energy usage. Let’s be advanced civilization.
Yes, of course AI uses a lot of energy. But we have to give it a bit of time to see if there are benefits that come with this cost. I think there will be. Whether the tradeoff was worthwhile, I think we are not even close to being able to conclude.
Something like social media, which has a good long while behind it, I could accept if you started to close the book on the plus-minus.
“I’ve never been so conflicted about a technology. Of course we are talking about iron smelting and its effects on the environment. Look I used an iron hoe and was impressed but have you seen how it was made? Look at all the waste and smoke and wood burned.
If you use an iron tool yourself, at least know how it was made.”
https://www.science.org/content/article/ancient-romans-likel...
> Global energy consumption in 2023 was around 500,000 GWh per day. That means that ChatGPT’s global 3GWh per day is using 0.0006% of Earth’s energy demand.
https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...
(to be clear: I'm not vegan. but most people, incl. myself, could be quite easily, and it would save several orders of magnitude more energy and water)
I suspect nearly all of it. You framed it such that it appears insignificant, but with this framing nothing is significant.
Also, though this is not really relevant to your major incorrect point, airlines aren't part of civil aviation.
if you are actually worried about your ChatGPT energy usage, skip a hot shower or spend a few hours less playing Cyberpunk 2077 and a few more hours reading an old book.
I suspect at a "sensible" breakdown (trying to avoid the "How long is my coastline?" problem), which is presumably something akin to Zipfian, the main uses will actually account for most energy usage.
If anyone can find a breakdown at the level of granularity that "all deep learning" would make sense, let me know: so far my Googling has led primarily to either detailed breakdowns of energy _sources_, or high-level breakdowns of use at the level of "industry", "agriculture", or "residential use".
The Black Death alone was 25-50 million people in 7 years
Does it?
https://prospect.org/environment/2024-09-27-water-not-the-pr...
> training GPT-3 used as much water as just under twice what is required for the average American’s beef consumption. In other words, just two people swearing off beef would more than compensate.
https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...
> It would be a sad meaningless distraction for people who care about the climate to freak out about how often they use Google search. Imagine what your reaction would be to someone telling you they did ten Google searches. You should have the same reaction to someone telling you they prompted ChatGPT.
That beef consumption and other activities also require massive amounts of resources is independent. Both can be true. And both need to be addressed. We no longer have the luxury to pick and choose which climate interventions we find most convenient; we need to succeed on all fronts. If the patient is dehydrated and bleeding, do doctors sit around debating if they should give them water or gauze? No, they do both immediately.
Regardless, I’m not convinced by the parent comment. It basically suggests we solve this without analysis of costs and effects. Sounds insane to me.
Especially since the most promising solutions come from rapidly developing technologies and research. Turning off the lights just sends us backwards.
I’ve heard the answers pre-AI, but I wonder how this new general purpose use of electricity changes the calculus?
Datacenters' used for training or batch jobs can be placed in places where water and power are plentiful, ambient temperature are relatively low and government/society are stable.
A datacenter is going to be built at the bottom of New Zealand where all these things are true. There must be plenty of places in the world where this holds true.
At least for training, I think it is possible to have our cake and eat it. For real-time inference, probably not.
There are many datacenters being built in Finland. Water is not running out, and one of the cheapest electricity prices in Europe thanks to plentiful wind power.
What I'm saying is that casual LLM use is also essentially trivial, and scolding people for using querying ChatGPT when you wouldn't scold people for, say, making a dozen Google searches or watching a Netflix show or taking a long shower or driving to the mall, is quite silly and a waste of time.
Similarly, there's a reason why doctors will tell you to worry more about your cigarette habit than almost any other bad habits you have, because it's so much more likely to kill you that it practically demands to be prioritized.
I returned to that place to visit family and friends last year. It was an eye opening experience. The people who live there have taken a keen interest on curtailing further development of any new data centers. One of the chief complaints is the constant power issues that didn’t exist before 2021-2022.
The locals argue that this is the result of the dramatic infusion of AI into every technology product. They’re likely not wrong. The communities in the area became quite politically active over the issue and have retained all manner of analysts and scientists, journalists and investigators, and so on, to aide them in making a political case against future data center development.
The thing that got me was the complaints about the noise. Over the past few years the locals and those in their employ have been monitoring noise levels near the data centers and they’ve tracked sustained increases in noise pollution with the timeline over which the use of AI has exploded. There it has become sort of a trope to measure the working day by how much noise pollution is produced by any data center nearby. Mostly belonging to a hyperscaler known by an acronym.
The data centers have reportedly not been the boon that was promised, which seems to be seen as insult to injury. The area already has a vibrant tech scene independent of data center operations. So the locals don’t really see value in allowing more data centers to be built, and they’re starting to organize politically around the idea of preventing future data center construction and implementing heavy usage based taxation on utilities used by the existing ones.
Water usage is also a nuanced concept because water isn’t destroyed when we use it. You have to consider the source and incidentals of processing it, as well as what alternatives it’s taking away from. None of this fits into convenient quotes though.
I think energy usage is a much better metric because we can quantify the impacts of energy usage by considering the blended sum of the sources going into the grid.
But let's look at what has happened with Grok, for example:
From May 6, 2025
https://www.yahoo.com/news/elon-musk-xai-memphis-35-14321739...
>The company has no Clean Air Act permits.
> In just 11 months since the company arrived in Memphis, xAI has become one of Shelby County's largest emitters of smog-producing nitrogen oxides, according to calculations by environmental groups whose data has been reviewed by POLITICO's E&E News. The plant is in an area whose air is already considered unhealthy due to smog.
> The turbines spew nitrogen oxides, also known as NOx, at an estimated rate of 1,200 to 2,000 tons a year — far more than the gas-fired power plant across the street or the oil refinery down the road.
The details are in the specifics here. People are _already_ feeling the effects of the AI race, the consequences just aren't evenly distributed.
And if we look at the "clean" nuclear deals to power these data centers:
https://www.reuters.com/business/energy/us-regulators-reject...
> The Talen agreement, however, would divert large amounts of power currently supplying the regional grid, which FERC said raised concerns about how that loss of supply would affect power bills and reliability. It was also unclear how transmission and distribution upgrades would be paid for.
The scale of environmental / social impacts comes down to how aggressive the AI race gets.
Trained on all the things/ideas of the past and creating a larger barrier of entry for new ideas.
Thinking about new computer languages, who would use one that AI couldn’t help you code in?
This isn't a very hopeful quote given how many people continue to eat sausages even though we all know how sausages are made.
I'm doubtful. There's a few documentaries showing the process and I've had people tell me multiple times that they were genuinely shocked by it. I'm assuming those people "knew" and it's just that knowing "it's all the waste at the butcher getting stuffed into guts" is not quite the same as seeing it first hand.
Surely, like everything else in tech, this too shall pass. I expect power requirements to fall away since there is no doubt strong incentives to do so.
> My naive optimism led me to believe that technology would help us fight climate change.
Yeah, proceeding full steam ahead with planet destruction and praying tech will save us is kind of naive. You're not alone though.
> There are also ethical concerns regarding the methods used to obtain data for training AI models.
This one hasn't registered as problematic for me. Maybe I'm unethical?
> Content creators can't even determine which parts of their work were used to train the model…
"Content creators" don't develop their style from a vacuum either. I'm not conflicted about this one either.
> LLMs are also contaminating the web with generic content produced without care or thought.
The web has been contaminated ever since SEO. Maybe AI will kill the web. So it goes.
This sounds extremely naive to my ears. Performance demands currently outpace resource utilization reductions and it's not clear that we're at the point where that will change soon. Also: https://en.wikipedia.org/wiki/Jevons_paradox
> > There are also ethical concerns regarding the methods used to obtain data for training AI models.
> This one hasn't registered as problematic for me. Maybe I'm unethical?
Yeah, the real ethical concern comes from the explosion of pernicious slop output that is utterly destroying everything good about the internet and media, not the training.
The problem is it was falling apart before generative AI. This AI has just hastened its demise.
AI might make things 1-2% worse, but we weren't suffering from a lack of legitimate content or media. We were suffering from centralized control and walled gardens force feeding us slop in the name of profit, the total enshittification of the internet without regard for the damage done - tragedy of the commons, maybe.
AI could also make things better, because it's no longer worth slogging through the bullshit with search - I'll have AI do my searching for me, or follow its recommendation directly, and never interact with any of the monetized bullshit the search engine tries to pawn off.
SEO content will have to get through ever more competent and discerning AI in order to get eyeballs. That eliminates a broad class of low effort trash, directly demonetizing bad faith actors.
So now you have content creators producing more plausible, but mediocre content, but OAI and other AI providers will have direct access to their logs - if they register someone as having had produced bulk trash content, they can shut down the account, report to authorities or other companies if there's some sort of fraud or illicit behavior going on.
There are competing pressures, and with Google also losing its monopoly on adtech, maybe we'll see companies forced to compete on quality products instead of exploitation of user data, all the while seeing the general quality of the internet improve.
Or maybe AI will just accelerate the race to the bottom and the internet's dead already.
This statement kinda blows my mind. You contemplate the destruction of the web, the single most world-changing invention of the last century, possibly the last millennium, a force that has democratized access to information, accelerated scientific progress and connected people around the world, the technology that one way or another played a huge role in the lives of every person on this site, and you shrug?
I see your Kurt Vonnegut and raise you Dylan Thomas: Rage, rage against the dying of the light.
Not sure how anyone can with certainty say that AI will be a net negative in the long run for climate change. Logic I think says the opposite.
I've been using Perplexity, and it annotates its recommendations, as to sources. Maybe it isn't very complete, though.
The argument about power usage for developers, as opposed to consumers, is probably insignificant compared to the inefficient use of compute resources for deployed software today.
Arguably LLMs are probably enabling some web developers to create native applications rather than Electron monstrosities saving many P-core CPU cycles multiplied by the number of their users.
Optimising server applications with an LLM could eliminate unnecessary cloud servers.
Of course all the above could be done without LLMs, but LLMs can empower people to do this kind of work when they were not able before.
Except with MCP, essentially most consumers will become "programmers". See my other comment for the rationale (https://news.ycombinator.com/item?id=43997227).
(TLDR: MCP lets non-programmers convert their prompts into programs, for all practical purposes. Currently there is a barrier to entry to automate simple tasks: The need to learn programming. That barrier will go away)
You are more optimistic than I am. Most people I have seen are using LLMs are at best as an alternative to Grammarly or a document/web summary, or at worst making decision based on outdated LLM advice or as an inaccurate fact-engine.
The average person could code using Excel, but most don't even if they know how to use IF() and VLOOKUP().
When I AI code, it's more convenient for me to type in a prompt to change the name of a variable. This involves sending a request to the LLM provider, doing very expensive computations, and then doing a buggy job at it. Even though my IDE can do it for what 0.1% of the energy usage? Or even less? Try running an LLM locally on a CPU and you'll get a glimpse of how much energy it is using to do this simple task.
But coding with AI isn't that huge. What will become huge this year or next is MCP. It will bring "programming" to the masses, all of whom will do stupid queries like the above.
Consider this: I wrote an MCP server to fetch the weather forecast, and have separate tools to get a broad forecast, an hourly forecast, etc. I often want to check things like "I'm thinking of going to X tomorrow. It will be a bummer if it's cloudy. Which hours tomorrow have less than 50% cloud cover?" I could go to a weather web site, but that's more effort (lots of clicks to get to this detail). Way easier if I have a prompt ready.
OK - that doesn't sound too bad. Now let's say I want to do this check daily. What you have to realize is that with MCP the prompt above is as good as a program! It's trivial for an average non-programmer to write that prompt and put it as part of a cron job, and have the LLM email/text you when the weather hits a predefined criteria.
Consider emails. I sign up for deals from a retailer.[1] Now deals from them are a dime a dozen so I've been programmed to ignore those emails. But now with MCP, I can set a simple rule: Any email from that retailer goes to the LLM, and I've written a "program" that loosely describes what I think is a great deal, and let the LLM decide if it should notify me.
Everyone will do this - no programming required! That prompt + cron is the program.
Compared to traditional programming, this produces 100-1000x more CO2 emissions. And because there is no barrier to entry, easily 1000x more people will be doing programming than are doing now. So it's almost a millionfold in CO2 emissions for tasks like these.
[1] OK, I don't do it, but most people do.
bitpush•1h ago
> Firstly, there’s the environmental impact
Their own blog contributes to climate crisis they are now crying about. We can argue someone in a developing country writing a similar article saying "all these self publishing technologists are making climate crisis worse" and it'll have a stronger point.
I say this without discounting the real environmental costs associated with technology, but LLMs / AI isnt uniquely problamatic.
Your latest macbook, iphone, datacenter, ssds.. all have impact.
kelseyfrog•1h ago
If we choose to affect any lower-priority issue, it is an example of hypocriticality that de-legitimizes the whole project.
nicce•1h ago
criddell•53m ago
https://www.fb.org/in-the-news/modern-farmer-farmers-face-a-...
nicce•44m ago
kirubakaran•39m ago
frereubu•1h ago
zdragnar•51m ago
Individuals value things differently, so attempting to do society-wide prioritization is always going to be a reductive exercise.
For example: Your local cafe doesn't need to exist at all. You could still drink coffee, you'd just have to make it yourself. That cafe is taking up space, running expensive commercial equipment, keeping things warm even when there aren't customers ordering, keeping food items cool that aren't going to be eaten, using harsh commercial chemicals for regular sanitization, possibly inefficient cooling or heating due to heavy traffic going in and out the door, so on and so forth.
Imagine the environmental impact of turning all cafes into housing and nobody driving to go get a coffee.
frereubu•44m ago
roywiggins•42m ago
zdragnar•15m ago
If by "we're going to do anything" you mean presumably fiat power to ban LLMs, then you're better off using that fiat power to just put a sin tax on carbon emissions and letting people decide where they want to cut back.
roywiggins•43m ago
Uehreka•36m ago
Thus LLMs don’t have to be that useful to be worth it. And if used in certain ways they can be very useful.
Source (with links to further sources): https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...