I mean, should we? Yeah. But we're not gonna.
Here's some stats showing the growth of the millionaire class, now up to 7% of the population: https://www.statista.com/chart/30671/number-of-millionaires-...
At the same time, here's some stats showing extreme poverty falling off a cliff: https://www.statista.com/statistics/1341003/poverty-rate-wor...
Such a thing was completely unthinkable before disruptive tech and the associated mega-rich became the new normal 200 years ago.
Having said that, you're correct to point out that negative externalities haven't really entered our minds until about 50-75 years ago, but it seems tech progress has even made clean, green living at scale possible at least in principle.
This is 80% due to the Chinese government, if it was for billionaires they would all be as poor as before.
Also, ~100% of China's growth started when they embraced market economics in 1990. Read US business books from the 80's. It's rare to even see China mentioned at all until the late 90's. Everybody was worried about Japan overtaking the US and nobody talked about the Chinese economy, because it barely existed.
Speak for yourself. Environmentalism has been a thing for longer than I have been alive. Clearly we care.
And before you reply with even more toxic cynicism stand behind an idling 1960, 1980, 2000 and 2020 sedan and tell me you can’t tell the difference.
https://youtu.be/Wp-WiNXH6hI?si=3uhneUSoiZaUKS9M
So, after 40 years I’m a little tired of hearing it takes too long to build nuclear power plants.
On the bright side, we’ve almost reached peak coal:
https://www.theguardian.com/business/2024/dec/18/coal-use-to...
This is irrelevant because most "Xboxes, smartphones, and personal computers" are powered by centralized fossil fuel power plants that could plausibly be replaced with nuclear reactors, just like the power plant for a datacenter can be replaced with nuclear reactors.
Here's some napkin math:
H100: 61% utilization / 700W ~ 3.7MW/year
RTX 3080: 10% utilization / 320W ~ 0.27MW/year
Which is why the power used is so much higher than a single gaming pc
Those in AI data centers never stop running and completely utilize their capacity. The difference in power usage is astronomical.
I don’t claim to know, but we ought to be able to have a rational debate on this.
A typical NVIDIA server GPU consumes 700W, and a server might have eight of them, so 5.6kW.
A PlayStation 5 consumes 200W total.
https://www.technologyreview.com/2025/05/20/1116327/ai-energ...
They are a big thing. Old people still donate to them.
They are a big reason Africans don't grow GMOs that can help children avoid blindness.
Which also increased reliance on Russian gas. If you thought Greenpeace couldn't be more evil.
Why is HN listening to dirty activists who can't even bathe talk about tech?
Is this how low HN is now? Redditors going to Greenpeace to do project estimates for them? This is your tech level?
djoldman•2h ago
> 1. An energy-efficient AI infrastructure powered 100% by renewable energy. This green power must be additionally generated.
> 2. AI companies must disclose: a. How much electricity is used in operating their AI. b. How much power is consumed by users during their use of AI. c. The goals under which their models were trained, and which environmental parameters were considered.
> 3. AI developers must take responsibility for their supply chains. They must contribute to the expansion of renewable energy in line with their growth and ensure that local communities do not suffer negative consequences (e.g., lack of drinking water, higher electricity prices).
Is there a term for "energy neutrality," the cousin of "net neutrality"?
Do we as a society want to wade into the morass of telling people what kinds of activities they can use energy for?
If we care about saving a watt-hour, there are lots of places to look. Pointing fingers at the incredible energy consumption of internet-delivered HD video might not feel very comfortable to lots of folks.
doener•2h ago
Doesn't training the model consume the most energy in most cases?
gruez•2h ago
Also,
>Brent Thill of Jefferies, an analyst, estimates that [inference] accounts for 96% of the overall energy consumed in data centres used by the AI industry.
https://archive.is/GJs5n
jnieswl•2h ago
Zacharias030•2h ago
Google announced they are serving 500T tokens per month. State of the art models are currently trained with less than 30T tokens. Even if training tokens are more costly to run (eg, a factor of 3x for forward, backward, and weight updates, and take another factor of 2x for missing quantization), you end up in a situation where inference compute dominates training after a very short time of amortization.
doener•2h ago
jillesvangurp•2h ago
My view is that increased energy demand is not necessarily a bad thing in itself. First, it's by no means the dominant source of such demand, other things (transport, shipping, heating, etc.) outrank it; so a little bit of pressure from AI won't move the needle too much. Our main problem remains the same: too much CO2 being emitted. Second, meeting increased demand is typically done with renewables these days. Not because it's nice to do so but because it's cheap to do so. That's why renewables are popular in places like Texas. They don't care about the planet there. But they love cheap energy. And the more cheap, clean power we bring online, the worse expensive dirty power actually looks.
Increased demand leads to mostly new clean generation and increased pressure to deprecate dirty expensive generation. That's why coal is all but gone from most energy markets. That has nothing to do with how dirty it is and everything to do with how expensive it is. Gas based generation is heading the same direction. Any investment in such generation should be considered as very risky.
Short term of course you get some weird behavior like data centers being powered by gas turbines. Not because it's cheap but because it's easy and quick. Long term, a cost optimization would be getting rid of the gas generators. And with inference increasingly becoming the main thing in terms of energy and tokens, energy also becomes the main differentiator for profitability of AI services. Which again points at using cheap renewables to maximize profit. The winners in this market will be working on efficiency. And part of that is energy efficiency. Because that and the hardware is the main cost involved here.
Uehreka•2h ago
mbgerring•2h ago
All serious, viable plans for decarbonization include a massive increase in electricity consumption, due to electrification of transportation, industrial processes, etc, along with an increase in renewable energy production. This isn't new, but AI datacenters are a very large net new single user of electricity.
If the amount of money already poured into AI had gone into the rollout of clean energy infrastructure, we wouldn't even be having this conversation, but here we are.
It makes perfect sense from a policy perspective, given that there are a small number of players in this space with more resources than most governments, to piggyback on this wave of infrastructure buildout.
It also makes plenty of financial sense. Initial capex for adding clean energy generation is high, but given both the high electricity usage of AI datacenters, and the long-term impact on the grid that someone will eventually have to pay for, companies deploying AI infrastructure would be smart to use the huge amount of capital at their disposal to generate their own electricity.
It's also, from a deployment standpoint, pretty straightforward — we're talking about massive, rectangular, warehouse-like buildings with flat roofs. We should have already mandated that all such buildings be covered in solar panels with on-site storage, at a minimum.
nico_h•2h ago
phillipcarter•2h ago
I agree that in general, if the goal is to limit CO2 emissions and use renewable sources of energy, we ought not to focus on AI first, because it is dwarfed by many other things that we take for granted today. My canonical example I give folks is that the latte they order every day from Starbucks involves substantially more energy and water use than most uses of ChatGPT on a daily basis.
But as we move to digitize more and more of this world, and now create automated cognitive labor, we should start with the right foundations. I'd rather we not try to disentangle critical AI infrastructure from coal power plants, and I'd rather we try to limit the compute available to workloads in ways that encourage people to use the tech actually befitting of their use case rather than throw it all into the most expensive model every time.
nico_h•2h ago
More seriously, i’m not too sure about the energy cost and IP infringed during the training and the value added to society by providing generic and mostly accurate but sometimes wildly wrong answers. Or from generating text or pretty pictures for a few milli-cents in cooling and electricity vs asking a human to do the same for a few kilo-cents.
It’s a lot of ladder kicking in the software industry these days.
TimPC•2h ago
whiplash451•1h ago
atonse•2h ago
So it seems like the better goal is to just aim for more clean energy.
mcv•1h ago
bee_rider•2h ago
There’s a lot of focus on the carbon cost of various digital goods. I get it. Destroying the environment is a big problem. But like, maybe we also should not make a bunch of plastic crap and ship it around the world a bunch of times.
jnieswl•2h ago
mumbisChungo•1h ago
bbor•2h ago
For anyone who's curious on specifics re:AI emissions, the recent MIT article is the gold standard in terms of specificity, neutrality, and nuance: https://www.technologyreview.com/2025/05/20/1116327/ai-energ... .
I also did some napkin math here in 2024.12: https://bsky.app/profile/robb.doering.ai/post/3lckwra33vk2t TL;DR: Eating one less burger affords you ~300 chatbot inferences, and avoiding a flight from ATL to SFO affords you ~16,000.
TimPC•2h ago
jcynix•1h ago
Air conditioning for example would be a good place to save energy, as the world wide energy consumption is a multiple of AI's consumption. But climate change will push the need (not luxury) for air conditioning up, which is the Catch-22 in this case.
The International Energy Agency (IEA) estimates that 10% of the globally generated energy is used for sir conditioning. But it would nevertheless be a good idea to require AI companies to care for renewable energy before they reach similar consumption levels.
Regarding the "morass" … we tell people how fast they can drive, or companies to limit air pollution (at least in some countries) so no problem here.
masswerk•1h ago
This really applies to any application which consumes high percentages of the resources available. (Compare, data centers are responsible for almost 80% of the electricity consumption in the Dublin area according to the paper.) The rational of purpose and resource demand and expected effects is secondary to this. The primary question is about (significant) quantities.