I mean, should we? Yeah. But we're not gonna.
Here's some stats showing the growth of the millionaire class, now up to 7% of the population: https://www.statista.com/chart/30671/number-of-millionaires-...
At the same time, here's some stats showing extreme poverty falling off a cliff: https://www.statista.com/statistics/1341003/poverty-rate-wor...
Such a thing was completely unthinkable before disruptive tech and the associated mega-rich became the new normal 200 years ago.
Having said that, you're correct to point out that negative externalities haven't really entered our minds until about 50-75 years ago, but it seems tech progress has even made clean, green living at scale possible at least in principle.
This is 80% due to the Chinese government, if it was for billionaires they would all be as poor as before.
Also, ~100% of China's growth started when they embraced market economics in 1990. Read US business books from the 80's. It's rare to even see China mentioned at all until the late 90's. Everybody was worried about Japan overtaking the US and nobody talked about the Chinese economy, because it barely existed.
Speak for yourself. Environmentalism has been a thing for longer than I have been alive. Clearly we care.
And before you reply with even more toxic cynicism stand behind an idling 1960, 1980, 2000 and 2020 sedan and tell me you can’t tell the difference.
But it's a tough analysis to make regardless, because every widget I buy here in the West that comes from China (and that's pretty much 100% of the widgets) spits out CO2 in China. Now I look cleaner and they look dirtier than the case would be if the widget was made here.
https://youtu.be/Wp-WiNXH6hI?si=3uhneUSoiZaUKS9M
So, after 40 years I’m a little tired of hearing it takes too long to build nuclear power plants.
On the bright side, we’ve almost reached peak coal:
https://www.theguardian.com/business/2024/dec/18/coal-use-to...
[1] https://en.wikipedia.org/wiki/Deep_geological_repository
[2] https://en.wikipedia.org/wiki/High-level_radioactive_waste_m...
But whatever the causes, too little has been done over those 40 years, and at the moment, solar is far cheaper and faster to deploy than nuclear. I'm not stopping anyone from building nuclear power plants, but I think its window has gone. It's too expensive and too slow. But feel free to prove me wrong.
Just stop burning more oil and coal.
This is irrelevant because most "Xboxes, smartphones, and personal computers" are powered by centralized fossil fuel power plants that could plausibly be replaced with nuclear reactors, just like the power plant for a datacenter can be replaced with nuclear reactors.
Here's some napkin math:
H100: 61% utilization / 700W ~ 3.7MW/year
RTX 3080: 10% utilization / 320W ~ 0.27MW/year
Which is why the power used is so much higher than a single gaming pc
I can't believe I have to point this out on HN of all places.
If you're only talking about the GPU's used for inference, then that's a different story. Not nearly as much hardware is required for inference.
But the number of GPU's needed to train models is in the tens of thousands, and there are rumors that some shops (Meta) are already using 100k+ GPU's, just for training.
Those are likely all/mostly H100s, running at least 60% of the time. Consider that OpenAI, Anthropic, Google, Meta, Tesla, X.com, etc. are all within an order of magnitude of each other in terms of compute.
For arguments sake, that's 6 companies approaching 100K H100's worth of compute for their next gen models.
Now consider that GPT4 used roughly 100x more compute to train, compared to GPT3. And GPT5 is rumored to follow this trend, using 100x more compute than GPT4. Extrapolating, GPT6 might also use 100x more compute than GPT5.
Even if the next generation of AI GPU's are 10x as powerful as the H100 for the same amount of electricity, the next generation of models would need 10x as many GPU's (and thus, 10x as much electric demand).
Extrapolate that to GPT7, 8, 9 etc. And you can see why people are worried about the power usage.
This isn't even theoretical. As mentioned in this thread already, these companies are signing deals to buy all the capacity of power plants in some areas.
That's a tiny drop in the sea of the almost 2 billion PC gamers [0], and hundreds of millions of gaming consoles [1] in the world. Not to mention the energy required to manufacture all that hardware. Plus the datacenters required for online gaming, which must also be considerable.
It's weird to be concerned about power usage of AI, but turning a blind eye to the massive amounts of power required by the gaming industry.
[0] https://www.statista.com/statistics/420621/number-of-pc-game...
[1] https://en.wikipedia.org/wiki/List_of_best-selling_game_cons...
It's not that AI uses too much power today, it's that at the current trend, it'll be using somewhere between 100x and 1000x as much power by 2030/2035. Which would place it between 2-20% of total power consumption.
AI provides tangible value to businesses and private users beyond mere entertainment. We'll see how much power it consumes in the future, and where that power comes from.
"The GPU's used for AI have significantly higher utilization rates than gaming GPU's... Here's some napkin math:
H100: 61% utilization / 700W ~ 3.7MW/year
RTX 3080: 10% utilization / 320W ~ 0.27MW/year"
H100 uses at least 10x as much power as a 3080 over it's life time. And most gamers aren't playing on 3080's.
Those in AI data centers never stop running and completely utilize their capacity. The difference in power usage is astronomical.
I don’t claim to know, but we ought to be able to have a rational debate on this.
There's nothing irrational about suggesting AI GPUs are consuming far more power
Apparently a single gaming GPU can be used to run an LLM that serves hundreds of concurrent requests.
> Benchmarking Llama 3.1 8B (fp16) on our 1x RTX 3090 instance suggests that it can support apps with thousands of users by achieving reasonable tokens per second at 100+ concurrent requests.
You're essentially arguing that shipping naval diesel aggregates must be trivial because you can fit a dozen moped motors on the bed of your pickup truck just fine.
I have no insight into how many GPT-4 users are served per GPU, but I would assume OpenAI heavily optimizes for that, considering the cost to run that thing. It's probably in the same ballpark: hundreds-thousands of concurrent user requests per GPU. Still better than one GPU per gamer, even if it requires 10x the energy.
A typical NVIDIA server GPU consumes 700W, and a server might have eight of them, so 5.6kW.
A PlayStation 5 consumes 200W total.
https://www.technologyreview.com/2025/05/20/1116327/ai-energ...
I agree: ignoring the carbon footprint of the gaming industry is irresponsible.
Given an average ~8 hours of work/school and ~8 hours of sleep, gaming GPUs likely don't use anywhere near as much power. Plus, even when they are on, they will probably idle near 30W-60W for a lot of time spent browsing the web or watching videos.
There are more gaming GPUs in existence right now, but the number of AI chips is likely closing that gap rapidly.
And of course, what is that energy being used for? People playing games are typically having fun, bonding with friends, or engaging in social behavior. A huge amount of AI is illegally trained on copyrighted works without license to use them, causing significant harm to various fields. Plus the deluge of AI slop bogging down the internet, social media, forums, image/art-hosting sites, search, and more.
I think it will be a while before modern generative AI is even close to providing value in aggregate.
They are a big thing. Old people still donate to them.
They are a big reason Africans don't grow GMOs that can help children avoid blindness.
Headcounts:
- Gaming GPUs: Installed base 700-900M GPUs + active gaming rigs (100-200M). Assumption of 250M active gamers worldwide.
- AI GPUs: ~3 million high-performance AI GPUs currently in active use globally
Average power usage:
Gaming: 3 hours gaming at 300W → 37.5W average over 24h
AI: 16h×600W+8h×100Widle=10400Wh→ 433W average over 24h
Total Global GPU Power Consumption:
Gaming: 250M GPUs × 37.5W avg = ~9.4 GW -> 9.4GW×24h×365=82.4 TWh/year
AI: 3M GPUs × 433W avg = ~1.3 GW -> 1.3GW×24h×365=11.4 TWh/year
Even taking into account that data centers also require power for cooling, which doubles AI GPU energy impact, gaming >> AI by a wide margin.
djoldman•7mo ago
> 1. An energy-efficient AI infrastructure powered 100% by renewable energy. This green power must be additionally generated.
> 2. AI companies must disclose: a. How much electricity is used in operating their AI. b. How much power is consumed by users during their use of AI. c. The goals under which their models were trained, and which environmental parameters were considered.
> 3. AI developers must take responsibility for their supply chains. They must contribute to the expansion of renewable energy in line with their growth and ensure that local communities do not suffer negative consequences (e.g., lack of drinking water, higher electricity prices).
Is there a term for "energy neutrality," the cousin of "net neutrality"?
Do we as a society want to wade into the morass of telling people what kinds of activities they can use energy for?
If we care about saving a watt-hour, there are lots of places to look. Pointing fingers at the incredible energy consumption of internet-delivered HD video might not feel very comfortable to lots of folks.
doener•7mo ago
Doesn't training the model consume the most energy in most cases?
gruez•7mo ago
Also,
>Brent Thill of Jefferies, an analyst, estimates that [inference] accounts for 96% of the overall energy consumed in data centres used by the AI industry.
https://archive.is/GJs5n
jnieswl•7mo ago
Zacharias030•7mo ago
Zacharias030•7mo ago
Google announced they are serving 500T tokens per month. State of the art models are currently trained with less than 30T tokens. Even if training tokens are more costly to run (eg, a factor of 3x for forward, backward, and weight updates, and take another factor of 2x for missing quantization), you end up in a situation where inference compute dominates training after a very short time of amortization.
doener•7mo ago
jillesvangurp•7mo ago
My view is that increased energy demand is not necessarily a bad thing in itself. First, it's by no means the dominant source of such demand, other things (transport, shipping, heating, etc.) outrank it; so a little bit of pressure from AI won't move the needle too much. Our main problem remains the same: too much CO2 being emitted. Second, meeting increased demand is typically done with renewables these days. Not because it's nice to do so but because it's cheap to do so. That's why renewables are popular in places like Texas. They don't care about the planet there. But they love cheap energy. And the more cheap, clean power we bring online, the worse expensive dirty power actually looks.
Increased demand leads to mostly new clean generation and increased pressure to deprecate dirty expensive generation. That's why coal is all but gone from most energy markets. That has nothing to do with how dirty it is and everything to do with how expensive it is. Gas based generation is heading the same direction. Any investment in such generation should be considered as very risky.
Short term of course you get some weird behavior like data centers being powered by gas turbines. Not because it's cheap but because it's easy and quick. Long term, a cost optimization would be getting rid of the gas generators. And with inference increasingly becoming the main thing in terms of energy and tokens, energy also becomes the main differentiator for profitability of AI services. Which again points at using cheap renewables to maximize profit. The winners in this market will be working on efficiency. And part of that is energy efficiency. Because that and the hardware is the main cost involved here.
Uehreka•7mo ago
mbgerring•7mo ago
All serious, viable plans for decarbonization include a massive increase in electricity consumption, due to electrification of transportation, industrial processes, etc, along with an increase in renewable energy production. This isn't new, but AI datacenters are a very large net new single user of electricity.
If the amount of money already poured into AI had gone into the rollout of clean energy infrastructure, we wouldn't even be having this conversation, but here we are.
It makes perfect sense from a policy perspective, given that there are a small number of players in this space with more resources than most governments, to piggyback on this wave of infrastructure buildout.
It also makes plenty of financial sense. Initial capex for adding clean energy generation is high, but given both the high electricity usage of AI datacenters, and the long-term impact on the grid that someone will eventually have to pay for, companies deploying AI infrastructure would be smart to use the huge amount of capital at their disposal to generate their own electricity.
It's also, from a deployment standpoint, pretty straightforward — we're talking about massive, rectangular, warehouse-like buildings with flat roofs. We should have already mandated that all such buildings be covered in solar panels with on-site storage, at a minimum.
nico_h•7mo ago
phillipcarter•7mo ago
I agree that in general, if the goal is to limit CO2 emissions and use renewable sources of energy, we ought not to focus on AI first, because it is dwarfed by many other things that we take for granted today. My canonical example I give folks is that the latte they order every day from Starbucks involves substantially more energy and water use than most uses of ChatGPT on a daily basis.
But as we move to digitize more and more of this world, and now create automated cognitive labor, we should start with the right foundations. I'd rather we not try to disentangle critical AI infrastructure from coal power plants, and I'd rather we try to limit the compute available to workloads in ways that encourage people to use the tech actually befitting of their use case rather than throw it all into the most expensive model every time.
nico_h•7mo ago
More seriously, i’m not too sure about the energy cost and IP infringed during the training and the value added to society by providing generic and mostly accurate but sometimes wildly wrong answers. Or from generating text or pretty pictures for a few milli-cents in cooling and electricity vs asking a human to do the same for a few kilo-cents.
It’s a lot of ladder kicking in the software industry these days.
TimPC•7mo ago
phillipcarter•7mo ago
whiplash451•7mo ago
phillipcarter•7mo ago
atonse•7mo ago
So it seems like the better goal is to just aim for more clean energy.
mcv•7mo ago
bee_rider•7mo ago
There’s a lot of focus on the carbon cost of various digital goods. I get it. Destroying the environment is a big problem. But like, maybe we also should not make a bunch of plastic crap and ship it around the world a bunch of times.
jnieswl•7mo ago
mumbisChungo•7mo ago
bee_rider•7mo ago
bbor•7mo ago
For anyone who's curious on specifics re:AI emissions, the recent MIT article is the gold standard in terms of specificity, neutrality, and nuance: https://www.technologyreview.com/2025/05/20/1116327/ai-energ... .
I also did some napkin math here in 2024.12: https://bsky.app/profile/robb.doering.ai/post/3lckwra33vk2t TL;DR: Eating one less burger affords you ~300 chatbot inferences, and avoiding a flight from ATL to SFO affords you ~16,000.
TimPC•7mo ago
jcynix•7mo ago
Air conditioning for example would be a good place to save energy, as the world wide energy consumption is a multiple of AI's consumption. But climate change will push the need (not luxury) for air conditioning up, which is the Catch-22 in this case.
The International Energy Agency (IEA) estimates that 10% of the globally generated energy is used for sir conditioning. But it would nevertheless be a good idea to require AI companies to care for renewable energy before they reach similar consumption levels.
Regarding the "morass" … we tell people how fast they can drive, or companies to limit air pollution (at least in some countries) so no problem here.
masswerk•7mo ago
This really applies to any application which consumes high percentages of the resources available. (Compare, data centers are responsible for almost 80% of the electricity consumption in the Dublin area according to the paper.) The rational of purpose and resource demand and expected effects is secondary to this. The primary question is about (significant) quantities.