frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

They were drawn to Korea with dreams of K-pop stardom – but then let down

https://www.bbc.com/news/articles/cvgnq9rwyqno
1•breve•53s ago•0 comments

Show HN: AI-Powered Merchant Intelligence

https://nodee.co
1•jjkirsch•3m ago•0 comments

Bash parallel tasks and error handling

https://github.com/themattrix/bash-concurrent
1•pastage•3m ago•0 comments

Let's compile Quake like it's 1997

https://fabiensanglard.net/compile_like_1997/index.html
1•billiob•4m ago•0 comments

Reverse Engineering Medium.com's Editor: How Copy, Paste, and Images Work

https://app.writtte.com/read/gP0H6W5
1•birdculture•9m ago•0 comments

Go 1.22, SQLite, and Next.js: The "Boring" Back End

https://mohammedeabdelaziz.github.io/articles/go-next-pt-2
1•mohammede•15m ago•0 comments

Laibach the Whistleblowers [video]

https://www.youtube.com/watch?v=c6Mx2mxpaCY
1•KnuthIsGod•16m ago•1 comments

Slop News - HN front page right now hallucinated as 100% AI SLOP

https://slop-news.pages.dev/slop-news
1•keepamovin•21m ago•1 comments

Economists vs. Technologists on AI

https://ideasindevelopment.substack.com/p/economists-vs-technologists-on-ai
1•econlmics•23m ago•0 comments

Life at the Edge

https://asadk.com/p/edge
2•tosh•29m ago•0 comments

RISC-V Vector Primer

https://github.com/simplex-micro/riscv-vector-primer/blob/main/index.md
3•oxxoxoxooo•32m ago•1 comments

Show HN: Invoxo – Invoicing with automatic EU VAT for cross-border services

2•InvoxoEU•33m ago•0 comments

A Tale of Two Standards, POSIX and Win32 (2005)

https://www.samba.org/samba/news/articles/low_point/tale_two_stds_os2.html
2•goranmoomin•37m ago•0 comments

Ask HN: Is the Downfall of SaaS Started?

3•throwaw12•38m ago•0 comments

Flirt: The Native Backend

https://blog.buenzli.dev/flirt-native-backend/
2•senekor•39m ago•0 comments

OpenAI's Latest Platform Targets Enterprise Customers

https://aibusiness.com/agentic-ai/openai-s-latest-platform-targets-enterprise-customers
1•myk-e•42m ago•0 comments

Goldman Sachs taps Anthropic's Claude to automate accounting, compliance roles

https://www.cnbc.com/2026/02/06/anthropic-goldman-sachs-ai-model-accounting.html
3•myk-e•44m ago•5 comments

Ai.com bought by Crypto.com founder for $70M in biggest-ever website name deal

https://www.ft.com/content/83488628-8dfd-4060-a7b0-71b1bb012785
1•1vuio0pswjnm7•45m ago•1 comments

Big Tech's AI Push Is Costing More Than the Moon Landing

https://www.wsj.com/tech/ai/ai-spending-tech-companies-compared-02b90046
4•1vuio0pswjnm7•47m ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
2•1vuio0pswjnm7•49m ago•0 comments

Suno, AI Music, and the Bad Future [video]

https://www.youtube.com/watch?v=U8dcFhF0Dlk
1•askl•51m ago•2 comments

Ask HN: How are researchers using AlphaFold in 2026?

1•jocho12•54m ago•0 comments

Running the "Reflections on Trusting Trust" Compiler

https://spawn-queue.acm.org/doi/10.1145/3786614
1•devooops•59m ago•0 comments

Watermark API – $0.01/image, 10x cheaper than Cloudinary

https://api-production-caa8.up.railway.app/docs
1•lembergs•1h ago•1 comments

Now send your marketing campaigns directly from ChatGPT

https://www.mail-o-mail.com/
1•avallark•1h ago•1 comments

Queueing Theory v2: DORA metrics, queue-of-queues, chi-alpha-beta-sigma notation

https://github.com/joelparkerhenderson/queueing-theory
1•jph•1h ago•0 comments

Show HN: Hibana – choreography-first protocol safety for Rust

https://hibanaworks.dev/
5•o8vm•1h ago•1 comments

Haniri: A live autonomous world where AI agents survive or collapse

https://www.haniri.com
1•donangrey•1h ago•1 comments

GPT-5.3-Codex System Card [pdf]

https://cdn.openai.com/pdf/23eca107-a9b1-4d2c-b156-7deb4fbc697c/GPT-5-3-Codex-System-Card-02.pdf
1•tosh•1h ago•0 comments

Atlas: Manage your database schema as code

https://github.com/ariga/atlas
1•quectophoton•1h ago•0 comments
Open in hackernews

Environmental Impacts of Artificial Intelligence

https://www.greenpeace.de/publikationen/environmental-impacts-of-artificial-intelligence
84•doener•7mo ago

Comments

djoldman•7mo ago
> Greenpeace calls for the following measures to minimize the environmental impacts of Artificial Intelligence:

> 1. An energy-efficient AI infrastructure powered 100% by renewable energy. This green power must be additionally generated.

> 2. AI companies must disclose: a. How much electricity is used in operating their AI. b. How much power is consumed by users during their use of AI. c. The goals under which their models were trained, and which environmental parameters were considered.

> 3. AI developers must take responsibility for their supply chains. They must contribute to the expansion of renewable energy in line with their growth and ensure that local communities do not suffer negative consequences (e.g., lack of drinking water, higher electricity prices).

Is there a term for "energy neutrality," the cousin of "net neutrality"?

Do we as a society want to wade into the morass of telling people what kinds of activities they can use energy for?

If we care about saving a watt-hour, there are lots of places to look. Pointing fingers at the incredible energy consumption of internet-delivered HD video might not feel very comfortable to lots of folks.

doener•7mo ago
> 2. AI companies must disclose: a. How much electricity is used in operating their AI.

Doesn't training the model consume the most energy in most cases?

gruez•7mo ago
Depends. For CoT models inference is significantly more costly (compared to regular models).

Also,

>Brent Thill of Jefferies, an analyst, estimates that [inference] accounts for 96% of the overall energy consumed in data centres used by the AI industry.

https://archive.is/GJs5n

jnieswl•7mo ago
Foreword Author here. I agree, even early estimates e.g. from Meta (2022) suggested 20% Training, 10% Experiments, 70% inference. And adoption is rising from month to month.
Zacharias030•7mo ago
Those must have been about other things than LLMs though. Meta has huge inference loads for other types of models.
Zacharias030•7mo ago
This is changing rapidly.

Google announced they are serving 500T tokens per month. State of the art models are currently trained with less than 30T tokens. Even if training tokens are more costly to run (eg, a factor of 3x for forward, backward, and weight updates, and take another factor of 2x for missing quantization), you end up in a situation where inference compute dominates training after a very short time of amortization.

doener•7mo ago
Thank you!
jillesvangurp•7mo ago
This is a good point. Another point is that the better models get, the less wasted tokens there will be on unproductive token generation for answers that are wrong in some way. Better answers might lead to increased demand of course. But less waste is not a bad thing in itself. And improved quality of the answers has other economical advantages.

My view is that increased energy demand is not necessarily a bad thing in itself. First, it's by no means the dominant source of such demand, other things (transport, shipping, heating, etc.) outrank it; so a little bit of pressure from AI won't move the needle too much. Our main problem remains the same: too much CO2 being emitted. Second, meeting increased demand is typically done with renewables these days. Not because it's nice to do so but because it's cheap to do so. That's why renewables are popular in places like Texas. They don't care about the planet there. But they love cheap energy. And the more cheap, clean power we bring online, the worse expensive dirty power actually looks.

Increased demand leads to mostly new clean generation and increased pressure to deprecate dirty expensive generation. That's why coal is all but gone from most energy markets. That has nothing to do with how dirty it is and everything to do with how expensive it is. Gas based generation is heading the same direction. Any investment in such generation should be considered as very risky.

Short term of course you get some weird behavior like data centers being powered by gas turbines. Not because it's cheap but because it's easy and quick. Long term, a cost optimization would be getting rid of the gas generators. And with inference increasingly becoming the main thing in terms of energy and tokens, energy also becomes the main differentiator for profitability of AI services. Which again points at using cheap renewables to maximize profit. The winners in this market will be working on efficiency. And part of that is energy efficiency. Because that and the hardware is the main cost involved here.

Uehreka•7mo ago
Net Neutrality is a really bad awkward term that constantly confuses laypeople. I get what you’re saying, but don’t lean on the term Net Neutrality in the hopes it will help people understand by building off something else they understand: People don’t understand Net Neutrality.
mbgerring•7mo ago
As long as energy production and consumption has severe downstream impacts, yes, we do need to wade into this territory.

All serious, viable plans for decarbonization include a massive increase in electricity consumption, due to electrification of transportation, industrial processes, etc, along with an increase in renewable energy production. This isn't new, but AI datacenters are a very large net new single user of electricity.

If the amount of money already poured into AI had gone into the rollout of clean energy infrastructure, we wouldn't even be having this conversation, but here we are.

It makes perfect sense from a policy perspective, given that there are a small number of players in this space with more resources than most governments, to piggyback on this wave of infrastructure buildout.

It also makes plenty of financial sense. Initial capex for adding clean energy generation is high, but given both the high electricity usage of AI datacenters, and the long-term impact on the grid that someone will eventually have to pay for, companies deploying AI infrastructure would be smart to use the huge amount of capital at their disposal to generate their own electricity.

It's also, from a deployment standpoint, pretty straightforward — we're talking about massive, rectangular, warehouse-like buildings with flat roofs. We should have already mandated that all such buildings be covered in solar panels with on-site storage, at a minimum.

nico_h•7mo ago
Sadly we’re already in the long term impact of the previous energy revolution, so we’d better get starting now instead of when we’ll feel the impact of this next compute evolution.
phillipcarter•7mo ago
> If we care about saving a watt-hour, there are lots of places to look. Pointing fingers at the incredible energy consumption of internet-delivered HD video might not feel very comfortable to lots of folks.

I agree that in general, if the goal is to limit CO2 emissions and use renewable sources of energy, we ought not to focus on AI first, because it is dwarfed by many other things that we take for granted today. My canonical example I give folks is that the latte they order every day from Starbucks involves substantially more energy and water use than most uses of ChatGPT on a daily basis.

But as we move to digitize more and more of this world, and now create automated cognitive labor, we should start with the right foundations. I'd rather we not try to disentangle critical AI infrastructure from coal power plants, and I'd rather we try to limit the compute available to workloads in ways that encourage people to use the tech actually befitting of their use case rather than throw it all into the most expensive model every time.

nico_h•7mo ago
Oh wow, growing, drying, transporting, roasting, transporting, brewing something takes more energy and physical resources than a single query in a computer? Physical goods are amazing like that. I wonder how margins on software stuff are so high!??!

More seriously, i’m not too sure about the energy cost and IP infringed during the training and the value added to society by providing generic and mostly accurate but sometimes wildly wrong answers. Or from generating text or pretty pictures for a few milli-cents in cooling and electricity vs asking a human to do the same for a few kilo-cents.

It’s a lot of ladder kicking in the software industry these days.

TimPC•7mo ago
How about the silly treadmill where we waste billions of compute to compute useless proof of work type behaviours and whenever more compute gets thrown at the problem we just make it harder to ensure there isn't better output. I believe it was called buttcoin or something silly like that.
phillipcarter•7mo ago
And whose biggest claim to fame is allowing for large-scale fraud!
whiplash451•7mo ago
Using a subpar model and having to run multiple requests may not be a better deal for climate than a sota model one-shotting the right answer.
phillipcarter•7mo ago
I'm saying that you can often use a "subpar" model to one-shot an answer too, provided you work a little harder on prompting and context management.
atonse•7mo ago
Technically, if it's all clean energy, does it matter if it's "energy-efficient"?

So it seems like the better goal is to just aim for more clean energy.

mcv•7mo ago
Once we've got abundant clean energy, it might not matter so much anymore, but as long as we're still burning carbon, it matters a lot. And until we get there, we should probably do both.
bee_rider•7mo ago
We should probably just do a carbon tax and not wade into that morass.

There’s a lot of focus on the carbon cost of various digital goods. I get it. Destroying the environment is a big problem. But like, maybe we also should not make a bunch of plastic crap and ship it around the world a bunch of times.

jnieswl•7mo ago
Disclaimer:Foreword Author here. I agree that there are may things one could change, however for many other services or objects you buy, you are able to estimate the env. footprint or you can change your consumer behaviour. However for the top AI-models one has no clue how much energy is used. Therefore the demands are among others for transparency from the ai companies.
mumbisChungo•7mo ago
I have no idea what the carbon footprint of the coffee I drink or chair I sit in or netflix program I watch is. I can control my consumption of LLMs just as easily as those things.
bee_rider•7mo ago
Well, you almost certainly know more than me about it, since you are working in the area. From a layman’s point of view it seems like knowing that things are very carbon producing has not provoked mass behavior changes. I’d have more faith in moves that add a measurable cost. But maybe knowing how much LLMs produce could be part of motivating us to include an actual cost.
bbor•7mo ago

  Do we as a society want to wade into the morass of telling people what kinds of activities they can use energy for?
I mean, yeah, that's just basic civil regulation. Energy generation has massive negative externalities, and preventing waste is a worthy cause. I don't agree that AI must be singled out in that sense, but even it were, I imagine a modest push for efficiency would only help us in the long run.

  If we care about saving a watt-hour, there are lots of places to look. 
Well put, but I think it's important to bring the analysis one level up, and look at emissions. In that paradigm, meat eating and non-essential travel (yes, including vacations to Rome, business meetings, scientific conferences, and other perceived-to-be-unalienable rights) are punching way above their weight class.

For anyone who's curious on specifics re:AI emissions, the recent MIT article is the gold standard in terms of specificity, neutrality, and nuance: https://www.technologyreview.com/2025/05/20/1116327/ai-energ... .

I also did some napkin math here in 2024.12: https://bsky.app/profile/robb.doering.ai/post/3lckwra33vk2t TL;DR: Eating one less burger affords you ~300 chatbot inferences, and avoiding a flight from ATL to SFO affords you ~16,000.

TimPC•7mo ago
I think if we do want to do this then banning bitcoin proof of work behaviours seems far more important.
jcynix•7mo ago
> If we care about saving a watt-hour, there are lots of places to look. Pointing fingers at the incredible energy consumption of internet-delivered HD video might not feel very comfortable to lots of folks.

Air conditioning for example would be a good place to save energy, as the world wide energy consumption is a multiple of AI's consumption. But climate change will push the need (not luxury) for air conditioning up, which is the Catch-22 in this case.

The International Energy Agency (IEA) estimates that 10% of the globally generated energy is used for sir conditioning. But it would nevertheless be a good idea to require AI companies to care for renewable energy before they reach similar consumption levels.

Regarding the "morass" … we tell people how fast they can drive, or companies to limit air pollution (at least in some countries) so no problem here.

masswerk•7mo ago
> Do we as a society want to wade into the morass of telling people what kinds of activities they can use energy for?

This really applies to any application which consumes high percentages of the resources available. (Compare, data centers are responsible for almost 80% of the electricity consumption in the Dublin area according to the paper.) The rational of purpose and resource demand and expected effects is secondary to this. The primary question is about (significant) quantities.

lenerdenator•7mo ago
We didn't care about the environmental impacts of all of the other stuff that made a few people obscenely rich; we're not gonna start now.

I mean, should we? Yeah. But we're not gonna.

FredPret•7mo ago
Everything you say is valid, but you left out the part where, in addition to new tech making a few "obscenely" rich, it also makes a layer of very many people under them extremely rich, and almost everyone else a lot better off in the long run.

Here's some stats showing the growth of the millionaire class, now up to 7% of the population: https://www.statista.com/chart/30671/number-of-millionaires-...

At the same time, here's some stats showing extreme poverty falling off a cliff: https://www.statista.com/statistics/1341003/poverty-rate-wor...

Such a thing was completely unthinkable before disruptive tech and the associated mega-rich became the new normal 200 years ago.

Having said that, you're correct to point out that negative externalities haven't really entered our minds until about 50-75 years ago, but it seems tech progress has even made clean, green living at scale possible at least in principle.

coliveira•7mo ago
> poverty falling off a cliff

This is 80% due to the Chinese government, if it was for billionaires they would all be as poor as before.

FredPret•7mo ago
Amazing feat by the Chinese government to boost the whole planet like that. Do you have numbers for that, or are you just a committed tankie?

Also, ~100% of China's growth started when they embraced market economics in 1990. Read US business books from the 80's. It's rare to even see China mentioned at all until the late 90's. Everybody was worried about Japan overtaking the US and nobody talked about the Chinese economy, because it barely existed.

coliveira•7mo ago
Market economics doesn't mean billionaire (robber baron) control. That's the difference.
mulmen•7mo ago
> We didn't care about the environmental impacts of all of the other stuff

Speak for yourself. Environmentalism has been a thing for longer than I have been alive. Clearly we care.

And before you reply with even more toxic cynicism stand behind an idling 1960, 1980, 2000 and 2020 sedan and tell me you can’t tell the difference.

sergiotapia•7mo ago
As long as India and China are dumping obscene amounts of plastics into the ocean, I don't really wanna hear it. AI drop in the bucket. The measures imposed on Americans and worse Europeans is an insult.
mbgerring•7mo ago
China is going to beat the U.S. to decarbonization. This excuse never made sense, and by the end of this decade it will be unintelligible.
FredPret•7mo ago
They pollute because they've turned into the West's industrial zone. They only make a bunch of stuff because we buy it
_0ffh•7mo ago
Do you know the CO2 footprint of the untold megatons of concrete poured into ghost cities? Germany is still manufacturing stuff, and would easily beat China in CO2/capita if they hadn't shut down the nuclear power plants.
FredPret•7mo ago
That's a fair point, I hadn't considered it.

But it's a tough analysis to make regardless, because every widget I buy here in the West that comes from China (and that's pretty much 100% of the widgets) spits out CO2 in China. Now I look cleaner and they look dirtier than the case would be if the widget was made here.

imnotlost•7mo ago
Just build a bunch of nuclear power plants around the world. If we can spend a trillion or two bombing the taliban back into power we can afford some energy projects.
melling•7mo ago
Greenpeace is likely opposed to that.
fragmede•7mo ago
> Nuclear energy has no place in a safe, clean, sustainable future.

https://www.greenpeace.org/usa/climate/issues/nuclear/

mcv•7mo ago
Nuclear power plants are expensive and take time to build, though. At the moment we're still burning way too much oil and coal for our energy, and everything that drives up demand, contributes to that.
melling•7mo ago
We’ve had 40 years and we’re still burning all the coal Carl Sagan warned us about.

https://youtu.be/Wp-WiNXH6hI?si=3uhneUSoiZaUKS9M

So, after 40 years I’m a little tired of hearing it takes too long to build nuclear power plants.

On the bright side, we’ve almost reached peak coal:

https://www.theguardian.com/business/2024/dec/18/coal-use-to...

exiguus•7mo ago
We had it 40 years and we will have it the next N-Thousand years because of the waste they produce. Also, Nuclear-Power was massive subvention by the gov. Actually a business-case that can not exist without subvention. So we all paid it with the taxes and we still pay because of the nuclear-wast. The idea to build new nuclear-plants, is a new subvention-scam by some lobbyists or tech-giants who want to pass on their costs to the general public.
melling•7mo ago
There has been some discussion about reprocessing and reusing it.

https://e360.yale.edu/features/nuclear-waste-recycling

exiguus•7mo ago
The reality is a deep geological repository[1] for high radioactive waste. And this is also necessary for reprocessing [2]. Reprocessing will only reduce the high level waste.

[1] https://en.wikipedia.org/wiki/Deep_geological_repository

[2] https://en.wikipedia.org/wiki/High-level_radioactive_waste_m...

mcv•7mo ago
I don't remember anyone complaining about it taking too long, 40 years ago. People were worried about Three Mile Island and Chernobyl, about what to do with the waste, about leakage. But not about time.

But whatever the causes, too little has been done over those 40 years, and at the moment, solar is far cheaper and faster to deploy than nuclear. I'm not stopping anyone from building nuclear power plants, but I think its window has gone. It's too expensive and too slow. But feel free to prove me wrong.

Just stop burning more oil and coal.

zekrioca•7mo ago
Why do you think this “easy solution” would work?
elpocko•7mo ago
There are many more millions of GPUs in computers and game consoles around the world burning electricity for your entertainment, for decades. The same class of devices. The environmental impact of having pretty pixels on your screens is at least an order of magnitude higher than what it is for AI, for inference and training. I don't see anyone being up in arms about that.
coliveira•7mo ago
Your calculations must be severely off, because I never heard anyone advocating for the construction of nuclear reactors to power game consoles around the world. However, we hear everyday that we need to build these reactors right now if we want to have AI.
mulmen•7mo ago
Xboxes, smartphones, and personal computers are geographically distributed and so is their power generation. Data centers are …centralized. Dedicated power plants for large data center installations are not new.
gruez•7mo ago
>Xboxes, smartphones, and personal computers are geographically distributed and so is their power generation.

This is irrelevant because most "Xboxes, smartphones, and personal computers" are powered by centralized fossil fuel power plants that could plausibly be replaced with nuclear reactors, just like the power plant for a datacenter can be replaced with nuclear reactors.

Kudos•7mo ago
My Xbox is powered by solar. I can't say the same for my use of Claude, and I do not have the same agency to change that.
mulmen•7mo ago
The point is that the centralization of data centers makes them suitable for dedicated power generation.
elpocko•7mo ago
I hear people advocating for the construction of nuclear reactors every day. They don't mention gaming, just like they don't mention refrigerators or washing machines specifically. A gaming machine consumes the same amount of energy as a machine used for AI, it's the same hardware. AI consumes it for seconds per user, while one gaming machine is used for hours per session. The energy required by one human to play a game for one hour could serve hundreds or thousands of AI users.
rybosworld•7mo ago
The GPU's used for AI have significantly higher utilization rates than gaming GPU's...

Here's some napkin math:

H100: 61% utilization / 700W ~ 3.7MW/year

RTX 3080: 10% utilization / 320W ~ 0.27MW/year

elpocko•7mo ago
How many AI users are served using a single H100 per time, and how many gamers are served using a single 3080 per time? How many gamers are simultaneously running a 3080 or equivalent for their entertainment?
mystified5016•7mo ago
Yes, AI users are being served by much more than one GPU at a time.

Which is why the power used is so much higher than a single gaming pc

elpocko•7mo ago
Thousands of users are being served by those GPUs. A gaming PC has one user.

I can't believe I have to point this out on HN of all places.

rybosworld•7mo ago
You can't compare those things very easily.

If you're only talking about the GPU's used for inference, then that's a different story. Not nearly as much hardware is required for inference.

But the number of GPU's needed to train models is in the tens of thousands, and there are rumors that some shops (Meta) are already using 100k+ GPU's, just for training.

Those are likely all/mostly H100s, running at least 60% of the time. Consider that OpenAI, Anthropic, Google, Meta, Tesla, X.com, etc. are all within an order of magnitude of each other in terms of compute.

For arguments sake, that's 6 companies approaching 100K H100's worth of compute for their next gen models.

Now consider that GPT4 used roughly 100x more compute to train, compared to GPT3. And GPT5 is rumored to follow this trend, using 100x more compute than GPT4. Extrapolating, GPT6 might also use 100x more compute than GPT5.

Even if the next generation of AI GPU's are 10x as powerful as the H100 for the same amount of electricity, the next generation of models would need 10x as many GPU's (and thus, 10x as much electric demand).

Extrapolate that to GPT7, 8, 9 etc. And you can see why people are worried about the power usage.

This isn't even theoretical. As mentioned in this thread already, these companies are signing deals to buy all the capacity of power plants in some areas.

elpocko•7mo ago
>100k+ GPU's, just for training.

That's a tiny drop in the sea of the almost 2 billion PC gamers [0], and hundreds of millions of gaming consoles [1] in the world. Not to mention the energy required to manufacture all that hardware. Plus the datacenters required for online gaming, which must also be considerable.

It's weird to be concerned about power usage of AI, but turning a blind eye to the massive amounts of power required by the gaming industry.

[0] https://www.statista.com/statistics/420621/number-of-pc-game...

[1] https://en.wikipedia.org/wiki/List_of_best-selling_game_cons...

rybosworld•7mo ago
I think it's weird to ignore that every GPU used for AI equates to 10-50 gaming GPUs (10 if you assume every gamer has a 3080, which they don't), and that the number of GPUs used for AI is 10x every 18-24 months.

It's not that AI uses too much power today, it's that at the current trend, it'll be using somewhere between 100x and 1000x as much power by 2030/2035. Which would place it between 2-20% of total power consumption.

elpocko•7mo ago
Where does this "10-50" figure come from? A quick google search says a H100 draws 700 W, while a 3080 draws 320 W.

AI provides tangible value to businesses and private users beyond mere entertainment. We'll see how much power it consumes in the future, and where that power comes from.

rybosworld•7mo ago
I did the math in previous comment that you already replied too..

"The GPU's used for AI have significantly higher utilization rates than gaming GPU's... Here's some napkin math:

H100: 61% utilization / 700W ~ 3.7MW/year

RTX 3080: 10% utilization / 320W ~ 0.27MW/year"

H100 uses at least 10x as much power as a 3080 over it's life time. And most gamers aren't playing on 3080's.

coliveira•7mo ago
The difference here is between potential energy use and actual energy use. AI chips as well as other server chips are in use most of the time. But if I sell 1 billion gaming chips, these chips will be in use a small fraction of the time.
whiplash451•7mo ago
Indeed. But how many gaming GPUs are out there in the world?
gruez•7mo ago
That's his point? Greenpace wants AI datacenters to be built with clean energy. elpocko points out that plenty of other pointless electricity consumers also aren't being built with clean energy today. That's not an argument against green energy, but is pointing out that greenpace isn't very rigorous with their pleas. They're seemingly picking whatever is the most topical. We should be against this, because latching on to the latest thing basically guarantees that the next thing rolls around, all the momentum will be lost. Remember when everyone was up in arms about crypto mining? How it's barely brought up because everyone's focused on AI.
namuol•7mo ago
Show your math.
elpocko•7mo ago
Besides common sense, I can tell you about my kW h counter going brrr when playing games (400 W continuously, sometimes for hours on end) vs. running Stable Diffusion or Llama-whatever (400 W for 15 seconds every 3 minutes for an hour or two). Extrapolate from that.
namuol•7mo ago
Training runs basically 24/7 at full throttle. Inference isn’t the main source of energy consumption.
elpocko•7mo ago
Show your math.
Night_Thastus•7mo ago
The GPUs in PCs, consoles and phones aren't running full tilt 24/7. They run very bursty workloads for a couple hours a day at most.

Those in AI data centers never stop running and completely utilize their capacity. The difference in power usage is astronomical.

whiplash451•7mo ago
Can you please at least do the back-of-the-envelope math behind the “astronomical”?

I don’t claim to know, but we ought to be able to have a rational debate on this.

bluefirebrand•7mo ago
You don't need concrete numbers or even napkin math to realize that a gaming computer running a GPU for a couple of hours in the evening is going to use much less energy than a GPU running maxed out 24/7 for AI

There's nothing irrational about suggesting AI GPUs are consuming far more power

elpocko•7mo ago
Yes, one AI GPU uses more energy than one GPU used for gaming. The one AI GPU however is shared between a large number of users, while the one gaming GPU is used by one player.

Apparently a single gaming GPU can be used to run an LLM that serves hundreds of concurrent requests.

> Benchmarking Llama 3.1 8B (fp16) on our 1x RTX 3090 instance suggests that it can support apps with thousands of users by achieving reasonable tokens per second at 100+ concurrent requests.

https://backprop.co/environments/vllm

jesus_666•7mo ago
But that's a tiny model; it's the smallest version of Llama 3.1. The commercially marketed models are way bigger - e.g. GPT-4 has been estimated to use about 1.76 trillion parameters, 220 times more than the Llama build you mentioned. Their resource and performance requirements are vastly different.

You're essentially arguing that shipping naval diesel aggregates must be trivial because you can fit a dozen moped motors on the bed of your pickup truck just fine.

elpocko•7mo ago
Okay but these tiny models are being used by people and businesses instead of GPT-4. My point was that they consume less energy per user than a rig used for gaming.

I have no insight into how many GPT-4 users are served per GPU, but I would assume OpenAI heavily optimizes for that, considering the cost to run that thing. It's probably in the same ballpark: hundreds-thousands of concurrent user requests per GPU. Still better than one GPU per gamer, even if it requires 10x the energy.

pavlov•7mo ago
It’s not the same class of device.

A typical NVIDIA server GPU consumes 700W, and a server might have eight of them, so 5.6kW.

A PlayStation 5 consumes 200W total.

gruez•7mo ago
A PS5 serves a single person, maybe two. A datacenter GPU might be shared by dozens or hundreds of people, depending on how you count occupancy.
jonas21•7mo ago
A typical server is serving hundreds or thousands of user sessions, while a PlayStation 5 is serving only one.
nemo•7mo ago
There are a lot of folks who vastly underestimate the carbon output of current AI training and work, and you're among them. The number of data centers being raised right now with increased power planning around data centers around the globe points to a reality of energy consumption that's probably an order of magnitude higher than you imagine. At a time when the costs of carbon poisoning the oceans is getting really ugly and driving extinctions, melting polar ice, and driving global warming, writing off a major new generator of atmospheric carbon is dangerously irresponsible.

https://www.technologyreview.com/2025/05/20/1116327/ai-energ...

elpocko•7mo ago
>writing off a major new generator of atmospheric carbon is dangerously irresponsible.

I agree: ignoring the carbon footprint of the gaming industry is irresponsible.

nemo•7mo ago
I wouldn't recommend ignoring any of the sources of pollution.
js8•7mo ago
However, each consumer is paying for his own gaming console power. With AI (but also other "free" Internet services), there is a tragedy of commons (a coordination) problem, where you don't know the direct cost to you and so you (and others) will not make rational choices.
toshinoriyagi•7mo ago
The most common GPU per the steam survey is the RTX 3060 at 170W TDP. A huge % of users have cards near or below this TDP. The SXM H100 has a 700W TDP, and will spend far more of its life at or near that value.

Given an average ~8 hours of work/school and ~8 hours of sleep, gaming GPUs likely don't use anywhere near as much power. Plus, even when they are on, they will probably idle near 30W-60W for a lot of time spent browsing the web or watching videos.

There are more gaming GPUs in existence right now, but the number of AI chips is likely closing that gap rapidly.

And of course, what is that energy being used for? People playing games are typically having fun, bonding with friends, or engaging in social behavior. A huge amount of AI is illegally trained on copyrighted works without license to use them, causing significant harm to various fields. Plus the deluge of AI slop bogging down the internet, social media, forums, image/art-hosting sites, search, and more.

I think it will be a while before modern generative AI is even close to providing value in aggregate.

narrator•7mo ago
Wait till the mining gets automated, the transportation gets automated, the manufacturing and construction gets automated. There won't be that much labor and we will run into our ecological limits to growth at meteoric speed since that will be the limiting factor on the AI/robot genie. The only job at the government will be who gets to use the AI/robot genie and frantically running about trying to play whack-a-mole with paperclip maximizers that will appear everywhere. The whole economy will collapse to that. Basically, central planning all over again. This is why we need Free Market Ecology. I'll post a link if anyone's interested.
astariul•7mo ago
Please share
narrator•7mo ago
https://botsfordism.substack.com/p/the-economic-math-behind-...
yomismoaqui•7mo ago
Is Greenpeace still a thing? I thought that Greta Thunberg and Just Stop Oil stole their thunder.
sien•7mo ago
100 M Euro budget , 3.4 K staff, 34 K volunteers.

They are a big thing. Old people still donate to them.

They are a big reason Africans don't grow GMOs that can help children avoid blindness.

https://en.wikipedia.org/wiki/Greenpeace

MoonGhost•7mo ago
Realistic calculations should include both sides. New way vs old. In this case AI assisted vs manual. Here intentionally only one side considered. Because comparison does not produce desirable result. Which makes in attention attracting BS.
platevoltage•7mo ago
It is kind of weird to see the same people who have been saying "Our grid can't support electric cars" also not seeing any issue with the injection of AI into everything we see and do.
klysm•7mo ago
I don’t think this argument applies because AI workloads can be centralized
grej•7mo ago
Linking energy use to the environment is a political choice, and Greenpeace are some of the worst offenders for making the situation worse by opposing nuclear power at every turn.
dydghks2033•7mo ago
Interesting point about energy cost. If GPT inference keeps scaling, latency + watt efficiency might become central to AGI deployment.
option•7mo ago
What is the environmental impacts of Greenpeace lobbying against nuclear power?
Zaylan•7mo ago
We talk a lot about AI’s potential, but its energy footprint is often underestimated. As model sizes grow, the environmental impact of both training and inference may show up faster than expected. It's an issue worth more attention.
whiplash451•7mo ago
Let's try to ground the discussion into data.

Headcounts:

- Gaming GPUs: Installed base 700-900M GPUs + active gaming rigs (100-200M). Assumption of 250M active gamers worldwide.

- AI GPUs: ~3 million high-performance AI GPUs currently in active use globally

Average power usage:

Gaming: 3 hours gaming at 300W → 37.5W average over 24h

AI: 16h×600W+8h×100Widle=10400Wh→ 433W average over 24h

Total Global GPU Power Consumption:

Gaming: 250M GPUs × 37.5W avg = ~9.4 GW -> 9.4GW×24h×365=82.4 TWh/year

AI: 3M GPUs × 433W avg = ~1.3 GW -> 1.3GW×24h×365=11.4 TWh/year

Even taking into account that data centers also require power for cooling, which doubles AI GPU energy impact, gaming >> AI by a wide margin.