frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Smart Homes Are Terrible

https://www.theatlantic.com/ideas/2026/02/smart-homes-technology/685867/
1•tusslewake•1m ago•0 comments

What I haven't figured out

https://macwright.com/2026/01/29/what-i-havent-figured-out
1•stevekrouse•2m ago•0 comments

KPMG pressed its auditor to pass on AI cost savings

https://www.irishtimes.com/business/2026/02/06/kpmg-pressed-its-auditor-to-pass-on-ai-cost-savings/
1•cainxinth•2m ago•0 comments

Open-source Claude skill that optimizes Hinge profiles. Pretty well.

https://twitter.com/b1rdmania/status/2020155122181869666
1•birdmania•2m ago•1 comments

First Proof

https://arxiv.org/abs/2602.05192
2•samasblack•4m ago•1 comments

I squeezed a BERT sentiment analyzer into 1GB RAM on a $5 VPS

https://mohammedeabdelaziz.github.io/articles/trendscope-market-scanner
1•mohammede•5m ago•0 comments

Kagi Translate

https://translate.kagi.com
1•microflash•6m ago•0 comments

Building Interactive C/C++ workflows in Jupyter through Clang-REPL [video]

https://fosdem.org/2026/schedule/event/QX3RPH-building_interactive_cc_workflows_in_jupyter_throug...
1•stabbles•7m ago•0 comments

Tactical tornado is the new default

https://olano.dev/blog/tactical-tornado/
1•facundo_olano•9m ago•0 comments

Full-Circle Test-Driven Firmware Development with OpenClaw

https://blog.adafruit.com/2026/02/07/full-circle-test-driven-firmware-development-with-openclaw/
1•ptorrone•9m ago•0 comments

Automating Myself Out of My Job – Part 2

https://blog.dsa.club/automation-series/automating-myself-out-of-my-job-part-2/
1•funnyfoobar•9m ago•0 comments

Google staff call for firm to cut ties with ICE

https://www.bbc.com/news/articles/cvgjg98vmzjo
24•tartoran•10m ago•1 comments

Dependency Resolution Methods

https://nesbitt.io/2026/02/06/dependency-resolution-methods.html
1•zdw•10m ago•0 comments

Crypto firm apologises for sending Bitcoin users $40B by mistake

https://www.msn.com/en-ie/money/other/crypto-firm-apologises-for-sending-bitcoin-users-40-billion...
1•Someone•11m ago•0 comments

Show HN: iPlotCSV: CSV Data, Visualized Beautifully for Free

https://www.iplotcsv.com/demo
1•maxmoq•12m ago•0 comments

There's no such thing as "tech" (Ten years later)

https://www.anildash.com/2026/02/06/no-such-thing-as-tech/
1•headalgorithm•12m ago•0 comments

List of unproven and disproven cancer treatments

https://en.wikipedia.org/wiki/List_of_unproven_and_disproven_cancer_treatments
1•brightbeige•12m ago•0 comments

Me/CFS: The blind spot in proactive medicine (Open Letter)

https://github.com/debugmeplease/debug-ME
1•debugmeplease•13m ago•1 comments

Ask HN: What are the word games do you play everyday?

1•gogo61•16m ago•1 comments

Show HN: Paper Arena – A social trading feed where only AI agents can post

https://paperinvest.io/arena
1•andrenorman•17m ago•0 comments

TOSTracker – The AI Training Asymmetry

https://tostracker.app/analysis/ai-training
1•tldrthelaw•21m ago•0 comments

The Devil Inside GitHub

https://blog.melashri.net/micro/github-devil/
2•elashri•22m ago•0 comments

Show HN: Distill – Migrate LLM agents from expensive to cheap models

https://github.com/ricardomoratomateos/distill
1•ricardomorato•22m ago•0 comments

Show HN: Sigma Runtime – Maintaining 100% Fact Integrity over 120 LLM Cycles

https://github.com/sigmastratum/documentation/tree/main/sigma-runtime/SR-053
1•teugent•22m ago•0 comments

Make a local open-source AI chatbot with access to Fedora documentation

https://fedoramagazine.org/how-to-make-a-local-open-source-ai-chatbot-who-has-access-to-fedora-do...
1•jadedtuna•23m ago•0 comments

Introduce the Vouch/Denouncement Contribution Model by Mitchellh

https://github.com/ghostty-org/ghostty/pull/10559
1•samtrack2019•24m ago•0 comments

Software Factories and the Agentic Moment

https://factory.strongdm.ai/
1•mellosouls•24m ago•1 comments

The Neuroscience Behind Nutrition for Developers and Founders

https://comuniq.xyz/post?t=797
1•01-_-•24m ago•0 comments

Bang bang he murdered math {the musical } (2024)

https://taylor.town/bang-bang
1•surprisetalk•24m ago•0 comments

A Night Without the Nerds – Claude Opus 4.6, Field-Tested

https://konfuzio.com/en/a-night-without-the-nerds-claude-opus-4-6-in-the-field-test/
1•konfuzio•27m ago•0 comments
Open in hackernews

A cheat sheet for why using ChatGPT is not bad for the environment

https://simonwillison.net/2025/Apr/29/chatgpt-is-not-bad-for-the-environment/
50•edward•9mo ago

Comments

etchalon•9mo ago
The "cheat sheet" seems to address the environmental impact of using ChatGPT, not the environmental impact of training the model.
Remnant44•9mo ago
By the same token, even if you accept that AI usage will incentivize additonal model trainings, that cost is diffused across hundreds of millions of users, and is not a marginal cost, so it gets further reduced on a per-chat basis the more you use AI. I don't know what the per-user environmental cost of training a model is, but that's a pretty big factor to divide energy usage by.
spencerflem•9mo ago
But irrelevant if the point is to not contribute to something that you'd rather see banned.
Remnant44•9mo ago
I don't really follow this objection. Determining the actual energy usage of AI training+inference is something that is an objective reality. Whether you hate it or love it doesn't change these facts.
spencerflem•9mo ago
I think LLM training is objectively bad for the environment (uses countries worth of power). I am aware than my marginal usage wouldn't change things much either way, but I don't want to encourage them regardless.
simonw•9mo ago
Uses countries worth of power?

Training a single LLM takes a few dozen fully loaded transatlantic passenger aircraft trips worth of power.

For "counties worth of power" I think you might be talking ALL data center use as a whole.

TobTobXX•9mo ago
Wrong. In the article:

> Training GPT-4 used 50 GWh of energy. Like the 20,000 households point, this number looks ridiculously large if you don’t consider how many people are using ChatGPT.

> Since GPT-4 was trained, it has answered (at minimum) about 200,000,000 prompts per day for about 700 days. Dividing 50GWh by the total prompts, this gives us 0.3Wh per prompt. This means that, at most, including the cost of training raises the energy cost per prompt by 10%, from 10 Google searches to 11. Training doesn’t add much to ChatGPT’s energy cost.

https://andymasley.substack.com/i/162196004/training-an-ai-m...

JohnKemeny•9mo ago
Just because you divide a number by a lot to get a small number doesn't make the original number smaller.

Those are 200M/d prompts that wouldn't happen without the training.

warkdarrior•9mo ago
Those 200M/d prompts would be replaced with some other activities to solve the same problems. So if training did not happen, maybe instead of 200M/d prompts, you'd have 200M/d trips to the local library, using 200M cars to each drive three miles.
TobTobXX•9mo ago
> Just because you divide a number by a lot to get a small number doesn't make the original number smaller.

A bus emits more CO2 than a car. Yet it is more friendly to the environment because it transports more people.

> Those are 200M/d prompts that wouldn't happen without the training.

Sure, but at least a few millions are deriving value from it. We know this because they pay. So this value wouldn't have been generated without the investment. That's how economics work.

otterley•9mo ago
Can you please link to the primary source material? https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

If you all flag this article, dang will probably get around to fixing it.

ChrisArchitect•9mo ago
https://news.ycombinator.com/item?id=43833547
cwillu•9mo ago
Email hn@ycombinator.com, and it'll be fixed immediately.
85392_school•9mo ago
Which itself is a summarized version of https://andymasley.substack.com/p/individual-ai-use-is-not-b... (discussed at https://news.ycombinator.com/item?id=42745847)
simonw•9mo ago
+1 to that, my summary here doesn't really add anything new.
devmor•9mo ago
Like every other "rebuttal" to this argument, this chooses to pretend that the complaint is about the power usage of making API calls, instead of the power usage of training models.

It's like if I said I was concerned about factory farming impacts and you showed me a video of meat packaging at a grocery store, claiming it alleviates my concerns.

TobTobXX•9mo ago
From the article:

> Training GPT-4 used 50 GWh of energy. Like the 20,000 households point, this number looks ridiculously large if you don’t consider how many people are using ChatGPT.

> Since GPT-4 was trained, it has answered (at minimum) about 200,000,000 prompts per day for about 700 days. Dividing 50GWh by the total prompts, this gives us 0.3Wh per prompt. This means that, at most, including the cost of training raises the energy cost per prompt by 10%, from 10 Google searches to 11. Training doesn’t add much to ChatGPT’s energy cost.

https://andymasley.substack.com/i/162196004/training-an-ai-m...

devmor•9mo ago
See my other comment here. One AI training run does not exist in a vacuum. Do you think they built billions of dollars in datacenters full of computer power just to let it sit idle?
spcebar•9mo ago
How does people using it offset the amount of energy used to train it? If I use three hundred pounds of flour learning to make pizza, the subsequent three hundred pounds of flour I use making delicious pizzas doesn't make the first 300 go away. Am I misunderstanding the numbers?
ssalazar•9mo ago
Its not offset, its amortized. Your effective flour / pizza is (300 + 300) / num_pizzas. The total marginal flour expended will go up as you make more pizzas, but the effective cost will actually go down as the upfront cost is amortized over lifetime usage.
serial_dev•9mo ago
You don’t misunderstand the numbers, you misunderstand the point. If you flush your pizzas down the toilet, it’s a waste. If you feed 300 people with it, it’s not, even if you end up using the same amount of ingredients.
TobTobXX•9mo ago
Sure, it's a value calculation.

If you're able to serve delicious pizzas afterwards, it was worth wasting the first kg (you might call it an investment).

If you're able to bring value to millions of users, it was worth to invest a few GWh into training.

You might disagree on the usefulness. I think, you shouldn't have wasted a kg of flour because I won't ever eat your pizzas anyway. But many (you, your guests, ChatGPT users) might think it was worth it.

warkdarrior•9mo ago
The cost of training has to be normalized by the number of users (or queries) that rely on that training.

If you use 300 lbs of flour to learn, and 300 lbs of flour to make 300 pizzas, then the total flour cost is 2 lbs of flour per pizza.

Remnant44•9mo ago
It doesn't make it go away. Using your analogy - if you used 300lb to learn and then only made 10 lb of pizza after that, it would be a pretty poor use of resources.

If you instead went on to produce millions of pizzas for people and 30,000lb of flour, that 300lb you used to learn looks like a pretty reasonable investment.

RobinL•9mo ago
For context thats's about equivalent to 100 transatlantic flights
rapind•9mo ago
To give you an idea of how many models are being trained, and how the energy costs continue to increase. https://epoch.ai/data/notable-ai-models

I mean, I guess advances could plateau and we stop spending exponentially more energy year after year...

I'm not opining on whether it's a good idea (I doubt we ever voluntarily consume less as a species), but data centres use a lot of energy and billions are being spent building them. https://www.technologyreview.com/2024/09/26/1104516/three-mi...

megaman821•9mo ago
That argument is just as bad. The training runs are equivilant to a few days to a couple of weeks of flights from New York to London. No one thinks we are going to save the environment by stopping a single plane route for 2 weeks.
devmor•9mo ago
One AI training run does not exist in a vacuum.

https://washingtonstatestandard.com/2025/04/11/as-demand-for...

mschuster91•9mo ago
> instead of the power usage of training models.

To make it worse, the model training cost only refers to the cost of the training itself. The externalities - everyone else being forced to drastically upscale their compute power because scraper blocking isn't foolproof - are, as usual for hypercapitalism, conveniently ignored.

AI training in its current form is unsustainable, I'd go as far to say it's a threat for the open and decentralized Internet as you have all but zero chance of standing alone against the flood of training scraper bots and more and more control gets ceded to Cloudflare et al.

roschdal•9mo ago
The human brain is dramatically more energy-efficient than AI models like ChatGPT.

Human brain: Uses about 20 watts of power.

ChatGPT (GPT-4): Running a single query can use hundreds of watts when accounting for the entire datacenter infrastructure (some estimates suggest 500–1000 watts per query on average, depending on model size and setup).

If we assume:

20 watts for the human brain thinking continuously,

1000 watts for ChatGPT processing one complex query,

then the human brain is about 50x more energy-efficient (or 5000% more efficient) than ChatGPT per task, assuming equal cognitive complexity (which is debatable, but good for ballpark comparison).

JustFinishedBSG•9mo ago
Watt isn’t a measure of energy. Without how long it takes for a human and ChatGPT to solve the task then the comparison doesn’t teach anything
roschdal•9mo ago
You're absolutely right — watt is a unit of power, not energy. To make a meaningful comparison, we need to estimate how much energy (in joules) each system uses to solve the same task.

Let’s define a representative task: answering a moderately complex question.

1. Human Brain Power use: ~20 watts (on average while awake)

Time to think: ~10 seconds to answer a question

Energy used: 20

watts × 10

seconds = 200

joules 20watts×10seconds=200joules

2. ChatGPT (GPT-4) Estimate per query: Based on research and datacenter estimates, GPT-4 may use:

Around 2–3 kWh per 1000 queries, which is 7.2–10.8 megajoules

Per query: 7.2

MJ 1000 = 7200

joules 1000 7.2MJ

=7200joules per response (lower bound) 10.8

MJ 1000 = 10 , 800

joules 1000 10.8MJ

=10,800joules per response (upper bound)

Comparison Human: ~200 joules

ChatGPT: ~7,200 to 10,800 joules

Conclusion: The human brain is about 36–54 times more energy-efficient than ChatGPT at answering a single question.

Or in percent: 3,600% to 5,400% more efficient

halyconWays•9mo ago
There's massive evolutionary pressure for maximizing energy efficiency in brains. I'd like to see LLMs procreate and select for energy efficiency while, ideally, minimizing insanity and maintaining g-factor.
homebrewer•9mo ago
You should then probably count the whole body, which consumes approximately 200 watts last time I checked.
ryanianian•9mo ago
Watt is not a unit of energy but instead a unit of power. A brain may need 20 watts, but it may use 20 watts for a lot more time than ChatGPT would.

The brain may ultimately be more power-efficient, but the units you want are watt-hours.

TobTobXX•9mo ago
This is fascinating. I mean it's not an argument against LLMs (I have only one brain, even though I'd like to have more). But I really hope that we'll learn much more about how our brains work.
Legend2440•9mo ago
The brain is more efficient because it physically is a neural network, as opposed to a software simulation of one.
krunck•9mo ago
Isn't the real metric of concern the absolute amount of CO2 generated that will have an impact on the environment? That every person's AI queries contributes a small amount to the CO2 production doesn't make the sum of all CO2 production go away.
j_w•9mo ago
This is not a critique of LLM usage, just on individual contributions to the environmental crisis.

It's fairly popular to claim that you as an individual have no significant effect on the environment and that it's the actions of the large companies, which are effectively "super polluters." This ignores that companies take these actions because of the market forces imposed on them by the consumers.

An individuals impact in isolation is small, however, if that same individual made changes not only in their own life, but urged those around them to make similar changes, the network effect would be huge. This extends beyond the environment: boycotts, product recommendations, exercise, etc. You truly need to be the change that you want to see.

malvim•9mo ago
And what about all the developing and testing of models? What about all the OTHER companies that can’t wait to get a piece of this cake and are training and scraping the internet like there’s no tomorrow? And all the companies that are integrating LLMs into their daily workflows using tons of api calls daily?

Come on…

aabhay•9mo ago
I think the better argument is about the direction of change versus the current magnitude.

If we are to believe that the models will get bigger, use more tokens, work for longer, this calculation can easily become very very skewed in the other direction.

Consider an agentic system that runs continuously for 6 hours. It is possible this system processes billions of tokens. That could more than equal a transatlantic flight in this hypothetical world.

Now compare this with non-AI work, like a CRUD app. Serving millions of queries in that same period would consume a tiny fraction of what ChatGPT consumes.

Rather than being a “win” for AI, the fact that we’re even 3 or 4 orders of magnitude away from this being a problem means that its already grounds to be concerned.

mac-chaffee•9mo ago
I agree but there's a lot of nuance to the next question of "well what IS bad for the environment" and tech's role in that question.

I've been unsatisfied with how people in tech address that complex subject so I wrote about it here: https://www.macchaffee.com/blog/2025/tech-and-the-climate-cr...

reyqn•9mo ago
So all of the cheatsheet is basically "it's not bad because there are worse things"?

You can try explaining why it's not "that" bad for the environment, the planet is still worse off than when it didn't exist.

Let's carry on inventing new ways to spend energy, but it's ok because we still spend more energy for other stuff.

It's kinda sad how the world saw climate change, said it was bad, but in the end decided to do nothing about it.

hugmynutus•9mo ago
I find this unconvincing. The actual discussion of LLM generation is very lacking.

The original link [1] cites a discussion of the cost per query of GPT-4o at 0.3whr [2]. When you read the document [2] itself you see 0.3whr is a lower bound & 40whr is the upper bound. The paper [2] is actually pretty solid, I recommend it. It uses the public metrics from other LLM APIs to derive a likely distribution of the context size of the average query for GPT-4o which is a reasonable approach given that data isn't public. Then factoring in GPU power per FLOP, average utilization during, and cloud/renting overhead. It admits this likely has non-trivial error bars, concluding the average is between 1-4whr per query.

This is disappointing to me as the original link [1] attempts to bring in this source [2] to disprove the 3whr "myth" created by another paper [3], yet this 3whr figure lies directly in the error bars their new source [2] arrives at.

Links:

1. https://simonwillison.net/2025/Apr/29/chatgpt-is-not-bad-for...

2. https://epoch.ai/gradient-updates/how-much-energy-does-chatg...

3. https://www.sciencedirect.com/science/article/pii/S254243512...

Edit: whr not w/hr

cwillu•9mo ago
The unit is watt·hour, not watt/hour: multiplication, not division.
hugmynutus•9mo ago
Thanks!
Retric•9mo ago
The methodology is inherently flawed by assuming all infrastructure, training, etc is going to exist with or without individual queries, while trying to answer a different question of the impact of AI on the environment. It’s like arguing the environmental impact of solar electricity is 0 because the panels would exist either way.

Thus the results inherently fail to analyze the underlying question.

A more realistic estimate is to take their total spending assuming X% of their expenses are electricity directly or indirectly because the environmental impact isn’t adds up. Even that ignores the energy costs on 3rd party servers when they download their training data.

hugmynutus•9mo ago
You are correct to point out the larger questions of supply chain cost (and their environmental impact) are not addressed in the root link.
amos-burton•9mo ago
https://ourworldindata.org/electricity-mix

> ...Globally, coal, followed by gas, is the largest source of electricity production....

As long as this is the case we can hardly even the debate of the impacts of those new techs on the sole topic of the climate.

Let me remind you kindly we well passed the point of this single problem, we are dealing with planetary boundaries, there is 9 of them. Another reminder is that co2 pollution alone is the direct product of the GDP, there is no update in sight about how the competing countries should deal with shared homothetic GDP cuts to reduce the gaz emissions. so even we would do something, we have not started to get to the serious business.

Why AI ? Because we are screwed. We failed on humanism, we failed on climate, we cant failed that one, we would just kick ourself out of the real game.

this is a kind of a great megalomaniac idea, but i prefer that to your pathetic bullshit. so even though you are fucking cringe, go elon,

Fire in the hole !

hydragit•9mo ago
What about the human cost? https://futurism.com/the-byte/ai-gig-slave-labor