frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

The Rise of Spec Driven Development

https://www.dbreunig.com/2026/02/06/the-rise-of-spec-driven-development.html
1•Brajeshwar•1m ago•0 comments

The first good Raspberry Pi Laptop

https://www.jeffgeerling.com/blog/2026/the-first-good-raspberry-pi-laptop/
2•Brajeshwar•1m ago•0 comments

Seas to Rise Around the World – But Not in Greenland

https://e360.yale.edu/digest/greenland-sea-levels-fall
1•Brajeshwar•1m ago•0 comments

Will Future Generations Think We're Gross?

https://chillphysicsenjoyer.substack.com/p/will-future-generations-think-were
1•crescit_eundo•4m ago•0 comments

State Department will delete Xitter posts from before Trump returned to office

https://www.npr.org/2026/02/07/nx-s1-5704785/state-department-trump-posts-x
1•righthand•8m ago•0 comments

Show HN: Verifiable server roundtrip demo for a decision interruption system

https://github.com/veeduzyl-hue/decision-assistant-roundtrip-demo
1•veeduzyl•9m ago•0 comments

Impl Rust – Avro IDL Tool in Rust via Antlr

https://www.youtube.com/watch?v=vmKvw73V394
1•todsacerdoti•9m ago•0 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
2•vinhnx•10m ago•0 comments

minikeyvalue

https://github.com/commaai/minikeyvalue/tree/prod
3•tosh•14m ago•0 comments

Neomacs: GPU-accelerated Emacs with inline video, WebKit, and terminal via wgpu

https://github.com/eval-exec/neomacs
1•evalexec•19m ago•0 comments

Show HN: Moli P2P – An ephemeral, serverless image gallery (Rust and WebRTC)

https://moli-green.is/
2•ShinyaKoyano•23m ago•1 comments

How I grow my X presence?

https://www.reddit.com/r/GrowthHacking/s/UEc8pAl61b
2•m00dy•24m ago•0 comments

What's the cost of the most expensive Super Bowl ad slot?

https://ballparkguess.com/?id=5b98b1d3-5887-47b9-8a92-43be2ced674b
1•bkls•25m ago•0 comments

What if you just did a startup instead?

https://alexaraki.substack.com/p/what-if-you-just-did-a-startup
4•okaywriting•32m ago•0 comments

Hacking up your own shell completion (2020)

https://www.feltrac.co/environment/2020/01/18/build-your-own-shell-completion.html
2•todsacerdoti•35m ago•0 comments

Show HN: Gorse 0.5 – Open-source recommender system with visual workflow editor

https://github.com/gorse-io/gorse
1•zhenghaoz•35m ago•0 comments

GLM-OCR: Accurate × Fast × Comprehensive

https://github.com/zai-org/GLM-OCR
1•ms7892•36m ago•0 comments

Local Agent Bench: Test 11 small LLMs on tool-calling judgment, on CPU, no GPU

https://github.com/MikeVeerman/tool-calling-benchmark
1•MikeVeerman•37m ago•0 comments

Show HN: AboutMyProject – A public log for developer proof-of-work

https://aboutmyproject.com/
1•Raiplus•37m ago•0 comments

Expertise, AI and Work of Future [video]

https://www.youtube.com/watch?v=wsxWl9iT1XU
1•indiantinker•38m ago•0 comments

So Long to Cheap Books You Could Fit in Your Pocket

https://www.nytimes.com/2026/02/06/books/mass-market-paperback-books.html
3•pseudolus•38m ago•1 comments

PID Controller

https://en.wikipedia.org/wiki/Proportional%E2%80%93integral%E2%80%93derivative_controller
1•tosh•42m ago•0 comments

SpaceX Rocket Generates 100GW of Power, or 20% of US Electricity

https://twitter.com/AlecStapp/status/2019932764515234159
2•bkls•43m ago•0 comments

Kubernetes MCP Server

https://github.com/yindia/rootcause
1•yindia•44m ago•0 comments

I Built a Movie Recommendation Agent to Solve Movie Nights with My Wife

https://rokn.io/posts/building-movie-recommendation-agent
4•roknovosel•44m ago•0 comments

What were the first animals? The fierce sponge–jelly battle that just won't end

https://www.nature.com/articles/d41586-026-00238-z
2•beardyw•52m ago•0 comments

Sidestepping Evaluation Awareness and Anticipating Misalignment

https://alignment.openai.com/prod-evals/
1•taubek•52m ago•0 comments

OldMapsOnline

https://www.oldmapsonline.org/en
2•surprisetalk•55m ago•0 comments

What It's Like to Be a Worm

https://www.asimov.press/p/sentience
2•surprisetalk•55m ago•0 comments

Don't go to physics grad school and other cautionary tales

https://scottlocklin.wordpress.com/2025/12/19/dont-go-to-physics-grad-school-and-other-cautionary...
2•surprisetalk•55m ago•0 comments
Open in hackernews

Apple Research unearthed forgotten AI technique and using it to generate images

https://9to5mac.com/2025/06/23/apple-ai-image-model-research-tarflow-starflow/
129•celias•7mo ago

Comments

celias•7mo ago
Paper at https://machinelearning.apple.com/research/normalizing-flows
rfv6723•7mo ago
Apple AI team keeps going against the bitter lesson and focusing on small on-device models.

Let's see how this would turn out in longterm.

echelon•7mo ago
Edge compute would be clutch, but Apple feels a decade too early.
7speter•7mo ago
Maybe for a big llm, but if they add some gpu cores and added a magnitude or 2 more unified memory to their i devices, or shoehorned m socs into high tier iDevices (especially as their lithography process advances), image generation becomes more viable, no? Also, I thought I read somewhere that apple wanted to infer simpler queries locally and switch to datacenter inference when the request was more complicated.

If they approach things this way, and transistor progress continues linearly (relative to the last few years) maybe they can make their first devices that can meet these goals in… 2-3 years?

sipjca•7mo ago
somewhat hard to say how the cards fall when the cost of 'intelligence' is coming down 1000x year over year while at the same time compute continues to scale. the bet should be made on both sides probably
furyofantares•7mo ago
10x year over year, not 1000x, right? The 1000x is from this 10x observation having held for 3 years.
sipjca•7mo ago
I believe the 1000x number I pulled is from SemiAnalysis or similar, using MMLU as the baseline benchmark and the cost per token from a year ago to today at the same score. Model improvements, hardware improvements and software improvements all make a massive difference when combined to make much greater than 10x gains in terms to intelligence/$
peepeepoopoo137•7mo ago
"""The bitter lesson""" is how you get the current swath of massively unprofitable AI companies that are competing with each other over who can lose money faster.
furyofantares•7mo ago
I can't tell if you're perpetuating the myth that these companies are losing money on their paid offerings, or just overestimating how much money they lose on their free offerings.
janalsncm•7mo ago
If it costs you a billion dollars to train a GPT5 and I can distill your model for a million dollars and get 90% of the performance, that’s a terrible deal for you. Or more realistically, whoever you borrowed from.
rfv6723•7mo ago
Then if you offer your distilled model for commercial services, you would get sued by OpenAI in court.
janalsncm•7mo ago
The bitter-er lesson is that distillation from bigger models works pretty damn well. It’s great news for the GPU poor, not great for the guys training the models we distill from.
rfv6723•7mo ago
Distillation is great for researchers and hobbyists.

But nearly all frontier models have anti-distillation ToS, so distillation is out of question for western commercial companies like Apple.

janalsncm•7mo ago
Even if Apple needs to train an LLM from scratch, they can distill it and deploy on edge devices. From that point, inference is free to them.
yorwba•7mo ago
They took a simple technique (normalizing flows), instantiated its basic building blocks with the most general neural network architecture known to work well (transformer blocks), and trained models of different sizes on various datasets to see whether it scales. Looks very bitter-lesson-pilled to me.

That they didn't scale beyond AFHQ (high-quality animal faces: cats, dogs and big cats) at 256×256 is probably not due to an explicit preference for small models at the expense of output resolution, but because this is basic research to test the viability of the approach. If this ever makes it into a product, it'll be a much bigger model trained on more data.

EDIT: I missed the second paper https://arxiv.org/abs/2506.06276 where they scale up to 1024×1024 with a 3.8-billion-parameter model. It seems to do about as well as diffusion models of similar size.

nextaccountic•7mo ago
This subject is fascinating and the article is informative, but I wish that HN had a button like "flag", but specific for articles that seems written by AI (well at least the section "How STARFlow compares with OpenAI’s 4o image generator" sounds like it)
CharlesW•7mo ago
FWIW, you can always report any HN quality concerns to hn@ycombinator.com and it'll be reviewed promptly and fairly (IMO).
Veen•7mo ago
It reads like the work of a professional writer who uses a handful of variant sentence structures and conventions to quickly write an article. That’s what professional writers are trained to do.
janalsncm•7mo ago
I had the opposite reaction, it definitely reads like a tech journalist who doesn’t have a great understanding of the tech. AI would’ve written a less clunky (and possibly incorrect) explanation.
lukan•7mo ago
If you enjoyed the article, why would you want to flag or tag it? For what purpose?
nextaccountic•7mo ago
Well maybe this article isn't AI written after all. But the intent was adding an (AI) besides the title.
kelseyfrog•7mo ago
Forgotten from like 2021? NVAE[1] was a great paper but maybe four years is long enough to be forgotten in the AI space? shrug

1. NVAE: A Deep Hierarchical Variational Autoencoder https://arxiv.org/pdf/2007.03898

bbminner•7mo ago
Right, it is bizzare to read that someone "unearthed a forgotten AI technique" that you happened to have worked with/on when it was still hot - when did I become a fossil? :D

Also, if we're being nitpicky, diffusion model inference has been proven equivalent to (and is often used as) a particular NF so.. shrug

nabla9•7mo ago
They are both variational inference, but Normalizing Flow (NF) is not VAE.
kelseyfrog•7mo ago
If you read the paper, you'll find "More Expressive Approximate Posteriors with Normalizing Flows" is in the methods section. The authors are in fact using (inverse) normalizing flows within the context of VAEs.

The appendix goes on to explain, "We apply simple volume-preserving normalizing flows of the form z′ = z + b(z) to the samples generated by the encoder at each level".

bitpush•7mo ago
I find it fascinating that Apple-centric media sites are stretching so much to position the company in the AI race. The title is meant to say that Apple found something unique that other people missed, when the simplest explanation is they started working on this a while back (2021 paper, afterall) and just released it.

A more accurate headline would be - Apple starting to create images using 4 year old techniques.

danhau•7mo ago
This „4 year old technique“ apparently could give Apple an edge for on-device workloads.

> short: both Apple and OpenAI are moving beyond diffusion, but while OpenAI is building for its data centers, Apple is clearly building for our pockets.

bitpush•7mo ago
The same edge Apple had summarizing notifications so poorly that they had to turn it off?

https://arstechnica.com/apple/2024/11/apple-intelligence-not...

janalsncm•7mo ago
That was a bad and unnecessary feature but the privacy benefits of running a model on device rather than in the cloud are undeniable.
bitpush•7mo ago
The fact that they shipped it shows they don't know what they were doing, private or not.
janalsncm•7mo ago
That’s a little unfair imo. Statistical models make mistakes and have failure modes which are difficult to predict.

When the bug popped up, turning the feature off was easier than retraining and redeploying.

politelemon•7mo ago
> I find it fascinating that Apple-centric media sites are stretching so much to position the company in the AI race.

A glance through the comments also shows HNers doing their best too. The mind still boggles as to why this site is so willing to perform mental gymnastics for a corporate.

amelius•7mo ago
We seriously need an AI to dampen the reality distorion field and bring back common sense. Maybe it can be something that people install in their browsers.
rTX5CMRXIfFG•7mo ago
That site's target market is what we know as "Apple fanboys". I'm not one to consider 9to5 serious journalism (nor even worthy to post in HN), but even those publications that I consider serious are businesses, too, and need to pander to their markets in order to make money.
darkstar_16•7mo ago
I think its just Apple PR pushing these out now to get Apple's name out in the AI era.
coldtea•7mo ago
>I find it fascinating that Apple-centric media sites are stretching so much to position the company in the AI race

Or, you know, just posting an article based on an Apple's press release about a new technique that falls squarely into their target audience (people reading Apple centric news) and is a great fit to current fashionable technologies (AI) people will show interest in.

Without giving a fuck to "position the company in the AI race". They'd post about Apple sewers having an issue at their HQs, if that news story was available.

Besides, when did Apple ever came first in some particular tech race (say, the mp3 player, the smartphone, the store, the tablet, the smartwatch, maybe VR now)? What they do typically is wait for the dust to settle and sweep the end-user end of that market.

bitpush•7mo ago
Precisely. Remember how they waited for AR/VR space to settle and then swept the end user market?

Or the smash hit Homepods.

Or Siri :)

npinsker•7mo ago
The VR space doesn’t seem settled to me. And I think Apple could win — having tried most of the major models, Apple’s is noticeably better. It really does feel like magic (other than the weight).
coldtea•7mo ago
They do have the only VR set that is user friendly enough and has enough ecosystem and apps for mass market. When they put out a future generation with lower prices, they'd get as much as possible.

Let's not forget that the iPod was a Mac only product, when the Mac just had like 3% of the market, with no Windows support and limited sales in the first versions. Or that the iPhone took a few years to dominate the market. Blackberry, Nokia and Microsoft even thought they had a chance for 3-4 years.

I wouldn't be surprised if their VR sales are already above all the other players (perhaps even combined).

niyyou•7mo ago
It's not even some "forgotten AI technique" (sigh...). It's been a hot topic for the last 5 years. Used a lot with Variational Auto-encoders, etc. Such a bad journalism.
msgodel•7mo ago
Given the tiny amount of funding they have Apple's ML team does some really amazing stuff. I think they're actually underappreciated by the public.
tomhow•7mo ago
Comments moved to https://news.ycombinator.com/item?id=44400105.