frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Leaving Meta and PyTorch

https://soumith.ch/blog/2025-11-06-leaving-meta-and-pytorch.md.html
204•saikatsg•2h ago

Comments

msmd74•2h ago
Sounds like you had a momentous run.

If you take advice from reformed Internet trolls, consider turning off all your devices and trying to give yourself at least a week, but ideally a month offline staring at your new baby. You'll never get that time back and there's nothing your brain will appreciate more than loading up those memories as they grow.

Good luck.

qmatch•2h ago
As a loyal JAX user, I hope they can play catchup. PyTorch has dominated the AI scene since TF1 fumbled the ball at 10th yard line. What Matt Johnson has done turning Autograd into JAX is hopefully going to be worthy of as much praise as what Soumith has received.
n_u•2h ago
> PyTorch has dominated the AI scene since TF1 fumbled the ball at 10th yard line

can you explain why you think TensorFlow fumbled?

zapnuk•1h ago
For me it was about 8 years ago. Back then TF was already bloated but had two weaknesses. Their bet on static compute graphs made writing code verbose and debugging difficult.

The few people I know back then used keras instead. I switched to PyTorch for my next project which was more "batteries included".

michaelt•32m ago
Imagine a total newbie trying to fine-tune an image classifier, reusing some open source example code, about a decade ago.

If their folder of 10,000 labelled images contains one image that's a different size to the others, the training job will fail with an error about unexpected dimensions while concatenating.

But it won't be able to say the file's name, or that the problem is an input image of the wrong size. It'll just say it can't concatenate tensors of different sizes.

An experienced user will recognise the error immediately, and will have run a data cleansing script beforehand anyway. But it's not experienced users who bounce from frameworks, it's newbies.

mschuster91•13m ago
> An experienced user will recognise the error immediately, and will have run a data cleansing script beforehand anyway. But it's not experienced users who bounce from frameworks, it's newbies.

Even seasoned developers will bounce away from frameworks or libraries - no matter if old dogs or the next hot thing - if the documentation isn't up to speed or simple, common tasks require wading through dozens of pages of documentation.

Writing good documentation is hard enough, writing relevant "common usage examples" is even harder... but keeping them up to date and working is a rarely seen art.

And the greatest art of all of it is logging. Soooo many libraries refuse to implement detailed structured logging in internal classes (despite particularly Java and PHP offering very powerful mechanisms), making it much more difficult to troubleshoot problems in the field.

intermerda•2h ago
Do you have experience in both JAX and PyTorch? Why do you prefer JAX?
chopete3•2h ago
>>Every major AI company and hardware vendor are on a speed dial. This kind of power is really hard to give up. But curiosity ultimately won out in my head.

A simple feeling has such a power. May he gets an opportunity to create one more powerful tool before retiring.

perfmode•2h ago
Respect.
mxkopy•2h ago
PyTorch is one of those tools that’s so simple and easy to take apart that you feel like you might’ve been able to make it yourself. I can’t imagine how much engineering effort was behind all those moments where I thought to myself, “of course it should work like that, how can it be any other way?”
TechnicolorByte•2h ago
Can anyone recommend a technical overview describing the design decisions PyTorch made that led it to win out?
huevosabio•2h ago
I don't know the full list, but back when it came out, TF felt like a crude set of bindings to the underlying c++/CUDA workhorse. PyTorch felt, in contrast, pythonic. It was much closer in feeling to numpy.
puttycat•1h ago
I think it was mostly the eager evaluation that made it possible to debug every step in the network forward/backward passes. Tensorflow didn't have that at the time which made debugging practically impossible.
GistNoesis•41m ago
The choice of the dynamic computation graph [1] of PyTorch made it easier to debug and implement, leading to higher adoption, even though running speed was initially slower (and therefore training cost higher).

Other decisions follow from this one.

Tensorflow started with static and had to move to dynamic at version 2.0, which broke everything. Fragmentation between tensorflow 1, tensorflow 2, keras, jax.

Pytorch's compilation of this computation graph erased the remaining edge of Tensorflow.

Is the battle over ? From a purely computational point, Pytorch solution is very far from optimal and billions of dollars of electricity and GPUs are burned every year, but major players are happy with circular deals to entrench their positions. So at the pace of current AI code development, probably one or two years before Pytorch is old history.

[1] https://www.geeksforgeeks.org/deep-learning/dynamic-vs-stati...

mxkopy•41m ago
I’m not sure if such an overview exists, but when caffe2 was still a thing and JAX was a big contender dynamic vs static computational graphs seemed to be a major focus point for people ranking the frameworks.
BoredPositron•2h ago
The last few years must have been incredibly exhausting. Thanks for your work good luck and 73.
vintermann•1h ago
That man has an infective enthusiasm. I remember the DCGAN paper inspired me to try getting the (Lua) Torch code to work, and I tried it on the Oxford flowers dataset early on. It worked surprisingly well, and Soumith Chintala even shared it around in social media, surprised at how well it worked on such a small dataset. Of course back then we didn't really appreciate the problem of mode collapse.

Pytorch and old Lua Torch were a pleasure to work with compared to the contemporary Tensorflow. Lots of S.C's code was copied around liberally, it had its quirks (I remember the DCGAN code had a pretty odd way of doing parameter passing) but it was also really easy to understand and made random people like me feel like we had suddenly stumbled onto something crazy powerful (which we had!). It was wonderfully hackable.

utopiah•48m ago
What I find most interesting with this is that it shows they believe there is nothing unique at Meta related to AI. There is no resource, people and computing power, that they can't get elsewhere for whatever they believe would be more interesting for them.

I mention this because it feels analogous to military research, where people "dream" of how advanced the military is, how forward they are compared to public research... and yet, it seems to be a recurring myth they love to sustain.

So the signal I get here is AI "labs" in BigTech have nothing worth waiting for around the corner, it's just more of the same and boring for people who stick there.

rtpg•36m ago
I don't think that's the read? Guy says he wants to work on something small. If you want to work on something big you probably want to be in a big corp to have the resources to do the big thing.

Also absolutely unknown if the "new thing" is AI-related at all!

utopiah•27m ago
Well he left so whatever is coming next, AI related or not, "small" or not (small for them might be reaching just a million people, he wrote that he "lead the software layer that powers the entire AI industry." so his notion of scale is probably unlike mine, maybe yours too) is more exciting to him that whatever he could do next with all of Meta's resources.

Edit: to be clear, I didn't mean to imply their next thing is AI related, solely that they obviously know more about AI at Meta than e.g. XR at Meta, just because that's their expertise.

aabhay•43m ago
For anyone that’s curious, the underlying Torch library is also a joy to work with, as are the many other torch bindings. For example, Rust has tch and Burn which both work with libtorch.

PyTorch of course has the benefit of being dynamically debuggable. Can’t forget the first time I break pointed my pytorch model and wrote pytorch calls inside the terminal to inspect the behavior. That’s still something I miss a lot now that I’m working with only “fast” compiled code.

lysecret•22m ago
I wrote som truly awful code back in the day because of that but god it was glorious.
irthomasthomas•35m ago
Counterfactual Regret Minimization irl
numice•32m ago
I read one post on his blog and found that Adam Paszke reached out to the author and got an internship. I wonder if it was that easy to get an internship at FAIR. I thought that they hire only PhDs.
gdiamos•13m ago
This is the end of an era. Amazing work soumith.

Leaving Meta and PyTorch

https://soumith.ch/blog/2025-11-06-leaving-meta-and-pytorch.md.html
209•saikatsg•2h ago•26 comments

A Fond Farewell from Farmers' Almanac

https://www.farmersalmanac.com/fond-farewell-from-farmers-almanac
199•erhuve•6h ago•66 comments

Lessons from Growing a Piracy Streaming Site

https://prison.josh.mn/lessons
48•zuhayeer•1h ago•4 comments

You should write an agent

https://fly.io/blog/everyone-write-an-agent/
651•tabletcorry•12h ago•282 comments

Two billion email addresses were exposed

https://www.troyhunt.com/2-billion-email-addresses-were-exposed-and-we-indexed-them-all-in-have-i...
446•esnard•12h ago•311 comments

Kimi K2 Thinking, a SOTA open-source trillion-parameter reasoning model

https://moonshotai.github.io/Kimi-K2/thinking.html
733•nekofneko•18h ago•303 comments

Text case changes the size of QR codes

https://www.johndcook.com/blog/2025/10/31/smaller-qr-codes/
17•ibobev•5d ago•2 comments

Game design is simple

https://www.raphkoster.com/2025/11/03/game-design-is-simple-actually/
280•vrnvu•10h ago•79 comments

Show HN: I scraped 3B Goodreads reviews to train a better recommendation model

https://book.sv
389•costco•1d ago•126 comments

Photoroom (YC S20) Is Hiring a Senior AI Front End Engineer in Paris

https://jobs.ashbyhq.com/photoroom/7644fc7d-7840-406d-a1b1-b9d2d7ffa9b8
1•ea016•2h ago

A Note on Fil-C

https://graydon2.dreamwidth.org/320265.html
159•signa11•8h ago•67 comments

We built a cloud GPU notebook that boots in seconds

https://modal.com/blog/notebooks-internals
29•birdculture•4d ago•6 comments

From web developer to database developer in 10 years

https://notes.eatonphil.com/2025-02-15-from-web-developer-to-database-developer-in-10-years.html
61•pmbanugo•3d ago•18 comments

Analysis indicates that the universe’s expansion is not accelerating

https://ras.ac.uk/news-and-press/research-highlights/universes-expansion-now-slowing-not-speeding
165•chrka•12h ago•139 comments

JermCAD: Browser-Based CAD Software

https://github.com/jeremyaboyd/jerm-cad
14•azhenley•4h ago•8 comments

Cryptography 101 with Alfred Menezes

https://cryptography101.ca
28•nmadden•3d ago•3 comments

Open Source Implementation of Apple's Private Compute Cloud

https://github.com/openpcc/openpcc
395•adam_gyroscope•1d ago•87 comments

Dead Framework Theory

https://aifoc.us/dead-framework-theory/
42•jhuleatt•5h ago•35 comments

HTML Slides with notes

https://nbd.neocities.org/slidepresentation/Slide%20presentation%20about%20slides
41•Curiositry•7h ago•9 comments

Time Immemorial turns 750: The Medieval law that froze history at 1189

https://www.ianvisits.co.uk/articles/time-immemorial-turns-750-the-medieval-law-that-froze-histor...
26•zeristor•7h ago•4 comments

Swift on FreeBSD Preview

https://forums.swift.org/t/swift-on-freebsd-preview/83064
204•glhaynes•15h ago•129 comments

A startup’s quest to store electricity in the ocean

https://techcrunch.com/2025/10/22/one-startups-quest-to-store-electricity-in-the-ocean/
4•rbanffy•17m ago•1 comments

LLMs encode how difficult problems are

https://arxiv.org/abs/2510.18147
138•stansApprentice•14h ago•27 comments

Eating stinging nettles

https://rachel.blog/2018/04/29/eating-stinging-nettles/
207•rzk•21h ago•190 comments

A prvalue is not a temporary

https://blog.knatten.org/2025/10/31/a-prvalue-is-not-a-temporary/
27•ingve•6h ago•53 comments

FBI tries to unmask owner of archive.is

https://www.heise.de/en/news/Archive-today-FBI-Demands-Data-from-Provider-Tucows-11066346.html
849•Projectiboga•16h ago•425 comments

I analyzed the lineups at the most popular nightclubs

https://dev.karltryggvason.com/how-i-analyzed-the-lineups-at-the-worlds-most-popular-nightclubs/
153•kalli•19h ago•72 comments

The Geometry of Schemes [pdf]

https://webhomes.maths.ed.ac.uk/~v1ranick/papers/eisenbudharris.pdf
41•measurablefunc•6d ago•9 comments

Word2Vec-style vector arithmetic on docs embeddings

https://technicalwriting.dev/embeddings/arithmetic/index.html
15•surprisetalk•6d ago•2 comments

Mathematical exploration and discovery at scale

https://terrytao.wordpress.com/2025/11/05/mathematical-exploration-and-discovery-at-scale/
251•nabla9•23h ago•116 comments