frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

How uv got so fast

https://nesbitt.io/2025/12/26/how-uv-got-so-fast.html
342•zdw•5h ago•120 comments

How Lewis Carroll computed determinants (2023)

https://www.johndcook.com/blog/2023/07/10/lewis-carroll-determinants/
116•tzury•3h ago•21 comments

Experts explore new mushroom which causes fairytale-like hallucinations

https://nhmu.utah.edu/articles/experts-explore-new-mushroom-which-causes-fairytale-hallucinations
235•astronads•5h ago•101 comments

The Best Things and Stuff of 2025

https://blog.fogus.me/2025/12/23/the-best-things-and-stuff-of-2025.html
23•adityaathalye•3d ago•2 comments

My insulin pump controller uses the Linux kernel. It also violates the GPL

https://old.reddit.com/r/linux/comments/1puojsr/the_device_that_controls_my_insulin_pump_uses_the/
221•davisr•3h ago•73 comments

Drawing with zero-width characters

https://zw.swerdlow.dev
23•benswerd•3h ago•10 comments

Package managers keep using Git as a database, it never works out

https://nesbitt.io/2025/12/24/package-managers-keep-using-git-as-a-database.html
528•birdculture•10h ago•305 comments

Show HN: Witr – Explain why a process is running on your Linux system

https://github.com/pranshuparmar/witr
124•pranshuparmar•7h ago•15 comments

LearnixOS

https://www.learnix-os.com
173•gtirloni•9h ago•66 comments

Gaussian Splatting 3 Ways

https://github.com/NullandKale/NullSplats
33•nullandkale•3h ago•3 comments

FFmpeg has issued a DMCA takedown on GitHub

https://twitter.com/FFmpeg/status/2004599109559496984
309•merlindru•5h ago•64 comments

Perfect Aircrete, Kitchen Ingredients [video]

https://www.youtube.com/watch?v=z4_GxPHwqkA
54•surprisetalk•6d ago•20 comments

MongoBleed

https://github.com/joe-desimone/mongobleed/blob/main/mongobleed.py
42•gpi•4h ago•5 comments

Parasites plagued Roman soldiers at Hadrian's Wall

https://arstechnica.com/science/2025/12/study-roman-soldiers-battled-parasites-at-hadrians-wall/
14•sipofwater•1w ago•9 comments

How I think about Kubernetes

https://garnaudov.com/writings/how-i-think-about-kubernetes/
46•todsacerdoti•2h ago•28 comments

Migrating my web analytics from Matomo to Umami

https://stanislas.blog/2025/12/migrating-matomo-to-umami-web-analytics/
23•angristan•2d ago•0 comments

Ask HN: What did you read in 2025?

114•kwar13•9h ago•135 comments

Sandbox: Run untrusted AI code safely, fast

https://github.com/PwnFunction/sandbox
41•vortex_ape•1w ago•6 comments

Show HN: Xcc700: Self-hosting mini C compiler for ESP32 (Xtensa) in 700 lines

https://github.com/valdanylchuk/xcc700
69•isitcontent•7h ago•16 comments

Unix "find" expressions compiled to bytecode

https://nullprogram.com/blog/2025/12/23/
94•rcarmo•10h ago•12 comments

Rob Pike goes nuclear over GenAI

https://skyview.social/?url=https%3A%2F%2Fbsky.app%2Fprofile%2Frobpike.io%2Fpost%2F3matwg6w3ic2s&...
1083•christoph-heiss•8h ago•1386 comments

A Proclamation Regarding the Restoration of the Dash

https://blog.nawaz.org/posts/2025/Dec/a-proclamation-regarding-the-restoration-of-the-dash/
93•BeetleB•5h ago•100 comments

The Algebra of Loans in Rust

https://nadrieril.github.io/blog/2025/12/21/the-algebra-of-loans-in-rust.html
177•g0xA52A2A•4d ago•79 comments

What happened to all the gold Spain got from the New World? (1985)

https://www.straightdope.com/21341789/what-happened-to-all-the-gold-spain-got-from-the-new-world
58•titaniumtown•4d ago•92 comments

Show HN: AutoLISP interpreter in Rust/WASM – a CAD workflow invented 33 yrs ago

https://acadlisp.de/noscript.html
97•holg•6h ago•29 comments

ZJIT is now available in Ruby 4.0

https://railsatscale.com/2025-12-24-launch-zjit/
72•ibobev•5h ago•24 comments

C/C++ Embedded Files (2013)

https://www.4rknova.com//blog/2013/01/27/cpp-embedded-files
40•ibobev•5h ago•35 comments

Joan Didion and Kurt Vonnegut had something to say. We have it on tape

https://www.nytimes.com/2025/12/19/books/james-baldwin-joan-didion-92ny-recordings.html
87•tintinnabula•4d ago•18 comments

Overlooked No More: Inge Lehmann, Who Discovered the Earth's Inner Core

https://www.nytimes.com/2025/12/20/obituaries/inge-lehmann-overlooked.html
69•Hooke•4d ago•14 comments

High school student discovers 1.5M potential new astronomical objects

https://www.smithsonianmag.com/smart-news/high-school-student-discovers-1-5-million-potential-new...
101•mhb•7h ago•92 comments
Open in hackernews

How uv got so fast

https://nesbitt.io/2025/12/26/how-uv-got-so-fast.html
339•zdw•5h ago

Comments

yjftsjthsd-h•2h ago
> No bytecode compilation by default. pip compiles .py files to .pyc during installation. uv skips this step, shaving time off every install. You can opt in if you want it.

Are we losing out on performance of the actual installed thing, then? (I'm not 100% clear on .pyc files TBH; I'm guessing they speed up start time?)

woodruffw•2h ago
No, because Python itself will generate bytecode for packages once you actually import them. uv just defers that to first-import time, but the cost is amortized in any setting where imports are performed over multiple executions.
yjftsjthsd-h•2h ago
That sounds like yes? Instead of doing it once at install time, it's done once at first use. It's only once so it's not persistently slower, but that is a perf hit.

My first cynical instinct is to say that this is uv making itself look better by deferring the costs to the application, but it's probably a good trade-off if any significant percentage of the files being compiled might not be used ever so the overall cost is lower if you defer to run time.

woodruffw•2h ago
> It's only once so it's not persistently slower, but that is a perf hit.

Sure, but you pay that hit either way. Real-world performance is always usage based: the assumption that uv makes is that people run (i.e. import) packages more often than they install them, so amortizing at the point of the import machinery is better for the mean user.

(This assumption is not universal, naturally!)

dddgghhbbfblk•1h ago
Ummm, your comment is backwards, right?
woodruffw•1h ago
Which part? The assumption is that when you `$TOOL install $PACKAGE`, you run (i.e. import) `$PACKAGE` more than you re-install it. So there's no point in slowing down (relatively less common) installation events when you can pay the cost once on import.

(The key part being that 'less common' doesn't mean a non-trivial amount of time.)

beacon294•2h ago
Probably for any case where an actual human is doing it. On an image you obviously want to do it at bake time, so I feel default off with a flag would have been a better design decision for pip.

I just read the thread and use Python, I can't comment on the % speedup attributed to uv that comes from this optimization.

Epa095•1h ago
Images are a good example where doing it at install-time is probably the best yeah, since every run of the image starts 'fresh', losing the compilation which happened last time the image got started.

If it was a optional toggle it would probably become best practice to activate compilation in dockerfiles.

saidnooneever•1h ago
you are right. it depends on how often this first start is, if its bad or not..most usecases id guess (total guess, have limited exp with python projects professionally) its not an issue.
tedivm•39m ago
You can change it to compile the bytecode on install with a simple environment variable (which you should do when building docker containers if you want to sacrifice some disk space to decrease initial startup time for your app).
VorpalWay•19m ago
I think they are making the bet that most modules won't be imported. For example if I install scipy, numpy, Pillow or such: what are the chances that I use a subset of the modules vs literally all of them?

I would bet on a subset for pretty much any non-trivial package (i.e. larger than one or two user facing modules). And for those trivial packages? Well they are usually small, so the cost is small as well. I'm sure there are exceptions: maybe a single gargantuan module thst consists of autogenerated FFI bindings for some C library or such, but that is likely the minority.

salviati•2h ago
Historically the practice of producing pyc files on install started with system wide installed packages, I believe, when the user running the program might lack privileges to write them. If the installer can write the .oy files it can also write the .pyc, while the user running them might not in that location.
plorkyeran•1h ago
If you have a dependency graph large enough for this to be relevant, it almost certainly includes a large number of files which are never actually imported. At worst the hit to startup time will be equal to the install time saved, and in most cases it'll be a lot smaller.
hauntsaninja•1h ago
Yes, uv skipping this step is a one time significant hit to start up time. E.g. if you're building a Dockerfile I'd recommend setting `--compile-bytecode` / `UV_COMPILE_BYTECODE`
thundergolfer•57m ago
This optimization hits serverless Python the worst. At Modal we ensure users of uv are setting UV_COMPILE_BYTECODE to avoid the cold start penalty. For large projects .pyc compilation can take hundreds of milliseconds.
blintz•2h ago
> PEP 658 went live on PyPI in May 2023. uv launched in February 2024. The timing isn’t coincidental. uv could be fast because the ecosystem finally had the infrastructure to support it. A tool like uv couldn’t have shipped in 2020. The standards weren’t there yet.

How/why did the package maintainers start using all these improvements? Some of them sound like a bunch of work, and getting a package ecosystem to move is hard. Was there motivation to speed up installs across the ecosystem? If setup.py was working okay for folks, what incentivized them to start using pyproject.toml?

yjftsjthsd-h•2h ago
Because static declaration was clearly safer and more performant? My question is why pip isn't fully taking advantage
eesmith•2h ago
Because pip contains decades of built-up code and lacks the people willing to work on updating it.
zahlman•1h ago
> If setup.py was working okay for folks, what incentivized them to start using pyproject.toml?

It wasn't working okay for many people, and many others haven't started using pyproject.toml.

For what I consider the most egregious example: Requests is one of the most popular libraries, under the PSF's official umbrella, which uses only Python code and thus doesn't even need to be "built" in a meaningful sense. It has a pyproject.toml file as of the last release. But that file isn't specifying the build setup following PEP 517/518/621 standards. That's supposed to appear in the next minor release, but they've only done patch releases this year and the relevant code is not at the head of the repo, even though it already caused problems for them this year. It's been more than a year and a half since the last minor release.

woodruffw•2h ago
I think this post does a really good job of covering how multi-pronged performance is: it certainly doesn't hurt uv to be written in Rust, but it benefits immensely from a decade of thoughtful standardization efforts in Python that lifted the ecosystem away from needing `setup.py` on the hot path for most packages.
yjftsjthsd-h•2h ago
I think a lot of rust rewrites have this benefit; if you start with hindsight you can do better more easily. Of course, rust is also often beneficial for its own sake, so it's a one-two punch:)
woodruffw•2h ago
Completely agreed!
pxc•2h ago
Succinctly, perhaps with some loss of detail:

"Rewrite" is important as "Rust".

pixelpoet•1h ago
as important as
s_ting765•1h ago
Rust rewrites are known for breaking (compatibility with) working software. That's all there is to them.
pxc•1h ago
In Python's case, as the article describes quite clearly, the issue is that the design of "working software" (particularly setup.py) was bad to the point of insane (in much the same way as the NPM characteristics that enabled the recent Shai Hulud supply chain attacks, but even worse). At some point, compatibility with insanity has got to go.

Helpfully, though, uv retains compatibility with newer (but still well-established) standards in the Python community that don't share this insanity!

s_ting765•49m ago
My gripe is with Rust rewrites. Not uv. Though I very much think uv is overhyped.
Lammy•21m ago
I would say the downside of them is that they're known for replacing GPL software with MIT software
Levitating•1h ago
> I think a lot of rust rewrites have this benefit

I think Rust itself has this benefit

glaslong•1h ago
Someone once told me a benefit of staffing a project for Haskell was it made it easy to select for the types of programmers that went out of their way to become experts in Haskell.

Tapping the Rust community is a decent reason to do a project in Rust.

steve_adams_86•1h ago
I'm my experience this is definitely where rust shined. The language wasn't really what made the project succeed so much as having relatively curious, meticulous, detail-oriented people on hand who were interested in solving hard problems.

Sometimes I thought our teams would be a terrible fit for more cookie-cutter applications where rapid development and deployment was the primary objective. We got into the weeds all the time (sometimes because of rust itself), but it happened to be important to do so.

Had we built those projects with JavaScript or Python I suspect the outcomes would have been worse for reasons apart from the language choice.

zahlman•38m ago
> having relatively curious, meticulous, detail-oriented people on hand who were interested in solving hard problems.... Had we built those projects with JavaScript or Python I suspect the outcomes would have been worse for reasons apart from the language choice.

I genuinely can't understand why you suppose that has to do with the implementation language at all.

KPGv2•9m ago
> I genuinely can't understand why you suppose that has to do with the implementation language at all.

Languages that attract novice programmers (JS is an obvious one; PHP was one 20 years ago) have a higher noise to signal ratio than one that attracts intermediate and above programmers.

If you grabbed an average Assembly programmer today, and an average JavaScript programmer today, who do you think is more careful about programming? The one who needs to learn arcane shit to do basic things and then has to compile it in order to test it out, or the one who can open up Chrome's console and console.log("i love boobies")

How many embedded systems programmers suck vs full stack devs? I'm not saying full stack devs are inferior. I'm saying that more inferior coders are attracted to the latter because the barriers to entry are SO much easier to bypass.

Calavar•59m ago
Paul Graham said the same thing about Python 20 years ago [1], and back then it was true. But once a programming langauge hits mainstream, this ceases to be a good filter.

[1] https://paulgraham.com/pypar.html

jghn•53m ago
This is important. The benefit here isn't the language itself. It's the fact that you're pulling from an esoteric language. People should not overfit and feel that whichever language is achieving that effect today is special in this regard.
bri3d•50m ago
It's an interesting debate. The flip side of this coin is getting hires who are more interested in the language or approach than the problem space and tend to either burn out, actively dislike the work at hand, or create problems that don't exist in order to use the language to solve them.

With that said, Rust was a good language for this in my experience. Like any "interesting" thing, there was a moderate bit of language-nerd side quest thrown in, but overall, a good selection metric. I do think it's one of the best Rewrite it in X languages available today due to the availability of good developers with Rewrite in Rust project experience.

The Haskell commentary is curious to me. I've used Haskell professionally but never tried to hire for it. With that said, the other FP-heavy languages that were popular ~2010-2015 were absolutely horrible for this in my experience. I generally subscribe to a vague notion that "skill in a more esoteric programming language will usually indicate a combination of ability to learn/plasticity and interest in the trade," however, using this concept, I had really bad experiences hiring both Scala and Clojure engineers; there was _way_ too much academic interest in language concepts and way too little practical interest in doing work. YMMV :)

nurettin•2h ago
> When a package says it requires python<4.0, uv ignores the upper bound and only checks the lower.

I will bring popcorn on python 4 release date.

yjftsjthsd-h•2h ago
If it's really not doing any upper bound checks, I could see it blowing up under more mundane conditions; Python includes breaking changes on .x releases, so I've had eg. packages require (say) Python 3.10 when 3.11/12 was current.
dev_l1x_be•2h ago
I always bring popcorn on major version changes for any programming language. I hope Rust's never 2.0 stance holds.
zahlman•1h ago
It would be popcorn-worthy regardless, given the rhetoric surrounding the idea in the community.
ethin•2h ago
> Zero-copy deserialization. uv uses rkyv to deserialize cached data without copying it. The data format is the in-memory format. This is a Rust-specific technique.

This (zero-copy deserialization) is not a rust-specific technique, so I'm not entirely sure why the author describes it as one. Any good low level language (C/C++ included) can do this from my experience.

nemothekid•2h ago
Given the context of the article, I think "Rust specific" here means that "it couldn't be done in python".

For example "No interpreter startup" is not specific to Rust either.

kbd•2h ago
It's Rust vs Python in this case.
woodruffw•2h ago
I think the framing in the post is that it's specific to Rust, relative to what Python packaging tools are otherwise written in (Python). It's not very easy to do zero-copy deserialization in pure Python, from experience.

(But also, I think Rust can fairly claim that it's made zero-copy deserialization a lot easier and safer.)

stefan_•1h ago
I suppose it can fairly claim that now every other library and blog post invokes "zero-copy" this and that, even in the most nonsensical scenarios. It's a technique for when you can literally not afford the memory bandwidth, because you are trying to saturate a 100Gbps NIC or handling 8k 60Hz video, not for compromising your data serialization schemes portability for marketing purposes while all applications hit the network first, disk second and memory bandwidth never.
woodruffw•1h ago
Many of the hot paths in uv involve an entirely locally cached set of distributions that need to be loaded into memory, very lightly touched/filtered, and then sunk to disk somewhere else. In those contexts, there are measurable benefits to not transforming your representation.

(I'm agnostic on whether zero-copy "matters" in every single context. If there's no complexity cost, which is what Rust's abstractions often provide, then it doesn't really hurt.)

vlovich123•44m ago
You’ve got this backward. The vast majority of time due to spatial and temporal locality, in practice for any application you’re actually usually doing CPU registers first, cache second, memory third, disk fourth, network cache fifth, and network origin sixth. So this stuff does actually matter for performance.

Also, aside from memory bandwidth, there’s a latency cost inherent in traversing object graphs - 0 copy techniques ensure you traverse that graph minimally, just what’s needed to actually be accessed which is huge when you scale up. There’s a difference between one network request and fetching 1 MB vs making 100 requests to fetch 10kib and this difference also appears in memory access patterns unless they’re absorbed by your cache (not guaranteed for object graph traversal that a package manager would be doing).

zahlman•22m ago
The point is that the packaging tool can analyze files from within the archives it downloads, without writing them to disk.
zahlman•22m ago
I can't even imagine what "safety" issue you have in mind. Given that "zero-copy" apparently means "in-memory" (a deserialized version of the data necessarily cannot be the same object as the original data), that's not even difficult to do with the Python standard library. For example, `zipfile.ZipFile` has a convenience method to write to file, but writing to in-memory data is as easy as

  with zipfile.ZipFile(archive_name) as a:
      with a.open(file_name) as f, io.BytesIO() as b:
          b.write(f.read())
          return b.getvalue()
(That does, of course, copy data around within memory, but.)
woodruffw•8m ago
> Given that "zero-copy" apparently means "in-memory" (a deserialized version of the data necessarily cannot be the same object as the original data), that's not even difficult to do with the Python standard library

This is not what zero-copy means. Here's a working definition[1].

Specifically, it's not just about keeping things in memory; copying in memory is normal. The goal is to not make copies (or more precisely, what Rust would call "clones"), but to instead convey the original representation/views of that representation through the program's lifecycle where feasible.

> a deserialized version of the data necessarily cannot be the same object as the original data

rust-asn1 would be an example of a Rust library that doesn't make any copies of data unless you explicitly ask it to. When you load e.g. a Utf8String[2] in rust-asn1, you get a view into the original input buffer, not an intermediate owning object created from that buffer.

> (That does, of course, copy data around within memory, but.)

Yes, that's what makes it not zero-copy.

[1]: https://rkyv.org/zero-copy-deserialization.html

[2]: https://docs.rs/asn1/latest/asn1/struct.Utf8String.html

agumonkey•2h ago
very nice article, always good to get a review of what a "simple" looking tool does behind the scense

about rust though

some say a nicer language helps finding the right architecture (heard that about cpp veteran dropping it for ocaml, any attempted idea would take weeks in cpp, was a few days in ocaml, they could explore more)

also the parallelism might be a benefit the language orientation

enough semi fanboyism

aswegs8•2h ago
uv seems to be a pet peeve of HN. I always thought pipenv was good but yeah, seems like I was being ignorant
aw1621107•1h ago
> uv seems to be a pet peeve of HN.

Unless I've been seeing very different submissions than you, "pet peeve" seems like the exact opposite of what is actually the case?

VerifiedReports•1h ago
Indeed; I don't think he knows what "peeve" means...
glaucon•1h ago
I too use pipenv unless there's a reason not to. I hope people use whatever works best for them.

I feel that sometimes there's a desire on the part of those who use tool X that everyone should use tool X. For some types of technology (car seat belts, antibiotics...) that might be reasonable but otherwise it seems more like a desire for validation of the advocate's own choice.

epage•2h ago
> uv is fast because of what it doesn’t do, not because of what language it’s written in. The standards work of PEP 518, 517, 621, and 658 made fast package management possible. Dropping eggs, pip.conf, and permissive parsing made it achievable. Rust makes it a bit faster still.

Isn't assigning out what all made things fast presumptive without benchmarks? Yes, I imagine a lot is gained by the work of those PEPs. I'm more questioning how much weight is put on dropping of compatibility compared to the other items. There is also no coverage for decisions influenced by language choice which likely influences "Optimizations that don’t need Rust".

This also doesn't cover subtle things. Unsure if rkyv is being used to reduce the number of times that TOML is parsed but TOML parse times do show up in benchmarks in Cargo and Cargo/uv's TOML parser is much faster than Python's (note: Cargo team member, `toml` maintainer). I wish the TOML comparison page was still up and showed actual numbers to be able to point to.

zahlman•1h ago
> Isn't assigning out what all made things fast presumptive without benchmarks?

We also have the benchmark of "pip now vs. pip years ago". That has to be controlled for pip version and Python version, but the former hasn't seen a lot of changes that are relevant for most cases, as far as I can tell.

> This also doesn't cover subtle things. Unsure if rkyv is being used to reduce the number of times that TOML is parsed but TOML parse times do show up in benchmarks in Cargo and Cargo/uv's TOML parser is much faster than Python's (note: Cargo team member, `toml` maintainer). I wish the TOML comparison page was still up and showed actual numbers to be able to point to.

This is interesting in that I wouldn't expect that the typical resolution involves a particularly large quantity of TOML. A package installer really only needs to look at it at all when building from source, and part of what these standards have done for us is improve wheel coverage. (Other relevant PEPs here include 600 and its predecessors.) Although that has also largely been driven by education within the community, things like e.g. https://blog.ganssle.io/articles/2021/10/setup-py-deprecated... and https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-... .

pecheny•2h ago
The content is nice and insightful! But God I wish people stopped using LLMs to 'improve' their prose... Ironically, some day we might employ LLMs to re-humanize texts that had been already massacred.
laidoffamazon•1h ago
Interestingly I didn’t catch this, I liked it for not looking LLM written!
yunohn•1h ago
“Why this matters” being the final section is a guaranteed give away, among innumerable others.
rick_dalton•13m ago
I realized once I was in the "optimizations that dont need rust" section. Specifically "This is concurrency, not language magic."
yunohn•1h ago
I have reached a point where any AI smell (of which this articles has many) makes me want to exit immediately. It feels tortuous to my reading sensibilities.

I blame fixed AI system prompts - they forcibly collapse all inputs into the same output space. Truly disappointing that OpenAI et all have no desire to change this before everything on the internet sounds the same forever.

fleebee•1h ago
You're probably right about the latter point, but I do wonder how hard it'd be to mask the default "marketing copywriter" tone of the LLM by asking it to assume some other tone in your prompt.

As you said, reading this stuff is taxing. What's more, this is a daily occurrence by now. If there's a silver lining, it's that the LLM smells are so obvious at the moment; I can close the tab as soon as I notice one.

SatvikBeri•1h ago
> do wonder how hard it'd be to mask the default "marketing copywriter" tone of the LLM by asking it to assume some other tone in your prompt.

Fairly easy, in my wife's experience. She repeatedly got accused of using chatgpt in her original writing (she's not a native english speaker, and was taught to use many of the same idioms that LLMs use) until she started actually using chatgpt with about two pages of instructions for tone to "humanize" her writing. The irony is staggering.

mattkevan•23m ago
It’s pretty easy. I’ve written a fairly detailed guide to help Claude write in my tone of voice. It also coaxes it to avoid the obvious AI tells such as ‘It’s not X it’s Y’ sentences, American English and overuse of emojis and em dashes.

It’s really useful for taking my first drafts and cleaning them up ready for a final polish.

efilife•26m ago
I also don't read AI slop. It's disrespectful to any reader.
captn3m0•49m ago
The author’ blog was on HN a few days ago as well for an article on SBOMs and Lockfiles. They’ve done a lot of work in the supply-chain security side and are clearly knowledgeable, and yet the blog post got similarly “fuzzified” by the LLM.
ec109685•1h ago
The article info is great, but why do people put up with LLM ticks and slop in their writing? These sentences add no value and treats the reader as stupid.

> This is concurrency, not language magic.

> This is filesystem ops, not language-dependent.

Duh, you literally told me that the previous sentence and 50 million other times.

aurumque•1h ago
This kind of writing goes deeper than LLM's, and reflects a decline in both reading ability, patience, and attention. Without passing judgement, there are just more people now who benefit from repetition and summarization embedded directly in the article. The reader isn't 'stupid', just burdened.
twoodfin•56m ago
Indeed, I am coming around in the past few weeks to realization and acceptance that the LLM editorial voice is a benefit to an order of magnitude more hn readers than those (like us) for whom it is ice pick in the nostril stuff.

Oh well, all I can do is flag.

hallvard•1h ago
Great post, but the blatant chatgpt-esque feel hits hard… Don’t get me wrong, I love astral! and the content, but…
hallvard•1h ago
Reading the other replies here makes it really obvious that this is some LLM’s writing. Maybe even all of it…
skywhopper•1h ago
This is great to read because it validates my impression that Python packaging has always been a tremendous overengineered mess. Glad to see someone finally realized you just need a simple standard metadata file per package.
looneysquash•1h ago
I don't have any real disagreement with any of the details the author said.

But still, I'm skeptical.

If it is doable, the best way to prove it is to actually do it.

If no one implements it, was it ever really doable?

Even if there is no technical reason, perhaps there is a social one?

stevemk14ebr•1h ago
What are you talking about, this all exists
VerifiedReports•1h ago
So... will uv make Python a viable cross-platform utility solution?

I was going to learn Python for just that (file-conversion utilities and the like), but everybody was so down on the messy ecosystem that I never bothered.

zahlman•1h ago
It has been viable for a long time, and the kinds of projects you describe are likely well served by the standard library.
pseudosavant•1h ago
I write all of my scripts in Python with PEP 723 metadata and run them with `uv run`. Works great on Windows and Linux for me.
IshKebab•39m ago
Yes, uv basically solves the terrible Python tooling situation.

In my view that was by far the biggest issue with Python - a complete deal-breaker really. But uv solves it pretty well.

The remaining big issues are a) performance, and b) the import system. uv doesn't do anything about those.

Performance may not be an issue in some cases, and the import system is ... tolerable if you're writing "a python project". If you're writing some other project and considering using Python for its scripting system, e.g. to wrangle multiple build systems or whatever than the import mess is a bigger issue and I would thing long and hard before picking it over Deno.

pwdisswordfishy•1h ago
> Some of uv’s speed comes from Rust. But not as much as you’d think. Several key optimizations could be implemented in pip today: […] Python-free resolution

Umm…

andy99•1h ago
I remain baffled about these posts getting excited about uv’s speed. I’d like to see a real poll but I personally can’t imagine people listing speed as one of the their top ten concerns about python package managers. What are the common use cases where the delay due to package installation is at all material?

Edit to add: I use python daily

pants2•1h ago
The biggest benefit is in CI environments and Docker images and the like where all packages can get reinstalled on every run.
gordonhart•1h ago
`poetry install` on my dayjob’s monolith took about 2 minutes, `uv sync` takes a few seconds. Getting 2 minutes back on every CI job adds up to a lot of time saved
toenail•1h ago
The speed is nice, but I switched because uv supports "pip compile" from pip-tools, and it is better at resolving dependencies. Also pip-tools uses (used?) internal pip methods and breaks frequently because of that, uv doesn't.
recov•1h ago
Docker builds are a big one, at least at my company. Any tool that reduces wait time is worth using, and uv is an amazing tool that removes that wait time. I take it you might not use python much as it solves almost every pain point, and is fast which feels rare.
stavros•1h ago
I can run `uvx sometool` without fear because I know that it'll take a few seconds to create a venv, download all the dependencies, and run the tool. uv's speed has literally changed how I work with Python.
quectophoton•25m ago
I wouldn't say without fear, since you're one typo away from executing a typo-squatted malicious package.

I do use it on CI/CD pipelines, but I wouldn't dare type uvx commands myself on a daily basis.

stavros•5m ago
uvx isn't more risky than `pip install`, which is what I used before.
rsyring•1h ago
As a multi decade Python user, uv's speed is "life changing". It's a huge devx improvement. We lived with what came before, but now that I have it, I would never want to go back and it's really annoying to work on projects now that aren't using it.
thraxil•1h ago
Working heavily in Python for the last 20 years, it absolutely was a big deal. `pip install` has been a significant percentage of the deploy time on pretty much every app I've ever deployed and I've spent countless hours setting up various caching techniques trying to speed it up.
SatvikBeri•1h ago
Setting up a new dev instance took 2+ hours with pip at my work. Switching to uv dropped the Python portion down to <1 minute, and the overall setup to 20 minutes.

A similar, but less drastic speedup applied to docker images.

techbruv•1h ago
At a previous job, I recall updating a dependency via poetry would take on the order of ~5-30m. God forbid after 30 minutes something didn’t resolve and you had to wait another 30 minutes to see if the change you made fixed the problem. Was not an enjoyable experience.

uv has been a delight to use

pxc•1h ago
> updating a dependency via poetry would take on the order of ~5-30m. God forbid after 30 minutes something didn’t resolve and you had to wait another 30 minutes to see if the change you made fixed the problem

I'd characterize that as unusable, for sure.

patrick91•1h ago
for me it's being able to do `uv run whatever` and always know I have the correct dependencies

(also switching python version is so fast)

pseudosavant•1h ago
I avoided Python for years, especially because of package and environment management. Python is now my go to for projects since discovering uv, PEP 723 metadata, and LLMs’ ability to write Python.
adammarples•44m ago
It's annoying. Do you use poetry? Pipenv? It's annoying.
IshKebab•43m ago
Do you still remain baffled after the many replies that people actually do like their tooling to be not dog slow like pip is?
VorpalWay•32m ago
CI: I changed a pipeline at work from pip and pipx to uv, it saved 3 minutes on a 7 minute pipeline. Given how oversubscribed our runners are, anything saving time is a big help.

It is also really nice when working interactivly to have snappy tools that don't take you out of the flow more than absolutely more than necessary. But then I'm quite sensitive to this, I'm one of those people who turn off all GUI animations because they waste my time and make the system feel slow.

zahlman•14m ago
It's not just about delays being "material"; waiting on the order of seconds for a venv creation (and knowing that this is because of pip bootstrapping itself, when it should just be able to install cross-environment instead of having to wait until 2022 for an ugly, limited hack to support that) is annoying.

But small efficiencies do matter; see e.g. https://danluu.com/productivity-velocity/.

zahlman•1h ago
I've talked about this many times on HN this year but got beaten to the punch on blogging it seems. Curses.

... Okay, after a brief look, there's still lots of room for me to comment. In particular:

> pip’s slowness isn’t a failure of implementation. For years, Python packaging required executing code to find out what a package needed.

This is largely refuted by the fact that pip is still slow, even when installing from wheels (and getting PEP 600 metadata for them). Pip is actually still slow even when doing nothing. (And when you create a venv and allow pip to be bootstrapped in it, that bootstrap process takes in the high 90s percent of the total time used.)

didibus•1h ago
There's an interesting psychology at play here as well, if you are a programmer that chooses a "fast language" it's indicative of your priorities already, it's often not much the language, but that the programmer has decided to optimize for performance from the get go.
bastawhiz•1h ago
> When a package says it requires python<4.0, uv ignores the upper bound and only checks the lower. This reduces resolver backtracking dramatically since upper bounds are almost always wrong. Packages declare python<4.0 because they haven’t tested on Python 4, not because they’ll actually break. The constraint is defensive, not predictive.

This is kind of fascinating. I've never considered runtime upper bound requirements. I can think of compelling reasons for lower bounds (dropping version support) or exact runtime version requirements (each version works for exact, specific CPython versions). But now that I think about it, it seems like upper bounds solve a hypothetical problem that you'd never run into in practice.

If PSF announced v4 and declared a set of specific changes, I think this would be reasonable. In the 2/3 era it was definitely reasonable (even necessary). Today though, it doesn't actually save you any trouble.

wging•42m ago
I think the article is being careful not to say uv ignores _all_ upper bound checks, but specifically 4.0 upper bound checks. If a package says it requires python < 3.0, that's still super relevant, and I'd hope for uv to still notice and prevent you from trying to import code that won't work on python 3. Not sure what it actually does.
breischl•32m ago
I read the article as saying it ignores all upper-bounds, and 4.0 is just an example. I could be wrong though - it seems ambiguous to me.

But if we accept that it currently ignores any upper-bounds checks greater than v3, that's interesting. Does that imply that once Python 4 is available, uv will slow down due to needing to actually run those checks?

VorpalWay•26m ago
Are there any plans to actually make a 4.0 ever? I remember hearing a few years ago that after the transition to 3.0, the core devs kind of didn't want to repeat that mess ever again.

That said, even if it does happen, I highly doubt that is the main part of the speed up compared to pip.

ofek•1h ago
> pip could implement parallel downloads, global caching, and metadata-only resolution tomorrow. It doesn’t, largely because backwards compatibility with fifteen years of edge cases takes precedence.

pip is simply difficult to maintain. Backward compatibility concerns surely contribute to that but also there are other factors, like an older project having to satisfy the needs of modern times.

For example, my employer (Datadog) allowed me and two other engineers to improve various aspects of Python packaging for nearly an entire quarter. One of the items was to satisfy a few long-standing pip feature requests. I discovered that the cross-platform resolution feature I considered most important is basically incompatible [1] with the current code base. Maintainers would have to decide which path they prefer.

[1]: https://github.com/pypa/pip/issues/13111

eviks•56m ago
> Every code path you don’t have is a code path you don’t wait for.

No, every code path you don't execute is that. Like

> No .egg support.

How does that explain anything if the egg format is obsolete and not used?

Similar with spec strictness fallback logic - it's only slow if the packages you're installing are malformed, otherwise the logic will not run and not slow you down.

And in general, instead of a list of irrelevant and potentially relevant things would be great to understand some actual time savings per item (at least those that deliver the most speedup)!

But otherwise great and seemingly comprehensive list!

zahlman•7m ago
> No, every code path you don't execute is that.

Even in compiled languages, binaries have to get loaded into memory. For Python it's much worse. On my machine:

  $ time python -c 'pass'

  real 0m0.019s
  user 0m0.013s
  sys 0m0.006s

  $ time pip --version > /dev/null

  real 0m0.202s
  user 0m0.182s
  sys 0m0.021s
Almost all of that extra time is either the module import process and then garbage collection. Even with cached bytecode, the former requires finding and reading from literally hundreds of files, deserializing via `marshal.loads` and then running top-level code, which includes creating objects to represent the functions and classes.

It used to be even worse than this; in recent versions, imports related to Requests are deferred to the first time that an HTTPS request is needed.

efilife•52m ago
this shit is ChatGPT-written and I'm really tired of it. If I wanted to read chatgpt I would have asked it myself. Half of the article are nonsensical repeated buzzwords thrown in for absolutely no reason
IshKebab•46m ago
Mmm I don't buy it. Not many projects use setup.py now anyway and pip is still super slow.

> Plenty of tools are written in Rust without being notably fast.

This also hasn't been my experience. Most tools written in Rust are notably fast.

ggm•42m ago
Some of these speed ups looked viable to backport into pip including parallel download, delayed .pyc, ignore egg, version checks.

Not that I'd bother since uv does venv so well. But, "it's not all rust runtime speed" implies pip could be faster too.

robertclaus•30m ago
At Plotly we did a decent amount of benchmarking to see how much the different defaults `uv` uses lead to its performance. This was necessary so we could advise our enterprise customers on the transition. We found you lost almost all of the speed gains if you configured uv behave as much like pip as you could. A trivial example is the precompile flag, which can easily be 50% of pips install time for a typical data science venv.

https://plotly.com/blog/uv-python-package-manager-quirks/

zahlman•16m ago
The precompilation thing was brought up to the uv team several months ago IIRC. It doesn't make as much of a difference for uv as for pip, because when uv is told to pre-compile it can parallelize that process. This is easily done in Python (the standard library even provides rudimentary support, which Python's own Makefile uses); it just isn't in pip yet (I understand it will be soon).
w10-1•30m ago
I like the implication that we can have an alternative to uv speed-wise, but I think reliability and understandability are more important in this context (so this comment is a bit off-topic).

What I want from a package manager is that it just works.

That's what I mostly like about uv.

Many of the changes that made speed possible were to reduce the complexity and thus the likelihood of things not working.

What I don't like about uv (or pip or many other package managers), is that the programmer isn't given a clear mental model of what's happening and thus how to fix the inevitable problems. Better (pubhub) error messages are good, but it's rare that they can provide specific fixes. So even if you get 99% speed, you end up with 1% perplexity and diagnostic black boxes.

To me the time that matters most is time to fix problems that arise.

zahlman•19m ago
> the programmer isn't given a clear mental model of what's happening and thus how to fix the inevitable problems.

This is a priority for PAPER; it's built on a lower-level API so that programmers can work within a clear mental model, and I will be trying my best to communicate well in error messages.

pkaodev•16m ago
AI slop
rvz•8m ago
TLDR: Because Rust.

This entire AI generated article with lots of text just to just say the obvious.