Also love Ruff from the Astral team. We just cut our linting + formatting across from pylint + Black to Ruff.
Saw lint times drop from 90 seconds to < 1.5 seconds. crazy stuff.
uv add <mydependencies> --script mycoolscript.py
And then shoving #!/usr/bin/env -S uv run
on top so I can run Python scripts easily. It's great!- https://everything.intellectronica.net/p/the-little-scripter
~~That mutates the project/env in your cwd. They have a lot in their docs, but I think you’d like run --with or uv’s PEP723 support a lot more~~
Instant reactive reproducible app that can be sent to others with minimal prerequisites (only uv needs to be installed).
Such a hot combo.
Claude 4's training cutoff date is March 2025 though, I just checked and it turns out Claude Sonnet 4 can do this without needing any extra instructions:
Python script using uv and inline script dependecies
where I can give it a URL and it scrapes it with httpx
and beautifulsoup and returns a CSV of all links on
the page - their URLs and their link text
Here's the output, it did the right thing with regards to those dependencies: https://claude.ai/share/57d5c886-d5d3-4a9b-901f-27a3667a8581 If you need to run these scripts, use "uv run script-name.py". It will automatically install the dependencies. Stdlibs don't need to be specified in the dependencies array.
since e.g. Cursor often gets confued because the dependencies are not installed and it doesn't know how to start the script. The last sentence is for when LLMs get confused and want to add "json" for example to the dependency array.1: https://old.reddit.com/r/Python/comments/12rk41t/astral_next...
It seems easy to imagine Astral following a similar path and making a significant amount of money in the process.
One day they're going to tell me I have to pay $10/month per user and add a bunch of features I really don't need just because nobody wants to prioritize the speed of pip.
And most of that fee isn't going to go towards engineers maintaining "pip but faster", it's going to fund a bunch of engineers building new things I probably don't want to use, but once you have a company and paying subscribers, you have to have developers actively doing things to justify the cost.
I don't want to charge people money to use our tools, and I don't want to create an incentive structure whereby our open source offerings are competing with any commercial offerings (which is what you see with a lost of hosted-open-source-SaaS business models).
What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today.
An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry. A lot of big companies use uv. We spend time talking to them. They all spend money on private package registries, and have issues with them. We could build a private registry that integrates well with uv, and sell it to those companies. [...]
But the core of what I want to do is this: build great tools, hopefully people like them, hopefully they grow, hopefully companies adopt them; then sell software to those companies that represents the natural next thing they need when building with Python. Hopefully we can build something better than the alternatives by playing well with our OSS, and hopefully we are the natural choice if they're already using our OSS.
Rust's speed advantages typically come from one of a few places:
1. Fast start-up times, thanks to pre-compiled native binaries.
2. Large amounts of CPU-level concurrency with many fewer bugs. I'm willing to do ridiculous threading tricks in Rust I wouldn't dare try in C++.
3. Much lower levels of malloc/free in Rust compared to some high-level languages, especially if you're willing to work a little for it. Calling malloc in a multithreaded system is basically like watching the Millennium Falcon's hyperdrive fail. Also, Rust encourages abusing the stack to a ridiculous degree, which further reduces allocation. It's hard to "invisibly" call malloc in Rust, even compared to a language like C++.
4. For better or worse, Rust exposes a lot of the machinery behind memory layout and passing references. This means there's a permanent "Rust tax" where you ask yourself "Do I pass this by value or reference? Who owns this, and who just borrows is?" But the payoff for that work is good memory locality.
So if you put in a modest amount of effort, it's fairly easy to make Rust run surprisingly fast. It's not an absolute guarantee, and there are couple of traps for the unwary (like accidentally forgetting to buffer I/O, or benchmarking debug binaries).
Conda rewrote their package resolver for similar reasons
The improvements came from lots of work from the entire python build system ecosystem and consensus building.
Sure, other tools could handle the situation, but being baked into the tooling makes it much easier to bootstrap different configurations.
uv does the Python ecosystem better than any other tool, but it's still the standard Python ecosystem as defined in the relevant PEPs.
off topic, but i wonder why that phrase gets used rather than 10x which is much shorter.
- 10x is a meme
- what if it's 12x better
Order of magnitude faces less of that baggage, until it does :)
In common conversation, the multiplier can vary from 2x - 10x. In context of some algorithms, order of magnitudes can be over the delta rather than absolutes. eg: an algorithms sees 1.1x improvement over the previous 10 years. A change that shows a 1.1x improvement by itself, overshadows an an order-of-magnitude more effort.
For salaries, I've used order-of-magnitude to mean 2x. Good way to show a step change in a person's perceived value in the market.
Long answer: Because if you put a number, people expect it to be accurate. If it was 6x faster, and you said 10x, people may call you out on it.
A metal wheel is still just a wheel. A faster package manager is still just a package manager.
My primary vehicle has off-road capable tires that offer as much grip as a road-only tire would have 20-25 years ago, thanks to technology allowing Michelin to reinvent what a dual-purpose tire can be!
The good thing about reinventing the wheel is that you can get a round one.
https://scripting.wordpress.com/2006/12/20/scripting-news-fo...
Just `git clone someproject`, `uv run somescript.py`, then mic drop and walk away.
Maybe that functionality isnt implemented the same way for uvx.
You could try this equivalent command that is under "uv run" to see if it behaves differently: https://docs.astral.sh/uv/concepts/tools/#relationship-to-uv...
e.g.
$ uv tool install asciinema
$ asciinema play example.cast
You don't have that problem with Poetry. You go make a cup of coffee for a couple minutes, and it's usually done when you come back.
other than that, it's invaluable to me, with the best features being uvx and PEP 723
What I want is, if my project depends on `package1==0.4.0` and there are new versions of package1, for uv to try install the newer version. and to do that for a) all the deps, simultaneously, b) without me explicitly stating the dependencies in the command line since they're already written in the pyproject.toml. an `uv refresh` of sorts
I think you're just specifying your dependency constraints wrong. What you're asking for is not what the `==` operator is for; you probably want `~=`.
pyproject.toml is meant to encode the actual constraints for when your app will function correctly, not hardcode exact versions, which is what the lockfile is for.
[1]: I do sometimes write the title or the description. But never the deps themselves
pyproject.toml’s dependency list specifies compatibility: we expect the program to run with versions that satisfy constraints.
If you want to specify an exact version as a validated configuration for a reproducible build with guaranteed functionality, well, that’s what the lock file is for.
In serious projects, I usually write that dependency section by hand so that I can specify the constraints that match my needs (e.g., what is the earliest version receiving security patches or the earliest version with the functionality I need?). In unserious projects, I’ll leave the constraints off entirely until a breakage is discovered in practice.
If `uv` is adding things with `==` constraints, that’s why upgrades are not occurring, but the solution is to relax the constraints to indicate where you are okay with upgrades happening.
Yeah, that's pretty much what I've been doing with my workaround script. And btw most of my projects are deeply unserious, and I do understand why one should not do that in any other scenario.
Still, I dream of `uv refresh` :D
Much prefer not thinking about venvs.
I've written a lightweight replacement script to manage named central virtual envs using the same command syntax as virtualenvwrapper. Supports tab completion for zsh and bash: https://github.com/sitic/uv-virtualenvwrapper
Perhaps uv will continue its ascendancy and get there naturally. But I’d like to see uv be a little more aggressive with “uv native” workflows. If that makes sense.
It's a nice software.
I don't see a way to change current and global versions of python/venvs to run scripts, so that when I type "python" it uses that, without making an alias.
https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...
* Redis -> redict, valkey
* elastic search -> opensearch
* terraform -> opentofu
(Probably a few more but those are the ones that come to mind when they "go rogue")
Or would it be possible to go this fast in python if you cared enough about speed?
Is it a specific thing that rust has an amazing library for? Like Network or SerDe or something?
pip could be made faster based on this, but maybe not quite as fast.
Using Rust is responsible for a lot of speed gains too, but I believe it's the hard linking trick (which could be implemented in any language) that's the biggest win.
I now use uv for everything Python. The reason for the switch was a shared server where I did not have root and there were all sorts of broken packages/drivers and I needed pytorch. Nothing was working and pip was taking ages. Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked
If you're still holding out, really just spend 5 minutes trying it out, you won't regret it.
Really? :)
requirements.txt is just hell and torture. If you've ever used modern project/dependency management tools like uv, Poetry, PDM, you'll never go back to pip+requirements.txt. It's crazy and a mess.
uv is super fast and a great tool, but still has roughnesses and bugs.
There are times when you do NOT want the wheel version to be installed (which is what --no-binary implements in pip), but so many package managers including uv don't provide that core, basic functionality. At least for those that do use pip behind the scenes, like pipenv, one can still use the PIP_NO_BINARY environment variable to ensure this.
So I'll not be migrating any time soon.
See https://docs.astral.sh/uv/reference/environment/#uv_no_binar...
uv is still quite new though. Perhaps you can open an issue and ask for that?
When, why? Should I be doing this?
I can see how if you've had issues with dependencies you would rave about systems that let you control down to the commit what an import statement actually means, but I like the system that requires the least amount of typing/thinking and I imagine I'm part of a silent majority.
uv pip install --system requests
but it's more typing. If I type 5 characters per second, making me also type "uv --system" is the same as adding 2 seconds of runtime to the actual command, except even worse because the chance of a typo goes up and typing takes energy and concentration and is annoying.Also, it seems like a sign that even Python tooling needs to not be written in Python now to get reasonable performance.
I also appreciate that it handles most package conflicts and it constantly maintains the list of packages as you move. I have gotten myself into a hole or two now with packages and dependencies, I can usually solve it by deleting venv an just using uv to reinstall.
#!/usr/bin/env -S uv --quiet run --script
# /// script
# requires-python = ">=3.13"
# dependencies = [
# "python-dateutil",
# ]
# ///
#
# [python script that needs dateutil]
Rather, pip was broken intentionally two years ago and they are still not interested in fixing it:
https://github.com/pypa/packaging/issues/774
I tried uv and it just worked.
Fast is a massive factor.
I haven't used it much, but being so fast, I didn't even stop to think "is it perfect at dependency management?" "does it lack any features?".
Just today I set it up on 20 PCs in a computer lab that doesn't have internet, along with vs code and some main packages. Just downloaded the files, made a powershell script and it's all working great with Jupyter etc... Now to get kids to be interested in it...
mh-•2h ago