UV doesn't change any of that for me - it just wraps virtualenv and pip downloads dependencies (much, much) more quickly - the conversion was immediate and required zero changes.
UV is a pip / virtualenv wrapper. And It's a phenomenal wrapper - absolutely changed everything about how I do development - but under the hood it's still just virtualenv + pip - nothing changed there.
Can you expand on the pain you've experienced?
Regarding "things that need to be deployed" - internally all our repos have standardized on direnv (and in some really advanced environments, nix + direnv, but direnv alone does the trick 90% of the time) - so you just "cd <somedir>", direnv executes your virtualenv and you are good to go. UV takes care of the pip work.
Has eliminated 100% use of virtualenvwrappers and direct-calls to pip. I'd love to hear a use case where that doesn't work for you - we haven't tripped across it recently.
Not quite; it reimplements the pip functionality (in a much smarter way). I'm pretty sure it reimplements the venv functionality, too, although I'm not entirely sure why (there's not a lot of room for improvement).
("venv" is short for "virtual environment", but "virtualenv" is specifically a heavyweight Python package for creating them with much more flexibility than the standard library option. Although the main thing making it "heavyweight" is that it vendors wheels for pip and Setuptools — possibly multiple of each.)
No, it doesn't. It specifically avoids the problem of environment pollution by letting you just make another environment. And it avoids the problem of interfering with the system by not getting added to sys.path by default, and not being in a place that system packages care about. PEP 668 was specifically created in cooperation between the Python team and Linux distro maintainers so that people would use the venvs instead of "globally pip installing packages".
> requires a bit of path mangling to work right, or special python configs, etc. In the past, it's also had a bad habit of leaking dependencies, though that was in some weird setups.
Genuinely no idea what you're talking about and I've been specifically studying this stuff for years.
> It's one of the reasons I would recommend against python for much of anything that needs to be "deployed" vs throw away scripts. UV seems to handle all of this much better.
If you're using uv, you're using Python.
Could you elaborate?
Source? That's an option, but it's not even explicitly mentioned in the related documentation [1].
You kind of have to read between the lines and "know" this is a good solution and then when you see it it's like "right of course".
And lack of non-local venv support [2].
In my docker files I use `uv sync` to install deps vs `pip install -f requirements.txt`
And then set my command to `uv run my_command.py` vs calling Python directly.
The biggest wins are speed and a dependable lock file. Dependencies get installed ~10x faster than with pip, at least on my machine.
Both of my Docker Compose starter app examples for https://github.com/nickjj/docker-flask-example and https://github.com/nickjj/docker-django-example use uv.
I also wrote about making the switch here: https://nickjanetakis.com/blog/switching-pip-to-uv-in-a-dock...
What's the problem with that?
You just make your script's entry point be something like this:
uv venv --clear
uv sync
uv run main.py
ENV UV_SYSTEM_PYTHON=1
A virtual environment, minimally, is a folder hierarchy and a pyvenv.cfg file with a few lines of plain text. (Generally they also contain a few dozen kilobytes of activation scripts that aren't really necessary here.) If you're willing to incur the overhead of using a container image in the first place, plus the ~35 megabyte compiled uv executable, what does a venv matter?
What caused python to go through these issues? Is there any fundamental design flaw ?
UV it's a step in the right direction, but legacy projects without Dockerfile can be tricky to start.
easy_install never even made it to 1.0
Still, not bad for a bunch of mostly unpaid volunteers.
More recent languages like Node.js and Rust and Go all got to create their packaging ecosystems learning from the experiences of Perl and Python before them.
There is one part of Python that I consider a design flaw when it comes to packaging: the sys.modules global dictionary means it's not at all easy in Python to install two versions of the same package at the same time. This makes it really tricky if you have dependency A and dependency B both of which themselves require different versions of dependency C.
[0] https://learnpythonthehardway.org/book/nopython3.html#the-py...
He’s like that uncle you see at family gatherings whom you nod along politely to.
All the languages of today gain all their improvements from:
1. Nothing should be global, but if it is it's only a cache (and caches are safe to delete since they're only used as a performance optimization)
2. You have to have extremely explicit artifact versioning, which means everything needs checksums, which means mostly reproducible builds
3. The "blessed way" is to distribute the source (or a mostly-source dist) and compile things in; the happy path is not distributing pre-computed binaries
Now, everything I just said above is also wrong in many aspects or there's support for breaking any and all of the rules I just outlined, but in general, everything's built to adhere to those 3 rules nowadays. And what's crazy is that for many decades, those three rules above were considered absolutely impossible, or anti-patterns, or annoying, or a waste, etc (not without reason, but still we couldn't do it). That's what made package managers and package management so awful. That's why it was even possible to break things with `sudo pip install` vs `apt install`.
Now that we've abandoned the old ways in e.g. JS/Rust/Go and adopted the three rules, all kinds of delightful side effects fall out. Tools now which re-build a full dependency tree on-disk in the project directory are the norm (it's done automatically! No annoying bits! No special flags! No manual venv!). Getting serious about checksums for artifacts means we can do proper versioning, which means we can do aggressive caching of dependencies across different projects safely, which means we don't have to _actually_ have 20 copies of every dependency, one for each repo. It all comes from the slow distributed Gentoo/FreeBSD-ification of everything and it's great!
All of this alongside the rise of GitHub and free CI builders, it being trivial to depend on lots of other packages of unknown provenance, stdlib packages being completely sidelined by stuff like requests.
It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend, what a lock file is and how workspace management fits into the whole picture. Distutils and setuptools are in there too.
Basically, Python’s packaging has been a mess for a long time, but uv getting almost everything right all of a sudden isn’t an accident; it’s an abrupt gelling of ideas that have been in progress for two decades.
Please don't use this. You need to be careful about how you place any secondary installation of Python on Ubuntu. Meanwhile, it's easy to build from source on Ubuntu and you can easily control its destination this way (by setting a prefix when you ./configure, and using make altinstall) and keep it out of Apt's way.
> and venvs, plus the ongoing awkwardness about whether pip should be writing stuff into usr/local or ~/.local or something else.
There is not really anything like this. You just use venvs now, which should have already been the rule since 3.3. If you need to put the package in the system environment, use an Apt package for that. If there isn't an Apt package for what you want, it shouldn't live in the system environment and also shouldn't live in your "user" site-packages — because that can still cause problems for system tools written in Python, including Apt.
You only need to think about venvs as the destination, and venvs are easy to understand (and are also fundamental to how uv works). Start with https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... .
> It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend
Well no; it's in that time that the idea of separating a backend and frontend emerged. Before that, it was assumed that Setuptools could just do everything. But it really couldn't, and it also led to people distributing source packages for pure-Python projects, resulting in installation doing a ton of ultimately useless work. And now that Setuptools is supposed to be focused on providing a build backend, it's mostly dead code in that workflow, but they still can't get rid of it for backwards compatibility reasons.
(Incidentally, uv's provided backend only supports pure Python — they're currently recommending heavyweight tools like maturin and scikit-build-core if you need to compile something. Although in principle you can use Setuptools if you want.)
Yes, the point of my post wasn’t to give current best practice counsel but rather to illustrate how much it has changed over the years as the needs and desires of the maintainers, distro people, developers, and broader community have evolved.
word of warning: I spent a lot of years working off of "built from source" Python on Ubuntu and every once in a while I'd have really awkward issues downstream of me not realizing I was missing some lib when I built Python and then some random standard library was just missing for me.
I think it's all generally good, but real easy to miss optional package stuff.
2. There is tons of code in the Python ecosystem not written in Python. One of the most popular packages, NumPy, depends on dozens of megabytes of statically compiled C and Fortran code.
3. Age again; things were designed in an era before the modern conception of a "software ecosystem", so there was nobody imagining that one day you'd be automatically fetching all the transitive dependencies and trying to build them locally, perhaps using build systems that you'd also fetch automatically.
4. GvR didn't seem to appreciate the problem fully in the early 2010s, which is where Conda came from.
5. Age again. Old designs overlooked some security issues and bootstrapping issues (this ties into all the previous points); in particular, it was (and still is) accepted that because you can include code in any language and all sorts of weird build processes, the "build the package locally" machinery needs to run arbitrary code. But that same system was then considered acceptable for pure-Python packages for many years, and the arbitrary code was even used to define metadata. And in that code, you were expected to be able to use some functionality provided by a build system written in Python, e.g. in order to locate and operate a compiler. Which then caused bootstrapping problems, because you couldn't assume that your users had a compatible version of the main build system (Setuptools) installed, and it had to be installed in the same environment as the target for package installation. So you also didn't get build isolation, etc. It was a giant mess.
5a. So they invented a system (using pyproject.toml) that would address all those problems, and also allow for competition from other build back-ends. But the other build back-end authors mostly wanted to make all-in-one tools (like Poetry, and now, er, uv); and meanwhile it was important to keep compatibility, so a bunch of defaults were chosen that enabled legacy behaviour — and ended up giving old packages little to no reason to fix anything. Oh, and also they released the specification for the "choose the build back-end system" and "here's how installers and build back-ends communicate" years before the specification for "human-friendly input for the package metadata system".
Given that, plus the breadth and complexity of its ecosystem, it makes sense that its tooling is also complex.
Funny thing is that decision was for modularity, but uv didn't even reuse pip.
To be fair, that's justified by pip's overall lack of good design. Which in turn is justified by its long, organic development (I'm not trying to slight the maintainers here).
But I'm making modular pieces that I hope will showcase the original idea properly. Starting with an installer, PAPER, and build backend, bbbb. These work together with `build` and `twine` (already provided by PyPA) to do the important core tasks of packaging and distribution. I'm not trying to make a "project manager", but I do plan to support PEP 751 lockfiles.
Rarely I'd need a different version of python, in case I do, either I let the IDE to take care of it or just do pyenv.
I know there's the argument of being fast with uv, but most of the time, the actual downloading is the slowest part.
I'm not sure how big a project should be, before I feel pip is slow for me.
Currently, I have a project with around 50 direct dependencies and everything is installed in less than a min with a fresh venv and without pip cache.
Also, if I ever, ever needed lock files stuff, I use pipx. Never needed the hash of the packages the way it's done in package-lock.json.
Maybe, I'm just not the target audience of uv.
I noticed the comment from andy99 got several downvotes (became grey) and mine here also immediately got some.
I couldn’t care less that it’s written in rust. It could be conjured from malbolge for all I care. It works as advertised, whatever it’s written in.
While I like the idea of pip or uv to be insanely fast, I still don't see it revolutionize my development experience.
Installing and uninstalling package is not something I do every 1 to 10 minutes. It doesn't save me any much time. Also, activating a venv is once a session in terminal and sometime a week goes by without ever activating a venv, because the IDE does that automatically on whatever I do.
That's why, personally for me it really doesn't change much.
Where I like things being fast in my development time is pre-commit and linting, where ruff shines. Which that I also don't use, even though I work on a small-medium 600k LoC project, I only pass the changed files to isort, flake8 and black and it's all done in less than 5 seconds.
To me, the only advantage of uv is being fast, which is something I haven't been bothered with so far, where 99% of things happen in less than 1 or max couple of seconds.
Where things get annoying is when I push to GitHub and Tox runs through GitHub Actions. I've set up parallel runs for each Python version, but the "Prepare Tox" step (which is where Python packages are downloaded & installed) can take up to 3 minutes, where the "Run Tox" step (which is where pytest runs) takes 1½ minutes.
GitHub Actions has a much better network connection than me, but the free worker VMs are much slower. That is where I would look at making a change, continuing to use pip locally but using uv in GitHub Actions.
If your project requires creating an env and switching to shit and then running it’s a bad program and you should feel bad.
Quite frankly the fact that Python requires explaining and understanding a virtual environment is an embarrassing failure.
uv run foo.py
I never ever want running any python program to ever require more than that. And it better work first time 100%. No missing dependencies errors are ever permitted.
Also, Conda can fucking die in a fire. I wil never ever ever install conda or mini-conda onto my system ever again. Keep those abominations away.
1. Write code that crosses a certain complexity treshold. Let's say tou also need compiled wheels for a performance critical section of a library that was written in Rust, have some non-public dependencies on a company-internal got server
2. Try deploying said code on a fleet of servers whose version and exact operating system versions (and python versions!) are totally out of your control. Bonus points for when your users need to install it themselves
3. Wait for the people to contact you
4. Now do monthly updates on their servers while updating dependencies for your python program
If that was never your situation, congrats on your luck, but that just means you really weren't in a situation where the strengths of uv had played out. I had to wrestle with this for years.
This is where uv shines. Install uv, run with uv. Everything else just works, including getting the correct python binary, downloading the correct wheel, downloading dependencies from the non-public git repo (provided the access has been given), ensuring the updates go fine, etc.
While I understand that some have acclimated well to the prior situation and see no need to change their methods, is there really no objective self-awareness that perhaps having one fast tool over many tools may be objectively better?
`uv install` = `uv sync`
`uv install rich` = `uv add rich`
bognition•2h ago
Python dependency management and environments have been a pain for 15 years. Poetry was nice but slow and sometimes difficult.
Uv is lightning fast and damn easy to use. It’s so functional and simple.
ThibWeb•2h ago
bognition•1h ago
Then I gave it a try and it just worked! It’s so much better that I immediately moved all my Python projects to it.
zahlman•1h ago
Pip, venv and virtualenvwrapper (people still use this?) are not meaningfully "dependency managers". A venv is just a place to put things, and pip does only basic tracking and tries to maintain a consistent environment. It isn't trying to help you figure out what dependencies you need, create new environments from scratch, update pyproject.toml....
Pip's core capability is the actual installation of packages, and uv does a far better job of that part, using smarter caching, hard links to share files, parallelized pre-compilation of .pyc files, etc. Basically it's designed from the ground up with the intention to make lots of environments and expect starting a new one to be cheap. Poetry, as far as I was able to determine, does it basically the same way as pip.
sgarland•7m ago
simonw•1h ago
WD-42•1h ago
Poetry which I think is the closest analogue, still requires a [tool.poetry.depenencies] section afaik.
greenavocado•1h ago
If you inherit a codebase made this way from someone else, merely running uv run program.py will automatically create, launch the venv, configure packages, run your script, seamlessly on first launch.
Uv lets you almost forget virtual environments exist. Almost.
kstrauser•35m ago
For everyone else, just try uv and don’t look back.
rtpg•1h ago
It's probably worth mentioning that Astral (The team behind uv/etc) has a team filled with people with a history of making very good CLI tooling. They probably have a very good sense for what matters in this stuff, and are thus avoiding a lot of pain.
Motivation is not enough, there's also a skill factor. And being multiple people working on it "full time"-ish means you can get so much done, especially before the backwards compat issues really start falling into place
perrygeo•1h ago
scuff3d•1h ago
lukeschlather•1h ago
It sounds like uv is a drop-in replacement for pip, pipx, and poetry with all of their benefits and none of the downsides, so I don't see why I wouldn't migrate to it overnight.
andy99•1h ago
I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.
I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.
I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.
morshu9001•1h ago
The speed usually doesn't matter, but one time I did have to use it to auto figure out compatible deps in a preexisting project because the pip equivalent with backtracking was taking forever with CPU pegged at 100.
cgearhart•1h ago
gre•1h ago
andy99•1h ago
gre•58m ago
eslaught•1h ago
1. If you edit any dependency, you resolve the environment from scratch. There is no way to update just one dependency.
2. Conda "lock" files are just the hashes of the all the packages you happened to get, and that means they're non-portable. If you move from x86 to ARM, or Mac to Linux, or CPU to GPU, you have to throw everything out and resolve.
Point (2) has an additional hidden cost: unless you go massively out of your way, all your platforms can end up on different versions. That's because solving every environment is a manual process and it's unlikely you're taking the time to run through 6+ different options all at once. So if different users solve the environments on different days from the same human-readable environment file, there's no reason to expect them to be in sync. They'll slowly diverge over time and you'll start to see breakage because the versions diverge.
P.S. if you do want a "uv for Conda packages", see Pixi [1], which has a lot of the benefits of uv (e.g., lock files) but works out of the box with Conda's package ecosystem.
[1]: https://pixi.sh/latest/
simonw•1h ago
uv uses some very neat tricks involving hard links such that if you start a new uv-managed virtual environment and install packages into it that you've used previously, the packages are symlinked in. This means the new environment becomes usable almost instantly and you don't end up wasting filesystem space on a bunch of duplicate files.
This means it's no longer expensive to have dozens, hundreds or even thousands of environments on a machine. This is fantastic for people like myself who work on a lot of different projects at once.
Then you can use "uv run" to run Python code in a brand new temporary environment that get created on-demand within ms of you launching it.
I wrote a Bash script the other day that lets me do this in any Python project directory that includes a setup.py or pyproject.toml file:
That will run pytest with Python 3.11 (or 3.12/3.13/3.14/whatever version you like) against the current project, in a fresh isolated environment, without any risk of conflicting with anything else. And it's fast - the overhead of that environment setup is negligible.Which means I can test any code I like against different Python versions without any extra steps.
https://til.simonwillison.net/python/uv-tests
Alir3z4•57m ago
On my machine, there are like 100s of not thousands of venvs.
I simply have all of them under ~/.python_venvs/<project_name>/
Does that mean, no matter how many projects I install pytorch and tensoflow and huggingface and all the heavy machinery, they'll be counted only once as long as they're unique?
If that's the case, then I can leave my habit of pip and move to uv.
This is something that always bugged my mind about virtual environments in almost all the package managers.
simonw•29m ago
I think so, based on my understanding of how this all works. You may end up with different copies for different Python versions, but it should still save you a ton of space.
markkitti•57m ago
testdelacc1•47m ago
> could care less
I think “couldn’t care less” works better.
zbentley•4m ago
When I first started using uv, I did not know what language it was written in; it was a good tool which worked far better than its predecessors (and I used pdm/pipenv/pyenv/etc. pretty heavily and in non-basic ways). I still don’t particularly care if it’s written in Rust or Brainfuck, it works well. Rust is just a way to get to “don’t bootstrap Python environments in Python or shell”.
> I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason.
I’m curious what issues you encountered. Were these bugs/failures of uv, issues using it in a specific environment, or workflow patterns that it didn’t support? Or something else entirely?
anitil•1h ago
saghm•36m ago
tclancy•21m ago
ziml77•1h ago
I switched to using uv just 2 weeks ago. Previously I had been dealing with maintaining a ton of batch jobs that used: global packages (yes, sudo pip install), manually managed virtualenvs, and docker containers.
uv beats all of them easily. Automatically handling the virtualenv means running a project that uses uv feels as easy as invoking the system Python is.
atoav•47m ago
Absolute no-brainer.
walkabout•45m ago
Or is the superior replacement actually up to the job this time?
kstrauser•38m ago
I just found out they’re still making pipenv. Yes, if you’re using pipenv, I’m confident that uv will be a better experience in every way, except maybe “I like using pipenv so I can take long coffee breaks every time I run it”.
walkabout•35m ago
kstrauser•19m ago
dmd•15m ago
walkabout•7m ago
hk1337•10m ago
hyperbovine•4m ago