No, the same uv that people have been regularly (https://hn.algolia.com/?q=uv) posting about on HN since its first public releases in February of 2024 (see e.g. https://news.ycombinator.com/item?id=39387641).
> How many are there now?
Why is this a problem? The ecosystem has developed usable interoperable standards (for example, fundamentally uv manages isolated environments by using the same kind of virtual environment created by the standard library — because that's the only kind that Python cares about; the key component is the `pyvenv.cfg` file, and Python is hard-coded to look for and use that); and you don't have to learn or use more than one.
There are competing options because people have different ideas about what a "package manager" should or shouldn't be responsible for, and about the expectations for those tasks.
I don’t really get that uv solves all these problems ve never encountered. Just make a venv and use it seems to work fine.
I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.
Edit: or was it ruff? Either way. I thought they created the tools first, then the company.
I just... build from source and make virtual environments based off them as necessary. Although I don't really understand why you'd want to keep older patch versions around. (The Windows installers don't even accommodate that, IIRC.) And I can't say I've noticed any of those "significant improvements and differences" between patch versions ever mattering to my own projects.
> I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.
In my book, the less under the PSF's control, the better. The meager funding they do receive now is mostly directed towards making PyCon happen (the main one; others like PyCon Africa get a pittance) and to certain grants, and to a short list of paid staff who are generally speaking board members and other decision makers and not the people actually developing Python. Even without considering "politics" (cf. the latest news turning down a grant for ideological reasons) I consider this gross mismanagement.
The PSF is busy with social issues and doesn't concern itself with trivia like this.
Wonderful project
I'm interested if you have any technical documentation about how conda environments are structured. It would be nice to be able to interact with them. But I suspect the main problem is that if you use a non-conda tool to put something into a conda environment, there needs to be a way to make conda properly aware of the change. Fundamentally it's the same issue as with trying to use pip in the system environment on Linux, which will interfere with the system package manager (leading to the PEP 668 protections).
uv has implemented experimental support, which they announced here [3].
[0] https://wheelnext.dev/proposals/pepxxx_wheel_variant_support...
[1] https://us.pycon.org/2025/schedule/presentation/100/
Or by asyncio.
The "install things that have complex non-Python dependencies using pip" story is much better than several years ago, because of things like pip gaining a new resolver in 2020, but in large part simply because it's now much more likely that the package you want offers a pre-built wheel (and that its dependencies also do). A decade ago, it was common enough that you'd be stuck with source packages even for pure-Python projects, which forced pip to build a wheel locally first (https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-...).
Another important change is that for wheels on PyPI the installer can now obtain separate .metadata files, so it can learn what the transitive dependencies are for a given version of a given project from a small plain-text file rather than having to speculatively download the entire wheel and unpack the METADATA file from it. (This is also possible for source distributions that include PKG-INFO, but they aren't forced to do so, and a source distribution's metadata is allowed to have "dynamic" dependencies that aren't known until the wheel is built (worst case) or a special metadata-only build hook is run (requires additional effort for the build system to support and the developer to implement)).
I'm still mostly on poetry
Currently they are a bit pointless. Sure they aid in documentation, but they are effort and cause you pain when making modifications (mind you with halfarse agentic coding its probably less of a problem. )
What would be better is to have a strict mode where instead of duck typing its pre-declared. It would also make a bunch of things faster (along with breaking everything and the spirit of the language)
I still don't get the appeal of UV, but thats possibly because I'm old and have been using pyenv and venv for many many years. This means that anything new is an attack on my very being.
however if it means that conda fucks off and dies, then I'm willing to move to UV.
I've been using it professionally and its been a big improvement for code quality.
/just guessing, haven't tried it
Maybe if you trust the software, then trusting the install script isn't that big of a stretch?
Also, many of the "distribution" tools like brew, scoop, winget, and more are just "PR a YAML file with your zip file URL, name of your EXE to add to a PATH, and a checksum hash of the zip to this git repository". We're about at a minimum effort needed to generate a "distribution" point in software history, so seems interesting shell scripts to install things seem to have picked up instead.
There have also been PoCs on serving malicious content only when piped to sh rather than saved to file.
If you want to execute shell code from the internet, at the very least store it in a file first and store that file somewhere persistent before executing it. It will make forensics easier
Versioning OTOH is often more problematic with distro package managers that can't support multiple versions of the same package.
Also inability to do user install is a big problem with distro managers.
Also, most reasonable developers should already be running with the ExecutionPolicy RemoteSigned, it would be nice if code signing these install script was a little more common, too. (There was even a proposal for icm [Invoke-Command] to take signed script URLs directly for a much safer alternative code-golfed version of iwr|iex. Maybe that proposal should be picked back up.)
no. thats how you get malware. Make a package. Add it to a distro. then we will talk.
#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.11"
# dependencies = [ "modules", "here" ]
# ///
The script now works like a standalone executable, and uv will magically install and use the specified modules.This isn't a knock against UV, but more a criticism of dynamic dependency resolution. I'd feel much better about this if UV had a way to whitelist specific dependencies/dependency versions.
If you don't care about being ecosystem-compliant (and I am sure malware does not), it's only a few lines of Python to download the code and eval it.
import shutil
shutil.rmtree('/')
aren't in the file so I don't need to read the code. I only read the code when there are dependencies. This is because I have my security hat on that sorted me into the retard house.As long as you have internet access, and whatever repository it's drawing from is online, and you may get different version of python each time, ...
But, yes, python scripts with in-script dependencies plus uv to run them doesn't change dependency distribution, just streamlines use compared to manual setup of a venv per script.
The man page tells me:
-S, --split-string=S
process and split S into separate arguments; used to pass multi‐
ple arguments on shebang lines
Without that, the system may try to treat the entirety of "uv run --script" as the program name, and fail to find it. Depending on your env implementation and/or your shell, this may not be needed.-S causes the string to be split on spaces and so the arguments are passed correctly.
It will install and use distribution packages, to use PyPA's terminology; the term "module" generally refers to a component of an import package. Which is to say: the names you write here must be the names that you would use in a `uv pip install` command, not the names you `import` in the code, although they may align.
This is an ecosystem standard (https://peps.python.org/pep-0723/) and pipx (https://pipx.pypa.io) also supports it.
linux core utils have supported this since 2018 (coreutils 8.3), amusingly it is the same release that added `cp --reflink`. AFAIK I know you have to opt out by having `POSIX_CORRECT=1` or `POSIX_ME_HARDER=1` or `--pedantic` set in your environment. [1]
freebsd core utils have supported this since 2008
MacOS has basically always supported this.
---
1. Amusingly despite `POSIX_ME_HARDER` not being official a alrge swapt of core utils support it. https://www.gnu.org/prep/standards/html_node/Non_002dGNU-Sta...
I want to be able to ship a bundle which needs zero network access to run, but will run.
It is still frustratingly difficult to make portable Python programs.
UV means getting more strings attached with VC funded companies and leaning on their infrastructure. This is a high risk for any FOSS community and history tells us how this ends….
uv is MIT licensed so if they rug pull, you can fork.
1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.
2. You end up needing to use `uv pip` so it's not even a full replacement for pip.
3. It does not play well with Docker.
4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.
- uv add <package_name>
- uv sync
- uv run <command>
Feels very ergonomic, I don't need to think much, and it's so much faster.
pip and virtualenv also add a ton of complexity and when they break (which happens quite often) debugging it is even harder despite them being "battle tested" tools.
Yep: Nix
It's the same sort of deal with pyenv--the Python version is itself a dependency of most libraries, so it's a little silly to have a dependency manager that only manages some dependencies.
`virtualenv` is a heavy-duty third-party library that adds functionality to the standard library venv. Or rather, venv was created as a subset of virtualenv in Python 3.3, and the projects have diverged since.
The standard library `venv` provides "obvious thing that a dependency manager does" functionality, so that every dependency manager has the opportunity to use it, and so that developers can also choose to work at a lower level. And the virtual-environment standard needs to exist so that Python can know about the pool of dependencies thus stored. Otherwise you would be forced to... depend on the dependency manager to start Python and tell it where its dependency pool is.
Fundamentally, the only things a venv needs are the `pyvenv.cfg` config file, the appropriate folder hierarchy, and some symlinks to Python (stub executables on Windows). All it's doing is providing a place for that "pool of dependencies" to exist, and providing configuration info so that Python can understand the dependency path at startup. The venvs created by the standard library module — and by uv — also provide "activation" scripts to manipulate some environment variables for ease of use; but these are completely unnecessary to making the system work.
Fundamentally, tools like uv create the same kind of virtual environment that the standard library does — because there is only one kind. Uv doesn't bootstrap pip into its environments (since that's slow and would be pointless), but you can equally well disable that with the standard library: `python -m venv --without-pip`.
> the Python version is itself a dependency of most libraries
This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.
If you're trying to solve the problem of deploying an application to people who don't have Python (or to people who don't understand what Python is), you need another layer of wrapping anyway. You aren't going to get end users to install uv first.
The alternative, of course, is having Python natively support a combined tool. Which you can support while also not liking `uv` for the above reason.
In my experience it generally does all of those well. Are you running into issues with the uv replacements?
> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.
What do end up needing to use `uv pip` for?
Needing pip and virtualenvs was enough to make me realize uv wasn't what I was looking for. If I still need to manage virtualenvs and call pip I'm just going to do so with both of these directly.
I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.
Isn't that exactly a pyproject.toml via the the uv add/sync/run interface? What is that missing that you need?
Ah ok I was missing this and this does sound like what I was expecting. Thank you!
If you are using uv, you don’t need to do shell shenanigans, you just use uv run. So I'm not sure how uv with pyproject.toml doesn't meet this description (yes, the venv is still there, it is used exactly as you describe.)
I disagree with this principle. Sometimes what I need is a kitset. I don't want to go shopping for things, or browse multiple docs. I just want it taken care of for me. I don't use uv so I don't know if the pieces fit together well but the kitset can work well and so can a la carte.
I think there are more cases where pip, pyenv, and virtualenv are used together than not. It makes sense to bundle the features of the three into one. uv does not replace ruff.
> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.
uv pip is there for compatibility and to facilitate migration but once you are full on the uv workflow you rarely need `uv pip` if ever
> 3. It does not play well with Docker.
In what sense?
> 4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.
You don't need to touch them at all
uv doesn’t try to replace ruff.
> You end up needing to use `uv pip` so it's not even a full replacement for pip.
"uv pip" doesn't use pip, it provides a low-level pip-compatible interface for uv, so it is, in fact, still uv replacing pip, with the speed and other advantages of uv when using that interface.
Also, while I’ve used uv pip and uv venv as part of familiarizing myself with the tool, I’ve never run into a situation where I need either of those low-level interfaces rather than the normal high-level interface.
> It does not play well with Docker.
How so?
No you don't. That's just a set of compatibility approaches for people who can't let go of pip/venv. Move to uv/PEP723, world's your oyster.
> It does not play well with Docker.
Huh? I use uv both during container build and container runtime, and it works just fine?
> You end up needing to understand all of these new environmental variables
Not encountered the need for any of these yet. Your comments on uv are so far out of line of all the uses I've seen, I'd love to hear what you're specifically doing that these become breaking points.
I'm using uv in two dozen containers with no issues at all. So not sure what you mean that it doesn't play well with Docker.
Happened to buy a new machine and decided to jump in the deep end and it's been glorious. I think the difference from your comment (and others in this chain) and my experience is that you're trying to make uv fit how you have done things. Jumping all the way in, I just . . . never needed virtualenvs. Don't really think about them once I sorted out a mistake I was making. uv init and you're pretty much there.
>You end up needing to use `uv pip` so it's not even a full replacement for pip
The only time I've used uv pip is on a project at work that isn't a uv-powered project. uv add should be doing what you need and it really fights you if you're trying to add something to global because it assumes that's an accident, which it probably is (but you can drop back to uv pip for that).
>`UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.
I've been using it for six months and didn't know those existed. I would suggest this is a symptom of trying to make it be what you're used to. I would also gently suggest those of us who have decades of Python experience may have a bit of Stockholm Syndrome around package management, packaging, etc.
The uv docs even have a whole page dedicated to Docker; you should definitely check that out if you haven't already: https://docs.astral.sh/uv/guides/integration/docker/
I also like how you can manage Python versions very easily with it. Everything feels very "batteries-included" and yet local to the project.
I still haven't used it long enough to tell whether it avoids the inevitable bi-yearly "debug a Python environment day" but it's shown enough promise to adopt it as a standard in all my new projects.
You can also prepend the path to the virtual environment's bin/ (or Scripts/ on Windows). Literally all that "activating an environment" does is to manipulate a few environment variables. Generally, it puts the aforementioned directory on the path, sets $VIRTUAL_ENV to the venv root, configures the prompt (on my system that means modifying $PS1) as a reminder, and sets up whatever's necessary to undo the changes (on my system that means defining a "deactivate" function; others may have a separate explicit script for that).
I personally don't like the automatic detection of venvs, or the pressure to put them in a specific place relative to the project root.
> I also like how you can manage Python versions very easily with it.
I still don't understand why people value this so highly, but so it goes.
> the inevitable bi-yearly "debug a Python environment day"
If you're getting this because you have venvs based off the system Python and you upgrade the system Python, then no, uv can't do anything about that. Venvs aren't really designed to be relocated or to have their underlying Python modified. But uv will make it much faster to re-create the environment, and most likely that will be the practical solution for you.
However, I also think many people, even many programmers, basically consider such external state "too confusing" and also don't know how they'd debug such a thing. Which I think is a shame since once you see that it's pretty simple it becomes a tool you can use everywhere. But given that people DON'T want to debug such, I can understand them liking a tool like uv.
I do think automatic compiler/interpreter version management is a pretty killer feature though, that's really annoying otherwise typically afaict, mostly because to get non-system wide installs typically seems to require compiling yourself.
``uv`` accomplishes the same thing, but it is another dependency you need to install. In some envs it's nice that you can do everything with the built-in Python tooling.
How does the rest of the world manage to survive without venvs? Config files in the directory. Shocking, really :-)))
not that it's great to start with, but it does happen, no?
source - why are we using an OS level command to activate a programming language's environment
.venv - why is this hidden anyway, doesn't that just make it more confusing for people coming to the language
activate - why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment
Feels dirty every time I've had to type it out and find it particularly annoying when Python is pushed so much as a good first language and I see people paid at a senior level not understand this command.
Because "activating an environment" means setting environment variables in the parent process (the shell that you use to run the command), which is otherwise impossible on Linux (see for example https://stackoverflow.com/questions/6943208).
> why is this hidden anyway, doesn't that just make it more confusing for people coming to the language
It doesn't have to be. You can call it anything you want, hidden or not, and you can put it anywhere in the filesystem. It so happens that many people adopted this convention because they liked having the venv in that location and hidden; and uv gives such venvs special handling (discovering and using them by default).
> why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment
Because the entire point is that, when you need to activate the environment, the folder in question is not on the path (the purpose of the script is to put it on the path!).
If activating virtual environments shadows e.g. /usr/bin/activate on your system (because the added path will be earlier in $PATH), you can still access that with a full absolute path; or you can forgo activation and do things like `.venv/bin/python -m foo`, `.venv/bin/my-program-wrapper`, etc.
> Feels dirty every time I've had to type it out
I use this:
$ type activate-local
activate-local is aliased to `source .local/.venv/bin/activate'
Notice that, again, you don't have to put it at .venv . I use a .local folder to store notes that I don't want to publish in my repo nor mention in my project's .gitignore; it in turn has $ cat .local/.gitignore
# Anything found in this subdirectory will be ignored by Git.
# This is a convenient place to put unversioned files relevant to your
# working copy, without leaving any trace in the commit history.
*
> and I see people paid at a senior level not understand this command.If you know anyone who's hiring....
The problem is, that would require support from the Python runtime itself (so that `sys.path` can be properly configured at startup) and it would have to be done in a way that doesn't degrade the experience for people who aren't using a proper "project" setup.
One of the big selling points of Python is that you can just create a .py file anywhere, willy-nilly, and execute the code with a Python interpreter, just as you would with e.g. a Bash script. And that you can incrementally build up from there, as you start out learning programming, to get a sense of importing files, and then creating meaningful "projects", and then thinking about packaging and distribution.
So far it seems like they have a bunch of these high performance tools. Is this part of an upcoming product suite for python or something? Just curious. I'm not a full-time python developer.
"What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today. An example of what this might look like [...] would be something like an enterprise-focused private package registry."
There's also this interview with Charlie Marsh (Astral founder): https://timclicks.dev/podcast/supercharging-python-tooling-a... (specifically the "Building a commerical company with venture capital " section)
(Transparently, I'm posting this before I've completed the article.)
uv's biggest advantage is speed. It claims a 10-100x performance speedup over pip and Conda [1]. uv can also manage python versions and supports using Python scripts as executables via inline dependencies [2].
But Conda is better for non-Python usage and is more mature, especially for data science related uses.
[1]: https://github.com/astral-sh/uv/blob/main/BENCHMARKS.md [2]: https://docs.astral.sh/uv/#scripts
pyenv was problematic because you needed the right concoction of system packages to ensure it compiled python with the right features, and we have a mix of MacOS and Linux devs so this was often non-trivial.
uv is much faster than both of these tools, has a more ergonomic CLI, and solves both of the issues I just mentioned.
I'm hoping astral's type checker is suitably good once released, because we're on mypy right now and it's a constant source of frustration (slow and buggy).
> uv is much faster than both of these tools
conda is also (in)famous for being slow at this, although the new mamba solver is much faster. What does uv do in order to resolve dependencies much faster?
- Representing version numbers as single integer for fast comparison.
- Being implemented in rust rather than Python (compared to Poetry)
- Parallel downloads
- Caching individual files rather than zipped wheel, so installation is just hard-linking files, zero copy (on unix at least). Also makes it very storage efficient.
not a python developer, so not sure it's equivalent as the npm registry is shared between all.
> Reminds me of that competing standards xkcd.
Yes, for years I've sat on the sidelines avoiding the fragmented Poetry, ppyenv, pipenv, pipx, pip-tools/pip-compile, rye, etc, but uv does now finally seem to be the all-in-one solution that seems to be succeeding where other tools have failed.
Definitely lightyears faster than mypy though.
My hope is that conda goes away completely. I run an ML cluster and we have multi-gigabyte conda directories and researchers who can't reproduce anything because just touching an env breaks the world.
https://docs.metaflow.org/scaling/dependencies https://outerbounds.com/blog/containerize-with-fast-bakery
The curmudgeon in me feels the need to point out that fast, lightweight software has always been possible, it's just becoming easier now with package managers.
uv is a clear improvement over pip and venv, for sure.
But I do everything in dev containers these days. Very few things get to install on my laptop itself outside a container. I've gotten so used to this that tools that uninstall/install packages on my box on the fly give me the heebie-jeebies.
Yes, it was the NPM supply chain issues that really forced this one me. Now I install, fetch, build in an interactive Docker container
I had to update some messy python code and I was looking for a tool that could handle python versions, package updates, etc. with the least amount of documentation needing be read and troubleshooting.
Rye was that for me! Next time I write python I'm definitely going to use uv.
You can go from no virtual environment, and just "uv run myfile.py" and it does everything that's needed, nearly instantly.
$ time pip install
ERROR: You must give at least one requirement to install (see "pip help install")
real 0m0.356s
user 0m0.322s
sys 0m0.036s
(Huh, that's a slight improvement from before; I guess pip 25.3 is a bit better streamlined.)It has always been enough to place installations in separate directories, and use the same bash scripts for environment variables configuration for all these years.
Have you tried uv?
Other than speed and consolidation, pip, pipx, hatch, virtualenv, and pyenv together roughly do the job (though pyenv itself isn’t a standard python tool.)
> Why uv over, lets say, conda?
Support for Python standard packaging specifications and consequently also easier integration with other tools that leverage them, whether standard or third party.
I don’t think people would think twice about the legitimacy (if you want to call it that) of uv except for all the weird fawning over it that happens, as you noticed. It makes it seem more like a religion or something.
> Do they have good influence on what python's main ecosystem is moving to? Yes, they're an early adaptor/implementer of the recent pyproject.toml standards.
Agree that uv is way way way faster than any of that and really just a joy to use in the simplicity
Also the ability to have a single script with deps using TOML in the headers super eaisly.
Also Also the ability to use a random python tool in effectively seconds with no faffing about.
good god no thank you.
>cargo
more like it.
This is the most insulting take in the ongoing ruination of Python. You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell. Now you get endlessly chastised for trying to use Python as a general purpose utility. Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.
I do agree it is annoying, and what they need to do is just provide an automatic "userspace" virtualenv for anything a user installs themselves... but that is a pandoras box tbh. (Do you do it per user? How does the user become aware of this?)
But that's probably not practical to retrofit given the ecosystem as it is now.
With PEP 723 and confortable tooling (like uv), now you get scripts, that are "actually directly runnable", not just "fake directly runnable oops forgot to apt-get install something sorta runnable", and work reliably even when stuff around you is updated.
This ideology is what caused all the problems to begin with, the base python is built as if it's the only thing in the entire operating systems environment when it's entire packaging system is also built in a way that makes that impossible to do without manually having to juggle package conflicts/incompatibilities.
This wasn't really the case; in principle anything you installed in the system Python environment, even "at user level", had the potential to pollute that environment and thus interfere with system tools written in Python. And if you did install it at system level, that became files within the environment your system package manager is managing, that it doesn't know how to deal with, because they didn't come from a system package.
But it's worse now because of how many system tools are written in Python — i.e., a mark of Python's success.
Notably, these tools commonly include the system package manager itself. Since you mentioned Debian (actually this is Mint, but ya know):
$ file `which apt`
/usr/local/bin/apt: Python script, ASCII text executable
> Now you get endlessly chastised for trying to use Python as a general purpose utility.No, you don't. Nothing prevents you from running scripts with the system Python that make use of system-provided libraries (including ones that you install later with the system package manager).
If you need something that isn't packaged by your distro, then of course you shouldn't expect your distro to be able to help with it, and of course you should expect to use an environment isolated from the distro's environment. In Python, virtual environments are the method of isolation. All reasonable tooling uses them, including uv.
> Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.
It's not "ruined". If you choose to install the system package for pip and to use it with --break-system-packages, the consequences are on you, but you get the legacy behaviour back. And the system packages still put files separately in dist-packages. It's just that... doing this doesn't actually solve all the problems, fundamentally because of how the Python import system works.
Basically the only thing missing from pip install being a smooth experience is something like npx to cleanly run modules/binary files that were installed to that directory. It's still futzing with the PATH variable to run those scripts correctly.
UV is making me give python a chance for the first time since 2015s renpy project I did for fun.
pip freeze > requirements.txt
pip install -r requirements.txt
Way before "official" lockfile existed.
Your requirements.txt becomes a lockfile, as long as you accept to not use ranges.
Having this in a single tool etc why not, but I don't understand this hype, when it was basically already there.
With pip you update a dependency, it won't work if it's not compatible, it'll work if they are. Not sure where the issue is?
Pips solver could still cause problems in general on changes.
UV having a better solver is nice. Being fast is also nice. Mainly tho it feeling like it is a tool that is maintained and can be improved upon without ripping one’s hair out is a godsend.
I would probably use something like this: https://stackoverflow.com/questions/17803829/how-to-customiz...
Pip is also not conda, but uv is way faster than pip.
I'm very happy the python community has better tooling.
So you are back having to use conda and the rest. Now, you have yet another package manager to handle.
I wouldn't be harsh to engineers at astral who developed amazing tooling, but the issue with the python ecosystem isn't lack of tooling, it is the proliferation and fragmentation. To solve dependency management fully would be to incorporate other package descriptors, or convert them.
Rsbuild, another rust library, for the node ecosystem did just that. For building and bundling. They came up with rspack, which has large compatibility with the webpack config.
You find a webpack repo? Just add rsbuild, rspack, and you are pretty much ready to go, without the slow (node native) webpack.
I refered to the interfaces of other packaging tools. I use uv and it's excellent on its own.
You get a repo, it's using playwright, what do you do now ? You install all the dependencies found in the dependency descriptor then sync to create a uv descriptor. or you compose a descriptor that uv understands.
It's repetitive, rather systematic so it could be automated. I should volunteer for a PR but my point is introducing yet another tool to an ecosystem suffering a proliferation of build and deps management tooling expands the issue. It would have been helpful from the get go to support existing and prolific formats.
pnpm understands package.json It didn't reinvent the wheel be cause we have millions of wheels out there. It created its own pnpm lock file, but that's files a user isn't meant to touch so it goes seamlessly to transition from npm to pnpm. Almost the same when migrating from webpack to rsbuild.
But you don't have to. Brew and other package managers hold uv in their registries.
For example, installing on an air gapped system, where uv barely has support.
How many commands are required to build up a locally consistent workspace?
Modern package managers do that for you.
Pip also generates PEP 751 lockfiles, and installing from those is on the roadmap still (https://github.com/pypa/pip/issues/13334).
venv is lower-level tooling. Literally all it does is create a virtual environment — the same kind that uv creates and manages. There's nothing to "integrate".
> I'd get suspicious if a developer is picky about python versions or library versions
Certain library versions only support certain python versions. And they also break API. So moving up/down the python versions also means moving library versions which means stuff no longer works.
The home page should be a simplified version of this page buried way down in the docs: https://docs.astral.sh/uv/guides/projects/
I've always wondered why Linux OSes that rely on python scripts don't make their own default venv and instead clobber the user's default python environment...
Now with uv everything just works and I can play around easily with all the great Python projects that exist.
In principle, you can ‘activate’ this new virtual environment like any typical virtual environment that you may have seen in other tools, but the most ‘uv-onic’ way to use uv is simply to prepend any command with uv run. This command automatically picks up the correct virtual environment for you and runs your command with it. For instance, to run a script — instead of
source .venv/bin/activate
python myscript.py
you can just do uv run myscript.pyYou may have a library that's been globally installed, and you have multiple projects that rely on it. One day you may need to upgrade the library for use in one project, but there are backward incompatibile changes in the upgrade, so now all of your other projects break when you upgrade the global library.
In general, when projects are used by multiple people across multiple computers, it's best to have the specific dependencies and versions specified in the project itself so that everyone using that project is using the exact same version of each dependency.
For recreational projects it's not as big of a deal. It's just harder to do a recreation of your environment.
Because it being available in the system environment could cause problems for system tools, which are expecting to find something else with the same name.
And because those tools could include your system's package manager (like Apt).
> So there is a massive possibility I am simply wrong and pip-installing something globally is a huge risk. I'm just not understanding it.
I assume you're referring to the new protections created by the EXTERNALLY-MANAGED marker file, which will throw up a large boilerplate warning if you try to use pip to install packages in the system environment (even with --user, where they can still cause problems when you run the system tools without sudo).
You should read one or more of:
* the PEP where this protection was introduced (https://peps.python.org/pep-0668/);
* the Python forum discussion explaining the need for the PEP (https://discuss.python.org/t/_/10302);
* my blog post (https://zahlman.github.io/posts/2024/12/24/python-packaging-...) where I describe in a bit more detail (along with explaining a few other common grumblings about how Python packaging works);
* my Q&A on Codidact (https://software.codidact.com/posts/291839/) where I explain more comprehensively;
* the original motivating Stack Overflow Q&A (https://stackoverflow.com/questions/75608323/);
* the Python forum discussion (https://discuss.python.org/t/_/56900) where it was originally noticed that the Stack Overflow Q&A was advising people to circumvent the protection without understanding it, and a coordinated attempt was made to remedy that problem.
Or you can watch Brodie Robertson's video about the implementation of the PEP in Arch: https://www.youtube.com/watch?v=35PQrzG0rG4.
> Instead of
>
> source .venv/bin/activate
> python myscript.py
>
> you can just do
>
> > uv run myscript
>
This is by far the biggest turn off for me. The whole point of an environment manager is set the environment so that the commands I run work. They need to run natively how they are supposed to when the environment is set, not put through a translation layer.Side rant: yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.
There is a new standard mechanism for specifying the same things you would specify when setting up a venv with a python version and dependencies in the header of a single file script, so that tooling can setup up the environment and run the script using only the script file itself as a spec.
uv (and PyPA’s own pipx) support this standard.
> yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.
"uv run myscript" is neither longer nor worse than separately manually building a venv, activating it, installing dependencies into it, and then running the script.
The `uv run` command is an optional shortcut for avoiding needing to activate the virtual environment. I personally don't like the whole "needing to activate an environment" before I can run commands "natively", so I like `uv run`. (Actually for the last 10 years I've had my `./manage.py` auto-set up the virtual environment for me.)
The `uv add` / `uv lock` / `uv sync` commands are still useful without `uv run`.
dependencies = [
"torch==2.8.0+rocm6.4",
"torchvision==0.23.0+rocm6.4",
"pytorch-triton-rocm==3.4.0",
...
]
There is literally no easy way to also have a configuration for CUDA, you have to have a second config, and, the worse, manually copy/symlink them into the hardcoded pyproject.toml fileOver the years, I've tried venv, conda, pipenv, petry, plain pip with requirements.txt. I've played with uv on some recent projects and it's a definite step up. I like it.
Uv actually fixes most of the issues with what came before and actually builds on existing things. Which is not a small compliment because the state of the art before uv was pretty bad. Venv, pip, etc. are fine. They are just not enough by themselves. Uv embraces both. Without that, all we had was just a lot of puzzle pieces that barely worked together and didn't really fit together that well. I tried making conda + pipenv work at some point. Pipenv shell just makes using your shell state-full just adds a lot of complexity. None of the IDEs I tried figured that out properly. I had high hopes for poetry but it ended up a bit underwhelming and still left a lot of stuff to solve. Uv succeeds in providing a bit more of an end to end solution. Everything from having project specific python installation, venv by default without hassle, dependency management, etc.
My basic needs are simple. I don't want to pollute my system python with random crap I need for some project. So, like uv, I need to have whatever solution deal with installing the right python version. Besides, the system python is usually out of date and behind the current stable version of python which is what I would use for new projects.
NewJazz•2h ago
UV is great but I use it as a more convenient pip+venv. Maybe I'm not using it to it's full potential.
zahlman•2h ago
You aren't, but that's fine. Everyone has their own idea about how tooling should work and come together, and I happen to be in your camp (from what I can tell). I actively don't want an all-in-one tool to do "project management".
hirako2000•1h ago
But where it isn't a matter of opinion is, speed. Never met anyone who given then same interface, would prefer a process taking 10x longer to execute.
collinmanderson•4m ago
uv is probably much more of a game changer for beginner python users who just need to install stuff and don't need to lint. So it's a bigger deal for the broader python ecosystem.