> pyx is also an instantiation of our strategy: our tools remain free, open source, and permissively licensed — forever.
> Beyond the product itself, pyx is also an instantiation of our strategy: our tools remain free, open source, and permissively licensed — forever. Nothing changes there. Instead, we'll offer paid, hosted services that represent the "natural next thing you need" when you're already using our tools: the Astral platform.
pyx itself is not a tool, it's a service.
To be precise: pyx isn't intended to be a public registry or a free service; it's something Astral will be selling. It'll support private packages and corporate use cases that are (reasonably IMO) beyond PyPI's scope.
(FD: I work on pyx.)
https://news.ycombinator.com/item?id=44712558
Now what do we have here?
(I work at Astral)
I'm glad to see Astral taking steps towards that.
I'm all seriousness, I'm all in on uv. Better than any competition by a mile. Also makes my training and clients much happier.
Of course if you are on one of the edge cases of something only uv does, well... that's more of an issue.
> "An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry."
https://hachyderm.io/@charliermarsh/113103605702842937
Astral has been very transparent about their business model.
But you can if you want have a non-flat namespace for imports using PEP 420 – Implicit Namespace Packages, so all your different packages "company-foo", "company-bar", etc. can be installed into the "company" namespace and all just work.
Nothing stops an index from validating that wheels use the same name or namespace as their package names. Sdists with arbitrary backends would not be possible, but you could enforce what backends were allowed for certain users.
import io
from contextlib import redirect_stdout as r
b=io.StringIO()
with r(b):import this
print(b.getvalue().splitlines()[-1])
> Namespaces are one honking great idea -- let's do more of those!We've been fed promises like these before. They will inevitably get acquired. Years of documentation, issues, and pull requests will be deleted with little-to-no notice. An exclusively commercial replacement will materialize from the new company that is inexplicably missing the features you relied on in the first place.
> Beyond the product itself, pyx is also an instantiation of our strategy: our tools remain free, open source, and permissively licensed — forever. Nothing changes there. Instead, we'll offer paid, hosted services that represent the "natural next thing you need" when you're already using our tools: the Astral platform.
Basically, we're hoping to address this concern by building a separate sustainable commercial product rather than monetizing our open source tools.
Only viral licenses are forever.
But, to refocus on the case at hand: Astral's tools don't require contributors to sign a CLA. I understand (and am sympathetic) to the suspicion here, but the bigger picture here is that Astral wants to build services as a product, rather than compromising the open source nature of its tools. That's why the announcement tries to cleanly differentiate between the two.
Because FWIW CPython is not GPL. They have their own license but do not require modifications to be made public.
jfrog artifactory suddenly very scared for its safety
> or is one simply paying for access to Astral's particular server that runs it?
pyx is currently a service being offered by Astral. So it's not something you can currently self-host, if that's what you mean.
... Then how can it make decisions about how to serve the package request that PyPI can't? Is there not some extension to the protocol so that uv can tell it more about the client system?
[1]: https://packaging.python.org/en/latest/specifications/simple...
Many ideas are being added to recent versions of pip that are at least inspired by what uv has done — and many things are possible in uv specifically because of community-wide standards development that also benefits pip. However, pip has some really gnarly internal infrastructure that prevents it from taking advantage of a lot of uv's good ideas (which in turn are not all original). That has a lot to do with why I'm making PAPER.
For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache. For pip to follow suit, it would have to completely redo its caching strategy from the ground up, because right now its cache is designed to save only download effort and not anything else about the installation process. It remembers entire wheels, but finding them in that cache requires knowing the URL from which they were downloaded. Because PyPI organizes the packages in its own database with its own custom URL scheme, pip would have to reach out to PyPI across the Internet in order to figure out where it put its own downloads!
FWIW, as a pip maintainer, I don't strongly agree with this statement, I think if pip had the same full time employee resources that uv has enjoyed over the last year that a lot of these issues could be solved.
I'm not saying here that pip doesn't have some gnarly internal details, just that the bigger thing holding it back is the lack of maintainer resources.
> For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache. For pip to follow suit, it would have to completely redo its caching strategy from the ground up, because right now its cache is designed to save only download effort and not anything else about the installation process.
I actually think this isn't a great example, evidenced by the lack of a download or wheel command from uv due to those features not aligning with uv's caching strategy.
That said, I do think there are other good examples to your point, like uv's ability to prefetch package metadata, I don't think we're going to be able to implement that in pip any time soon due to probably the need for a complete overhaul of the resolver.
imo, if pip's preference is to ship broken functionality, then what is/is not shipped with pip is not meaningful.
(And for the record: it isn't their fault at all. `pip search` doesn't work because PyPI removed the search API. PyPI removed that API for very good reasons[1].)
> Private registry
ouch.
This is a joke, obviously. We've had more than 14 for years.
This itself is IMO a product of Python having a relatively healthy consensus-driven standardization process for packaging, rather than an authoritative one. If Python had more of an authoritative approach, I don't think the language would have done as well as it has.
(Source: I've written at least 5 PEPs.)
Since this is a private, paid-for registry aimed at corporate clients, will there be an option to expose those registries externally as a public instance, but paid for by the company? That is, can I as a vendor pay for a Pyx registry for my own set of packages, and then provide that registry as an entrypoint for my customers?
We actually support this basic idea today, even without pyx. You can run (e.g.) `uv pip install --torch-backend=auto torch` to automatically install a version of PyTorch based on your machine's GPU from the PyTorch index.
pyx takes that idea and pushes it further. Instead of "just" supporting PyTorch, the registry has a curated index for each supported hardware accelerator, and we populate that index with pre-built artifacts across a wide range of packages, versions, Python versions, PyTorch versions, etc., all with consistent and coherent metadata.
So there are two parts to it: (1) when you point to pyx, it becomes much easier to get the right, pre-built, mutually compatible versions of these things (and faster to install them); and (2) the uv client can point you to the "right" pyx index automatically (that part works regardless of whether you're using pyx, it's just more limited).
> Since this is a private, paid-for registry aimed at corporate clients, will there be an option to expose those registries externally as a public instance, but paid for by the company? That is, can I as a vendor pay for a Pyx registry for my own set of packages, and then provide that registry as an entrypoint for my customers?
We don't support this yet but it's come up a few times with users. If you're interested in it concretely feel free to email me (charlie@).
Don't understand how a private company like Astral is leading here. Why is that hard for the Python community to come up with a single tool to rule them all? (I know https://xkcd.com/927/). Like, you could even copy what Go or Node are doing, and make it Python-aware; no shame on that. Instead we have these who-knows-how-long-they-will-last tools every now and then.
They should remove the "There should be one-- and preferably only one --obvious way to do it." from the Python Zen.
Python packaging is (largely) solving problems that Go and Node packaging are not even trying to address.
imagine a world without: failed to build native gem extension
I don't know much about go, and I've only scratched the surface with node, but as far as node goes I think it just distributes JS? So that would be one answer to what Python packaging is trying to solve that node isn't trying to address.
I don't know what guides you're reading but I haven't touched easy_install in at least a decade. It's successor, pip, had effectively replaced all use cases for it by around 2010.
Some of those are package tools, some are dependency managers, some are runtime environments, some are package formats...
Some are obsolete at this point, and others by necessity cover different portions of programming language technologies.
I guess what I'm saying is, for the average software engineer, there's not too many more choices in Python for programming facilities than in Javascript.
Whatever I would say at this point about PyPA would be so uncharitable that dang would descend on me with the holy hammer of banishment, but you can get my drift. I just don't trust them to come out with good tooling. The plethora they have produced so far is quite telling.
That said, pip covers 99% of my needs when I need to do anything with Python. There are ecosystems that have it way worse, so I count my blessings. But apparently, since Poetry and uv exist, my 99% are not many other people's 99%.
If I wanted to package my Python stuff, though, I'm getting confused. Is it now setup.py or pyproject.toml? Or maybe both? What if I need to support an older Python version as seen in some old-but-still-supported Linux distributions?
> They should remove the "There should be one-- and preferably only one --obvious way to do it." from the Python Zen.
Granted, tooling is different from the language itself. Although PyPA could benefit from a decade having a BDFL.
uv addressed these flaws with speed, solid dependency resolution, and a simple interface that builds on what people are already used to. It unifies virtual environment and package management, supports reproducible builds, and integrates easily with modern workflows.
Anyway, if you're familiar with Node then I think you can view pip and venv as the npm of Python. Things like Poetry are Yarn, made as replacements because pip sort of sucks. UV on the other hand is a drop-in replacement for pip and venv (and other things) similar to how pnpm is basically npm. I can't answer your question on why there isn't a "good" tool to rule them all, but the single tool has been pip since 2014, and since UV is a drop-in, it's very easy to use UV in development and pip in production.
I think it's reasonable to worry about what happens when Astral needs to make money for their investors, but that's the beauty of UV compared to a lot of other Python tools. It's extremely easy to replace because it's essentially just smarter pip. I do hope Astral succeeds with their pyx, private registries and so on by the way.
It’s entirely volunteer based so I don’t blame them, but the reality is that it’s holding back the ecosystem.
I suspect it’s also a misalignment of interests. No one there really invests in improving UX.
They had time to force "--break-system-packages" on us though, something no one asked for.
> It’s entirely volunteer based so I don’t blame them
It's not just that they're volunteers; it's the legacy codebase they're stuck with, and the use cases that people will expect them to continue supporting.
> I suspect it’s also a misalignment of interests. No one there really invests in improving UX.
"Invest" is the operative word here. When I read discussions in the community around tools like pip, a common theme is that the developers don't consider themselves competent to redesign the UX, and there is no money from anywhere to hire someone who would be. The PSF operates on an annual budget on the order of $4 million, and a big chunk of that is taken up by PyCon, supporting programs like PyLadies, generic marketing efforts, etc. Meanwhile, total bandwidth use at PyPI has crossed into the exabyte range (it was ~600 petabytes in 2023 and growing rapidly). They would be completely screwed without Fastly's incredible in-kind donation.
Now we have pyproject.toml and uv to learn. This is another hour or so of learning, but well worth it.
Astral is stepping up because no one else did. Guido never cared about packaging and that's why it has been the wild west until now.
0. A lot of those "tools and concepts" are actually completely irrelevant or redundant. easy_install has for all practical purposes been dead for many years. virtualenv was the original third party project that formed the basis for the standard library venv, which has been separately maintained for people who want particular additional features; it doesn't count as a separate concept. The setup.py file is a configuration file for Setuptools that also happens to be Python code. You only need to understand it if you use Setuptools, and the entire point is that you can use other things now (specifically because configuring metadata with Python code is a terrible idea that we tolerated for far too long). Wheels are just the distribution format and you don't need to know anything about how they're structured as an end user or as a developer of ordinary Python code — only as someone who makes packaging tools. And "pypy" is an alternate implementation of Python — maybe you meant PyPI? But that's just the place that hosts your packages; no relevant "concept" there.
Imagine if I wanted to make the same argument about JavaScript and I said that it's complicated because you have to understand ".tar.gz (I think, from previous discussion here? I can't even find documentation for how the package is actually stored as a package on disk), Node.js, NPM, TypeScript, www.npmjs.com, package.json..." That's basically what you're doing here.
But even besides that, you don't have to know about all the competing alternatives. If you know how to use pip, and your only goal is to install packages, you can completely ignore all the other tools that install packages (including poetry and uv). You only have any reason to care about pipenv if you want to use pip and care about the specific things that pipenv does and haven't chosen a different way to address the problem. Many pip users won't have to care about it.
1. A lot of people actively do not want it that way. The Unix philosophy actually does have some upsides, and there are tons of Python users out there who have zero interest in participating in an "ecosystem" where they share their code publicly even on GitHub, never mind PyPI — so no matter what you say should be the files that give project metadata or what they should contain or how they should be formatted, you aren't going to get any buy-in. But beyond that, different people have different needs and a tool that tries to make everyone happy is going to require tons of irrelevant cruft for almost everyone.
2. Reverse compatibility. The Python world — both the packaging system and the language itself — has been trying to get people to do things in better, saner ways for many years now; but people will scream bloody murder if their ancient stuff breaks in any way, even when they are advised years in advance of future plans to drop support. Keep in mind here that Python is more than twice as old as Go.
3. Things are simple for Go/JS/TS users because they normally only have to worry about that one programming language. Python packages (especially the best-known, "serious" ones used for heavyweight tasks) very commonly must interface with code written in many other programming languages (C and C++ are very common, but you can also find Rust, Fortran and many more; and Numpy must work with both C and Fortran), and there are many different ways to interface (and nothing that Python could possibly do at a language level to prevent that): by using an explicit FFI, by dlopen() etc. hooks, by shelling out to a subprocess, and more. And users expect that they can just install the Python package and have all of that stuff just work. Often that means that compiled-language code has to be rebuilt locally; and the install tools are expected to be able to download and set up a build system, build code in an isolated environment, etc. etc. All of this is way beyond the expectations placed on something like NPM.
4. The competition is deliberate. Standards — the clearest example being the PEPs 517/518/621 that define the pyproject.toml schema — were created specifically to enable both competition and interoperation. Uv is gaining market share because a lot of people like its paradigms. Imagine if, when people in the Python community first started thinking about the problems and limitations of tools from the early days, decided to try to come up with all the paradigms themselves. Imagine if they got them wrong, and then set it in stone for everyone else. When you imagine this, keep in mind that projects like pip and setuptools date to the 2000s. People were simply not thinking about open-source ecosystems in the same way in that era.
> They should remove the "There should be one-- and preferably only one --obvious way to do it." from the Python Zen.
First, define "it". The task is orders of magnitude greater than you might naively imagine. I know, because I've been an active participant in the surrounding discussion for a couple of years, aside from working on developing my own tooling.
Second, see https://news.ycombinator.com/item?id=44763692 . It doesn't mean what you appear to think it does.
because they're obsessed with fixing non-issues (switching out pgp signing for something you can only get from microsoft, sorry, "trusted providers", arguing about mission statements, etc.)
whilst ignoring the multiple elephants in the room (namespacing, crap slow packaging tool that has to download everything because the index sucks, mix of 4 badly documented tools to build anything, index that operates on filenames, etc.)
This is so true! On Windows (and WSL) it is also exacerbated by some packages requiring the use of compilers bundled with outdated Visual Studio versions, some of which are only available by manually crafting download paths. I can't wait for a better dev experience.
I went with Python because I never had this issue. Now with any AI / CUDA stuff its a bit of a nightmare to the point where you use someone's setup shell script instead of trying to use pip at all.
Also, now you have two problems.
I use Go a lot, the journey has been
- No dependency management
- Glide
- Depmod
- I forget the name of the precursor - I just remembered, VGo
- Modules
We still have proxying, vendoring, versioning problems
Python: VirtualEnv
Rust: Cargo
Java: Maven and Gradle
Ruby: Gems
Even OS dependency management is painful - yum, apt (which was a major positive when I switched to Debian based systems), pkg (BSD people), homebrew (semi-official?)
Dependency Management is the wild is a major headache, Go (I only mention because I am most familiar with) did away with some compilation dependency issues by shipping binaries with no dependencies (meaning that it didn't matter which version of linux you built your binary for, it will run on any of the same arch linux - none of that "wrong libc" 'fun'), but you still have issues with two different people building the same binary in need of extra dependency management (vendoring brings with it caching problems - is the version in the cache up to date, will updating one version of one dependency break everything - what fun)
Meanwhile, I didn't feel like Python had reached the bare minimum for package management until Pipenv came on the scene. It wasn't until Poetry (in 2019? 2020?) that I felt like the ecosystem had reached what Ruby had back in 2010 or 2011 when bundler had become mostly stable.
I think based on the quality of their work, there's also an important component which is trust. I'd trust and pay for a product from them much more readily than an open source solution with flaky maintainers.
- there's no way to do an installation dry run without pre-downloading all the packages (to get their dep info)
- there's no way to get hashes of the archives
- there's no way to do things like reverse-search (show me everything that depends on x)
I'm assuming that a big part of pyx is introducing a dynamically served (or maybe even queryable) endpoint that can return package metadata and let uv plan ahead better, identify problems and conflicts before they happen, install packages in parallel, etc.
Astral has an excellent track record on the engineering and design side, so I expect that whatever they do in this space will basically make sense, it will eventually be codified in a PEP, and PyPI will implement the same endpoint so that other tools like pip and poetry can adopt it.
[1]: Top-level: https://pypi.org/simple/ Individual package: https://pypi.org/simple/pyyaml/
But I don’t get it. How does it work? Why is it able to solve the Python runtime dependency problem? I thought uv had kinda already solved that? Why is a new thingy majig needed?
uv becomes a client that can install those packages from your private registry.
I imagine pip will be able to install them from a PYX registry too.
The dependencies in question are compiled C code that Python interfaces with. Handling dependencies for a graph of packages that are all implemented in pure Python, is trivial.
C never really solved all the ABI issues and especially for GPU stuff you end up having to link against very specific details of the local architecture. Not all of these can be adequately expressed in the current Python package metadata system.
Aside from that, a lot of people would like to have packages that use pre-installed dependencies that came with the system, but the package metadata isn't designed to express those dependencies, and you're also on your own for actually figuring out where they are at runtime, even if you take it on blind faith that the user separately installed them.
This is not really my wheelhouse, though; you'll find a much better write-up at https://pypackaging-native.github.io/ .
Much ad language.
They do not explain what an installation of their software does to my system.
They use the word "platform".
I'll pass. I'd rather have the battle-tested old thing, thanks.
The compiled languages now have better "packaging" than the interpreted ones. "go build" and "cargo build" (for Rust), which do real work, are easier to use than the packaging systems for Python and Javascript.
We've come a long way in the compiled world since "make depend; make".
deno run http://uri.of.the/dang.program.ts
notatallshaw•2h ago