frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Neo Humanoid Robot [video]

https://www.youtube.com/watch?v=LTYMWadOW7c
1•temperceve•1m ago•0 comments

How and why I built a free AI Visibility / GEO tool

1•linksku•2m ago•0 comments

Ruby Core Team Assumes Stewardship of RubyGems and Bundler, Former Maintainers

https://socket.dev/blog/ruby-core-team-assumes-stewardship-of-rubygems-and-bundler
1•feross•2m ago•0 comments

PhantomRaven: NPM Malware Hidden in Invisible Dependencies

https://www.koi.ai/blog/phantomraven-npm-malware-hidden-in-invisible-dependencies
1•azornathogron•3m ago•0 comments

Google DeepMind's AI Learns to Create Original Chess Puzzles, Reviewed by GMs

https://www.chess.com/news/view/ai-learns-to-create-original-chess-puzzles-earns-praise-from-gran...
1•vivekveeriah1•4m ago•0 comments

Ex-L3Harris exec pleads guilty to selling zero-day exploits to Russian broker

https://cyberscoop.com/peter-williams-guilty-selling-zero-day-exploits-russian-broker-operation-z...
2•aspenmayer•5m ago•1 comments

Guy turned his wedding suit into a sponsored billboard

https://www.famouscampaigns.com/2025/10/this-guy-turned-his-wedding-suit-into-a-sponsored-billboard/
1•rmason•5m ago•0 comments

Maintaining a Music Library, Ten Years On

https://brianschrader.com/archive/maintaining-a-music-library-ten-years-on/
1•sonicrocketman•6m ago•0 comments

The Julius Learning Sub Agent

https://julius-d061c216.mintlify.dev/docs/data-connectors/learning-system
2•zachperkel•7m ago•0 comments

Don't use us-east-1, or 'Why didn't ngrok go down in last week's AWS outage?'

https://ngrok.com/blog/dont-use-us-east-1/
1•lbrito•8m ago•0 comments

China's era of rapid oil fuels growth appears to be at an end

https://www.iea.org/commentaries/oil-demand-for-fuels-in-china-has-reached-a-plateau
2•JumpCrisscross•8m ago•1 comments

Chibi Izumi: phased DAG-planning dependency injection for Python

https://github.com/7mind/izumi-chibi-py
1•pshirshov•11m ago•0 comments

Thoughts on Cursor 2.0 and Cursor Compose

https://simonwillison.net/2025/Oct/29/cursor-composer/
3•azhenley•12m ago•0 comments

Writing an LLM from scratch, part 25 – instruction fine-tuning

https://www.gilesthomas.com/2025/10/llm-from-scratch-25-instruction-fine-tuning
1•gpjt•13m ago•0 comments

TheWhisper: High-Performance Speech-to-Text

https://github.com/TheStageAI/TheWhisper
2•ashvardanian•13m ago•0 comments

Why Foundation Models in Pathology Are Failing

https://rewire.it/blog/why-foundation-models-in-pathology-are-failing-and-what-comes-next/
2•timini•14m ago•0 comments

Shiraoi: Where the first arrows of the last war fell (2009)

https://spikejapan.wordpress.com/2009/09/26/373/
1•oregoncurtis•15m ago•0 comments

Google delivered their first-ever $100B quarter

https://twitter.com/sundarpichai/status/1983627221425156144
2•amrrs•19m ago•1 comments

France Wants a Bitcoin Reserve, to Buy 2% of Bitcoin Supply

https://bitcoinmagazine.com/news/france-proposes-national-bitcoin-reserve
1•janandonly•20m ago•0 comments

Michigan startup transforms brewery waste into revenue streams

https://www.mlive.com/news/kalamazoo/2025/10/michigan-startup-transforms-brewery-waste-into-reven...
1•rmason•22m ago•0 comments

RustyFlow: LLM built on pure Rust language

https://github.com/cekim7/RustyFlow
1•cekim7•22m ago•1 comments

The AI divide roiling video-game giant Electronic Arts

https://www.businessinsider.com/inside-ai-divide-roiling-video-game-giant-electronic-arts-2025-10
4•rwmj•23m ago•2 comments

Business Services Will Transform Affordable Private Schools

https://medium.com/ai-in-education-flourish-style/business-services-will-transform-affordable-pri...
1•rmason•23m ago•0 comments

Backpressure in Distributed Systems

https://blog.pranshu-raj.me/posts/backpressure/
6•andection•23m ago•0 comments

What's wrong with Agile Frameworks (2019)

https://yusufaytas.com/whats-wrong-with-agile-frameworks/
6•richardbrown•26m ago•0 comments

Claude Skills, anywhere: making them first-class in Codex CLI

https://www.robert-glaser.de/claude-skills-in-codex-cli/
2•youngbrioche•27m ago•0 comments

Fast frequency reconstruction using DL for event recognition in ring laser data

https://arxiv.org/abs/2510.03325
1•PaulHoule•28m ago•0 comments

One mountain town hopes AI can help it fight wildfires

https://www.theverge.com/report/809348/ai-fire-detection-vail-hpe-smart-city-platform
1•fleahunter•29m ago•0 comments

It Looks Like a Desert, but It Has Lakes

https://www.youtube.com/watch?v=biGJ_5t30Lk
1•ijidak•29m ago•0 comments

Adding Customizable Frame Contrast to KDE Plasma

https://akselmo.dev/posts/frame-contrast-settings/
1•todsacerdoti•30m ago•0 comments
Open in hackernews

Uv is the best thing to happen to the Python ecosystem in a decade

https://emily.space/posts/251023-uv
538•todsacerdoti•2h ago

Comments

NewJazz•2h ago
Idk, for me ruff was more of a game changer. No more explaining why we need both flake8 and pylint (and isort), no more flake8 plugins... Just one command that does it all.

UV is great but I use it as a more convenient pip+venv. Maybe I'm not using it to it's full potential.

zahlman•2h ago
> Maybe I'm not using it to it's full potential.

You aren't, but that's fine. Everyone has their own idea about how tooling should work and come together, and I happen to be in your camp (from what I can tell). I actively don't want an all-in-one tool to do "project management".

hirako2000•1h ago
The dependencies descriptor is further structured, a requirements.txt is pretty raw in comparison.

But where it isn't a matter of opinion is, speed. Never met anyone who given then same interface, would prefer a process taking 10x longer to execute.

collinmanderson•4m ago
I agree flake8 -> ruff was more of a game changer for me than pip+venv -> uv. I use flake8/ruff for more often than pip/venv.

uv is probably much more of a game changer for beginner python users who just need to install stuff and don't need to lint. So it's a bigger deal for the broader python ecosystem.

languagehacker•2h ago
A very accessible and gentle introduction for the scientific set who may still be largely stuck on Conda. I liked it!
Animats•2h ago
Another Python package manager? How many are there now?
zahlman•2h ago
> Another

No, the same uv that people have been regularly (https://hn.algolia.com/?q=uv) posting about on HN since its first public releases in February of 2024 (see e.g. https://news.ycombinator.com/item?id=39387641).

> How many are there now?

Why is this a problem? The ecosystem has developed usable interoperable standards (for example, fundamentally uv manages isolated environments by using the same kind of virtual environment created by the standard library — because that's the only kind that Python cares about; the key component is the `pyvenv.cfg` file, and Python is hard-coded to look for and use that); and you don't have to learn or use more than one.

There are competing options because people have different ideas about what a "package manager" should or shouldn't be responsible for, and about the expectations for those tasks.

andy99•1h ago
It’s definitely an issue for learning the language. Obviously after working with python a bit that doesn’t matter, but fragmentation still makes it more of a hassle to get open source projects up and running if they don’t use something close to your usual package management approach.
andrewstuart•2h ago
Venv seems pretty straightforward once you’ve learned the one activate command.

I don’t really get that uv solves all these problems ve never encountered. Just make a venv and use it seems to work fine.

bigstrat2003•2h ago
Yeah I've never remotely had problems with venv and pip.
nicce•2h ago
There have been actually many cases in my experience where venv simply worked but uv failed to install dependencies. uv is really fast but usually you need to install dependencies just once.
athorax•2h ago
For me the biggest value of uv was replacing pyenv for managing multiple versions of python. So uv replaced pyenv+pyenv-virtualenv+pip
Hasz•1h ago
This is it. Later versions of python .11/.12/.13 have significant improvements and differences. Being able to seamlessly test/switch between them is a big QOL improvement.

I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.

rkomorn•1h ago
Didn't Astral get created out of uv (and other tools), though? Isn't it fair for the creators to try and turn it into a sustainable job?

Edit: or was it ruff? Either way. I thought they created the tools first, then the company.

zahlman•1h ago
> Later versions of python .11/.12/.13 have significant improvements and differences. Being able to seamlessly test/switch between them is a big QOL improvement.

I just... build from source and make virtual environments based off them as necessary. Although I don't really understand why you'd want to keep older patch versions around. (The Windows installers don't even accommodate that, IIRC.) And I can't say I've noticed any of those "significant improvements and differences" between patch versions ever mattering to my own projects.

> I don't love that UV is basically tied to a for profit company, Astral. I think such core tooling should be tied to the PSF, but that's a minor point. It's partially the issue I have with Conda too.

In my book, the less under the PSF's control, the better. The meager funding they do receive now is mostly directed towards making PyCon happen (the main one; others like PyCon Africa get a pittance) and to certain grants, and to a short list of paid staff who are generally speaking board members and other decision makers and not the people actually developing Python. Even without considering "politics" (cf. the latest news turning down a grant for ideological reasons) I consider this gross mismanagement.

philipallstar•1h ago
> I think such core tooling should be tied to the PSF, but that's a minor point.

The PSF is busy with social issues and doesn't concern itself with trivia like this.

gegtik•1h ago
Yes. poetry & pyenv was already a big improvement, but now uv wraps everything up, and additionally makes "temporary environments" possible (eg. `uv run --with notebook jupyter-notebook` to run a notebook with my project dependencies)

Wonderful project

philipallstar•1h ago
With uvx it also replaces pipx.
projektfu•2h ago
One thing that annoys me about Claude is that it doesn't seem to create a venv by default when it creates a python project. (But who knows, maybe 1/3 of the time it does or something.) But you have to ask each time to be sure.
cdmckay•1h ago
Occasionally I have to build Python projects and coming from other languages and package managers, having to deal with a venv is super weird and annoying.
nilamo•1h ago
If that works for you, then that's cool. Personally, I don't want to think about environments, and it's weird that python is the only language that has venvs. Having a tool that handles it completely transparently to me is ideal, to me.
mgh95•2h ago
As someone who generally prefers not to use python in a production context (I think it's excellent for one-off scripts or cron jobs that require more features then what bash provides), I agree with this sentiment. I recently wrote some python (using uv) and found it to be pleasant and well-integrated with a variety of LSPs.
curiousgal•2h ago
The best thing to happen to the Python ecosystem would be something that unites pip and conda. Conda is not going anywhere given how many packages depend on non-python binaries, especially in enterprise settings.
zahlman•1h ago
The standard approach nowadays is to vendor the binaries, as e.g. Numpy does. This works just fine with pip.

I'm interested if you have any technical documentation about how conda environments are structured. It would be nice to be able to interact with them. But I suspect the main problem is that if you use a non-conda tool to put something into a conda environment, there needs to be a way to make conda properly aware of the change. Fundamentally it's the same issue as with trying to use pip in the system environment on Linux, which will interfere with the system package manager (leading to the PEP 668 protections).

karlding•1h ago
I'm not sure if you're aware, but there's the Wheel Variants proposal [0] that the WheelNext initiative is working through that was presented at PyCon 2025 [1][2], which hopes to solve some of those problems.

uv has implemented experimental support, which they announced here [3].

[0] https://wheelnext.dev/proposals/pepxxx_wheel_variant_support...

[1] https://us.pycon.org/2025/schedule/presentation/100/

[2] https://www.youtube.com/watch?v=1Oki8vAWb1Q

[3] https://astral.sh/blog/wheel-variants

dugidugout•1h ago
I had this discussion briefly with a buddy who uses python exclusively for his career in austronomy. He was lamenting the pains of colaborting around Conda and seemed convinced it was irreplaceable. Being that I'm not familiar with the exact limitations Conda is providing for, Im curious if you could shed some insight here. Does nix not technically solve the issue? I understand this isn't solely a technical problem and Nix adoption in this space isn't likely, but I'm curious none-the-less!
Carbonhell•1h ago
You might be interested in Pixi: https://prefix.dev/ It uses uv under the hood for Python dependencies, while allowing you to also manage Conda dependencies in the same manifest (pixi.toml). The ergonomics are really nice and intuitive imo, and we're on our way to replace our Poetry and Conda usage with only Pixi for Python/C++ astrodynamics projects. The workspace-centric approach along with native lockfiles made most of our package management issues go away. I highly recommend it! (Not affiliated anyhow, other than contributing with a simple PR for fun)
verdverm•2h ago
I'd put type annotations and GIL removal above UV without a second thought. UV is still young and I hit some of those growing pains. While it is very nice, I'm not going to put it up there with sliced bread, it's just another package manager among many
WD-42•1h ago
As far as impact on the ecosystem I’d say uv is up there. For the language itself you are right. Curious if you’ve come across any real use cases for Gil-less python. I haven’t yet. Seems like everything that would benefit from it is already written in highly optimized native modules.
seabrookmx•1h ago
> Seems like everything that would benefit from it is already written in highly optimized native modules

Or by asyncio.

WD-42•1h ago
I'm pretty ignorant about this stuff but I think asyncio is for exactly that, asynchronus I/O. Whereas GIL-less Python would be beneficial for CPU bound programs. My day job is boring so I'm never CPU bound, always IO bound on the database or network. If there is CPU heavy code, it's in Numpy. So I'm not sure if Gil-less actually helps there.
nomel•23m ago
asyncio is unrelated to the parallelism prevented by the GIL.
rustystump•1h ago
I second and third this. I HATE python but uv was what made it usable to me. No other language had such a confusing obnoxious setup to do anything with outside of js land. uv made it sane for me.
giancarlostoro•1h ago
Node definitely needs its own "uv" basically.
jampekka•1h ago
Why? Uv very good compared to other Python package managers, but even plain npm is still better than uv, and pnpm is a lot better.
verdverm•1h ago
pnpm
monkpit•1h ago
How is npm not exactly that?
jampekka•1h ago
Type annotations were introduced in 2008 and even type hints over decade ago in Sept 2015.
zacmps•1h ago
But there has been continual improvement over that time, both in the ecosystem, and in the language (like a syntax for generics).
brcmthrowaway•1h ago
What happend with GIL-removal
verdverm•1h ago
You can disable it, here's the PEP, search has more digestible options

https://peps.python.org/pep-0703/

zahlman•1h ago
For that matter, IMX much of what people praise uv for is simply stuff that pip (and venv) can now do that it couldn't back when they gave up on pip. Which in turn has become possible because of several ecosystem standards (defined across many PEPs) and increasing awareness and adoption of those standards.

The "install things that have complex non-Python dependencies using pip" story is much better than several years ago, because of things like pip gaining a new resolver in 2020, but in large part simply because it's now much more likely that the package you want offers a pre-built wheel (and that its dependencies also do). A decade ago, it was common enough that you'd be stuck with source packages even for pure-Python projects, which forced pip to build a wheel locally first (https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-...).

Another important change is that for wheels on PyPI the installer can now obtain separate .metadata files, so it can learn what the transitive dependencies are for a given version of a given project from a small plain-text file rather than having to speculatively download the entire wheel and unpack the METADATA file from it. (This is also possible for source distributions that include PKG-INFO, but they aren't forced to do so, and a source distribution's metadata is allowed to have "dynamic" dependencies that aren't known until the wheel is built (worst case) or a special metadata-only build hook is run (requires additional effort for the build system to support and the developer to implement)).

verdverm•1h ago
For sure, we see the same thing in the JS ecosystem. New tooling adds some feature, other options implement feature, convergence to a larger common set.

I'm still mostly on poetry

KaiserPro•1h ago
typed annotations that are useful.

Currently they are a bit pointless. Sure they aid in documentation, but they are effort and cause you pain when making modifications (mind you with halfarse agentic coding its probably less of a problem. )

What would be better is to have a strict mode where instead of duck typing its pre-declared. It would also make a bunch of things faster (along with breaking everything and the spirit of the language)

I still don't get the appeal of UV, but thats possibly because I'm old and have been using pyenv and venv for many many years. This means that anything new is an attack on my very being.

however if it means that conda fucks off and dies, then I'm willing to move to UV.

KK7NIL•24m ago
You can get pretty darn close to static typing by using ty (from the same team as uv).

I've been using it professionally and its been a big improvement for code quality.

hollow-moe•2h ago
curl|sh and iwr|iex chills my spine, no one should recommend these methods of installation in 2025. I'm against closed computers but I'm also against reckless install. Even without the security concerns these way of installation tends to put files in a whole random places making it hard to manage and cleanup.
01HNNWZ0MV43FF•1h ago
Maybe there will be a .deb one day
mystifyingpoi•1h ago
That doesn't fix the core issue. You can put anything inside a .deb file, even preinstall script can send your ~/.aws/credentials to China. The core concern is getting a package that's verified by a volunteer human to not contain anything malicious, and then getting that package into Debian repository or equivalent.
chasd00•1h ago
can't you just do curl|more and then view what it's going to do? Then, once you're convinced, go back to curl|sh.

/just guessing, haven't tried it

threeducks•1h ago
A malicious server could detect whether the user is actually running "curl | sh" instead of just "curl" and only serve a malicious shell script when the code is executed blindly. See this thread for reference: https://news.ycombinator.com/item?id=17636032
chasd00•1h ago
well you still have to execute the shell script at some point. You could do curl > install.sh, open it up to inspect, and then run the install script which would still trigger the callback to the server mentioned in the link you posted. I guess it's really up to the user to decide what programs to run and not run.
mystifyingpoi•1h ago
While I do share the sentiment, I firmly believe that for opensource, no one should require the author to distribute their software, or even ask them to provide os-specific installation methods. They wrote it for free, use it or don't. They provide a handy install script - don't like it? sure, grab the source and build it yourself. Oops, you don't know what the software does? Gotta read every line of it, right?

Maybe if you trust the software, then trusting the install script isn't that big of a stretch?

WorldMaker•1h ago
For small project open source with a CLI audience, why bother with an install script at all and not just provide tarballs/ZIP files and assume that the CLI audience is smart enough to untarball/unzip it to somewhere on their PATH?

Also, many of the "distribution" tools like brew, scoop, winget, and more are just "PR a YAML file with your zip file URL, name of your EXE to add to a PATH, and a checksum hash of the zip to this git repository". We're about at a minimum effort needed to generate a "distribution" point in software history, so seems interesting shell scripts to install things seem to have picked up instead.

rieogoigr•8m ago
Part of writing software involves writing a way to deploy that software to a computer. Piping a web URL to a bash interpreter is not good enough. if that's the best installer you can do the rest of your code is probably trash.
jampekka•1h ago
Installing an out-of-distro deb/rpm/msi/dmg/etc package is just as unsafe as curl|sh. Or even unsafer, as packages tend to require root/admin.
procaryote•1h ago
A package is at least a signable, checksummable artefact. The curl | sh thing could have been anything and after running it you have no record of what it was you did.

There have also been PoCs on serving malicious content only when piped to sh rather than saved to file.

If you want to execute shell code from the internet, at the very least store it in a file first and store that file somewhere persistent before executing it. It will make forensics easier

nikisweeting•58m ago
Security and auditability is not the core problem, it's versioning and uninstalling. https://docs.sweeting.me/s/against-curl-sh
jampekka•44m ago
Uninstalling can be a problem.

Versioning OTOH is often more problematic with distro package managers that can't support multiple versions of the same package.

Also inability to do user install is a big problem with distro managers.

WorldMaker•1h ago
That iwr|iex example is especially egregious because it hardcodes the PowerShell <7.0 EXE name to include `-ExecutionPolicy Bypass`. So it'll fail on Linux or macOS, but more importantly iwr|iex is already an execution bypass, so including a second one seems a red flag to me. (What else is it downloading?)

Also, most reasonable developers should already be running with the ExecutionPolicy RemoteSigned, it would be nice if code signing these install script was a little more common, too. (There was even a proposal for icm [Invoke-Command] to take signed script URLs directly for a much safer alternative code-golfed version of iwr|iex. Maybe that proposal should be picked back up.)

rieogoigr•9m ago
for real. You want to pipe a random URL to my bash interpreter to install?

no. thats how you get malware. Make a package. Add it to a distro. then we will talk.

LeoPanthera•1h ago
For single-file Python scripts, which 99% of mine seem to be, you can simplify your life immensely by just putting this at the top of the script:

  #!/usr/bin/env -S uv run --script
  # /// script
  # requires-python = ">=3.11"
  # dependencies = [ "modules", "here" ]
  # ///
The script now works like a standalone executable, and uv will magically install and use the specified modules.
d4mi3n•1h ago
If I were to put on my security hat, things like this give me shivers. It's one thing if you control the script and specified the dependencies. For any other use-case, you're trusting the script author to not install python dependencies that could be hiding all manner of defects or malicious intent.

This isn't a knock against UV, but more a criticism of dynamic dependency resolution. I'd feel much better about this if UV had a way to whitelist specific dependencies/dependency versions.

chatmasta•1h ago
If you’re executing a script from an untrusted source, you should be examining it anyway. If it fails to execute because you haven’t installed the correct dependencies, that’s an inconvenience, not a lucky security benefit. You can write a reverse shell in Python with no dependencies and just a few lines of code.
maccard•1h ago
If that’s your concern you should be auditing the script and the dependencies anyway, whether they’re in a lock file or in the script. It’s just as easy to put malicious stuff in a requirements.txt
p_l•1h ago
uv can still be redirected to private PyPi mirror, which should be mandatory from security and reliability perspective anyway.
theamk•53m ago
Is there anything new that uv gives you here though?

If you don't care about being ecosystem-compliant (and I am sure malware does not), it's only a few lines of Python to download the code and eval it.

renewiltord•8m ago
This is true. In fact, if the shebang reads `#!/usr/bin/env python3` I can be absolutely sure that the lines:

    import shutil
    shutil.rmtree('/')
aren't in the file so I don't need to read the code. I only read the code when there are dependencies. This is because I have my security hat on that sorted me into the retard house.
kardos•1h ago
> uv will magically install and use the specified modules.

As long as you have internet access, and whatever repository it's drawing from is online, and you may get different version of python each time, ...

maccard•1h ago
If I download python project from someone on the same network as me and they have it written in a different python version to me and a requirements.txt I need all those things anyway.
85392_school•1h ago
You can constrain Python version: https://peps.python.org/pep-0723/#:~:text=requires-python
dragonwriter•1h ago
I mean, if you use == constraints instead of >= you can avoid getting different versions, and if you’ve used it (or other things which combined have a superset of the requirements) you might have everything locally in your uv cache, too.

But, yes, python scripts with in-script dependencies plus uv to run them doesn't change dependency distribution, just streamlines use compared to manual setup of a venv per script.

tclancy•58m ago
And electricity and running water and oh the inconvenience. How is this worse than getting a script file that expects you to install modules?
moleperson•1h ago
Why is the ‘-S’ argument to ‘env’ needed? Based on the man page it doesn’t appear to be doing anything useful here, and in practice it doesn’t either.
zahlman•1h ago
> Based on the man page it doesn’t appear to be doing anything useful here

The man page tells me:

  -S, --split-string=S
         process and split S into separate arguments; used to pass multi‐
         ple arguments on shebang lines
Without that, the system may try to treat the entirety of "uv run --script" as the program name, and fail to find it. Depending on your env implementation and/or your shell, this may not be needed.

See also: https://unix.stackexchange.com/questions/361794

moleperson•1h ago
Right, I didn’t think about the shebang case being different. Thanks!
Rogach•1h ago
Without -S, `uv run --script` would be treated as a binary name (including spaces) and you will get an error like "env: ‘uv run --script’: No such file or directory".

-S causes the string to be split on spaces and so the arguments are passed correctly.

zahlman•1h ago
As long as your `/usr/bin/env` supports `-S`, yes.

It will install and use distribution packages, to use PyPA's terminology; the term "module" generally refers to a component of an import package. Which is to say: the names you write here must be the names that you would use in a `uv pip install` command, not the names you `import` in the code, although they may align.

This is an ecosystem standard (https://peps.python.org/pep-0723/) and pipx (https://pipx.pypa.io) also supports it.

hugmynutus•34m ago
> As long as your

linux core utils have supported this since 2018 (coreutils 8.3), amusingly it is the same release that added `cp --reflink`. AFAIK I know you have to opt out by having `POSIX_CORRECT=1` or `POSIX_ME_HARDER=1` or `--pedantic` set in your environment. [1]

freebsd core utils have supported this since 2008

MacOS has basically always supported this.

---

1. Amusingly despite `POSIX_ME_HARDER` not being official a alrge swapt of core utils support it. https://www.gnu.org/prep/standards/html_node/Non_002dGNU-Sta...

XorNot•41m ago
I use this but I hate it.

I want to be able to ship a bundle which needs zero network access to run, but will run.

It is still frustratingly difficult to make portable Python programs.

globular-toast•39m ago
You can get uv to generate this and add dependencies to it, rather than writing it yourself.
runningmike•1h ago
Seems like a commercial blog. And imho hatch is better from a Foss perspective.

UV means getting more strings attached with VC funded companies and leaning on their infrastructure. This is a high risk for any FOSS community and history tells us how this ends….

maccard•1h ago
You say this on a message board run by a VC about a programming language that is primarily developed by meta, google and co.

uv is MIT licensed so if they rug pull, you can fork.

kyt•1h ago
I must be the odd man out but I am not a fan of uv.

1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

3. It does not play well with Docker.

4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

defraudbah•1h ago
yeah, I've moved away from it too, but that's a great tool. A rush of rust tools is the best thing that happened to python in the decade
leblancfg•1h ago
uv's pip interface is like dipping one toe in the bathtub. Take a minute and try on the full managed interface instead: https://docs.astral.sh/uv/concepts/projects/dependencies. Your commands then become:

- uv add <package_name>

- uv sync

- uv run <command>

Feels very ergonomic, I don't need to think much, and it's so much faster.

daedrdev•1h ago
I mean I’ve had quite awful bugs from using pip pyenv and venv at the same time
chatmasta•1h ago
What problems do you encounter using it with Docker?
xmprt•1h ago
Your implication is that pyenv, virtualenv, and pip should be 3 different tools. But for the average developer, these tools are all related to managing the python environment and versions which in my head sounds like one thing. Other languages don't have 3 different tools for this.

pip and virtualenv also add a ton of complexity and when they break (which happens quite often) debugging it is even harder despite them being "battle tested" tools.

nicce•1h ago
Python versions and environments can be solved in more reliable abstraction level as well, e.g. if you are heavy Nix user.
throwaway894345•1h ago
On the other hand, Nix and Bazel and friends are a lot of pain. I'm sure the tradeoff makes sense in a lot of situations, but not needing to bring in Nix or Bazel just to manage dependencies is a pretty big boon. It would be great to see some of the all-in-one build tools become more usable though. Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!
eisbaw•21m ago
> Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!

Yep: Nix

331c8c71•16m ago
Well Nix is the only sane way I know to manage fully reproducible envs that incorporate programs/scripts spanning multiple ecosystems. Very common situation in applied data analysis.
throwaway894345•1h ago
Yeah, I agree. In particular it seems insane to me that virtualenv should have to exist. I can't see any valid use case for a machine-global pool of dependencies. Why would anyone think it should be a separate tool rather than just the obvious thing that a dependency manager does? I say this as someone with nearly 20 years of Python experience.

It's the same sort of deal with pyenv--the Python version is itself a dependency of most libraries, so it's a little silly to have a dependency manager that only manages some dependencies.

zahlman•59m ago
I, too, have ~20 years of Python experience.

`virtualenv` is a heavy-duty third-party library that adds functionality to the standard library venv. Or rather, venv was created as a subset of virtualenv in Python 3.3, and the projects have diverged since.

The standard library `venv` provides "obvious thing that a dependency manager does" functionality, so that every dependency manager has the opportunity to use it, and so that developers can also choose to work at a lower level. And the virtual-environment standard needs to exist so that Python can know about the pool of dependencies thus stored. Otherwise you would be forced to... depend on the dependency manager to start Python and tell it where its dependency pool is.

Fundamentally, the only things a venv needs are the `pyvenv.cfg` config file, the appropriate folder hierarchy, and some symlinks to Python (stub executables on Windows). All it's doing is providing a place for that "pool of dependencies" to exist, and providing configuration info so that Python can understand the dependency path at startup. The venvs created by the standard library module — and by uv — also provide "activation" scripts to manipulate some environment variables for ease of use; but these are completely unnecessary to making the system work.

Fundamentally, tools like uv create the same kind of virtual environment that the standard library does — because there is only one kind. Uv doesn't bootstrap pip into its environments (since that's slow and would be pointless), but you can equally well disable that with the standard library: `python -m venv --without-pip`.

> the Python version is itself a dependency of most libraries

This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.

If you're trying to solve the problem of deploying an application to people who don't have Python (or to people who don't understand what Python is), you need another layer of wrapping anyway. You aren't going to get end users to install uv first.

knowitnone3•1h ago
"other languages don't have 3 different tools for this." But other languages DO have 3 different tools so we should do that too!
j2kun•30m ago
I think OP's complaint is rather that using `uv` is leaky: now you need to learn all the underlying stuff AND uv as well.

The alternative, of course, is having Python natively support a combined tool. Which you can support while also not liking `uv` for the above reason.

collinmanderson•1h ago
> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

In my experience it generally does all of those well. Are you running into issues with the uv replacements?

> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

What do end up needing to use `uv pip` for?

eatonphil•1h ago
> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

Needing pip and virtualenvs was enough to make me realize uv wasn't what I was looking for. If I still need to manage virtualenvs and call pip I'm just going to do so with both of these directly.

I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.

ellg•1h ago
What are you needing to use `uv pip` for? I don't think I ever call into pip from uv for anything nowadays. I typically just need to do `uv sync` and `uv run`, maybe sometimes `uvx` if I want to run some random 3rd party python script
ivell•1h ago
Pixi is an alternative that you may want to try.
notatallshaw•1h ago
> I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.

Isn't that exactly a pyproject.toml via the the uv add/sync/run interface? What is that missing that you need?

eatonphil•1h ago
> pyproject.toml

Ah ok I was missing this and this does sound like what I was expecting. Thank you!

og_kalu•1h ago
In most cases, you don't really need to manage virtual envs though ? uv commands that need a venv will just create one for you or install to the existing one automatically.
dragonwriter•1h ago
> I had been hoping someone would introduce the non-virtualenv package management solution that every single other language has where there's a dependency list and version requirements (including of the language itself) in a manifest file (go.mod, package.json, etc) and everything happens in the context of that directory alone without shell shenanigans.

If you are using uv, you don’t need to do shell shenanigans, you just use uv run. So I'm not sure how uv with pyproject.toml doesn't meet this description (yes, the venv is still there, it is used exactly as you describe.)

yoavm•37m ago
Really sounds like you're using it wrong, no? I completely forgot about virtualenvs, pip and requirements.txt since I start using UV.
vindex10•1h ago
I would also add UV_NO_SYNC as smth I had to learn. It comes in combination with uv pip
wtallis•1h ago
What's your use case for UV_NO_SYNC? I assume the option exists for a reason, but aside from maybe a modest performance improvement when working with a massive complex package environment, I'm not sure what problem it solves.
tpl•1h ago
What do you mean it doesn't play well with docker?
nicoco•1h ago
uv pip is a full reimplementation of pip. Way faster, better caching, less disk usage. What'd not to like about it?
brikym•1h ago
> It tries to do too many things. Please just do one thing and do it well.

I disagree with this principle. Sometimes what I need is a kitset. I don't want to go shopping for things, or browse multiple docs. I just want it taken care of for me. I don't use uv so I don't know if the pieces fit together well but the kitset can work well and so can a la carte.

a_bored_husky•1h ago
> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

I think there are more cases where pip, pyenv, and virtualenv are used together than not. It makes sense to bundle the features of the three into one. uv does not replace ruff.

> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

uv pip is there for compatibility and to facilitate migration but once you are full on the uv workflow you rarely need `uv pip` if ever

> 3. It does not play well with Docker.

In what sense?

> 4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

You don't need to touch them at all

dragonwriter•1h ago
> It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

uv doesn’t try to replace ruff.

> You end up needing to use `uv pip` so it's not even a full replacement for pip.

"uv pip" doesn't use pip, it provides a low-level pip-compatible interface for uv, so it is, in fact, still uv replacing pip, with the speed and other advantages of uv when using that interface.

Also, while I’ve used uv pip and uv venv as part of familiarizing myself with the tool, I’ve never run into a situation where I need either of those low-level interfaces rather than the normal high-level interface.

> It does not play well with Docker.

How so?

pityJuke•1h ago
There is an optional & experimental code formatting tool within uv (that just downloads riff), which is what OP may be referring to: https://pydevtools.com/blog/uv-format-code-formatting-comes-...
j45•1h ago
It's still one tool to orchestrate and run everything, which is preferable to many.
TYPE_FASTER•1h ago
Yeah, I'm with you. I'm forcing myself to learn it because it looks like that's the way PyWorld is going. I don't dislike uv as much as poetry. But I guess I never really ran into issues using pyenv and pip. shrug Maybe I wasn't working on complex enough projects.
groby_b•1h ago
> You end up needing to use `uv pip` so it's not even a full replacement for pip.

No you don't. That's just a set of compatibility approaches for people who can't let go of pip/venv. Move to uv/PEP723, world's your oyster.

> It does not play well with Docker.

Huh? I use uv both during container build and container runtime, and it works just fine?

> You end up needing to understand all of these new environmental variables

Not encountered the need for any of these yet. Your comments on uv are so far out of line of all the uses I've seen, I'd love to hear what you're specifically doing that these become breaking points.

dsnr•1h ago
This. I was researching uv to replace my pipenv+pyenv setup, but after reading up a bit I decided to just give up. Pipenv is just straightforward and “just work”. Aside from being slow, not much is wrong with it. I’m not in the mood to start configuring uv, a tool that should take me 2 minutes and a “uv —-help” to learn.
robertfw•50m ago
Slow doesn't really begin to do justice, I'd have to wait for >5 minutes for pipenv to finish figuring out our lock file. uv does it in less than a second.
scuff3d•1h ago
If your pyproject.toml is setup properly you shouldn't need to use `uv pip` at all.

I'm using uv in two dozen containers with no issues at all. So not sure what you mean that it doesn't play well with Docker.

tclancy•1h ago
So I have been doing Python for far too long and have all sort of tooling I've accreted to make Python work well for me across projects and computers and I never quite made the leap to Poetry and was suspicious of uv.

Happened to buy a new machine and decided to jump in the deep end and it's been glorious. I think the difference from your comment (and others in this chain) and my experience is that you're trying to make uv fit how you have done things. Jumping all the way in, I just . . . never needed virtualenvs. Don't really think about them once I sorted out a mistake I was making. uv init and you're pretty much there.

>You end up needing to use `uv pip` so it's not even a full replacement for pip

The only time I've used uv pip is on a project at work that isn't a uv-powered project. uv add should be doing what you need and it really fights you if you're trying to add something to global because it assumes that's an accident, which it probably is (but you can drop back to uv pip for that).

>`UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

I've been using it for six months and didn't know those existed. I would suggest this is a symptom of trying to make it be what you're used to. I would also gently suggest those of us who have decades of Python experience may have a bit of Stockholm Syndrome around package management, packaging, etc.

Narushia•1h ago
uv has played well with Docker in my experience, from dev containers to CI/CD to production image builds. Would be interested to hear what is not working for you.

The uv docs even have a whole page dedicated to Docker; you should definitely check that out if you haven't already: https://docs.astral.sh/uv/guides/integration/docker/

aerhardt•1h ago
I'm surprised by how much I prefer prepending "uv" to everything instead of activating environments - which is still naturally an option if that's what floats your boat.

I also like how you can manage Python versions very easily with it. Everything feels very "batteries-included" and yet local to the project.

I still haven't used it long enough to tell whether it avoids the inevitable bi-yearly "debug a Python environment day" but it's shown enough promise to adopt it as a standard in all my new projects.

bobsomers•1h ago
Personally, I prefer prepending `uv` to my commands because they're more stateless that way. I don't need to remember which terminal my environment is sourced in, and when copying and pasting commands to people I don't need to worry about what state their terminal is it. It just works.
zahlman•1h ago
> how much I prefer prepending "uv" to everything instead of activating environments

You can also prepend the path to the virtual environment's bin/ (or Scripts/ on Windows). Literally all that "activating an environment" does is to manipulate a few environment variables. Generally, it puts the aforementioned directory on the path, sets $VIRTUAL_ENV to the venv root, configures the prompt (on my system that means modifying $PS1) as a reminder, and sets up whatever's necessary to undo the changes (on my system that means defining a "deactivate" function; others may have a separate explicit script for that).

I personally don't like the automatic detection of venvs, or the pressure to put them in a specific place relative to the project root.

> I also like how you can manage Python versions very easily with it.

I still don't understand why people value this so highly, but so it goes.

> the inevitable bi-yearly "debug a Python environment day"

If you're getting this because you have venvs based off the system Python and you upgrade the system Python, then no, uv can't do anything about that. Venvs aren't really designed to be relocated or to have their underlying Python modified. But uv will make it much faster to re-create the environment, and most likely that will be the practical solution for you.

lelandbatey•1h ago
I agree, once I learned (early in my programming journey) what the PATH is as a concept, I have never had an environment problem.

However, I also think many people, even many programmers, basically consider such external state "too confusing" and also don't know how they'd debug such a thing. Which I think is a shame since once you see that it's pretty simple it becomes a tool you can use everywhere. But given that people DON'T want to debug such, I can understand them liking a tool like uv.

I do think automatic compiler/interpreter version management is a pretty killer feature though, that's really annoying otherwise typically afaict, mostly because to get non-system wide installs typically seems to require compiling yourself.

biimugan•48m ago
Yup. I never even use activate, even though that's what you find in docs all over the place. Something about modifying my environment rubs me the wrong way. I just call ``./venv/bin/python driver.py`` (or ``./venv/bin/driver`` if you install it as a script) which is fairly self-evident, doesn't mess with your environment, and you can call into as many virtualenvs as you need to independently from one another.

``uv`` accomplishes the same thing, but it is another dependency you need to install. In some envs it's nice that you can do everything with the built-in Python tooling.

j45•1h ago
This isn't a comment just about Python.. but it should just work. There shouldn't be constant ceremony for getting and keeping environments running.
oblio•1h ago
There are basically 0 other programming languages that use the "directory/shell integration activated virtual environment", outside of Python.

How does the rest of the world manage to survive without venvs? Config files in the directory. Shocking, really :-)))

roflyear•34m ago
what happens when you have two projects using different versions of node, etc? isn't that a massive headache?

not that it's great to start with, but it does happen, no?

whywhywhywhy•31m ago
The only word in the `source .venv/bin/activate` command that isn't a complete red flag that this was the wrong approach is probably bin. Everything else is so obviously wrong.

source - why are we using an OS level command to activate a programming language's environment

.venv - why is this hidden anyway, doesn't that just make it more confusing for people coming to the language

activate - why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment

Feels dirty every time I've had to type it out and find it particularly annoying when Python is pushed so much as a good first language and I see people paid at a senior level not understand this command.

zahlman•14m ago
> why are we using an OS level command to activate a programming language's environment

Because "activating an environment" means setting environment variables in the parent process (the shell that you use to run the command), which is otherwise impossible on Linux (see for example https://stackoverflow.com/questions/6943208).

> why is this hidden anyway, doesn't that just make it more confusing for people coming to the language

It doesn't have to be. You can call it anything you want, hidden or not, and you can put it anywhere in the filesystem. It so happens that many people adopted this convention because they liked having the venv in that location and hidden; and uv gives such venvs special handling (discovering and using them by default).

> why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment

Because the entire point is that, when you need to activate the environment, the folder in question is not on the path (the purpose of the script is to put it on the path!).

If activating virtual environments shadows e.g. /usr/bin/activate on your system (because the added path will be earlier in $PATH), you can still access that with a full absolute path; or you can forgo activation and do things like `.venv/bin/python -m foo`, `.venv/bin/my-program-wrapper`, etc.

> Feels dirty every time I've had to type it out

I use this:

  $ type activate-local 
  activate-local is aliased to `source .local/.venv/bin/activate'
Notice that, again, you don't have to put it at .venv . I use a .local folder to store notes that I don't want to publish in my repo nor mention in my project's .gitignore; it in turn has

  $ cat .local/.gitignore 
  # Anything found in this subdirectory will be ignored by Git.
  # This is a convenient place to put unversioned files relevant to your
  # working copy, without leaving any trace in the commit history.
  *
> and I see people paid at a senior level not understand this command.

If you know anyone who's hiring....

zahlman•26m ago
> Config files in the directory.

The problem is, that would require support from the Python runtime itself (so that `sys.path` can be properly configured at startup) and it would have to be done in a way that doesn't degrade the experience for people who aren't using a proper "project" setup.

One of the big selling points of Python is that you can just create a .py file anywhere, willy-nilly, and execute the code with a Python interpreter, just as you would with e.g. a Bash script. And that you can incrementally build up from there, as you start out learning programming, to get a sense of importing files, and then creating meaningful "projects", and then thinking about packaging and distribution.

sirfz•1h ago
I use mise with uv to automatically activate a project's venv but prefixing is still useful sometimes since it would trigger a sync in case you forgot to do it.
globular-toast•37m ago
One of the key tenets of uv is virtualenvs should be disposable. So barring any bugs with uv there should never be any debugging environments. Worst case just delete .venv and continue as normal.
atonse•1h ago
These rust based tools really change the idea of what's possible (when you can get feedback in milliseconds). But I'm trying to figure out what Astral as a company does for revenue. I don't see any paid products on their website. They even have investors.

So far it seems like they have a bunch of these high performance tools. Is this part of an upcoming product suite for python or something? Just curious. I'm not a full-time python developer.

tabletcorry•1h ago
Take a look at their upcoming product Pyx to see where revenue can start to come in for paid/hosted services.

https://astral.sh/pyx

bruckie•1h ago
From "So how does Astral plan to make money? " (https://news.ycombinator.com/item?id=44358216):

"What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today. An example of what this might look like [...] would be something like an enterprise-focused private package registry."

There's also this interview with Charlie Marsh (Astral founder): https://timclicks.dev/podcast/supercharging-python-tooling-a... (specifically the "Building a commerical company with venture capital " section)

throwway120385•1h ago
That doesn't really seem like a way to avoid getting "Broadcommed." Vertically integrated tooling is kind of a commodity.
dark__paladin•1h ago
Genuinely trying to learn here - what's the major advantage of using uv over conda?

(Transparently, I'm posting this before I've completed the article.)

collinmanderson•1h ago
uv is unbelievably fast.
ethmarks•1h ago
They have different use cases. uv is meant to be the singular tool for managing Python packages and dependencies, replacing pip, virtualenv, and pip-tools. Conda is for more general-purpose environment management, not just Python. If you're doing something with Node or R, uv won't work at all because it's only for Python.

uv's biggest advantage is speed. It claims a 10-100x performance speedup over pip and Conda [1]. uv can also manage python versions and supports using Python scripts as executables via inline dependencies [2].

But Conda is better for non-Python usage and is more mature, especially for data science related uses.

[1]: https://github.com/astral-sh/uv/blob/main/BENCHMARKS.md [2]: https://docs.astral.sh/uv/#scripts

seabrookmx•1h ago
Can't agree more. We were using pyenv+poetry before and regularly had to pin our poetry version to a specific one, because new poetry releases would stall trying to resolve dependencies.

pyenv was problematic because you needed the right concoction of system packages to ensure it compiled python with the right features, and we have a mix of MacOS and Linux devs so this was often non-trivial.

uv is much faster than both of these tools, has a more ergonomic CLI, and solves both of the issues I just mentioned.

I'm hoping astral's type checker is suitably good once released, because we're on mypy right now and it's a constant source of frustration (slow and buggy).

kardos•1h ago
> because new poetry releases would stall trying to resolve dependencies.

> uv is much faster than both of these tools

conda is also (in)famous for being slow at this, although the new mamba solver is much faster. What does uv do in order to resolve dependencies much faster?

collinmanderson•1h ago
> What does uv do in order to resolve dependencies much faster?

- Representing version numbers as single integer for fast comparison.

- Being implemented in rust rather than Python (compared to Poetry)

- Parallel downloads

- Caching individual files rather than zipped wheel, so installation is just hard-linking files, zero copy (on unix at least). Also makes it very storage efficient.

asaddhamani•1h ago
I find the python tooling so confusing now. There’s pip, virtualenv, pipx, uv, probably half a dozen others I’m missing. I like node, npm isolates by default, npx is easy to understand, and the ecosystem is much less fragmented. I see a python app on GitHub and they’re all listing different package management tools. Reminds me of that competing standards xkcd.
tabletcorry•1h ago
Node has at least bun, and probably other tools, that attempt to speed things up in similar ways. New tooling is always coming for our languages of choice, even if we aren't paying attention.
theultdev•1h ago
well there's npm, pnpm, yarn, bun package managers

not a python developer, so not sure it's equivalent as the npm registry is shared between all.

collinmanderson•1h ago
> There’s pip, virtualenv, pipx, uv, probably half a dozen others I’m missing...

> Reminds me of that competing standards xkcd.

Yes, for years I've sat on the sidelines avoiding the fragmented Poetry, ppyenv, pipenv, pipx, pip-tools/pip-compile, rye, etc, but uv does now finally seem to be the all-in-one solution that seems to be succeeding where other tools have failed.

sdairs•1h ago
Everything from the astral team has been superb, I don't want to use Python without ruff & uv. Yet to try "ty", anyone used it?
tabletcorry•1h ago
Ty is still under very active development, so it either works or very much doesn't. I run it occasionally to see if it works on my codebases, and while it is getting closer, it isn't quite there yet.

Definitely lightyears faster than mypy though.

sph•1h ago
There is something hilarious about using a project/package manager written in another language.
philipallstar•1h ago
Wait til you find out what CPython is written in.
srameshc•1h ago
I am still learning and I have the same feeling as someone who don't consider myself good with python. At least I can keep my venv in control now is all I can feel with Uv approach.
dec0dedab0de•1h ago
I don't like that it defaults to putting the virtual environment right there, I much prefer how pipenv does it with a shared one in the users home directory, but it's a small price to pay for how fast it is.
dekhn•1h ago
I hadn't paid any attention to rust before uv, but since starting to use uv, I've switched a lot of my performance-sensitive code dev to rust (with interfaces to python). These sorts of improvements really do improve my quality of life significantly.

My hope is that conda goes away completely. I run an ML cluster and we have multi-gigabyte conda directories and researchers who can't reproduce anything because just touching an env breaks the world.

gostsamo•1h ago
As far as I get it, conda is still around because uv is focused on python while conda handles things written in other languages. Unless uv gets much more universal than expected, conda is here to stay.
tempay•1h ago
There is also pixi (which uses uv for the python side of things) which feels like uv for conda.
embe42•1h ago
You might be interested in pixi, which is roughly to conda as uv is to pip (also written in Rust, it reuses the uv solver for PyPI packages)
th0ma5•1h ago
This is something that uv advocates should pay attention to, there are always contexts that need different assumptions, especially with our every growing and complex pile of libraries and systems.
kardos•1h ago
It would be nice indeed if there was a good solution to multi-gigabyte conda directories. Conda has been reproducible in my experience with pinned dependencies in the environment YAML... slow to build, sure, but reproducible.
PaulHoule•1h ago
I'd argue bzip compression was a mistake for Conda. There was a time when I had Conda packages made for the CUDA libraries so conda could locally install the right version of CUDA for every project, but boy it took forever for Conda to unpack 100MB+ packages.
kardos•59m ago
It seems they are using zstd now for .conda packages, eg, bzip is obsoleted, so that should be faster.
whimsicalism•1h ago
I work professionally in ML and have not had to touch conda in the last 7 years. In an ML cluster, it is hopefully containerized and there is no need for that?
BoredPositron•1h ago
It's still used in edu and research. Haven't seen it in working environments in quite some time as well.
savin-goyal•1h ago
the topic of managing large dependency chains for ML/AI workloads in a reproducible has been a deep rabbit hole for us. if you are curious, here is some of the work in open domain

https://docs.metaflow.org/scaling/dependencies https://outerbounds.com/blog/containerize-with-fast-bakery

jvanderbot•1h ago
Obligatory: Not only rust would be faster than python, but Rust definitely makes it easy with Cargo. Go, C, C++ should all exhibit the performance you are seeing in uv, if it had been written in one of those languages.

The curmudgeon in me feels the need to point out that fast, lightweight software has always been possible, it's just becoming easier now with package managers.

zem•1h ago
I think ruff is the best thing to happen to the python ecosystem in a decade, it really sold the entire community on the difference fast native tooling could make.
pixelpoet•1h ago
Wait until they fully embrace the benefits of strong typing :)
psunavy03•1h ago
I have one problem with uv as of now, and it's more of an annoyance. It doesn't seem to understand the concept of >= when it's trying to resolve a local wheel I built and use. If I have 6.4.1 published on GitLab and the pyproject says $WHEEL_NAME>=6.2.0, it still goes to look for 6.2.0 (which I deleted) and errors out.
hglaser•1h ago
Am I the only one who feels like this is obviated by Docker?

uv is a clear improvement over pip and venv, for sure.

But I do everything in dev containers these days. Very few things get to install on my laptop itself outside a container. I've gotten so used to this that tools that uninstall/install packages on my box on the fly give me the heebie-jeebies.

collinmanderson•1h ago
uv can be used to speed up building containers.
czbond•1h ago
> I do everything in dev containers these days. Very few things get to install on my laptop itself outside a container.

Yes, it was the NPM supply chain issues that really forced this one me. Now I install, fetch, build in an interactive Docker container

zahlman•44m ago
Lots of people are doing things where they would prefer not to invoke the weight of an entire container.
dev_l1x_be•1h ago
And Rust is the best thing to happen to CS in a decade
semiinfinitely•1h ago
I had a recent period in my programming career where I started to actually believe that the "worse is better" philosophy is true in practice. It was a dark period and thankfully the existence of tools like uv save me from that abyss.
cyrialize•1h ago
I haven't tried uv yet, but I did use it's precursor - rye.

I had to update some messy python code and I was looking for a tool that could handle python versions, package updates, etc. with the least amount of documentation needing be read and troubleshooting.

Rye was that for me! Next time I write python I'm definitely going to use uv.

sirfz•1h ago
Indeed rye is great and switching to uv is pretty straight forward. I still think rye's use of shims was pretty cool but probably uv's approach is more sane
captain_coffee•1h ago
Yes, uv is probably the best thing to happen to the Py ecosystem in the last decade. That is mainly because the rest of the ecosystem is somewhere between garbage fire and mediocre at best. uv in itself is a great tool, I have no complaints about it whatsoever! But we have to remember just how bad the rest of things are and never forget that everything's still in a pretty bad state even after more than 3 ** DECADES ** of constant evolution.
These335•34m ago
Got a specific example in mind for garbage fire and mediocre?
nothrowaways•1h ago
Does speed really matter during python installation?
sunshowers•1h ago
Yes. Technical excellence is a virtue in and of itself.
collinmanderson•1h ago
It's fast enough that sometimes dependencies can be checked and resolved and installed at program runtime rather than it needing to be a separate step.

You can go from no virtual environment, and just "uv run myfile.py" and it does everything that's needed, nearly instantly.

maccard•1h ago
Speed matters everywhere. How much compute is spent on things that could easily be 100x faster than they are? Compare using VMware with pip to run a battery of unit tests with firecracker plus uv. It’s orders of magnitude quicker, and avoids a whole suite of issues related to persistent state on the machine
zahlman•45m ago
On my system, Pip takes noticeable time just to start up without ultimately doing anything of importance:

  $ time pip install
  ERROR: You must give at least one requirement to install (see "pip help install")

  real 0m0.356s
  user 0m0.322s
  sys 0m0.036s
(Huh, that's a slight improvement from before; I guess pip 25.3 is a bit better streamlined.)
magdyks•1h ago
Huge fan of uv and ruff and starting to play around with ty. Hats of to astral!
pjmlp•1h ago
Using Python on and off for OS scripting since version 1.6.

It has always been enough to place installations in separate directories, and use the same bash scripts for environment variables configuration for all these years.

j45•1h ago
uv has definitely helped make python a first class citizen in more ways.
isodev•1h ago
Or is it a corporate grab to gain more influence in the ecosystem? I like the idea, but for profit backing is out of the question. This lesson has been learned countless times.
dcgudeman•1h ago
no, it's a python library, get a grip. Also "This lesson has been learned countless times"? No it hasn't, since when has a package manager developed by a for-profit company hurt the ecosystem?
tootie•1h ago
I've been using uv and am pleased that is about as useful as maven was the last time I used it 12 years ago. I'm not really sure why we still need venv.
an_guy•1h ago
All these comments look like advertisement. "uv is better than python!!", "8/10 programmers recommend uv", "I was a terrible programmer before but uv changed my life!!", "uv is fast!!!"
collinmanderson•1h ago
> All these comments look like advertisement. "uv is better than python!!", "8/10 programmers recommend uv", "I was a terrible programmer before but uv changed my life!!", "uv is fast!!!"

Have you tried uv?

an_guy•1h ago
Why would I? Does it offer something that standard python tools doesn't? Why uv over, lets say, conda?
andy99•57m ago
FWIW I asked the same question last time a uv thread was posted (two weeks ago) - got some legit answers, none that swayed me personally but I can see why people use it. Also lots of inexplicable love for it https://news.ycombinator.com/item?id=45574550
dragonwriter•54m ago
> Does it offer something that standard python tools doesn't?

Other than speed and consolidation, pip, pipx, hatch, virtualenv, and pyenv together roughly do the job (though pyenv itself isn’t a standard python tool.)

> Why uv over, lets say, conda?

Support for Python standard packaging specifications and consequently also easier integration with other tools that leverage them, whether standard or third party.

andy99•1h ago
First time reading one of these threads? It’s a cult, and don’t dare criticize it. I think the same thing used to be true with rust though nobody really talks about it much anymore.

I don’t think people would think twice about the legitimacy (if you want to call it that) of uv except for all the weird fawning over it that happens, as you noticed. It makes it seem more like a religion or something.

taeric•1h ago
I still feel bitten by diving into poetry when starting some projects. Has the ecosystem fully moved on to uv, now? Do they have good influence on what python's main ecosystem is moving to?
collinmanderson•33m ago
> Has the ecosystem fully moved on to uv, now? It's moving pretty quick.

> Do they have good influence on what python's main ecosystem is moving to? Yes, they're an early adaptor/implementer of the recent pyproject.toml standards.

hardwaregeek•1h ago
I gotta say, I feel pretty vindicated after hearing for years how Python’s tooling was just fine and you should just use virtualenv with pip and how JS must be worse, that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing
WesolyKubeczek•1h ago
I somehow had quite enough problems going from bundler 1.13 to 1.16 to 2.x some years ago. I’m glad we have killed that codebase with fire.
gigatexal•1h ago
the thing is I never had issues with virtual environments -- uv just allows me to easily determine what version of python that venv uses.
j2kun•36m ago
you mean you can't just do `venv/bin/python --version`?
shlomo_z•32m ago
he means "choose", not "check"
icedchai•1h ago
poetry gave us lock files and consistent installs for years. uv is much, much faster however.
beeb•58m ago
I used poetry professionally for a couple of years and hit so many bugs, it was definitely not a smooth experience. Granted that was probably 3-4 years ago.
icedchai•47m ago
I've occasionally run into performance issues and bugs with dependency resolution / updates. Not so much recently, but at a previous company we had a huge monorepo and I've seen it take forever.
rcleveng•51m ago
and pip-compile before that.

Agree that uv is way way way faster than any of that and really just a joy to use in the simplicity

ShakataGaNai•49m ago
I have to agree that there were a lot of good options, but uv's speed is what sets it apart.

Also the ability to have a single script with deps using TOML in the headers super eaisly.

Also Also the ability to use a random python tool in effectively seconds with no faffing about.

pydry•1h ago
>finally get a taste of npm

good god no thank you.

>cargo

more like it.

internetter•30m ago
cargo is better than npm, yes, but npm is better than pip (in my experience)
anp•51m ago
Might be worth noting that npm didn’t have lock files for quite a long time, which is the era during which I formed my mental model of npm hell. The popularity of yarn (again importing bundled/cargo-isms) seems like maybe the main reason npm isn’t as bad as it used to be.
kevin_thibedeau•51m ago
> you should just use virtualenv with pip

This is the most insulting take in the ongoing ruination of Python. You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell. Now you get endlessly chastised for trying to use Python as a general purpose utility. Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.

whalesalad•44m ago
it's because so many essential system tools now rely on python, and if you install arbitrary code outside of a venv it can clobber the global namespace and break the core OS' guarantees.

I do agree it is annoying, and what they need to do is just provide an automatic "userspace" virtualenv for anything a user installs themselves... but that is a pandoras box tbh. (Do you do it per user? How does the user become aware of this?)

dragonwriter•41m ago
What they needed to do is allow side-by-side installs of different versions of the same distribution package and allow specifying or constraining versions at import time, then you wouldn't have the problem at all.

But that's probably not practical to retrofit given the ecosystem as it is now.

aunderscored•39m ago
pipx solves this perfectly.
zahlman•31m ago
For "applications" (which are distributed on PyPI but include specified entry points for command-line use), yes. For development — installing libraries that your own code will use — you'll still generally need something else (although the restriction is really quite arbitrary).
ElectricalUnion•39m ago
Unless all python dependencies you ever used were available in your distro (and then at that point, you're no longer using pip, you're using dpkg...), this never worked well. What solves this well is PEP 723 and tooling around it.

With PEP 723 and confortable tooling (like uv), now you get scripts, that are "actually directly runnable", not just "fake directly runnable oops forgot to apt-get install something sorta runnable", and work reliably even when stuff around you is updated.

whywhywhywhy•36m ago
> Python as a general purpose utility

This ideology is what caused all the problems to begin with, the base python is built as if it's the only thing in the entire operating systems environment when it's entire packaging system is also built in a way that makes that impossible to do without manually having to juggle package conflicts/incompatibilities.

zahlman•35m ago
> You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell.

This wasn't really the case; in principle anything you installed in the system Python environment, even "at user level", had the potential to pollute that environment and thus interfere with system tools written in Python. And if you did install it at system level, that became files within the environment your system package manager is managing, that it doesn't know how to deal with, because they didn't come from a system package.

But it's worse now because of how many system tools are written in Python — i.e., a mark of Python's success.

Notably, these tools commonly include the system package manager itself. Since you mentioned Debian (actually this is Mint, but ya know):

  $ file `which apt`
  /usr/local/bin/apt: Python script, ASCII text executable
> Now you get endlessly chastised for trying to use Python as a general purpose utility.

No, you don't. Nothing prevents you from running scripts with the system Python that make use of system-provided libraries (including ones that you install later with the system package manager).

If you need something that isn't packaged by your distro, then of course you shouldn't expect your distro to be able to help with it, and of course you should expect to use an environment isolated from the distro's environment. In Python, virtual environments are the method of isolation. All reasonable tooling uses them, including uv.

> Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.

It's not "ruined". If you choose to install the system package for pip and to use it with --break-system-packages, the consequences are on you, but you get the legacy behaviour back. And the system packages still put files separately in dist-packages. It's just that... doing this doesn't actually solve all the problems, fundamentally because of how the Python import system works.

noirscape•19m ago
Nowadays pip also defaults to installing to the users home folder if you don't run it as root.

Basically the only thing missing from pip install being a smooth experience is something like npx to cleanly run modules/binary files that were installed to that directory. It's still futzing with the PATH variable to run those scripts correctly.

chrisweekly•48m ago
Webdev since 1998 here. Tabling the python vs JS/etc to comment on npm per se. PNPM is better than npm in every way. Strongest possible recommendation to use it instead of npm; it's faster, more efficient, safer, and more deterministic. See https://pnpm.io/motivation
Ant59•38m ago
I've gone all-in on Bun for many of the same reasons. Blazingly fast installs too.

https://bun.sh/

ifwinterco•26m ago
I think at this point everyone on hacker news with even a passing interest in JS has heard of bun, it's promoted relentlessly
nullbyte•29m ago
I find pnpm annoying to type, that's why I don't use it
bdangubic•23m ago
alias it to “p”
globular-toast•42m ago
I've been using pip-tools for the best part of a decade. uv isn't the first time we got lock files. The main difference with uv is how it abstracts away the virtualenv and you run everything using `uv run` instead, like cargo. But you can still activate the virtualenv if you want. At that point the only difference is it's faster.
caconym_•37m ago
There is nothing I dread more within the general context of software development, broadly, than trying to run other people's Python projects. Nothing. It's shocking that it has been so bad for so long.
hardwaregeek•28m ago
Never underestimate cultural momentum I guess. NBA players shot long 2 pointers for decades before people realized 3 > 2. Doctors refused to wash their hands before doing procedures. There’s so many things that seem obvious in retrospect but took a long time to become accepted
RobertoG•10m ago
[delayed]
lynndotpy•6m ago
I was into Python enough that I put it into my username but this is also my experience. I have had quasi-nightmares about just the bog of installing a Python project.
Multicomp•5m ago
I agree with you wholeheartedly, besides not preferring dynamic programming languages, I would in the past have given python more of a look because of its low barrier to entry...but I have been repulsed by how horrific the development ux story has been and how incredibly painful it is to then distribute the code in a portable ish way.

UV is making me give python a chance for the first time since 2015s renpy project I did for fun.

acomjean•3m ago
You aren’t kidding. Especially if it’s some bioinformatics software that is just hanging out there on GitHub older than a year…
mk89•32m ago
I have used

pip freeze > requirements.txt

pip install -r requirements.txt

Way before "official" lockfile existed.

Your requirements.txt becomes a lockfile, as long as you accept to not use ranges.

Having this in a single tool etc why not, but I don't understand this hype, when it was basically already there.

icedchai•28m ago
That works for simple cases. Now, update a transitive dependency used by more than one dependency. You might get lucky and it'll just work.
bdangubic•24m ago
it won’t work of course, no one is that lucky :)
mk89•8m ago
Not sure how uv helps here, because I am not very familiar with it.

With pip you update a dependency, it won't work if it's not compatible, it'll work if they are. Not sure where the issue is?

ifwinterco•27m ago
It is indeed fairly simple to implement it, which is why it's so weird that it's never been implemented at a language level
rtpg•25m ago
As a “pip is mostly fine” person, we would direct the result to a new lock file, so you could still have your direct does and then pin transitives and update

Pips solver could still cause problems in general on changes.

UV having a better solver is nice. Being fast is also nice. Mainly tho it feeling like it is a tool that is maintained and can be improved upon without ripping one’s hair out is a godsend.

epage•19m ago
Good luck if you need cross-platform `requirements.txt` files.
mk89•4m ago
This is a good use case. Not sure how this is typically solved, I guess "requirements-os-version.txt"? A bit redundant and repetitive.

I would probably use something like this: https://stackoverflow.com/questions/17803829/how-to-customiz...

2wrist•4m ago
It is also manages the runtime, so you can pin a specific runtime to a project. It is very useful and worth investigating.
devlovstad•1h ago
uv has made working with different python versions and environments much, much nicer for me. Most of my colleagues in computational genomics use conda, but I've yet to encounter a scenario where I've been unable to just use uv instead.
bfkwlfkjf•1h ago
The best thing about uv is it's not conda.

Pip is also not conda, but uv is way faster than pip.

samgranieri•1h ago
I'm not a pythonista, and the most recent time I've been playing with python has been using octodns. origninally I was using a pip setup, and honestly wow UV was so much faster.

I'm very happy the python community has better tooling.

hirako2000•1h ago
A problem remain in that many and still more of the popular repositories don't use uv to manage their dependencies.

So you are back having to use conda and the rest. Now, you have yet another package manager to handle.

I wouldn't be harsh to engineers at astral who developed amazing tooling, but the issue with the python ecosystem isn't lack of tooling, it is the proliferation and fragmentation. To solve dependency management fully would be to incorporate other package descriptors, or convert them.

Rsbuild, another rust library, for the node ecosystem did just that. For building and bundling. They came up with rspack, which has large compatibility with the webpack config.

You find a webpack repo? Just add rsbuild, rspack, and you are pretty much ready to go, without the slow (node native) webpack.

oblio•1h ago
Don't they publish to PyPi? What do you care what they use behind the scenes?
hirako2000•8m ago
It isn't what they use under the scene.

I refered to the interfaces of other packaging tools. I use uv and it's excellent on its own.

You get a repo, it's using playwright, what do you do now ? You install all the dependencies found in the dependency descriptor then sync to create a uv descriptor. or you compose a descriptor that uv understands.

It's repetitive, rather systematic so it could be automated. I should volunteer for a PR but my point is introducing yet another tool to an ecosystem suffering a proliferation of build and deps management tooling expands the issue. It would have been helpful from the get go to support existing and prolific formats.

pnpm understands package.json It didn't reinvent the wheel be cause we have millions of wheels out there. It created its own pnpm lock file, but that's files a user isn't meant to touch so it goes seamlessly to transition from npm to pnpm. Almost the same when migrating from webpack to rsbuild.

pama•1h ago
I love uv. But the post starts with a simple install using a oneliner curl piping to sh, which is such a big attack surface area… I would much rather have a much longer one liner that increases safety.
oblio•1h ago
Isn't uv like... a Rust binary? If that sh has any sense it just copies the binary and adds it to PATH.
rieogoigr•7m ago
but since you are curling a web URL straight to sh you will never know. which is the problem.
hirako2000•1h ago
It seems to be a trend in the rust community. I guess because rustup is suggested to be installed that way.

But you don't have to. Brew and other package managers hold uv in their registries.

mhogers•1h ago
Seeing a `pip install -r requirements.txt` in a very recently created python project is almost a red flag now...
nomel•5m ago
requirements.txt allows pip arguments to be included, so can be doing much more than just listing package names.

For example, installing on an air gapped system, where uv barely has support.

FattiMei•1h ago
But what was wrong with pip, venv and pyproject.toml in the first place? I just keep a system installation of python for my personal things and an environment for every project I'm working on. I'd get suspicious if a developer is picky about python versions or library versions like what crazy programs are you writing?
wrs•1h ago
The pytorch ecosystem, for one, is notorious for very specific version dependencies between libraries.
jvanderbot•1h ago
What was wrong was that you needed to do that.

How many commands are required to build up a locally consistent workspace?

Modern package managers do that for you.

oblio•1h ago
How do pip and venv integrate with pyproject.toml? At least pip doesn't even use it.
zahlman•52m ago
As of half a year ago with pip 25.1, it can install from "dependency groups" listed in pyproject.toml: https://ichard26.github.io/blog/2025/04/whats-new-in-pip-25....

Pip also generates PEP 751 lockfiles, and installing from those is on the roadmap still (https://github.com/pypa/pip/issues/13334).

venv is lower-level tooling. Literally all it does is create a virtual environment — the same kind that uv creates and manages. There's nothing to "integrate".

johnfn•1h ago
As mostly a Python outsider, in the infrequent times that I do use python package management, uv just works. When I use pip I’d get all sorts of obscure error messages that I’d have to go track down, probably because I got some obscure environment detail wrong. With uv I never run into that nonsense.
the8472•57m ago
What's wrong? Having modify the shell environment, no lockfile, slow download/installation, lack of a standard dependency dir, ...

> I'd get suspicious if a developer is picky about python versions or library versions

Certain library versions only support certain python versions. And they also break API. So moving up/down the python versions also means moving library versions which means stuff no longer works.

wrs•57m ago
Every time I see one of these comment threads it seems like uv desperately needs a better home page that doesn’t start with a long list of technical stuff. It’s really simple to use, in fact so simple that it confuses people!

The home page should be a simplified version of this page buried way down in the docs: https://docs.astral.sh/uv/guides/projects/

CalChris•54m ago
Mojo?
zahlman•49m ago
As far as I can tell, Mojo doesn't have very broad adoption. It also isn't actually Python, it just looks like it.
ModernMech•15m ago
Mojo stopped saying out loud they are trying to be a Python superset. Maybe they can do it one day but they're keeping that on the DL now because it's a really big ask.
kristopolous•53m ago
Hype is dangerous
docsaintly•51m ago
Python venv's is the #1 reason I've avoided working with it more. It used to be #2 behind strong typing, but now that Linux OSes' take up the default python install and block it from being used for quick scripts, it jumped to #1.

I've always wondered why Linux OSes that rely on python scripts don't make their own default venv and instead clobber the user's default python environment...

mosselman•48m ago
uv is great. I am a Ruby developer and I always loathed having to work with Python libraries because of how bad the tooling was. It was too complex to learn for the one-off times that I needed it and nothing worked properly.

Now with uv everything just works and I can play around easily with all the great Python projects that exist.

j2kun•37m ago
This article appears to be NOT about someone who discovered uv after using venv/pip, but rather an article about someone who discovered uv after not using virtual environments at all, and is mostly excited about the cleanliness of virtual environments.
collinmanderson•30m ago
The article shows some advantages compared to plain virtual environments:

In principle, you can ‘activate’ this new virtual environment like any typical virtual environment that you may have seen in other tools, but the most ‘uv-onic’ way to use uv is simply to prepend any command with uv run. This command automatically picks up the correct virtual environment for you and runs your command with it. For instance, to run a script — instead of

   source .venv/bin/activate
   python myscript.py
you can just do

   uv run myscript.py
mannicken•30m ago
God yes. I got dragged into the uv when I started using copyparty and I am a fanatical admirer ever since. I also use pipx to install tools often. I really don't understand why you can't just pip install something globally. I want this package to be available to me EVERYWHERE, why can't I do it? I only use python recreationally because everyone uses python everywhere and you can't escape it. So there is a massive possibility I am simply wrong and pip-installing something globally is a huge risk. I'm just not understanding it.
collinmanderson•18m ago
> I really don't understand why you can't just pip install something globally. I want this package to be available to me EVERYWHERE, why can't I do it? I only use python recreationally because everyone uses python everywhere and you can't escape it. So there is a massive possibility I am simply wrong and pip-installing something globally is a huge risk. I'm just not understanding it.

You may have a library that's been globally installed, and you have multiple projects that rely on it. One day you may need to upgrade the library for use in one project, but there are backward incompatibile changes in the upgrade, so now all of your other projects break when you upgrade the global library.

In general, when projects are used by multiple people across multiple computers, it's best to have the specific dependencies and versions specified in the project itself so that everyone using that project is using the exact same version of each dependency.

For recreational projects it's not as big of a deal. It's just harder to do a recreation of your environment.

zahlman•3m ago
> I want this package to be available to me EVERYWHERE, why can't I do it?

Because it being available in the system environment could cause problems for system tools, which are expecting to find something else with the same name.

And because those tools could include your system's package manager (like Apt).

> So there is a massive possibility I am simply wrong and pip-installing something globally is a huge risk. I'm just not understanding it.

I assume you're referring to the new protections created by the EXTERNALLY-MANAGED marker file, which will throw up a large boilerplate warning if you try to use pip to install packages in the system environment (even with --user, where they can still cause problems when you run the system tools without sudo).

You should read one or more of:

* the PEP where this protection was introduced (https://peps.python.org/pep-0668/);

* the Python forum discussion explaining the need for the PEP (https://discuss.python.org/t/_/10302);

* my blog post (https://zahlman.github.io/posts/2024/12/24/python-packaging-...) where I describe in a bit more detail (along with explaining a few other common grumblings about how Python packaging works);

* my Q&A on Codidact (https://software.codidact.com/posts/291839/) where I explain more comprehensively;

* the original motivating Stack Overflow Q&A (https://stackoverflow.com/questions/75608323/);

* the Python forum discussion (https://discuss.python.org/t/_/56900) where it was originally noticed that the Stack Overflow Q&A was advising people to circumvent the protection without understanding it, and a coordinated attempt was made to remedy that problem.

Or you can watch Brodie Robertson's video about the implementation of the PEP in Arch: https://www.youtube.com/watch?v=35PQrzG0rG4.

aurintex•28m ago
I can only agree. I'm not an python expert, but I always struggled when installing a new package and got the warning, that it could break the system packages, or when cloning an existing repo on a new installed system. Always wondered, why it became so "complicated" over the years.
peter-m80•27m ago
So basically a node-like thing for python
talsperre•26m ago
uv is the best tool out there as long as you have python only dependencies. It's really fast, and you can avoid using poetry, pipenv, etc. The only reason for conda to still exist is non pythonic dependencies, but that's another beast to tackle in itself.
samuel2•25m ago
Reminds me of Julia's Pkg manager and the way Julia packages are managed (also with a .toml file). That's the way to go!
eisbaw•24m ago
nix-shell is the OG
zmmmmm•23m ago

    > Instead of 
    >
    > source .venv/bin/activate
    > python myscript.py
    >
    > you can just do
    >
    > > uv run myscript
    >
This is by far the biggest turn off for me. The whole point of an environment manager is set the environment so that the commands I run work. They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

Side rant: yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.

dragonwriter•16m ago
> They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

There is a new standard mechanism for specifying the same things you would specify when setting up a venv with a python version and dependencies in the header of a single file script, so that tooling can setup up the environment and run the script using only the script file itself as a spec.

uv (and PyPA’s own pipx) support this standard.

> yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.

"uv run myscript" is neither longer nor worse than separately manually building a venv, activating it, installing dependencies into it, and then running the script.

mborsuk•15m ago
From what I can tell (just started using uv) it doesn't break the original workflow with the venv, just adds the uv run option as well.
collinmanderson•10m ago
> The whole point of an environment manager is set the environment so that the commands I run work. They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

The `uv run` command is an optional shortcut for avoiding needing to activate the virtual environment. I personally don't like the whole "needing to activate an environment" before I can run commands "natively", so I like `uv run`. (Actually for the last 10 years I've had my `./manage.py` auto-set up the virtual environment for me.)

The `uv add` / `uv lock` / `uv sync` commands are still useful without `uv run`.

hkt•23m ago
Am I the only one who thought poetry was still the greatest available whizbang?
ModernMech•18m ago
Honestly though it's a pretty rough indictment of Python that the best thing to happen in a decade is that people started writing Python tools in Rust. Not even a little Rust, uv is 98% Rust. I mean, they just released 3.14 and that was supposed to be a pretty big deal.
quantum_state•13m ago
Running pytest with uv run —active pytest… is very slow to get it started … anyone has some tips on this?
tonymet•12m ago
Can someone steelman the python tooling ecosystem for me? Having a new packaging / dependency manager every few years seems excessive.
rieogoigr•11m ago
Is there a way to install this that doesn't involve piping a random URL to my shell interpreter?
pshirshov•8m ago
And still there are some annoying issues:

  dependencies = [
      "torch==2.8.0+rocm6.4",
      "torchvision==0.23.0+rocm6.4",
      "pytorch-triton-rocm==3.4.0",
  ...
  ]
There is literally no easy way to also have a configuration for CUDA, you have to have a second config, and, the worse, manually copy/symlink them into the hardcoded pyproject.toml file
jillesvangurp•5m ago
Python is not my first language but I've always liked it. But project and dependency management was always a bit meh and an afterthought.

Over the years, I've tried venv, conda, pipenv, petry, plain pip with requirements.txt. I've played with uv on some recent projects and it's a definite step up. I like it.

Uv actually fixes most of the issues with what came before and actually builds on existing things. Which is not a small compliment because the state of the art before uv was pretty bad. Venv, pip, etc. are fine. They are just not enough by themselves. Uv embraces both. Without that, all we had was just a lot of puzzle pieces that barely worked together and didn't really fit together that well. I tried making conda + pipenv work at some point. Pipenv shell just makes using your shell state-full just adds a lot of complexity. None of the IDEs I tried figured that out properly. I had high hopes for poetry but it ended up a bit underwhelming and still left a lot of stuff to solve. Uv succeeds in providing a bit more of an end to end solution. Everything from having project specific python installation, venv by default without hassle, dependency management, etc.

My basic needs are simple. I don't want to pollute my system python with random crap I need for some project. So, like uv, I need to have whatever solution deal with installing the right python version. Besides, the system python is usually out of date and behind the current stable version of python which is what I would use for new projects.