I really like uv, and it's the first package manager for a while where I haven't felt like it's a minor improvement on what I'm using but ultimately something better will come out a year or two later. I'd love if we standardized on it as a community as the de facto default, especially for new folks coming in. I personally now recommend it to nearly everyone, instead of the "welllll I use poetry but pyenv works or you could use conda too"
As a NodeJS developer it's still kind of shocking to me that Python still hasn't resolved this mess. Node isn't perfect, and dealing with different versions of Node is annoying, but at least there's none of this "worry about modifying global environment" stuff.
I usually stay away far FAR from shiny new tools but I've been experimenting with uv and I really like it. I'm a bit bummed that it's not written in Python but other than that, it does what it says on the tin.
I never liked pyenv because I really don't see the point/benefit building every new version of Python you want to use. There's a reason I don't run Gentoo or Arch anymore. I'm very happy that uv grabs pre-compiled binaries and just uses those.
So far I have used it to replace poetry (which is great btw) in one of my projects. It was pretty straightforward, but the project was also fairly trivial/typical.
I can't fully replace pipx with it because 'uv tool' currently assumes every Python package only has one executable. Lots of things I work with have multiple, such as Ansible and Jupyterlab. There's a bug open about it and the workarounds are not terrible, but it'd be nice if they are able to fix that soon.
The activation of the virtualenv is unnecessary (one can execute pip/python directly from it), and the configuring of your local pyenv interpreter is also unnecessary, it can create a virtual environment with one directly:
Ok, this must be a dumb question answered by the manual, but I still haven't got my hands on uv, so: but does it solve the opposite? I mean, I pretty much never want any "ad-hoc" environments, but I always end up with my .venv becoming an ad-hoc environment, because I install stuff while experimenting, not bothering to patch requirements.txt, pyproject.toml or anything of the sort. In fact, now I usually don't even bother typing pip install, PyCharm does it for me.
This is of course bad practice. What I would like instead is what PHP's composer does: installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv), automatically freezes the versions, and then it is on git diff to tell me what I did last night, I'll remove a couple of lines from that file, run composer install and it will remove packages not explicitly added to my config from the environment. Does this finally get easy to achieve with uv?
I have used virtualenvwrapper before and it was very convenient to have all virtual environments stored in one place, like ~/.cache/virtualenvs.
The .venv in the project directory is annoying because when you copy folder somewhere you start copying gigabytes of junk. Some tools like rsync can't handle CACHEDIR.TAG (but you can use --exclude .venv)
Why can’t python just adopt something like yarn/pnpm + and effing stop patch-copying its binaries into a specific path? And pick up site_packages from where it left it last time? Wtf. How hard it is to just pick python-modules directory and python-project.json and sync it into correctness by symlink/mklink-inf missing folders from a package cache in there in a few seconds?
Every time when I have to reorganize or upgrade my AI repos, it’s yet another 50GB writes to my poor ssd. Half of it is torch, another half auto-downloaded models that I cannot stop because they become “downloaded” and you never know how to resume it back or even find where they are cause python logging culture is just barbaric.
- Take the proper route:
- Create a virtual environment
- pip install pandas
- Activate the virtual environment
- Run python
Basically, out of the box, when you create an virtual it is immediately activated. And you would obviously need to have it activated before doing a pip install...
In addition, in my opinion this is the thing that would sucks about UV to have different functions being tied to a single tool execution.
It is a breeze to be able to activate a venv, and be done with it, being able to run multiple times your program in one go, even with crashes, being able to install more dependencies, test it in REPL, ...
Python package management has always seemed like crazyland to me. I've settled on Anaconda as I've experimented with all the ML packages over the years, so I'd be interested to learn why uv, and also what/when are good times to use venv/pip/conda/uv/poetry/whatever else has come up.
NeutralCrane has a really helpful comment below[0], would love to have a more thorough post on everything!
been using conda for years with multiple projects each of which has numerous environments (for different versions). fairly large complex environments with coda, tf, jax, etc. has always worked well, and my biggest complaint - the sluggish resolver - largely addressed with mamba resolver. packages not available on conga-forge can be installed into the conda env pip. maybe I'm missing something but it's not clear to me what advantage uv would provide over conda.
When using Maven build tool with Java, the downloaded artifacts always have the version number added to prefix (artifact-version.jar).. and that means there can be multiple versions stored in parallel, cached globally and cherry picked without any ambiguity. The first time when I used Node and Python, I was shocked that the version number is not part of any downloaded artifacts. Versioning the dependencies is such a fundamental need and having it part of the artifact file itself seems like a common sense to me. Can anyone please explain why the Python/Node build tools do not follow that?
What would be interesting is if you could do something similar for IPython/Jupyter Notebooks: while front-ends like JupyterLab and VS Code Notebooks do let you select a .venv if present in the workspace, it's annoying to have to set one up and build one for every project.
For anyone that used rye, it's worth noting that the creator of rye recommends using uv. Also, rye is going to be continually updated to just interface with uv until rye can be entirely replaced by uv.
I want to like uv, but unfortunately there's some kind of technical distinction between a Python "package manager" and a "build system". Uv doesn't include a "build system", but encourages you to use some other one. The net result is that external dependencies don't build the same as on Poetry, don't work, and uv points the finger at some other dependency.
I do hope the situation changes one day. Python packaging is such a mess, but Poetry is good enough and actually works, so I'll stick with it for now.
I really thought this would mention uv script deps (standardized by some PEP) together with a `#!/usr/bin/env -S uv run` shebang line which automatically install the deps on execution.
Has been super useful to write single-file tools/scripts which LLMs can understand and run easily.
I do like uv and hope to try it soon but I don't get the point of the article.
Pyenv + poetry already gives you ability to "pull in local dependencies". Yes, you have to create a virtual environment and it's not "ad-hoc".
But if you're going to pull in a bunch of libraries, WHY would you want to invoke python and all your work dependencies on a one liner? Isn't it much better and easier to just spell-out the dependencies in a pyproject.toml? How "ad-hoc" are we talking here?
I honestly really hate the venv ergonomics but uv does still depend on it as the golden path if you don’t use the —with flags (in my understanding). Is there a way to do a clean break with just the new —script inline dependencies, or is that wrong/suboptimal?
one useful UV alias I use is
uvsys='uv pip install --system'
So I can just do uv {package} for a quick and dirty global install. I'm so used to pip install being global by default just making this shorthand makes things a bit easier.
Uv's killer feature is making ad-hoc environments easy
(valatka.dev)498 points by astronautas 12 January 2025 | 414 comments
Comments
I never liked pyenv because I really don't see the point/benefit building every new version of Python you want to use. There's a reason I don't run Gentoo or Arch anymore. I'm very happy that uv grabs pre-compiled binaries and just uses those.
So far I have used it to replace poetry (which is great btw) in one of my projects. It was pretty straightforward, but the project was also fairly trivial/typical.
I can't fully replace pipx with it because 'uv tool' currently assumes every Python package only has one executable. Lots of things I work with have multiple, such as Ansible and Jupyterlab. There's a bug open about it and the workarounds are not terrible, but it'd be nice if they are able to fix that soon.
1. `uvx --from git+https://github.com/httpie/cli httpie` 2. https://simonwillison.net/2024/Aug/21/usrbinenv-uv-run/ uv in a shebang
uvx --from 'huggingface_hub[cli]' huggingface-cli
This is of course bad practice. What I would like instead is what PHP's composer does: installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv), automatically freezes the versions, and then it is on git diff to tell me what I did last night, I'll remove a couple of lines from that file, run composer install and it will remove packages not explicitly added to my config from the environment. Does this finally get easy to achieve with uv?
I have used virtualenvwrapper before and it was very convenient to have all virtual environments stored in one place, like ~/.cache/virtualenvs.
The .venv in the project directory is annoying because when you copy folder somewhere you start copying gigabytes of junk. Some tools like rsync can't handle CACHEDIR.TAG (but you can use --exclude .venv)
Every time when I have to reorganize or upgrade my AI repos, it’s yet another 50GB writes to my poor ssd. Half of it is torch, another half auto-downloaded models that I cannot stop because they become “downloaded” and you never know how to resume it back or even find where they are cause python logging culture is just barbaric.
The author says that a normal route would be:
Basically, out of the box, when you create an virtual it is immediately activated. And you would obviously need to have it activated before doing a pip install...In addition, in my opinion this is the thing that would sucks about UV to have different functions being tied to a single tool execution.
It is a breeze to be able to activate a venv, and be done with it, being able to run multiple times your program in one go, even with crashes, being able to install more dependencies, test it in REPL, ...
NeutralCrane has a really helpful comment below[0], would love to have a more thorough post on everything!
[0]https://news.ycombinator.com/item?id=42677048
I do hope the situation changes one day. Python packaging is such a mess, but Poetry is good enough and actually works, so I'll stick with it for now.
Edit: might be possible now? https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
Can uv work with that?
Has been super useful to write single-file tools/scripts which LLMs can understand and run easily.
https://packaging.python.org/en/latest/specifications/inline...
Especially useful if the script has dependencies on packages in private repos.
Use Nix for Python version as well as other bin deps, and virtualenv + pip-tools for correct package dependency resolution.
Waiting 4s for pip-tools instead of 1ms for uv doesn't change much if you only run it once a month.
I wish there was a way to either shebang something like this or build a wheel that has the full venv inside.
Pyenv + poetry already gives you ability to "pull in local dependencies". Yes, you have to create a virtual environment and it's not "ad-hoc".
But if you're going to pull in a bunch of libraries, WHY would you want to invoke python and all your work dependencies on a one liner? Isn't it much better and easier to just spell-out the dependencies in a pyproject.toml? How "ad-hoc" are we talking here?
So I can just do uv {package} for a quick and dirty global install. I'm so used to pip install being global by default just making this shorthand makes things a bit easier.