A very well written article! I admire the analysis done by the author regarding the difficulties of Python packaging.
With the advent of uv, I'm finally feeling like Python packaging is solved. As mentioned in the article, being able to have inline dependencies in a single-file Python script and running it naturally is just beautiful.
#!/usr/bin/env -S uv run
# /// script
# dependencies = ['requests', 'beautifulsoup4']
# ///
import requests
from bs4 import BeautifulSoup
After being used to this workflow, I have been thinking that a dedicated syntax for inline dependencies would be great, similar to JavaScript's `import ObjectName from 'module-name';` syntax. Python promoted type hints from comment-based to syntax-based, so a similar approach seems feasible.
> It used to be that either you avoided dependencies in small Python script, or you had some cumbersome workaround to make them work for you. Personally, I used to manage a gigantic venv just for my local scripts, which I had to kill and clean every year.
I had the same fear for adding dependencies, and did exactly the same thing.
> This is the kind of thing that changes completely how you work. I used to have one big test venv that I destroyed regularly. I used to avoid testing some stuff because it would be too cumbersome. I used to avoid some tooling or pay the price for using them because they were so big or not useful enough to justify the setup. And so on, and so on.
But... the 86GB python dependency download cache on my primary SSD, most of which can be attributed to the 50 different versions of torch, is testament to the fact that even uv cannot salvage the mess that is pip.
Never felt this much rage at the state of a language/build system in the 25 years that I have been programming. And I had to deal with Scala's SBT ("Simple Build Tool") in another life.
Like so many other articles that make some offhand remarks about conda, this article raves about a bunch of "new" features that conda has had for years.
> Being independent from Python bootstrapping
Yep, conda.
> Being capable of installing and running Python in one unified congruent way across all situations and platforms.
Yep, conda.
> Having a very strong dependency resolver.
Yep, conda (or mamba).
The main thing conda doesn't seem to have which uv has is all the "project management" stuff. Which is fine, it's clear people want that. But it's weird to me to see these articles that are so excited about being able to install Python easily when that's been doable with conda for ages. (And conda has additional features not present in uv or other tools.)
The pro and con of tools like uv is that they layer over the base-level tools like pip. The pro of that is that they interoperate well with pip. The con is that they inherit the limitations of that packaging model (notably the inability to distribute non-Python dependencies separately).
That's not to say uv is bad. It seems like a cool tool and I'm intrigued to see where it goes.
before you can install the package, you first have to install some other package whose only purpose is to break pip so it uses nvidia's package registry. This does not work with uv, even with the `uv pip` interface, because uv rightly doesn't put up with that shit.
This is of course not Astral's fault, I don't expect them to handle this, but uv has spoiled me so much it makes anything else even more painful than it was before uv.
I think at this point, the only question it remains is how Astral will make money. But if they can package some sort enterprise package index with some security bells and whistles it seems an easy sell into a ton of orgs.
I think the biggest praise I can give uv is that as a non Python dev, it makes Python a lot more accessible. The ecosystem can be really confusing to approach as an outsider. There’s like 5 different ways to create virtual environments. With uv, you don’t have to care about any of that. The venv and your Python install are just handled for you by ‘uv run’, which is magic.
Can someone explain a non-project based workflow/configuration for uv? I get creating a bespoke folder, repo, and uv venv for certain long-lived projects (like creating different apps?).
But most of my work, since I adopted conda 7ish years ago, involves using the same ML environment across any number of folders or even throw-away notebooks on the desktop, for instance. I’ll create the environment and sometimes add new packages, but rarely update it, unless I feel like a spring cleaning. And I like knowing that I have the same environment across all my machines, so I don’t have to think about if I’m running the same script or notebook on a different machine today.
The idea of a new environment for each of my related “projects” just doesn’t make sense to me. But, I’m open to learning a new workflow.
Addition: I don’t run other’s code, like pretrained models built with specific package requirements.
Since this seems to be a love fest let me offer a contrarian view. I use conda for environment management and pip for package management. This neatly separates the concerns into two tools that are good at what they do. I'm afraid that uv is another round of "Let's fix everything" just to create another soon to be dead set of patterns. I find nothing innovative or pleasing in its design, nor do I feel that it is particularly intuitive or usable.
You don't have to love uv, and there are plenty of reasons not to.
A familiar tale: Joe is hesitant about switching to UV and isn't particularly excited about it. Eventually, he gives it a try and becomes a fan. Soon, Joe is recommending UV to everyone he knows.
uv is so much better than everything else, I'm just can't afraid they can't keep the team going. Time will tell but I just use uv and ruff in every project now tbh.
I know good naming is hard, and there are an awful lot of project names that clash, but naming a project uv is unfortunate due to the ubiquitous nature of libuv
> You don't even need to know there is a venv, or what activation means.
> All those commands update the lock file automatically and transparently.... It's all taken care of.
When is the python community going to realize that simple is the opposite of easy? I don't see how hiding these aspects is desirable at all; I want to know how my programming tools work!
With all due respect to the author, I don't like the assumption that all programmers want magic tools that hide everything under the rug. Some programmers still prefer simplicity, ie understanding exactly what every part of the system does.
Nothing against uv, it seems like a fine tool. And I'm sure one could make a case for it on other technical merits. But choosing it specifically to avoid critical thinking is self-defeating.
The problem I have with uv is that it is not opinionated enough, or complete enough. It still needs a backend for building the package, and you still have a choice of backends.
In other words, it is a nice frontend to hide the mess that is the Python packaging ecosystem, but the mess of an ecosystem is still there, and you still have to deal with it. You'll still have to go through hatchling's docs to figure out how to do x/y/z. You'll still have to switch from hatchling to flit/pdm/setuptools/... if you run into a limitation of hatchling. As a package author, you're never using uv, you're using uv+hatchling (or uv+something) and a big part of your pyproject.toml are not uv's configuration, it is hatchling configuration.
I'm sticking with Poetry for now, which has a more streamlined workflow. Things work together. Every Poetry project uses the same configuration syntax (there are no Poetry+X and Poetry+Y projects). Issues in Poetry can be fixed by Poetry rather than having to work with the backend.
I understand that uv is still young and I am sure this will improve. Maybe they'll even pick a specific backend and put a halt to this. But of course Poetry might catch up before then.
> I had a friend who decided to not use uv, because the first time he used it, it was on a 15 years old codebase that had just been migrated to Python 3. It was standing on a pile of never cleaned up pip freeze exports, and uv could not make it work.
This is my only gripe with uv, despite how the author decided to depict it, this really turns into a headache fast as soon as you have ~4-5 in-house packages.
I don't think it's that bad that uv is so unforgiving in those case because it leads to better overall project quality/cohesion, but I wish there was a way to more progressively onboard and downgrade minor version mismatch to warnings.
uv may be an improvement, but the Python packaging hell is a cultural problem that will not be solved without changing culture. And the main cultural issue is: 1. Depending on small and huge packages for trivial things. 2. A culture of breaking API compatibility. The two things combined create the mess we see.
I am a casual python user, and for that I love uv. Something I haven't quite figured out yet is integration with the pyright lsp - when I edit random projects in neovim, any imports have red squiggles. Does anyone know of a good way to resolve imports for the lsp via uv?
Is there a conda to uv migration tutorial written by anyone?
I have installed miniconda system-wide. For any Python package that I use a lot, I install them on base environment. And on other environments. Like ipython.
For every new project, I create a conda environment, and install everything in it. Upon finishing/writing my patch, I remove that environment and clean the caches. For my own projects, I create an environment.yaml and move on.
Everything works just fine. Now, the solving with mamba is fast. I can just hand someone the code and environment.yaml, and it runs on other platforms.
Can someone say why using uv is a good idea? Has anyone written a migration guide for such use cases?
I am mightily impressed by one line dependency declaration in a file. But I don't know (yet) where the caches are stored, how to get rid of them later, etc.
I really like uv and I have successfully got rid of miniconda but :
- I wish there was a global virtual environment which could be referenced and activated from terminal. Not every new scripts needs their own .venv in their respective folder. uv takes the route of being project centered and based on file system, this works for me most of the time but sometime it doesn't.
- I wish we could avoid the .python_version file and bundle it in the pyproject.toml file.
I've been using Hermit to install uv, then pointing scripts at $REPO_ROOT/bin/uv. That gives you a repo where the scripts can be run directly after cloning (Hermit is smart enough to install itself if necessary).
Unfortunately, Hermit doesn't do Windows, although I'm pretty sure that's because the devs don't have Windows machines: PRs welcome.
Honest question: is uv more reproducible/portable than cramming your Python project into a Docker container? I've used pyenv, pip, venv, and a couple of other things, and they all work fine, at first, in simple scenarios.
I can't speak to uv from an app dev perspective (I use Go for that), but as someone who dips into the Python world for data science reasons only, uv is great and I'm thankful for it.
Adding to the article (which I agree with): Lack of `uv pip install --user` has made transitioning our existing python environment a bit more challenging than I'd like, but not a deal breaker.
Just going to plug https://mise.jdx.dev as a perfect accompaniment to uv. It simplifies installing tooling across languages and projects. I even install uv via mise, and it uses uv under the hood for Python related things.
Now we should just figure out why to stop here. Why not write everything in Rust? Recently I have moved all my projects to Rust from Python and never looked back. Of course we need projects like Torch and we are not yet there, but those simpler projects that do not require GPU libraries Rust is great.
I tried out uv a bit ago and dropped it. But about two weeks ago, I switched to it and migrated two projects with no issues.
Things like pypi sources per dep are there finally.
I still find rough points (as many others pointed out, especially with non sandboxed installs), that are problematic, but on the whole it’s better than Mamba for my use.
Is uv better than micromamba? I tried using uv once and got some big ugly error I don't remember, and that was the end of that, whereas mm just worked (perhaps due to my familiarity). It was a project with the usual situation, i.e., torch, numpy, cuda support, nvcc, all had to play nicely together and satisfy requirements.txt.
Really like uv too but surprised he doesn’t mention the lack of conda compliance. Some scientific packages only being available on conda is the only reason I can’t use uv (but micromamba) for some projects.
uv has been fantastic for most of my personal projects. It feels so much smoother than any other python tooling combo I tried in past. That said, it just does not work well behind corporate proxies. This is single most annoying thing that has stopped me from recommending it at work.
How are they handling the returns VCs expect through this free software? If it's so easy to deploy, surely we should expect a Docker-like licensing model in the near future?
All else equal, I prefer julia to python, but uv makes the python experience so much nicer. I'd love it if julia copied uv and replaced the Pkg system with it.
I can't find any way to get that working in uv without some pretty major refactoring of my internal structure and import declarations. Maybe I've accidentally cornered myself in a terrible and ill-advised structure?
Can anyone explain to a non-python developer why python infrastructure is so much broken around the version management?
It looks to me that every new minor python release is a separate additional install because realistically you cannot replace python 3.11 with python 3.12 and expect things to work. How did they put themselves in such a mess?
I have seen references to using uv for Python package management before and been thoroughly confused. I never realized it was not the same thing as the very nice asynchronous cross-platform library libuv (https://libuv.org/) and I could never figure out what that library had to do with Python package management (answer: nothing).
Maybe we need a Geographic Names Board to deconflict open source project names, or at least the ones that are only two or three characters long.
A year of uv: pros, cons, and should you migrate
(bitecode.dev)687 points by bertdb 18 February 2025 | 387 comments
Comments
With the advent of uv, I'm finally feeling like Python packaging is solved. As mentioned in the article, being able to have inline dependencies in a single-file Python script and running it naturally is just beautiful.
After being used to this workflow, I have been thinking that a dedicated syntax for inline dependencies would be great, similar to JavaScript's `import ObjectName from 'module-name';` syntax. Python promoted type hints from comment-based to syntax-based, so a similar approach seems feasible.> It used to be that either you avoided dependencies in small Python script, or you had some cumbersome workaround to make them work for you. Personally, I used to manage a gigantic venv just for my local scripts, which I had to kill and clean every year.
I had the same fear for adding dependencies, and did exactly the same thing.
> This is the kind of thing that changes completely how you work. I used to have one big test venv that I destroyed regularly. I used to avoid testing some stuff because it would be too cumbersome. I used to avoid some tooling or pay the price for using them because they were so big or not useful enough to justify the setup. And so on, and so on.
I 100% sympathize with this.
But... the 86GB python dependency download cache on my primary SSD, most of which can be attributed to the 50 different versions of torch, is testament to the fact that even uv cannot salvage the mess that is pip.
Never felt this much rage at the state of a language/build system in the 25 years that I have been programming. And I had to deal with Scala's SBT ("Simple Build Tool") in another life.
> Being independent from Python bootstrapping
Yep, conda.
> Being capable of installing and running Python in one unified congruent way across all situations and platforms.
Yep, conda.
> Having a very strong dependency resolver.
Yep, conda (or mamba).
The main thing conda doesn't seem to have which uv has is all the "project management" stuff. Which is fine, it's clear people want that. But it's weird to me to see these articles that are so excited about being able to install Python easily when that's been doable with conda for ages. (And conda has additional features not present in uv or other tools.)
The pro and con of tools like uv is that they layer over the base-level tools like pip. The pro of that is that they interoperate well with pip. The con is that they inherit the limitations of that packaging model (notably the inability to distribute non-Python dependencies separately).
That's not to say uv is bad. It seems like a cool tool and I'm intrigued to see where it goes.
Here's just one example, nemo2riva, the first in several steps to taking a trained NeMo model and making it deployable: https://github.com/nvidia-riva/nemo2riva?tab=readme-ov-file#...
before you can install the package, you first have to install some other package whose only purpose is to break pip so it uses nvidia's package registry. This does not work with uv, even with the `uv pip` interface, because uv rightly doesn't put up with that shit.
This is of course not Astral's fault, I don't expect them to handle this, but uv has spoiled me so much it makes anything else even more painful than it was before uv.
But most of my work, since I adopted conda 7ish years ago, involves using the same ML environment across any number of folders or even throw-away notebooks on the desktop, for instance. I’ll create the environment and sometimes add new packages, but rarely update it, unless I feel like a spring cleaning. And I like knowing that I have the same environment across all my machines, so I don’t have to think about if I’m running the same script or notebook on a different machine today.
The idea of a new environment for each of my related “projects” just doesn’t make sense to me. But, I’m open to learning a new workflow.
Addition: I don’t run other’s code, like pretrained models built with specific package requirements.
You don't have to love uv, and there are plenty of reasons not to.
https://libuv.org/
Seems like a lot of people have tried their hand at various tooling, so there must be more to it than I am aware of.
> All those commands update the lock file automatically and transparently.... It's all taken care of.
When is the python community going to realize that simple is the opposite of easy? I don't see how hiding these aspects is desirable at all; I want to know how my programming tools work!
With all due respect to the author, I don't like the assumption that all programmers want magic tools that hide everything under the rug. Some programmers still prefer simplicity, ie understanding exactly what every part of the system does.
Nothing against uv, it seems like a fine tool. And I'm sure one could make a case for it on other technical merits. But choosing it specifically to avoid critical thinking is self-defeating.
In other words, it is a nice frontend to hide the mess that is the Python packaging ecosystem, but the mess of an ecosystem is still there, and you still have to deal with it. You'll still have to go through hatchling's docs to figure out how to do x/y/z. You'll still have to switch from hatchling to flit/pdm/setuptools/... if you run into a limitation of hatchling. As a package author, you're never using uv, you're using uv+hatchling (or uv+something) and a big part of your pyproject.toml are not uv's configuration, it is hatchling configuration.
I'm sticking with Poetry for now, which has a more streamlined workflow. Things work together. Every Poetry project uses the same configuration syntax (there are no Poetry+X and Poetry+Y projects). Issues in Poetry can be fixed by Poetry rather than having to work with the backend.
I understand that uv is still young and I am sure this will improve. Maybe they'll even pick a specific backend and put a halt to this. But of course Poetry might catch up before then.
This is my only gripe with uv, despite how the author decided to depict it, this really turns into a headache fast as soon as you have ~4-5 in-house packages.
I don't think it's that bad that uv is so unforgiving in those case because it leads to better overall project quality/cohesion, but I wish there was a way to more progressively onboard and downgrade minor version mismatch to warnings.
Astral is a great team, they built the ruff linter and are currently working on a static type checker called red-knot: https://x.com/charliermarsh/status/1884651482009477368
I have installed miniconda system-wide. For any Python package that I use a lot, I install them on base environment. And on other environments. Like ipython.
For every new project, I create a conda environment, and install everything in it. Upon finishing/writing my patch, I remove that environment and clean the caches. For my own projects, I create an environment.yaml and move on.
Everything works just fine. Now, the solving with mamba is fast. I can just hand someone the code and environment.yaml, and it runs on other platforms.
Can someone say why using uv is a good idea? Has anyone written a migration guide for such use cases?
I am mightily impressed by one line dependency declaration in a file. But I don't know (yet) where the caches are stored, how to get rid of them later, etc.
I have yet to learn uv, but I intend to. Still, having to ".venv/bin/activate" to activate the virtualenv is a lot less ergonomic than "pipenv shell".
Unfortunately, Hermit doesn't do Windows, although I'm pretty sure that's because the devs don't have Windows machines: PRs welcome.
https://github.com/cashapp/hermit
Things like pypi sources per dep are there finally.
I still find rough points (as many others pointed out, especially with non sandboxed installs), that are problematic, but on the whole it’s better than Mamba for my use.
The latest release is at 0.6.1, what is missing (roadmap/timeline wise) for uv to exist as a 1.0 release?
I'm extremely excited about the ruff type checker too.
All else equal, I prefer julia to python, but uv makes the python experience so much nicer. I'd love it if julia copied uv and replaced the Pkg system with it.
For instance, from a personal project that uses a src layout, without being a package, I have this in my pyproject.toml:
[tool.poetry] ... packages = [{ include = "*", from = "src", format = "sdist" }] ...
[tool.poetry.scripts] botch = "launcher:run_bot('botch')" beat = "launcher:run_bot('beat')"
I can't find any way to get that working in uv without some pretty major refactoring of my internal structure and import declarations. Maybe I've accidentally cornered myself in a terrible and ill-advised structure?
With uv, there is now one more.
https://xkcd.com/927/
It looks to me that every new minor python release is a separate additional install because realistically you cannot replace python 3.11 with python 3.12 and expect things to work. How did they put themselves in such a mess?
Maybe we need a Geographic Names Board to deconflict open source project names, or at least the ones that are only two or three characters long.