I really like uv, and it's the first package manager for a while where I haven't felt like it's a minor improvement on what I'm using but ultimately something better will come out a year or two later. I'd love if we standardized on it as a community as the de facto default, especially for new folks coming in. I personally now recommend it to nearly everyone, instead of the "welllll I use poetry but pyenv works or you could use conda too"
I never used anything other than pip. I never felt the need to use anything other than pip (with virtualenv). Am I missing anything?
Couple of things.
- pip doesn't handle your Python executable, just your Python dependencies. So if you want/need to swap between Python versions (3.11 to 3.12 for example), it doesn't give you anything. Generally people use an additional tool such as pyenv to manage this. Tools like uv and Poetry do this as well as handling dependencies
- pip doesn't resolve dependencies of dependencies. pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies. I've had many issues where my environment stopped working despite none of my specified dependencies changing, because underlying dependencies introduced breaking changes. To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility. uv and poetry do this out of the box.
- Tool usage. Say there is a python package you want to use across many environments without installing in the environments themselves (such as a linting tool like ruff). With pip, you need to install another tool like pipx to install something that can be used across environments. uv can do this out of the box.
Plus there is a whole host of jobs that tools like uv and poetry aim to assist with that pip doesn't, namely project creation and management. You can use uv to create a new Python project scaffolding for applications or python modules in a way that conforms with PEP standards with a single command. It also supports workspaces of multiple projects that have separate functionality but require dependencies to be in sync.
You can accomplish a lot/all of this using pip with additional tooling, but its a lot more work. And not all use cases will require these.
Yes, generally people already use an additional tool for managing their Python executables, like their operating system's package manager:
$> sudo apt-get install python3.10 python3.11 python3.12
And then it's simple to create and use version-specific virtual environments: $> python3.11 -m venv .venv3.11
$> source .venv3.11/bin/activate
$> pip install -r requirements.txt
You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment. In fact, this behavior is made more difficult by tools like `uv` if or `pipx` they're trying to manage Python executables as well as dependencies.Sometimes I feel like my up vote doesn't adequately express my gratitude.
I appreciate how thorough this was.
Oh wow, it actually can handle the Python executable? I didn't know that, that's great! Although it's in the article as well, it didn't click until you said it, thanks!
I would avoid using this feature! It downloads a compiled portable python binary from some random github project not from PSF. That very same github project recommends against using their binary as the compilation flags is set for portability against performance. See https://gregoryszorc.com/docs/python-build-standalone/main/
https://github.com/astral-sh/python-build-standalone is by the same people as uv, so it's hardly random. The releases there include ones with profile-guided optimisation and link time optimisation [1], which are used by default for some platforms and Python versions (and work seems underway to make them usable for all [2]). I don't see any recommendation against using their binaries or mention of optimising for portability at the cost of performance on the page you link or the pages linked from it that I've looked at.
[1] https://github.com/astral-sh/uv/blob/main/crates/uv-python/d... (search for pgo)
This must have moved recently! I looked at this around end of December and it was hosted on https://github.com/indygreg/python-build-standalone/releases which had nothing to do with UV. If you read through the docs now it still references indygreg and still shows this https://github.com/indygreg/python-build-standalone so I guess the move has not completed it, but yes it's a positive change to see UV taking ownership of the builds.
Its not from some random github project its from a trusted member of open source community. Same as other libraries you use and install.
It was used by rye before rye and uv sort of merged and is used by pipx and hatch and mise (and bazel rules_python) https://x.com/charliermarsh/status/1864042688279908459
My understanding is that the problem is that psf doesnt publish portable python binaries (I dont think they even publish any binaries for linux). Luckily theres some work being done on a pep for similar functionality from an official source but that will likely take several years. Gregory has praised the attempt and made suggestions based on his experience. https://discuss.python.org/t/pep-711-pybi-a-standard-format-...
Apparently he had less spare time for open source and since astral had been helping with a lot of the maitinence work on the project he happily transfered over ownership to themin December
https://gregoryszorc.com/blog/2024/12/03/transferring-python... https://astral.sh/blog/python-build-standalone
That makes sense thanks for sharing these details
No problem.
That's not to say they aren't downsides. https://gregoryszorc.com/docs/python-build-standalone/main/q... documents them. As an example I had to add the https://pypi.org/project/gnureadline/ package to a work project that had its own auto completing shell because by default the builds replaces the gnu readline package with lived it/edit line and they're far from a drop in replacement.
I still don't understand why people want separate tooling to "handle the Python executable". All you need to do is have one base installation of each version you want, and then make your venv by running the standard library venv for that Python (e.g. `python3.x -m venv .venv`).
> All you need to do is have one base installation of each version you want
Because of this ^
But any tool you use for the task would do that anyway (or set them up temporarily and throw them away). Python on Windows has a standard Windows-friendly installer, and compiling from source on Linux is the standard few calls to `./configure` and `make` that you'd have with anything else; it runs quite smoothly and you only have to do it once.
I need to tell you a secret... I'm a long-life Linux user (since mandrake!)
Also, I don't have a c compiler installed.
Really? I was told Mint was supposed to be the kiddie-pool version of Linux, but it gave me GCC and a bunch of common dependencies anyway.
(By my understanding, `pyenv install` will expect to be able to run a compiler to build a downloaded Python source tarball. Uv uses prebuilt versions from https://github.com/astral-sh/python-build-standalone ; there is work being done in the Python community on a standard for packaging such builds, similarly to wheels, so that you can just use that instead of compiling it yourself. But Python comes out of an old culture where users expect to do that sort of thing.)
In Debian build-essential package is only recommended dependency of pip. Pyenv obviously wouldn't work without it.
Having to manually install python versions and create venvs is pretty painful compared to say the Rust tooling where you install rustup once, and then it will automatically choose the correct Rust version for each project based on what that project has configured.
UV seems like it provides a lot of that convenience for python.
I'm glad to let uv handle that for me. It does a pretty good job at it!
Lots of reasons starting. You may want many people to have the same point release. They have early builds without needing to compile it from source and have free threading (mogul) builds. I think they might even have pro builds. Not to mention that not all district releases will have the right python release. Also people want the same tool to handle both python version and venv creation and requirement installation
>Also people want the same tool to handle both python version and venv creation and requirement installation
This is the part I don't understand. Why should it be the same tool? What advantage does that give over having separate tools?
Because its easier. Because it fits together nicer and more consistently. Also because UV is well written and written in rust so all the parts are fast. You can recreate a venv from scratch for every run.
Also as silly as it is I actually have a hard time remembering the venv syntax each time.
uv run after a checkout with a lock file and a .python-version file downloads the right python version creates a venv and then installs the packages. No more needing throwaway venvs to get a clean pip freeze for requirements. And I don't want to compile python, even with something helping me compile and keep track of compiles like pyenv a lot can go wrong.
And that assumes an individualindividual project run by someone who understands python packaging. UV run possibly in a wrapper script will do those things for my team who doesn't get packaging as well as I do. Just check in changes and next time they UV run it updates stuff for them
I guess I will never really understand the aesthetic preferences of the majority. But.
>Because its easier. Because it fits together nicer and more consistently. Also because UV is well written and written in rust so all the parts are fast. You can recreate a venv from scratch for every run.
This is the biggest thing I try to push back on whenever uv comes up. There is good evidence that "written in Rust" has quite little to do with the performance, at least when it comes to creating a venv.
On my 10-year-old machine, creating a venv directly with the standard library venv module takes about 0.05 seconds. What takes 3.2 more seconds on top of that is bootstrapping Pip into it.
Which is strange, in that using Pip to install Pip into an empty venv only takes about 1.7 seconds.
Which is still strange, in that using Pip's internal package-installation logic (which one of the devs factored out as a separate project) to unpack and copy the files to the right places, make the script wrappers etc. takes only about 0.2 seconds, and pre-compiling the Python code to .pyc with the standard library `compileall` module takes only about 0.9 seconds more.
The bottleneck for `compileall`, as far as I can tell, is still the actual bytecode compilation - which is implemented in C. I don't know if uv implemented its own bytecode compilation or just skips it, but it's not going to beat that.
Of course, well thought-out caching would mean it can just copy the .pyc files (or hard-link etc.) from cache when repeatedly using a package in multiple environments.
[dead]
pip's resolving algorithm is not sound. If your Python projects are really simple it seems to work but as your projects get more complex the failure rate creeps up over time. You might
pip install
something and have it fail and then go back to zero and restart and have it work but at some point that will fail. conda has a correct resolving algorithm but the packages are out of date and add about as many quality problems as they fix.I worked at a place where the engineering manager was absolutely exasperated with the problems we were having with building and deploying AI/ML software in Python. I had figured out pretty much all the problems after about nine months and had developed a 'wheelhouse' procedure for building our system reliably, but it was too late.
Not long after I sketched out a system that was a lot like uv but it was written in Python and thus had problems with maintaining its own stable Python enivronment (e.g. poetry seems to trash itself every six months or so.)
Writing uv in Rust was genius because it eliminates that problem of the system having a stable surface to stand on instead of pipping itself into oblivion, never mind that it is much faster than my system would have been. (My system had the extra feature that it used http range requests to extract the metadata from wheel files before pypi started letting you download the metadata directly.)
I didn't go forward with developing it because I argued with a lot of people who, like you, thought it was "the perfect being the enemy of the good" when it was really "the incorrect being the enemy of the correct." I'd worked on plenty of projects where I was right about the technology and wrong about the politics and I am so happy that uv has saved the Python community from itself.
Respectively, yes. The ability to create venvs so fast, that it becomes a silent operation that the end user never thinks about anymore. The dependency management and installation is lightning quick. It deals with all of the python versioning
and I think a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>
Your script code would like:
#!/usr/bin/env -S uv run --script # /// script # requires-python = ">=3.12" # dependencies = [ # "...", # "..." # ] # ///
Then uv will make a new venv, install the dependencies, and execute the script faster than you think. The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.
It is really interesting. You should at least take a look at a YT or something. I think you will be impressed.
Good luck!
If you switch to uv, you’ll have fewer excuses to take coffee breaks while waiting for pip to do its thing. :)
Pip only has requirements.txt and doesn't have lockfiles, so you can't guarantee that the bugs you're seeing on your system are the same as the bugs on your production system.
Pip is sort of broken before because it encourages confusion between requirements and lock files. In other languages with package managers you generally specify your requirements with ranges and get a lock file with exact versions of those and any transitive dependencies letting you easily recreate a known working environment. The only way to do that in pip is to make a *new* venue install then pip freeze. I think pip tools package is supposed to help but it's a separate tool (one which I've also includes). Also putting stuff in pyproject.toml feels more solid then requirements files (and allows options to be set on requirements (like installing only one package that's only on your company's private python package index mirror while installing the others from the global python package index) and allows dev dependencies and other optional features dependency groups without multiple requirements files and having to update locks on those files.
It also automatically creates venvs if you delete them. And it automatically updates packages when you run something with uv run file.py (useful when somebody may have updated the requirements in git). It also lets you install self contained (installed in a virtualenv and linked to ~/.local/bin which is added to your path)python tools (replacing pipx). It installs self contained python builds letting you more easily pick python version and specify it in a .python-version file for your project (replacing pyenv and usually much nicer because pyenv compiles them locally)
Uv also makes it easier to explore and say start a ipython shell with 2 libraries uv run --with ipython --with colorful --with https ipython
It caches downloads. Of course the http itself isn't faster but they're exploring things to speed that part up and since it's written in rust local stuff (like deleting and recreating a venv with cached packages) tends to be blazing fast
I am not a python developer, but sometimes I use python projects. This puts me in a position where I need to get stuff working while knowing almost nothing about how python package management works.
Also I don’t recognise errors and I don’t know which python versions generally work well with what.
I’ve had it happen so often with pip that I’d have something setup just fine. Let’s say some stable diffusion ui. Then some other month I want to experiment with something like airbyte. Can’t get it working at all. Then some days later I think, let’s generate an image. Only to find out that with pip installing all sorts of stuff for airbyte, I’ve messed up my stable diffusion install somehow.
Uv clicked right away for me and I don’t have any of these issues.
Was I using pip and asdf incorrectly before? Probably. Was it worth learning how to do it properly in the previous way? Nope. So uv is really great for me.
I'm fairly minimalist when it comes to tooling: venv, pip, and pip-tools. I've started to use uv recently because it resolves packages significantly faster than pip/pip-tools. It will generate a "requirements.txt" with 30 packages in a few seconds rather than a minute or two.
Well, for one you can't actually package or add a local requirement (for example , a vendored package) to the usual pip requirements.txt (or with pyproject.toml, or any other standard way) afaik.
I saw a discourse reply that cited some sort of possible security issue but that was basically it and that means that the only way to get that functionality is to not use pip. It's really not a lot of major stuff, just a lot of little paper cuts that makes it a lot easier to just use something else once your project gets to a certain size.
Yeah, it unifies the whole env experience with the package installation experience. No more forgetting to activate virtualenv first. No more pip installing into the wrong virtual env or accidentally borrowing from the system packages. It’s way easier to specify which version of python to use. Everything is version controlled including python version and variant like cpython, puppy, etc. it’s also REALLY REALLY fast.
Performance and correctness mostly.
I was in your boat too. Been using Python since 2000 and pretty satisfied with venv and pip.
However, the speed alone is reason enough to switch. Try it once and you will be sold.
Also you can set the python version for that project. It will download whatever version you need and just use it.
in my view, depending on your workflow you might have been missing out on pyenv in the past but not really if you feel comfortable self-managing your venvs.
now though, yes unequivocally you are missing out.
Yeah, I switched from pip to uv. uv seems like its almost the perfect solution for me.
it does virtualenv, it does pyenv, it does pip, so all thats managed in once place.
its much faster than pip.
its like 80% of my workflow now.
Much of the Python ecosystem blatantly violates semantic versioning. Most new tooling is designed to work around the bugs introduced by this.
Cool story bro.
I've used pip, pyenv, poetry, all are broken in one way or another, and have blind spots they don't serve.
If your needs are simple (not mixing Python versions, simple dependencies, not packaging, etc) you can do it with pip, or even with tarballs and make install.
Pip doesn't resolve dependencies for you. On small projects that can be ok, but if you're working on something medium to large, or you're working on it with other people you can quickly get yourself into a sticky situation where your environment isn't easily reproducible.
Using uv means your project will have well defined dependencies.
What is the deal with uv's ownership policy? I heard it might be VC backed. To my mind, that means killing pip and finding some kind of subscription revenue source which makes me uneasy.
The only way to justify VC money is a plot to take over the ecosystem and then profit off of a dominant position. (e.g. the Uber model)
I've heard a little bit about UV's technical achievements, which are impressive, but technical progress isn't the only metric.
It’s dual MIT and Apache licensed. Worst case, if there’s a rug pull, fork it.
This:
> I haven't felt like it's a minor improvement on what I'm using
means that this:
> I'd love if we standardized on it as a community as the de facto default
…probably shouldn’t happen. The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.
It would be like replacing the python repl with the current version of ipython. I’d say the same thing, that it isn’t a minor improvement. While I almost always use ipython now, I’m glad it’s a separate thing.
> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.
The problem is that in the python ecosystem there really isn't a default de facto standard yet at all. It's supposed to be pip, but enough people dislike pip that it's hard as a newcomer to know if it's actually the standard or not.
The nice thing about putting something like this on a pedestal is that maybe it could actually become a standard, even if the standard should be simple and get out of the way. Better to have a standard that's a bit over the top than no standard at all.
As it happens, the Python REPL was just replaced a few months ago!
…Not with IPython. But with an implementation written in Python instead of C, originating from the PyPy project, that supports fancier features like multi-line editing and syntax highlighting. See PEP 762.
I was apprehensive when I heard about it, but then I had the chance to use it and it was a very nice experience.
> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.
This to me is unachievable. Perfection is impossible. On the the way there if the community and developers coalesced around a single tool then maybe we can start heading down the road to perfectionism.
+1 uv now also supports system installation of python with the --default --preview flags. This probably allows me to replace mise (rtx) and go uv full time for python development. With other languages, I go back to mise.
I use mise with uv for automatic activation of venvs when I cd into a directory containing one (alongside a mise.toml). Do you tackle this in some other manner?
(I will likely base a blog post in my packaging series off this comment later.)
What people seem to miss about Pip is that it's by design, not a package manager. It's a package installer, only. Of course it doesn't handle the environment setup for you; it's not intended for that. And of course it doesn't keep track of what you've installed, or make lock files, or update your `pyproject.toml`, or...
What it does do is offer a hideously complex set of options for installing everything under the sun, from everywhere under the sun. (And that complexity has led to long-standing, seemingly unfixable issues, and there are a lot of design decisions made that I think are questionable at best.)
Ideas like "welllll I use poetry but pyenv works or you could use conda too" are incoherent. They're for different purposes and different users, with varying bits of overlap. The reason people are unsatisfied is because any given tool might be missing one of the specific things they want, unless it's really all-in-one like Uv seems like it intends to be eventually.
But once you have a truly all-in-one tool, you notice how little of it you're using, and how big it is, and how useless it feels to have to put the tool name at the start of every command, and all the specific little things you don't like about its implementation of whatever individual parts. Not to mention the feeling of "vendor lock-in". Never mind that I didn't pay money for it; I still don't want to feel stuck with, say, your build back-end just because I'm using your lock-file updater.
In short, I don't want a "package manager".
I want a solid foundation (better than Pip) that handles installing (not managing) applications and packages, into either a specified virtual environment or a new one created (not managed, except to make it easy to determine the location, so other tools can manage it) for the purpose. In other words, something that fully covers the needs of users (making it possible to run the code), while providing only the bare minimum on top of that for developers - so that other developer tools can cooperate with that. And then I want specialized tools for all the individual things developers need to do.
The specialized tools I want for my own work all exist, and the tools others want mostly exist too. Twine uploads stuff to PyPI; `build` is a fine build front-end; Setuptools would do everything I need on the back-end (despite my annoyances with it). I don't need a lockfile-driven workflow and don't readily think in those terms. I use pytest from the command line for testing and I don't want a "package manager" nor "workflow tool" to wrap that for me. If I needed any wrapping there I could do a few lines of shell script myself. If anything, the problem with these tools is doing too much, rather than too little.
The base I want doesn't exist yet, so I've started making it myself. Pipx is a big step in the right direction, but it has some arbitrary limitations (I discuss these and some workarounds in my recent blog post https://zahlman.github.io/posts/2025/01/07/python-packaging-... ) and it's built on Pip so it inherits those faults and is that much bigger. Uv is even bigger still for the compiled binary, and I would only be using the installation parts.
Virtualenv should have never existed in the first place. So you claiming that UV or whatever tool is doing too much, sounds to me like you're arguing based on "traditionalist" or "conservative" reasons rather than doing any technical thinking here.
Node.js's replacement for virtualenv is literally just a folder named "node_modules". Meanwhile python has an entire tool with strange ideosyncracies that you have to pay attention to otherwise pip does the wrong thing by default.
It is as if python is pretending to be a special snowflake where installing libraries into a folder is this super hyper mega overcomplicated thing that necessitates a whole dedicated tool just to manage, when in reality in other programming languages nobody is really thinking about that the fact that the libraries end up in their build folders. It just works.
So again you're pretending that this is such a big deal that it needs a whole other tool, when the problem in question is so trivial that another tool is adding mental overhead with regard to the microscopic problem at hand.
Do you feel that Npm, mix, cargo went the wrong way, doing too much? It seems like their respective communities _love_ the standard tooling and all that it does. Or is Python fundamentally different?
Heck, you can get even cleaner than that by using uv’s support for PEP 723’s inline script dependencies:
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "pandas",
# ]
# ///
h/t https://simonwillison.net/2024/Dec/19/one-shot-python-tools/I've started a repo with some of these scripts, the most recent one being my favorite: a wrapper for Microsoft AutoGen's very recent Magentic-1, a generalist LLM-Multi-Agent-System. It can use the python code, the CLI, a browser (Playwright) and the file system to complete tasks.
A simple example a came across is having to rename some files:
1. you just open the shell in the location you want
2. and run this command:
uv run https://raw.githubusercontent.com/SimonB97/MOS/main/AITaskRu... "check the contents of the .md files in the working dir and structure them in folders"
There's a link to Magentic-1 docs and further info in the repo: https://github.com/SimonB97/MOS/tree/main/AITaskRunner (plus two other simple scripts).
I don't understand how things like this get approved into PEPs.
Seems like a great way to write self documenting code which can be optionally used by your python runtime.
As in, you think this shouldn't be possible or you think it should be written differently?
The PEP page is really good at explaining the status of the proposal, a summary of the discussion to date, and then links to the actual detailed discussion (in Discourse) about it:
It's helpful as a way to publish minimal reproductions of bugs and issues in bug reports (compared to "please clone my repo" which has so many layers of friction involved).
I would want distributed projects to do things properly, but as a way to shorthand a lot of futzing about? It's excellent
And people were laughing at PHP comments configuring framework, right?
with that expected use case of uv script run command it effectively makes those comments executable
python's wheels are falling off at an ever faster and faster rate
I don't think this IS a PEP, I believe it is simply something the uv tool supports and as far as Python is concerned it is just a comment.
Is it possible for my IDE (vscode) to support this? Currently my IDE screams at me for using unknown packages and I have no type hinting, intellisense, etc.
With your python plugin you should be able choose .venv/bin/python as your interpreter after you've run `uv sync` and everything should resolve
And for the Jupyter setting, check out Trevor Manz's juv:
So it's like a shebang for dependencies. Cool.
As a NodeJS developer it's still kind of shocking to me that Python still hasn't resolved this mess. Node isn't perfect, and dealing with different versions of Node is annoying, but at least there's none of this "worry about modifying global environment" stuff.
Caveat: I'm a node outsider, only forced to interact with it
But there are a shocking number of install instructions that offer $(npm i -g) and if one is using Homebrew or nvm or a similar "user writable" node distribution, it won't prompt for sudo password and will cheerfully mangle the "origin" node_modules
So, it's the same story as with python: yes, but only if the user is disciplined
Now ruby drives me fucking bananas because it doesn't seem to have either concept: virtualenvs nor ./ruby_modules
It's worth noting that Node allows two packages to have the same dependency at different versions, which means that `npm i -g` is typically a lot safer than a global `pip install`, because each package will essentially create its own dependency tree, isolated from other packages. In practice, NPM has a deduplication process that makes this more complicated, and so you can run into issues (although I believe other package managers can handle this better), but I rarely run into issues with this.
That said, I agree that `npm i -g` is a poor system package manager, and you should typically be using Homebrew or whatever package manager makes the most sense on your system. That said, `npx` is a good alternative if you just want to run a command quickly to try it out or something like that.
Because you don’t need virtualenvs or ruby_modules. You can have however many versions of the same gem installed it’s simply referenced by a gemfile, so for Ruby version X you are guaranteed one copy of gem version Y and no duplicates.
This whole installing the same dependencies a million times across different projects in Python and Node land is completely insane to me. Ruby has had the only sane package manager for years. Cargo too, but only because they copied Ruby.
Node has littered my computer with useless files. Python’s venv eat up a lot of space unnecessarily too.
Ruby has a number of solutions for this - rvm (the oldest, but less popular these days), rbenv (probably the most popular), chruby/gem_home (lightweight) or asdf (my personal choice as I can use the same tool for lots of languages). All of those tools install to locations that shouldn't need root.
The mainline ruby doesn't but tools to support virtualenvs are around. They're pretty trivial to write: https://github.com/regularfry/gemsh/blob/master/bin/gemsh
As long as you're in the ruby-install/chruby ecosystem and managed to avoid the RVM mess then the tooling is so simple that it doesn't really get any attention. I've worked exclusively with virtualenvs in ruby for years.
FWIW, you can usually just drop the `-g` and it'll install into `node_modules/.bin` instead, so it stays local to your project. You can run it straight out of there (by typing the path) or do `npm run <pkg>` which I think temporarily modifies $PATH to make it work.
Python has been cleaning up a number of really lethal problems like:
(i) wrongly configured character encodings (suppose you incorporated somebody else's library that does a "print" and the input data contains some invalid characters that wind up getting printed; that "print" could crash a model trainer script that runs for three days if error handling is set wrong and you couldn't change it when the script was running, at most you could make the script start another python with different command line arguments)
(ii) site-packages; all your data scientist has to do is
pip install --user
the wrong package and they'd trashed all of their virtualenvs, all of their condas, etc. Over time the defaults have changed so pythons aren't looking into the site-packages directories but I wasted a long time figuring out why a team of data scientists couldn't get anything to work reliably(iii) "python" built into Linux by default. People expected Python to "just work" but it doesn't "just work" when people start installing stuff with pip because you might be working on one thing that needs one package and another thing that needs another package and you could trash everything you're doing with python in the process of trying to fix it.
Unfortunately python has attracted a lot of sloppy programmers who think virtualenv is too much work and that it's totally normal for everything to be broken all the time. The average data scientist doesn't get excited when it crumbles and breaks, but you can't just call up some flakes to fix it. [1]
> Python has been cleaning up a number of really lethal problems like
I wish they would stick to semantic versioning tho.
I have used two projects that got stuck in incompatible changes in the 3.x Python.
That is a fatal problem for Python. If a change in a minor version makes things stop working, it is very hard to recommend the system. A lot of work has gone down the drain, by this Python user, trying to work around that
I don’t exactly remember the situation but a user created a python module named error.py.
Then in their main code they imported the said error.py but unfortunately numpy library also has an error.py. So the user was getting very funky behavior.
Half the time something breaks in a javascript repo or project, every single damn javascript expert in the team/company tells me to troubleshoot using the below sequence, as if throwing spaghetti on a wall with no idea what's wrong.
Run npm install
Delete node_modules and wait 30minutes because it takes forever to delete 500MB worth of 2 million files.
Do an npm install again (or yarn install or that third one that popped up recently?)
Uninstall/Upgrade npm (or is it Node? No wait, npx I think. Oh well, used to be node + npm, now it's something different.)
Then do steps 1 to 3 again, just in case.
Hmm, maybe it's the lockfile? Delete it, one of the juniors pushed their version to the repo without compiling maybe. (Someone forgot to add it to the gitignore file?)
Okay, do steps 1 to 3 again, that might have fixed it.
If you've gotten here, you are royally screwed and should try the next javascript expert, he might have seen your error before.
So no, I'm a bit snarky here, but the JS ecosystem is a clustermess of chaos and should rather fix it's own stuff first. I have none of the above issues with python, a proper IDE and out of the box pip.
So you’re not experiencing exactly this with pip/etc? I hit this “just rebuild this 10GB venv” scenario like twice a day while learning ML. Maybe it’s just ML, but then regular node projects don’t have complex build-step / version-clash deps either.
The pain is real. Most of the issues are navigable, but often take careful thought versus some canned recipe. npm or yarn in large projects can be a nightmare. starting with pnpm makes it a dream. Sometimes migrating to pnpm can be rough, because projects that work may rely on incorrect, transitive, undeclared deps actually resolving. Anyway, starting from pnpm generally resolves this sort of chaos.
Most packing managers are developed.
Pnpm is engineered.
It’s one of the few projects I donate to on GitHub
What kind of amateurs are you working with? I’m not a Node.js dev and even I know about npm ci command.
Sounds like a tale from a decade ago, people now use things like pnpm and tsx.
Environment and dependency management in JS-land is even worse.
Similar problems with runtime version management (need to use nvm for sanity, using built-in OS package managers seems to consistently result in tears).
More package managers and interactions (corepack, npm, pnpm, yarn, bun).
Bad package interop (ESM vs CJS vs UMD).
More runtimes (Node, Deno, Bun, Edge).
Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.
Valid criticisms, but the "standard" choices all work well. Nvm is the de facto standard for node version management, npm is a totally satisfactory package manager, node is the standard runtime that those other runtimes try to be compatible with, etc.
Will also note that in my years of js experience I've hardly ever run into module incompatibilities. It's definitely gnarly when it happens, but wouldn't consider this to be the same category of problem as the confusion of setting up python.
Hopefully uv can convince me that python's environment/dependency management can be easier than JavaScript's. Currently they both feel bad in their own way, and I likely prefer js out of familiarity.
>Similar problems with runtime version management (need to use nvm for sanity, using built-in OS package managers seems to consistently result in tears).
In practice I find this a nuisance but a small one. I wish there had been a convention that lets the correct version of Node run without me manually having to switch between them.
> More package managers and interactions (corepack, npm, pnpm, yarn, bun).
But they all work on the same package.json and node_modules/ principle, afaik. In funky situations, incompatibilities might emerge, but they are interchangeable for the average user. (Well, I don't know about corepack.)
> Bad package interop (ESM vs CJS vs UMD).
That is a whole separate disaster, which doesn't really impact consuming packages. But it does make packaging them pretty nasty.
> More runtimes (Node, Deno, Bun, Edge).
I don't know what Edge is. Deno is different enough to not really be in the same game. I find it hard to see the existence of Bun as problematic: it has been a bit of a godsend for me, it has an amazing ability to "just work" and punch through Typescript configuration issues that choke TypeScript. And it's fast.
> Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.
I guess I don't have a lot of reference points for this one. The 1000s of dependencies is certainly true though.
I've only recently started with uv, but this is one thing it seems to solve nicely. I've tried to get into the mindset of only using uv for python stuff - and hence I haven't installed python using homebrew, only uv.
You basically need to just remember to never call python directly. Instead use uv run and uv pip install. That ensures you're always using the uv installed python and/or a venv.
Python based tools where you may want a global install (say ruff) can be installed using uv tool
> Python based tools where you may want a global install (say ruff) can be installed using uv tool
uv itself is the only Python tool I install globally now, and it's a self-contained binary that doesn't rely on Python. ruff is also self-contained, but I install tools like ruff (and Python itself) into each project's virtual environment using uv. This has nice benefits. For example, automated tests that include linting with ruff do not suddenly fail because the system-wide ruff was updated to a version that changes rules (or different versions are on different machines). Version pinning gets applied to tooling just as it does to packages. I can then upgrade tools when I know it's a good time to deal with potentially breaking changes. And one project doesn't hold back the rest. Once things are working, they work on all machines that use the same project repo.
If I want to use Python based tools outside of projects, I now do little shell scripts. For example, my /usr/local/bin/wormhole looks like this:
#!/bin/sh
uvx \
--quiet \
--prerelease disallow \
--python-preference only-managed \
--from magic-wormhole \
wormhole "$@"
>You basically need to just remember to never call python directly. Instead use uv run and uv pip install.
I don't understand why people would rather do this part specfically, rather than activate a venv.
Because node.js isn't a dependency of the Operating system.
Also we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue.
Oh, tell us OS can’t venv itself a separate python root and keep itself away from what user invents to manage deps. This is non-explanation appealing to authority while it’s clearly just a mess lacking any thought. It just works like this.
we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue
TensorFlow.
This. IME, JS devs rarely have much experience with an OS, let alone Linux, and forget that Python literally runs parts of the OS. You can’t just break it, because people might have critical scripts that depend on the current behavior.
I think it makes sense given that people using python to write applications are a minority of python users. It's mostly students, scientists, people with the word "analyst" in their title, etc. Perhaps this goes poorly in practice, but these users ostensibly have somebody else to lean on re: setting up their environments, and those people aren't developers either.
I have to imagine that the python maintainers listen for what the community needs and hear a thousand voices asking for a hundred different packaging strategies, and a million voices asking for the same language features. I can forgive them for prioritizing things the way they have.
I'm not sure I understand your point. Managing dependencies is easy in node. It seems to be harder in Python. What priority is being supported here?
Hot take: pnpm is the best dx, of all p/l dep toolchains, for devs who are operating regularly in many projects.
Get me the deps this project needs, get them fast, then them correctly, all with minimum hoops.
Cargo and deno toolchains are pretty good too.
Opam, gleam, mvn/gradle, stack, npm/yarn, nix even, pip/poetry/whatever-python-malarkey, go, composer, …what other stuff have i used in the past 12 months… c/c++ doesn’t really have a first class std other than global sys deps (so ill refer back to nix or os package managers).
Getting the stuff you need where you need it is always doable. Some toolchains are just above and beyond, batteries included, ready for productivity.
Have you used bun? It's also great. Super fast
pnpm is the best for monorepos. I've tried yarn workspaces and npm's idea of it and nothing comes close to the DX of pnpm
I swear I'm not trolling: what do you not like about modern golang's dep management (e.g. go.mod and go.sum)?
I agree that the old days of "there are 15 dep managers, good luck" was high chaos. And those who do cutesy shit like using "replace" in their go.mod[1] is sus but as far as dx $(go get) that caches by default in $XDG_CACHE_DIR and uses $GOPROXY I think is great
1: https://github.com/opentofu/opentofu/blob/v1.9.0/go.mod#L271
When was the last time you saw a NodeJS package that expects to be able to compile Fortran code at installation time?
If you want Numpy (one of the most popular Python packages) on a system that doesn't have a pre-built wheel, you'll need to do that. Which is why there are, by my count, 54 different pre-built wheels for Numpy 2.2.1.
And that's just the actual installation process. Package management isn't solved because people don't even agree on what that entails.
The only way you avoid "worry about modifying the global environment" is to have non-global environments. But the Python world is full of people who refuse to understand that concept. People would rather type `pip install suspicious-package --break-system-packages` than learn what a venv is. (And they'll sometimes do it with `sudo`, too, because of a cargo-cult belief that this somehow magically fixes things - spoilers: it's typically because the root user has different environment variables.)
Which is why this thread happened on the Python forums https://discuss.python.org/t/the-most-popular-advice-on-the-... , and part of why the corresponding Stack Overflow question https://stackoverflow.com/questions/75608323 has 1.4 million views. Even though it's about an error message that was carefully crafted by the Debian team to tell you what to do instead.
It is kind of solved, but not default.
This makes a big difference. There is also the social problem of Python community with too loud opinions for making a good robust default solution.
But same has now happened for Node with npm, yarn and pnpm.
I wouldn't really say it's that black and white. It was only recently that many large libraries and tools recommended starting with "npm i -g ...". Of course you could avoid it if you knew better, but the same is true for Python.
How has NodeJS solved it? There are tons of version managers for Node.
Node hasn't solved this mess because it doesn't have the same mess.
It has a super limited compiled extensions ecosystem, plugin ecosystem and is not used as a system language in mac and linux.
And of course node is much more recent and the community less diverse.
tldr: node is playing in easy mode.
I usually stay away far FAR from shiny new tools but I've been experimenting with uv and I really like it. I'm a bit bummed that it's not written in Python but other than that, it does what it says on the tin.
I never liked pyenv because I really don't see the point/benefit building every new version of Python you want to use. There's a reason I don't run Gentoo or Arch anymore. I'm very happy that uv grabs pre-compiled binaries and just uses those.
So far I have used it to replace poetry (which is great btw) in one of my projects. It was pretty straightforward, but the project was also fairly trivial/typical.
I can't fully replace pipx with it because 'uv tool' currently assumes every Python package only has one executable. Lots of things I work with have multiple, such as Ansible and Jupyterlab. There's a bug open about it and the workarounds are not terrible, but it'd be nice if they are able to fix that soon.
uv is great, but downloading and installing base python interpreter is not a good feature as it doesn’t fetch that from PSF but from a project on GitHub, that very same project says this is compiled for portability over performance, see https://gregoryszorc.com/docs/python-build-standalone/main/
>that very same project says this is compiled for portability over performance
Realistically, the options on Linux are the uv way, the pyenv way (download and compile on demand, making sure users have compile-time dependencies installed as part of installing your tool), and letting users download and compile it themself (which is actually very easy for Python, at least on my distro). Compiling Python is not especially fast (around a full minute on my 4-core, 10-year-old machine), although I've experienced much worse in my lifetime. Maybe you can get alternate python versions directly from your distro or a PPA, but not in a way that a cross-distro tool can feasibly automate.
On Windows the only realistic option is the official installer.
But PSF doesn't distribute binary builds, so what's the alternative?
that it does it automatically is weird
There's so many more!
1. `uvx --from git+https://github.com/httpie/cli httpie` 2. https://simonwillison.net/2024/Aug/21/usrbinenv-uv-run/ uv in a shebang
Yes! since that Simon Willison article, I've slowly been easing all my scripts into just using a uv shebang, and it rocks! I've deleted all sorts of .venvs and whatnot. really useful
The uv shebang is definitely the killer feature for me, especially with so much of the AI ecosystem tied up in Python. Before, writing Python scripts was a lot more painful requiring either a global scripts venv and shell scripts to bootstrap them, or a venv per script.
I’m sure it was already possible with shebangs and venv before, but uv really brings the whole experience together for me so I can write python scripts as freely as bash ones.
Super neat re Willison article.. would something like this work under powershell though?!
The activation of the virtualenv is unnecessary (one can execute pip/python directly from it), and the configuring of your local pyenv interpreter is also unnecessary, it can create a virtual environment with one directly:
pyenv virtualenv python3.12 .venv
.venv/bin/python -m pip install pandas
.venv/bin/python
Not quite one command, but a bit more streamlined; I guess.Note that in general calling the venv python directly vs activating the venv are not equivalent.
E.g. if the thing you run invokes python itself, it will use the system python, not the venv one in the first case.
Surely if you want to invoke python you call sys.executable otherwise if your subprocess doesn’t inherit PATH nothing will work with uv or without uv
Indeed, you're right ;).
This is super cool, personally:
uv run --python 3.12 --with label-studio label-studio
Made my life so much easier
Uv also bundles uvx command so you can run Python scripts without installing them manually:
uvx --from 'huggingface_hub[cli]' huggingface-cli
And there's also the `uv run script.py` where you can have dependencies indicated as comments in the script, see eg https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
Neat!
Ok, this must be a dumb question answered by the manual, but I still haven't got my hands on uv, so: but does it solve the opposite? I mean, I pretty much never want any "ad-hoc" environments, but I always end up with my .venv becoming an ad-hoc environment, because I install stuff while experimenting, not bothering to patch requirements.txt, pyproject.toml or anything of the sort. In fact, now I usually don't even bother typing pip install, PyCharm does it for me.
This is of course bad practice. What I would like instead is what PHP's composer does: installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv), automatically freezes the versions, and then it is on git diff to tell me what I did last night, I'll remove a couple of lines from that file, run composer install and it will remove packages not explicitly added to my config from the environment. Does this finally get easy to achieve with uv?
I think it does! uv add [0] adds a dependency to your pyproject.toml, as well as your environment.
If you change your pyproject.toml file manually, uv sync [1] will update your environment accordingly.
[0]: https://docs.astral.sh/uv/guides/projects/#managing-dependen... [1]: https://docs.astral.sh/uv/reference/cli/#uv-sync
If I read [1] correctly, it seems it checks against lockfile, not pyproject.toml. So it seems like it won't help if I change pyproject.toml manually. Which is a big inconveniece, if so.
Whatever, I think I'll try it for myself later today. It's long overdue.
>installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv)
pyproject.toml represents an inter-project standard and Charlie Marsh has committed to sticking with it, along with cooperating with future Python packaging PEPs. But while you can list transitive dependencies, specify exact versions etc. in pyproject.toml, it's not specifically designed as a lockfile - i.e., pyproject.toml is meant for abstract dependencies, where an installer figures out transitively what's needed to support them and decides on exact versions to install.
The current work for specifying a lockfile standard is https://peps.python.org/pep-0751/ . As someone else pointed out, uv currently already uses a proprietary lockfile, but there has been community interest in trying to standardize this - it just has been hard to find agreement on exactly what it needs to contain. (In the past there have been proposals to expand the `pyproject.toml` spec to include other information that lockfiles often contain for other languages, such as hashes and supply-chain information. Some people are extremely against this, however.)
As far as I know, uv isn't going to do things like analyzing your codebase to determine that you no longer need a certain dependency that's currently in your environment and remove it (from the environment, lock file or `pyproject.toml`). You'll still be on the hook for figuring out abstractly what your project needs, and this is important if you want to share your code with others.
> uv isn't going to do things like analyzing your codebase
Sure, that's not what I meant (unless we call pyproject.toml a part of your codebase, which it kinda is, but that's probably not what you meant).
In fact, as far as I can tell from your answer, Python does move in the direction I'd like it to move, but it's unclear by how far it will miss and if how uv handles it is ergonomical.
As I've said, I think PHP's composer does a very good job here, and to clarify, this is how it works. There are 2 files: composer.json (≈pyproject.toml) and composer.lock (≈ PEP751) (also json). The former is kinda editable by hand, the latter you ideally never really touch. However, for the most part composer is smart enough that it edits both files for you (with some exceptions, of course), so every time I run `composer require your/awesomelib` it
1) checks the constraints in these files
2) finds latest appropriate version of your/awesomelib (5.0.14) and all its dependencies
3) writes "your/awesomelib": "^5.0"
4) writes "your/awesomelib": "5.0.14" and all its dependencies to composer.lock (with hashsums, commit ids and such)
It is a good practice to keep both inside of version control, so when I say "git diff tells me what I did last night" it means that I'll also see what I installed. If (as usual) most of it is some useless trash, I'll manually remove "your/awesomelib" from composer.json, run `composer install` and it will remove it and all its (now unneeded) dependencies. As the result, I never need to worry about bookkeeping, since composer does it for me, I just run `composer require <stuff>` and it does the rest (except for cases when <stuff> is a proprietary repo on company's gitlab and such, then I'll need slightly more manual work).
That is, what I hope to see in Python one day (10 years later than every other lang did it) is declarative package management, except I don't want to have to modify pyproject.toml manually, I want my package manager do it for me, because it saves me 30 seconds of my life every time I install something. Which accumulates to a lot.
I'm not an expert, but as far as I can tell UV allows you to do this without feeling so guilty (it handles multiple versions of Python and libraries AFAIK quite well).
I haven't tried it, but I think so? https://docs.astral.sh/uv/concepts/projects/layout/#the-lock...
Why can’t python just adopt something like yarn/pnpm + and effing stop patch-copying its binaries into a specific path? And pick up site_packages from where it left it last time? Wtf. How hard it is to just pick python-modules directory and python-project.json and sync it into correctness by symlink/mklink-inf missing folders from a package cache in there in a few seconds?
Every time when I have to reorganize or upgrade my AI repos, it’s yet another 50GB writes to my poor ssd. Half of it is torch, another half auto-downloaded models that I cannot stop because they become “downloaded” and you never know how to resume it back or even find where they are cause python logging culture is just barbaric.
I'm waiting for this issue to be done: Add an option to store virtual environments in a centralized location outside projects https://github.com/astral-sh/uv/issues/1495
I have used virtualenvwrapper before and it was very convenient to have all virtual environments stored in one place, like ~/.cache/virtualenvs.
The .venv in the project directory is annoying because when you copy folder somewhere you start copying gigabytes of junk. Some tools like rsync can't handle CACHEDIR.TAG (but you can use --exclude .venv)
Python package management has always seemed like crazyland to me. I've settled on Anaconda as I've experimented with all the ML packages over the years, so I'd be interested to learn why uv, and also what/when are good times to use venv/pip/conda/uv/poetry/whatever else has come up.
NeutralCrane has a really helpful comment below[0], would love to have a more thorough post on everything!
If you use conda, and can use conda for what you need to do, use conda w/ conda-forge. It has a much better story for libraries with binary dependencies, whereas PyPI (which `uv` uses) is basically full of static libraries that someone else compiled and promises to work.
Note, I use PyPI for most of my day-to-day work, so I say this with love!
Ridiculous post:
The author says that a normal route would be:
- Take the proper route:
- Create a virtual environment
- pip install pandas
- Activate the virtual environment
- Run python
Basically, out of the box, when you create an virtual it is immediately activated. And you would obviously need to have it activated before doing a pip install...In addition, in my opinion this is the thing that would sucks about UV to have different functions being tied to a single tool execution.
It is a breeze to be able to activate a venv, and be done with it, being able to run multiple times your program in one go, even with crashes, being able to install more dependencies, test it in REPL, ...
Hey, I actually made a silly mistake in my post, indeed you first activate the environment and then install stuff in it. Fixed!
I disagree though it is activated immediately, or at least to me with venv I always have to activate it explicitly.
You can still use traditional venvs with UV though, if you want.
Uh, but then you don't really need uv, right?
been using conda for years with multiple projects each of which has numerous environments (for different versions). fairly large complex environments with coda, tf, jax, etc. has always worked well, and my biggest complaint - the sluggish resolver - largely addressed with mamba resolver. packages not available on conga-forge can be installed into the conda env pip. maybe I'm missing something but it's not clear to me what advantage uv would provide over conda.
It is very difficult for most conda users to maintain conda environments. They use the same env for nearly all their work, don’t understand the hierarchical nature of conda envs, don’t know which one they’re installing into, install stuff with pip without recording it in their env file, etc. The worst local environment messes i’ve ever seen always involve conda.
It can be used effectively, but does not make it easy to do so.
uv makes itself harder to misuse
I believe pixi, by the same group (astral) is the better comparison to conda.
What would be interesting is if you could do something similar for IPython/Jupyter Notebooks: while front-ends like JupyterLab and VS Code Notebooks do let you select a .venv if present in the workspace, it's annoying to have to set one up and build one for every project.
When using Maven build tool with Java, the downloaded artifacts always have the version number added to prefix (artifact-version.jar).. and that means there can be multiple versions stored in parallel, cached globally and cherry picked without any ambiguity. The first time when I used Node and Python, I was shocked that the version number is not part of any downloaded artifacts. Versioning the dependencies is such a fundamental need and having it part of the artifact file itself seems like a common sense to me. Can anyone please explain why the Python/Node build tools do not follow that?
Version number is part of every wheel and sdist filename.
For anyone that used rye, it's worth noting that the creator of rye recommends using uv. Also, rye is going to be continually updated to just interface with uv until rye can be entirely replaced by uv.
I believe they are from the same author, Charlie Marsh / Astral
no armin created rye then gave it to astral
I want to like uv, but unfortunately there's some kind of technical distinction between a Python "package manager" and a "build system". Uv doesn't include a "build system", but encourages you to use some other one. The net result is that external dependencies don't build the same as on Poetry, don't work, and uv points the finger at some other dependency.
I do hope the situation changes one day. Python packaging is such a mess, but Poetry is good enough and actually works, so I'll stick with it for now.
It's not "some sort of technical distinction". Package managers are for keeping track of which pieces of code you need in your project's environment. Build systems are for... building the code, so that it can actually be used in an environment.
Usually, you can directly make a pre-built wheel, and then an installer like Pip or uv can just unpack that into the environment. If it needs to be build on the user's machine, then you offer an sdist, which specifies its build backend. The installer will act as a build frontend, by downloading and setting up the specified backend and asking it to make a wheel from the sdist, then installing the wheel.
Poetry's build backend (`poetry.masonry`) doesn't build your external dependencies unless a) you obtain an sdist and b) the sdist says to use that backend. And in these cases, it doesn't matter what tools you're using. Your installer (which could be Pip, which is not a package manager in any meaningful sense) can work with `poetry.masonry` just fine.
If you can give a much more specific, simple, reproducible example of a problem you encountered with external dependencies and uv, I'll be happy to try to help.
Maybe the docs are misleading? Seems that if I want my package to be installed, I need to pick a build system, regardless of if I am using any native code. https://docs.astral.sh/uv/concepts/projects/init/#packaged-a...
> Package managers are for keeping track of which pieces of code you need in your project's environment. Build systems are for... building the code, so that it can actually be used in an environment.
This probably means something to the developers of the package managers and build systems, but to me, as a Python developer who wants to be able to publish a pure Python CLI program to PyPI, it seems like a distinction without a difference.
Uv doesn't include a "build system", but encourages you to use some other one.
Personally I consider this one of uv's greatest strengths. The inflexibility and brittleness of Poetry's build system is what made me give up on poetry entirely. Had poetry made it easy to plug in a different build system I might never have tried uv.
OK, I'm convinced. I just installed uv. Thanks for sharing!
Ditto. This is pretty cool!
Sometimes, only a specific wheel is available (e.g. on Nvidia's Jetson platform where versions are dictated by the vendor).
Can uv work with that?
Even better would be if you could specify the dependencies inside of the script.
Edit: might be possible now? https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
Small misorder in the „right route” - you should first activate the virtual environment just created and then install pandas.
uv implements PEP 723.
https://packaging.python.org/en/latest/specifications/inline...
Especially useful if the script has dependencies on packages in private repos.
I really thought this would mention uv script deps (standardized by some PEP) together with a `#!/usr/bin/env -S uv run` shebang line which automatically install the deps on execution.
Has been super useful to write single-file tools/scripts which LLMs can understand and run easily.
Whats the point if you have other binary dependencies?
Use Nix for Python version as well as other bin deps, and virtualenv + pip-tools for correct package dependency resolution.
Waiting 4s for pip-tools instead of 1ms for uv doesn't change much if you only run it once a month.
I love this, the biggest problem I have right now with python scripts is distributing my single file utility scripts (random ops scripts).
I wish there was a way to either shebang something like this or build a wheel that has the full venv inside.
There’s a shebang now. as of PEP 722 you can declare dependencies in a comment at the top of a single file script that a package manager can choose to read and resolve.
uv has support for it: https://docs.astral.sh/uv/guides/scripts/#running-a-script-w... (which only helps if your team is all in on uv, but maybe they are)
PEP 722 was rejected. You are thinking of PEP 723, which was very similar to 722 in goals.
https://discuss.python.org/t/pep-722-723-decision/36763 contains the reasoning for accepting 723 and rejecting 722.
How does that work with the shebang?
Do other package managers support this yet?
https://peps.python.org/pep-0723/ is at the very least related. It's a way of specifying the metadata in the script, allowing other tools to do the right thing. One of the use cases is:
> A user facing CLI that is capable of executing scripts. If we take Hatch as an example, the interface would be simply hatch run /path/to/script.py [args] and Hatch will manage the environment for that script. Such tools could be used as shebang lines on non-Windows systems e.g. #!/usr/bin/env hatch run
https://micro.webology.dev/2024/08/21/uv-updates-and.html shows an example with uv:
> With this new feature, I can now instruct users to run uv run main.py without explaining what a venv or virtualenv is, plus a long list of requirements that need to be passed to pip install.
That ends:
> PEP 723 also opens the door to turning a one-file Python script into a runnable Docker image that doesn’t even need Python on the machine or opens the door for Beeware and Briefcase to build standalone apps.
You are in luck https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...
You mean like pyinstaller https://pyinstaller.org that takes your python and makes a standalone, self extracting or onedir archive to convert you ops script plus dependencies into something you can just distribute like a binary?
This makes sense when you need to provision Python itself to the end user, not just third-party libraries.
I used to have pyenv, asdf or mise to manage python versions (never use conda unless I need DL lib like pytorch). Now just uv is enough.
I do like uv and hope to try it soon but I don't get the point of the article.
Pyenv + poetry already gives you ability to "pull in local dependencies". Yes, you have to create a virtual environment and it's not "ad-hoc".
But if you're going to pull in a bunch of libraries, WHY would you want to invoke python and all your work dependencies on a one liner? Isn't it much better and easier to just spell-out the dependencies in a pyproject.toml? How "ad-hoc" are we talking here?
Yes! I love verbosity. It gives me job security. I’m tired of these tools making my job easier.
Previously I could allocate a whole week to setup initial scaffold for the project. Also more tools - more failure points, so I can flex on stupid juniors how smart I am. Now I can’t even go to pee with how fast and easy this freaking uv is. WTF.
Well I guess I am not smart enough to dump multiple dependencies + python, densely, on one line to spin everything up so I can do “ad-hoc” computing without just spelling them out in a file. Sorry, but that just isn’t a killer feature for me and it doesn’t seem like a big deal anyway.
I do like the idea of getting rid of pyenv though. And since poetry has failed to become as widespread as I hoped, maybe uv has a better shot?
That’s like a killer app type feature. However it says adhoc so you probably can’t get back to that setup easily
I honestly really hate the venv ergonomics but uv does still depend on it as the golden path if you don’t use the —with flags (in my understanding). Is there a way to do a clean break with just the new —script inline dependencies, or is that wrong/suboptimal?
You can definitely do that — it's just sub-optimal when you have multiple files that share dependencies.
But I still need pip to install uv, right? Or download it using a one-liner alternatively.
You can install it in several ways without pip, easiest is standalone installer (which can upgrade itself)
cargo install
one useful UV alias I use is uvsys='uv pip install --system'
So I can just do uv {package} for a quick and dirty global install. I'm so used to pip install being global by default just making this shorthand makes things a bit easier.
this sounds like it’s asking for trouble! very easy to mess up your whole system
i would highly recommend only using —-system in a docker container or similar
Can you also specify which version of pandas to use?
Of course!
uv run -q --with pandas==2.1.4 python -c "import pandas; print(pandas.__version__)" 2.1.4
uv has does not (nor do they plan to add) support for conda, and that is a deal-breaker.
I can't see why anyone is using Conda in 2025. In 2018, yeah, pip (now uv) was hard and you could get a "just works" experience installing Tensorflow + NVIDIA on Conda. In 2023 it was the other way around and it still is.
Well, when you're building python packages that have non python dependencies and a big chunk of your users are on Windows, conda is the only option, even in 2025 :)
Examples include, quant libraries, in-house APIs/tools, etc.
Why would it be a deal breaker? uv would replace conda. And I hope it does. Conda has been such a headache for me when I've used it in the past. If the Python (particularly ML/academic community) could move on from conda it would be a great thing.
uv can’t replace conda, any more than it can replace apt or nix.
Conda packages general binary packages, not just python packages. uv is just python packages.
Pixi might be something worth looking for, if you want a uv conda equivalent
Fun fact: pixi uses uv as a library to install pypi packages!
Interesting, will check it out.
pixi seems fine, but it also is just using mamba on the backend so you might as well continue to use miniforge
That doesn't make sense, respectfully.
I've replaced the linkbait title with an attempt at saying what the feature is. If there's a more accurate wording, we can change it again.
I don't feel strongly, but as a uv author, I found "local dependencies" misleading. It's more like "uv's killer feature is making ad-hoc environments easy".
When we talk about local dependencies in the Python packaging ecosystem, it's usually adding some package on your file system to your environment. The existing title made me think this would be about the `[tool.uv.sources]` feature.
Really, it's about how we create environments on-demand and make it trivial to add packages to your environment or try other Python versions without mutating state.
Sorry dang, didn't know the practice + got a bit emotional haha. I agree with the remark above, my message is rather on how easy it is to run python scripts with dependencies (without mutating the state.
Happy to take correction from an author! I've switched the wording above.
uh, thanks I guess.
It's just standard practice here. See https://news.ycombinator.com/newsguidelines.html.
How about "A UV feature that intrigues me most"?
[flagged]
>you need a virtual environment for some reason
You have always needed on, practically speaking. Python isn't designed to have multiple versions of the same library in the same runtime environment. A virtual environment is just a separate place to put the packages you need for the current project, so that they're isolated from other packages, and thus you don't get version conflicts. This includes the system packages. If you want to play with, say, the latest version of Requests, and you try sudo installing that in a system environment, and it happens that the latest version of Requests breaks Apt (which is written in Python), you're in for a bad time.
The new warning is because even user-level installations can mess with system scripts, when those scripts are run without sudo. Also, Apt has no real way to know about or understand anything you do with Pip, so that interferes with Apt's actual package management.
>installing packages [with] sudo doesn't make them available to other users
If you use sudo to install packages for the system Python, then yes they absolutely are available to all users. But you don't see them in virtual environments by default (you can change this) because the default is to ignore the system installation's `site-packages` completely (including user-level installations).
> on ubuntu it seems pip has been replaced with 'python-*' debian packages
None of this is new, and it doesn't even remotely "replace" Pip. You're just facing a little more pressure to actually use the system package manager when installing packages for your system, since that can actually manage packages, and integrate them with the rest of your system (the non-Python parts). The Debian packages are specifically vetted and tested for this purpose and may include Canonical's own patches that you won't get from PyPI. On the other hand, PyPI provides vastly more different packages.
When you install in a virtual environment, you'll generally use Pip to do it (unless you use uv etc.). Because the environment is specifically created to be isolated from your system, so that Apt doesn't have to care.
Please see https://stackoverflow.com/questions/75608323 for details. It wasn't a snap decision; see https://discuss.python.org/t/pep-668-marking-python-base-env... for context. Arch implements analogous protections, too, for the same reasons (https://www.youtube.com/watch?v=35PQrzG0rG4). I recall Fedora having similar plans but I didn't hear about it being implemented yet.
wow, they've re-invented a tiny bit of Nix, purely legend!
A few months ago I saw someone hacking the linker to get mundane package management working in Nix. It was bubbling up to the top of my "to try" list and that bumped it back down. It'll be good eventually, I'm sure.
> It'll be good eventually, I'm sure.
not with this attitude of getting scared of things by watching someone doing something, for sure
That you can use without having 4 PhDs. It's pretty good. You should try it sometime when your done fully ingesting algebraic topology theory or whatever the fuck Nix requires to know just to install figlet.
Try flox [0]. It's an imperative frontend for Nix that I've been using. I don't know how to use nix-shell/flakes or whatever it is they do now, but flox makes it easy to just install stuff.
[0]: https://flox.dev/
> You should try it sometime when your done fully ingesting algebraic topology theory or whatever the fuck Nix requires to know
aka how to say that you've never really tried learning Nix without saying it directly.
Oh come on, it's not that hard even for packaging stuff (let alone usage). Quite trivial compared to leetcode grind I'd say.
Nix people are more annoying than Rust Defense Force.
I use Windows, and not WSL. Nix does literally nothing for me.
[flagged]