Oh wow, my package on the front page again. Glad that it's still being used.
This was written 10 years ago when I was struggling with pulling and installing projects that didn't have any requirements.txt. It was frustrating and time-consuming to get everything up and running, so I decided to fix it, apparently many other developers had the same issue.
[Update]: Though I do think the package is already at a level where it does one thing and it does it good. I'm still looking for maintainers to improve it and move it forward.
Please consider consolidating python dependency management instead of fragmenting it:
https://docs.astral.sh/uv/pip/compile/
In other words, bring the thinking here. Whether it's new thinking, or decade old thinking, it's well time to converge. We've had decades of trying it like Heinz catsup varieties. Worth trying more wood behind fewer arrows.
This is not fragmenting it. The requirements.txt file has been a steady and well used API for well over a decade. This tool just helps you produce one for a project that is missing one.
One can start non-distruptively with uv by only using it as a pipx replacement at first ( uv tool). Nice and fast.
How about you stop trying to create package managers for every single language/ecosystem in existence and instead converge on trying to solve the whole problem once and for all with Nix.
Asking honestly since I've seen Nix and NixOS show up on the front page here a bunch over the years, but never used it: my impression of it is that it fills the same kind of niche as the Zig or Nim languages: conceptually pretty cool, but not widely adopted outside of a dedicated core user group.
Is this really the case for Nix, or is it actually widely adopted, and this adoption is underreported?
If it's not actually widely adopted, what do you think are the biggest obstacles in Nix's way?
> Please consider consolidating python dependency management instead of fragmenting it
You realize that this project predates uv by roughly a decade, right?
> You realize that this project predates uv by roughly a decade, right?
"Whether it's new thinking, or decade old thinking, it's well time to converge."
Can you expand on what you mean, exactly?
You're just picking a winner like every other Python dependency project. If it's not in a PEP do whatever the hell you want. Good ideas will get turned into future PEPs for tooling to standardize on. uv itself has two separate locking formats.
How is uv, another flavor-of-the-month Python packaging tool, not contributing to fragmentation?
uv unlike all other options has the best shot at becoming the one.
Please consider consolidating python dependency management instead of fragmenting it:
https://github.com/mamba-org/mamba
In other words, bring the thinking here. Whether it's new thinking, or decade old thinking, it's well time to converge. We've had decades of trying it like Heinz catsup varieties. Worth trying more wood behind fewer arrows.
> Please consider consolidating python dependency management instead of fragmenting it: https://github.com/mamba-org/mamba
Mamba doesn't even interact with the official python packing ecosystem... It is purely a conda replacement and conda packaging is not Python packaging (it's a more fundamental fragmentation than choosing a particular tool). So weird choice to not fragment Python dependency management.
If you depend on both conda and standard Python packaging (e.g. PyPI.org) you can use Pixi: https://github.com/prefix-dev/pixi?tab=readme-ov-file#pixi-p.... Pixi attempts to bridge the conda world and the Python package world, so it's possible to rely more deeply on standards based packaging but still use conda tooling when you need it.
While I don't know which (if any) Python package manager will "win", I'm confident it won't be Conda or similar. I hope and pray not, anyway.
Yes, the entire workflow of conda/mamba is a liability. Having environments detached from directories encourages bad practices (reusing environments for unrelated things, installing everything into the same environment and getting very broken dependency resolutions, that stuff) and it doesn't even have a lock file format, so environments can't be saved short of dumping its entire directory.
Iff it must be the conda ecosystem, pixi (https://pixi.sh/latest/) is a much better pick.
But conda-style packages (or anything with install time dependency resolution really) also have the issue of being completely untestable. That makes them unsuitable at least for user-facing package management, if you care about your packages not randomly breaking without warning.
I'd rather see every language converge on a single package manager that implements functional package management, i.e. guix or nix. One can dream...
I've been using a conda+poetry flow for years and it's worked very well. Particularly because envs aren't tied to projects. I tried pixi for a few days in a project months ago and it was just breakage and untenable limitations all around. I prefer the freedom of using per project envs when I want, and shared envs other times.
What are you using each of them for in that workflow?
I've found that there really only are two kinds of packages I want to install: those I want globally (e.g. CLI tools like git, git-annex, DataLad, whatever shell enhancing you want, etc.) and project-specific dependencies (anything required to get going with it on a freshly installed system with the smallest amount of assumptions possible).
The former is sufficiently addressed by `pixi global` and other package managers, the latter by pixi projects. Notably, conda environments are a bad fit for both (not global, not really updatable, not automatically tied to a project, not rebuildable due to missing lock files, ...).
I used this last year, with a relative success. I was asked to fix a Python code written by an intern that was no longer there. The code used external libraries, but did not declare any. I had only a zip of the source code, without any library. pipreqs was able to identify the 22 required libraries. Unfortunately, there was a runtime crash because a library was at the wrong version. So I had to let a real Python developer handle the next steps.
BTW, this tool is not a dependency manager. Many sibling comments seem to misunderstand the case and promote unrelated tools.
When I first started with Python, long ago, I looked into these kind of solutions, which didn't work so well, and wondered why the concept was not better developed. Later, with experience, I realized it is not am great idea, and more hassle than the benefits it brings.
I don't think it is good idea to merrily write 10s of import statements and end up with loads of dependencies.
Can I have it find the requirements as of a specific date?
If I find a script/notebook/project with no requirements.txt, I usually know when it was created. Being able to get the versions that were selected back then on the author's machine would be great for reproducibility.
For the future, please pick a package manager that can give you a lock file alongside your code, so that you have a definitive record of the dependencies.
Even if you have all versions as of the time of the last modification to the code, you don't know if the dependency resolution happened at that point in time, or if the environment was set up years prior and never updated.
Nevertheless, this is what you are looking for: https://pypi.org/project/pypi-timemachine/
See whether “exclude-newer” uv option might help if you’ve inherited a script without a dependencies lock file https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...
uv pip compile --exclude-newer=2019-12-02
https://docs.astral.sh/uv/pip/compile/ • https://docs.astral.sh/uv/reference/cli/#uv-pip-compileI have never realized until this moment that I want such a tool. Ideally, you would never find yourself in this position, but alas.
I thought import names and PyPI names are not always equal, thus this can not work reliably, right?
It uses this file which maps import names to package names, but it's only 1152 packages and I'm not sure how it was generated. https://github.com/bndr/pipreqs/blob/master/pipreqs/mapping
Wrap your python program in an LLM that just keeps installing things when it fails until it works :)
Would this be useful for identifying packages in a requirements.txt that aren’t used? Or is there another tool for that?
mv and diff?
Anything that takes away from users fully migrating to pyproject really needs to die.
But pyproject itself isn't even taking a stance on the underlying question of dependency management, which can still be flit, poetry, uv, pipx, pipenv, and who knows what else.
(I'm a poetry user myself, but I can see the writing on the wall that poetry is probably not going to be the winner of who ultimately ends up in the PEP.)
and why is that?
looks great. similarly this is why i love ‘go mod tidy’. just use the thing and let the package manager sort it out after the fact.
I was looking for this and thought I was doing something wrong not finding anything... Great job! I do think though a "clean" development mode not needing this would be to work with a virgin virtual environment starting a new project and running pip freeze on that env.
How is it different to pip-chill[1]?
I believe pip-chill still operates on packaged installed into the environment. This project seems to derive from the code itself, even if no packages are installed in the current environment.
From the GH repo description:
> Looking for maintainers to move this project forward.
So not sure how maintained it is.
let's not fuck with supply chain vulnerabilities
Surprised I've never seen this, despite it existing for at least 9 years??
Please consider using PDM instead. https://pdm-project.org/en/latest/
could you explain why?
Can you have it work with pipfiles too ?
Alternatively, can the pipenv gang support the same pyproject.toml files as everyone else?
I had forgotten about pipfiles. Like their name implies (/s) they are not for pip but pipenv, a separate tool.
https://github.com/pypa/pipfile "This format is still under active design and development", last commit "2 years ago". I think this is dead.
That's the repo for the Pipfiles themselves. The (only?) thing using them is pipenv, which lives in a separate repo.
https://github.com/pypa/pipenv
Pipenv was last updated 10 hours ago. Looks like it's still an active project to me.
I don't spend a lot of time in Python, but my current understanding having read Python documentation and seeing some projects online is that you use pip and requirements.txt with --require-hashes, venv, and pyproject.toml to use a standard dependency management toolchain.
Is that correct?
I don’t know if this is how pipreqs works, but I’d be concerned about a typo in an import that inadvertently installs a library.
I’ve found pip-tools [1] to be a nice middle ground between `pip freeze` and something heavier like poetry. It’s especially helpful in showing you where a library came from, when a library installs other libraries it depends upon.
One could still have a typo in their requirements.in file with pip-tools, but changes there are much less frequent than imports in any Python file in your proejcf which could be a daily occurrence in some codebases.
god, please no
Nicely done
...it would also be nice if Debian (Ubunutu and other derivitives) stop shitting on pip in some ridiculous paranoid attempt to stop breaking apt packages (that don't break apt packages anyway) or because of "muh security".