Package managers for JavaScript
like npm
and yarn
use a package.json
to specify 'top-level' dependencies, and create a lock-file to keep track of the specific versions of all packages (i.e. top-level and sub-level dependencies) that are installed as a result.
In addition, the package.json
allows us to make a distinction between types of top-level dependencies, such as production and development.
For Python
, on the other hand, we have pip
. I suppose the pip
equivalent of a lock
-file would be the result of pip freeze > requirements.txt
.
However, if you maintain only this single requirements.txt
file, it is difficult to distinguish between top-level and sub-level dependencies (you would need for e.g. pipdeptree -r
to figure those out). This can be a real pain if you want to remove or change top-level dependencies, as it is easy to be left with orphaned packages (as far as I know, pip
does not remove sub-dependencies when you pip uninstall
a package).
Now, I wonder: Is there some convention for dealing with different types of these requirements
files and distinguishing between top-level and sub-level dependencies with pip
?
For example, I can imagine having a requirements-prod.txt
which contains only the top-level requirements for the production environment, as the (simplified) equivalent of package.json
, and a requirements-prod.lock
, which contains the output of pip freeze
, and acts as my lock
-file. In addition I could have a requirements-dev.txt
for development dependencies, and so on and so forth.
I would like to know if this is the way to go, or if there is a better approach.
p.s. The same question could be asked for conda
's environment.yml
.
There are at least three good options available today:
Poetry uses pyproject.toml
and poetry.lock
files, much in the same way that package.json
and lock files work in the JavaScript world.
This is now my preferred solution.
Pipenv uses Pipfile
and Pipfile.lock
, also much like you describe the JavaScript files.
Both Poetry and Pipenv do more than just dependency management. Out of the box, they also create and maintain virtual environments for your projects.
pip-tools
provides pip-compile
and pip-sync
commands. Here, requirements.in
lists your direct dependencies, often with loose version constraints and pip-compile
generates locked down requirements.txt
files from your .in
files.
This used to be my preferred solution. It's backwards-compatible (the generated requirements.txt
can be processed by pip
) and the pip-sync
tool ensures that the virtualenv exactly matches the locked versions, removing things that aren't in your "lock" file.