Best practice for Python is to use venv
to isolate to the imports you really need. I use python -m venv
.
For development, it's very convenient to use IPython and notebooks for exploring code interactively. However, these need to be installed into the venv to be used. That defeats the purpose of the venv.
I can make two venvs, one for usage and one venv-ipy
for interactive exploration, but that's hard to manage, and need to be synchornized.
Is there a solution or practice?
I work on multiple projects most of which run in production, each with it's own dependencies. The practice which works for me is to create separate requirements.txt
files: one for production environment, the other one contains additions used for development, and a small script that creates/u[dates 2 venvs at the same run: <ProjectName>_prod
and <ProjectName>_dev
.
So the requirements_dev.txt
can contain only extensions like jupyter, matplotlib
and other packages that are used only in development and there is no need to sync the files until reaching the production stage.
To install the dependencies in dev
environment you mention both files:
pip install -r requirements.txt -r requirements_dev.txt
I also use conda
to create/manage environments as I find it more convenient than venv
.