I am trying to containerize my program. Everything works fine locally but then the imports don't work as I'd like when I build and run a container.
Here is my Dockerfile
FROM python:3.8
WORKDIR /src
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY . .
CMD ["python3", "./app/run.py"]
Here is a general file structure I have going.
app
├── module
│ ├── file.py
│ └── __init__.py
├── __init__.py
└── run.py
run.py looks something like this:
from app.module import Class
instance = Class(args)
instance.do_stuff
and my app/module/init.py has:
from .file import Class
Again, this works fine when I run it locally. However I end up getting an import error when I run this in docker.
Traceback (most recent call last):
File "./app/run.py", line 3, in <module>
from app.module import Class
ModuleNotFoundError: No module named 'app'
I am able to get my code running in the container by changing the imports in run.py to from module import Class
instead of from app.module import Class
. However I'd like to understand it better. I'd honestly prefer my imports to be app.module as to make local imports a little more obvious.
From https://docs.python.org/3/using/cmdline.html :
If the script name refers directly to a Python file, the directory containing that file is added to the start of sys.path, and the file is executed as the main module.
The directory app
is added to search path. If you want to directory containing app
to be added to search path, you can PYTHONPATH=$(pwd)
or similar.
Research PYTHONPATH
and import python documentation. Consider writing a python package for your code to use relative imprts. https://docs.python.org/3/reference/import.html https://docs.python.org/3/using/cmdline.html#envvar-PYTHONPATH