I have an issue with Python that it can't find any dependencies when running in a container. In my case, I have a fastAPI-based application that runs perfectly on my local machine. When I start the docker image, it complains about every single module as long as I don't do a separate "pip install xyz" within the image.
I have the following Dockerfile:
# Use the official Python image from the Docker Hub
FROM python:3.12-slim as builder
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Install build dependencies
RUN apt-get update \
&& apt-get install -y --no-install-recommends gcc libpq-dev
# Create a working directory
WORKDIR /app
# Install pipenv and python dependencies in a virtual environment
COPY ./requirements.txt /app/
RUN python3 -m venv venv && bash -c "source venv/bin/activate"
RUN venv/bin/pip3 install -r requirements.txt
# Use the official Python image again for the final stage
FROM python:3.12-slim
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Create a working directory
WORKDIR /app
# Copy installed dependencies from the builder stage
COPY --from=builder /usr/local/lib/python3.12/site-packages /usr/local/lib/python3.12/site-packages
COPY --from=builder /usr/local/bin/pip /usr/local/bin/pip
# Copy the application code
COPY . /app
# Install runtime dependencies (if any)
RUN pip install uvicorn gunicorn
# Expose the port the app runs on
EXPOSE 8000
# Command to run the application
CMD ["gunicorn", "-k", "uvicorn.workers.UvicornWorker", "main:app", "--bind", "0.0.0.0:8000", "--workers", "4"]
My file requirements.txt contains all necessary modules, like
...
fastapi==0.111.0
fastapi-cli==0.0.4
filelock==3.14.0
...
pydub==0.25.1
Pygments==2.18.0
python-dotenv==1.0.1
python-multipart==0.0.9
PyYAML==6.0.1
requests==2.32.3
rich==13.7.1
...
I built the container with:
docker build -t my-fastapi-app .
I run the container with:
docker run -p 8000:8000 my-fastapi-app
it prints a long call stack ending with:
ModuleNotFoundError: No module named 'requests'
Then I am going to add the modules with a separate "pip install" inside the Dockerfile:
RUN pip install pydub requests
Now it complains about missing fastAPI:
ModuleNotFoundError: No module named 'fastapi'
So I am going to add another pip install
with fastAPI and so on and so forth.
Then I have tried using pipenv:
COPY ./Pipfile /app
COPY ./Pipfile.lock /app
RUN pip install --upgrade pip \
&& pip install pipenv \
&& pipenv install --deploy --ignore-pipfile
The necessary Pipfile contains all required modules, but then they are still missing after installation.
I thought that pip install -r requirements.txt
would do all this automatically. But that's not the case here. Where is my mistake?
Do I have to blow up my Dockerfile with separate "pip install"-commands for all modules that are listed within my "requirements.txt"?
Or is the virtual env just messed up somehow that Python can't find any modules although they are installed?
You're installing things into a virtualenv, and then copying over the system packages directory (which doesn't have those deps then) into the runtime container.
Copy the virtualenv instead (and then use it).
RUN python3 -m venv /venv
RUN /venv/bin/pip3 install -r requirements.txt
# Use the official Python image again for the final stage
FROM python:3.12-slim
# ...
COPY --from=builder /venv /venv
# ...
RUN /venv/bin/python -m pip install uvicorn gunicorn
# ...
CMD ["/venv/bin/gunicorn", "-k", "uvicorn.workers.UvicornWorker", "main:app", "--bind", "0.0.0.0:8000", "--workers", "4"]