pythondockerflaskminiconda

What is the correct way to run a Python Flask API on Docker with Miniconda and access it on localhost?


How to access Flask API running on localhost docker container?

I created miniconda docker image with Flask API on it.

Dockerfile is:

FROM continuumio/miniconda3

# Install base utilities
RUN apt-get update \
&& apt-get install -y wget \
&& rm -rf /var/lib/apt/lists/*

COPY api /root/api
RUN echo "Running $(conda --version)" 
RUN conda update conda 
RUN conda create -n api python=3.9 
RUN echo "conda activate api" >> ~/.bashrc
SHELL ["/bin/bash", "--login", "-c"]
RUN conda activate api
RUN conda install flask requests 
ENTRYPOINT ["conda", "run", "-n", "api", "python", "/root/api/main.py"]

Flask API uses port 5000. I've tried to add EXPOSE 5000 to dockerfile, but I didn't find any difference.

It builds without error, but I'm still not sure everything is correct. So I run it locally on my PC to test. But I can't access it. I've tested http://172.17.0.2/api and http://localhost/api, but it didn't respond. Also I've tried to run main.py in container terminal, but it says "Port 5000 is in use by another program".

So here is what I would like to ask:

  1. How to properly run Flask API on Linux? Am I doing it right?
  2. How to properly run Flask API in Docker? Am I doing it right?
  3. How to access Flask API running on localhost Docker container (same PC)? I just can't understand what ip/address my API gets.

Solution

  • Here is a best solution I found.

    Dockerfile:

    FROM tensorflow/tensorflow:2.12.0-gpu
    
    # Install dependencies
    RUN pip install tensorflow-hub <whatever else you need>
    
    # Copy application code
    COPY /my-app-code /app
    WORKDIR /app
    
    # Expose port 5000 for Flask
    EXPOSE 5000
    
    # Set entrypoint
    ENTRYPOINT ["python", "main.py"]
    

    In your .py with Flask app you should run your application this way:

    app.run(host="0.0.0.0", port="5000")
    

    I use tensorflow image, because otherwise I'll miss nvidia drivers and containter won't see GPU. I don't use nvidia container because this could have version conflict with tensorflow version I'm using.

    I had to move from miniconda as it brings lot of complexity. However it's possible to run Flask API on it. Here is working dockerfile:

    FROM continuumio/miniconda3
    
    # Install base utilities
    RUN apt-get update \
    && apt-get install -y wget \
    && rm -rf /var/lib/apt/lists/*
    
    # flask app code is in "my-app-code" folder near with dockerfile
    COPY /my-app-code /app
    RUN echo "Running $(conda --version)" 
    RUN conda update conda 
    RUN conda create -n app-env python=3.9 
    RUN echo "conda activate app-env" >> ~/.bashrc
    SHELL ["/bin/bash", "--login", "-c"]
    RUN conda activate app-env
    RUN conda install flask requests 
    EXPOSE 5000
    ENTRYPOINT ["conda", "run", "-n", "app-env", "python", "/app/main.py"]
    
    

    Don't forget to add host and port to your app.run as showed above.