I'm following this tutorial to integrate Ollama (http://localhost:11434/) with AutoGPT (http://localhost:3000) using Docker Compose. However, I encounter the following error when trying to connect:
Error calling LLM: [Errno 111] Connection refused
I ensured that:
$ curl http://localhost:11434/
Ollama is running
Error Details:
Connection Refused: Make sure Ollama is running and the host address is correct (also make sure the port is correct, its default is 11434).
Solution:
Instead of using localhost
, replace it with host.docker.internal
in the connection string:
http://host.docker.internal:11434/
Background:
Docker containers have their own network namespace, so localhost
inside a container refers to the container itself—not the host. As a result, services running on the host aren't accessible via localhost
from within the container.
Reference:
This workaround is discussed in this GitHub issue: Ollama Issue #703.