I'm following this tutorial to integrate Ollama (http://localhost:11434/) with AutoGPT (http://localhost:3000) using Docker Compose. However, I encounter the following error when trying to connect:
Error calling LLM: [Errno 111] Connection refused
I ensured that:
$ curl http://localhost:11434/
Ollama is running
Error Details:
Connection Refused: Make sure Ollama is running and the host address is correct (also make sure the port is correct, its default is 11434).
After some research, I discovered that if AutoGPT is running in a Docker container, it might be a networking issue. Docker often isolates its network, making it unable to access the host's services directly.
Solution:
Instead of using localhost
, replace it with host.docker.internal
in the connection string:
http://host.docker.internal:11434/
This workaround is discussed in this GitHub issue: Ollama Issue #703.