pythonkagglengrokollamawebui

Hosting Ollama on a Kaggle Notebook with Ngrok: OpenWebUI Not Connecting to Ollama Client


I'm hosting Ollama on a Kaggle notebook and using Ngrok to create a tunnel, allowing me to connect to the Ollama client on my local machine via the command line. The Ngrok tunnel works as expected, and I can interact with the Ollama client through it.

To expand the setup, I created a new environment and installed OpenWebUI for Ollama using pip (not Docker). However, when I start OpenWebUI, it fails to connect to the Ollama client, even though the Ngrok tunnel is active and functional.

I followed this tutorial to set up the system, adjusted the LD_LIBRARY_PATH according to kaggle: https://youtu.be/Qa1h7ygwQq8?si=wh1U3TODeSb6O_0c.

Here’s what I’ve tried so far:

Verified the Ngrok tunnel URL is correct and accessible. Ensured the Kaggle notebook is running and Ollama is operational. Despite this, OpenWebUI doesn’t seem to recognize or connect to Ollama.

This is the notebook I'm using :- [https://www.kaggle.com/code/singh008/ollama-server]

NOTE :- same thing happens on colab as well

What might be causing this issue? Do I need specific configurations in OpenWebUI to work with Kaggle-hosted Ollama via Ngrok?


Solution

  • I was not able to reproduce your error (double check your ngrok logs - it might be the case as I had 403 Forbidden when I connected my llama endpoint to ngrok first; I needed to set up OLLAMA_HOST).

    1. Ollama endpoint
     OLLAMA_HOST=0.0.0.0 ollama serve
    

    See that host is set. Without it default settings make ollama unable to use ngrok endpoint (only localhost is allowed)

    1. Ngrok
    ngrok http http://localhost:11434
    
    1. OpenWebUI
    OLLAMA_BASE_URL=your_ngrok_endpoint open-webui serve
    

    where endpoint is like <your ngrok endpoint in form of https://*.ngrok-free.app>