pythonpiptorchpython-venvollama

How to avoid a Torch error with Open WebUI/Ollama


I'd like to get Open WebUI working with Ollama on Ubuntu 24.10, but installing it using pip and venv leads me to a torch error.

Firstly, Ollama (0.6.2) is working: I can type /path/to/ollama list and see the 3 models I've been working with.

Next, I follow the guidance in the error message after pip install open-webui, relating to the use of the APT package manager, and so I use venv:

sudo apt-get install python3-full
python3 -m venv /path/to/venv
source /path/to/venv/bin/activate
python install open-webui

I then try python open-webui serve, but this complains that there is no file or directory called open-webui in my $HOME directory. I see there is an executable file called open-webui in my /path/to/venv/bin so I try:

python /path/to/venv/bin/open-webui serve

...and I see the large OPEN WEBUI text , and an error. Opening http://localhost:8080 in my browser seems to work, but how can I avoid the following error message?:

ERROR [open_webui.main] Error updating models: cannot import name 'Tensor' from 'torch' (unknown location)

Solution

  • Reinstalling torch using the following command (from here) worked for me:

    (venv) $ pip install --upgrade --no-deps --force-reinstall torch
    

    I issued the above command from within my venv, activated via source /path/to/venv/bin/activate. The error message regarding torch then disappears.