pythonlarge-language-modelllama-indexollama

ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location) - installing dependencies does not solve the problem


I want to learn LLMs. I run Ollama with the following Docker Compose file - it's running:

services:
  ollama:
    image: ollama/ollama:latest
    ports:
      - 11434:11434
    volumes:
      - ollama_data:/root/.ollama
    healthcheck:
      test: ollama list || exit 1
      interval: 10s
      timeout: 30s
      retries: 5
      start_period: 10s
  ollama-models-pull:
    image: curlimages/curl:8.6.0
    command: >-
      http://ollama:11434/api/pull -d '{"name": "mistral"}'
    depends_on:
      ollama:
        condition: service_healthy
volumes:
  ollama_data:

I would like to write a Python app, which will use ollama, and I found this piece of code:

from llama_index.llms import Ollama, ChatMessage

llm = Ollama(model="mistral", base_url="http://127.0.0.1:11434")

messages = [
    ChatMessage(
        role="system", content="you are a multi lingual assistant used for translation and your job is to translate nothing more than that."
    ),
    ChatMessage(
        role="user", content="please translate message in triple tick to french ``` What is standard deviation?```"
    )
]
resp = llm.chat(messages=messages)
print(resp)

I installed all dependencies:

python3 -m venv venv
source venv/bin/activate
pip install llama-index  
pip install llama-index-llms-ollama
pip install ollama-python

However, when I run the app, I got:

Traceback (most recent call last):
  File "/home/user/test.py", line 1, in <module>
    from llama_index.llms import Ollama, ChatMessage
ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location)

where can be the problem?


Solution

  • The correct way to import Ollama should be:

    from llama_index.llms.ollama import Ollama
    

    See example here

    For ChatMessage it should be:

    from llama_index.core.llms import ChatMessage
    

    See example here