pythonlarge-language-modelollamallama3

Not able to access llama3 using python


I am testing llama3 here using this simple code below

import ollama

message = "What is football"  

# connect to Llama3 model
try:
  response_stream = ollama.chat(
      model="llama3",
      messages=[{ 'role': 'assistant','content': message}], 
      stream= True
  )
  print("Connected to Llama3")

  for response in response_stream:
    print(f"Llama3: {response['content']}")

except Exception as e:
  print(f"Error connecting to Llama3: {e}")

I ran it but getting error (llama3 is installed correctly)

Connected to Llama3
Error connecting to Llama3: 'content'

[Done] exited with code=0 in 1.104 seconds

Solution

  • The line

    print(f"Llama3: {response['content']}")
    

    should be

    print(f"Llama3: {response['message']['content']}")
    

    Check the relevant part of the docs on github https://github.com/ollama/ollama-python#streaming-responses for reference