pythonlangchainollama

Langchain ChatOllama always produces invalid format: expected "json" error


Just trying to follow a basic langchain tutorial: https://python.langchain.com/v0.2/docs/tutorials/local_rag/

Super simple code:

from langchain_ollama import ChatOllama
import logging

logging.basicConfig(
    level=logging.DEBUG,
    format='%(asctime)s.%(msecs)03d [%(levelname)s]: %(message)s',
    datefmt='%H:%M:%S'
)

logging.info("### Starting up")

llm = ChatOllama(
    model="llama3.1",
)

response_message = llm.invoke(
    "What are you?"
)

print(response_message.content)

No matter what I try I always get this error:

Exception has occurred: ResponseError
invalid format: expected "json" or a JSON schema
  File "C:\XXXX\local_rag\main.py", line 16, in <module>
    response_message = llm.invoke(
                       ^^^^^^^^^^^
ollama._types.ResponseError: invalid format: expected "json" or a JSON schema

Tried a few different approaches including messages[], PromptTemplate, streaming etc. from https://python.langchain.com/docs/integrations/chat/ollama/ but always getting the same error.

No issues going via the rest API i.e.

curl http://localhost:11434/api/chat -d '{
  "model": "llama3.1",
  "messages": [
    { "role": "user", "content": "why is the sky blue?" }
  ]
}'

Any help would be appreciated. Really hope I am just doing something daft here.

Edit: The version of Ollama makes a difference but also setting llm.format = None works as suggested in the comment.


Solution

  • I am seeing the same thing after upgrading Ollama. I just opened an issue here https://github.com/langchain-ai/langchain/issues/28753. For now it should work if you downgrade your version of ollama.

    Edit: This was fixed as of https://github.com/ollama/ollama/releases/tag/v0.5.3