langchainollama

How to save history and restart a chat from last point in Langchain with Ollama?


I have a Ollama Langchain chat system. Once the chat ends, I save the history in DB. But I am not able to load the history for restarting a particular chat again. The code, error and history are as below.

history = {'input': 'What is life?', 'history': 'Human: What is life?\nAI: {}', 'response': '{ "Life" : {\n  "Definition" : "A complex and multifaceted phenomenon characterized by the presence of organization, metabolism, homeostasis, and reproduction.",\n  "Context" : ["Biology", "Philosophy", "Psychology"],\n  "Subtopics" : [\n    {"Self-awareness": "The capacity to have subjective experiences, such as sensations, emotions, and thoughts."},\n    {"Evolutionary perspective": "A process driven by natural selection, genetic drift, and other mechanisms that shape the diversity of life on Earth."},\n    {"Quantum perspective": "A realm where quantum mechanics and general relativity intersect, potentially influencing the emergence of consciousness."}\n  ]\n} }'}

PROMPT_TEMPLATE = """ 
{history}
"""

custom_prompt = PromptTemplate(
    input_variables=["history"], template=PROMPT_TEMPLATE
)

chain = ConversationChain(
    prompt=custom_prompt,
    llm=llm,
    memory=ConversationBufferMemory()
)
prompt = "How to live it properly?"

answer = chain.invoke(input=prompt)

Error: miniconda3/lib/python3.11/site-packages/pydantic/v1/main.py", line 341, in init raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for ConversationChain root Got unexpected prompt input variables. The prompt expects ['history'], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)

How to load the history the right way?

TIA


Solution

  • I found an answer here: https://github.com/langchain-ai/langchain/discussions/3224

    This is one of the easy ways to add history. memory.chat_memory.add_ai_message(message)