langchainpy-langchain

how to remove items from langchain memory


inside the langchain memory object there are different methods e.g. ConversationBufferMemory or ConversationBufferWindowMemory

Regardless, if the conversation get long at somepoint I get the following error

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 4113 tokens (4063 in the messages, 50 in the functions). Please reduce the length of the messages or functions.

because the context get full. I wonder how I can remove the last object or apply some moving window strategy ?


Solution

  • If memory is an instance of ConversationBufferMemory, for example, you can access past messages with memory.chat_memory.messages.

    The method memory.clean() sets memory.chat_memory.messages to an empty list, so it deletes all memory. See here and here for the respective code parts. What you can do is

    memory.chat_memory.messages = memory.chat_memory.messages[:-2]
    

    to, for example, delete the last two messages.

    To overwrite them, you could then use

    memory.save_context({"input": "Servus"}, {"output": "Grüezi!"})
    

    update Oct 2024

    I could delete last 8 instances by

    memory.messages = memory.messages[:-8]

    where memory is created at the beginning using

    memory = ChatMessageHistory()