Here's my code:
import pickle, os
from langchain_openai.chat_models import ChatOpenAI
from langchain.schema import (
AIMessage,
HumanMessage,
SystemMessage
)
def execute_prompt(text, history, jarvis_setup):
print(f"You said: {text}")
history.append(HumanMessage(content = text))
response = jarvis_setup(history)
history.append(AIMessage(content = response.content))
with open('JarvisMemory.txt', 'wb') as file:
pickle.dump(history, file)
print(response.content)
def main():
jarvis_setup = ChatOpenAI(openai_api_key="API_KEY", model = "gpt-3.5-turbo", temperature = 0.7, max_tokens = 400)
#history = [SystemMessage(content="You are a human-like virtual assistant named Jarvis.", additional_kwargs={})]
if os.path.exists("JarvisMemory.txt"):
with open("JarvisMemory.txt", "rb") as file:
history = pickle.load(file)
else:
with open("JarvisMemory.txt", "wb") as file:
history = [SystemMessage(content="You are a human-like virtual assistant named Jarvis. Answer all questions as shortly as possible, unless a longer, more detailed response is requested.", additional_kwargs={})]
pickle.dump(history, file)
while True:
print("\n")
print("Enter prompt.")
text = input().lower()
print("Prompt sent.")
if text:
execute_prompt(text, history, jarvis_setup)
else:
print("No prompt given.")
continue
if __name__ == "__main__":
main()
And I get this error:
LangChainDeprecationWarning: The method BaseChatModel.__call__
was deprecated in langchain-core 0.1.7 and will be removed in 0.3.0. Use invoke instead.
warn_deprecated(
Traceback (most recent call last):
File "C:\Users\maste\Documents\Coding\Python\Jarvis\JarvisTextInpuhjhjghyjvjt.py", line 44, in
main()
File "C:\Users\maste\Documents\Coding\Python\Jarvis\JarvisTextInpuhjhjghyjvjt.py", line 37, in main
execute_prompt(text, history, jarvis_setup)
File "C:\Users\maste\Documents\Coding\Python\Jarvis\JarvisTextInpuhjhjghyjvjt.py", line 12, in execute_prompt
response = jarvis_setup(history)
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core_api\deprecation.py", line 148, in warning_emitting_wrapper
return wrapped(*args, **kwargs)
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 847, in call
generation = self.generate(
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 456, in generate
raise e
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 446, in generate
self._generate_with_cache(
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 671, in _generate_with_cache
result = self._generate(
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 520, in _generate
message_dicts, params = self._create_message_dicts(messages, stop)
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 533, in _create_message_dicts
message_dicts = [_convert_message_to_dict(m) for m in messages]
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 533, in
message_dicts = [_convert_message_to_dict(m) for m in messages]
File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 182, in _convert_message_to_dict
if (name := message.name or message.additional_kwargs.get("name")) is not None:
AttributeError: 'SystemMessage' object has no attribute 'name'
I'm guessing I need to add ".invoke" somewhere in the code based on some research I did on the issue, but I'm a beginner.
I found this website showcasing a very similar error and how to fix it: https://wikidocs.net/235780 You can translate the page to English with Google Translate and the translations are sufficient to understand. It says to add ".invoke" in the place you can see shown on the website. Not sure how to implement this into my code though. Also, this might not be the right solution.
I also looked at the Langchain website and it also says to use "invoke" but I can't find examples of it being used in a full line of code.
Here's the solution! I just figured it out. Very simple mistake! When changing the langchain_community to langchain_openai, remove the ".chat_models"! That's all it was!
So this line: from langchain_community.chat_models import ChatOpenAI
Should be this: from langchain_openai import ChatOpenAI
This is how I figured it out: https://python.langchain.com/v0.2/docs/versions/v0_2/#upgrade-to-new-imports
Also, at least in my code, I had to add ".invoke" after jarvis_setup here: response = jarvis_setup(history)
With those two changes, I get no warnings and no errors!