I copied and pasted the Gemini document directly into my Node.js server, and I encountered an error related to the history
parameter.
{"error": "Cannot use 'in' operator to search for 'text' in H"}
However, the operation succeeded when I set history
to null
.
this is my code :
history: [
{
role: "user",
parts: : "Hello, I have 2 dogs in my house.",
},
{
role: "model",
parts: "Great to meet you. What would you like to know?",
},
],
generationConfig: {
maxOutputTokens: 100,
},
});
and this is the error i got in the console :
TypeError: Cannot use 'in' operator to search for 'text' in H
at validateChatHistory (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\@google\generative-ai\dist\index.js:760:25)
at new ChatSession (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\@google\generative-ai\dist\index.js:816:13)
at GenerativeModel.startChat (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\@google\generative-ai\dist\index.js:1035:16)
at C:\myprojects\chat_app_generativeai_dartfrog\server\routes\chat.js:21:22
at Layer.handle [as handle_request] (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\express\lib\router\layer.js:95:5)
at next (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\express\lib\router\route.js:149:13)
at Route.dispatch (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\express\lib\router\route.js:119:3)
at Layer.handle [as handle_request] (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\express\lib\router\layer.js:95:5)
at C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\express\lib\router\index.js:284:15
at Function.process_params (C:\myprojects\chat_app_generativeai_dartfrog\server\node_modules\express\lib\router\index.js:346:12)
The current structure of the chat history in Google document does not accurately reflect the expected format for multi-turn conversations. The model appears to be incorrectly handling the conversation turns, leading to errors in the conversation flow. To fix this issue, I propose the following enhancement to the chat history model:
const chatHistory = [
{
role: "user",
parts: [{
text: "Hello, I have 2 dogs in my house."
}]
},
{
role: "model",
parts: [{
text: "Great to meet you."
}]
}
];