pythonpy-langchain

how to deal with dict input type in langchain


i tried to make the input type to the PromptTemplate dict type and access values in the f-string template

from langchain.prompts.prompt import PromptTemplate
from langchain.llms import GooglePalm
prompt_template_advice = PromptTemplate(
input_types={"data" : dict},
input_variables = ["data"],
template=" ...{data['Gender']} ...{data['Age']}... {data['output']} (etc)"
)
advisor=GooglePalm(temperature=0.9)
advice_chain = LLMChain(verbose = True , llm=advisor , prompt= prompt_template_advice,output_key="main_advice")
advice_chain.run(data)

returns this error

Missing some input keys: {"data['BMI']", "data['TUE']", "data['FAF']", "data['NCP']", "data['CAEC']", "data['Gender']", "data['FAVC']", "data['MTRANS']", "data['SCC']", "data['CALC']", "data['Age']", "data['family_history_with_overweight']", "data['output']"}

Solution

  • Instead of passing data['Gender'] into the template, you should pass the dictionary keys. So it should look something like this:

    from langchain.prompts.prompt import PromptTemplate
    from langchain.chains.llm import LLMChain
    from langchain_openai import ChatOpenAI
    
    prompt_template_advice = PromptTemplate(
    input_types={"data" : dict},
    input_variables = ["data"],
    template=" Make a joke using the following data: ...{Gender} ...{Age} ...{Name}..."
    )
    
    advisor = ChatOpenAI(model='gpt-3.5-turbo', temperature=0)
    advice_chain = LLMChain(verbose = True , llm=advisor , prompt= prompt_template_advice, output_key="main_advice")
    data = {"Gender":"male","Age":"25","Name":"John"}
    advice_chain.invoke(data)
    

    This would correctly insert the values into the template.