pythonpy-langchainllamacpp

Could not load Llama model from path: ./Models/llama-7b.ggmlv3.q2_K.bin. Received error Llama.__init__() got an unexpected keyword argument 'input'


from langchain.llms import LlamaCpp
from langchain import PromptTemplate, LLMChain
from langchain.callbacks.manager import CallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler

template = """Question: {question}

Answer: Let's work this out in a step by step way to be sure we have the right answer."""

prompt = PromptTemplate(template=template, input_variables=["question"])

callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])

llm = LlamaCpp(
                model_path="./Models/llama-7b.ggmlv3.q2_K.bin",
                input={"temperature": 0.75,
                       "max_length": 2000,
                       "top_p": 1},
                callback_manager=callback_manager,
                verbose=True,
                )

llm_chain = LLMChain(prompt=prompt, llm=llm)

current folder structure

(llm) C:\llm>python app1.py
C:\llm\lib\site-packages\langchain\utils\utils.py:155: UserWarning: WARNING! input is not default parameter.
                input was transferred to model_kwargs.
                Please confirm that input is what you intended.
  warnings.warn(
Exception ignored in: <function Llama.__del__ at 0x000001923B3AE680>
Traceback (most recent call last):
  File "C:\llm\lib\site-packages\llama_cpp\llama.py", line 1507, in __del__
    if self.model is not None:
AttributeError: 'Llama' object has no attribute 'model'
Traceback (most recent call last):
  File "C:\llm\app1.py", line 14, in <module>
    llm = LlamaCpp(
  File "C:\llm\lib\site-packages\langchain\load\serializable.py", line 74, in __init__
    super().__init__(**kwargs)
  File "pydantic\main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp
__root__
  Could not load Llama model from path: ./Models/llama-7b.ggmlv3.q2_K.bin. Received error Llama.__init__() got an unexpected keyword argument 'input' (type=value_error)

Solution

  • You could try installing llama-cpp-python older version:

    pip install llama-cpp-python==0.1.65 --force-reinstall --upgrade --no-cache-dir
    

    This worked for me.