fastapihuggingface-transformerspydanticpeft

FastAPI custom Validator Error: FastAPI/Pydantic not recognizing custom validator functions (RuntimeError: no validator found for <class>)


I'm working on a fastAPI project on an Amazon EC2 instance running Ubuntu 20.04.5. The nature of the project requires me to have several custom types (written by me) and third party types (from HuggingFace transformers, hf peft, Langchain) as fields in my Model schema. When I try to run my fastAPI application, though, I consistently get this error for each custom/third party type I use as a field:

RuntimeError: no validator found for <class 'peft.peft_model.PeftModelForCausalLM'>, see `arbitrary_types_allowed` in Config

I've done everything I can find online/in docs to fix this error. I'll show some snippets of my code that should show what I've tried so far.

My custom Model Schema look like this

class SaicModelForTextGen(BaseModel):
    """
    Loads a model for Text Generation. Will require some time to load checkpoint shards 
    once called. 
    """

    model_config = ConfigDict(arbitrary_types_allowed=True)

    prompt: str = BASE_PROMPT_TEXTGEN
    
    model, tokenizer = load_text_generation()

    conversation: Conversation = Conversation()

    @validator('model', check_fields=False)
    def validate_model(cls, value):
        return validate(value)

the field 'model' is of type PeftModelForCausalLM (shown in the error). The function validate(value) redirects to a simple validation function that immediately returns the object when called.

In the api itself, a get request that uses this class would look like this:

@app.get("/", response_model=None)
def init_text_gen(base_prompt: str = BASE_PROMPT_TEXTGEN) -> SaicModelForTextGen:
    return SaicModelForTextGen(prompt=base_prompt)

setting response_model=None was a suggestion I saw on docs/internet that would provide a workaround for my error, but so far hasn't changed anything.

I've also experimented with older/newer versions of FastAPI and pydantic without success either. I haven't been able to find anyone online who has experienced the same issue as me after implementing custom validator functions and setting response_model=None.

Does anyone know of a workaround/solution to this issue? Thanks in advance for your help :) - I'll list the versions/libraries I'm working with below

I'm working with LLMs on a GPU, so there's a lot of other libraries involved in the project, but I mentioned the ones that I thought would be helpful since they seem to be directly involved in this issue. Thanks again!

EDIT: Forgot to include the actual libraries. Worried this is a dependency issues somehow.

fastapi 0.95.0

unicorn 0.23.2

pydantic 1.10.12

python 3.11

transformers 4.31.0

peft 0.4.0


Solution

  • I eventually reached a solution to this error. I was working with custom types I trusted, so I bypassed validation entirely with the BaseModel.construct() method. For other with a similar issue, you should only do this if you can already trust the data that's going into your custom model, because it's definitely a dangerous method (although safe in my case as I mentioned).