I have created azure ai search service called ***********. After this, I created skillset called text split skill. After this, I created index called azureblob-index1 using azure blob storage in which there is a container called rag where all my important documents are stored.
Then, I created chat playground using azure ai search index. Also, created web application of the same.
When I am asking question in chat playground and web application, the responses are different and references are also different.
I asked Question:
How does C******* plan to ensure the accessibility and inclusivity of the Digital Curriculum Website for diverse user groups?
I have used fields of the index like:
content (String) metadata_storage_content_type (String) metadata_storage_size (Int64) metadata_storage_last_modified (DateTimeOffset) metadata_storage_path (String) metadata_author (String) metadata_title (String) metadata_creation_date (DateTimeOffset) language (String) split_text (StringCollection) keywords (String) summary (String) section_titles (String) metadata_file_type (String) merged_content (String) text (StringCollection) layoutText (StringCollection) metadata_storage_file_extension (String) metadata_content_type (String) metadata_language (String) metadata_storage_name(string)
There are only 5 documents in my container right now.
I am not able to understand why responses are different. Same question I asked and got different response. Also, the reference document which is given sometimes doesn't match to the document.
What can be the exact reason.
First of all, LLMs are actually decoder only transformers which may not generate the exact same answer every time. The phrasing and length of the answer might be different each time.
However, if both of your answers are totally off, you can troubleshoot via following:
Webapp Resource > Settings > Enviornment Variables
, and see if the all the variables especially model name, model parameters, system message, api version and search index parameters are exactly the same.top_p=1
and temperature=0
for both. This will set deterministic model behavior.