My code is working for a call to Azure OpenAI when I don't have a datasource added. However, when I do add my datasource with the following parameters I get an error:
response = client.chat.completions.create(
messages = [
{
"role": "system",
"content": "when the user provides a project name as input you should do the steps mentioned below: Step 1: Get the project band of the project from the file."
},
{
"role": "user",
"content": 'Project Name: "Test project" '
}
],
model = "GPT-3.5 Turbo",
seed = 42,
temperature = 0,
max_tokens = 800,
extra_body = {
"dataSources": [
{
"type": "AzureCognitiveSearch",
"parameters": {
"endpoint": os.environ["SEARCH_ENDPOINT"],
"key": os.environ["SEARCH_KEY"],
"indexName": "test-index"
}
}
]
Gives error:
Exception has occurred: BadRequestError
Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: dataSources', 'type': 'invalid_request_error', 'param': None, 'code': None}}
httpx.HTTPStatusError: Client error '400 model_error' for url 'https://openai-ngap-genai-poc.openai.azure.com//openai/deployments/NTAPOC/chat/completions?api-version=2023-09-01-preview'
For more information check: https://httpstatuses.com/400
During handling of the above exception, another exception occurred:
File "C:\Users\choran\OneDrive - Open Sky Data Systems\Documents\NTA\NTA Chatbot code\Attempting to add datasource.py", line 13, in <module>
response = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: dataSources', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Verified that datasource details were correct.
In my environment, when I tried the same code, I got the same error:
Error:
openai.BadRequestError: Error code: 400 - {'error': {'message':'Unrecognized request argument supplied: dataSources', 'type': 'invalid_request_error', 'param': None, 'code': None}}
You can use this MS-DOCS to create your own data with chat completion.
You can use the code below to create chat completion with data source and openai version 1.9.0
.
Code:
import os
from openai import AzureOpenAI
endpoint=os.environ["AZURE_ENDPOINT"]
deployment="gpt-35-turbo"
apikey=os.environ["API_KEY"]
client = AzureOpenAI(
base_url=f"{endpoint}/openai/deployments/{deployment}/extensions",
api_key=apikey,
api_version="2023-09-01-preview")
for i in range(3):
print (f'Answer Version {i + 1}\n---')
completion = client.chat.completions.create(
model = deployment,
messages = [
{
"role": "system",
"content": "When the user provides a project name as input, you should do the steps mentioned below: Step 1: Get the project band of the project from the file."
},
{
"role": "user",
"content": 'Where do I go for Azure OpenAI customer support?" '
}
],
seed = 42,
temperature = 0,
max_tokens = 800,
extra_body = {
"dataSources": [
{
"type": "AzureCognitiveSearch",
"parameters": {
"endpoint": os.environ["SEARCH_ENDPOINT"],
"key": os.environ["SEARCH_KEY"],
"indexName": "test-index"
}
}
]
}
)
print (completion.choices[0].message.content)
print("---\n")
Output:
Answer Version 1
---
Answer Version 2
---
Answer Version 3
---
You can check the Cognitive Services support options guide for help with Azure OpenAI [doc1].
---