Invoking a request using AzureChatOpenAI returns response as expected:
import os
from dotenv import load_dotenv
from langchain_openai import AzureChatOpenAI
load_dotenv()
llm = AzureChatOpenAI(
azure_endpoint=os.getenv("AZURE_ENDPOINT"),
api_key=os.getenv("TOOL_KEY"),
api_version=os.getenv("API_VERSION"),
model="gpt-4o-2024-08-06"
)
response = llm.invoke("what is 2+3?")
print(response.content)
Returns:
2 + 3 equals 5.
But using the same config doesn't work with AzureOpenAI:
import os
from dotenv import load_dotenv
from langchain_openai import AzureOpenAI
load_dotenv()
llm = AzureOpenAI(
azure_endpoint=os.getenv("AZURE_ENDPOINT"),
api_key=os.getenv("TOOL_KEY"),
api_version=os.getenv("API_VERSION"),
model="gpt-4o-2024-08-06"
)
response = llm.invoke("what is 2+3?")
print(response.content)
Returns error:
openai.NotFoundError: Error code: 404 - {'detail': 'Not Found'}
What am I missing here?
AzureChatOpenAI supports chat completion models, including newer models like gpt-4o-2024-08-06
. AzureOpenAI supports text completion models, but NOT chat completion models (like the new gpt-4 models). If you want to use gpt-4 models, stick with AzureChatOpenAI.
More info here, and in the top banner: