I'm trying to use GPT-3.5 in my Flutter app. I got answers but it takes 30-60 seconds to get a response. The code is the following:
Future<String> getResponse(String message) async {
OpenAI.apiKey = openApiKey;
try {
final chatCompletion = await OpenAI.instance.chat.create(
model: 'gpt-3.5-turbo',
messages: [
OpenAIChatCompletionChoiceMessageModel(
content: message,
role: OpenAIChatMessageRole.user,
),
],
);
print(chatCompletion);
return chatCompletion.choices.first.message.content;
} catch (e) {
return "Something went wrong. Please try again later.";
}
}
Right now I have a personal account, and I don’t have a paid subscription at the OpenAI site. Is something wrong with my code, or should I select a paid plan and will this solve the issue and will the response faster?
This is probably due to the OpenAI server being overloaded.
As explained on the official OpenAI forum by @rob.wheatley:
The last few days have been really quite bad. Even with streaming, a response could take a long time to start. But last night, as I was testing my new streaming interface, I noticed some odd, but promising, behavior. Randomly, I would get very quick responses. They were rare at first. /.../ This morning, all responses have been quick so far.
So, the whole thing looks like a capacity issue to me. Not great if you are building a commercial app.
Sources: