I'm integrating my own MCP server with OAuth authentication through the ChatGPT Connector.
During the authorization code exchange, the connector doesn’t send the client_id field in the POST request to the /token endpoint.
My .well-known/openid-configuration looks like this:
{
"token_endpoint_auth_methods_supported": [
"client_secret_post"
]
}
According to RFC 6749, section 4.1.3,the client_id parameter is required for all authentication methods except none.
However, the ChatGPT Connector omits it.
Question
Is this a known bug in the ChatGPT Connector or an intended behavior (e.g. when the client is registered as a public client)? If it’s a bug, are there any known workarounds?
The issue is caused by the fact that the current ChatGPT MCP Connector
ignores the token_endpoint_auth_methods_supported value provided
by the Authorization Server.
Even if the server advertises:
{
"token_endpoint_auth_methods_supported": [
"client_secret_post"
]
}
the connector always uses client_secret_basic.
As a result: - it does not send client_id or client_secret in
the POST body, - it ignores the requirement for client_secret_post, -
and it violates the behavior described in RFC 6749 §4.1.3, where
client_id is required for all methods except none.
This behavior is not related to client registration or whether the
client is public.
It is a limitation (or a bug) of the current connector implementation.
At the moment, the connector supports only the basic method
(client_secret_basic), regardless of the Authorization Server's
discovery configuration.