According to the JWS specification, "x5t#s256" thumbprint is defined as follows.
The "x5t#S256" (X.509 certificate SHA-256 thumbprint) Header Parameter is a base64url-encoded SHA-256 thumbprint (a.k.a. digest) of the DER encoding of the X.509 certificate [RFC5280] corresponding to the key used to digitally sign the JWS. Note that certificate thumbprints are also sometimes known as certificate fingerprints. Use of this Header Parameter is OPTIONAL.
Here in order to calculate the thumbprint value, we first retrieve the DER encoding of the certificate which produces a 32 byte array (digest). Then how should we compute the thumbprint value? Should we base64 url encode the byte array directly or should we first convert it into a hex string and then base64 url encode? Do we have defined this clearly in any RFC specification?
As per my findings, in some places we are taking the hex string and then base64 url encoding it. But in some other places we are directly base64 url encoding the digest byte array.
For example, in the MTLS token bound spec, we are following the above second approach. I.e. directly base64 url encoding the digest byte array.
But the following thumbprint validation procedure requires to follow the above first approach. I.e. take the hex string of the digest byte array and then base64 url encode.
Extract the SHA-256 thumbprint from a certificate.
36:12:AD:8F:01:B4:EC:F4:71:4F:0B:C8:E0:71:B6:40:3D:D3:4C:4D:DE:62:D8:1D:D4:B9:1D:1A:A3:56:DE:E6
Remove ':'s and convert it to lowercase.
3612ad8f01b4ecf4714f0bc8e071b6403dd34c4dde62d81dd4b91d1aa356dee6
Then base64url encode the value.
MzYxMmFkOGYwMWI0ZWNmNDcxNGYwYmM4ZTA3MWI2NDAzZGQzNGM0ZGRlNjJkODFkZDRiOTFkMWFhMzU2ZGVlNg
Final output should be equal to the "x5t#s256" value.
Posting the answer for anyone that may looking for the same issue in future.
The correct approach doesn't involve hexifying the byte array to covert it into a human readable format. We should directly base64 url encode the 32 bit byte array.
Refer working group response for more details.