pythonsocketsssltls1.2pyopenssl

Python ssl get ciphers supported


I was wondering why isn't there a simple way to find the ciphers supported by a particular ssl context? (Or did I miss it)

I know we can use socket.cipher() to get the one being used for the connection for that particular socket, but why not something to get all supported ciphers?

Another things is,

I have a server which is running openssl library with the cipher string "HIGH+TLSv1.2:!MD5:!SHA1". The client is a python file using the ssl library, with default options, and after the connection is established the socket.cipher() shows the below tuple

('DHE-RSA-AES256-GCM-SHA384', 'TLSv1/SSLv3', 256)

How is the connection established with TLSv1, when I explicitly mentioned TLSv1.2, and how is it using SHA384, when TLSv1 doesn't have support for anything higher then SHA1?


Solution

  • I was wondering why isn't there a simple way to find the ciphers supported by a particular ssl context? (Or did I miss it)

    Only OpenSSL 1.1.0 added a function SSL_CTX_get_ciphers to access this list and this functionality is not yet available in Python.

    ('DHE-RSA-AES256-GCM-SHA384', 'TLSv1/SSLv3', 256)
    

    How is the connection established with TLSv1, when I explicitly mentioned TLSv1.2, and how is it using SHA384, when TLSv1 doesn't have support for anything higher then SHA1?

    According to the source code Python is using SSL_CIPHER_get_version to find out the version string. The matching OpenSSL documentation says for OpenSSL 1.0.2:

    SSL_CIPHER_get_version() returns string which indicates the SSL/TLS protocol version that first defined the cipher. This is currently SSLv2 or TLSv1/SSLv3. In some cases it should possibly return "TLSv1.2" but does not; use SSL_CIPHER_description()

    Thus it is a bug in OpenSSL which according to the documentation of the same function in OpenSSL 1.1.0 was fixed in the latest OpenSSL version.