Differences in defaults between 1.0.2 and 1.1.1
graeme.perrow at sap.com
Tue Mar 19 13:40:19 UTC 2019
I have an LDAP server that accepts TLS connections, and I can make a connection to it using "openssl s_client -showcerts -host <host>:<port> -debug". The output shows this is a TLSv1.2 connection using ECDHE-RSA-AES128-SHA. This is using OpenSSL version 1.0.2j.
If I run exactly the same command using the openssl executable built with 1.1.1, I get errors:
write to 0x2917b30 [0x2928090] (326 bytes => 326 (0x146))
0000 - 16 03 01 01 41 01 00 01-3d 03 03 5a e6 ad 03 79 ....A...=..Z...y
0140 - cb bb 7f 9c 78 24 ....x$
read from 0x2917b30 [0x291edf3] (5 bytes => 0 (0x0))
no peer certificate available
No client certificate CA names sent
SSL handshake has read 0 bytes and written 326 bytes
New, (NONE), Cipher is (NONE)
Secure Renegotiation IS NOT supported
No ALPN negotiated
Early data was not sent
Verify return code: 0 (ok)
read from 0x2917b30 [0x290e960] (8192 bytes => 0 (0x0))
The connection is closed by the server, which is reporting an error:
TLS: error: accept - force handshake failure: errno 11 - moznss error -12162
TLS: can't accept: TLS error -12162:Unsupported hash algorithm used by TLS peer..
If I add the -no_tls1_2 switch, the openssl 1.1.1 command succeeds. Since the server didn't change and the client command line didn't change, it would seem that some default behaviour has changed within OpenSSL for 1.1.1. I know that some ciphersuites were removed or disabled but the one used by OpenSSL 1.0.2j (ECDHE-RSA-AES128-SHA) does not seem to be one of them (it's listed in "openssl ciphers"). Does anyone know what might be happening here?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the openssl-users