[openssl-users] d2i_TYPE() BCP question, distinguish malformed input from malloc error?
Viktor Dukhovni
openssl-users at dukhovni.org
Mon Jul 13 18:25:52 UTC 2020
I am curious whether anyone has BCP recommentations for distinguishing
between (presumably rare) out-of-memory or similar internal resource
issues resulting in a NULL return value from d2i_TYPE() (e.g.
d2i_X509()), vs. (presumably more common) issues with the input
encoding?
Does anyone have experiences (good or bad), and/or working code examples
with trying to use the error stack to distinguish between these
conditions? Would it be a good or bad idea to even try?
Should OpenSSL have an alternative d2i interface:
ssize_t d2i_ex_TYPE(TYPE **out, unsigned const char *buf, size_t len)
where:
* The out parameter is not "re-used", avoiding the d2i() pitfall.
* The buffer pointer is not modified.
* Success is a return value > 0.
* Malformed input is a zero return value.
* Internal errors (malloc, but also malformed ASN.1 type templates)
yield a negative return value.
This sort of thing could allow an application to distinguish permanent
errors (malformed input) from transient errors (temporarily out of
memory), and perhaps degrade in the appropriate way (possibly avoid
resource exhaustion downgrade attacks).
Of course a new API like that is a non-trivial project. I am not
in a position to sign on that at present...
So I'm left wondering whether anyone has worked around the issues in a
reasonable way by inspecting the error stack in some reasonably reliable
manner.
--
Viktor.
More information about the openssl-users
mailing list