[openssl-dev] [openssl.org #4227] openssl rand 10000000000 does not produce 10000000000 random bytes

Ole Tange via RT rt at openssl.org
Tue Jan 12 03:56:58 UTC 2016


On Tue, Jan 12, 2016 at 4:36 AM, Kaduk, Ben via RT <rt at openssl.org> wrote:
> On 01/11/2016 06:01 PM, Salz, Rich via RT wrote:
>>> I am a bit worried when I see C-beginner mistakes like this in a security suite:
>>> When using sscanf on data you have not produced yourself, you should
>>> always assume they will be bigger that your largest buffer/variable and deal
>>> correctly with that.

>> That's a bit of an exaggeration here.  It's not network data coming in from somewhere else, it's a number typed on the command line in a local program.

The worry is not about this particular case (where it does not seem to
be possible to abuse), but as a general observation: If the rest of
the code has the same quality, then we will be screwed.

> There's also the part where asking 'openssl rand' for gigabytes of data
> is not necessarily a good idea -- I believe in the default configuration
> on unix, it ends up reading 32 bytes from /dev/random and using that to
> seed EAY's md_rand.c scheme, which is not exactly a state-of-the-art
> CSPRNG these days...

We do not know what they will be using these data for (in my case a
user wanted it for overwriting a harddrive, so the quality mattered
less than the speed). But if it is unsafe for generating GB's of data
based on a 32-byte seed, then by all means read another 32-byte seed
now and then (but please do it from /dev/urandom:
http://www.2uo.de/myths-about-urandom/).


/Ole




More information about the openssl-dev mailing list