[openssl-dev] DRBG entropy

John Denker ssx at av8n.com
Wed Jul 27 13:39:52 UTC 2016


On 07/27/2016 05:13 AM, Leon Brits wrote:
> 
> I have a chip (FDK RPG100) that generates randomness, but the
> SP800-90B python test suite indicated that the chip only provides
> 2.35 bits/byte of entropy. According to FIPS test lab the lowest
> value from all the tests are used as the entropy and 2 is too low. I
> must however make use of this chip.

That's a problem on several levels.

For starters, keep in mind the following maxim:
     Testing can certainty show the absence of entropy.
     Testing can never show the presence of entropy.

That is to say, you have ascertained that 2.35 bits/byte is an
/upper bound/ on the entropy density coming from the chip.  If
you care about security, you need a lower bound.  Despite what
FIPS might lead you to believe, you cannot obtain this from testing.
The only way to obtain it is by understanding how the chip works.
This might require a treeeemendous amount of effort and expertise.

================

Secondly, entropy is probably not even the correct concept.  For any
given probability distribution P, i.e. for any given ensemble, there
are many measurable properties (i.e. functionals) you might look at.
Entropy is just one of them.  It measures a certain /average/ property.
For cryptologic security, depending on your threat model, it is quite
possible that you ought to be looking at something else.  It may help
to look at this in terms of the Rényi functionals:
  H_0[P] = multiplicity      = Hartley functional
  H_1[P] = plain old entropy = Boltzmann functional
  H_∞[P] = adamance

The entropy H_1 may be appropriate if the attacker needs to break
all messages, or a "typical" subset of messages.  The adamance H_∞
may be more appropriate if there are many messages and the attacker
can win by breaking any one of them.

To say the same thing in other words:
 -- A small multiplicity (H_0) guarantees the problem is easy for the attacker.
 -- A large adamance (H_∞) guarantees the problem is hard for the attacker.

================

Now let us fast-forward and suppose, hypothetically, that you
have obtained a lower bound on what the chip produces.

One way to proceed is to use a hash function.  For clarity, let's
pick SHA-256.  Obtain from the chip not just 256 bits of adamance,
but 24 bits more than that, namely 280 bits.  This arrives in the
form of a string of bytes, possibly hundreds of bytes.  Run this
through the hash function.  The output word is 32 bytes i.e. 256
bits of high-quality randomness.  The key properties are:
 a) There will be 255.99 bits of randomness per word, guaranteed
  with high probability, more than high enough for all practical
  purposes.
 b) It will be computationally infeasible to locate or exploit
  the missing 0.01 bit.

Note that it is not possible to obtain the full 256 bits of
randomness in a 256-bit word.  Downstream applications must be
designed so that 255.99 is good enough.

========

As with all of crypto, this requires attention to detail.  You
need to protect the hash inputs, outputs, and all intermediate
calculations.  For example, you don't want such things to get
swapped out.


More information about the openssl-dev mailing list