[openssl-project] Entropy seeding the DRBG
Oracle
paul.dale at oracle.com
Tue May 8 16:26:59 UTC 2018
I can conform that it is measured in bits per sample size (in this case bytes). The estimate is very low and this is not a great source.
We can explore other options and I should be able to spare some time over ICMC to assist. I’m not well versed in VMS though.
Pauli
> On 30 Apr 2018, at 12:00 pm, Richard Levitte <levitte at openssl.org> wrote:
>
> In message <20180430.164908.1424770216194967097.levitte at openssl.org> on Mon, 30 Apr 2018 16:49:08 +0200 (CEST), Richard Levitte <levitte at openssl.org> said:
>
> levitte> In message <20180430.152609.587396153749337701.levitte at openssl.org> on Mon, 30 Apr 2018 15:26:09 +0200 (CEST), Richard Levitte <levitte at openssl.org> said:
> levitte>
> levitte> levitte> In message <20180430131000.GA25216 at roeckx.be> on Mon, 30 Apr 2018 15:10:01 +0200, Kurt Roeckx <kurt at roeckx.be> said:
> levitte> levitte>
> levitte> levitte> kurt> The comment about not hashing it is if you want to use the tool to
> levitte> levitte> kurt> do entropy estimation. Hashing it will not increase the entropy,
> levitte> levitte> kurt> but the estimation will be totally wrong.
> levitte> levitte> kurt>
> levitte> levitte> kurt> Passing the hashed data to the drbg as entropy input is fine if
> levitte> levitte> kurt> you already know how much entropy that it contains.
> levitte> levitte>
> levitte> levitte> Thanks, that's what I suspected. Ok, on to the next step
> levitte>
> levitte> Not done running, but does show some promise...
> levitte>
> levitte> : ; ./a.out ../../../levitte/vms-experiments/entropy-gathering/entropy-stats.bin 8 -v
> levitte> Opening file: ../../../levitte/vms-experiments/entropy-gathering/entropy-stats.bin
> levitte>
> levitte> Running non-IID tests...
> levitte>
> levitte> Entropic statistic estimates:
> levitte> Most Common Value Estimate = 0.975224
> levitte> Collision Test Estimate = 0.902997
> levitte> Markov Test Estimate = 0.410808
> levitte> Compression Test Estimate = 0.811274
> levitte>
> levitte> I assume that estimate is per "word" (i.e. per 8 bits of data in this
> levitte> case).
>
> Ok, done running... suffice to say, the first tests left me ever so
> hopeful...
>
> : ; ./a.out ../../../levitte/vms-experiments/entropy-gathering/entropy-stats.bin 8 -v
> Opening file: ../../../levitte/vms-experiments/entropy-gathering/entropy-stats.bin
>
> Running non-IID tests...
>
> Entropic statistic estimates:
> Most Common Value Estimate = 0.975224
> Collision Test Estimate = 0.902997
> Markov Test Estimate = 0.410808
> Compression Test Estimate = 0.811274
> t-Tuple Test Estimate = 0.0818796
> Longest Reapeated Substring Test Estimate = 0.0818772
>
> Predictor estimates:
> Multi Most Common in Window (MultiMCW) Test: 100% complete
> Correct: 507351
> P_avg (global): 0.508671
> P_run (local): 0.587891
> Multi Most Common in Window (Multi MCW) Test = 0.76638
> Lag Test: 100% complete
> Correct: 269907
> P_avg (global): 0.271051
> P_run (local): 0.347168
> Lag Prediction Test = 1.52629
> MultiMMC Test: 100% complete
> Correct: 11700
> P_avg (global): 0.011977
> P_run (local): 0.444824
> Multi Markov Model with Counting (MultiMMC) Prediction Test = 1.16869
> LZ78Y Test: 99% complete
> Correct: 572107
> P_avg (global): 0.573391
> P_run (local): 0.615723
> LZ78Y Prediction Test = 0.699647
> Min Entropy: 0.0818772
>
> So I'd like to have it confirmed that I'm reading this right, that's
> about 0.08 entropy bits per 8 data bits? Or is it per data bit?
> Depending on the interpretation, we either have 1 bit of entropy per
> 12 data bits... or per 100 data bits... The latter has my heart
> sinking...
>
> --
> Richard Levitte levitte at openssl.org
> OpenSSL Project http://www.openssl.org/~levitte/
> _______________________________________________
> openssl-project mailing list
> openssl-project at openssl.org
> https://mta.openssl.org/mailman/listinfo/openssl-project
More information about the openssl-project
mailing list