encryptionrandompicentropyanalog-digital-converter

How can I estimate the entropy content of this input?


I have a 1KHZ triangle wave generator that I am measuring from a PIC micro controller using the analog input. The frequency source for the triangle wave and the analog capture are separate frequency sources. The ADC captures at 100ksps with 12 [edit:10] usable bits of precision.

I want to estimate the entropy contained in the analog samples for the purpose of generating true random numbers. The two sources of entropy that I have identified are the kelvin noise, and the frequency source offsets.

From the captured waveform I can continuously distinguish about two frequencies per second and I will capture on average one kelvin input threshold upset event per second. So my estimate is about two bits of entropy per second.

Can anyone think of a way to justify a larger entropy estimate?

Based on answers to similar questions already posted on S.O., I'll add the following clarifications:

I am not particularly interested in other ideas for entropy sources, as I would still have to answer this same question for those alternate sources.

Analysis of the data itself for autocorrelation or other measures of randomness is not the correct answer as they will be wildly optimistic.


Solution

  • NIST Publication SP800-90B recommends min-entropy as the entropy measure. However, testing the min-entropy of an entropy source is non-trivial. See NIST SP800-90B for one way such testing can be done.