[Pkg-sysvinit-devel] fundamental properties of entropy

Henrique de Moraes Holschuh hmh at debian.org
Thu Sep 16 15:05:21 UTC 2010


On Wed, 15 Sep 2010, John Denker wrote:
> a) Suppose I shuffle a deck of cards, and ask you to figure
> out the order of the cards.  The entropy in this situation
> is the logarithm of 52 factorial, which is just under 226
> bits.  You can figure out the order by asking 226 yes/no
> questions.
> 
> b) Suppose I prepare two decks of cards by shuffling one
> and then stacking the other into the exact same configuration.
> If we throw away deck #1, the entropy of deck #2 is 226 bits.
> If we throw away deck #2, the entropy of deck #1 is 226 bits.
> The situation is symmetrical with respect to which deck is
> which.  Last but not least, if we keep both decks, the entropy 
> of both of them together is 226 bits.

Does the 226 bits of entropy on scenario (b), when we keep both decks,
depend on our knowledge that both decks are in exact the same
configuration?

I'd say that no, it doesn't.  It is always going to be just 226 bits of
entropy, because the correlation between the decks will always exist
even in a scenario where such correlation cannot be known because
information about one deck is not going to reach observers of the other
deck.

> This proves that entropy is not an extensive quantity.  It is
> context-dependent.

We can, at some time, find a correlation between two RNGs, and that
knowledge will force us to re-evaluate the real entropy available from
that set of (now known to be correlated) RNGs.

The question which will make a lot of people uneasy is: did we had the
wrong idea about the real entropy available from the set until we found
out about the correlation?  Suppose that no observer knew of that
correlation, either.

> Some people find this confusing or even disturbing, but it is
> a well-established fact of nature.  For details on all this,
> see
>  http://www.av8n.com/physics/thermo/entropy.html#sec-card-game

I'll read that text, I am very sure I will learn something from it :)

> There is a fundamental principle in the cryptography / security
> business says that you cannot make something secure by throwing
> together a whole bunch of insecure elements.  You can make it
> complicated, but you cannot make it secure.  This has been
> discussed and documented, in connection with RNGs and otherwise,
> in various places including Knuth _TAoCP_ 

Yes.  However it is not impossible to increase the required effort by an
attacker through a bunch of insecure elements.  In that case, you did
not make the system secure, but you did raise the bar.  If it is unknown
how much you raise the bar, it is going to be considered of dubious
value security-wise, and caution would speak against including such
insecure elements in that case.

One example is using a clock to shuffle an entropy pool.  By itself, it
is never going to make that entropy pool secure.  Even if there are no
external observers of that clock, there will be external observers of
its indirect effects on the output of the RNGs driven by the pool.  The
information leak is there, so it is not secure.

But it made the attacker's life a bit more difficult if his attack was
based on a known initial state of the pool, he now needs also something
about the state of the clock, and he will spend resources trying to
determine it.  I suppose (my math is NOT good enough for this) that
there are some attacks that are NOT going to be made any more difficult
by an input that has no entropy.  But if it makes a set of cheap attacks
harder, it did have some value.

-- 
  "One disk to rule them all, One disk to find them. One disk to bring
  them all and in the darkness grind them. In the Land of Redmond
  where the shadows lie." -- The Silicon Valley Tarot
  Henrique Holschuh



More information about the Pkg-sysvinit-devel mailing list