Score:3

Entropy of a counter

us flag

I wondered whether my understanding of entropy is correct, that a 256-bit counter that starts at 0 and counts to 2^256 - 1 by a +1 increment has a 256-bit entropy. I am asking this because I felt it awkward that for RNGs, developers often only specify an entropy.

So is it correct that a non-repeating 256-bit counter has 256-bits of entropy, because the numbers are equally distributed in their appearance?

Morrolan avatar
ng flag
Entropy is first and foremost a property of a **process**, and not of a variable or value. E.g. throwing a 16-sided perfect dice once 'produces' 4 bits of entropy. As such you might want to clarify what you mean, as the 'entropy of a counter' is ill-defined. See e.g. the first half of [this answer](https://crypto.stackexchange.com/questions/26853/trng-vs-prng-entropy) for a description of entropy which might help you refine your question.
kaiya avatar
us flag
I mean I have a process in which I start counting up from 0 to 2^256 - 1, and the process does not repeat itself. I am not quite sure what more information would be needed to answer the question. Let's say this was my first grader implementation of an RNG.
kaiya avatar
us flag
Or, turn the question around, what other properties of a RNG decide about whether the RNG is good?
Score:11
my flag

So is it correct that a non-repeating 256-bit counter has 256-bits of entropy, because the numbers are equally distributed in their appearance?

Not necessarily. Entropy can be viewed as a measurement of "how much information does the attacker not know about the system". To take a simple example, a fair coin flipped in secret generates one bit of entropy; that same coin flipped in the open (where everyone can see it) generates no entropy (even though it is the same coin); whether it comes up heads or tails, the attacker knows which, and can take that into account.

In your example, if we assume that the attacker knows about when the counter is started, and knows the approximate count rate, he knows the approximate value of the counter, and hence the amount of entropy in that counter is much less than 256 bits.

kaiya avatar
us flag
Thanks! So the entropy would be 256-bit only if the observer wasn't able to recognize the pattern? But how are we able to distinguish non-predictable patterns from predictable ones? (I guess now we are talking PRNGs)
kaiya avatar
us flag
Anyhow, thanks for pointing out the relation with predictability, which were not obvious when only looking at the formulas of (Shannon) entropy, as are often taught.
poncho avatar
my flag
@kaiya: actually, it is implicit with the standard formula for Shannon entropy; however we need to remember that the probability distribution used in that formula is from the attacker's perspective
Brian avatar
jp flag
@kaiya: Anything you wrongly treat as unpredictable represents a hole in your algorithm. So, the standard approach in crypto is to assume the observer has access to the algorithm source code, but not the key.
Score:2
cn flag

Actually not at all. It's only a few bits equivalent to the semantic content of the descriptive para-phrase integers 0 through 2^256 - 1 i.e. the set $( n \in \mathbb{Z}, 0 \le n \le 2^{256}-1 )$. That's only 14 symbols thus not many bits. It's a rough (upper bound) approximation to the Kolmogorov complexity of your counter.

Given that repetitively incrementing anything by one is extremely auto correlated, that process has an effective Kolmogorov complexity of zero and can be ignored. The only method for injecting entropy into your system would be to initialize the counter to a secret value. That would create the 256 bits of entropy.

And yes there are problems and nuances with the way the $\log_2$ Shannon formula is commonly used here. It only applies to to independent data samples that are not explicitly predictable from prior values, e.g. a fixed delta like incrementing by one. And also without an understanding of the generator at the point the sequence is read.

kaiya avatar
us flag
Thanks, I haven't yet looked into Kolmogorov complexity. Do I understand it correctly that, when using a random initialization value, only the process of creating the initial ransom has an entropy of 2^256, or did you mean the entire process in this case?
Paul Uszak avatar
cn flag
@kaiya Err, sorry. I'm not quite sure I understand your comment...
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.