Score:5

Conditional entropy between ciphertext, plaintext and the key

il flag

If we have a (possibly imperfect) cryptosystem that generates ciphertext $C$ from plaintext $P$ and key $K$, we have:

$H(C, P, K) = H(C | P, K) + H(P, K)$

where $H$ is entropy.

My question is why following is true:

$H(C | P, K) = 0$

It seems because each key and plaintext uniquely define a ciphertext but I want to prove this mathematically (theories in entropy).

sh flag
Your question might be more useful to other readers if you add an explanation for the symbols used.
ctwardy avatar
hn flag
Edited to explain symbols. Two partial answers below could be merged.
Score:4
ru flag

It is not true for all cryptosystems. Most modern cryptographic systems support probabilistic encryption where there can be many ciphertexts associated with a single key-plaintext pair. This is particularly important where we wish schemes to have indistinguishability under chosen plaintext attacks (IND-CPA) for example.

Score:3
zw flag

Take the definition of the conditional entropy,

$H(C|P,K) = - \sum_{c \ \in \ C \\ m \ \in \ P \\ k \ \in \ K} p(c, m, k) \log\left(\frac{p(c,m,k)}{p(m,k)}\right)$

(maybe this is clearer on the Wikipedia, sorry for my rubbish formatting skills) then notice that, if $c$ is totally conditioned on $(m, k)$ - because, like you said, each ciphertext corresponds to a unique plaintext-key pair with a bijective mapping - then it must be that $p(c,m,k) = p(m,k)$. Consequently, the fraction inside the logarithm evaluates to $1$ always, and so the logarithm evaluates to $0$, always. Then each term in the sum is $0$ and the over all conditional entropy is $0$, as you expect for a distribution where $(m,k)$ totally defines $c$.

As mentioned above, for probabilistic encryption, where a nonce or IV is used, this wouldn't be the case, because there would be more inputs to define $c$, so the entropy of those inputs would 'filter through' to $c$ leading to non-zero conditional entropy.

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.