Score:2

What does $(a+b) \bmod{256}$ and $a$ XOR $b$ reveal about $a, b$?

in flag

Say $a$ and $b$ are some uniform random $8$ bits so that the entropy of $a$ and $b$ is 8 bits each.

If I show you $(a+b) \bmod{256}$ and $a$ XOR $b$, then what can you tell about $a$ and $b$? Or how much of their entropy is reduced?

Score:5
my flag

fgrieu analyzes the average case; we can also consider the worst case - how much the entropy are we guaranteed to have left.

One 'worse case' happens if the $a+b = 0 \pmod {256}$ and $a \oplus b = 0$; in that case, the only possible solutions are $a=b=0$ and $a=b=128$; hence we have reduced the entropy to 1 bit.

More generally, this worse case happens if $a \oplus b = 0$ or $a \oplus b = 128$; whenever that happens, there are only two possible solutions, namely (in the case of $a \oplus b = 0$, we have either $a = b = sum/2$ (where $sum$ is the published sum, which will always be even) or $a = b = sum/2 + 128$; in the case of $a \oplus b = 128$, we have $a, b = sum/2, sum/2 + 128$ in some order.

We note that, for any $a, b$, the alternative values $a \oplus 128, b \oplus 128$ always give the same xor and sum, hence there are always at least two solutions - hence this bad case is the worse case.

Score:4
ng flag

I'll assume bitstrings are assimilated to integers by big-endian notation, $a$ and $b$ are $k$-bits with $k=8$ in the question, and it's given two $k$-bit quantities $s:=a+b\bmod{2^k}$ and $x:=a\oplus b$.

$s$ and $x$ are not independent: their low-order bit is the same. Therefore revealing $(s,x)$ reveals at most $2k-1$ bit of information, thus cause at most a $2k-1$ bit reduction of entropy.

Since given $a$ and $x$ we can compute $b=a\oplus x$, revealing $(s,x)$ cause at least a $k$ bit reduction of entropy.

The actual reduction of entropy varies between these bounds:

  • With $x=0$ and $s$ even, the solutions are $(a,b)\in\{(s/2,s/2),(s/2+2^{k-1},s/2+2^{k-1})\}$, thus there remains $\log_2(2)=1$ bit of entropy out of the initial $2k$, a loss of $2k-1$ bits of entropy.
  • With $x=s=1$ there are 4 solutions: $(a,b)\in\{(0,1),(1,0),(2^{k-1},2^{k-1}+1),(2^{k-1}+1,2^{k-1})\}$, thus there remains $\log_2(4)=2$ bit of entropy out of the initial $2k$, a loss of $2k-2$ bits of entropy.
  • With $x=s=2^{k-1}$ there are $2^k$ solutions of the form $(a,2^k-1-a)$, thus there remains $k$ bit of entropy out of the initial $2k$, a loss of $k$ bits of entropy.

I assert without proof that for $i\in[0,k)$ the entropy loss is $2k-1-i$ bit with probability ${k-1\choose i}/2^{k-1}$, and that it follows the expected entropy loss is $(3k-1)/2$ bit.

Score:4
cn flag

As I didn't see it mentioned yet: $a + b = (a\oplus b) + ((a\&b)<<1) \bmod 256$ (where $<<$ denotes left-shift), so the information you are given is equivalent to knowing $a\oplus b$ and - except the highest bit - $a\&b$.

All functions $a+b$, $a\oplus b$ and $a\&b$ are symmetric in $a$ and $b$. For a fixed bit position $i$ you can therefore at most know how many of the two bits $a_i$ and $b_i$ are 1, but if it's only one, then you don't know which one.

For all but the highest bits you know $a_i\&b_i$ and $a_i\oplus b_i$, which are just the binary digits of $a_i+b_i$, so you have for every bit position (but the highest) exactly the 0/1-count. For the highest bit position you just have $a_7\oplus b_7$.

From this you should be able to derive the results of poncho and fgrieu.

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.