Score:2

For any numbers $a, b$, what are the operators $X, Y$ such that revealing $a\ X\ b$ and $a\ Y\ b$ doesn't reveal info about $a,b$?

in flag

Previously I thought about a pair of 8-bit uniformly distributed random numbers $(a,b) \in \{0,1\}^8$, and $X$ to be bitwise XOR, $Y$ to be 8-bit addition. But it turned out that revealing $a \text{ XOR } b, a+b \bmod{2^8}$ does reveal a lot of information bits about $a,b$.

A smart dude here mentioned "dependence" as a property. So I guess I am looking for independent operators? Or, at least, operators that are independent when their input are random numbers?

My questions are:

  • How far can we go in minimising the amount of information that $a\ X\ b$ and $a\ Y\ b$ give about $a,b$?
  • Can we mathematically prove any bounds? E.g. proving that if $a,b$ are uniform random numbers, then if $X$ is ... and $Y$ is ..., then it has to be that $a\ X\ b$ and $a\ Y\ b$ cannot give more than $x$ bits of information about $a,b$?
fgrieu avatar
ng flag
If $X$ (resp. $Y$) can take $u$ (resp. $v$) values over the full set of it's two arguments, then $a\ X\ b$ and $a\ Y\ b$ together can't take more than $u\,v$ values, thus can't reveal more than $\log_2(u\,v)$ bit of information. A better bound is possible for some choice of $X$ and $Y$ making the values "dependent" , but I don't know how to characterize that better than defining the dependency as the difference between that bound and the actual amount of information disclosed,
caveman avatar
in flag
@fgrieu - Trying to understand: if we only deal with 8-bit variables, then will that mean that $u=v=8$? Also, any idea how to prove the $\log_2(uv)$ bound? Thanks a lot.
fgrieu avatar
ng flag
$u$ and $v$ depend on $X$ and $Y$. If we define $a\ X\ b $ and $a\ Y\ b $ to be 42 regardless of $a$ and $b$, then $u=v=1$, $\log_2(u\,v)$ is $0$, and thus this upper bound tells us that no information is revealed about $a$ and $b$. Notice that when $a\ X\ b $ and $a\ Y\ b $ are non-constant but equal regardless of $a$ and $b$, we have $u=v>1$, the bound $\log_2(u\,v)$ holds but is loose. Proof of the $\log_2(u\,v)$ bound: a variable taking $w$ values can't reveal more than $\log_2(w)$ bit of information, and for $(X,Y)$ we get at most $w=u\,v$ values.
fgrieu avatar
ng flag
I'm wondering: are you considering "the amount of information that $a\ X\ b$ and $a\ Y\ b$ give about $a,b$" (which can be up to always 16 bits, assuming e.g. $a\ X\ b$ returns $a$, and $a\ Y\ b$ returns $b$), or the amount of information that $a\ X\ b$ and $a\ Y\ b$ give about one of $a$ or $b$, the other being regarded as random and unknown (obviously that can't exceed 8 bits)?
caveman avatar
in flag
@fgrieu The former. Both $a,b$ are supposed to be secrets as much as possible. I'm trying to find how far can one go with operators $X,Y$ in terms of minimising information leakage off $a,b$. I see why $\log_2(uv)$ is a loose maximum bound (because this is the maximum information containable in $uv$ many unique numbers).
Score:2
sa flag

It is possible to leak zero information. Assume uniformly distributed $a$ and $b$ and let $a$ vary along the rows and $b$ along the columns of the operation tables below:

$$ \begin{array}{ccc} \begin{array}{c|cccc} X & 0&1&2&3\\ \hline 0 & 0&1&2&3 \\ 1 & 1&2&3&0 \\ 2 & 2&3&0&1 \\ 3 & 3&0&1&2 \end{array} & \quad & \begin{array}{l|cccc} Y & 0&1&2&3\\ \hline 0 & 3&0&1&2 \\ 1 & 0&1&2&3 \\ 2 & 1&2&3&0 \\ 3 & 2&3&0&1 \end{array} \end{array} $$

Note that for each operation knowing the output ($aXb$ or $aYb$) gives no information at all about $a$. The same is true of $b$. But if you know one of $a$ or $b$ you then know the other one uniquely.

Furthermore, let us say $aXb=0.$ The possible pairs $(a,b)$ are now in the set $$ S=\{(0,0),(1,3),(2,2),(3,1)\}. $$

Assuming no errors in computing the operation, the only possibility for $aYb$ is $aYb=3$ and this gives no further information about the possible pairs in $S$.

You may say this is a strange example, but it demonstrates that the minimum can be zero for each individual input variable.

One last point, since I don't know your requirements exactly. It is possible to double the bitlength of the output while ensuring even knowing one of $a$ or $b$ leaks no information about the other. The output $2X3=12$ would correspond to the output bit pattern $0110$ with $01=1,$ and $10=2.$ Here is an example below:

$$ \begin{array}{c|cccc} X & 0&1&2&3\\ \hline 0 & 00&11&22&33 \\ 1 & 13&02&31&20 \\ 2 & 21&30&03&12 \\ 3 & 32&23&10&01 \end{array} $$ Now let us say you know that $a=1.$ This restricts you to the second row of the operating table but $b$ is still completely undetermined, you know nothing about the value of $b.$

This example uses two MOLS (Mutually Orthogonal Latin Squares).

caveman avatar
in flag
_Astounding-total-amaze-stupendousness_. This is pure glory. Thanks a lot. However, I don't get the double bit-length one: if $aXb = 12$, wouldn't it be a trivial lookup to see that it must be $a=2, b=3$? I.e. both $a,b$ are exposed? Perhaps I am missing something fundamental?
kodlu avatar
sa flag
In the information theoretic model about information leakage we observe certain variables and ask about the uncertainty regarding the other variables once that value is known. So we observe, e.g., $aXb$ and ask about $a$ or $b$ or $a,b$ both. Alternatively we observe $a$ and $aXb$ and ask about $b$. Also, if we apply the model to a larger set in a cryptographic context, brute force is not feasible.
caveman avatar
in flag
I'm just not able to read the last table. $2X3 = 12$, right? But your text says $1X2=12$?
kodlu avatar
sa flag
see edit for a clarification
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.