NOTE: This question is based on my assumption that $X$ is a "truly random number" if and only if it's length measured in bits is equal to its entropy measured in bits. In other words, when every bit of $X$ has been generated by a random coin toss.
Suppose I have a truly random number $R$ of size 256 bits (256 bits of entropy), and a truly random number $S$ of length $n * 256$, where $n$ is some natural number, so it has $n * 256$ bits of entropy.
I now derive four keys $T_1$ to $T_4$ from $R$
- $T_1 = \text{concat}(R, \text{... n times ...}, R)$
- Calculate $t_1 = \text{sha256}(R)$, $t_2 = \text{sha256}(t_1)$, ..., $t_n = \text{sha256}(t_{n-1})$, and do $T_2=\text{concat}(t_1, ..., t_n)$.
- $T_3$ is calculated same as above, but using HMAC instead of sha256.
- $T_4 = \text{hkdf_expand}(R, \text{null}, n * 256 / 8)$.
Finally, I calculate $K_i = T_i\text{ xor }S$.
How many bits of entropy does $K_1$, $K_2$, $K_3$ and $K_4$ have?
My happy guesses:
- $T1$ will have as most as many entropy as $R$, since concatenation by repetition doesn't increase the entropy of the output, but I suspect it won't decrease it either.
- $\text{sha256}$ and $\text{HMAC}$ are believed to preserve the bits of entropy of the input, but since the process to construct $T_2$ and $T_3$ is deterministically calculated from $R$, the entropy of $T_2$ and $T_3$ will be roughly equivalent to $T1$.
- No idea about $T_4$. I guess the benefits of $\text{hkdf_expand}$ kick in when its input is not a truly random number.
About every $K_i$, I'm not sure. I recently learned that XORting two truly random numbers gives a truly random number, so the bits of entropy of the output is still its length, but since the $T_i$s aren't truly random numbers anymore, I don't know what will happen here.
My intuition tells me that the entropy of $S$ will be preserved ($n * 128$ bits), because $K_i$ is equivalent to encrypt $T_i$ using $S$ as a one-time pad key, making $T_i$ or $S$ theoretically unbreakable, so $K_i$ is still a truly random number.