Score:1

Unconditionally Secure Signature Key Generation

cn flag

I'm reading through a paper called Unconditionally Secure Signatures (https://eprint.iacr.org/2016/739) and to generate keys, the authors select $\epsilon-ASU_2$ functions, such that:

  1. For any $m \in M, t \in T, \vert \lbrace f \in F : f(m)=t \rbrace \vert = \vert F \vert / \vert T \vert$
  2. For any $m_1, m_2 \in M, t_1, t_2 \in T$, such that $m_1 \neq m_2, \vert \lbrace f \in F : f(m_1) = t_1 \land f(m_2) = t_2 \rbrace \vert \leq \epsilon \frac{\vert F \vert}{\vert T \vert}$

The paper also later mentions that $\epsilon = \frac{2}{\vert T \vert}$

It seems like it's asking for a 2-independent hash function, so one would need to send 2 uniformly random independent coefficients and a modulus for a Carter-Wegman polynomial. However, unless I'm misunderstanding, the domain and range for all of the hash functions is supposed to be $M$ and $T$ respectively, and the formula given for the amount of data needed to specify one such function under proposition 1 (I've listed their equations below) seems too high to just be the 2 coefficients, so I'm obviously missing something.

$a := log_2(\vert M \vert)$

$b := log_2(\vert T \vert)$

$a = (b + s)(1 + 2^s)$

$y = 3b + 2s$

So if $|T|=|M|=2^{64}$, then we'd need just over 184 bits for each one. However, that's nearly an entire 3rd coefficient-worth more bits than I was expecting if they're taken from $2^{64}$, so I'm missing something about how the functions are selected in the first place. Any hints? I'm assuming it's to do with the choice of $\epsilon$ but I'm confused on how it relates.

Lev Knoblock avatar
cn flag
Revisiting a day later, I mixed up ASU functions with independent ones, but I'm still largely confused on the details.
Score:0
cn flag

Coming back to this a few days later, it's obvious to me that this question isn't exactly coherent considering I had some fundamental misunderstandings when writing it.

Given some output range $T$, we choose some prime $p$ such that $p > |T|$ and $(p - |T|) / p \leq \epsilon$, and then choose 2 coefficients from $GF(p)$ for the Carter-Wegman polynomial, and then take the results mod $|T|$. So, we still need to send the modulus along with the coefficients. However, since the modulus (not the coefficients) is strictly greater than $|T|$ for it to be an $\epsilon-ASU_2$ function rather than 2-independent, we can difference code it against $|T|$ or do something else and avoid representing the full third term.

I'm probably still missing something, but that at least makes sense to me as to why the representation sizes weren't matching any numbers I was expecting.

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.