Score:2

How small is the negligible advantage for DDH?

ma flag

The well known Decisional Diffie Hellman assumption (DDH) assert that for any $n = \log q$ and generator $g$ of $\mathbb{Z}_q$, for uniformly i.i.d $A, B, C \sim U(\mathbb{Z}_q)$, the following are indistinguishable for any PPT $M$: $g^A, g^B, g^C$ vs. $g^A, g^B, g^{AB}$. That is, up to negligible advantage: $$\epsilon = \left| \Pr[M(g^A, g^B, g^C) = 1] - \Pr [M(g^A, g^B, g^{AB}) = 1 \right| \leq 1/\omega(n) $$ I'm interested however how small the error can be taken? That is, is there any distinguisher for the extremely small $\epsilon = 2^{-O(n)}$? Is it known how ``negligible'' should $\epsilon$ be?

Mark avatar
ng flag
I think $\epsilon$ should be a function of $q$, not $n$, and therefore you need to specify a connection between $q$ and $n$ for this question to have an answer.
Napoleon avatar
ma flag
Thanks, I meant that $n = \log q$
Score:1
gb flag

Negligible has a precise meaning in cryptography. It is really defined in terms of growth (or rather, decay), for example, with respect to the security parameter.

A function $\mu$ is negligible if it grows slower (or decays faster) than 1 over any polynomial function. Specifically, for any polynomial $\mathsf{poly}$, for some constant $N$, then for all $x \geq N$, we have: $$|\mu(x)| < \frac{1}{\mathsf{poly}(x)}.$$

An example of a negligible function is $\mu(x) = 2^{-x}$. This is because for any polynomial, we can always find an $N$ such that the previous inequality holds, since the decay is exponential. For example, using the polynomial $x^3$, the inequality doesn't hold at $x = 2$ (since $1/4 > 1/8$), $x = 3$ (since $1/8 > 1/27$), and so on. But when $x \geq 10$, then the inequality does hold (e.g. $2^{-10} < 1/10^3$). So in this specific eaxmple, we'd set $N = 10$.

In the specific example of DDH, suppose $M$ spends a polynomial amount of time computing random DDH triples itself, ($g^a,g^b,g^{ab}$). Then there is some tiny probability that the DDH challenge it is given was one that it computed, so it would win slightly more than $1/2$ the time (it wins half the time from a uniformly random guess). However, this advantage is negligible in the technical sense, because as $M$ is PPT, it can only compute polynomially many tuples, but the number of possible tuples grows exponentially with the security parameter. Therefore the advantage looks something like $\textsf{poly}(\kappa)/2^{\kappa}$, which is negligible in the formal sense above.

Napoleon avatar
ma flag
I agree that with the specific machine $M$ you mentioned, that simply guesses elements, has $1/2^{O(n)}$ advantage. Nevertheless, there might be other PPTs that do something different, and able to gain better advantage, say, $1/n^{\log n}$, which is still negligible. My question concerns more how much negligible the advantage can be: $n^{-\log \log n}, n^{-\log n}, 2^{-O(n)}$?
meshcollider avatar
gb flag
Theoretically, any of those things could exist, the security assumption says nothing about it. We just draw the line at "negligible" and leave it there.
Napoleon avatar
ma flag
I agree that all of those error are theoretically possible, but practically -- are you aware for example about a distinguisher with advantage $n^{-\log \log n}$?
meshcollider avatar
gb flag
No, I'm not aware off the top of my head. It sounds like it would be a weird algorithm.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.