Score:24

What does the work "An Efficient Quantum Algorithm for Lattice Problems Achieving Subexponential Approximation Factor" mean?

ie flag

In An Efficient Quantum Algorithm for Lattice Problems Achieving Subexponential Approximation Factor, the author claims they give a polynomial-time quantum algorithm for solving the Bounded Distance Decoding problem with a subexponential approximation factor on a class of integer lattices. What does this result mean? Will it imply the insecurity of lattice cryptography? Is it as important as quantum algorithm for factoring by Peter Shor?

fgrieu avatar
ng flag
Before this work implies the insecurity of lattice cryptography, we'll need [Cryptographically Relevant Quantum Computers](https://media.defense.gov/2021/Aug/04/2002821837/-1/-1/1/Quantum_FAQs_20210804.PDF). [Late addition: this comments the obvious: the question's "Will ((this result) _imply_ the insecurity of lattice cryptography?" could only be with CRQC]
Yehuda Lindell avatar
us flag
@fgrieu One of the major selling points of lattice cryptography is quantum resistance. So the question is whether this new result invalidates that claim. Clearly there is no threat until a quantum computer at stage is built, if such a machine is ever built. However, the question remains as to whether lattice-based cryptography should continue to be a candidate for a post-quantum world.
kelalaka avatar
in flag
@YehudaLindell isn't it a threat if such a machine is possible? Since the eavesdropper can store the communication and break all. This is why we are using AES-256 not AES-128.
Mark avatar
ng flag
@kelalaka it really depends on the application. Some (like authentication) are mildly impacted in the near-term, with minor exceptions (perhaps OS updates or other *highly* security critical things).
yyyyyyy avatar
in flag
There is now a [note](https://github.com/lducas/BDD-note/blob/main/note.pdf) by Léo Ducas & Wessel van Woerden claiming that classical LLL suffices to get pretty much the same result.
Yehuda Lindell avatar
us flag
@kelaka I agree that someone saying that they need privacy for 20 years may worry now. Personally, I doubt it will happen within 20 years but I can see the concern. In any case, meanwhile I strongly recommend using double encryption with both RSA/Lattices or ECC/Lattices.
Score:30
in flag

There is no public paper available yet, so this answer is preliminary and based on what was presented in the talk and the follow-up discussion. A full understanding cannot be reached until there is a paper to verify, evaluate, and compare to prior work and known results. However, a good understanding of the situation already seems to be emerging.

The tl;dr is: the special problem the authors address is classically easy to solve using standard lattice algorithms (no quantum needed), as shown in this note. Moreover, the core new quantum step can instead be implemented classically (and much more simply and effectively) as well. So, the work doesn’t show any quantum advantage versus what we already knew how to do classically, nor anything new about what we can do classically. Details follow.

The clause “on a class of integer lattices” is a very important qualifier. The BDD problem the authors address is one where the lattice is “$q$-ary” and generated by a single $n$-dimensional mod-$q$ vector (or a small number of them), the modulus $q \gg 2^{\sqrt{n}}$, and the target vector is within a $\ll 2^{-\sqrt{n}}$ factor of the minimum distance of the lattice. This setting is far from anything that has ever been used in lattice cryptography (to my knowledge), so the result would not have any direct effect on proposed lattice systems. Of course the broader question is whether the techniques could lead to stronger results that do affect lattice crypto.

Based on the description given in the talk, several expert attendees believe it’s very likely that the special lattice problem the authors address is already easily solvable using known classical techniques (no quantum needed). UPDATE: this has turned out to be the case, and is substantiated in this note. In other words, the particular form of the BDD problem makes it easy to solve in known and unsurprising ways. The algorithm is merely the standard sequence of LLL basis reduction followed by Babai nearest-plane decoding, but showing that this actually works relies on some deeper (but previously known) properties of LLL than the ones that are usually invoked.

What about the broader question: could the main techniques lead to stronger results that we can’t currently obtain classically? It turns out that what the core quantum step accomplishes, the “worst-case to average-case” transformation, can be done classically (and more simply and efficiently) using a well known randomization technique—what’s called the “LWE self reduction“ or “($q$-ary) BDD to LWE reduction.” See Section 5 and Theorem 5.3 of this paper and the earlier works cited therein for details.

More precisely, $n$-dimensional $q$-ary BDD for relative distance $\alpha$ (the problem considered by the authors) classically reduces to LWE with error rate $\alpha \cdot O(\sqrt{n})$. While this reduction looks unnecessary to solve the original BDD problem in question, it shows that the core new quantum step can be replaced by something classical that performs at least as well (and likely even better in terms of parameters). This indicates that the main quantum technique probably does not hold any novel or surprising power in this context.

Sam Jaques avatar
us flag
The use of approximate eigenvectors, whose eigenvalue has the "secret", was new to me. Is that a well-known technique in quantum lattice cryptanalysis, or is it possible it can find a more powerful application than a $n\rightarrow\sqrt{n}$ reduction?
Chris Peikert avatar
in flag
I didn’t really follow what they were getting at with that. But “approximate eigenvectors” (in a different, non-quantum sense) are a common tool in the latest generation of LWE-based FHE schemes, a la GSW.
cn flag
Interesting... suppose my only concern then is that this might inspire someone else to discover something more genuinely novel. The general concern with current PQ candidates is that before QCs even exist we certainly have not explored anything close to the entire space of possible quantum algorithms.
Sam Jaques avatar
us flag
In your second-last paragraph, I don't see the chain of reductions. Theorem 5.3 of the paper you linked doesn't seem to improve the approximation factor or the dimension, which is what the quantum algorithm does. Could you explain how it works? Once we reduce to LWE, can we reduce back to $q$-ary BDD with relative distance less than $\alpha$?
Score:9
in flag

I created a website to crowdsource what is known about algorithms for lattice problems in NP intersect CoNP:

https://latticealgorithms.xyz

Our paper is up:

https://arxiv.org/abs/2201.13450

For the record, here was the timeline we followed:

Until 8/17/21, we went through the classical literature pretty throughly. A classical algorithm would also have been fine so that we can feed it back into quantum algorithms. But since the base case is a worst-case type of the well-studied hidden number problem, it seemed reasonable that nothing is known.

8/17/21-9/21/21: We asked a sequence of 5 experts what is known about the problem. In one case we indicated the best classical approach we could find. We received no responses with new information.

9/21/21: The decision was made to go with the special base case with one vector in the talk because (1) it will aid people in solving it, and (2) it's a colloquium talk in a quantum seminar and therefore needs to be accessible to a broad audience. A talk with only lattices or only quantum is already a challenge, and to combine both, well, try it! 9/21/21: Lively discussion about possibilities during panel, but no algorithms.

9/22/21: We are contacted with a new classical algorithm for the special case, and after we made it clear what our algorithms can do, a revision outlining how to get a more general case by analyzing LLL.

1/31/22: No classical answer for our Schnorr bound has been received yet.

Sean

Score:8
ng flag

One could give a much longer answer to this question (and I would be quite interested in seeing someone like Chris's perspective), but the following two points probably suffice for a non-specialist.

Approximation Factors: The main way this attack should be seen as (potentially) threatening to lattice-based cryptography is via the possibility of future improvements. The approximation factor this targets (which I believe is $2^{n^{1/2}}$) is large enough that even if LWE was classically weak in this range, one could still construct things like:

  • PKE
  • Digital Signatures
  • FHE

E.g. most of what you might care about would still exist. So the current attack itself does not really impact the main body of "practical" lattice-based cryptography, although future improvements to the attack are of course possible.

The Possibility of Dequantization: The attack has two parts:

  1. A (quantum) dimensionality reduction step (from $n\to \sqrt{n}$)
  2. Use standard classical techniques to solve the $\sqrt{n}$-dimension instance.

A number of people suggested the possibility of dequantizing the first step, via things like taking random linear combinations (say with Gaussian coefficients) of the basis vectors. If this dimension reduction works, one gets BDD with approximation factor $2^{n^{1/2}}$ in polynomial time (I believe).

While this would make the algorithm itself stronger, it would also make it less concerning in a certain sense. This is because we (classically) have a fairly decent idea of how hard lattice problems are. By this, I am in particular thinking of things like:

  1. The various "fine-grained" hardness results that exist (say under the Exponential Time Hypothesis, or variants of it),
  2. the recent lattice sieving lower bounds, and
  3. hardness results in the presence of pre-processing (for example CVPP hardness results).

Of course these do not rule out all possible attacks, but they should be mentioned as a growing body of formal evidence for the classical hardness of lattice problems. This is to say that the main concerning thing about the linked talk is the existence of a non-trivial quantum speedup --- if this is dequantized, we are back in the setting of classical computing, where our understanding of the hardness of lattice problems is better.

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.