Score:0

NTL: Solve the closest vector problem for non-square matrix using LLL/Nearest Plane Algorithm

cn flag

Assume I have a matrix $A \in \mathbb{Z}^{m \times n}$, $m > n$, which forms a basis of a lattice. Given a vector target vector $t = Ax + e$, $t,e \in \mathbb{Z}^m$,$x \in \mathbb{Z}^n$, I want to find the (approximate) closest vector in the lattice $\mathcal{L}(A)$ to $t$.

I wanted to use Babai's nearest plane algorithm, in particular the NTL implementation NTL::NearVector to solve this problem (approximately) using LLL. However, it seems to me that in the literature and definitely in the software package, Babai's nearest plane algorithm requires a full-rank lattice?

What other techniques/embeddings can I use to solve the closest vector problem on an lattice with higher dimension than rank? Could I just extend the matrix with zero-vector columns?

Mark avatar
ng flag
In the $\ell_2$ norm, it should suffice to project your vector onto the (real) span of your lattice (which is a rank $n$ subspace), then orthogonally rotate this subspace to be isomorphic to $\mathbb{R}^n\times\{0\}^{m-n}$. I don't know how to do this in NTL in though, so will only leave the comment.
Score:1
sz flag

no Babai's nearest plan algorithm doesn't necessary need a full rank lattice look at this paper here.

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.