Score:2

LLL on Knapsack-eque problem

gm flag

Given integers $s_1, \dots , s_n$ and target integer $t$, I'm trying to find small integer coefficients $x_1, \dots , x_n$ such that: $$ t \approx x_1 s_1 + \dots +x_ns_n $$ Taking inspiration from the Knapsack problem, I was trying to use LLL on the matrix : $$ B = \begin{pmatrix} 1 & & & & \\ & 1 & & & \\ & & \ddots & & \\ & & & 1 & \\ s_1 & s_2 & \cdots & s_n & -t \end{pmatrix} $$

A problem I forsee is that lattice points generated by this basis are of the form : $$ \begin{pmatrix} x_1 \\ \vdots \\ x_n \\ x_1 s_1 + \dots +x_ns_n - x_{n+1} t \end{pmatrix} $$ which means LLL could just return small coefficients for $$ k \, t = x_1 s_1 + \dots +x_ns_n $$ where $k$ is not 1.

How do I modify my setup to prevent this issue? Also, wouldn't this issue also affect Knapsack instances (i.e. the weights add up to some multiple of the target), and if so, how is this not a problem when using LLL on Knapsack?

Score:4
ru flag

If we have a target upper bound $X$ for the $|x_i|$, then we expect the contribution of the $x_i$ to the square of the $\mathcal L_2$ norm of the short vector to be less than $nX^2$. If we therefore introduce a scaling factor $S\approx\sqrt n X$ and instead work with the lattice generated by the columns of $$B'=\begin{pmatrix} 1 & & & & \\ & 1 & & & \\ & & \ddots & & \\ & & & 1 & \\ Ss_1 & Ss_2 & \cdots & Ss_n & -St \end{pmatrix},$$ we will generate vectors $(x_1,\ldots,x_n,x_{n+1})^T$ where $$Sx_{n+1}t=x_1s_1+\cdots+x_ns_n.$$ Now the solution that we want with $|x_{n+1}|=1$ will have a squared $\mathcal L_2$ norm less than $S^2+nX^2\approx 2nX^2$ whereas any other solution will have $|x_{n+1}|>1$ and so squared $\mathcal L_2$ norm at least $4S^2\approx 4nX^2$. Thus our causal answer is always shorter than the non-causal answer and our SVP algorithm should be able to detect it.

One can view this as a use of Kannan's embedding to solve a close vector problem and go to greater length optimising $S$.

user348382 avatar
gm flag
Can you elaborate on how you got $4S^2$ when $|x_{n+1}|>1$? From the basis $B'$ you gave, lattice points would be of the form $(x_1, \dots , x_n, S(x_1s_1 + \cdots x_ns_n - x_{n+1}t))^T$, but wouldn't the norm be $S^2 + nX^2$ or less as long as $x_1s_1 + \cdots x_ns_n - x_{n+1}t \approx 0$ ?
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.