Score:2

The second moment and fourth moment of $\mathcal{P}(V)$?

br flag
zbo

Backgroud: I am reading the paper "Learning a Parallelepiped: Cryptanalysis of GGH and NTRU Signatures". (here is the link). And I got stuck in understanding the computation of moment.

Question statement: In section 4.3 of the paper, It defined: For any $V=[\mathbf{v}_1,\cdots,\mathbf{v}_n] \in GL_n(\mathbb{R})$ and any integer $k \ge 1$, the $k$-th moment of $\mathcal{P}(V)$ over a vector $\mathbf{w} \in \mathbb{R}^n$ is $$\text{mom}_{V,k}(\mathbf{w}) = \mathbb{E}[\langle\mathbf{u},\mathbf{w}\rangle ^k]$$ where $\mathbf{u}$ is uniformly distributed over the parallelepiped $\mathcal{P}(V)$.
Then authors said a straightforward calculation shows that for any $\mathbf{w} \in \mathbb{R}^n$, second moments and fourth moments are $$\text{mom}_{V,2}(\mathbf{w}) = \frac{1}{3}\sum_{i=1}^{n} \langle \mathbf{v}_i,\mathbf{w}\rangle^2 =\frac{1}{3}\mathbf{w}V^tV\mathbf{w}^t$$

$$\text{mom}_{V,4}(\mathbf{w}) = \frac{1}{5}\sum_{i=1}^{n} \langle \mathbf{v}_i,\mathbf{w}\rangle^4 + \frac{1}{3}\sum_{i \neq j} \langle \mathbf{v}_i,\mathbf{w}\rangle^2 \langle \mathbf{v}_j,\mathbf{w}\rangle^2$$

But it seems not such straightforward for me... I have trouble in understanding the calculation of these second and fourth moments.

My effort: For second moment, by the definition, we have $$\text{mom}_{V,2} = \mathbb{E}[\langle\mathbf{u},\mathbf{w}\rangle ^2] = \int \langle\mathbf{u},\mathbf{w}\rangle ^2 f(\mathbf{u}) \text{d}\mathbf{u} = \frac{1}{|\mathcal{P}(V)|} \int \langle\mathbf{u},\mathbf{w}\rangle ^2 \text{d}\mathbf{u}$$

the last equality is because $\mathbf{u}$ is uniformly distributed over the parallelepiped $\mathcal{P}(V)$. Now I can't move forward. The same is fourth moments.

Any hints would be helpful.

kodlu avatar
sa flag
The notation looked annoyingly like exponential function, but I believe it is the expectation operator. Feel free to undo if I misinterpreted
zbo avatar
br flag
zbo
@kodlu, yes ,it is expectation operator, the original paper used Exp notation so I copied here. The notation you edited makes it more clear.
Score:2
ru flag

It's instructive to look at the proof of Lemma 1 where $\mathbf v$ is rewritten $\mathbf u=\mathbf xV$ where $\mathbf x$ is a uniform sample from $[-1,1]^n$. Thus $$\mathbb E(\langle \mathbf u,\mathbf w\rangle^2)=\mathbb E_{\mathbf u}(\mathbf w\mathbf u^T\mathbf u\mathbf w^T)=\mathbf wV^T\mathbb E_{\mathbf x}(\mathbf x^T\mathbf x)V\mathbf w^T.$$

The expectation of $\mathbf x^T\mathbf x$ is the covariance matrix of $n$ i.i.d. uniform random variables on $[-1,1]$. Independence means that the off-diagonal terms are 0 and that the diagonal terms are $$\mathbb E(X^2)=\frac12\int_{-1}^1t^2dt=\frac 13$$ where $X$ is a random variable uniformly distributed on $[-1,1]$. So that $$\mathbb E_{\mathbf x}(\mathbf x^T\mathbf x)=\frac13 I_n$$ and so $$\mathbb E(\langle \mathbf u,\mathbf w\rangle^2)=\frac13\mathbf wV^TV\mathbf w^T$$ as claimed.

I haven't fully expanded out the 4th moment, but again the terms will be integrals products of i.i.d. random variables $X_iX_jX_kX_\ell$ uniform on $[-1,1]$. Terms where any variable is represented to an odd power will be zero which only leaves terms $X_i^4$ and $X_i^2X_j^2$ with $i\neq j$. These should correspond to the two sums.

zbo avatar
br flag
zbo
Got this! I am trying to find the relation of $\text{mom}_{V,2}(\mathbf{w})$ and $\frac{1}{3}\sum_{i=1}^{n} \langle \mathbf{v}_i,\mathbf{w}\rangle^2$ first. Then try to find the relation of $\frac{1}{3}\sum_{i=1}^{n} \langle \mathbf{v}_i,\mathbf{w}\rangle^2$ and $\frac{1}{3}\mathbf{w}V^tV\mathbf{w}^t$.
zbo avatar
br flag
zbo
But it seems the relation should be $\mathbf{w}V^tV\mathbf{w}^t$ = $(\sum_{i=1}^{n} \langle \mathbf{v}_i,\mathbf{w}\rangle)^2$. That is $\mathbf{w}V^tV\mathbf{w}^t = (w_1,\cdots, w_n)(\mathbf{v}_1^t,\cdots,\mathbf{v}_n^t)^t(\mathbf{v}_1,\cdots,\mathbf{v}_n)(w_1,\cdots,w_n)^t = (w_1\mathbf{v}_1^T+\cdots+w_n\mathbf{v}_n^t) (\mathbf{v}_1w_1+\cdots+\mathbf{v}_nw_n) =(\sum_{i=1}^{n} \langle \mathbf{v}_i,\mathbf{w}\rangle)^2 $.
zbo avatar
br flag
zbo
Please corret me if am wrong.
zbo avatar
br flag
zbo
It seems another question. Should I put this in the question body?
Daniel S avatar
ru flag
@zbo Your identity $\sum\langle\mathbf v_i,\mathbf w\rangle^2=\mathbf V^TV\mathbf w^T$ is indeed correct by the definition of matrix multiplication. Hence $\mathrm{mom_V,2}(\mathbf w)=\mathbf wV^TV\mathbf w^T$ as in my answer also proves that $\mathrm{mom}_{V,2}(\mathbf w)=\sum\langle\mathbf v_i,\mathbf w\rangle^2$.
zbo avatar
br flag
zbo
Hi, This is what I am confused. The matrix multiplication. I think it should be $\sum_{i=1}^n \langle\mathbf{v}_i,\mathbf{w} \rangle ^2 = \mathbf{w} VV^T \mathbf{w}^T$. Not $\mathbf{w} V^TV \mathbf{w}^T$ . Since $VV^T = \mathbf{v}_1\mathbf{v_1}^T + \mathbf{v}_2\mathbf{v_2}^T + \cdots + \mathbf{v}_n\mathbf{v_n}^T$. But $V^T V $ is actually a matrix form. Since $V$ is a 1*n matrix form of colunm vector.
zbo avatar
br flag
zbo
My mistake. The authors said Vectors of $\mathbb{R}^n$ will be row vectors denoted by bold lowercase letters. I think they are trying to let $V$ be form of row vectors $\mathbf{v}_1,\cdots, \mathbf{v}_n$. So $V^TV = \mathbf{v}_1^T\mathbf{v}_1+\cdots+\mathbf{v}_n^T\mathbf{v}_n$
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.