Score:2

For what 'rounding constant' exists in Round5?(NIST PQC Round 2 Algorithm)

ke flag

I am reading a paper Round5.

This public key encryption scheme is based on Ring-LWR but I found it is a little bit different from typical LWR-based PKE scheme.

In the key generation algorithm of Round5 (Algorithm 1, Line 3 in the paper), they compute

$$ b= \left< \Bigl \lfloor \frac{p}{q}\left(\left< as \right>_{\Phi_{n+1\ }(x)} +h_1\right) \Bigr\rceil \right>_p$$

where $\left< \cdot \right>_f$ means reduction modulo $f$.

As far as I know,

$$ b= \left< \Bigl \lfloor \frac{p}{q}\left(\left< as \right>_{\Phi_{n+1\ }(x)} \right) \Bigr\rceil \right>_p$$

is the "typical" LWR-based public key, but in the Round5, there is an additional $h_1$ term.

What is the role of $h_1$?

Thank you in advance.

Score:1
ru flag

What Algorithm 1 Line 3 actually has is $$b=\left\langle\left\lfloor\frac pq\left(\langle as\rangle_{\Phi_{n+1}(x)}+h_1\right)\right\rfloor\right\rangle_p$$ (note the use of the floor function rather than the rounding function).

Note that $h_1=p/2q$ (see the second complete paragraph on page 5) and $\lfloor x\rceil=\lfloor x+\frac12\rfloor$. Adding $h_1$ and multiplying by $\frac pq$ is equivalent to adding 1/2 and this is just to allow us to implement the rounding function as an integer part function (which is easier in code).

Lee Seungwoo avatar
ke flag
Thank you so much.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.