Score:1

How to map the message to the vector of weight t in Niederreiter cryptosystem?

dz flag

In Niederreiter cryptosystem, we require the message to be a vector of weight $t$ in $F_q^n$ in encryption, assume $t$ is the error-correction ability of the code. But what is the mapping? One possible way is mapping the message of length $k$ to a codeword of a constant weight $t$ linear code, e.g., $[n,k]_q$ code. In this way, the message space is $q^k$. Is there other better way to do that, e.g., the larger message space?

Score:1
ru flag

Fun question! We can, in fact, efficiently realise the maximum message space of size $(q-1)^t({n\atop t})$.

Let us begin with the case $q=2$. We want to generate a bit string of length $n$ and Hamming weight exactly $t$. There are $C:=({n\atop t})$ such strings and we would like to map an integer message from the interval $[0,C-1]$ to this set bijectively. Combinatorially speaking, the set of bit strings correspond to combinations (specifically $t$-combinations of the integers 0 through $n-1$). We can bijectively rewrite our string as a strictly decreasing sequence of $t$ values $n-1\ge c_t>c_{t-1}>\ldots>c_1\ge 0$ by writing down the indices of the location of the 1s.

Now, an elegant theorem of mathematics that Knuth attributes to the 19th century mathematician Ernesto Pascal says that every number $N$ can be represented in the combinatorial number system $$N=\left({c_t\atop t}\right)+\cdots+\left({c_1\atop 1}\right)$$ and this ordering steps through combinations is lexicographic order (for our purposes, the first $({n\atop t})$ entries list weight $t$ strings of length $n$). Therefore if we have an integer $N\in [0,C-1]$ we can use a greedy algorithm to recover the $N$th weight $t$ binary string given by the combinatorial number system. Here's some pseudo-code:

Initialise i:=n-1, j:=t, remainder=N
while i>=0
   if Binomial(i,j) > remainder
      set bit i of the string to 0
   else
      set bit i of the string to 1
         j--
         remainder -= Binomial(i,j)
   i--

The reverse map is straightforward.

For example, with $n=10$, $t=3$ there are 120 possible combinations, let us find the 17th (counting 0-up). We first find the smallest tetrahedral number not greater than 17; this is $({5\atop 3})=10$; we mark the 5th spot and have 7 left over. We now find the smallest triangular number not greater than 7; this is $({4\atop 2})=6$; we mark the 4th spot and have 1 left over. We now find the smallest number not greater than 1; this $({1\atop 1})=1$; we mark the 1st spot and have nothing left over. Our string is 0000110010. If we receive the string we compute $({5\atop 3})+({4\atop 2})+({1\atop 1})=10+6+1=17$.

For more general alphabets, we can use the above process to generate the error locations and then choose from the $q-1$ possible error values for each location. Concretely let $M=C(q-1)^t$ be the size of our message space. Given a message $m\in[0,M-1]$ we let $N=[m/(q-1)^t]$ so that $N\in[0,C-1]$ and generate a list of locations as above. We then let $B=m\mod{(q-1)^t}$ and write $B$ as a $t$-digit number in base $(q-1)$ (allowing leading zeroes) and assign the values digit+1 to each location. Again the reverse map is straightforward.

For example suppose we have a decimal alphabet and consider strings of length 10 with 3 errors. There are 120*729=87480 possible messages; let us find the 12707th. We find $N=[12707/729]=17$ and so generate the same bits string as above. We find $B=12707\mod{729}=314$ which is 378 in base 9. Our message converts to the error string 0000480090. Again if we receive this string, we pull out the non-zero digits in order and subtract one from each to give the base-9 number 378 so that $B=314$ likewise we know that $N=17$ from the error locations and can compute $m=729N+B=12707$.

Chapter 7.2.1.3 "Generating all Combinations" of Donald Knuth's definitive "The Art of Computer Programming" is an excellent, comprehensive (albeit distracting) account of other algorithms for generating combinations.

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.