Score:9

Any advantage to a block cipher which is not efficiently invertible?

mk flag

The classic definition of a PRP includes efficient invertibility.

Given that many modern cipher modes (CTR-based e.g. GCM) use only the forward direction of the block cipher, it seems that the efficient invertibility part of the definition is not actually necessary for practical purposes.

Would such a relaxation gain us anything? i.e. are there practical PRP constructions which are efficiently computable in the forward direction but not the reverse direction? And which are more efficient in the forward direction than current block ciphers with equivalent security?

fgrieu avatar
ng flag
The title and body of the question ask for different things. In this [related question](https://crypto.stackexchange.com/q/14338/555) I asked the title's question. It's difficult, and among what's proposed nothing is very fast in the forward direction. Regarding the question in the body (that I read as: can we make faster block ciphers by not asking that it is efficiently invertible): notice that AES in software is appreciably faster in the forward direction, and the speedup is deliberate. But the reverse still is efficiently invertible.
eddydee123 avatar
mk flag
I corrected the title (so that Francois's comment makes sense, the original was "Block cipher which is not efficiently invertible?")
Score:9
ru flag

I would argue that for many cryptographers the argument goes even further. Given that block cipher streaming modes are favoured for bulk encryption, is invertibility needed at all? If one follows this line reasoning, one can see why there has been a resurgence in the popularity of stream ciphers with ChaCha20 being an obvious example. Although ChaCha20 produces a 512-bit output from a 512-bit state and updates state with a simple counter (much like CTR and GCM modes), the process is not (we believe) invertible. It is also true that ChaCha20 is a very efficient design (assuming that the hardware supports efficient addition of 32-bit words).

Note that for many implementations of AES the decryption round function is less efficient than the encryption round function as inverting the MixCol process is more computationally involved.

Gilles 'SO- stop being evil' avatar
“the decryption round function is less efficient than the decryption round function” I think the second *decryption* should be *encryption*? Is that really common? In a naive implementation, MixCol is pretty symmetric.
Daniel S avatar
ru flag
Thanks for the correction; I've now edited. Unless people are using T-table implementations, the most common approach is to multiply out the corresponding matrix and the [inverse matrix](https://en.wikipedia.org/wiki/Rijndael_MixColumns#InverseMixColumns) has more awkward entries than the `MixCol` matrix where the entries are all 1, $x$ or $1+x$.
Score:6
in flag

A PRP is Pseudo-Random Permutation and we want them to be indistinguishable from random permutations. AES and all block cipher are supposed to be PRP. The permutation means there is an inverse and they are designed to have one and indeed to have an efficient one.

We need a mode of operation for block ciphers and we left the CBC due to the many attacks that occurred though it has Ind-CPA secure. Currently, all of the TLS 1.3 ciphers internally use Ind-CPA secure CTR mode (TLS 1.3 cipher suites are more than that, they are all Authenticated Encryption modes with Authenticated Data)

Would such a relaxation gain us anything?

It gives us lots of opportunities. We don't need to restrict ourselves to PRPs in the CTR mode - it was already designed for Pseudo-Random Functions (PRF); CTR mode doesn't need the inverse of the function. With PRF we can use a wide range of functions that don't need to have inverses (There are $2^n!$ PRPs and $(2^n)^{2^n}$ PRF for n-bit block ciphers. Even we can take a hash function and convert it into CTR encryption as in Salsa. We may design a key schedule at almost zero cost, too.

Using PRP in CTR mode can cause the long message distinguisher and we can eliminate this by using a PRF. If we use PRP in CTR mode then we need to restrict the number of encryption blocks due to the PRP-PRF switching lemma.

CTR mode also doesn't require padding so they are immune to padding oracle attacks.

ChaCha20 and Salsa20 are well-known examples that have zero key schedule cost, ARX design with CPU friendly. They have built-in CTR mode and are very fast in software.

Score:5
in flag

Indeed, a PRF is suited better than a PRP for various modes such as CTR. The problem is that we don't know how construct good PRFs other than from a PRP.

  1. One way is simply to pretend that our PRP is a PRF: and this is true, up to a certain amount of data is given out (the birthday bound, see the PRP/PRF switching Lemma).

  2. Another often used way is to compute the PRP/permutation and add input to the output. It does not improve the birthday bound, but this trick makes the function non-invertible, which is crucial in some usages (such as ChaCha mentioned by @Daniel S). It is also used in Merkle-Damgard-style hash functions to construct the compression function (Davis-Meyer).

  3. Adding/xoring two permutations is a good PRF, but it is costly.

  4. Sufficiently truncating the output is also a good PRF, but again is costly.

Worth mentioning is sponge-based cryptography, which is based on public permutations, which are evaluated only in the forward direction. In other words, it is not required that the permutation is very efficiently invertible.

poncho avatar
my flag
Note that sponge-based cryptography effectively uses the 'truncating the output' option (by not outputting the 'capacity')
fgrieu avatar
ng flag
_"A PRF is suited better than a PRP"_ drifts away from the question as currently worded, which unambiguously asks for an efficiently computable _permutation_, and what we gain when we remove the constraint that the reverse permutation is efficiently computable. As exemplified by AES, it's plausible we gain a little. I agree we gain more by dropping the requirement that the function is a permutation, and that with large enough block size we can get away with it in practice. But I still find an interest in the question.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.