Score:2

Existence of algorithm predicting next bit in output sequence

lu flag

Let $X = [0, 1]\cap \mathbb{Q}$, and let $f:X \rightarrow X$ be a chaotic map (i.e. the logistic map with rational parameter). My question is as follows, and is purely theoretical in nature. Pick some value $x_0$ from $X$ (note that $X$ is infinite here, so pick a value using the axiom of choice), and then consider sequences of bits generated by iterating $f$ over $x_0$, returning a $0$ if $f^n(x_0)\leq 1/2$ and returning a $1$ if $f^n(x_0)>1/2$.

Does there exist an algorithm which, when given the parameter of the map $f$ and some bit sequence $s \in \{0, 1\}^n$ obtained by iterating $f$ over $x_0$ and returning bits in the specified manner for a total of $n$ times, that can predict the next bit obtained from $f^{n+1}(x_0)$ with non-negligible probability, for any $n$, even when $n >|x_0|$ where $|x_0|$ is the length of the bit-string representing $x_0$?

The reason I am asking this is because it seems to be different from asking if there exists a PRG (but I could be wrong). The reason is that I assumed the "secret" initial condition $x_0$ was not randomly chosen from a finite set, but rather was chosen from an infinite set $X$ (even though $X$ is countable and hence every element can be represented by a finite bit string). Hence, I am wondering if this assumption about how the initial condition was drawn changes things.

Generic avatar
lu flag
@fgrieu that is correct, I meant to iterate $f$, thanks for pointing that out! I have changed the question accordingly.
Score:2
ng flag

The first part of this answer is for an earlier version of the question, with $x_0$ a rational represented by an arbitrarily large bitstring.

There exists functions $f$ such that no algorithms can predict the next bit of the output sequence.

A simple example is $f(x)=\begin{cases}2x&\text{if }x<1/2\\2x-1&\text{otherwise}\end{cases}$.
With this function, the binary sequence produced is the¹ binary representation of $x_0$ (starting at the first bit after decimal point), which can't be predicted. Is that $f$ "a chaotic map"? I can't tell.

We can make $f$ continuous, e.g. $f(x)=\begin{cases}2x&\text{if }x<1/2\\2-2x&\text{otherwise}\end{cases}$.
The relation between a binary representation of $x_0$ and the sequence remains such that changing the $i^\text{th}$ bit of that binary representation of $x_0$ changes the $i^\text{th}$ bit of the sequence. I think I have seen this function, or a close cousin, dubbed "a chaotic map".

We can make $f$ indefinitely derivable with $f(x)=\frac{43}{11}\,x\,(1-x)$ (a case of "the logistic map with rational parameter", and "a chaotic map" by most accounts). Without proof: for any bitstring in $\{0,1\}^{n+1}$ there exists an $x_0$ such that the first $n+1$ bits output are that bitstring, thus the next bit is not predictable with certainty.


Now for the revised question with

for any $n$, even when $n >|x_0|$ where $|x_0|$ is the length of the bit-string representing $x_0$.

Without proof: with $f(x)=\frac{43}{11}\,x\,(1-x)$ and most $x_0$, the question's generator requires work exponential in the number of bits produced (argument: $f^n(x)$ is a polynomial of degree $2^n$, thus it seems that evaluating it to a give accuracy requires knowing $x$ with an exponential number of bits). Thus the question's generator does not meet the standard criteria of being a Polynomial-Time algorithm, and thus does not meet the standard definition of a PRG, regardless of predictability. At least, the cost and memory requirement grows so fast that it's not useful in practice.

On the other hand, for most fixed $x_0$ (perhaps, all those that do not make the bit sequence generated ultimately periodic), it's possible to make a partial predictor. In particular, the output sequence is sizably biased towards $1$. Thus a distinguisher is much easier than computing the sequence. I think that simple fixes like changing the threshold $1/2$ to the expected mean still will allow a polynomial-time distinguisher simply by computing the frequency of sequences of a number of bits.


¹ For numbers with two binary representations, that is of the form $a/2^k$, take the lexicographically first: e.g. $3/4$ is $.1011111111111111111…$

ph flag
This is just what I was thinking. I think you can make the "unguessable" part more rigorous by arguing about the measure of the values of $X_0$ that produce each next output being equal.
ph flag
And even though this answer is much more about the general case than any particular iterative function, if a PRNG is not better than just returning digits of the seed that's not a good sign.
Generic avatar
lu flag
Thanks for your response, although what I had in mind was a function where its binary output is not predictable given ANY length of output, even when the output is longer than the binary representation of the initial condition $x_0$. I amended the question to reflect this.
ph flag
Yes, but your conclusion is "not predictable with certainty" and I'm saying you can adapt this to get to the stronger "not predictable with prob > 1/2"
Generic avatar
lu flag
Thanks for the answer, I marked the question as resolved, although I will remark that the choice of parameter 43/11 is odd, as it is greater than four and hence the function no longer satisfies the requirement that it maps X to itself, in fact it is no longer chaotic with such a parameter. Also, if the generator did not require exponential work, then would the definition of a PRNG be satisfied, even though the initial condition was taken from an infinite set?
fgrieu avatar
ng flag
@GEG: $43/11$ is just _below_ $4$. It's chosen to be in the chaotic region. The definition of a PRG requires it to be expanding, and that allows a seed of size that grows with $n$ (e.g. representable as $s=x_0\,2^{n/2}$ over $n/2$ bits. We still get about twice more bits out than in. The main issue is that with the logistic map, I see not polynomial time algorithm to evaluate $n$ bits, and the bits are distinguishable from random, including biased.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.