Nice question!
This seems to have been addressed in a conference paper also available here by Schrift and Shamir in 1991:
A.W. Schrift, A. Shamir, On the universality of the next bit test,
Conference on the Theory and Application of Cryptography, 1990.
There is a later journal version in Journal of Cryptology as well.
To summarize, they consider a source of biased but independent bits and how to devise a distinguisher for it. Note that without loss of generality they assume the probability of a $1$ bit is $b\in (1/2,1)$ but call the quantity $b$ the bias which is a bit counterintuitive, compared to the traditional use of bias for the quantity $b-\frac{1}{2}$.
In particular, they define a weighted success rate of any non-constant PPTA algorithm $$A:\{0,1\}^n\rightarrow \{0,1\}$$ in predicting the $i^{th}$ bit of a biased source as
Note: The notation $f<O(\nu(n))$ is used for any function that vanishes faster than any polynomial
reciprocal power, i.e., any vanishing function.
There are some other alternative tests proposed in the paper.
This paper is cited quite a lot, but mostly by papers that apply it to various sources. In fact, it seems that the same authors had already used this technique to prove results on hardness of every bit including the zero biased most significant bit for discrete logs modulo a composite number.