However, available physical sources of randomness are imperfect and are biased and correlated.
No they're not. Sources don't actually produce any entropy whatsoever. None. Imagine a Zener diode or ring oscillator circuit in front of you. They just sit there looking circuity.
The observer generates the entropy when sampling the source. The source doesn't. This leads to the concept of $ (\epsilon, \tau) $-entropy per unit time, where "$ (\epsilon, \tau) $-entropy is the amount of information generated per unit time, at different scales $\tau$ of time and $\epsilon$ of the observables." Effectively sample rate and resolution. That means the observer can infinitely vary the entropy rate of the source at their discretion, by changing either $\tau$ or $\epsilon$.
This then further leads to an asymmetrical two sided problem: How do we measure $H_{\infty}$ for correlated sources? There are two asymmetric solutions:-
Try to determine $H_{\infty}$ for a correlated source with very low certainty.
Adjust your $ (\epsilon, \tau) $ sampling regime to produce uncorrelated data with high certainty.
Virtually nobody does #1, and even NIST have said it's neigh on impossible (Kerry McKay's unguarded comments). I can't imagine what $H_{\infty}$ means practically in a correlated scenario. So do #2 as the overwelming majority do, obtain $H_{\infty}$ as $-\log_2{(p_{\text{max}})}$ and extract.
Therefore it is possible to create a good TRNG from a biased and correlated source.
The Santha-Vazirani paper apparently demonstrates that it is impossible to (deterministically) extract an almost-uniform random bit from a SV-source.
Not quite. It actually says " by contrast, we prove that there is no algorithm that can extract even a single unbiased bit from a semi-random source (in fact, no better than a 1 - $\delta$ biased bit). " This is established knowledge and appears in all manner of 'extractor' papers. It simply means that any extracted randomness will always have a 1 - $\delta$ bias. NIST just recommends that you keep it below $2^{-64}$ which is easy.
Ref.: Pierre Gaspard and Xiao-Jing Wang, Noise, chaos, and $ (\epsilon, \tau) $-entropy per unit time, PHYSICS REPORTS (Review Section of Physics Letters) 235, No. 6 (1993) 291—343.
North-Holland.