Can anyone elaborate on this topic? What is weak physical random number generator/source and why we consider it as weak?
Some physicists say that the randomness obtained in radioactive decay or thermal noise, and especially in quantum phenomena, is perfect, unquestionable (and this is also indicated by experiments with Bell's inequalities). Meanwhile, in practical applications such sources are referred to as weak.

If one has some practical experience on electronic circuit design, one would immediately realize that: it's easy to find many unpredictable sources of randomness, but building an unbiased random bit generator on top of these randomness sources is not a trivial task. In fact it's fairly difficult to build a perfect electronic "coin flipper". In most cryptography textbooks, it's given for granted that there exists a true random number generator that produces an unbiased binary output, but this abstraction hides an enormous amount of implementation details and complexities.

For example, radioactive decay in principle is purely a quantum mechanical process on a *microscopic scale*. However, it's impractical to generate a random bit by measuring the radioactive decay of a single atom and selecting an unit time with a 50% chance for it to occur. Instead, practical devices only measure radioactivity on a *macroscopic scale*. For example, counting the number of pulses from a Geiger counter in 60 seconds. In this case, obtaining a straightforward binary output is no longer possible. While the output of the detector is certainly unpredictable due to its physical nature, but the degree of randomness of the detector's output is only a "weak", highly-biased, and low-entropy form. You might be able to see 20 ± 5 pulses per minutes. It's not even clear on how one could map the numbers to a binary sequence. Even if the probability distribution of the radioactive source can be modeled with infinite precision, any background noise still biases the output.

Zener diodes and avalanche diodes are another standard noise source in electronics engineering, the simplest circuit can be built by reverse biasing the base-emitter junction of a generic bipolar transistor (*note: but as pointed out by Paul Uszak, Zener-connected BJTs are subject to degradation and has questionable long-term reliability, do use a real diode if you need a serious TRNG*). The physical origin of the diode noise is the unpredictable movement of electrons during an avalanche breakdown, ultimately also quantum mechanical. Unfortunately, this noise can only be measured on a macroscopic scale, and all we can see is a fuzzy analog signal on an oscilloscope, not unlike an analog radio or TV tuned to a dead channel. Worse, it's not a perfect white noise source. At low frequency, semiconductor devices have a 1/f noise spectrum. Most practical electronics components also generate "excess noise" above its theoretical level due to physical imperfection.

How to convert these raw voltages to an unbiased binary sequence is not obvious. The simplest solution is comparing the voltages with an arbitrary threshold. Because the physical process is unpredictable, we know the output bitstream has some randomness. Again, it's a weak form of randomness: highly-biased, and low-entropy.

Other sources of randomness include:

Thermal noise from resistors, also known as Johnson-Nyquist noise. It's produced by random thermal movement of electrons within a resistor at a temperature above absolute zero.

Radio noise in the shortwave band from the atmosphere, mainly a result of lightning strikes, but is also influenced by solar activities, seasons, and man-made interference. This noise source is famously used by random.org.

The jitter (phase noise) of an oscillator. For example, relaxation oscillators are notoriously unstable, with significant cycle-to-cycle jitter. Another idea is starting two different crystal oscillators at the same frequency, and see if the first oscillator is leading or lagging than the second oscillator due to jitter. The "truerand" algorithm from some old IBM PC programs was based on this idea - counting the number of CPU clock cycles between a fixed RTC timer interval.

The output waveform of a chaotic circuit, such as a Chua's circuit, essentially an unstable oscillator.

The metastable resolution time of a flip-flop. A digital logic circuit is just a nonlinear analog amplifier that pushes a signal strongly towards the direction of the "1" or "0" level, but one can always construct a special input to make it output "1/2", like balancing a knife on its edge. This metastable state eventually collapses to a binary output, and the time it takes is only known statistically, and thus unpredictable. This is allegedly used in Intel CPUs to implement the RDRAND instruction.

Your idea here.

But all of these methods have the same problem. Not only the exact probability distribution is unknown, it's not even clear how to convert these signals into an unbiased binary sequence. In some special cases, it may be possible to create a physical model of the system, but other times it can be impractical.

Therefore, in practice, instead of trying to obtain perfect random numbers from first principles, we simply collect a large list of the heavily-biased raw output, often generated by a somewhat arbitrary method. When enough raw output is collected, the output is plugged into a hash function to obtain unbiased binary sequences, this process is usually called "whitening" or "randomness extraction". Even if the raw output is biased, as long as it contains sufficient entropy (which can be determined statistically during the design process), it's a secure system.