The updated question, containing the quote, is much less polemic than the original. Indeed, it's just an issue of the general impossibility of proving a negative (in this case: "can certain PQC problems be solved efficiently"?) It's easy to prove that yes, they can be solved efficiently, by just providing an algorithm that does solve them efficiently. Proving that one can't be designed is much more difficult. In certain cases, not even a proof that $P \neq NP$ would be sufficient, as it only proves the difficulty in the general case, but there may be special cases for which efficient algorithms exist.
So if we can't prove that efficient solutions don't exist, the best we can do is show how much we tried, and we weren't able to find them.
It is a fact that PQC schemes, including those based on lattice and code problems, have been less scrutinized than RSA. The Rainbow and SIKE breaks (both PQC schemes, but not based on the two mentioned problems: the first is based on multivariate polynomials, and the second on supersingular isogenies of elliptic curves) do not exactly inspire further confidence on PQC.
On the other hand, it bears mentioning that both lattice and code problems are not exactly new: McEliece (code-based) was proposed in 1978, NTRU (lattice-based) in 1996 and the learning with errors framework, used by Kyber and Dilithium, in 2005. For reference, Diffie-Hellman was published in 1976, RSA in 1977 and elliptic curves in 1985. Of course, because DH, RSA and ECC were actually adopted in the real world, they were much more scrutinized than their PQC counterparts.
The bright side is that PQC has been an active area of research for over 15 years, and has received considerably more attention since NIST launched their PQC contest in 2016. So it's not like cryptographers just came up with new schemes and are throwing them out there, untested (as much as Rainbow and SIKE look like exactly that). Unfortunately, there's no substitute for decades of scrutiny as the classical algorithms have received at this point. In 10 years we'll probably be more certain of the security of Kyber, Dilithium, NTRU, McEliece and other schemes. In 20 years, much more so.
Thus, if you are choosing between classical and PQC schemes today, you should ask yourself: what threat worries me most? Is it that my protocol will certainly be attacked within a few decades, when quantum computers should become practical (and that includes cyphertexts generated today and saved for later decryption)? Or that my protocol is subject to an overnight break, however unlikely that becomes day after day of further cryptoanalysis, and I don't care too much if my data is decrypted decades from now? If the former, go with PQC; if the latter, with classical cryptosystems.