This is already answered here for 1 iteration, and here for many iterations, but since the latter answer presents an heuristic argument I'll leave here Lemma 4 of Functional Graphs and Their Applications in Generic Attacks on Iterated Hash Constructions which gives a good approximation based on Theorem 2 of Random Mapping Statistics, and (3.10) of On the height of trees:
Lemma 4. Let $f$ be a random mapping in $\mathcal{F}_N$. Denote $N = 2^n$. For $k \le 2^{n/2}$, the expectation of number of k-th iterate image nodes in the functional graph of $f$ is
$$
(1 - \tau_k)\cdot N \approx \left(\frac{2}{k} - \frac{2}{3}\frac{\log k}{k^2} - \frac{c}{k^2} - \dots \right) \cdot N\,.
$$
It suggests that $\lim_{k \to \infty} k\cdot (1 - \tau_k) = 2$. Thus,
$$
\lim_{N \to \infty, k \to \infty, k \le \sqrt{N}}(1 - \tau_k)\cdot N \approx 2^{n - \log_2 k + 1}\,,
$$
where $\tau_k$ satisfies the recurrence $\tau_0 = 0, \tau_{k+1} = e^{-1+\tau_k}$, and $c$ is a certain constant.
So while yes, there is some entropy loss due to repeated iteration of a random function, this loss grows only logarithmically on the number of iterations. To lose, say, $32$ bits of entropy, you need to iterate the hash around $2^{32}$ times. For large output lengths such as $256$ bits, this kind of loss is negligible for almost all practical purposes.