For the following answer I'll assume that the hashing is performed twice over the data (a 2-pass system) and that the output is the output of the two hash operations combined, i.e.:
$$\text{H}'(m) = \text{H}_1(m) \| \text{H}_2(m)$$
Along with the metadata, I want to capture the hash of each file. I am aware of the existence of a collision problem and the possibility of file substitution so that its hash remains unchanged.
Any unbroken hash function, at the time of evaluation, does not have this problem. In other words, if this problem is present then the hash has broken collision resistance; it is considered broken.
The old hash functions MD5 and SHA-1 have been broken for collision resistance, the MD5 function is broken instantaneously but for SHA-1 a lot of computation has to be performed - although having data files that are colliding is a problem all in itself.
Does using multiple hashes increase the complexity of implementing such an attack? Or does it not make sense?
It does make some sense. There has been a precedence of using this kind of scheme in a well known protocol as OpenSSL used signatures over both MD5 and SHA-1, which were already considered to be problematic at the time.
If this is the case, how can I try to prove the usefulness/uselessness of using multiple hashes for my purpose mathematically?
I don't think you can create a mathematical proof for any good degree. The premise would be that one of the hashing algorithms would be broken and that the other hash is not affected by this unknown attack. Of course you can start proving that you'd need a collision on both, but that's hardly any effort and can be understood without any math.
Beware that concatenating two hash outputs may not be as secure as you would expect. In short, it is not as strong as the output bitsize may indicate. This may be useful against breaking one of the algorithms, not so much to provide higher security in bits.
For very high, long term assurance you could use a SHA-2 and SHA-3 / SHAKE hash combined. If the double pass mechanism and double output size are worth it is highly questionable though. We've got pretty high confidence in the current hash functions especially since SHA-2 held up fine during the SHA-3 hash competition. If SHA-3 is too slow you might want to consider BLAKE3 as well.
For normal hashing using SHA-256 is good enough, and it is high speed as it is supported in hardware by most modern processors, which is perfect for file hashing.