To preserve information you need to make the output larger.
The maximal amount of information we can preserve is all of it.
This can be achieved by: e.g the identity function. $h(m)=m$
Obviously this provides no compression at all.
Information theory teaches us we can not compress general data, some data is compressible with some functions but no function can compress all data.
If you are looking for lossy compression this requires you to decide which information is less important, this is fairly well understood in images video and sound, however not for general data.
For general data if you don't care what you lose you can truncate the message and preserve some information, you won't get better than that from information theory reasons. a hash output of n bits can have no more than n bits of information.
If you are hoping to keep a lot of information about non random messages of unknown structure, cryptographic hash functions are very good. Apply e.g SHA3-256 to an arbitrary compressible message and you are likely to get very close to 256 bits of information.
Sadly(Luckily) finding the set of possible messages producing such a hash or saying anything interesting about them is beyond us.