Is there a lossless compression algorithm that has hashing-like properties where there are multiple solutions to it?
As in for example, when a 1000-bit data-sequence is compressed into a 500-bit data sequence, there are multiple possible 500-bit data sequences that can be generated as outputs. Each of these 500-bit data sequences, once decompressed would all output the original 1000-bit data sequence. Additionally, upon decompression, any one of the 500-bit data sequences would also have multiple 1000-bit data sequence solutions upon decompression (around (2^1000/2^500) valid solutions). Collisions are fine, since otherwise it would be impossible for multiple 500-bit data sequences to be valid compression outputs of the 1000-bit data sequence.
The amount of "pigeon-holes" available for compression and decompression don't need to be symmetric. Also, it may not be appropriate to call a compression algorithm that produces multiple valid outputs when compressing and multiple valid same-sized outputs when decompressing as "lossless". But all the original data would still be there, it would just generate additional data on top of it. This additional data can be dealt with (the decompression would be stochastic if without help).
One more thing, the compression degree should be customizable (eg, from 1000-bits to 900-bits, or from 1000-bits to 950-bits, etc). Same with decompression. Otherwise, if such an algorithm requires the bit-size-change steps to be fixed, then it needs to be symmetric and fine enough to allow for multiple repeats. For ex, 1000-bit to 900-bit to 800-bit to 700-bit, etc. And the decompression process needs to backtrack for those exact sizes.