Let $(X, Y, Z)$ be a set of binary strings of length $n$. Let random $X$ be the private key for encoding (or decoding) message random $Y$ as $Z$. Let the encryption algorithm $m$ be a matching function, i.e., for $m(X, Y) = Z$, if $x_i = y_i$, $z_i = 1$, otherwise $z_i = 0$. Now, $m$ is clearly not a one-way function: given output $Z$, it is trivial to define some string pair such that $m(X, Y) = Z$. However, it is impossible to obtain the particular encoded message from the encrypted message without the private key.
Reusing the key repeatedly does not compromise it, given that $Z$ provides no information (entropy) about $(X, Y)$. To wit, suppose an eavesdropper collects a set of $k<2^n$ distinct encrypted messages $Z$, all encrypted using the same $X$. This tells the eavesdropper nothing about $(X, Y)$, and therefore nothing about $Y$, because it's just a random subset of all $2^n$ possible $Z$ of length $n$. Thus, if my reasoning is correct, the strength of the encryption is limited only by $n$.
Given the existence of such provably hard problems, why do we rely on supposedly one-way functions that might turn out to not be hard? Like, say, factoring the product of a pair of large primes? Or, is its equivalent used in some form I do not recognize?
Note: Corrected to clarify that the original message is random, and that breaking it is not impossible but provably hard.