GPUs or dedicated hardware can calculate most things much much faster than regular computers ever could. There are password hashing algorithms such as scrypt and argon2 that make the difference smaller, but a powerful dedicated argon2 machine/cluster is still much more efficient than an old budget smartphone is.
My thought was that requests over the internet is something that is not much faster for a powerful system compared to a weak system. So could one store all required salts on a completely separate (trusted) server and make requests to it (and not storing or caching the response) each time a new hash needs to be calculated? The idea being that one would have to send a number of requests to a server to avoid having to brute-force an infeasibly large random salt.
Reference implementation:
username = "us3rn@m3"
password = "p@ssw0rd"
tmp = sha256(username)
for i in (0, ..., n):
salt = http_request("https://other.server/"+tmp)
tmp = sha256(salt+username+tmp)
password_hash = sha256(salt+username+password)
and the other server keeps a database, generating a new random salt each time a new request comes in and retrieving a stored salt for a request that has been seen before.
Could this technique (given a proper specification and implementation) be relied upon as a way to slow down hash evaluation as an alternative to memory hard functions?
Edit:
I can't see any obvious issues with the approach if both servers are not compromised together. If an attacker got access to both DBs, however, they would get easy access to all passwords. This is a big drawback of this approach.