When a string like "Hello World" is encrypted, do the algorithms convert this string into binary or ASCII?
Well, the answer depends on whether you're talking about historic or modern cryptography.
Historically, encryption methods typically did take the string as a series of characters, and transformed them that way. Sometimes they did an internal mapping, say, "A" -> 0, "B" -> 1, etc, however that was generally done internally to the algorithm. They generally didn't convert the characters to binary (which back then would have been seen as an odd choice), and certainly not to ASCII (which didn't exist back then).
On the other hand, modern encryption algorithms generally [1] take strings of bits or bytes as inputs, and generates a string of bits or bytes as an output. To encrypt a string, you would convert it into such a string (whether using ASCII, Unicode, or EBCDIC, the algorithm wouldn't care), and the algorithm worked on that.
Also, if this is a file, can ASCII still be used to encrypt data?
Well, a file of ASCII characters can be viewed as a string of bytes, and so modern encryption algorithms would handle it just fine.
[1]: One exception: Format Preserving Encryption schemes, which generally look at the input as a base-$b$ string, where $b$ might not be a power of two.