Late to the party, and I can see the other relatively complex answers, but I’d like to comment on the OP’s potentially misleading use of the term “character”.
I know that a 128-bit hash contains 32 characters since each represents a hexadecimal.
No.
You can display an arbitrary 128 bit binary value however you want: 128 binary digits (each 0 or 1); 16 octal digits 0-7; 8 hex digits 0-9|A-F; a sequence of random hieroglyphics; or whatever. But none of those define a “character” in my opinion.
Traditionally, “character” meant a 7-bit ASCII character. These were generally stored with an extra parity bit, for 8 bits per character. So a 128 bit value could store 128/8 = 16 such characters.
The modern Unicode scheme defines an enormously number of characters, that can be “encoded” (ie. stored in binary) in various different ways. For example, in so-called “UTF32 encoding”, each Unicode character takes exactly 32 bits. So a 128 bit value can only store 128/32 = 4 such characters. But in “UTF-8 encoding”, a character can take anywhere from 8 to 40 bits, depending on the character!
Alternatively, someone might have an alphabet of 2^128 unique characters, in which case, a 128 bit value could only store a single character!
Thus, you just can’t say how many characters can be stored in ‘x’ bits - it depends entirely on what you mean by a character, what encoding you’re using to represent those characters, and for certain encodings, exactly which characters you’re actually trying to store!
So a 128-bit value, which you say “contains 32 characters since each represents a hexadecimal [digit]”, might in fact contain anywhere from one, to 2^128 characters, using what I believe to be a reasonable modern definition of the term “character”.