Hex encoding of the plaintext doubles it's size, while preserving entropy. Hence entropy per bit is halved.
For weak encryption algorithms (only), this typically makes attack easier. For example: the original context was XOR with a repeated key. For uniformly random key the size of a single plaintext, that is perfectly secure. But it's no longer after hex encoding.
Detailed example following comment: The cipher is XOR with a 5-bytes uniformly random key repeated as necessary to match plaintext length. Without hex encoding, given the ciphertext for a single 5-byte plaintext, we can't deduce anything about the plaintext, e.g. decide if it's for ALICE
or OSCAR
(in ASCII). But with (uppercase) hex encoding these become 414C494345
or 4F53434152
(in ASCII). These are 10-byte each, so that the same key byte must be reused for the first and 6th characters. These are 4
and 9
(34h and 39h) for ALICE
, 4
and 3
(34h and 33h) for OSCAR
. Thus we need only XOR the first and 6th bytes of ciphertext: we'll get 0Dh (34h XOR 39h) for the plaintext ALICE
, versus 07h (34h XOR 33h) for OSCAR
. More generally, XOR of jth and 5+jth bytes is independent of key, and characteristic enough of plaintext that's a serious issue.
Rebuttal of this comment: in the above example, the addition of hex encoding does NOT change the size of the effective plaintext, which remains the same 5 characters. That example does illustrate that all things being equal, encoding the plaintext in hex tends to weaken an already weak cipher. At the expense of a much more complex example, it would be possible to come up with a cipher with fixed-size key/password that's practically secure for arbitrarily large plaintext with the usual level of redundancy in English, but not for the increased redundancy of hex-encoded text.