The question's title (now) asks "What are the first bits of a bitstring". That's unambiguous if the bits are presented individually in a chronological order, or in writing in a context where there is a conventional reading order, like left to right.
But the body of the question is about "a string that generates from sha256". That's to be read as SHA-256, defined by FIPS 180-4. It's an algorithm which outputs a "bit string" of $256$ bits, and they are not immediately identifiable.
The body of the question has x =
followed by $64$ characters all either digits or letters from a
to f
(in uniform case, here lowercase), which is $16=2^4$ characters. That hints the result of SHA-256 is encoded in hexadecimal, with each character encoding 4 bits (notice $256=64\times4$). This is one of several common representations of bitstrings as characters.
There are several different and incompatible ways to convert hexadecimal to bits, but fortunately, in the case of SHA-256, one is specified in 3.1 subparagraph 2 on page marked 7 of FIPS 180-4 (a must read). In summary, the most significant bits are first, be it at the nibble (4-bit), byte (8-bit), or word (32-bit in case of SHA-256) level.
Thus to find the first $i$ bits for the SHA-256 hash given, we can
This applies to all hashes of the SHA family (replacing $256$ with their output width). Absent other specification, it's reasonable to apply it to other standard hashes which output width is a multiple of $32$ (or even $8$ or $4$) bits. That's debatable for MD5, since it uses little-endian convention in byte order within a $32$-bit word. And I would not blindly extend it to other quantities used in cryptography and also represented in hexadecimal, such as integers as used in RSA.