TL;DR:
The entropy of a cryptographic key $K$ cannot be computed in isolation for a single key. It is the property of the generation mechanism for the key.
Explanation:
Entropy is a function of a probability distribution. Assuming you mean the most common entropy measure, Shannon Entropy, given a key $K\in \{0,1\}^b$ which has been randomly generated from the set of $b-$bit strings according to some probability distribution
$$
P(x)=Prob\{K=x\},\quad x\in \{0,1\}^b
$$
then the entropy of a key drawn from this distribution is
$$
H(X)=\sum_{x \in \{0,1\}^b} -P(x) \log_2 P(x)\quad\textrm{bits}.
$$
If the distribution of $K$ is uniform then this entropy is $b$ bits.
If the key comes from some randomly chosen SEED and is generated by means of some deterministic algorithm or function, the entropy of the key that is output is the same as the entropy of the SEED.
PS: One can define the uncertainty in a single object, such as a key, by means of Kolmogorov Complexity, which is a theoretical measure. It is defined as the program length of a universal Turing machine (UTM) that will output that key, and halt (stop). This complexity is uncomputable, but can be approximated.