No, it's not possible to generate a private key (or key pair) from biometric data acquired by a sensor, and regenerate the same one with a different device that has no communication whatsoever with the original one.
That's true
- For fingerprint, blood vessels map, retina scan; any biometric, even if we have lots of biometric data (fingerprint of many fingers…)
- By extension, for any acquisition by computer of naturally diverse/noisy data: position of fibers of a sheet of paper, defects in a crystal, even power-up state of static RAM memory (which is inherently digital).
- Even if we use the same acquisition device after a zeroization. The problem is not sensor variability (it's noise).
- Even if we want only a stable identifier with no absolute requirement for uniqueness (e.g. a 40-bit bitstring).
Problem is: we won't get something reproducible unless we have some error-correction data to counter the noise. And that needs to be stored.
What we can do, with enough biometric data and some care, is make things such that:
- With error-correction data stored and made available independently of the biometric, we get the desired reproducible private key (or identifier).
- The combination of said error-correction data and private key is non-revealing of the biometric.
- Knowledge of both the biometric data and the error-correction data is needed to regenerate the private key, or match either the biometric data or the error-correction data with the private key.
Note: I don't know how much useful entropy a single fingerprint holds when the error-correction data is assumed public. I suspect it's not much compered to the standard ≈120-bit security level targeted for modern key pairs. However, at least in principle, we can use several fingerprints. And standard key-stretching techniques can add at least 20 bits on top of that.
In commercial devices, error-correction data is often stored in the acquisition device if there's a single one; or in a central database; or shared among a network of acquisition devices.
My main reference on this is a slightly dated (2007) book by Pim Tuyls, Boris Škorić, Tom Kevenaar: Security with Noisy Data (paywalled), subtitled On Private Biometrics, Secure Key Storage and Anti-Counterfeiting.
I welcome recommendations for more recent reference material (there's some papers on Boris Škorić's page, but I have yet to explore them).
Independently: even if what's asked was possible, or when the error-correction data is public: biometric is not sound as the sole secret to generate a private key. A fingerprint is not secret. It's on one's passport or other ID, which is routinely handed to untrusted individuals. It's in the ID issuer's database. It can be captured in a variety of ways, including traces left on a glass of water. Also it can't be changed when compromised.