Brute-Forcing a Fingerprint Reader

It’s neither hard nor expensive:

Unlike password authentication, which requires a direct match between what is inputted and what’s stored in a database, fingerprint authentication determines a match using a reference threshold. As a result, a successful fingerprint brute-force attack requires only that an inputted image provides an acceptable approximation of an image in the fingerprint database. BrutePrint manipulates the false acceptance rate (FAR) to increase the threshold so fewer approximate images are accepted.

BrutePrint acts as an adversary in the middle between the fingerprint sensor and the trusted execution environment and exploits vulnerabilities that allow for unlimited guesses.

In a BrutePrint attack, the adversary removes the back cover of the device and attaches the $15 circuit board that has the fingerprint database loaded in the flash storage. The adversary then must convert the database into a fingerprint dictionary that’s formatted to work with the specific sensor used by the targeted phone. The process uses a neural-style transfer when converting the database into the usable dictionary. This process increases the chances of a match.

With the fingerprint dictionary in place, the adversary device is now in a position to input each entry into the targeted phone. Normally, a protection known as attempt limiting effectively locks a phone after a set number of failed login attempts are reached. BrutePrint can fully bypass this limit in the eight tested Android models, meaning the adversary device can try an infinite number of guesses. (On the two iPhones, the attack can expand the number of guesses to 15, three times higher than the five permitted.)

The bypasses result from exploiting what the researchers said are two zero-day vulnerabilities in the smartphone fingerprint authentication framework of virtually all smartphones. The vulnerabilities—­one known as CAMF (cancel-after-match fail) and the other MAL (match-after-lock)—result from logic bugs in the authentication framework. CAMF exploits invalidate the checksum of transmitted fingerprint data, and MAL exploits infer matching results through side-channel attacks.

Depending on the model, the attack takes between 40 minutes and 14 hours.

Also:

The ability of BrutePrint to successfully hijack fingerprints stored on Android devices but not iPhones is the result of one simple design difference: iOS encrypts the data, and Android does not.

Other news articles. Research paper.

Posted on May 30, 2023 at 7:16 AM15 Comments

Comments

J. C. Checco May 30, 2023 9:22 AM

The issue with the implementation of fingerprint readers on smartphones is one of duplicity: because the capture area is so small, the reading devices stores multiple partial measurements. Thus, one fingerprint signature may actually multiple acceptable templates stored. This usability feature was exploited by New York University Tandon School of Engineering (https://engineering.nyu.edu/news/machine-learning-masters-fingerprint-fool-biometric-systems). I personally am a fan of the fingerprint slide scanner since it reads a larger print and leaves no latent residue to capture.

Wannabe techguy May 30, 2023 10:45 AM

“iOS encrypts the data, and Android does not.”
Imagine that. Even a dummy like me knows data should be encrypted.

Roger A. Grimes May 30, 2023 12:50 PM

Some minor corrections on the Apple iTouch technologies: 1) It wasn’t only that the iTouch data was “encrypted”, but the Secure Enclave chip that prevented many of the attacks from working. The latter probably had more to do with the attacks being defeated than the “encryption”. 2) iTouch fingerprints aren’t simply “encrypted”. They are transformed into a mathematical relationship that cannot be rendered back to the original fingerprint image. Much like a hash rendering. This is better than encrypted, although I’m unsure if it would stop these particular types of attacks. It is sad that any authentication attribute is stored in any authentication solution database without first being transformed in some way that an attacker cannot (easily) figure out the original fingerprint. I discuss this idea here: https://www.linkedin.com/pulse/protecting-mfa-shared-secrets-roger-grimes.

lurker May 30, 2023 2:38 PM

Never mind @Clive’s Edam cheese wax, in my current incarnation ss a peasant my real live fingerprint varies from day to day due to scratching and abrasion, so I couldn’t use one of these “locks” even if I wanted to.

Ted May 30, 2023 3:08 PM

Sorry to get distracted, but I was just reading @Mr. Grimes’ LinkedIn article. I was not aware that Google Authenticator QR codes might be non-expiring. This is something to think through.

vas pup May 30, 2023 6:18 PM

“In a BrutePrint attack, the adversary removes the back cover of the device and attaches the $15 circuit board that has the fingerprint database loaded in the flash storage.”

Hardware protection to prevent monkey work with back cover?

All security chain hardware and software may have the weakest link which should make stronger. Just opinion. Will appreciate any logical input/objection.

Clive Robinson May 30, 2023 7:19 PM

@ lurker,

Re : Nature did decree, there would prints every where, from toe nail through bended knee, to the upmost follicle of hair…

“my real live fingerprint varies from day to day”

So… Why not pick another body part you do not labour so hard with…

But more seriously you are in practice, not just theory, better secured against “Might is Right” types if you don’t use biometrics.

As I’ve indicated in the past I’m of the view that of the three basic aithentication factors of,

1, Something you are.
2, Something you have.
3, Something you know.

The first two are irredeemably broken in the face of “might is right” types, including judges and less patient types.

Thus you need to augment all three authentication factors with anti “might is right” measures of,

4, Time / temporal.
5, Place / spatial.

So that the soft lockout of the “something you know” for ordinary use becomes a hard lockout at say 2 hours or so during the day. Which can only be lifted at a certain time at a certain place otherwise it becomes a full lockout that can not be lifted by you no matter how much preasure the “Might is right” types apply.

As long as you keep the time&place in your head as “something you know” and change them frequently, then it makes life for attackers difficult if not impossible.

It’s something that needs to be considered along with “deadman’s switches”, “no goes”, and “duress codes” etc when designing a security system for this comming decade…

Clive Robinson May 30, 2023 8:18 PM

@ vas pup, ALL,

Re : All chains age.

“All security chain hardware and software may have the weakest link which should make stronger.”

All techbology has temporal limitations, as well as resource limitations.

Conrad Zuse made a state machine out of electromechanical relays, Charles Babage would have made a programable state machine / computer out of hand cranked mechanics a lifetime before.

Each was limited by not just the technology of the time, but also all the other resources.

So what is “strong today” history tells us will be “weak tommorow” almost as surely as day follows night. But more importantly we can not aford to make “strong today” to also be “strong tommorow”.

Thus as engineers we have no choice but to accept we have to continuously improve our designs. But we should also realise we should “embrace change” in our designs. That is we should design systems in a way to make “making improvments not just possible but without limiting the future”.

When I’m asked to provide an example of this I point out the trade off between “C Strings” and “Pascal Strings” and why C strings are quite annoyingly still with us, but Pascal Strings are effectively niche at best.

Erdem Memisyazici May 30, 2023 10:18 PM

Something you have rather than something you are would work better here.

I personally don’t live in an environment where I can find any privacy (too many IT security people) so I wouldn’t bother with it but if I did something like a USB-C key on your keychain would probably work better than a fingerprint.

As long as you don’t know the key and it isn’t transmitted unless you physically provide it power and datalines (no nearfield attacks, nobody can get it out of you with a crowbar) it’s much safer than external biometrics.

That being said a determined team will steal your keys and fingerprints anyways. The only way to beat a team of hackers is to hire one of your own and work closely with the laws in your state. They tend to be costly however and few are equipped properly (usually ones who are or working with military or government contractors, rest get old CVEs).

There is no tax payer’s alternative to my knowledge. It would be nice. Still waiting for a public solution to the issue myself. It certainly is not going to happen with this period of pretending like people have objective privacy and demonizing anyone who says anything against the hierarchy of broken software.

Maybe the next generation will be smarter than mine has been. 😄

Dave May 31, 2023 1:48 AM

@Wannabe techguy: Android has always been like that, a veneer of security so you can claim magic pixie dust in the press release but nothing is done properly until something is published pointing out just how insecure it is, and even then the “fix” is often just a band-aid patch so you can claim something was done. For example a few years ago Google implemented a signing oracle in Android phones where your phone would use a stored crypto key to sign absolutely anything someone fed to it, no questions asked. When I first saw this I thought “surely this isn’t what they’re actually doing”, but I checked it on my phone and it was. Pointed it out to some Google folks and they made a token change that required an extra step in using the signing oracle, but AFAIK it’s still there.

iAPX May 31, 2023 12:29 PM

Putting gloves everywhere, glasses that protect from remote iris scanning, a plastic-formed mask to hide your face feature, and naturally cleaning everything your mouth touched to destroy the DNA you left on it?
Adopting Monty Python’s Minister of Walk This Way?

What a wonderful world of biometric signatures!

I am not against using them for convenience, but none could be dubbed secure in any way, and moreover, none could be easily transferred, cancelled or changed.
Convenience trump all…

We have to really think about what we are doing with each device, and any setting.

Torsten Gang June 1, 2023 11:41 AM

One thing I am musing about is, with sufficient knowledge of the fingerprint matching algorithm, would it be possible to generate fingerprints that have a higher acceptance rate. According to apple, the expected rate is 1 in 50,000 false positives, I wonder if it is possible to beat that…

Kern Howell June 26, 2023 8:14 AM

The article you cited—and thus your post too—contains a critical mistake: it’s not an OS-related issue. They wrote that because they severely misinterpreted the paper. It’s a totally hardware-related issue. It has nothing to do with “Android vs. iOS”. Android devices tested by the authors are well-known to be highly insecure in general, mainly because they lack several security-related features or implement some incorrectly. Several people, including myself, are wondering why researchers didn’t conduct their experiments against Google Pixels—currently (currently) representing the state-of-the-art of mobile security. It would be interesting to see those tests on security-focused devices such as Pixels—and maybe non-smartphone devices. Security-conscious people usually tend to stay away from brands such as Xiaomi, Samsung, and OnePlus—as well as Apple, at least in recent (~5) years.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.