Using Machine Learning to Create Fake Fingerprints

Researchers are able to create fake fingerprints that result in a 20% false-positive rate.

The problem is that these sensors obtain only partial images of users’ fingerprints—at the points where they make contact with the scanner. The paper noted that since partial prints are not as distinctive as complete prints, the chances of one partial print getting matched with another is high.

The artificially generated prints, dubbed DeepMasterPrints by the researchers, capitalize on the aforementioned vulnerability to accurately imitate one in five fingerprints in a database. The database was originally supposed to have only an error rate of one in a thousand.

Another vulnerability exploited by the researchers was the high prevalence of some natural fingerprint features such as loops and whorls, compared to others. With this understanding, the team generated some prints that contain several of these common features. They found that these artificial prints were more likely to match with other prints than would be normally possible.

If this result is robust—and I assume it will be improved upon over the coming years—it will make the current generation of fingerprint readers obsolete as secure biometrics. It also opens a new chapter in the arms race between biometric authentication systems and fake biometrics that can fool them.

More interestingly, I wonder if similar techniques can be brought to bear against other biometrics are well.

Research paper.

Slashdot thread

Posted on November 23, 2018 at 6:11 AM16 Comments

Comments

Tatütata November 23, 2018 8:43 AM

This item has been around for a couple of weeks, and I envisaged mentioning it in a squiddy thread, but when checked it out I realized that something rather similar had already been published here back in May 2017.

However, the prior work was done at Michigan State, whereas the one mentioned above comes from New York University.

Duplication or original research?

Tatütata November 23, 2018 8:54 AM

I just noticed this in the Wired article which answers my question:

Anil Jain, a biometrics researcher at Michigan State University who did not participate in the project, sees that as a real shortcoming; it’s hard to extrapolate the research out to an actual use case. But he says the strength of the work is in the machine learning techniques it developed. “The proposed method works much better than the earlier work,” Jain says.

Clive Robinson November 23, 2018 12:12 PM

@ Bruce,

If this result is robust — and I assume it will be improved upon over the coming years — it will make the current generation of fingerprint readers obsolete as secure biometrics.

As described, the attack methodology is sound, but what it is exploiting is a significant failing of more modern finger print readers.

That is, as a simple approximation, it’s like saying that the string “LLLWLWWLWW” is matched by any other string of the same length with five “W” and five “L” in it “in any combination” but with much shorter strings…

It’s a major failing by those who wrote the software in the scanners, such that it causes users the least amount of problems, not the least amount of security breaches.

Thus improving the way scanners work should “in theory” work against this attack…

Oddly perhaps the software developers are not alone in this way of doing things. Conventionaly fingerprint information was stored by the number of each feature not by the patterns the features form. Rarely if ever is actuall finger print matching produced in court apparently because juries might get confused if shown it… Though from what I’ve been told, when it has been seriously questioned by the defence the prosecution have tried all sorts of tricks to stop entire images being produced… Because when the full image has been produced cases tend to stop at that point (which does not sound like a confused jury to me)…

The reason apparently is that the traditional matching process is the wrong way around. That is there is an image from an object and a suspects finger print on a card. What the analyst tries to do is imagine how the suspects image can be distorted to match that from the object… Thus it appears like reading a polygraph, the result depends on how good your imagination is, rather than something a little more repeatable as science generally asks for…

Ninja November 23, 2018 1:16 PM

Biometrics should not be used as the key. At best as the ID. It’s not too hard to implement easy multi-factor processes nowadays. Want to use biometrics? Ok, combo it with some authentication in the cell phone (ie: NFC with on screen confirmation and randomly generated codes on the terminal and/or the phone) and a somewhat simple pass key but drop this bad security practice of using biometrics as the key. I don’t expect companies to start listening to security experts any time soon though.

Jonathan Wilson November 23, 2018 2:46 PM

Yet another thing to add to the long list of reasons I dont want to own a device (phone, laptop, door lock or anything else) that uses fingerprints for security (or if I somehow end up with such a thing, not use the fingerprints and use proper security instead)

echo November 23, 2018 6:49 PM

@Bruce

More interestingly, I wonder if similar techniques can be brought to bear against other biometrics are well.

Yes, I use these tricks every day. It’s called the magic of make up. Make up plays people’s neuro-psycho-social stack like a fiddle. From time to time I have played Google reverse image search just to see what their algorithm outputs.

Why November 24, 2018 4:47 PM

We leave fingerprints on everything we touch. It’s easy enough to make a 3D mold with those ancient home-electronics circuit board etching kids. It’s easy to map the new print onto your finger. Just press hard. Or use a gummy bear.

This idea of using fingerprints as keys has ALWAYS sucked. Now it sucks even more.

Although I do wonder… If people touch buttons the same way all the time, perhaps you should be required to press a set of random numbers while your prints are read. That might allow for greater accuracy without a lot of false negatives.

VinnyG November 25, 2018 4:04 PM

Also interesting is whether and how much effect these studies debunking the myth of fingerprints as unique identification might have on criminal law. How many persons have been imprisoned largely or solely on the basis of partial fingerprints, or expert testimony that ignored the prevalence of loops and whorls?

Men in Black November 25, 2018 4:31 PM

@VinnyG

Fingerprints are good for the jury.

Show an image, magnified, contrast enhanced, on a transparency of the fingerprint at the scene, if you will.

The same of the suspect’s fingerprints on file or from arrest booking. Same magnification.

Show the jury how they match up on an overhead projector. Let them see for themselves. You can’t do that with DNA and still retain your credibility in court.

Clive Robinson November 26, 2018 12:05 AM

@ Men in Black, VinnyG,

Fingerprints are good for the jury.

Err no they are not, in the UK you will not see fingerprint evidence presented that way in court…

The argument for not doing so that is given, is along the lines of,

    The human skin is very pliable and is distorted by both the shape of the object and the resultant directional forces involved with gripping it.

It’s why I said what I did in the last paragraph of my comment of ’12:12′ above in this thread.

Put simply you have two 3D finger tips that have been stretched and distorted in the process of “projecting” into two 2D images. One on the fibgerprint card, and one from the scene of crime object. Each projection has it’s own stretches and distortions, which means the odds of the two images matching by overlay is quite minimal. Worse the projection from the object is likely not only “a partial” it’s also likely to be contaminated in many ways and broken up by the physical properties of the object thus be quite indistinct.

Thus the matching process if you can call it that is primarily to find “certain feature types” that can be counted, and match by counting them…

Because the projection distortion is primarily based on distance, over a very small area of each promenent feature the images will kind of match to the human eye, with the matching quickly getting less and less the further you go from the center of the feature…

But taken over such a small scale area of a given 2D projection the chance that two people have a prominent feature that looks the same after processing / categorisation is very very high (Which is why this “dictionary attack” against certain types of fingerprint readers works).

What a human fingerprint matcher is supposed to do after identifying prominent features,is take them and make a constelation with them, then try and overlay the constelations by “best judgment”. A big problem with this is many fingetprint images are in effect “partials” so you are trying to map only a small piece of a constalation onto another larger constalation (the one on the file card). This is a highly subjective process and very very susceptible to “confirmation bias” as independent testing has shown.

As part of the process those matching fingerprints are also supposed to note any missing or other features to rule out matches. But that can get “accidently” forgotton or put down to “other factors”…

However forensics “labs get paid by results” and it’s a competitive business these days where “market forces” and “throat slitting” are kind of synonymous…

As the police have to pay for it out of their budget you can see why “paid by results” quickly gets a sinister conitation. It’s made worse by the fact that for many crimes the police actually have the right criminal by the “Means, Motive and Opertunity” pre-filter followed by the “Modus Operandi” or “who’s style” filter followed by “who’s flapping their gums” filter. But quite often run the other way starting with “Who’s heard a whisper on the streets” via their pet Confidential Informant (CI).

The simple fact is over four fifths of crimes solve them selves because the criminals “big it up” to friends and ascociates for “respect” or “reputation” and word goes around and thus people “snitch / grass / inform”. There are two reasons why criminals “big it up” the obvious one is “stupidity” which is what we are seeing with these “Proud Boy Rappers” and simillar “what’s me app” smart device “instercrime view apps. The less obvious one is “building your C.V.” that is it’s not quite “you are only as good as your last job” but getting your name known in the right circles to get more profitable jobs or be trusted by others to bring needed skills to your jobs means you need a reputation or legand and that has to be built.

Part of that “building process” gives rise to your M.O. which is where experienced investigators get their “gut feeling” or “who looks good in the picture” for a particular job.

Thus the forensic labs getting fingerprints to test know that with a very high probability they match in some way any way. They also know that fingerprints rarely get called into question in court because judges do not like it happening in their court. Thus the chances they will get caught out doing a shody job on “meal ticket jobs” is very very low. Thus the lab needs a way to sort out “99 percenters” from the ones where they might just get challenged. It’s actually not as hard as it might first appear, evidence has a “chain of custody” thus it’s not overly difficult to work out where a particular job has come from. Secondly a seemingly innocent enquiry about “priority” will give much more information. Oh and just like criminals the police tend to talk to much “to their own” or those they see as “their own”. This latter problem is not helped by the “canteen mentality” in the lower police ranks that fosters the “For us or against us” black and white thinking. If you look back at just how easily journalists have got ahold of leads on the quiet from low rank police officers, it does not take much imagination to work out that the forensic labs would not have similar “leads” about jobs that come in…

Thus fingerprint matching has kind of dropped to the point of “reading entrails” in many cases but nobody tends to realise it.

Moving the whole shoddy process to A.I. is not going to improve things any because of the way we build such systems. That is the training data they will use will be a subset of the existing “criminal records”. As we are finding out when you do this with other criminal databases what you do is actually train the existing failures and biases into the A.I. model, so it “Carries on doing the same old same old as the human system did”… Great for cutting “head count” but as much use as a “chocolate fire guard” when it comes to improving the process on which some peoples very lives depend upon…

In short fingerprint matching like lie detectors and many other “forensic processes” are at best “junk science” at worst a deliberately ploy by some to get recognition, promotion, better pay and new job opportunities…

Oh but don’t “rock the boat” or “break rice bowls” by saying this because you will get “push back” that you will not believe as an honest Police Woman in Scottland found to her cost,

https://www.bbc.co.uk/news/uk-scotland-glasgow-west-16181875

The simple fact is nothing has realy changed in the 21 years since then, other than the introduction of computers… So that the same shody practices are now hidden behind the front of a bunch of 1U, 19″ racks that house the computers that now do the “entrail reading”.

The advantage of course of computers is “nobody is to blaim” it’s spread so thin that well, the usual applies, you get an enquirey that surprise suprise concludes,

    All involved must share some measure of responsability, lessons must be learned, new procedures must be put in place, yardy yardy, quack, quack, give us a load of money and resources to do this…

Thus it’s “profitably” booted into the long grass, trebbles all round for the profit sharers, case closed, till the next profitable failing… All is as it’s supposed to be, except for all the victimes who’s lives have been ruined.

Dan November 27, 2018 7:33 AM

I, for one, would be very glad if fingerprints were no longer used as a biometric as I also leave them everywhere I go. I am very uncomfortable crossing a border and giving my fingerprints as I have not committed any crime.

vas pup November 29, 2018 8:51 AM

@all:
Flexible electronic skin aids human-machine interactions:
https://www.sciencedaily.com/releases/2018/11/181128114927.htm

“Human skin contains sensitive nerve cells that detect pressure, temperature and other sensations that allow tactile interactions with the environment. To help robots and prosthetic devices attain these abilities, scientists are trying to develop electronic skins. Now researchers report a new method that creates an ultrathin, stretchable electronic skin, which could be used for a variety of human-machine interactions.”

@Clive, is it possible future application for (1)digital fingerprints/palm prints: let say you could load any fingerprint out of the data base on your electronic skin based on the task (IC cover ops in particular) fooling biometrics; (2)opening electronic locks just by touching them (same load of codes out of DB)?
Your super expertise in electronics open eyes many times for our respected bloggers on many difficult technical issues (I guess not for me only in particular). Thank you.

Clive Robinson November 29, 2018 10:48 AM

@ Vas Pup,

let say you could load any fingerprint out of the data base on your electronic skin based on the task

Both yes and no.

At a first order view a fingerprint is a 3D surface on what is in effect a 2D plane, not unlike the surface on an old vinyl record (45RPM / 33&1/3RPM).

It is not difficult to make such a surface the two basic ways both require a mold, the first you use a liquid not unlike making “slipware pottery” the second uses a soft surface that is distorted to fit in some way then fixed in some manner such as mild heat treatment or using ultravilot light.

The problem with a mold is actually making it the best method is still to use a thin liquid in direct contact with the finger that drys without shrinkage or other distortion. However this is problemattical as the subject would normally be aware of you doing this proceadure.

Thus other methods are often used, in each case the precision is a lot less, thus in theory at least the fakes should be detectable by that alone.

But how to do it without a mold, is a tads more tricky especially if you want to do it covertly any time any place. In theory you could use a surface that contracts or expand under the influance of electricity, it’s fairly common to find such a device wr generaly call it “muscle” and to a certain extent we already do something similar when the gripping surfaces of our hands and feet are immersed in water. So as in many things nature solved that eons ago. However human tech is still a way of doing such a thing. But it could be done with either “nano-tech” actuators or by re-engineering biology. However both would require the continuous supply of energy as work is being done, and as there is no such thing as 100% efficient there would be heat generated, which again in theory could be detected in some way. However I have been thinking on and off if e-inks could be used, but that is a whole other conversation for another day.

But human skin is not just a surface, you need to start thinking about looking at it as multipurpose active device.

Thus you need to think about second order effects such as sweat pores and what comes out of them.

Because we already can detect pores in various ways, and I suspect it will not be long before the use of silicon micro sensor chemical agent measurment (CAM) devices. The technology is already in prototype and one aim is to progress it to the point of “direct reading DNA scanners” though I feel that is quiye aways of currently so “intetmediate” uses as checking “if animal, mineral or vegtable” will be a first stage followed by what sort of animal etc. Whilst we might share 99% of our DNA with other creatures the chemicals we emit are much more widely spaced so much so that even the very limited capability human nose can tell the difference between sheep, goats, donkeys and horses where their emitted “smell” has been collected on otherwise sterile lint cloth.

But then there are third order effects to be considered. As I’ve mentioned on the odd occasion all materials have an EM spectrum when it comes to transnisability, reflection and absorbtion. Human skin looks radically different at diferent EM wavelengths and trying to match that is not going to be easily done. But transmissability brings up another issue, human skin is close to being transparent at certain frequencies which most of us have seen in hospitals when they stick a little clip with an LED and near IR etc photo sensor in it to read not just pulse rate but oxygen saturation and other things such as temprature and even glucose and INR on the way.

So to recap you are still quite a way away from solving the first order effects in the way you envisage, but second and third order sensors could be included in fingerprint readers fairly easily thus soon, are quite a way ahead of what you could fake up… So at best you would have a narrow time window with higher security systems.

Which brings us to another issue, cheap sensors and extetnalising risk for the sensor owner.

Banks try wherever possible to lay the blaim for fraud on orhers be it clients, merchants or suppliers. Thus if they don’t carry the risk, they don’t need to do any more than “pay lip service to it”. Which means they will use the lowest cost probably most insecure fingerprint readers out there, just as US Boarder did when starting collecting passenger finger prints. Such behaviour develops a “legacy effect” the more widely deployed it is, where all of a sudden the cost of upgrading is used as an inhibitory argument to doing an upgrade…

Thus the window on shody fingerprint readers in some areas may remain open untill such shody devices are nolonger available… However as long as product is selling profitably there is little incentive to stop producing it even in the natoriously short life cycle tech industry. Some things (AC97 audio chips) just stay the same for good old “legacy issues”.

Such legacy issues are one of the best places to not just find but use attack vectors. So maybe your fingerprint idea will come into existance simply because of “externalised risk” or as we say in the UK “Not my job mate” or “jobsworth” reasons…

Clive Robinson November 29, 2018 11:17 AM

@ Vas Pup, All

I forgot to mention that in reality the only thing new in the link you posted to is realy,

with a conductive “glue” made of vertically aligned magnetic particles embedded in a polyvinyl alcohol gel

Such glue is kind of the second generation after “zebra strip” developed last century for using small LCD pannels in the likes of what we would now call “Non Smart Phones” such as the Nokia XX10 range.

It just so happens if you look such glues, pastes, resins, and plastics up they have been used for some time now in connecting very very small RFid type chips to thin foil antenas and the like in bank cards, pasports and similar.

The rest of what they have done we were trying to do in the 1980’s in the London Home Computer Robot scene that used to meet at North London Polytechnic. For touch we used the “Hundred ohm foam” used for transporting chips to protect them from static.

Hundred ohm foam was essentialy a plastic foam that had a lot of carbon powder/particles in it to make it conductive. It’s actuall resistance depended on how hard you presed down on it. The more the carbon came into contact the lower the resistance. So what we did was use conductive glue we made to stick aluminium foil on one side then put that up against an etched PCB not very disililar to that you would find under the “poke dead flesh” keyboards in the likes of the Sinclair Spectrum and other low cost home computers.

We even used them as “force feed back” sensors for home made electronic music piano style keyboards such that you could measure the players force etc and some prototype games joysticks etc.

When I look back on it the alternative uses we found for hundred ohm foam were seemingly boundless, for instance it made an effective shield to use to get close enough to those combined IR and Microwave burglar alarm sensors such that they were disabled 😉 it also made cheap anechoic chamber lining for doing EM testing in and you could with care make microwave waveguide attenuators with it.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.