In our increasingly computer-facilitated lives, we are constantly confronted by new threats to our personal privacy. We have learned that our credit cards, electronic home assistants and smartphones are all capable of sharing our personal information with their corporate sponsors. Yet carried with us every day is another thing that risks exposing our personal information: our faces.
Use of biometric identifiers is a growing challenge in the privacy space. People can now be automatically identified by their faces, their fingerprints, their eyes and even their voices. Cameras in public places can scan crowds, and then both private companies and the government are able to use databases of facial recognition information to identify individuals. On one level, this is nothing new. Whenever you are in public, there is a chance that a person might see and recognize you. Famous people are recognized by strangers all the time, and the rest of us may still be known to those we see regularly: baristas, salespeople, secretaries and, if we are unlucky, the police. But this is on a completely different scale. The proliferation of cameras, and of long-term storage, vastly increases the chances that people will be seen as they go about their lives. And automated facial recognition may turn a slim possibility of being recognized into a virtual certainty.
Biometric identification can be incredibly useful. Imagine a transit camera observes a mugging and gets a shot of the offender’s face. Or a doorbell camera sees a jogger going by a murder scene — a potential witness. Wouldn’t it be great to be able to put names to the faces? Biometrics also allow police to scan crowds for known bad actors, people with outstanding warrants and celebrity stalkers. On the private side, stores may use facial recognition to track known shoplifters, casinos to ban card counters and airlines to check in customers.
Facial recognition makes all these tasks far easier than they were. But that ease comes at a real privacy cost. Suddenly a face in a picture of a crowd may be almost as good as a name. What do we lose? The ability to protest without everyone knowing that we did, the ability to enter an Alcoholics Anonymous meeting or doctor’s office without being noticed by the camera across the street.
Biometric identification is not flawless. We know that facial recognition tends to be less reliable at identifying nonwhite people, and it is often hard to find out how accurate a particular vendor’s software is. The more we begin to rely on biometric identification, the more we must be carefully modest about our level of certainty.
When something is incredibly useful but also incredibly dangerous, the answer is to set rules for it. Communities have begun to do that. Use of facial recognition by law enforcement has been banned in some municipalities, and private use of biometric information is tightly regulated in states like Illinois, Texas and California.
This is a balancing exercise. My research shows that people respond very differently to uses of biometric technology depending on who is using it and what they are using it for. One study showed, for example, that 59% of people were comfortable with a store using facial recognition to track shoplifters, but only 26% were comfortable with the same store using it to track customers for advertising. If a bank uses a voiceprint as extra verification over the phone, that is probably an unalloyed good. But widespread use of biometric monitoring in public places turns science-fiction-level Big Brother into a real possibility.
Matthew Kugler is an associate professor at Northwestern Pritzker School of Law.
Peoples’ comfort level with use of biometrics
|Fingerprint to unlock smartphone|
|Smart doorbell with facial recognition to identify visitors|
|Store using facial recognition to detect known shoplifters|
|A homeowner’s association using facial recognition to track the movements of people on streets and sidewalks|
|Store using facial recognition to track shoppers around store and serve targeted ads|
Matthew B. Kugler, From Identification to Identity Theft: Public Perceptions of Biometric Privacy Harms, 10 U.C. Irvine L. Rev. 107 (2019).