Virtual lineup: Your face is already on file

A new report pulls back the curtain and reveals the ‘Wild West’ of unrestricted facial recognition monitoring by American law enforcement

If local police showed up at your door requesting fingerprints and DNA samples, would you passively and unquestioningly comply? Or would you ask what crime you're suspected of committing and demand probable cause for making the request or proof of a search warrant?

The fact is, there's a 50 percent chance your photo is already part of a biometric database. And law enforcement agencies across the country are using facial recognition software to regularly search this "virtual lineup" with little to no regulation or limits, according to an eye-opening 150-page report, "The Perpetual Line-Up: Unregulated Police Face Recognition in America," published this week by the Georgetown Center on Privacy & Technology.

"Unless you've been arrested, the chances are you're not in a criminal fingerprint database or a criminal DNA database either. Yet by standing for a driver's license photo, at least 117 million adults have been enrolled in a face recognition network searched by the police or the FBI," said Alvaro Bedoya, the center's executive director and co-author of the report.

According to "The Perpetual Line-Up," only 8 percent of the photos that appear in the FBI's facial recognition system are of known criminals. This is an unprecedented privacy violation, Bedoya said. It's "a national biometric database that is populated primarily by law-abiding people."

With great power comes … no accountability?

Georgetown researchers sent 106 public records requests to police agencies and found that of the 52 agencies that acknowledged using facial recognition, only one had obtained legislative approval before doing so.

No state in the country has passed laws that define how facial recognition can be used in police investigations. Police departments don't need a warrant to search facial recognition databases, nor do they limit use of the technology to investigating serious crimes, the report said.

Only a handful of departments have imposed voluntary limits on their searches -- for instance, to require reasonable suspicion. And only one agency -- the Ohio Bureau of Criminal Investigation -- explicitly prohibited "using face recognition to track individuals engaging in political, religious, or other protected free speech."

Most police departments don't even audit their facial recognition systems for accuracy or teach their staff how to visually confirm facial matches. (That skill may seem like it would be innate but actually requires specialized training.) 

"With only a few exceptions, there are no laws governing police use of the technology, no standards ensuring its accuracy, and no systems checking for bias," said Clare Garvie, a co-author of the report "It's a Wild West."

The fallibility of technology

Law enforcement agencies like the FBI argue that using biometric tools reduces the likelihood of racial policing because an algorithm is not biased. But the report also disputes that claim, stating research shows facial recognition is significantly less accurate when identifying African Americans, women, and young people. TV tropes about magical Enhance buttons aside, the reality is the facial recognition software used to search photo databases is far from perfect.

"The algorithms make mistakes," Garvie told PCWorld by email. "These mistakes happen at a higher rate when the systems are used to try and identify people in lower-quality images," including surveillance camera images, smartphone photos, and social media pictures. In addition, search systems are set up to return results, "regardless of whether the suspect being searched for is in the database," she added. "This means that a system may return a list of 10 or 40 completely innocent people." 

Think that doesn't have real-world consequences? Read The Intercept's chilling story of how one man's life was ruined by a facial recognition mismatch, and see whether you still think that unrestricted, unaudited use of facial recognition by law enforcement is a good idea.

Big Brother is watching

"Perhaps the most dystopian aspect of the report is its findings that real-time facial recognition -- identifying people in public as they pass a live-feed video camera -- is increasing in popularity among police departments," Wired writes.  

The report says at least five major police departments have "run real-time face recognition off of street cameras, bought technology that can do so, or expressed a written interest in buying it." That's counting only the departments that responded to the study. The New York Police Department is known to have a facial recognition program, but it denied Georgetown's records request -- as did the Los Angeles Police Department, which also claims to use real-time facial recognition.

This kind of surveillance tracking has serious privacy implications. "This is the ability to conduct a real-time digital manhunt on the street by putting people on a watch list. Now suddenly everyone is a suspect," said Bedoya. "It turns the premise of the Fourth Amendment on its head."

It also could fly in the face of last year's Supreme Court decision on privacy, in which the justices unanimously agreed that "putting a GPS tracker on you, your car, or any of your personal effects counts as a search" and is therefore protected by the Fourth Amendment. People have a reasonable expectation to the privacy of their location data, the court concluded.

Facial recognition is "an extraordinarily powerful tool," said Bedoya. "It doesn't just track our phones or computers. It tracks our flesh and our bones. This is a tracking technology unlike anything our society has ever seen."

Who's watching the watchers?

It may be too late to keep your face out of a biometric database, but privacy advocates hope to limit the ways in which the system can be abused. A coalition of civil liberties groups is calling for the Department of Justice to investigate police facial recognition databases, starting with police departments that are already under investigation for biased policing.

The aim is not to ban the use of facial recognition software, but to pass strict legislation on its use. "Face recognition can and should be used to respond to serious crimes and public emergencies. It should not be used to scan the face of any person, at any time, for any crime," the report argues.

The report proposes that states pass laws to protect civil liberties -- including requiring a "reasonable suspicion" of criminal conduct before searching databases -- limiting the amount and types of data stored, and requiring independent oversight with regular audits of performance.

"As technology advances," The Verge writes, "drawing a line between policing and invasive surveillance will be an unavoidable part of the debate over facial recognition." But Bedoya points out that state legislatures have already passed laws that limit not only geolocation trackers but automatic license plate readers, drones, wiretaps, and other surveillance tools.

"It's not about protecting criminals. It's about protecting our values."

Copyright © 2016 IDG Communications, Inc.

How to choose a low-code development platform