Throughout most major cities, everyday citizens are under constant surveillance by the police when they walk city streets. For many, it is a welcomed comfort knowing that law enforcement is on the lookout for trouble. However, two groundbreaking reports released on Thursday, May 16, maintain that police are abusing this technology and the civil liberties of the public are in danger.
The reports are titled “Garbage In, Garbage Out: Face Recognition on Flawed Data” and “America Under Watch: Face Surveillance in the United States,” and should be cause for alarm for the average citizen. Now, the authors of the reports are set to appear before a Congressional House Oversight Committee hearing on face recognition next Wednesday, May 22.
Here are the key findings in the reports:
- On multiple occasions, when a suspect’s photo didn’t turn up good leads, NYPD picked a celebrity who looked like that suspect, downloaded their photo from the Internet, ran the celebrity’s photo through its system—and arrested people based on those leads. The actor Woody Harrelson’s photo was used in this way.
- Analysts using face recognition systems at NYPD and other agencies routinely edit photos before submitting them for identification, including by adding in facial features copied from different photos.
- Analysts at NYPD and other departments sometimes feed forensic sketches into their face recognition system to get a list of suspects. This practice has been shown to fail 95 percent of the time.
- Chicago and Detroit have both acquired massive real-time face recognition systems with little to no public scrutiny.
- Detroit’s system was designed to be able to operate on the city’s “Project Green Light” network of over 500 cameras citywide. Project Green Light cameras are present outside places of worship, women’s reproductive clinics, and youth centers.
- Illinois has the country’s strongest legal privacy protections in the commercial biometrics space, yet Chicago has quietly acquired and paid to maintain a face surveillance system capable of operating across the city.
The reports were issued by the Center on Privacy & Technology, and the authors state the situation is getting worse as police are purchasing more advanced facial recognition software and imaging data. Some of this data is coming from social media platforms as users upload innocent pictures of themselves, friends or family.
“Face recognition is a powerful tool, and it is only getting more powerful. Our research demonstrates, definitively, that use of this technology is out of control. This helps explain why so many communities are clamoring for change, and even going so far as to ban the use of face recognition technology by government entities—perhaps a wise choice in light of what we’ve learned,” says Clare Garvie, senior associate at the Center on Privacy & Technology.
“Communities, not the police, should decide whether and how to use surveillance tools with the ability to identify people based on their faces, and those decisions should be made in public forums, with full transparency into how these systems are used and what they can do.”