If you have ever uploaded a selfie to social media or have a federal or state issued ID, chances are your face is on file and you have little-to-no control over its use. Clare Garvie, a technology expert and author of The Perpetual Line-Up: Unregulated Police Face Recognition in America, stated to rolling out, “Over half of all American adults are now enrolled in a face recognition database accessible to law enforcement—likely without their knowledge—because they applied for a driver’s license or state ID card.”
This software is connected to various databases and allows the quick sifting of images and biometric data to match up with the requested information. In countries such as Israel, the UK, and other European countries it is widely used in anti-terror efforts. Within the past decade, US law enforcement agencies have also started to use facial recognition software that is also in some major cities linked to real-time CCTV video. For some, it gives a feeling of security that a dangerous criminal can be caught with greater ease. For others, such as Black Americans, it increases the well-used police phrase of “You fit the description” and a negative encounter with police.
Earlier in July, Amazon rolled out its own software geared to law enforcement, called Amazon Rekognition. The software offers a low cost option that can actually be used by anyone who wants to analyze video for facial recognition matching. According to the website it offers a pricing based on the amount of video that needs to be analyzed against the Amazon Rekognition database.
Last week, the ACLU used Amazon Rekognition to scan members of Congress and 28 members, including six members of the Congressional Black Caucus were misidentified as suspects charged with a crime. Blacks and Latinos were disproportionally identified in the test, including Civil Rights legend Rep. John Lewis. The test showed a 5 percent error rate with the software, which could make a normal police encounter deadly. The ACLU wrote an open letter to Amazon CEO Jeff Bezos that states in part:
“An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.”
The ACLU has asked Amazon to release the names of a list of what government agencies, including law enforcement, state, and US Intelligence branches are using Amazon Rekognition. In response to the ACLU test, Amazon General Manager Matt Wood wrote in a technology blog that the ACLU test was flawed. According to Woods, the ACLU used a default setting of 80 percent accuracy and that a higher setting should have been used to provide more accurate results. Woods claims that Amazon got much better results when it ran the test and recommends that law enforcement put its settings at 99 percent.