Skip to main content

Amazon’s Rekognition misidentified 28 members of Congress as criminals

Amazon Rekognition
Amazon's facial recognition service, Amazon Rekognition.
Image Credit: Amazon

Watch all the Transform 2020 sessions on-demand here.


Facial recognition algorithms are improving by leaps and bounds each year, but they’re far from perfect. Case in point: The American Civil Liberties Union said that in a test of Amazon’s Rekognition, the service erroneously identified 28 members of Congress as criminals.

The ACLU supplied Rekognition with 25,000 mugshots from a “public source” and had Amazon’s service compare them to official photos of Congressional members. Among the representatives misidentified were six in the Congressional Black Caucus, including civil rights activist Rep. John Lewis (D-Georgia), and 11 of the 28 false matches — roughly 38 percent — were people of color, who make up only 20 percent of current members of Congress.

A trio of Democratic Congress members responded to the test in an open letter to Amazon CEO Jeff Bezos.

“While facial recognition services might provide a valuable law enforcement tool, the efficacy and impact of the technology are not yet fully understood,” the letter read. “In particular, serious concerns have been raised about the dangers facial recognition can pose to privacy and civil rights, especially when it is used as a tool of government surveillance, as well as the accuracy of the technology and its disproportionate impact on communities of color.”


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


Jacob Snow, a civil liberties attorney for the ACLU, told members of the media that the test was conducted for under $13.

“One of the things that is dangerous about presenting this information in a law enforcement context is that there can be differences — in lighting, in angles, in age — so it can be genuinely difficult to say just based on the photos that they are the same person,” Snow told Mashable. “Facial recognition has the possibility of suggesting to a law enforcement user that there is a match. And then there is a high probability or a reasonable probability that the law enforcement user will trust the system and not apply the same level of skepticism.”

The ACLU’s findings aren’t entirely surprising. Facial recognition technologies are susceptible to racial bias, research has shown — a 2011 study found that systems developed in China, Japan, and South Korea had more trouble distinguishing between Caucasian faces than East Asians. And in a separate study conducted in 2012, facial recognition algorithms from vendor Cognitec performed 5 to 10 percent worse on African Americans than on Caucasians.

But an Amazon spokesperson told VentureBeat that the ACLU’s test was likely skewed by poor calibration. It used a confidence threshold — i.e., the likelihood that a given prediction is correct — of 80 percent, lower than the 95 percent Amazon recommends for law enforcement applications.

“[W]e think that the results could probably be improved by following best practices around setting the confidence thresholds … used in the test,” an Amazon spokesperson told VentureBeat in an email. “While 80 percent confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty.”

There’s no guarantee, however, that Amazon’s customers are following its guidelines. And historically, the accuracy of facial recognition algorithms used by law enforcement has left a lot to be desired. A recent House oversight committee hearing on facial recognition technologies revealed that the algorithms used to identify matches are wrong about 15 percent of the time. Meanwhile, the system used by London’s Metropolitan Police produces as many as 49 false matches for every hit.

In May, the ACLU revealed that Amazon worked with the city of Orlando, Florida and the Washington County Sheriff’s Office in Oregon to deploy Rekognition, reportedly charging around $400 for installation and as little as $12 a month.

Orlando is leveraging the facial recognition technology to target suspected criminals in footage from the city’s surveillance systems. And Washington County built a smartphone that allows deputies to scan mugshots through a database of 300,000 faces for matches.

In June, in a letter address to Bezos, 19 groups of Amazon shareholders expressed reservations over sales of Rekognition to law enforcement, joining the ACLU and nearly 70 other groups in protest.

“While Rekognition may be intended to enhance some law enforcement activities, we are deeply concerned it may ultimately violate civil and human rights,” the shareholders wrote. “We are concerned the technology would be used to unfairly and disproportionately target and surveil people of color, immigrants, and civil society organizations … We are concerned sales may be expanded to foreign governments, including authoritarian regimes.”

Update at 10:27 p.m. Eastern: Added reference to a letter sent by members of Congress in response to the ACLU’s report.