Watch all the Transform 2020 sessions on-demand here.
A bipartisan group in the U.S. Congress today said it plans to draft legislation to address facial recognition software use by law enforcement. Calls for action were heard in the House Oversight and Reform Committee from Rep. Alexandria Ocasio-Cortez (D-NY), as well as ranking member Rep. Jim Jordan (R-OH), a Trump supporter who said “seems to me it’s time for a timeout.”
“You’ve hit the sweet spot that brings progressives and conservatives together,” Rep. Mark Meadows (R-NC) told committee chair Rep. Elijah Cummings (D-MD). “When you have a diverse group on this committee, as diverse as you might see on the polar ends, I’m here to tell you we’re serious about this, and let’s get together and work on legislation. The time is now before it gets out of control.”
The Congressional committee hearing took place at the same time as an Amazon meeting in Seattle, where shareholders voted to reject proposals to halt the sale of Rekognition to governments.
“That just means that it’s more important that Congress acts,” said Rep. Jimmy Gomez (D-CA) in response to the shareholder vote.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
The ACLU first made public the use of facial recognition software by Amazon roughly one year ago. Since then, audits of Rekognition have found it more likely to work for white men and less likely to recognize people with dark skin, particularly women of color.
“I do expect that we are going to be able to get some legislation out on this. I talked to the ranking member, and there is a lot of agreement,” Rep. Cummings said. “The question is ‘Do you have an all-out moratorium and at the same time, try to see how this process can be perfected?’ But there’s a lot of agreement here, thank God.”
Trump tech advisor Michael Kratsios warned against regulation that stifles innovation today while in Paris to sign on to the Organisation for Economic Co-operation and Development (OECD) AI principles and recommendations. In recent months, a number of bills have been proposed to regulate AI, including legislation introduced Tuesday in the U.S. Senate to fund a national AI strategy.
Facial recognition like the kind the used by state and local law enforcement is accessible to at least one in four law enforcement agencies in the United States today, according to analysis by Georgetown University Law Center. Use of driver’s license photo databases for facial recognition means than half of U.S. adults are currently included in searches.
A panel that includes facial recognition software experts was nearly unanimous in urging that a national moratorium be put in place until regulation can be enacted or the technology matures. And state lawmakers in Washington and Massachusetts are currently considering moratoriums on the use of facial recognition software. Bans on facial recognition use by police and city departments are also being considered in Oakland and Berkeley, following the first ban in the nation in San Francisco last week.
In addition to warning against the use of real-time facial recognition on cameras like the kind police in Detroit and Chicago have, some supported a ban on the use of facial recognition at protests. Doing so could have a chilling effect on the first amendment-protected right to protest and could lead to people avoiding demonstrations because they’re afraid of being monitored, followed, or tracked by the government.
“It is fundamentally American to protest and frankly un-American to chill that kind of protest,” University of the District of Columbia law professor Andrew Ferguson said at today’s committee meeting, pointing to a 2015 use of facial recognition at protests following the killing of Freddie Gray by police in Baltimore.
MIT Media Lab researcher and Algorithmic Justice League founder Joy Buowamwini coauthored 2018 and 2019 audits of facial recognition software systems, including those made by Amazon, IBM, Face++, and Microsoft, and found they performed poorly on young people, women, and people with dark skin. The results of her work were criticized by AI leaders but defended by dozens of prominent AI researchers last month.
“At a minimum, Congress should pass a moratorium on the police use of facial recognition, as the capacity for abuse, lack of oversight, and technical immaturity poses great risk, especially for marginalized communities,” she said at the meeting.
Buowamwini also urged improvements to the Department of Commerce’s NIST data sets used to evaluate the trustworthiness of facial recognition systems, which she says currently oversample communities of color.
Clare Garvie began monitoring police use of facial recognition in the United States in 2016 and last week for the first time called for a national moratorium. Additional research by Garvie and the Georgetown University Center for Privacy and Technology found that police have altered images or used photos of celebrities who look like suspects to make arrests, while a number of major cities have acquired real-time facial recognition software systems.
“Imagine if we had a fingerprint lab drawing fingerprints or drawing a latent print’s finger ridges with a pen and submitting that to search. That would a scandal, that would be a reason for a mistrial or convictions being overturned, and it’s hugely problematic,” she said to the committee.
Dr. Cedric Alexander, former president of the National Organization of Black Law Enforcement Executives, also spoke today, but he does not favor a moratorium or ban.
“I kind of cringe in some ways when I hear my colleagues here [say] maybe there should be a complete halt or a moratorium on facial recognition. I’m not sure if that’s the answer,” he said.
Instead, he favors creating standards that determine lawful and inappropriate use of the technology and worries that a lack of proper police training or understanding of the technology could have a negative impact on police-community relations.
“God knows that’s one thing the police don’t need, considering the environment we’re already in trying to build relationships between police and community,” said Alexander.