It's still too early to celebrate Amazon's Rekognition moratorium

Amazon Rekognition shouldn’t have been a product in the first place -- and it shouldn’t come back.

By Aimee Pearcy , written on June 11, 2020

From The Surveillance Valley Desk

A couple of days ago I wrote a post about Jeff Bezos’ virtue signalling. Now, just two days after IBM left the facial recognition market, it looks like Bezos has finally logged out of his Instagram account.

Yesterday, Amazon announced that they will be “implementing a one-year moratorium on police use of Rekognition,” its facial recognition software. 

The statement came as a huge surprise -- especially given that back in February, Amazon seemed to care so little about the issue that it admitted it had no idea how many police departments were actually using Rekognition. 

Amazon’s short statement, as posted in their blog last night, is pasted below:

“We’re implementing a one-year moratorium on police use of Amazon’s facial recognition technology. We will continue to allow organizations like Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics to use Amazon Rekognition to help rescue human trafficking victims and reunite missing children with their families.

We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”

We’ve been waiting a long time for Amazon to even admit that its Rekognition technology is problematic.

"It took two years for Amazon to get to this point, but we’re glad the company is finally recognizing the dangers face recognition poses to Black and Brown communities and civil rights more broadly,” Nicole Ozer, technology and civil liberties director with the American Civil Liberties Union (ACLU) of Northern California, said in a statement yesterday.

Back in 2018, ALCU revealed that facial-recognition technology is racially biased after Rekognition falsely matched 28 members of congress with mugshots in a test. A 2018 MIT study further confirmed such bias, after showing “an error rate of 0.8 percent for light-skinned men, 34.7 percent for dark-skinned women.” 

The moratorium is a huge turning point for the many organizations that have been pushing Amazon to stop selling this product to governments for years. Some cities, including Orlando and San Francisco, banned police departments from using it last year.

But a mere one year pause leaves a lot to be desired on Amazon's part. 

Amazon’s announcement doesn’t state that companies using Rekognition can’t sell their services to law enforcement. It doesn’t state whether Amazon Web Services (AWS) will stop taking contracts from the police. And there has been no commitment from Amazon to end the Ring doorbell camera partnerships used for surveillance by 400 police forces

“This surveillance technology’s threat to our civil rights and civil liberties will not disappear in a year. Amazon must fully commit to a blanket moratorium on law enforcement use of face recognition until the dangers can be fully addressed, and it must press Congress and legislatures across the country to do the same,” said Ozer.

One year isn’t a lot of time to push policy. Especially if Amazon lobbyists manage to wangle their way into writing the law.

Amazon Rekognition shouldn’t have been a product in the first place -- and it shouldn’t come back.