Amazon has decided to take the wholly selfless and completely original decision to stop supplying police forces with its facial “Rekognition” software, for, wait for it, a whole year.
The anti-racism protests currently tearing through America have prompted the company to consider that, actually, militarised cops might not be that great, and supplying them with facial recognition technology – which civil rights activists say poses a threat to humanity commensurate with nuclear weapons – wasn’t a good look. For now, anyway.
Of course some, like Phil Booth of advocacy group medConfidential, have speculated that it might actually be because, if people are going to be wearing face masks for the foreseeable future, the company’s current recognition models will be rendered temporarily useless.
The move was also was categorically not inspired by IBM’s opportunistic crowing earlier in the week that it would abandon ‘general use’ facial recognition in vague solidarity with the Black Lives Matter movement. That firm’s announcement came with big, glaring caveats. IBM will continue to develop ‘specific use’ facial recognition software and it wasn’t anywhere close to being a market leader in the technology anyway.
Curiously, ethics didn’t come into it when IBM showed interest in a contract to assist Trump’s “extreme vetting” programme for Immigration and Customs Enforcement (ICE), the agency notorious for grievous human rights violations like separating children from their families and keeping people in cages. The programme would’ve automatically mined social media and the internet to find evidence of whether a visitor was a “positively contributing member of society”, or had the potential to commit criminal acts. However, it was eventually assigned to a human team instead. Ethics didn’t come into it in Hitler’s Germany either, when IBM supplied tabulating machines to the Nazis that helped ‘categorise’ the population.
With Amazon, the caveats are also egregious. The company has deep ties to law enforcement that it’s been determinedly cultivating for years. Amazon’s head of Web Services Andy Jassy has said the company doesn’t even know how many police forces are currently deploying its Rekognition technology. The company’s home security arm, Ring, has partnered with more than 600 police departments nation wide, some of which give out free Ring cameras to households. The technology has been linked to the pervasive, racist policing methods that are currently being protested in the country. Amazon claims it can identify eight different emotions – including, most chillingly – fear.
A 2018 American Civil Liberties Union (ACLU) report demonstrated that Amazon’s Rekognition software was racially biased, disproportionately misidentifying black members of Congress as people who had their mugshot in a police database in ACLU’s test run. This means black people are more likely to be wrongly classified by the technology and potentially suffer police actions based on misidentification.
But even if the software was completely accurate, it would still help facilitate the over-policing of black communities in America. Technology put in the hands of a systemically racist institution will be wielded in a discriminatory manner.
Rights advocates and Amazon’s own employees have repeatedly called on the company to stop marketing and selling Rekognition to government agencies including police and immigration, and to stop hosting controversial data analysis company Palantir (that contracts with ICE, the FBI and the CIA) on the Amazon Web Services cloud.
Amazon had already sparked rebuke over its feeble show of support for protesters in the wake of George Floyd’s murder at the hands of police. “The inequitable and brutal treatment of Black people in our country must stop,” read the company’s statement. “Together we stand in solidarity with the Black community — our employees, customers, and partners — in the fight against systemic racism and injustice.”
But while Amazon claimed to stand in solidarity with the black community, many pointed out its flagrant lack of solidarity with its own employees. Over the past few months, Amazon has ignored its warehouse workers’ repeated cries for safer working conditions during the coronavirus pandemic, penalised those who spoke up, and fired employees attempting to unionise the work force. Even before the pandemic, warehouse employees regularly complained about their wretched working conditions, where ambulances are regularly called out to UK facilities and staff are stifled by strict surveillance technology. Given all of this, a year’s moratorium on its facial recognition technology is an insultingly underwhelming concession.