Shareholders tell Amazon to stop selling Rekognition facial recognition tech to govt
Amazon shareholders file a resolution demanding that Amazon stop selling facial recognition software Rekognition to government and law enforcement, citing concerns of potential civil and human rights abuses.
“Shareholders request that the Board of Directors prohibit sales of facial recognition technology to government agencies”
“Civil liberties organizations, academics, and shareholders have demanded Amazon halt sales of Rekognition to government, concerned that our Company is enabling a surveillance system ‘readily available to violate rights and target communities of color.’ Four hundred fifty Amazon employees echoed this demand, posing a talent and retention risk,” the group of Amazon shareholders stated in the resolution.
Among many of the concerns that private citizens, Amazon shareholders, employees, and users have include a lack of ethical oversight, accountability, and auditing on the part of the tech giant and the organizations that use its software.
“Shareholders have little evidence our Company is effectively restricting the use of Rekognition to protect privacy and civil rights. In July 2018, a reporter asked Amazon executive Teresa Carlson whether Amazon has ‘drawn any red lines, any standards, guidelines, on what you will and you will not do in terms of defense work,'” the resolution reads.
“We have not drawn any lines there […] We are unwaveringly in support of our law enforcement, defense, and intelligence community,” responded Carlson.
A study by the ACLU found that Rekognition incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime.
The members of Congress who were falsely matched with the mugshot database used in the test included Republicans and Democrats, men and women, and legislators of all ages, from all across the country.
Nearly 40% of Rekognition’s false matches in the test were of people of color, even though they made up only 20% of Congress.
“Shareholders have little evidence our Company is effectively restricting the use of Rekognition to protect privacy and civil rights”
“Rekognition, which is marketed by Amazon Web Services, performs image and video analysis of faces, including identifying and tracking people and their emotions. Tests of the technology have raised concerns that it is biased, inaccurate and dangerous,” according to a statement by Open MIC, which organized the resolution filed by the Sisters of St. Joseph of Brentwood, a member of the Tri-State Coalition for Responsible Investment.
Amazon’s reportedly flawed and biased facial recognition software, along with how it is being used without oversight at the potential loss of liberties and basic human rights, led Amazon stakeholders to file the resolution to stop selling Rekognition to government agencies until it can become more transparent.
“Shareholders request that the Board of Directors prohibit sales of facial recognition technology to government agencies unless the Board concludes, after an evaluation using independent evidence, that the technology does not cause or contribute to actual or potential violations of civil and human rights,” the resolution reads.
“Face surveillance gives the government new power to target and single out immigrants, religious minorities, and people of color in our communities”
On Tuesday the ACLU published a coalition letter to Amazon execs Jeff Bezos and David Zapolsky stating:
“The dangers of face surveillance can only be fully addressed by stopping its use by governments. Face surveillance provides government agencies with an unprecedented ability to track who we are, where we go, what we do, and who we know.
“Face surveillance gives the government new power to target and single out immigrants, religious minorities, and people of color in our communities. Systems built on face surveillance will amplify and exacerbate historical and existing bias that harms these and other over-policed and over-surveilled communities.
“In a world with face surveillance, people will have to fear being watched and targeted by the government for attending a protest, congregating outside a place of worship, or simply living their lives.”
Last year the American Civil Liberties Union (ACLU) filed a Freedom of Information Act (FOIA) request demanding that the Department of Homeland Security (DHS) disclose how it and ICE use Amazon facial recognition software Rekognition for law and immigration enforcement.
“ICE should not be using face recognition for immigration or law enforcement”
“ICE [Immigration and Customs Enforcement] should not be using face recognition for immigration or law enforcement. Congress has never authorized such use and should immediately take steps to ensure that federal agencies put the brakes on the use of face recognition for immigration or law enforcement purposes,” wrote ACLU Senior Legislative Counsel Neema Singh Guliani in an open letter at the time.
The ACLU; however, may have a long wait for the FOIA requests as the US Intelligence Community (IC), including the CIA, FBI, NSA, and others cannot keep up with the amount of Freedom of Information Act (FOIA) requests due to a lack of organization and inadequate tech.
The ACLU’s FOIA request arrived the same week that a similar report released by Mijente, the National Immigration Project, and the Immigrant Defense Project, revealed that Silicon Valley giants like Amazon Web Services (AWS) and Palantir are providing ICE with the data to incarcerate and deport government “undesirables” en masse.
“Law enforcement has already started using facial recognition with virtually no public oversight or debate or restrictions on use from Amazon”
In that report, the analysts concluded, “A handful of huge corporations, like Amazon Web Services and Palantir, have built a ‘revolving door’ to develop and entrench Silicon Valley’s role in fueling the incarceration and deportation regime,” and if left unchecked, “these tech companies will continue to do the government’s bidding in developing the systems that target and punish en masse those it deems ‘undesirable’ — immigrants, people of color, the incarcerated and formerly incarcerated, activists, and others.”
The report was published a week after an anonymous Amazon employee who was veried by Medium, announced that Amazon should not be selling facial recognition software “Rekognition” to law enforcement as it was being used by police departments ICE without ethical oversight.
The anonymous Amazon employee’s words were echoed in the shareholders’, “Amazon’s website brags of the system’s ability to store and search tens of millions of faces at a time. Law enforcement has already started using facial recognition with virtually no public oversight or debate or restrictions on use from Amazon.”