القائمة الرئيسية

الصفحات

Prominent AI researchers call on Amazon to stop selling Rekognition facial analysis to law enforcement

In a letter published today, a cohort of about two dozen AI researchers working in tech and academia are calling on Amazon’s AWS to stop selling facial recognition software Rekognition to law enforcement agencies.
Among those who object to Rekognition being used by law enforcement are deep learning luminary and recent Turing Award winner Yoshua Bengio, Caltech professor and former Amazon principal scientist Anima Anandkumar, and researchers in fields of computer vision and machine learning t Google AI, Microsoft Research, and Facebook AI Research
Rekognition has been used by police departments in Florida and Washington, and has reportedly been offered to the Department of Homeland Security to identify immigrants.
“We call on Amazon to stop selling Rekognition to law enforcement as legislation and safeguards to prevent misuse are not in place,” reads the letter. “There are no laws or required standards to ensure that Rekognition is used in a manner that does not infringe on civil liberties.”
The researchers cite the work of privacy advocates who are concerned that law enforcement agencies with little understanding of the technical aspects of computer vision systems could make serious errors, like committing an innocent person to jail, or trust autonomous systems too much.
“Decisions from such automated tools may also seem more correct than they actually are, a phenomenon known as ‘automation bias’, or may prematurely limit human-driven critical analyses,” the letter reads.
The research also criticizes Rekognition for its binary classification of sexual orientation as male or female, an approach that can lead to misclassifications and cites the work of researchers like Os Keyes whose analysis of gender recognition research found few examples of work that incorporate transgender people.
The letter takes issue with arguments made by Amazon’s deep learning and AI general manager Mathew Wood and global head of public policy Michael Punke, who reject the results of a recent audit that found Rekognition misidentifies women with dark skin tones as men 31% of the time.
The analysis, which examined the performance of commercially available facial analysis tools like Rekognition, was published in January at the AAAI/ACM conference on Artificial Intelligence Ethics and Society by Inioluwa Deborah Raji and Joy Buolamwini.
The report follows the release a year ago of Gender Shades, analysis that found facial recognition software from companies like Face++ and Microsoft had limited ability to recognize people with dark skin tones, especially women of color.
Timnit Gebru, a Google researcher who coauthored Gender Shades, also signed the letter published today.
A study the American Civil Liberties Union (ACLU) released last summer found that Rekognition inaccurately labeled members of the 115th U.S. Congress as criminals, a label Rekognition was twice as likely to bestow on members of Congress who are people of color than their white counterparts.
Following the release of the paper and an accompanying New York Times article, Wood claimed the research “draws misleading and false conclusions.”
In response, the letter published today says that in multiple blog posts Punke and Wood “misrepresented the technical details for the work and the state-of-the-art in facial analysis and face recognition.” The letter also refutes specific claims made by Wood and Punke, like the assertion that facial recognition and facial analysis have completely different underlying technology.
Instead, the letter asserts that many machine learning researchers view the two as closely related and that facial recognition data sets can be used to train models for facial analysis.
“So in contrast to Dr. Wood’s claims, bias found in one system is cause for concern in the other, particularly in use cases that could severely impact people’s lives, such as law enforcement applications.”
The letter opposing law enforcement use of Rekognition comes weeks after members of the U.S. Senate proposed legislation to regulate the use of facial recognition software.
For its part, Amazon said it welcomes some form of regulation or “legislative framework,” while Microsoft urged the federal government to regulate facial recognition software before law enforcement agencies abuse it.

هل اعجبك الموضوع :

تعليقات