• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Prominent AI researchers call on Amazon to stop selling Rekognition facial analysis to law enforcement

April 4, 2019   Big Data
 Prominent AI researchers call on Amazon to stop selling Rekognition facial analysis to law enforcement

In a letter published today, a cohort of about two dozen AI researchers working in tech and academia are calling on Amazon’s AWS to stop selling facial recognition software Rekognition to law enforcement agencies.

Among those who object to Rekognition being used by law enforcement are deep learning luminary and recent Turing Award winner Yoshua Bengio, Caltech professor and former Amazon principal scientist Anima Anandkumar, and researchers in fields of computer vision and machine learning t Google AI, Microsoft Research, and Facebook AI Research.

Rekognition has been used by police departments in Florida and Washington, and has reportedly been offered to the Department of Homeland Security to identify immigrants.

“We call on Amazon to stop selling Rekognition to law enforcement as legislation and safeguards to prevent misuse are not in place,” reads the letter. “There are no laws or required standards to ensure that Rekognition is used in a manner that does not infringe on civil liberties.”

The researchers cite the work of privacy advocates who are concerned that law enforcement agencies with little understanding of the technical aspects of computer vision systems could make serious errors, like committing an innocent person to jail, or trust autonomous systems too much.

“Decisions from such automated tools may also seem more correct than they actually are, a phenomenon known as ‘automation bias’, or may prematurely limit human-driven critical analyses,” the letter reads.

The research also criticizes Rekognition for its binary classification of sexual orientation as male or female, an approach that can lead to misclassifications and cites the work of researchers like Os Keyes whose analysis of gender recognition research found few examples of work that incorporate transgender people.

The letter takes issue with arguments made by Amazon’s deep learning and AI general manager Mathew Wood and global head of public policy Michael Punke, who reject the results of a recent audit that found Rekognition misidentifies women with dark skin tones as men 31% of the time.

The analysis, which examined the performance of commercially available facial analysis tools like Rekognition, was published in January at the AAAI/ACM conference on Artificial Intelligence Ethics and Society by Inioluwa Deborah Raji and Joy Buolamwini.

The report follows the release a year ago of Gender Shades, analysis that found facial recognition software from companies like Face++ and Microsoft had limited ability to recognize people with dark skin tones, especially women of color.

Timnit Gebru, a Google researcher who coauthored Gender Shades, also signed the letter published today.

A study the American Civil Liberties Union (ACLU) released last summer found that Rekognition inaccurately labeled members of the 115th U.S. Congress as criminals, a label Rekognition was twice as likely to bestow on members of Congress who are people of color than their white counterparts.

Following the release of the paper and an accompanying New York Times article, Wood claimed the research “draws misleading and false conclusions.”

In response, the letter published today says that in multiple blog posts Punke and Wood “misrepresented the technical details for the work and the state-of-the-art in facial analysis and face recognition.” The letter also refutes specific claims made by Wood and Punke, like the assertion that facial recognition and facial analysis have completely different underlying technology.

Instead, the letter asserts that many machine learning researchers view the two as closely related and that facial recognition data sets can be used to train models for facial analysis.

“So in contrast to Dr. Wood’s claims, bias found in one system is cause for concern in the other, particularly in use cases that could severely impact people’s lives, such as law enforcement applications.”

The letter opposing law enforcement use of Rekognition comes weeks after members of the U.S. Senate proposed legislation to regulate the use of facial recognition software.

For its part, Amazon said it welcomes some form of regulation or “legislative framework,” while Microsoft urged the federal government to regulate facial recognition software before law enforcement agencies abuse it.

Let’s block ads! (Why?)

Big Data – VentureBeat

Amazon, Analysis, Call, Enforcement, facial, Prominent, Rekognition, researchers, Selling, Stop
  • Recent Posts

    • We were upgraded to the Unified Interface for Dynamics 365. Now What?
    • Recreating Art – the unexpected way
    • Upcoming Webinar: Using Dynamics 365 to Empower your Marketing and Sales Teams with Digital Automation
    • Center for Applied Data Ethics suggests treating AI like a bureaucracy
    • Improving Dynamics 365 Data Integrations with Alternate Keys
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited