Too Long; Didn't Read
In our <a href="http://www.aies-conference.com/wp-content/uploads/2019/01/AIES-19_paper_223.pdf">recent study</a> of bias in commercial facial analysis systems, Deborah Raji and I show Amazon Rekognition, an AI service the company sells to law enforcement, exhibits gender and racial bias for gender classification. The <a href="https://www.nytimes.com/2019/01/24/technology/amazon-facial-technology-study.html">New York Times</a> broke the story. Unlike its peers, Amazon did not submit their AI systems to the National Institute of Standard and Technology for the latest rounds of facial recognition evaluations. Their claims of being bias free are based on internal evaluations. This is why we did an external evaluation to provide an outside perspective. Despite receiving preliminary reports of gender and racial bias in a <a href="https://uploads.strikinglycdn.com/files/e286dfe0-763b-4433-9a4b-7ae610e2dba1/RekognitionGenderandSkinTypeDisparities-June25-Mr.%20Bezos.pdf?id=125030">June 25, 2018 letter</a>, Amazon’s approach thus far has been one of denial, deflection, and delay. We cannot rely on Amazon to police itself or provide unregulated and unproven technology to police or government agencies.