Tech

Amazon Rekognition Gives Sexist and Racist Results


The research shows that Amazon's face detection algorithms struggle with gender and racial bias.

Media Laboratory of MIT (Massachusetts Institute of Technology), Amazon's facial recognition application he found no one but one-fifth of dark men and one third of dark women wrongly identified his sex. IBM and Microsoft's software slapped Amazon's software with a 1.5 percent margin of error in dark people

Amazon objected to the results of MIT's testing during 2018. He said the testing team didn't use the latest version of Rekognition and was testing with face analysis instead of face detection. Amazon said that the face analysis is much simpler and basic, and that the face detection comes with very comprehensive features, and stressed that these are two separate packages.

            Amazon Delivers Initial Delivery with Delivery Scout Scout
        
    

This kind of software is not being showered with fire for the first time. In February 2018, MIT and Stanford researchers said that three different facial analysis programs had sexist and racist results, and that they were inaccurate. Since then, IBM has made some improvements to improve accuracy in face-to-face analysis tools, and Microsoft has made it necessary to maintain high standards.

they had waited for some answers from the Amazon. While Amazon wanted to use its application at customs, some shareholders wanted the system to be shut down because they were worried that the system could violate the civil rights of people.

                    https://www.engadget.com/2019/01/25/amazon-rekognition-facial-analysis-gender-race-bias-mit/