In June of 2018, in the midst of all the other letters demanding that the company stop police use of Rekognition, Raji and Buolamwini expanded the Gender Shades audit to encompass its performance. The results, published half a year later in a peer-reviewed paper, once again found huge technical inaccuracies. Rekognition was classifying the gender of dark-skinned women 31.4 percentage points less accurately than that of light-skinned men.
WHY IT MATTERS:Â this article presents detailed numbers and references that show facial recognition is not accurate for dark skinned persons and that Amazon technology called rekognition is in use by law enforcement via ring doorbells...