While some facial recognition software can identify gender, the accuracy of its results varies by skin tone and gender, accurate 99 percent of the time if the photo is of white men, but erroneous up to 35 per cent of the time “for images of darker skinned women.” Steve Lohr’s article in the New York Times (February 9, 2018) describes research results of computer scientist, Joy Buolamwini, of the M.I.T. Media Lab and founder of the Algorithmic Justice League. Click here to read the entire article. To hear Dr. Buolamwini’s Ted Talk on coded bias, click here.
Since this is reminiscent of a problem uncovered with color photography, click here to watch a brief video explaining how “early film stocks in photograph[y] were designed with light skin as the ideal skin standard, and therefore sometimes had problems rendering darker skins.”