facial analysis software

2019-02-07
Buolamwini is a computer scientist, founder of the Algorithmic Justice League and a poet of code.  Machines can discriminate in harmful ways. I experienced this firsthand, when I was a graduate student at MIT in 2015 and discovered that some facial analysis software couldn’t detect my dark-skinned face until I put on a white mask. These systems are often trained on images of predominantly light-skinned men. And so, I decided to share my experience of the coded gaze, the bias in artificial intelligence that can lead to discriminatory or exclusionary practices.