News

Facial recognition software is biased towards white men, researcher finds

New research out of MIT’s Media Lab is underscoring what other experts have reported or at least suspected before: facial recognition technology is subject to biases based on the data sets provided and the conditions in which algorithms are created.

Joy Buolamwini, a researcher at the MIT Media Lab, recently built a dataset of 1,270 faces, using the faces of politicians, selected based on their country’s rankings for gender parity (in other words, having more females in office). Buolamwini then tested the accuracy of three facial recognition systems: those made...

read more...

Published By: theverge.com - Sunday, 11 February