2062 by Toby Walsh

2062 by Toby Walsh

Author:Toby Walsh
Language: eng
Format: epub
Publisher: Schwartz Publishing Pty. Ltd


One of the largest benchmarks used in face recognition is the ‘Labelled Faces in the Wild’ dataset. This was released in 2007 and contains over 13,000 images of faces collected from news stories on the web. Reflecting the time of its release, the most common face is that of George W. Bush. The dataset is 77.5 per cent male and 83.5 per cent white. Very obviously, people in the news are not representative of the wider population.

However, there are image sets in use within the computer vision community that are more diverse. For instance, the ‘10k US Adult Faces Database’, released in 2013, contains 10,168 faces designed to match precisely the demographic distribution of the United States (according to variables such as age, race and gender). And Facebook has billions of photos at its disposal for its DeepFace research: since nearly everyone who signs up to Facebook uploads a photograph. Facebook really is a very large Face Book. So it is not at all clear that face recognition is being held back by an absence of diverse training sets.

There’s another simple factor that may be causing this bias to continue, which is perhaps a little more challenging for many well-meaning liberals. In humans, there is evidence that people are significantly better at recognising people from within their own ethnic group than those from outside their ethnic group. This is called the cross-race effect. There are similar effects within and between different age groups. So it’s possible that face-recognition software is replicating this. One solution might be to train separate face-recognition algorithms for different racial groups, and for different age groups.

There is a related phenomenon in speech recognition. To get good accuracy with both male and female voices, you need different software. So likewise the racial bias in face-recognition software might not be due to biased data, but simply because we need to use different programs to recognise different races.

GORILLA WARFARE

Given that face recognition is all about recognising faces, it’s perhaps not surprising that software for face recognition has been especially prone to charges of racism. In 2015 Jacky Alciné found that Google Photos was tagging pictures of him and his girlfriend as gorillas. His tweet succinctly described the failure:

Google Photos, y’all f*cked up. My friend’s not a gorilla.



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.