Facebook AI Mistakenly Labels Men as Primates

John Lister's picture

Facebook has apologized after its recommendation system labeled two black men as "primates." It's the latest in a series of blunders by automated technology.

The "recommendation" was shown at the end of a video posted by a British newspaper. It featured a white man in the United States phoning 911 to report "being harassed by a bunch of black men." The video did not include any content of references involving non-human primates.

One user who watched the video shared a screenshot of a message created by Facebook that appeared at the end of the video, asking "Keep seeing videos about Primates?" While technically all humans are primates, that's certainly not how the user interpreted the label.

Such recommendations are designed to capture the viewers interest and get them to watch videos for longer, in turn increasing the time during which they are exposed to Facebook's money-making ads. The system is also designed to take note of whether users follow the recommendation, which could help Facebook decide what to show them in future.

Google Photos Had Similar Problem

A Facebook spokesperson told the New York Times that "while we have made improvements to our A.I., we know it's not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations." (Source: nytimes.com)

It's not the first time a big tech company has blundered with automatic recognition technology and matters of race. In 2015, users noticed Google Photos has automatically labeled pictures of black people as "gorillas."

Although Google vowed to fix the problem, it was later accused of not solving the fundamental issue. Instead, it simply removed "gorilla" and other primate and ape terms from its labeling database.

Facial Recognition Not Equally Reliable

While in these cases the main risk is causing deep offense, other uses of technology have had a more damaging effect. Several high-profile facial recognition technologies have proven far less reliable with black faces than those of other ethnic backgrounds, leading to more cases of mistaken identity. (Source: wired.com)

Exactly why this is the case is uncertain. One theory is that many algorithms are trained using databases where black subjects are a minority, meaning the rules the AI develops aren't accurate. Another is simply that pictures of black faces are darker, making it harder to accurately use contrast as part of the recognition.

What's Your Opinion

Are you surprised Facebook's system made this error? Could Facebook do better in avoiding such blunders? Do facial recognition and other automated processes need to take more account of different skin tones?

Rate this article: 
Average: 5 (4 votes)

Comments

Gurugabe's picture

Honestly, I am not surprised by this. Although it is very unfortunate, AI is still too new and growing so fast that any programmers on these projects have not had the time to fully develop the algorithms for facial recognition. At the moment, what does AI look at on recognizing faces? It looks at the facial features broken down into simple wireframes on the position of the facial features and then looks at skin color around the facial features. Easy to get it wrong when you are looking at it from such a simple aspect, it's like looking at an object through frosted glass and instantly knowing who or what you are looking at. Game designers started slowly moving away from basic wireframes when creating people for their games because the human face is so complex and of course the devs wanted the people to look more human, easier to connect with. Remember the .com boom? Every business had to have a website right then and when the "newness" wore off a little, the internet as far as commerce and business presence got better once people could develop their sites better. It will take a few more years before this rush to get AI facial recognition out and the developers will have the time it takes to break down a face into much more than eyes, ears, nose, mouth placement, and the color of the surrounding skin. Until then, please do not make it public so that you can work out the bugs.