Facebook users who recently watched a UK tabloid video featuring black men saw an automatic prompt from the social network asking if they “want to keep watching videos about primates,” prompting the company to use the artificial intelligence-based one Function to investigate and disable the squeezed the message.
On Friday, Facebook apologized for a so-called “unacceptable mistake” and said it was reviewing the recommendation feature to “prevent this repetition.”
The June 27, 2020 video was from The Daily Mail and showed clips of black men clashing with white civilians and police officers. It had no association with monkeys or primates.
Darci Groves, a former content design manager at Facebook, said a friend recently sent her a screenshot of the solicitation. She then published it on a product feedback forum for current and former Facebook employees. In response, a product manager at Facebook Watch, the company’s video service, said it was “unacceptable” and said the company was “investigating the cause.”
Ms. Groves said the request was “terrible and egregious”.
Dani Lever, a Facebook spokeswoman, said in a statement: “As I said, while we’ve made improvements to our AI, we know it’s not perfect and we need to make further progress. We apologize to anyone who has seen these offensive recommendations. “
Google, Amazon, and other tech companies have been scrutinizing biases in their artificial intelligence systems for years, particularly regarding race issues. Studies have shown that facial recognition technology is biased towards colored people and has more difficulty identifying them, leading to incidents where blacks have been or been discriminated against arrested for computer failure.
In an example from 2015 Google photos Pictures of black people mistakenly labeled “gorillas” that Google said was “genuinely sorry” and are working to fix the problem immediately. More than two years later, Wired found that Google’s solution was to censor the word “gorilla” from searches while blocking “chimpanzee”, “chimpanzee” and “monkey”.
Facebook has one of the world’s largest repositories for user-uploaded images that are used to train its face and object recognition algorithms. The company, which tailors content for users based on their previous browsing and viewing habits, sometimes asks people if they want to keep seeing posts in related categories. It was unclear whether news like the “primates” was widespread.
Facebook and its photo sharing app Instagram have struggled with other race-related problems. For example, after the European Football Championship in July, three black members of the English national football team were racially insulted on the social network for missing penalties in the championship game.
Racist problems have also led to internal unrest on Facebook. In 2016, Mark Zuckerberg, the CEO, asked employee Remove the phrase “Black Lives Matter” and replace it with “All Lives Matter” in a common room at the company’s headquarters in Menlo Park, California. Hundreds of employees too staged a virtual strike last year to protest the company’s handling of a post by President Donald J. Trump about the assassination of George Floyd in Minneapolis.
The company later hired and released a vice president for civil rights a civil rights examination. In an annual diversity report In July, Facebook said 4.4 percent of its US-based employees were black, up from 3.9 percent the previous year.
Ms. Groves, who left Facebook after four years, said in an interview that a number of missteps at the company suggested that dealing with racial issues was not a priority for its executives.
“Facebook can’t keep making these mistakes and then saying, ‘I’m sorry,'” she said.