New Delhi: Facebook came under fire yet again on Friday after its topic recommendation feature mistook Black men for “primates” in a video. In the past as well, facial recognition software has been criticized by civil rights advocates due to its inaccuracy when it comes to people who aren’t white.
Due to this inaccuracy, many people of colour have been arrested wrongly as it is being used for investigations by the police.
ALSO READ: Mu Covid-19 Variant: 10 Things That We Know About The Latest WHO ‘Variant Of Interest’
“We apologize to anyone who may have seen these offensive recommendations,” Facebook told AFP.
According to a New York Times report, Facebook users who watched a British tabloid video featuring Black men received an auto-generated prompt asking if they would like to “keep seeing videos about Primates”.
“We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again,” Facebook further told AFP.
Humans are among primate family but this particular video had nothing to do with monkeys, chimpanzees or gorillas.
Former design manager, Darci Groves took to Twitter to point out the prompt.
Um. This “keep seeing” prompt is unacceptable, @Facebook. And despite the video being more than a year old, a friend got this prompt yesterday. Friends at FB, please escalate. This is egregious. pic.twitter.com/vEHdnvF8ui
— Darci Groves (@tweetsbydarci) September 2, 2021
“This is egregious,” she wrote.