In a video showing people of color, the algorithm suggested watching “more primate videos”. A Facebook spokesperson described the error as “unacceptable”. The program was taken offline
the culprit is Video from the British tabloid, Daily Mail, dating back to 27 June 2020. entitled “The white man calls the police against the black men in the marinaThe footage shows a group of blacks arguing with other white men and police officers. There are no references to monkeys or gorillas. However, the Facebook algorithm decided to suggest, to those who viewed this post, some primate videos. It was written as such. : “Keep watching primate videosAn outbreak of controversy against this AI that is apparently not trained enough to distinguish between lions and apes is inevitable.
Facebook immediately launched an internal investigation and Disable the algorithm. Then he apologizes: a specific spokesperson The “unacceptable” error e He assures us that everything possible is being done so that “it does not happen again”. still. “As we make improvements to our artificial intelligence, We know it’s not perfect and we need to make more progress».
However, the British newspaper’s video is not an isolated case. For years, the AI systems of big tech companies, from Facebook to Google to Amazon, have been especially watched Bias on the issue of race. Numerous studies Show how, for example, Facial recognition has a much harder time recognizing people of color of eggs. Which leads to annoying accidents such as The one that happened to 31-year-old Langer Parks. He looked like a wanted man: AI identified him as the culprit And he ended up in prison. Back in 2015, there was Google tags status which they identified as Gorillas are two black.
Sep 5, 2021 (change on Sep 5, 2021 | 12:44)
© Reproduction reserved