Facebook and YouTube should have learned from Microsoft’s racist chatbot
Facebook and YouTube have recently come under fire for offensive search suggestions
By Ingrid Angulo
Microsoft showed us in 2016 that it only takes hours for internet users to turn an innocent chatbot into a racist. Two years later, Facebook and YouTube haven’t learned from that mistake.
Facebook came under fire after users noticed search suggestions alluding to child abuse and other vulgar and upsetting results when people started typing “video of...” Facebook promptly apologised and removed the predictions.
YouTube has also been the subject of investigations regarding how it highlights extreme content. On Monday, Youtube users highlighted the prevalence of conspiracy theories and extreme content in the website’s autocomplete search box.
Both companies blamed users for their search suggestion issues. Facebook told The Guardian, “Facebook search predictions are representative of what people may be searching for on Facebook and are not necessarily reflective of actual content on the platform.”
Alphabet’s Google, the owner of YouTube, says that its search results take into account “popularity” and “freshness,” which are determined by users.