<?xml version="1.0"?>
<rss version="2.0" xml:base="link" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:fb="http://www.facebook.com/2008/fbml" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:media="http://www.rssboard.org/media-rss" xmlns:og="http://ogp.me/ns#" xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#" xmlns:schema="http://schema.org/" xmlns:sioc="http://rdfs.org/sioc/ns#" xmlns:sioct="http://rdfs.org/sioc/types#" xmlns:skos="http://www.w3.org/2004/02/skos/core#" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <channel>
    <title>Jerry Kaplan - South China Morning Post</title>
    <link>https://www.scmp.com/rss/329907/feed</link>
    <description/>
    <language>en</language>
    
    <atom:link href="https://www.scmp.com/rss/329907/feed" rel="self" type="application/rss+xml"/>
    <item>
      <description>Online search results, like so many other algorithms that rely on external sources of information, naturally expose whatever leanings or affinities that data reflects.
This effect – called “algorithmic bias” – is fast becoming a common feature of our digital world, and in far more insidious ways than search results.
Google search makes Trump the world’s top idiot, but is it fair?
Whether Google is partisan is a matter of opinion. But consider the subtle ways that it reinforces racial...</description>
      <guid isPermaLink="true">https://www.scmp.com/lifestyle/article/2179122/how-google-search-and-other-algorithms-could-be-racist-and-support?utm_source=rss_feed</guid>
      <link>https://www.scmp.com/lifestyle/article/2179122/how-google-search-and-other-algorithms-could-be-racist-and-support?utm_source=rss_feed</link>
      <pubDate>Mon, 24 Dec 2018 12:33:55 +0000</pubDate>
      <title>How Google Search and other algorithms could be racist and support stereotyping</title>
      <enclosure length="5196" type="image/jpeg" url="https://cdn.i-scmp.com/sites/default/files/styles/1280x720/public/images/methode/2018/12/24/02a86226-0500-11e9-b0d2-cf4a0f50367e_image_hires_113239.JPG?itok=s9u0Adnm&amp;v=1545622363"/>
      <media:content height="3296" medium="image" type="image/jpeg" url="https://cdn.i-scmp.com/sites/default/files/styles/1280x720/public/images/methode/2018/12/24/02a86226-0500-11e9-b0d2-cf4a0f50367e_image_hires_113239.JPG?itok=s9u0Adnm&amp;v=1545622363" width="5196"/>
    </item>
  </channel>
</rss>