image

Google

How Google Search and other algorithms could be racist and support stereotyping

  • Search engines display results based on previous behaviour of their users, an effect called ‘algorithmic bias’
  • People need to understand that search engines and their results are a reflection of society
PUBLISHED : Monday, 24 December, 2018, 8:33pm
UPDATED : Monday, 24 December, 2018, 8:33pm

Online search results, like so many other algorithms that rely on external sources of information, naturally expose whatever leanings or affinities that data reflects.

This effect – called “algorithmic bias” – is fast becoming a common feature of our digital world, and in far more insidious ways than search results.

Google search makes Trump the world’s top idiot, but is it fair?

Whether Google is partisan is a matter of opinion. But consider the subtle ways that it reinforces racial stereotypes.

Try typing “Asian girls” into the Google search bar. If your results are like mine, you’ll see a selection of come-ons to view sexy pictures, advice for “white men” on how to pick up Asian girls and some items unprintable in this publication. Try “Caucasian girls” and you’ll get wonky definitions of “Caucasian,” pointers to stock photos of wholesome women and kids, and some anodyne dating advice. Does Google really think so little of me?

Of course not. Google has no a priori desire to pander to baser instincts. Its search results, like it or not, reflect the actual behaviour of its audience. And if that’s what folks like me click on most frequently, that’s what Google assumes I want to see. While I might take offence at being lumped in with people whose values I deplore, it’s hard to argue that Google is at fault. Yet it’s clear that such racially tinged results are demeaning to all parties involved.

One particularly difficult to detect source of algorithmic bias stems not from the data itself, but from sampling errors that may over- or under-represent certain portions of the target population.

For instance, a face recognition system trained mainly on light-skinned images is likely to perform better on white people than black.

Google halts plans for censored Chinese search engine, report says

In a recent famous case of such bias, a Google search for “gorillas” included many images of African Americans, to the company’s embarrassment.

As we delegate more of our decision-making to machines, we run the risk of enshrining all sorts of injustices into computer programs, where they could fester undetected in perpetuity.

Addressing this critical risk should be an urgent social priority. We need to educate the public to understand that computers are not infallible mechanical sages incapable of malice and bias. Rather, in our increasingly data-driven world, they are mirrors of ourselves – reflecting both our best and worst tendencies.

Like the evil queen in Snow White, how we react to this new mirror on the wall might say more about us than any computer program ever can.

The Washington Post

Kaplan is a research affiliate at Stanford University’s Centre on Democracy, Development and the Rule of Law at the Freeman Spogli Institute for International Studies.