Algorithms are crucial to our digital lives – but they can stick us in a cognitive bubble
From Google searches and Amazon purchases to Facebook news feeds and satellite surveillance – the brains behind computer programs can be problematic when they try to predict our choices
Algorithms are a crucial cog in the mechanics of our digital world, but also a nosy minder of our personal lives and a subtle, even insidious influence on our behaviour.
They have also come to symbolise the risks of a computerised world conditioned by commercial factors.
Long before they were associated with Google searches, Facebook pages and Amazon suggestions, algorithms were the brainchild of a Persian scientist.
The word is a combination of medieval Latin and the name of a ninth century mathematician and astronomer, Al Khwarizmi, considered the father of algebra.
A bit like a kitchen recipe, an algorithm is a series of instructions that allows you to obtain a desired result, according to sociologist Dominique Cardon, who wrote À quoi rêvent les algorithmes (What Are Algorithms Dreaming of?). Initially a term known mainly to mathematicians, it has spread as computers have developed.
The brains of computer programs are algorithms, and are thus a central cog in the internet machine.
“We are surrounded by algorithms,” says Olivier Ertzscheid, a French professor of information technology and communication.
“Every time you consult Facebook, Google or Twitter you are exposed to choices” that algorithms calculate for us, and we are also sometimes influenced by them, he says.
They reign in the finance sector, one example being high frequency trading programs, which can execute trades in milliseconds driven by algorithms that analyse a range of market and economic factors. Their speed and rule-based nature means they can make markets volatile and have triggered so-called flash crashes in the foreign exchange and stock markets.
Police forces increasingly use algorithms to predict where and when crimes are most likely to be committed. Predpol, a software program, claims to have contributed to double-digit drops in burglaries, robberies and vehicle theft in several US states and is also used in Kent, southern England.
Satellite tracking and surveillance would not have reached the point they are at today without sophisticated algorithms.
In the 1990s, PageRank (PR) was created in Stanford, California by Larry Page and Sergey Brin, Google’s co-founders. PR made it possible to class web pages by order of popularity. It became the heart of the Google research engine. In addition to PR, Google uses “a dozen algorithms … to deal with spam, detect copyright infractions” and handle other crucial tasks, Ertzscheid says.
Facebook uses sophisticated algorithms to offer its more than 1.8 billion users worldwide personalised content, in particular on its News Feed service which compiles messages from “friends”, and shares articles selected according to each user’s social media contacts.
One risk posed by such a system is that of “The Filter Bubble”, according to Eli Pariser, who developed the concept in a book of the same name. Being surrounded by information filtered by algorithms based on one’s friends, tastes and previous digital searches and choices, someone surfing the internet can be plunged unwittingly into a “cognitive bubble” that just reinforces their convictions and perspective on the world.
Another risk was exposed during the last US presidential election – the prevalence of so-called fake news or hoaxes on Facebook and other social media. Facebook’s algorithms were not designed to distinguish true from false but instead to assess the popularity of information.
Facebook chief Mark Zuckerberg has sought to deflect criticism that it had been used to fuel the spread of misinformation that may have impacted the presidential race, but the company responded to growing criticism by saying new tools would be provided so users could call attention to controversial content.
Cardon says four main “families” of web algorithms exist. One calculates the popularity of web pages, another assesses their authority within the digital community, and a third evaluates the notoriety of social network users. The fourth attempts to predict the future.
This last one is “problematic” for the sociologist, because it tries to anticipate our future behaviour based on clues we have left on the internet in the past. It shows up on Amazon, for example, as book recommendations based on past purchases. “We build the calculators, but in return they build us,” Cardon says.