How algorithms rule modern world
Computer analysis makes our lives easier in countless ways, and - arguably - safer, but many worry about its threat to privacy
On August 4, 2005, the police department of Memphis, Tennessee, made so many arrests over a three-hour period that it ran out of vehicles to transport the detainees to jail. Three days later, 1,200 people had been arrested across the city - a new police department record. Operation Blue Crush was hailed a huge success.
Larry Godwin, the city's new police director, quickly rolled out the "Blue Crush" scheme and by 2011 crime had fallen by 24 per cent. When it was revealed Blue Crush faced budget cuts this year, there was public outcry.
"Crush" policing is now perceived to be so successful that it has reportedly been mimicked across the globe.
Crush stands for "Criminal Reduction Utilising Statistical History". Translated, it means predictive policing. Or, more accurately, police officers guided by algorithms.
A team of criminologists and data scientists at the University of Memphis first developed the technique using IBM predictive analytics software. They compiled crime statistics from across the city and overlaid it with other datasets - social-housing maps, outside temperatures etc - then instructed algorithms to search for correlations in the data to identify crime "hot spots". Police then flooded those areas with highly targeted patrols.
"It's putting the right people in the right places on the right day at the right time," Dr Richard Janikowski, an associate professor in the department of criminology and criminal justice at the University of Memphis, said when the scheme launched.
But not everyone is comfortable with the idea. Some critics have dubbed it "Minority Report" policing, in reference to the sci-fi film in which psychics are used to guide a "PreCrime" police unit.
The use of algorithms in policing is one example of their increasing influence on our lives. And, as their ubiquity spreads, so too does the debate around whether we should allow ourselves to become so reliant on them - and who, if anyone, is policing their use.
Such concerns were sharpened further by the continuing revelations about how the US National Security Agency (NSA) has been using algorithms to help it interpret the colossal amounts of data it has collected from its covert dragnet of international telecommunications.
"For datasets the size of those the NSA collect, using algorithms is the only way to operate for certain tasks," says James Ball, The Guardian's data editor.
"The problem is how the rules are set: It's impossible to do this perfectly. If you're, say, looking for terrorists, you're looking for something very rare. Set your rules too tight and you'll miss lots of, probably most, potential terror suspects. But set them more broadly and you'll drag lots of entirely blameless people into your dragnet, who will then face further intrusion or even formal investigation.
"We don't know exactly how the NSA or GCHQ [Government Communications Headquarters, Britain's spy agency] use algorithms - or how extensively they're applied. But we do know they use them, including on the huge data trawls revealed in The Guardian."
From dating websites and trading floors, through to online retailing and internet searches, algorithms are increasingly determining our collective futures.
"Bank approvals, store cards, job matches and more all run on similar principles," Ball says. "The algorithm is the god from the machine powering them all, for good or ill."
But what is an algorithm? Dr Panos Parpas, a lecturer in the quantitative analysis and decision science ("quads") section of the department of computing at Imperial College London, says that wherever we use computers, we rely on algorithms.
"There are lots of types, but algorithms, explained simply, follow a series of instructions to solve a problem. It's a bit like how a recipe helps you to bake a cake. Instead of having generic flour or a generic oven temperature, the algorithm will try a range of variations to produce the best cake possible from the options and permutations available."
Parpas stresses that algorithms are not a new phenomenon: "They've been used for decades … but the current interest in them is due to the vast amounts of data now being generated and the need to process and understand it. They are now integrated into our lives.
"On the one hand, they are good because they free up our time and do mundane processes on our behalf. The questions being raised about algorithms at the moment are not about algorithms per se, but about the way society is structured with regard to data use and data privacy. It's also about how models are being used to predict the future.
"There is currently an awkward marriage between data and algorithms. As technology evolves, there will be mistakes, but it is important to remember they are just a tool. We shouldn't blame our tools."
The "mistakes" Parpas refers to are events such as the "flash crash" of May 6, 2010, when the Dow Jones Industrial Average fell 1,000 points in just a few minutes, only to see the market regain itself 20 minutes later. The reasons for the sudden plummet has never been fully explained, but most financial observers blame a "race to the bottom" by the competing quantitative trading (quants) algorithms widely used to perform high-frequency trading.
Parpas says: "By far the most complicated algorithms are to be found in science, where they are used to design new drugs or model the climate. But they are done within a controlled environment with clean data. It is easy to see if there is a bug in the algorithm.
"The difficulties come when they are used in the social sciences and financial trading, where there is less understanding of what the model and output should be, and where they are operating in a more dynamic environment. Scientists will take years to validate their algorithm, whereas a trader has just days to do so in a volatile environment."
But as the NSA revelations exposed, the bigger questions centre on governance and privacy. How are algorithms being used to access and interpret "our" data? And by whom?
Christopher Steiner, author of Automate This: How Algorithms Came to Rule Our World, argues that we should not automatically see algorithms as a malign influence on our lives, but "as their power intensifies, wealth will concentrate towards them. They will ensure the 1 per cent-99 per cent divide gets larger."