Advertisement
Advertisement
Crime
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
An IT company is using an AI to send warning messages to the social media accounts of people who search for child pornography. Photo: Shutterstock

IT firm’s AI program deters paedophiles from searching for child porn by sending warnings to their social media accounts

  • Internet searches for child pornography surged with so many people at home because of Covid-19 lockdowns this year; sharing of harmful content more than doubled
  • A Singapore IT firm is using an AI program to send warnings to people’s social media accounts if they search the internet for child porn
Crime

Just before the Covid-19 pandemic began to sweep the world, Singapore-based anthropologist Angad Chowdhry was shocked when he stumbled upon Europol’s Stop Child Abuse – Trace an Object page.

Random images of objects were arranged on the website, including a comb, a tiny pair of yellow shorts, and parts of a bedcover and a T-shirt: all screenshots the European police network had culled from child sexual abuse videos circulating on the internet.

Europol wanted people to make contact if they had seen the items anywhere: any information could help police officers track down the abusers.

“We had been reading articles about the scale of the problem, but this page humanised it like nothing else had,” says Chowdhry, the 40-year-old co-founder of internet firm Quilt.AI, a technology platform that “converts big data signals into human insights”. He decided to work on a way to use the power of the internet to deter consumers of child porn.

Angad Chowdhry is an anthropologist and co-founder of internet firm Quilt.AI.

Worldwide office closures, curfews and lockdowns this year mean millions of people have been online for significantly more time than they once were. The child pornography problem was vast before the pandemic hit, but the growth in internet use has seen it surge.

The Britain-based independent Internet Watch Foundation, which works to remove child sexual abuse from the internet, reported that 8.8 million attempts were made to access underage pornography in April 2020 in the United Kingdom alone.

Tutorial centre owner among six arrested over child pornography

It soon became apparent that the child pornography trend was global. The US-based National Centre for Missing and Exploited Children has estimated that sharing of harmful content increased by 106 per cent around the world in June 2020.

In India, on the first day of lockdown in March, demand for child pornography rose by 100 per cent – from 3.5 million searches a day to over 7 million, Chowdhry’s firm discovered when it began researching the phenomena.

“But that’s just the number where the keywords were unambiguous. The real number was closer to 25 million; all searches essentially related to child abuse plus adjacent stuff,” says Chowdhry. “More repugnant was the doubling, sometimes tripling, of very specific and extreme keywords.”

Every once in a while, I would stumble upon actual child sexual abuse material and it would make my blood boil, and I would have sleepless nights. Numbers and statistics are fine, but if you even see one tiny video clip, you never want to see anything ever again
Angad Chowdhry, anthropologist and co-founder of internet firm Quilt.AI

Quilt.AI began investigating the problem and potential remedies in December last year, after Chowdhry was shocked by the pathetic images on the Europol site and worried by the shortage of deterrent action on the web. “There was a lot of conversation in the cause-based world but not much movement and action,” he says of the work then under way to tackle the problem.

His company’s moon shot work, aimed at finding a way to deter internet child pornography seekers, was partly funded by third parties. “We got a few projects for tackling this issue,” Chowdhry says. “So it was a mixture of frustration and opportunity.”

Before the pandemic, about 94 per cent of child sexual abuse material found online in 2019 by the Internet Watch Foundation contained images of children under 13, and 39 per cent featured children under 10.

For his doctoral thesis, Chowdhry pounded the streets of Mumbai for nearly 18 months. Photo: Getty Image

Chowdhry has a doctorate in the anthropology of media from the School of Oriental and African Studies at the University of London. For his doctoral thesis, he pounded the streets of Mumbai, India, for nearly 18 months from 2005, visiting crime scenes and asking locals about their memories of them.

He was interested in what happened to society after a murder. He considered himself sufficiently hardened by this experience to cope with the trauma of investigating child sex abuse. “The crime scenes were real; it was human,” he says. “Each was a tragedy but you could put it away.”

His doctoral fieldwork was no m atch for the nightmares the child sexual abuse project unleashed. “I did not know the scale of this problem,” he says. “Every once in a while, I would stumble upon actual child sexual abuse material and it would make my blood boil, and I would have sleepless nights.

“Numbers and statistics are fine, but if you even see one tiny video clip, you never want to see anything ever again. It was emotionally very difficult.”

Searches for child porn increased during the months of lockdown this year. Photo: Getty Images

He put aside his horror and got to work. From December 2019 until midway through 2020, Chowdhry and a couple of his Singapore-based colleagues tracked nearly 7,000 keywords in 11 languages in India, across several platforms – search engines, social media, news, video sharing sites and chat platforms.

“We learned some shocking things,” he says. “One, it was astonishing how easy it was to get such stuff, with just a little mental effort. Two, before the pandemic, searches for child sexual abuse material were restricted to late night and early morning; during the pandemic it became all-day consumption, peaking during lunchtime.

“Three, there was no urban-rural divide; in fact, some semi-urban centres had higher per capita consumption than urban centres. Four … the sheer ingenuity of how keywords were used was mind-boggling.”

So Chowdhry and his colleagues decided to leave messages for those seeking porn on the internet. “People search for this material because they think there are no consequences,” he says. “No one searches for terrorist stuff because there are repercussions.”

Malaysian survivors of Telegram porn scandal lead calls for change

Search engines might issue automatic warnings that searching for child pornography is illegal, but these warnings have little deterrent effect, he adds.

The Quilt.AI team built a system that is triggered by searches for child pornography and then sends messages to the user on his, or much more rarely her, social media platforms.

“We built a system where, if a person has searched for such material on a search engine today, ads would target him on his social media platforms a day or two later, saying something to the effect that ‘We know what you did, so watch it’,” Chowdhry explains. The messages left, he explains, are generated by machine learning and tested in advance.

We saw immediate results. In the three months since then, we managed to scare off 600,000 people on average, month on month. So that’s an almost 15 per cent reduction.
Angad Chowdhry, Singapore-based anthropologist and founder of Quilt.AI

With concerns about the right to privacy and doubts about the role of citizen vigilantes, deciding to press the go button on the project was difficult for the Quilt.AI team, even though it seems there are unlikely to be any legal ramifications because data is not collected, and individuals are not specifically targeted. Similar technology is used to comb through internet searches and advertise products to internet consumers.

“We had many discussions about whether it is the right way to go about it,” Chowdhry says. “In the end we decided that this was the best way to protect people’s privacy, while trying to make it a no-go space for everyone. We wanted to show that there is no breach between what you search and your other life on social media. It is non-conventional, but if this logic is used to sell washing machines, it can also be used to make the space better for everyone.”

The company started rolling out the programme in June in India, just as the nation was emerging from lockdowns in various parts of the country. “We saw immediate results,” Chowdhry says. “In the three months since then, we managed to scare off 600,000 people on average, month on month. So that’s an almost 15 per cent reduction. I think it’s a big impact. I don’t know how it will translate worldwide, but it has shown results.”

Nivedita Ahuja, of the India Child Protection Fund, says internet deterrence might only be a partial remedy and repeat offenders are the norm. Photo: Sajjad Hussain/AFP via Getty Images

Nivedita Ahuja, of the India Child Protection Fund, says this sort of internet deterrence might only be a partial remedy. “The statistics tell a really grim story,” she says. With data from a pilot study using this sort of internet warning, the organisation found sustained and continuous deterrence protocols were likely to be necessary.

In a report based on statistics gathered in December 2019, the Protection Fund said an analysis of trends indicated that repeat offenders were the norm. “Seventy-six per cent of people who returned to the public web explicitly re-exhibited a demand for child pornography,” the report said. “On an average the impact of the campaign lasted for 38 days, [with] individuals exhibiting recidivism after this period.”

The child pornography scourge can seem almost invisible, though in reality while it is difficult to find child sex abuse material on the surface Web, it takes just half a second to gain access on the deep Web.

“Nothing can be done on the deep Web; it’s encrypted, secure,” says Chowdhry. “That’s why we chose the surface, because something can be done. For all the bleakness surrounding the issue, I am optimistic.”

Post