Advertisement
Advertisement
Singapore
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Singapore’s Changi prison, one of two in the city state where AI-assisted facial recognition technologies are being used and piloted. Photo: Google

‘Like being in a fishbowl’: spotlight on Singapore’s prisons over facial recognition technology

  • Advocates say the use of facial recognition technology violates prisoners’ privacy and increases risk of bias against minority communities
  • India and Australia also employ AI-based surveillance systems to keep tabs on prisoners – despite warnings that it’s ‘riddled with errors’
Singapore
In Singapore’s prisons, CCTV cameras in the cells watch over inmates, facial recognition is used for headcount checks, and an artificial intelligence-based behaviour detection system monitors for fights and other suspicious activities.

“Sometimes, the facial recognition cameras would turn on at odd times, without warning. Or the behaviour detection would alert the guards if people were just exercising in the cell,” said Tan, 26, a former inmate, who asked to go by his last name.

“I was arrested for a non-violent crime, yet made to feel like a dangerous terrorist who had to be watched all the time,” said Tan, who served two terms in prison of up to a year for smoking cannabis, a banned substance.

Officials say the technologies being used and piloted in the city state’s Selarang and Changi prisons improve “effectiveness and efficiency”, and free up guards to focus on prisoner rehabilitation and other “more value-added work”.

Advocates say constant surveillance “alienates” prisoners and “makes them feel like they aren’t treated with dignity”. Photo: Shutterstock

Former inmates like Tan and human rights groups, however, say the constant surveillance violates prisoners’ privacy, that AI-based systems can be inaccurate and biased against minorities in particular, and that there is little clarity about data use.

“It’s not clear what is done with the data, how long it’s kept, and what recourse prisoners or former prisoners have if there is abuse or leaks of the data,” said Kirsten Han at Transformative Justice Collective, a rights group in Singapore.

“Prisoners already feel dehumanised and disrespected in prison, and the constant surveillance and lack of privacy … alienates them and makes them feel like they aren’t treated with dignity,” she said.

The Singapore Prison Service said in a statement that data protection rules apply, and that the technologies are assessed for accuracy and reliability, and are “appropriately calibrated” for race and sex.

New tech system to detect prison fights rolled out in Singapore

Worldwide, facial recognition – which uses AI to match live images of a person for verification against a database of photographs – is increasingly used for everything from unlocking mobile phones to checking in for flights to making payments.

But its use by police and in prisons is problematic as inmates have limited rights, and because of the risk of bias against minority communities who are often over-represented in the prison system, rights groups say.

“Surveillance systems in prisons can be abused, especially in the case of political prisoners and other vulnerable people,” said Phil Robertson, deputy director of Human Rights Watch in Asia.

“Even considering the need to prevent violence, facial recognition in prisons is overly intrusive and unnecessary.”

If the data collected is used to train algorithms, people with similar facial features could be profiled as criminals or suspects
Anushka Jain, Internet Freedom Foundation
Asian cities – including Singapore, New Delhi and others in India – have among the highest concentrations of surveillance cameras in the world, according to tech website Comparitech.

In Delhi, authorities are rolling out facial recognition systems in the city’s three prisons for greater safety and security, said Sanjay Beniwal, director general of Tihar jail, India’s largest prison complex.

The prisons, with a combined inmate population of about 20,000, already have a big network of CCTV cameras, so facial recognition technology is needed to analyse the feeds, he said.

“It will enable us to monitor suspicious activity, and also alert us to fights and falls,” he said.

Police keep watch in a CCTV control room in New Delhi. The Indian capital has among the highest concentration of surveillance cameras in the world. Photo: AFP

The system will have “adequate checks and balances” to ensure data is secured, and that the rights of inmates are upheld, he added.

But India’s criminal justice system disproportionately targets marginalised communities, and the lack of a data protection law raises the risk of misuse, said Anushka Jain, policy counsel at the Internet Freedom Foundation, a digital rights group.

“There is a high proportion of minority communities in prison, and if the data collected is used to train algorithms, people with similar facial features could be profiled as criminals or suspects, thus reinforcing the bias,” she said.

“Prisoners also risk being misidentified and being held responsible for acts they did not commit, which would reduce their chances of early release or parole, or lead to a further curbing of their rights.”

US cities walk back facial recognition bans in face of surging crime

Surveillance technologies are often tested on vulnerable populations such as refugees and prisoners before being rolled out elsewhere, and are increasingly used to also target dissidents and activists.

The Australian Human Rights Commission in 2021 called for a ban on the use of facial recognition in the country until “stronger, clearer and more targeted” human rights protections are in place.

Yet the technology has been rolled out in several prisons in New South Wales state, despite concerns about potential biases against Aboriginal people who are over-represented in prisons.

The technology will ensure greater security and enable “faster, more accurate processing at all stages of the enrolment and identification process” for everyone who enters or exits a facility, Corrective Services NSW, the state agency that oversees prisons, said in a statement.

AI security cameras using facial recognition technology are displayed at a product exhibition in China. Photo: AFP/Getty Images/TNS

All biometric data is “encrypted on a secure system, with data stored on encrypted secure servers,” and the system complies with the state’s privacy safeguards, the spokesperson added.

But the technology undermines the right to privacy, said James Clark, executive director of Digital Rights Watch, an advocacy group.

“Facial surveillance only benefits [private prison service providers], while the risks will be carried by some of the most vulnerable people in our society,” he said.

The technology is also “riddled with errors” when it comes to identifying darker-skinned people and women, he added. Prison inmates are unaware of these risks.

Computer, your biases are showing

In Singapore, Tan and other former inmates said they received a short briefing on the technologies, but had no knowledge of how they worked, or what became of their data.

“At first I thought it was quite cool. But you can’t opt out,” he said.

“It’s like being in a fish bowl all the time. It was very dehumanising and just unnecessary.”

Post