Sexual harassment and assault
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Protesters shout slogans during a 2018 rally against non-consensual “spy-cam porn” in Seoul, South Korea. Photo: AFP

A Chinese woman found a video of herself on Pornhub. Her new app aims to help survivors of image-based abuse

  • The woman came close to taking her own life after learning about the non-consensual footage. Now she is fighting for justice
  • She hopes her app, Alecto AI, will help others find their images online – the first step to getting them removed

It was a sunny spring afternoon last year when Tisiphone, 25, received a phone call from a male friend that turned her life upside down.

“He told me that I was on Pornhub. Of course, my first reaction was that it must be a mistake. How could it be me?” said the Chinese woman, who asked for her details to be kept private and to be identified as Tisiphone – one of the Furies, the Greek goddesses of vengeance.

When she opened the link sent by her friend, her heart sank in disbelief as she saw herself in an intimate video filmed non-consensually while she was living in the United States.

“The incident happened maybe seven years ago. I was really young, a teenager … I had no idea that monster had secretly filmed me until I saw my video on Pornhub,” Tisiphone said. “It was really devastating. I consider myself a very strong person and well educated, but that was the moment when I literally stopped and thought, ‘I can’t live any more. I don’t want to live any more’.”

Porn, privacy, and pain: how image-based abuse tears women’s lives apart

She then went to the roof of her building and climbed over the fence.

“At that time, it was for me the only way to get out of it, because I was so ashamed, and I was so scared. I felt like I was betrayed by the whole world,” Tisiphone said, adding that she was only deterred by the thought of how much pain her family would endure if they lost her.

“I survived, barely,” she said, gathering herself.

Tisiphone, who worked for the content policy team of a prominent US tech company, is now fighting back. She is developing a facial recognition app called Alecto AI – named after another of the Furies, Tisiphone’s sister – which she is hoping to launch before the end of the year to help thousands of survivors such as herself find their images online.
The landing page of the Alecto AI site. Photo: Screen capture

According to a three-country study published last year, one in three people have experienced some form of image-based sexual abuse, but advocates warn cases are increasing amid the Covid-19 pandemic. This has become widely known as “revenge porn”, though experts say that term should be avoided as it fails to capture the full scope of the issue.

“Damage control is particularly painful [for survivors]. The infringing content is sometimes hosted on various platforms … It is difficult to search for this content scattered all over the internet, while being forced to relive our trauma over and over again,” said Tisiphone, who contacted This Week in Asia after reading our series about image-based abuse. “We can’t defend ourselves unless we have access to technology that can help us do so.”

In her opinion, the most efficient way to tackle image-based abuse is through facial recognition software. But, she added, the existing tools have a lower accuracy rate when it comes to women and people of colour.

Abuse and anger: inside the online groups spreading stolen, sexual images of women and children

Tisiphone, who is part of an accelerator programme at a top American university, put together a team of five women of colour with the goal of introducing an app that is “powerful, unbiased and compassionate”.

She is aiming at “decentralising the process” by giving more power to individuals, noting that major platforms – such as Facebook – often relied on users to report abusive content, and their response could be delayed as they had to deal with a wide range of such issues.

Alecto AI works by scanning users’ faces and then searching for their images online. “We are disrupting not only content policy and revenge porn, but also disrupting facial recognition, making artificial intelligence more human-centred,” she said, adding that currently available tools were not easily accessible, or vulnerable to being abused.


Tisiphone said her team was implementing a number of security measures, including biometric verification, to ensure users’ sensitive information could only be accessed by themselves. To add an extra layer of protection, Alecto AI will also use end-to-end encryption and no data will be saved on its server.

Major platforms such as Facebook often rely on users to report abusive content. Photo: Reuters

“We are doing everything we can to protect users’ privacy and make sure that it’s not abused by some random third party. We will continue to enhance our algorithm to make sure it can help people,” she said.


While Tisiphone is working with tech experts to refine her idea, an initial version is already being tested on Apple’s App Store.

“Because it’s a very sensitive thing we are working on, we want to make sure that everything runs smoothly,” she said. “But I am feeling very hopeful.”

Once users are able to detect their images online, Tisiphone hopes her app will also help survivors find non-profits and law firms willing to take pro bono cases. She said she had already received positive feedback from some of the largest organisations working in this field, adding that the Alecto AI website provided a resources page to link survivors with support groups and other professionals.

For lust and money: when online sexual encounters end in despair and death

“We also want to work with [online] platforms, so we have a more efficient process for [abusive content] to be taken down, but that will happen after we have more users,” she said.


Alecto AI will at first only be open to users who pay a monthly fee, which will be revealed once work on the business model is complete.

“Server costs are extremely expensive, especially when you are trying to scrape the internet for millions of images. I cannot stay in business if we don’t charge people. But it’s going to be an affordable price. Much cheaper than other solutions out there,” Tisiphone said.

Once the app reaches a certain number of subscribers, the idea is to approach major platforms such as Facebook, which could then pay for the technology rather than the financial burden falling on individuals.

“We need to bring more awareness to this issue. A lot of people don’t know they are victims. I am in the tech industry, I am an activist, and it happened to me,” she said.

Existing facial-recognition software has a lower accuracy rate when it comes to women and people of colour, according to Tisiphone. Photo: AFP

While working on Alecto AI, Tisiphone is still fighting her own case.

After learning about the video of her, she asked a friend in the US to approach the authorities on her behalf.

“You have no idea how hard it was for me to even file a police report,” she said. “I am lucky because I speak English, but a lot of Chinese girls don’t speak English and when they go to the Chinese police they will tell them that this is not under their jurisdiction … Even within the US, different states have different laws, some states choose to criminalise revenge porn while others don’t.”

But despite being able to file a police report there, Tisiphone quickly realised cases such as hers were not a priority for the authorities. “Even when you show evidence, most of the time, justice is not served. It’s really frustrating, and it hurts so much.”

She said the police should get sensitivity training, noting that she was asked “really insulting questions” at a time when she was still trying to recover from suicidal thoughts.

A few months after filing the complaint, Tisiphone was told her case would be closed as the perpetrator had fled to Mexico. “So there was nothing they could do about it. It’s ridiculous,” she said, adding that she believed the man had other victims.

After several non-profit organisations did not respond to her requests for help, Tisiphone had to try on her own to remove the video from about 10 websites, including Pornhub. But she eventually found a law firm willing to take her case on a pro bono basis.

“The legal service I am using now, it’s because I got in touch with a friend of mine who went to Harvard Law School. She found within her network someone who cared about it,” Tisiphone said. “You have to have connections … I am well educated, I speak three languages, and I am in this situation. Imagine how hard it is for some teenagers or for someone who is not as educated as me, imagine some helpless young girl.

“I am lucky. And, even for me, the situation is not looking that good.”

People attend a protest as a part of the #MeToo movement on International Women’s Day in Seoul. Photo: Reuters

Most of Tisiphone’s intimate content has been removed, with the exception of one video hosted on a Turkish website. “My lawyers are doing everything they can … That’s a leaking website, you can’t really find an admin, and you can’t enforce US law on them. It’s really frustrating.”

Her legal team is trying to regain copyright of the video from the perpetrator. “We need a legislation change. If the content was stolen from you or taken without consent you should automatically own the intellectual property,” she said. “A lot of things need to change – and that’s what I am trying to work on Alecto AI because this [situation] has been enabled by technology and it can only be solved through technology.”

Tisiphone is now focusing all her efforts on the app.

“The reason I am doing this start-up is to help other victims, because I see this as salvation. By helping others, I am doing something that helps me to heal myself as well,” she said.

There was a time when she couldn’t talk to her lawyer without shaking and crying – she couldn’t even speak in complete sentences.

“It was so traumatic,” Tisiphone said. “But now I have to get a lot stronger. I feel like this is my destiny … No one deserves to die from this.” If you are having suicidal thoughts, or you know someone who is, help is available. For Hong Kong, dial +852 2896 0000 for The Samaritans or +852 2382 0000 for Suicide Prevention Services. In the US, call The National Suicide Prevention Lifeline on +1 800 273 8255. For a list of other nations’ helplines, see this page .

Alecto AI’s resources page is here.

This article appeared in the South China Morning Post print edition as: Woman in porn video is fighting back with AI app