A Chinese programmer has apologised and shut down an artificial intelligence project that used facial recognition technology to detect whether women had appeared in porn videos, after it set off a storm of criticism. The programmer, who identified himself as Li Xu, said the program used the technology to cross-reference videos and images on porn sites with user information on social media. Reporting progress on the project on Monday, Li said it had already collected more than 100 terabytes of data from several pornography sites and popular social media platforms including Facebook, Instagram, TikTok and Weibo. More than 100,000 young women working in the porn industry had been “successfully identified on a global scale”, Li wrote on microblogging site Weibo. That prompted a flood of angry messages condemning the project and criticising him for slut-shaming. Four days later, Li cancelled a live-streaming session during which he had planned to take questions from the media and explain his program. He later apologised and said he had deleted all data relevant to the project. Li did not respond to inquiries from the South China Morning Post . Earlier, he had said his goal was to help male programmers identify “promiscuous” women who had appeared in any sex video. Programmers are mocked on Chinese social media as jiepanxia – men willing to enter into a relationship with a woman who has a messy sexual history. The term is derogatory to women, implying that those who have had sex with a number of partners are less desirable. Li worked with a small team to help his programmer friends “filter out” such women by matching videos and images across a variety of websites, Li said in a Weibo post in August last year. No place in modern China for women’s morality classes that seek to turn the clock back, 70 years after Mao proclaimed equality Feng Yuan, co-founder of Equality, a Beijing-based NGO focused on women’s rights and gender, slammed the project as discriminatory. “It is clearly a double standard. Why wasn’t the program aimed at identifying men as well?” Feng said. Speaking to Sohu News earlier, Li said it was because “we didn’t collect enough social media data from men”. He added that after starting the project, he realised the program could also be used to help a woman check whether she was in a sex video that had been uploaded online without her consent. But Feng doubted whether using the program for that purpose would be helpful. “Even if the technology works and a woman figures out that her sex video has been put online, what can she do then? She still won’t find a way to take down the video,” Feng said. Li told Sohu News he was sorry for “announcing the news in a rush”, saying he should have made “a more comprehensive plan” explaining what the team wanted to do. The project has also raised privacy concerns. In response to a Weibo user’s query as to whether he was aware of any legal issues with the project, Li said there was no problem because he had not shared any data or opened up the database, and that sex work was legal in Germany, where he lives. But Marcelo Thompson, an assistant professor specialising in privacy at the University of Hong Kong, said the case was “a clear violation” of data protection regimes in Europe and China. Under the European Union’s General Data Protection Regulation, the purposes for which personal data is collected need to be legitimate. “Victimisation of women because of their previous sexual lives strikes me as one of the most illegitimate purposes I can think of,” Thompson said. Data protection regimes in both Europe and China prohibit the processing of biometric data without explicit consent from the subject, which was the case in Li’s project, he said. Gender issues have increasingly been in the spotlight in China in recent years, and Feng said discrimination was a problem across society. “It’s not as simple as the fact that some programmers discriminate against women. The double standard on gender exists widely in Chinese society,” Feng said.