Advertisement
Advertisement
Fame and celebrity
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Taylor Swift at the recent Golden Globe Awards. Photo: EPA-EFE

X lifts ban on Taylor Swift searches after spread of fake porn images

  • Sexually explicit and abusive fake images of Taylor Swift began circulating widely last week on X
  • In response, X briefly disabled queries for pop singer’s name but lifted the search ban on Monday
Social-media company X lifted the ban on searches for Taylor Swift on Monday evening, after blocking users from searching for her following the spread of fake sexually-explicit images of the pop singer on the social media site last week.

The search has been reactivated and the social media platform “will continue to be vigilant for any attempt to spread this content and will remove it if we find it,” Joe Benarroch, head of business operations at X, said in a statement.

Searches for Taylor Swift’s name on Sunday afternoon on the social media platform formerly known as Twitter yielded the error message: “Something went wrong. Try reloading”.

X had called the measure a temporary action done with “abundance of caution”.

One image of Swift, who was named Time Magazine’s “Person of the Year” in 2023, shared on X was viewed 47 million times before the account was suspended, according to a New York Times report.

The ban on searches came after White House weighed in on Friday, calling the fake images “alarming” and highlighting that social media companies have a responsibility to prevent the spread of such misinformation.

“Sadly we know that lack of enforcement (by the tech platforms) disproportionately impacts women and they also impact girls who are the overwhelming targets of online harassment,” White House Press Secretary Karine Jean-Pierre said.

Swift, who was seen Sunday in Baltimore celebrating with her boyfriend’s NFL team, the Kansas City Chiefs, has made no public comment on the issue.

Taylor Swift kissing her boyfriend, Kansas City Chiefs tight end Travis Kelce. Photo: AP

Deepfake porn images of celebrities are not new, but activists and regulators are worried that easy-to-use tools employing generative artificial intelligence (AI) will create an uncontrollable flood of toxic or harmful content.

In 2019, a report released by the AI firm DeepTrace Labs showed these images were overwhelmingly weaponised against women. Most of the victims, it said, were Hollywood actors and South Korean K-pop singers.

Since billionaire Elon Musk acquired Twitter in 2022, he has faced criticism for his own controversial posts, prompting many advertisers on the platform to pull back spending out of fear of being associated with harmful content.

Additional reporting by Agence France-Presse and Bloomberg

Post