Reddit bans hi-tech ‘deepfake’ pornography, that face-swaps celebrities into hard-core sex scenes
Advances in face-swapping technology have allowed amateur pornographers to create disturbingly convincing fake videos

Reddit on Wednesday banned “deepfake” pornography – fake celebrity porn videos created with face-swapping technologies – and shut down the community of the same name, becoming the third major internet platform this week to crack down on the increasingly popular clips.
“This community has been banned,” the former deepfakes subreddit page reads. “This subreddit was banned due to a violation of our content policy, specifically our policy against involuntary pornography.”
The site also updated its rules to prohibit sexually explicit photos and images “that have been faked.”

It may be hi-tech, but it doesn’t take a sophisticated production studio to produce face-swap porn, which is a major reason deepfakes have proliferated online in recent months.
Deepfake makers create their videos using a patchwork of readily available technologies, as Vice’s Motherboard has explained in detail. Often, they use open-source social media tools to download photos of victims en masse. Once they have enough face pictures of the targeted celebrity to work with – it typically takes a few hundred – they look for a suitable porn performer’s body to graft them onto. In this sense, performers are victimised, too.
Some deepfake makers have experimented with browser-based applications that supposedly use facial recognition software to find the best face-body matches, according to Motherboard. When the data is fed to a machine learning algorithm, available online free, the resulting fake porn videos can be disturbingly convincing.