Deepfakes: People are now swapping their friends' faces into porn
Deepfakes use facial recognition technology to superimpose faces on to porn stars.
An online community is using new technology to take photos of friends, ex-partners and acquaintances from social media and superimpose them on to the faces of porn film actors.
Know as "deepfakes" after a Reddit user who pioneered the technique, artificial intelligence and facial recognition technology is harnessed to replace the faces of actors in porn movies. Celebrities such as Star Wars actress Daisy Ridley and singer Taylor Swift are among high-profile figures to be targeted.
It has emerged that users are creating deepfakes using images of people they know without their permission. One person wrote on a deepfake chat room that they had used easily downloadable software to scrape hundreds of images from a classmate's Instagram and Facebook pages.
Such data can then be used to create a deepfake using software such as the FakeApp application, Motherboard reported.
"Hi, I want to make a pron video with my ex-girlfriend. But I don't have high-quality videos with er, but I have a lot of good photos," wrote one user on a Reddit forum.
The technology is in its early stages and it is difficult to create seamless footage without the dimensions of the faces matching up. One user wrote on the chat room Discord: "It really only works well for simple vids without too much head movement"
Deepfakes are a subset of a genre of porn that has long existed online, where community members find lookalike actors who resemble friends, former partners and celebrities or swap faces in pornographic photographs. Experts fear, however, that programs which seamlessly superimpose faces on to porn stars will usher in a new era of hyper-realistic doppleganger porn that could be used for blackmail.
"The influx of fast-paced developments in technology is making it very difficult for the law to keep pace and adequately support victims," Luke Patel, a solicitor partner at Blacks Solicitors in Leeds, told IBTimes UK.
He explained that if a person could prove that the use of their image had caused serious harm to their reputation they could launch a libel case. That could be costly. Data protection laws could also be used.
However he added: "Unfortunately, the long arm of the law is just simply not long enough or equipped to deal quickly with such fast changes in the digital world.
"The pressure and responsibility to curb such activity can only lie with the internet platforms and more needs to be done to clamp down on them and for them to ensure tighter controls are in place to ensure that their platforms are not used for such elicit activities."
The actions of deepfake creators are not going entirely unchecked. Discord recently closed down a room dedicated to the practice.
It told Business Insider: "Non-consensual pornography warrants an instant shutdown on the servers whenever we identify it, as well as a permanent ban on the users. We have investigated these servers and shut them down immediately."
A moderator on Reddit's "doppelbanger" forum dedicated to lookalike porn urged users: "Please be aware of how someone might feel about finding their picture here."