Just last December, Reddit users were noticed to be creating fake pornographic content using faces of celebrities pasted on to the bodies of adult stars in pre-existing videos.
The feat was achieved as a result of a sophisticated machine learning algorithm that used photographs to create human masks that were then overlaid on top of the adult-film footage to make it seem strikingly real, courtesy Reddit user 'deepfakes'.
Also read: The dangers of 'up-skirting' porn
Websites like CelebJihaad, which is known for its celebrity nude content, kind of follow the same procedure to spew out topless and sometimes fully nude photos of celebrities which are just fake photos with the actress' faces pasted on regular adult stars' bodies.
And now this AI-assisted porn is spreading all over Reddit, via an easy to update app known as FakeApp which uses user deepfakes' algorithm without requiring any kind of language coding, thus allowing it to be downloaded directly to your desktop computer, as per Motherboard.
Widely popular celebrities like Emma Watson, Daisy Ridley, Katy Perry and Cara Delevingne have all been morphed into what seems like credible adult-film footage of the celebrities, using that technology.
As worrisome as that sounds, Reddit user deepfakes – who's behind that application – has also shared with Motherboard that he looks forward to building a library of such public data that can be plastered into any video at any time – meaning the faces are not restricted to just celebrities anymore.
"Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face at the press of one button," the user told Motherboard.
Clips of Emma Watson taking a shower, having circulated widely over just fake websites only, sheds light on how alarming the issue could be. Another one of Gal Gadot performing in a short adult movie and a separate one of Daisy Ridley has made rounds too.
These were possibly made by training a machine into learning algorithms based on stock photos, Google search images, and YouTube videos of the stars. And experts believe that the technology behind it is 'no longer rocket science.'
While the existing video could fool or even convince anyone, the possibilities of using the technology to create fake porn for regular people without their consent or even knowledge along with other negative content or intent is not lost on people.
User deepfakes shared, "I just found a clever way to do face-swap. With hundreds of face images, I can easily generate millions of distorted images to train the network. After that if I feed the network someone else's face, the network will think it's just another distorted image and try to make it look like the training face."
And all it takes is just a few days or a week to create this content. And with today's generation's incessant social media sharing, it's rather easy to get access to anybody's photos.
"Everyone needs to know just how easy it is to fake images and videos, to the point where we won't be able to distinguish forgeries in a few months from now," AI researcher Alex Champandard told Motherboard.
"Of course, this was possible for a long time but it would have taken a lot of resources and professionals in visual effects to pull this off. Now it can be done by a single programmer with recent computer hardware."