fake Gal Gadot porn video
A screenshot from the fake Gal Gadot porn video.SendVids

A new video showing Gal Gadot performing in an adult film is doing the rounds on the Internet. The woman in the video clip, however, is not the real "Wonder Woman" actress, but an unidentified porn star whose body is fitted with Gadot's face with the help of artificial intelligence.

The video, made by Reddit user who goes by the name "deepfakes," was created by training a machine learning algorithm on stock photos and videos of Gadot. The creator is also said to have used easily accessible materials and open-source code that anyone familiar with deep learning algorithms can exploit.

"It's not going to fool anyone who looks closely. Sometimes the face doesn't track correctly and there's an uncanny valley effect at play, but at a glance it seems believable," said Motherboard, which first spotted the unsettling video.

Although the video may not be very convincing, it does raise serious privacy concerns over the possibility of machine learning falling into the wrong hands and leading to fake porn showing a particular person without his or her consent.

Deepfakes has posted similar hardcore porn videos of other stars, too, including Scarlett Johansson, Maisie Williams, Taylor Swift and Aubrey Plaza, Motherboard said, adding that it has notified the management companies and publicists who represent the affected personalities.

Wonder Woman breaks box office record  for a female director

Deepfakes, who didn't disclose his identity, told the publication that the software used to create the fake porn video is based on multiple open-source libraries like Keras with TensorFlow backend.

According to him, he trained the algorithm on porn videos and Gadot's face, and allowed it to autonomously run computations on input data, and create an approximation of the actor's face that was overlaid on the moving figure in the video.

"I just found a clever way to do face-swap," deepfakes told Motherboard. "With hundreds of face images, I can easily generate millions of distorted images to train the network. After that if I feed the network someone else's face, the network will think it's just another distorted image and try to make it look like the training face."

Deepfakes also said in a comment thread on Reddit that he was using an algorithm that was similar to one developed by Nvidia researchers to instantly turn a video of a summer scene into a winter one. There's, however, no evidence that it was an application of their work.

"Everyone needs to know just how easy it is to fake images and videos, to the point where we won't able to distinguish forgeries in a few months from now," Alex Champandard, an AI expert, told Motherboard. "Of course, this was possible for a long time but it would have taken a lot of resources and professionals in visual effects to pull this off. Now it can be done by a single programmer with recent computer hardware."