around the world

Why do I think that in the story about fake porn movies with celebrities there are pros

he pros and cons of the scandalous design that put the faces of Gal Gadot and Scarlett Johansson on the faces of girls in adult films.

Screenshot Samantha Cole with SendVids

In early autumn of 2017, the user Reddit began to publish excerpts from porn videos, where the face of the actress was replaced by persons Gal Gadot, Scarlett Johansson, Macy Williams or Aubrey Plaza. Later he laid out his own development in open access on the basis of self-learning algorithms, which allowed to create such content.

On December 12, 2017, Motherboard journalist Samantha Cole published an article whose headline read : “Fake porn on the basis of artificial intelligence is here, and we are in the ass.” In the article, the journalist has repeatedly made it clear that she is suspicious of the development.

Editor agrees – in such content there is nothing good, but from history one should learn a valuable lesson, rather than overwhelm with condemnation.

How the algorithm works

On the Internet, only a video with Gal Gadot broke, as it turned out to be the best quality. According to the plot, in it the heroine has sex with her half-brother. The attentive spectator quickly discerns the noise and artifacts that appear on the actress’s face, and will understand that this is a fake. All because the algorithm still allows errors, but it self-learning, and hence its effectiveness will grow.

After the publication of the development on Reddit (she immediately dispersed among enthusiasts), the author of DeepFakes clarified that the principles of her work “to the shameful elementary.” To create the program, the programmer used open source libraries like TensorFlow. Google distributes this development free of charge to specialists, students and anyone interested in computer training.

To teach artificial intelligence, the author used images in Google, stock photos, videos on YouTube and porn videos. Thanks to this, the computer began to distinguish images of Gal Gadot and erotic videos, and then learned to put faces of porn actresses on the faces of celebrities. The more often a computer was practiced, the faster it turned out.

Why development deserves condemnation

Samantha Cole claims that the algorithm can be used by intruders. They will create a video about what their victim has never done or said. That is, the average hand the programmer will be able to reverse the algorithm against the victim, undermining her reputation. “The ease with which someone can do this frightens,” Cole writes.

The journalist cites statistics – for 2015-2016 people downloaded into Google Photos about 24 billion self, and now the attacker does not need to find your picture and substitute it in porn.

“This is very scary. In part this shows how some men see in women only an object for manipulation, which can be made to do anything. This demonstrates a complete lack of respect for the work of people in the field of porn, “- said former porn actress Alia Janine in a conversation with Cole.

The position of the journalist and actress is clear. From the ethical point of view, the development of DeepFakes is offensive – no one deserves such an attitude towards himself. But it does not scare so much (there are enough ways on the Internet to ruin a person’s life ), how much attitude to this story.

Why development deserves the right to exist

There is a risk that the technology will fall into the wrong hands, and it will be used for mercenary purposes. I have no doubt that many unsophisticated people on the Internet will not immediately understand that the video with Gal Gadot is a fake. But now development in the best case at the beta stage, and algorithms for self-study will only be improved.

But the development of DeepFakes – like any technology – can be applied for good. Just as in the “Fast and Furious 7” or “Izgoy-one” they used digital copies of deceased actors. In the last picture in the frame appeared a digital version of the deceased Peter Cushing, and it cost the creators a lot of money and effort. But what if, thanks to the algorithms, this re-creation becomes cheaper and easier?

Artificial Intelligence NVIDIA can quickly turn video with snow-covered scenery on a summer day. DeepFakes used it in developing his algorithm

According to the researcher of the possibilities of artificial intelligence of Alex Champandar, the technology of digital replacement of persons exists for decades. But the DeepFakes story shows that not dozens of professionals are needed to implement them, but only a programmer with a modern computer. And not all of them will spend time creating fake porn movies with celebrities.

DeepFakes claims that its development can increase the interest of users in computer training. Controversial thesis, but he has the right to exist. The more such fake movies will appear, the faster people will learn to catch them.

Can the development of DeepFakes harm people? Completely, she already did it. Can it help in the development of machine learning? I think so, but for this it is necessary to discuss and analyze the story from different angles, and not to perceive it as “chernukha” and not to serve as a catastrophe in the “era of hackers”.

Some media, including Russian ones, when referring to the news about the algorithm referred to the Motherboard, but for some reason did not reflect the positive aspects of the technology (there are several paragraphs in the original article about them). Other publications did not write about the incident, including . Probably, the editors considered the story too odious (or maybe just missed, I do not know).

But I believe that to look at this story only from the ethical point of view, that is, to see only the negative, will be a mistake. It’s disgusting that someone does porn videos with famous actresses without their permission. But this is an old topic that can hardly be defeated by ignoring or blaming. Hence, it makes sense at least try to look at the situation from the positive side.

Back to top button
Close
Close