Photo: Portraits of nudity created by neural network

“The way cars see us.”

18-year-old American researcher in the field of artificial intelligence, Robbie Barrat, showed on Twitter the results of his own neural network, which was supposed to “draw” portraits of naked people. Instead, according to the developer, AI created “surrealistic drops of flesh with limbs” that only vaguely resemble people.

Barrat told CNet that he used the generative-contention network. This method is a combination of two neural networks: one was trained on hundreds of nude portraits, and the second was trying to distinguish the correct results from the wrong ones.

Typically, such a competition leads to an increase in the quality of work, but one hemisphere of the algorithm has learned to deceive the “verifier” with the help of “fleshy drops”. Because of this, the neural network ceased to improve, and the portraits turned out far from ideal. “Drops” only vaguely resemble people in typical poses for nude paintings.

But Barratt said that he liked this result even more.

Barrat noted that the neural network almost always depicts heads as a “yellow-purple texture.” According to the developer, this is how machines and “see” people.

In social networks, Barrath began to ask why the neural network did not “draw” black people. The American replied that he had taught the algorithm in the pictures of those times when portraits of naked black people were not “too popular.”

In March 2017, Barrat, while still a schoolboy, created a neural network that studied on the Kanye West tracks and recorded her own rap song. After school, he began working at the Stanford research lab.

Back to top button