“The way cars see us.”
18-year-old American researcher in the field of artificial intelligence, Robbie Barrat, showed on Twitter the results of his own neural network, which was supposed to “draw” portraits of naked people. Instead, according to the developer, AI created “surrealistic drops of flesh with limbs” that only vaguely resemble people.
Barrat told CNet that he used the generative-contention network. This method is a combination of two neural networks: one was trained on hundreds of nude portraits, and the second was trying to distinguish the correct results from the wrong ones.
Typically, such a competition leads to an increase in the quality of work, but one hemisphere of the algorithm has learned to deceive the “verifier” with the help of “fleshy drops”. Because of this, the neural network ceased to improve, and the portraits turned out far from ideal. “Drops” only vaguely resemble people in typical poses for nude paintings.
But Barratt said that he liked this result even more.
— cepcam (@cepcam) March 21, 2018
In social networks, Barrath began to ask why the neural network did not “draw” black people. The American replied that he had taught the algorithm in the pictures of those times when portraits of naked black people were not “too popular.”
Unfortunately not too many;
please keep in mind that a lot of these paintings were from a few hundred years ago at least, and dark skinned nude portraits weren’t very popular then.
— Robbie Barrat (@DrBeef_) March 27, 2018
In March 2017, Barrat, while still a schoolboy, created a neural network that studied on the Kanye West tracks and recorded her own rap song. After school, he began working at the Stanford research lab.