social network

NYT: YouTube hasn’t solved the problem with commercials that look like child porn. Users and companies have been complaining about them since February.

The company has not changed the algorithm of recommendations, which offers videos with minors.

Getty Photos

In February, an American blogger found videos on YouTube that resemble child porn. After that, a scandal unfolded around the platform: users blamed the service for organizing a community of pedophiles on a resource, and large companies like Disney, Nestle and Epic Games recalled advertising budgets.

After the accusations, YouTube promised to hide inappropriate comments and remove policy-violating videos of children under the age of 13. According to The New York Times, nothing has changed since February – the recommendation algorithm continues to offer candid videos with children.

  • A researcher at the Harvard Center for Internet and Society, Jonas Kaiser, told the publication that the recommendation algorithm links the children’s channels. Taken individually, the videos look “innocent”, and the explicit frames in them seem random. But the videos gathered together are troubling;
  • Kaiser, along with a team of researchers, tested YouTube’s algorithms and came to the conclusion that the site often begins to offer strange content with an emphasis on children. Videos, where women discuss sex, led to a video of girls in underwear, sometimes with a note about their age – 19, 18, 16 years. After a few clicks, they stumbled upon partially undressed children from Latin America and Eastern Europe;
  • Experts note that the recommendation system collects dozens of candid videos with children in one place and promotes them to a wide audience;
  • When the publication turned to YouTube, several clips were removed from the site, but left many others. The recommendations also ceased to appear like children’s explicit videos. The company stated that this was due to small changes in the algorithm;
  • Experts with whom the publication has spoken, consider that it is necessary to turn off algorithms for problem videos. However, the company responded that videos received the most traffic due to recommendations. If you remove them, then “content creators” will suffer. At the same time, YouTube promised to limit the recommendations to videos that, in their opinion, put children at risk;
  • Some accounts with children’s videos contain links to the social networks of channel authors. Psychologist Marcus Rogers (Marcus Rogers), who investigated the spread of child pornography, noted that pedophiles can contact children and solicit more sexualized content;
  • The publication was able to find contacts of parents of children from the commercials. They contacted local organizations that can help them;
  • Journalists talked with the mother of a 10-year-old girl, who uploaded a video where she and her friends played in an inflatable pool. A few days later, the video gained 400 thousand views. “I watched the video again and was scared,” the woman noted. – “It’s terrible that the video fell into this category.The only thing I can do is to forbid my daughter to post something on YouTube.”.
Back to top button
Close
Close