Companies have withdrawn advertising, the site has banned several channels, and video bloggers are afraid of another wave of demonetization.
Since February 17, 2019, a scandal has unfolded around YouTube: users accuse theservice of organizing a community of pedophiles on the resource, Disney, Nestle and Epic Games withdraw their advertising budgets, and the video hosting management removes and repairs random channels.
The story began with a roller blogger Matt Watson (Matt Watson). In a video titled “YouTube makes it easier for children to be sexually exploited and monetize it,” he told how pedophiles find videos with children on the site. restored the events of the conflict.
“Wormhole” children’s videos on YouTube
Using examples, Watson showed how service algorithms help pedophiles find videos of minors. He created a new account and entered in the search for “bikini haul” – these are the names of the commercials whose authors show their purchases, in this case, bikini. In just a few clicks, the author got into the “wormhole” of children’s videos. The recommendations began to appear videos, where the girls undress, do gymnastics, eat lollipops or play Twister.
These videos are not pornographic, in which the authors talk about their lives or spend chlelendzhi with friends. But dozens of users in the comments leave timecode on the moments where the children are in explicit postures. They write compliments and send Emoji with hearts and drops of water. Watson also claimed that users share links to child pornography. Many videos have a built-in advertising service, that is, Google monetizes them. At the end of the video, Watson said that he was leaving YouTube because the platform had become “disgusting.”
checked the video hosting algorithms and made sure that stumble on the videos that Watson talked about can be done in three clicks.
Under the video you can really find timecodes that Watson mentioned. Sometimes the owner of the channel put huskies under the comments of pedophiles. could not find links to child porn or other erotic content in the comments.
YouTube is struggling with the problem
The YouTube community rules prohibit the “representation of minors in a sexual context”. Users under the age of 13 cannot be registered on the site. The service also promises to contact law enforcement agencies and the National Center for Missing and Exploited Children if it finds such videos.
The company explained that it removes erotic videos from the platform and bans channels that spread them. Google claims to remove ads from these videos.
A YouTube spokesperson said the company is reviewing its policies and checking out specific videos and comments from a Watson video. After verification, some materials have been removed.
On YouTube, they stressed that most of the videos Watson paid attention to are “innocent videos of children doing their daily activities. The service has closed comments under similar content. The statement that innocent videos are exploited by pedophiles was not confirmed in the service. According to them, not all videos are deleted, since it is difficult for artificial intelligence to determine the context of materials. About 10 thousand moderators work on the platform, and about 400 videos per minute are loaded into the service.
On February 17, the channels of two bloggers, who uploaded videos for all ages about Pokémon Go, were removed from the service . They went to the ban for the presence of the abbreviation CP in the titles of videos.In the game, this means the pokemon’s combat power (combat power), but in pedophile circles, child pornography is meant to be abbreviated. Channels restored on the same day. YouTube also sent a warning to the author of the video with the passage of the Club Penguin and deleted another channel for videos with the same game. Bloggers have noted that the artificial intelligence of the service cannot recognize the context.
Reaction of bloggers and advertisers
Content creators have criticized Watson’s video. In the description of the video, he attached the numbers of companies whose ads appeared in controversial videos and asked viewers to complain to advertisers. Bloggers believe that he is trying to provoke another “advertising apocalypse” (adpocalypse).
- In November 2016, violent videos spread to YouTube , where cartoon characters died or committed suicide. The service decided to promote content for all ages and allowed advertisers to independently choose the video for advertising. After the changes, content creators began to earn significantly less than before.Then the term “advertising apocalypse” arose;
- In February 2017, the media found that the blogger PewDiePie was posting a video with anti-Semitic symbols. Disney refused to cooperate with a blogger, and his show on YouTube was canceled;
- In July 2017, extremist and terrorist videos with embedded advertising appearedon YouTube . The service has disabled monetization for videos that have less than 10 thousand views;
- In December 2017, the site noticed comments pedophiles under the video of children. HP, Mars and Deutsche Bank have declined to advertise on YouTube. Service management promised to apply more stringent criteria to monetized videos;
- In January 2018, Logan Paul posted a video with a corpse from the “forest of suicides”. YouTube announced that now to monetize content creators, you need to have 4,000 hours of video views over the past year and a thousand subscribers. Previously, the limit was at 10 o’clock views for all time.
I’m not reporting the story because it negatively affects the whole YouTube community. We don’t need another ad apocalypse. What I have done behind the scenes though is reached out to my YouTube contacts showing them the video & my team is showing them content to take down. https://t.co/IsMmfXwACK
— KEEM 🍿 (@KEEMSTAR) 18 February 2019
“I don’t cover this story because it negatively affects the entire YouTube community. We do not need another advertising apocalypse. What I did behind the scenes: I contacted my contacts from the company, showed the video. My team shows them the content to be deleted. ”
This is not just about me. This is about all my friends big & small creators. I’m not reporting something that’s going to affect there livelihoods. Instead I’m going to work privately behind the scenes with YouTube to get this content taken down. #TeamYouTube https://t.co/VinaTJYl2K
— KEEM 🍿 (@KEEMSTAR) 18 February 2019
“I’m not worried about myself. I worry for my friends, content creators. I will not cover what will affect their activities. Instead, I will work behind the scenes from YouTube and help them remove these videos. ”
He’s purposely trying to usher in another Adpocalypse…
YouTube has problems, fixing them doesn’t mean you go telling people to call up advertisers…
This isn’t some heroic crusade, it’s a vendetta…
— Roberto Blake #AWESOMESQUAD (@robertoblake) 20 February 2019
The Verge noted that if large companies refuse to advertise, this will have serious consequences for ordinary video bloggers on YouTube. On Feb. 20, Disney, Nestle and Epic Games really called off their advertising budgets. Service promised to reimburse them the money that went to advertise in the controversial video. Companies expect further action from Google.
But YouTube does not give up
After the refusal of large companies from advertising on the YouTube platform, toldadvertisers that he was seriously concerned with the problem: cleans up monetization, strengthens moderation, and resolves the issue with the recommendation algorithm.
According to YouTube, they’ll ask channel creators to moderate their content. Management stated that the service is responsible for the comments on the site, so they can place higher demands on the channel owners.
Google also tackled the issue of social workers, child development professionals, former prosecutors, former FBI and CIA employees.
YouTube began to delete thousands of channels a week that are run by children under the age of 13.
On YouTube, they know that the main problem remains the algorithm that offers recommendations. A service representative said that the company had removed the recommendation window from the controversial clips. On some videos, they really disappeared.
As noted in the Newsweek edition, the algorithm that generates recommendations based on user preferences is still a new problem industry. The attackers continue to use YouTube for illegal purposes, despite the fact that the video hosting leads against them active struggle.