YouTube said it will disable reactions to almost all videos with children – potentially affecting millions of posts on the site – after reports from last week that pedophiles leave inappropriate comments on harmless video's of children.
The change is because YouTube is struggling to moderate content on its platform, because concerns about hate speech, violence and conspiracy theories still plague it.
It takes several months for YouTube to disable comments on all videos with minors, the company said.
The process started last week when it turned off tens of millions of videos.
* Spark joins the YouTube backlash after revelations about inappropriate content targeted at children
* Five changes are coming for YouTube
* Is YouTube not safe for children?
* Suppression of YouTube ads puts some creators to work
Advertisers including Nestle, AT & T and Fortnite creator Epic Games pulled ads from YouTube last week after the inappropriate comments about children were found by popular YouTuber and media messages.
At least one company, Nestle, was satisfied with YouTube's response and recovered ads at the end of last week.
A small number of channels with video & # 39; s with children may leave comments on. But they must be familiar with YouTube and must actively follow the comments that go beyond the standard monitoring tools that YouTube offers.
Turning off comments on such a large number of video's seems like an "extreme reaction". & # 39 ;, says eMarketer analyst Paul Verna.
But it's about the safety of children, so it's only natural that YouTube wants to act quickly, he said.
Comments are not the main focus of the site for publishing videos, but disabling this video will likely reduce the experience for many users and creators, he said.
YouTube director Susan Wojcicki acknowledged Thursday's concern by tweeting: "Nothing is more important to us than ensuring the safety of young people on the platform."
The company said it has also released an updated version of its automated moderation system that it expects to identify and remove twice as many inappropriate responses.
YouTube, such as Facebook, Twitter and other sites that allow publication by users, have had to call more and more to check what is displayed on their sites and to remove inappropriate content. The companies all say that they have taken action to protect users. But the problems continue to emerge.
Concern about YouTube responses was not a top priority for advertisers and viewers a few weeks ago, Verna said.
"You just wonder, what is the next thing that will happen?"