YouTube says it has had enough of channels that tend to exploit children with some of their content. According to the video-streaming website, it is rolling out some stricter policies to deal with the issue, and make its platform more secure for minors.
The Google subsidiary said in a blog post on Wednesday that it has noticed a growing trend in recent months of content on its platform that attempt to “pass as family-friendly, but clearly is not.”
“While some of these videos may be suitable for adults, others are completely unacceptable, so we are working to remove them from YouTube,” YouTube said.
To rid its website of such content and provide a safer environment where kids can watch their favorite videos without being subjected to adult content, the company has rolled out stricter rules:
- Through machine learning, the company will effectively and regularly find and escalate any content that violates its policy. Of course, the decision of what happens to any content found to have violated its policy will be left for humans to review and act appropriately. YouTube added that it had terminated over 50 channels in the last week that violated its recently expanded guidelines.
- Removing ads from inappropriate videos that target families: According to YouTube, 3 million ads that target families have been removed since it updated its advertiser-friendly policy in June. The ads were removed from videos under the policy—and the policy has now been further strengthened to remove ads from another 500,000 videos.
- Blocking inappropriate comments on videos that feature minors: In addition to working with NCMEC to report “illegal behavior to law enforcement”, the video-streaming company said it is rolling stricter ruled to deal with bad comments. “Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
- Help creators by providing guidelines: YouTube will help creators that create family-friendly content by providing them with guidelines, especially for its YouTube Kids app.
Some of these rules have already been rolled out, while others will gradually take shape probably in weeks to come. The process is not matured yet, and there’s still more to be done to protect kids from adult content.
“These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge. We’re wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right. As a parent and as a leader in this organization, I’m determined that we do,” said Johanna Wright, Vice President of Product Management at YouTube.
Last year Google gave parents power to block unsuitable content in YouTube Kids. The company stepped up on that early this month when it added more parental control features to the app.
Google said its decision was based on rapport it had with parents who felt they needed more control over the kind of content their children get access to. Parents, according to YouTube, can now block or flag off any content that they are uncomfortable with.