A complaint has been filed against YouTube at the US Federal Trade Commission by child advocacy groups alleging that the Google-owned video-streaming site is illegally collecting private data of children without parental consent. According to The Guardian, the company is illegally collecting the data and advertising to those under the age of 13.
The groups comprise of the Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy and 21 other organizations. The group claims that despite Google’s claim that its video-streaming website is made for viewers from age 13 and above, it [YouTube] is fully aware that children below that age use the site.
According to the groups, personal information such as location, phone numbers, and device identifiers is being collected in order to track the children across different websites and services without parental consent. This, the groups claim, violates the US Children’s Online Privacy Protection Act (Coppa).
The group thereby prevailed on the FTC to investigate and sanction Google for the violation:
“For years, Google has abdicated its responsibility to kids and families by disingenuously claiming YouTube— a site rife with popular cartoons, nursery rhymes, and toy ads — is not for children under 13,” said Josh Golin, executive director of the CCFC said per The Guardian. “Google profits immensely by delivering ads to kids and must comply with Coppa. It’s time for the FTC to hold Google accountable for its illegal data collection and advertising practices.”
James P Steyer, chief executive of Common Sense called on Google to be “transparent” especially when it comes to facts and institute new policies to protect privacy of children:
“It is time for Google to be completely transparent with all the facts and institute fundamentally responsible new policies moving forward to protect the privacy of kids. We fully expect Google to work closely with advocates and reach out to parents with information about parental controls, content and collection practices on YouTube so parents can make informed choices about what content they allow their kids to access and how to protect their privacy.”
In 2016, YouTube rolled out a couple of steps to put parents in the driving seat when it updated YouTube Kids app to let parents block unsuitable content.
This has more to do with granting parents power to block or choose what they would rather have their kids see when they use the app than inappropriate video. YouTube Kids curation process ensures that only videos tagged appropriate for kids are allowed on the app; but parents might have different opinion about unnecessary repetition.
Blocking an unsuitable video is quite easy too: click on the three dots next to any video or channel of the content you feel uncomfortable about, and a question whether you would like to “block this video” pops up. The app will then ask you to “send yourself a parental consent email” and a verification code will be sent for account verification. Blocked video will remain consistent across all your devices—and you also have the chance to unblock the video or channel anytime you change your mind.