After years of YouTube Kids being available only via a mobile app, the video-streaming platform is now offering kid-friendly videos on its own website, YouTubeKids.com, which will provide similar experiences, features and safeguards to its app counterpart.
Details and Implications:
Until recently, YouTube Kids has only been available as an app on mobile/tablet devices and has more than 14 million monthly active viewers in over 40 countries.
Along with a new website for YouTube Kids, Google also announced that the platform will be getting new privacy filters for parents, allowing them to manually control what content is deemed age-appropriate by adding videos to the ‘Approved Content Only’ list. This ensures that children will only be able to watch videos chosen and controlled by their parents. In addition, the new site filters content into three age-defined sections: Preschool, Younger (ages 5-7), and older (ages 8-12). Though these filters have been put in place, the platform warns that adult content could still find its way onto YouTube Kids. Parents can block or flag inappropriate videos on YouTube Kids for a fast review, Google said.
With the rollout of the desktop-friendly version of YouTube Kids, the video-sharing platform also announced plans to invest $100m over three years, dedicated to the creation of ‘thoughtful’ original content. This move was in response to recent controversies over YouTube collecting children’s data and using it to target ads, all without parental consent.
The Federal Trade Commission (FTC) recently issued YouTube with a $170m fine for the very same privacy infringements in the US. The fine while small, has large implications for the site. New rules that have been set out by the FTC, include that: YouTube must stop collecting data on videos targeted towards children (children are classified as under the age of 12) and content designed and intended for children must be clearly labelled. YouTube will also stop personalised ads and machine learning will be used to identify videos that are targeted to young audiences. Features like comments and notifications will no longer be available on thousands of videos featuring kids to dissuade predators from lurking in the comments section. On the other hand, for creators, the absence of these features will have a disastrous effect. With fewer notifications, there are fewer recommendations and therefore less money earned. “Starting in four months, we will treat data from anyone watching children’s content on YouTube as coming from a child, regardless of the age of the user,” YouTube said. “This means that we will limit data collection and use on videos made for kids only to what is needed to support the operation of the service.”
“Responsible growth” was only recently made one of YouTube’s core metrics after advertisers, over the years, have taken their ad dollars away from YouTube out of fear that their ads will be shown on harmful videos. However, each time the advertisers came back to the platform due to its importance and scale.
Data privacy concerns continues to be at the forefront for advertisers and parents alike. However, YouTube hasn’t been the only one facing scrutiny from the FTC over children’s privacy. Earlier this year, TikTok (formerly known as Musical.ly), was forced to settle a $5.7m fine for illegally collecting information from children under 13. As a result of the misconduct, TikTok also launched a separate, limited app experience intended for children under 13, with additional safety features and privacy protections. The settlement was described as the “largest civil penalty” in a children’s privacy case.