YouTube has announced a range of new measures to better protect children using its platform, reducing data collection on content aimed at kids, and the capacity for advertisers to target young viewers through ads on such content.
As explained by YouTube:
"Starting in about four months, we will treat data from anyone watching children’s content on YouTube as coming from a child, regardless of the age of the user. This means that we will limit data collection and use on videos made for kids only to what is needed to support the operation of the service. We will also stop serving personalized ads on this content entirely, and some features will no longer be available on this type of content, like comments and notifications."
These are significant changes, both for advertisers and content creators. Now, brands looking to reach younger users will not have access to the same levels of data they've had up till now, while for creators, losing notifications will likely be a blow to their awareness efforts.
YouTube actually removed comments on such content back in March after reports of misuse, but these additional measures expand upon this significantly, and will have impacts.
"In order to identify content made for kids, creators will be required to tell us when their content falls in this category, and we’ll also use machine learning to find videos that clearly target young audiences, for example those that have an emphasis on kids characters, themes, toys, or games."
The noted four-month threshold is designed to give creators and marketers time to adjust their strategies in turn, but even with that lead time, the changes will be big. That said, they are likely necessary, given reports of how YouTube's systems have been misused in this respect.
The New York Times published an in-depth report on such cases back in 2017, with various examples of disturbing, offensive content that had been slipping past YouTube's filtering systems, and making its way to young viewers. Add to this the instances of pedophile rings operating within YouTube's systems, and the need for action is very clear.
As per YouTube:
"Responsibility is our number one priority at YouTube, and nothing is more important than protecting kids and their privacy. We’ve been significantly investing in the policies, products and practices to help us do this. From its earliest days, YouTube has been a site for people over 13, but with a boom in family content and the rise of shared devices, the likelihood of children watching without supervision has increased. We’ve been taking a hard look at areas where we can do more to address this, informed by feedback from parents, experts, and regulators, including COPPA concerns raised by the U.S. Federal Trade Commission and the New York Attorney General that we are addressing with a settlement announced today."
These new measures significantly expand on its efforts - but as noted, they will necessitate a shift in focus for marketers who have been using ads to reach younger users, and for creators who've come to rely upon that ad revenue to fuel their own businesses.
On the latter element, YouTube is also establishing a $100 million fund "dedicated to the creation of thoughtful, original children’s content on YouTube and YouTube Kids globally". The funding will be disbursed over three years, and will help YouTube maintain its popular kids content push, while also negating at least some of the potential impacts of these changes for creators.
It's a significant, and important, update for YouTube. These days, kids don't watch traditional TV at the same rates, and they're far more likely to be familiar with YouTube celebrities than they are with more generally recognized entertainment identities.
Case in point, a recent study found that today’s kids are 3x more likely to aspire to be a YouTuber, as opposed to an astronaut.
YouTube's influence among younger age groups is massive, and relative impacts considered, it is good to see YouTube taking stronger action in this respect.