YouTube Stops Serving Personalized Ads On Children’s Videos

YouTube Stops Serving Personalized Ads On Children’s Videos

by  @wendyndavis, January 6, 2020

YouTube Stops Serving Personalized Ads On Children's Videos | DeviceDaily.com

Google’s YouTube said Monday it has begun rolling out a new policy banning personalized ads on videos aimed at children.

“YouTube now treats personal information from anyone watching children’s content on the platform as coming from a child, regardless of the age of the user,” the company said in a blog post. “This means that on videos made for kids, we limit data collection and use. We no longer serve personalized ads on this content or support features such as comments, live chat, notification bell, stories, save to playlist, and others.”

Google alerted people those changes were coming in September, when the company agreed to pay $170 million to settle allegations that it violated the Children’s Online Privacy Protection Act by knowingly collecting data from children younger than 13.

The FTC alleged in its complaint against that Google marketed YouTube to Mattel, Hasbro and other companies as a “top destination for kids.”

In a presentation to Mattel, Google allegedly described YouTube as “today’s leader in reaching children age 6-11 against top TV channels.”

Google did not admit to violating the children’s privacy law when it settled with the FTC.

Even though the company is changing its policies, it has also asked the Federal Trade Commission to revise children’s privacy rules to make it easier for YouTube to collect data from people over age 12 who watch videos aimed at young children.

The FTC currently presumes that people watching child-oriented videos are themselves children. Earlier this year, the FTC solicited comments from the public about whether to whether to allow companies like Google to “rebut the presumption” that all viewers of child-oriented videos are under age 13.

Google recently argued in favor of that change.

“On YouTube … there is a strong community of adults who watch child-directed content on their own, for example ‘nostalgia watching,’” Google wrote in comments filed with the FTC last month. “Teachers and parents may also access and engage with this content for the benefit of their students or children — for example, to evaluate appropriate content or recommend it to others.”

Andrew Smith, the head of the FTC’s consumer protection bureau, said in September that he anticipated the agency would revise at least some regulations. He suggested the agency was considering making it easier for platforms to serve behaviorally targeted ads to users over age 12 who view videos that appear to be directed at children.

At the time, Smith elaborated to MediaPost on possible ways companies like YouTube could distinguish between users likely to be younger than 13 and older ones. Among others, YouTube could turn off commenting for users who don’t have Gmail accounts, and then only collect data from users who leave comments. (Google requires Gmail users to be at least 13 years old.)

Four U.S. Senators recently asked the FTC not to weaken the regulations. “Now is not the time to pull back,” Sens. Ed Markey (D-Mass.), Richard Blumenthal (D-Conn.), Josh Hawley (R-Mo.), and Marsha Blackburn (R-Tenn.) wrote. “As children’s use of technology continues to increase, so too does the appetite by tech giants for children’s personal information.”

MediaPost.com: Search Marketing Daily

(48)