San Jose, Calif. (KTVU) - YouTube announced on Monday that it will be updating its live streaming policy to prevent minors from live streaming video, unless they are clearly accompanied by an adult. The video sharing platform will also automatically disable comments on videos featuring minors and reduce recommending videos featuring minors in risky situations.
The announcement comes in the aftermath of an NYT investigation that YouTube’s automated recommendation algorithm had begun showing videos that sexualizes children to users who watched other videos of “prepubescent, partially clothed children”.
Earlier in February 2019, a report by Wired Magazine had found many instances of YouTube videos with tens of millions of views inundated with comments by pedophiles. A number of brands, including Disney, Nestle, Epic Games, and others pulled their ad spending following the report.
In the blog published Monday, YouTube said that it terminates accounts belonging to people under 13 when discovered, and that it terminates thousands of accounts per week.
“In the first quarter of 2019 alone, we removed more than 800,000 videos for violations of our child safety policies, the majority of these before they had ten views,” reads a statement on the YouTube blog. It has also disabled comments on tens of millions of videos featuring minors on its platform. A machine learning classifier, updated in June is better at identifying videos that may put minors at risk, YouTube says.
YouTube also adds that it has shared tens of thousands of reports with the NCMEC (National Center for Missing and Exploited Children), leading to numerous law enforcement investigations.