新闻中心

YouTube is now taking further measures to moderate content that will affect its kids app

YouTube is finally taking bigger steps to combat inappropriate videos targeted toward children.

In October, Mashable first reported that weird, creepy, and downright inappropriate videos were slipping through filters on YouTube Kids, an app geared toward children that allows virtually anyone with a YouTube account to create content that could be seen by millions of children. Those findings were reignited this week after the New York Times reported on the story.

Back in August, the company rolled out a new policy restricting users from advertising dollars for the inappropriate use of family-friendly characters, such as Elsa and Spider-Man. Now YouTube has decided to take additional measures that age restricts this type of flagged content on its main app, which will automatically block it from slipping into the kids app, as first reported by The Verge.

SEE ALSO:YouTube's TV streaming service just took an important step forward

"Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetization," Juniper Downs, YouTube director of policy, said in a statement from the company. "We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right."

That means if a kid-friendly character like Elsa from Frozenis doing something inappropriate, like shooting a machine gun, YouTube is hoping users will flag it, which will age restrict it, therefore blocking it from hitting the kids app. Content from YouTube main may take several days to filter into the kids app, and content flagged in the kids app has its own reviewers, who are monitoring flagged content 24/7.

Mashable Top StoriesStay connected with the hottest stories of the day and the latest entertainment news.Sign up for Mashable's Top Stories newsletterBy signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

YouTube stressed to Mashablethat this is an added layer of protection and not the only process that can keep a video from migrating into the kids app from YouTube main. The company says it uses machine learning and algorithms to select content appropriate for children. The system is constantly evolving to block inappropriate content.

Via Giphy

YouTube will be using its team of moderators to help sift through content and take action on any videos that may be inappropriate. This new practice should be rolling out in the coming weeks.

YouTube says it has been working on the policy for a while, and that practices were not revised due to scrutiny in the media. No mention of a new policy from YouTube was discussed with Mashableduring the reporting of our original piece in October.

While the policy is a welcome change for parents worried about the content their kids may see on a user-generated platform such as YouTube, it appears that the new policy will still rely heavily on algorithms, and on someone spotting the problem content first. So it's not necessarily a sure fix: Some me of these bizarre clips from YouTube can be 30 minutes or longer, and they often start out completely normal, only to take sudden, dark turns.

And, as we all know, algorithms are far from perfect.

UPDATE: Nov. 9, 2017, 4:45 p.m. PST This article has been updated to include a statement from YouTube.


Featured Video For You
Teaching kids to code might be a little easier thanks to this robot

上一篇:North Korea's new suspected COVID 下一篇:一汽骏派D60汽车内饰中控改装配件专用仪表台防晒隔热遮光避光垫

Copyright © 2024 苹果im虚拟机 版权所有   网站地图