YouTube is hiring moderators to make its content reviewers a team of 10,000 strong in an effort to remove “problematic content” from the platform.
In a blog post, YouTube CEO Susan Wojcicki said the company aims to increase its content reviewers to more than 10,000 people who will “address content that might violate our policies.” She didn't say how many people already do that job.
YouTube’s trust and safety teams have reviewed “almost 2 million videos for violent extremist content” since June, Wojcicki wrote, and the company is adjusting its algorithms to identify these types of videos in order to further automate the process.
Machine learning methods are now removing close to five times as many videos as they had been previously, the post said. Half of violent extremist content is removed by machine learning in under two hours and 70 percent is removed within eight hours.
The kid-friendly platform YouTube Kids has come under greater scrutiny after many videos were shown to contain profanity and violence. Some advertisers also pulled their advertising from the regular YouTube platform in June after seeing their content come up next to videos made by a hate preacher.
“We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand values,” Wojcicki wrote. “Equally, we want to give creators confidence that their revenue won’t be hurt by the actions of bad actors.”
YouTube plans more transparency for 2018 in the form of reports detailing both data about the flags it gets and what it does to remove videos and comments that violate its policies.
© 2025 Newsmax. All rights reserved.