YouTube Employing Ten Thousand People Just To Watch Videos

YouTube is looking to have a team of about 10,000 people watching videos to find and remove anything inappropriate or illegal.

By nowproducerdave on December 6, 2017
(Photo by Ethan Miller/Getty Images)

Ok, so maybe their job isn’t JUST to watch videos, and it’s not just WATCHING the videos, but that is a job requirement.

YouTube is trying to improve their system that automatically detects”hateful and exploitative” videos. To do that, they have to train the system. That’s where the roughly 10,000 people come into play. Let’s back up a little bit first. Earlier this year they implemented a new sort of algorithm that automatically detects videos that may violate their terms of service, or that may contain content that advertisers might find “not suitable.” The system would flag that video, and the uploader would then not receive any ad revenue from that video. This angered a lot of “YouTubers” because some of them make content as their full-time job, and it proved to be a significant dent to their monthly income. The system was wrongfully flagging a lot of those videos too though, and that’s why some improvement is necessary.

But that’s not the only problem. Some of those videos were genuine content violations. CEO Susan Wojcicki even says in a blog post that there’s been “a significant increase in bad actors seeking to exploit our platform.” That basically means that there are people out there creating content that is either “fake news,” offensive, hateful, etc. All content that’s created specifically to get a lot of views quickly, and generate a lot of income for someone, regardless of the accuracy or moral integrity of the content. There’s also been an increase in child predators who are posting rather inappropriate comments on videos of children, and creating content that appeals to children.

YouTube is trying to train their new system to recognise that bad content, flag it for removal, and investigate further. They’re looking to bump the size of their “trust and safety” team to over 10,000 individuals, which will not only help train the machines that will aid in flagging videos, but will provide faster human review of videos that were flagged, and get genuine content back online, keeping some money in the pocket of content creators. No word on when hiring will start, but we’re sure there will be a mass-hiring event soon, since they’re planning to start the “training” process in 2018.

Source.

Around the site