If you are one of those social media users, who spend hours on YouTube watching videos and are quick to differentiate between what is authentic and what is fake and what exactly can be considered inappropriate keeping in mind everyone's taste, why not make money out of it? The video-sharing website has said that it plans to hire 10,000 people who will review the content on YouTube so that they comply with the company policies.
The Google-owned site has made this decision after several inappropriate videos have popped up on the site in the recent past, leading to YouTube losing a lot of advertisers. In addition, parents have also lashed out at the brand for carrying violent and inappropriate content unfit for children.
YouTube chief executive officer Susan Wojcicki said that while the site's open functioning has helped many, it also has a flipside, which is causing issues not just to the site but its viewers as well. She said the site was now taking action "because it is the right thing to do."
"I've seen how our open platform has been a force for creativity, learning and access to information. I've seen how activists have used it to advocate for social change, mobilize protests, and document war crimes," she said in a blogpost. "But I've also seen up-close that there can be another, more troubling, side of YouTube's openness. I've seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm."
After complaints from advertisers and other viewers, YouTube is said to have pulled down 150,000 videos and disabled comments for over 625,000 videos. The California-based firm also said that it had disabled the accounts of hundreds of users for posting "predatory comments on videos featuring minors."
Apart from bringing 10,000 moderators onboard, the video-sharing site has also put machine learning technology, which will flag inappropriate content for the human moderators to review.
Here's what Wojcicki had to say about YouTube's content and how the firm plans to tackle online violence and extremism.
As the CEO of YouTube, I've seen how our open platform has been a force for creativity, learning and access to information. I've seen how activists have used it to advocate for social change, mobilize protests, and document war crimes. I've seen how it serves as both an entertainment destination and a video library for the world. I've seen how it has expanded economic opportunity, allowing small businesses to market and sell their goods across borders. And I've seen how it has helped enlighten my children, giving them a bigger, broader understanding of our world and the billions who inhabit it.
But I've also seen up-close that there can be another, more troubling, side of YouTube's openness. I've seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm.
In the last year, we took actions to protect our community against violent or extremist content, testing new systems to combat emerging and evolving threats. We tightened our policies on what content can appear on our platform, or earn revenue for creators. We increased our enforcement teams. And we invested in powerful new machine learning technology to scale the efforts of our human moderators to take down videos and comments that violate our policies.
Now, we are applying the lessons we've learned from our work fighting violent extremism content over the last year in order to tackle other problematic content. Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.
More people reviewing more content
Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content. Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future. We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether. In the last few weeks we've used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments. Our teams also work closely with NCMEC, the IWF, and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies.
We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.
Apart from this YouTube CEO also explained that the firm understands the advertisers' concerns regarding the inappropriate content on the site and has formulated new policies to tackle the issue. "We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand's values. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors," she said.
"We believe this requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for advertising. We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will also help vetted creators see more stability around their revenue. It's important we get this right for both advertisers and creators, and over the next few weeks, we'll be speaking with both to hone this approach.
"We are taking these actions because it's the right thing to do. Creators make incredible content that builds global fan bases. Fans come to YouTube to watch, share, and engage with this content. Advertisers, who want to reach those people, fund this creator economy. Each of these groups is essential to YouTube's creative ecosystem—none can thrive on YouTube without the other—and all three deserve our best efforts.
"As challenges to our platform evolve and change, our enforcement methods must and will evolve to respond to them. But no matter what challenges emerge, our commitment to combat them will be sustained and unwavering. We will take the steps necessary to protect our community and ensure that YouTube continues to be a place where creators, advertisers, and viewers can thrive."