Youtube online extremism
A 3D plastic representation of the Twitter and YouTube logo is seen in front of a displayed ISIS flag in this photo illustration in Zenica, Bosnia and Herzegovina, February 3, 2016.REUTERS/Dado Ruvic

Google, which is under tremendous pressure to eliminate online extremism from its various services, responded to the heat on Sunday. The search giant outlined four steps to identify and remove terrorist or violent extremist content from its platforms, especially from YouTube.

Google said that it had invested in systems using content-based signals to identify videos that need to be removed while there were newly-developed partnerships with experts, counter-extremism agencies, and other technology companies to strengthen its efforts. In addition to that, the company also pledged to take a few additional steps, with the first one focusing more on reinforcing its machine learning research.

"We will now devote more engineering resources to apply our most advanced machine learning research to train new 'content classifiers' to help us more quickly identify and remove extremist and terrorism-related content," Google's general counsel Kent Walker said in a blog post.

The second step is aimed at bringing more human oversight to YouTube's Trusted Flagger programme. While machines will help identify offensive videos, human experts will play a vital role to differentiate between violent content and religious or newsworthy speech, according to Google.

The company also said that it would collaborate with more NGOs and specialised organisations working on issues such as hate speech, self-harm, and terrorism.

YouTube
People are silhouetted as they pose with mobile devices in front of a screen projected with a Youtube logo, in this picture illustration taken in Zenica October 29, 2014.REUTERS/Dado Ruvic

As part of the third step, Google said it would start issuing warnings to videos containing inflammatory religious or supremacist content that does not clearly violate its policies. The company also added that such videos will not be monetised, recommended or eligible for user comments and endorsement.

Finally, the company is even planning to use its targeted online advertising to reach potential Islamic State recruits, and redirect them towards anti-terrorist videos. Google hopes that this particular move can change their minds about joining ISIS.

"In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages," Walker said in the blog post.

Last week, Facebook also outlined methods to fight terrorism on its social media platform while Twitter has also stepped up its fight against terrorism.