YouTube Updates Policies With New Approach To Curb Extremism Videos

A report reveals one of the London Bridge attackers was radicalized after watching extremist videos of an American Islamic preacher on YouTube.

YouTube has updated its policies to curb such videos from its platform.

Google announced on Sunday the company is taking a new approach towards issuing warning and removing such videos which violates community guidelines.

The search engine giant added YouTube will be identifying such offensive videos and removing those. The videos which are made to promote the subjugation of religious or races without inciting violence would be warned and also removed from monetizing policy. Those will not be facilitated for viewers to comment and this will result with less engagement.

Until now the company has relied on computer-based video analysis, but now would devote more engineering resources in identifying and removing the terrorism-related videos.

Google also said to be enlisting experts from nongovernmental organizations in determining whether the content of the videos are religious or newsworthy speech.

More than 400 hours of video content are uploaded on YouTube every minute and policing all in real time is tough. Offensive videos are flagged by users and algorithms thereafter neaten up the site for potential problems. Usually copyrighted material, graphic violence and nudity videos are taken down immediately.

Other tech companies too are working similarly to keep their platforms free from offensive materials and bar those becoming the dens of extremism.

In 2016 Twitter suspended 377,000 accounts for violations of community guidelines.

British Prime Minister Theresa May criticized tech companies after the terror attacks in Manchester and London for providing safe space to the Islamic preachers.

Comments