YouTube outlines four new ways to combat terrorism

Share the joy
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

YouTube_logo_2015.svg

Credit: https://en.wikipedia.org/wiki/YouTube

Google has outlined four new measures to combat terrorism on YouTube. Recall that the social media has come under intense pressure from different quarters to take more actions against terrorism.

Google had in an op-ed published in the Financial Times on Sunday, said YouTube has been working with different governments and law enforcements to rid its platform of content posted by terrorists. Kent Walker, the senior vice-president and general counsel of Google said the company has invested in systems that will help with identifying such post before removing them from its platform. Walker, however, admits that more still has to be done, and quickly too.

“Our engineers have developed technology to prevent re-uploads of known terrorist content using image-matching technology. We have invested in systems that use content-based signals to help identify new videos for removal. And we have developed partnerships with expert groups, counter-extremism agencies, and the other technology companies to help inform and strengthen our efforts,” Walker said on Sunday.

The first step, according to Walker, will be to increase Google’s use of technology to identify content posted by extremists and terrorists. This will be achieved by using machine learning to “train new ‘content classifiers’ to help us more quickly identify and remove such content.”

The second step will be to greatly increase the number of “independent experts in YouTube’s Trusted Flagger program”. Walker said machines can help to identify videos that are posted by extremists; even though human experts still have major roles to play in nuance decisions when it comes to drawing a line “between violent propaganda and religious or newsworthy speech.” Trusted Flagger, according to Google, provides more accurate reports than many flaggers, and help to scale the company’s efforts when it comes to identifying emerging areas of concern.

Google pledged to expand the program by adding an additional 50 expert NGO to the existing organizations that exist already. It also pledged to provide operational grants to support their activities.

The third step will see Google taking a tougher stance on content that do not specifically violate its policies. Such videos could be ones that contain inflammatory or supremacist content. “In future these will appear behind an interstitial warning and they will not be monetized, recommended or eligible for comments or user endorsements,” Walker said. In other words, YouTube will make such videos difficult to see for its viewers.

The fourth one, which of course, is the last one, will see YouTube expanding its “counter-radicalization efforts.” The company hopes to achieve this by working with Jigsaw to implement the “Redirect Method” more broadly across Europe. “This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining,” Walker said.

Banks and several other blue-chip companies pulled out their ads from YouTube last March in protest against Google’s decision not to act against extremist content.

The companies were embittered over their ads appearing side-by-sides videos being posted by extremists like the Islamic State, ISIS on YouTube. Three of UK’s biggest banks including HSBC, Lloyds and the Royal Bank of Scotland (RBS) expressed fears that some of their budgets for ads were being used to fund banned hate preachers, extremists, terrorist organizations and racists among others.


Share the joy
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

Author: Ola Ric

Ola Ric is a professional tech writer. He has written and provided tons of published articles for professionals and private individuals. He is also a social commentator and analyst, with relevant experience in the use of social media services.

Share This Post On