Google vowed to prevent extremist materials to spread across its YouTube video service. In June, general counsel at Google stated that the company is “committed to being part of the solution” by tackling extremist content online.
“We are working with government, law enforcement, and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services.” – Kent Walker, General Counsel at Google
Google is investing more engineering resource in developing further AI software that can identify and eliminate extremist content. It’s also expanding its independent experts from a non-government organization. These experts belong to the Trusted Flagger program. Ninety percent of the time, the experts’ reports are accurate.
It ensures videos that don’t violate its rules but may have inflammatory or supremacist content will have a warning. Its goal is to make these videos harder to find.
Furthermore, YouTube is working with Jigsaw, which is a company behind the Redirect Method. It uses ad targeting to redirect potential ISIS recruits to anti-terrorist videos category.
The UK Prime Minister, Theresa May, called on tech companies to tackle online extremism after the Manchester Arena bombing. She and France’s President, Emmanuel Macron, said they would evaluate the proposals that impose a penalty on Internet companies that fail to remove extremist content online.
Fighting terrorist content online is challenging, according to Google. However, it’s committed to doing more to fight it.
With the latest anti-terrorism feature, users would have a hard time finding extremist content. When they search for a violent extremist content on YouTube, they’ll see videos that discredit extremist messaging.
“This early product integration of the Redirect Method on YouTube is our latest effort to provide more resources and more content that can help change minds of people at risk of being radicalized. “
Over the coming weeks, the company plans to expand the feature to “a wider set of search queries in other languages beyond English.” It’ll use machine learning technology to update the search terms dynamically. YouTube is also working with NGOs to help in developing anti-terrorism video content.
The Redirect Method is still unrefined. However, Google promised that it’d measure its success by looking into the engagement of alternative content. The use of machine learning could counter attempts to sidestep redirection using disguised keywords.
Although this method is helpful in fighting online terrorist content, it doesn’t eliminate such topic. Rather, it only reduces the number of videos about the subject. Sooner or later, other platforms will arise, and terrorists will use them to spread their propaganda.
The Redirect Method raises some issues. Should YouTube drive extremism underground or monitor those people who access those content using its platform?
The changes came after big advertisers pulled their ads from the said platform because their videos appeared on videos that contain extremists, homophobia, and racism.
“This work is made possible by our partnerships with NGOs that are experts in this field, and we will continue to collaborate closely with them to help support their research through our technological tools. We hope our work together will also help open and broaden a dialogue about other work that can be done to counter radicalization of potential recruits.”