The shooter in one of the two New Zealand mosque attacks streamed his deadly rampage live on social media on Friday.
Some social networks and video-streaming firms found out. They removed his accounts. Yet, versions of the video stayed onsite hours after the shootings reportedly killed at least 49 people.
YouTube, Facebook and Twitter said they took out the original video after the attack. But people reported hours later that they still found several copies on their platforms.
Twitter removed the original video and barred the account that posted it. It’s still working to remove copies posted from other accounts. The firm said that the account and video violated community guidelines.
“We are deeply saddened by the shootings in Christchurch today,” said Twitter. “Twitter has rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this. We also cooperate with law enforcement to facilitate their investigations as required.”
Facebook removed the video and is scrambling to stop content praising the attack from spreading.
“Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” said Mia Garlick of Facebook New Zealand.
“We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware. We will continue working directly with New Zealand police as their response and investigation continues.”
Garlick said on Friday afternoon that Facebook has been adding videos violating its policies to database. It will allow them to detect and remove copies automatically when re-uploaded.
Facebook Live has been abused before. Since then, the firm has taken steps to detect questionable livestreams in real time.
In 2017, the social network added measures to detect livestreams where people talk about suicide. It included using AI to streamline reports and adding live chat with crisis support groups.
The new policies followed suicides reportedly livestreamed on its platform.
People tweeted that they found reposts of the mosque attacks on YouTube 12 hours after it happened.
A quick search on YouTube yields reliable reports from news organizations. Yet, the graphic videos will still appear if you filtered results by upload date.
YouTube has ensured valid news reports as priority when searching for trending events. It helps filter videos that may spread misinformation or fake news.
YouTube blogged in July last year that its Top News section highlights videos from reliable news groups and links to news articles minutes or a few hours after a breaking news.
These steps prevent videos from appearing atop search results or in the Trending section. Yet, uploaders can still post them on YouTube.
“Shocking, violent and graphic content has no place on our platforms, and is removed as soon as we become aware of it. As with any major tragedy, we will work cooperatively with the authorities,” said YouTube.
We’ll see if these firms ramp up their efforts against videos of the New Zealand mosque attacks in the following days.