Facebook is at the moment not in the good books of the United Nations—the global body has accused the social media giant of being responsible for the spread of hatred in Myanmar.
Formerly known as Burma, the Southeast Asian country with more than 100 ethnic nationalities is currently in the middle of a crisis where its leaders are being accused of committing human right abuses.
The United Nations human rights experts carrying out investigation of a possible genocide in Rakhine state have warned that Facebook is being used by ultra-nationalist Buddhists to cause violence and hatred against minority ethnic groups in the country, reports Reuters.
Chairman of the UN Independent International Fact-Finding Mission on Myanmar Marzuki Darusman told reporters per Reuters that:
“It has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media,”
Speaking in support of Darusman, Special Rapporteur on the human rights situation in Myanmar, Yanghee Lee said the government used Facebook as a medium of disseminating information to the public:
“Everything is done through Facebook in Myanmar,” Lee said, claiming that though, Facebook has been of help to the people of Myanmar, the social media giant also allowed its platform to be used for spreading hate speech:
“It was used to convey public messages but we know that the ultra-nationalist Buddhists have their own Facebooks and are really inciting a lot of violence and a lot of hatred against the Rohingya or other ethnic minorities.
“I’m afraid that Facebook has now turned into a beast, and not what it originally intended.”
A couple of days ago, the European Union on ordered Facebook, Twitter and Google to remove terrorist content within an hour of being notified of its presence.
“While several platforms have been removing more illegal content than ever before – showing that self-regulation can work – we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens’ security, safety and fundamental rights,” said Andrus Ansip, vice president for the digital single market in a statement, per Mercury News.
Facebook says it gets rid of 83 percent of content related to terrorism within an hour of upload. With a little more effort, the social media giant could achieve this; but what happens to the other 17 percent of content left unremoved within the period?
Good thing is, progress is being made, and there is room for improvement. A Facebook spokesperson said per Mercury News that:
“As the latest figures show, we have already made good progress removing various forms of illegal content. We continue to work hard to remove hate speech and terrorist content while making sure that Facebook remains a platform for all ideas.”
Last summer, about 650,000 Rohingya Muslims fled into neighboring Bangladesh following a security crackdown. The situation of things since then have not been any better with multiple reports of state-led violence against the refuges.