From Online Hatred to Offline Violence: Facebook’s Impact in Myanmar

Over the past year, fake news has turned into an easy tool to incite violence among conflicting communities via hate-mongering posts on social media. As the Indian government and Whatsapp grapple with the medium’s use in violence in India, Myanmar has also found itself embroiled in a discussion about Facebook’s impact on its social fabric. Reports have revealed that Facebook posts targeting minorities in Myanmar have led to extrajudicial killings and many other brutalities in the past few years.

Fake news as a means to trigger hatred in society is connected to our ability to consider every information as the absolute truth. “People easily believe content that is usually to be unbelievable or too good to be true”, said Ponnurangam Kumaraguru, Associate Professor at Indraprastha Institute of Information Technology, Delhi.

Online hate speech

In the past two years, several studies have analysed social media data to explain how fake news spreads via Facebook and platforms.

In March 2018, Soroush Vosoughi, Deb Roy and Sinan Aral of Massachusetts Institute of Technology (MIT) studied over 126,000 stories tweeted by about 3 million people more than 4.5 millions times to understand the rise of fake news on social media. The veracity of the collected data was then cross-checked against the information available on six independent fact-checking organisations. The study revealed that falsehoods diffused significantly farther, faster, deeper and more broadly than the truth in all categories of information.

“Gaining access to technology and to (social media) platforms which ensures virality of content is not that hard”, adds Kumaraguru.


Also watch: The Catchiest Fake News Explainer You’ll Ever See


Easy access to thousands of people, combined with our susceptibility to false information can be toxic in a politically divisive social set up.

In Myanmar, about 80% of the population is Buddhist and a minority of about 4% are Muslim. Among the Muslim population, reports show that the Rohingya are the most abused community, targeted by extremists and even the state. As the world knows, this has resulted in many fleeing the region. As per the latest reports, there are about 900,000 Rohingya refugees.

Over the last few months, Facebook has been used to heighten the sense of intolerance against minorities in the region.

“Myanmar was a closed society until about 2010, thanks to the (military) Junta government. So, social media came pretty late to the Myanmar society. But since 2014, Facebook and WhatsApp have been used quite frequently for radicalisation,” says Aparupa Bhattacharjee, PhD scholar, school of conflict and security studies at National Institute of Advanced Studies (NIAS).

Violence against minorities 

Reports of the mass persecution of the Rohingya by Myanmar’s military personnel in August 2017 were followed by reports about misinformation being deliberately spread on social media to stoke pre-existing tensions and further escalate the violence.

On September 28, 2018, a UN-commissioned independent ‘Fact Finding Mission’ released a full account detailing instances of genocide, crimes against humanity and war crimes committed by the military in Myanmar. The report also highlighted how hate speech was disseminated through public pronouncements and social media tools like Facebook. It called for content moderation based on “international human rights law” and recommended more transparency from social media platforms. Finally, it concluded that before entering any new domain, especially one rife with ethnic and religious tensions, social media platforms should do in-depth human-rights impact assessments to take appropriate mitigating measures.

The report concluded that social media platforms should do in-depth human-rights impact assessment before entering domains rife with ethnic tensions to take appropriate mitigating measures.


Also read: What Can We, Responsible Readers, Do About Fake News?


Similarly, an investigative report published by the New York Times on October 15, 2018, stated how the Myanmar military used Facebook as a tool to amplify tensions between Myanmar’s Buddhists and the Rohingya.

“The campaign included hundreds of military personnel who created troll accounts and news and celebrity pages on Facebook and then flooded them with incendiary comments and posts timed for peak viewership,” said the report.

This phenomenon is not just being used to publicise that certain communities in Myanmar are being persecuted but also to further radicalise the Buddhists within the state.

“In a sense, the Buddhist Burmese community is coming together to counter the international community which they perceive is targeting them using the Rohingya as an excuse,” says D. Suba Chandran, dean, school of conflict and security studies at NIAS. And in reference to the facts brought to light by the New York Times report, Chandran says that the usage of troll accounts to spread fake news against the Rohingya could be seen as a strategic move by the state to counter the international narrative which portrays the Rohingya only as victims. “So, the intended audience for the fake news wasn’t necessarily the local population.”

Way ahead

After receiving flak from researchers across the world, Facebook announced that it had commissioned Business for Social Responsibility (BSR), an independent organisation with expertise in human rights practices and policies, to assess the website’s impact on human rights in Myanmar.

The BSR report examined the socio-cultural set-up in Myanmar where a large population is quickly moving online amid a legal framework which hasn’t evolved enough to safeguard human rights and reign in ethnic tensions. It also highlighted that the people’s rights in Myanmar with respect to security, privacy, freedom of expression, child rights etc are highly interrelated and interdependent “with the improvement or deprivation of one right significantly affecting the others.”


Also read: Mixing Science and Art to Make the Truth More Interesting Than Lies


The report cited instances where Facebook took action to curb further inflammations of ethnic and religious tensions saying, “In August 2018, Facebook removed 18 Facebook accounts, one Instagram account, and 52 Facebook Pages in Myanmar, and banned 20 individuals and organisations from Facebook, including Senior General Min Aung Hlaing, commander-in-chief of the armed forces, and the military’s Myawady television network.”

But concluded that although Facebook has had “a powerful democratizing effect in Myanmar” by spreading awareness about concepts like democracy, human rights, accountability and civil society; it has also become “a useful platform for those seeking to incite violence and cause offline harm”. The report recommended a ‘systemwide change’ which Facebook could adopt in light of the political and economic set-up in Myanmar which poses risks to human rights.

Politically complex scenarios exist almost everywhere today and therefore questions about how much responsibility can be placed on Facebook needs to be assessed.

“It is a multi-party problem in the sense that it is a shared responsibility between the user, the platform provider like Facebook or Twitter, the internet service provider and in some cases governmental parties are also involved,” said Kumaraguru.

Rishika Pardikar is a freelance journalist in Bengaluru.

Feature image credit: Reuters