How Sri Lanka’s Laws Are Enabling Hate Speech and Disinformation Instead of Curbing It

(Photo: Nimal Skandhakumar via Flickr)

Dr. Sanjana Hattotuwa is the Research Director at The Disinformation Project and the founder and former editor of Groundviews, Sri Lanka’s first citizen journalism website. He has written about digital security, web activism, new media literacy, and online advocacy in Sri Lanka and beyond. 

In this interview, Dr. Hattotuwa shares his insights on the state of misinformation and disinformation in Sri Lanka, the role of social media platforms in the fight against disinformation, misinformation, and online violence, and how legal mechanisms that are created to counter hate speech actually embolden hate in Sri Lanka.

This conversation has been edited for length and clarity. 

Deeksha Udupa (DU): How would you describe the nature of misinformation and disinformation in Sri Lanka today? Who would you describe as the target group and how does it contribute to communal tensions in the country?

Sanjana Hattotuwa (SH): The question is an interesting one, because the clear definition of misinformation is not one that is easy to determine in Sri Lanka and around the world. What we have seen after 2012 is the growth of immeasurable violent extremism — racism, xenophobia, Islamophobia in particular — using, adapting, and adopting social media. 

In Sri Lanka, it’s inextricably entwined with media architecture and what we call an information environment that is defined by implicit and explicit bigotry. So it’s not just what is promoted and published, it’s also what is silenced. For context, Sri Lanka also has one of the lowest data rates — subscription rates — in the world. It has very highly saturated high-speed connectivity so social media essentially reaches the vast majority of Sri Lankans. In that environment, during electoral moments and particular moments of unrest, like the 2019 Easter Sunday attacks, there was a clear weaponization of social media against Muslims. At times of offline violence, social media has been a defining feature. It’s been a defining feature for particular political parties, including the Rajapaksas, whose brand identity is racism and a single Buddhist majoritarianism. 

The final point is that Sri Lanka is also a test bed for what Maria Resa, the Nobel Laureate, calls a “Petri dish for influence operations,” which brings us to the domain of Foreign Malign Influence Operations (FMIO). This has been very clearly present in Sri Lanka since the invasion of Crimea in 2014, which I’ve been studying since then. Very few know about it.

Dr. Sanjana Hattotuwa

DU: How do laws like the Online Safety Act and the International Covenant on Civil and Political Rights (ICCPR) Act — which was created to protect minorities from hate speech — embolden these purveyors of hate? 

SH: With hate speech, it’s complicated because the laws that are in place have been severely manipulated. If you read the International Covenant on Civil and Political Rights (ICCPR) Act of 2007, it’s rather progressive. It’s not designed or written with the intent of incarcerating dissenters. The ICCPR, rather than incarcerating people who’ve promoted hateful content, has been abused for over 17 years to put poets, authors, writers, and activists in prison. 

The Online Safety Act (OSA), which was passed this year, is one of the worst laws to have been passed in any parliament after 1948. In theory, these are laws created to address online hate, but in reality, they enable autocracy and authoritarian control. The OSA was presented as a law to protect women and children, but it really is a chaotic and disturbing assault on the freedom of expression and all fundamental rights. It’s also extraterritorial, so it applies to everyone on the planet, not just people in Sri Lanka. It has devastating consequences for Tamils and for the Tamil diaspora as well. Even in 2024, Tamils are heavily surveilled. Social media was the sole space that they had to talk about accountability for war crimes and dreams for a more peaceful, rights-based future — the Tamil national question. Yet now they cannot do that because of the OSA. There’s an asymmetric impact on minority communities and rights.

DU: Whose interests then are prioritized when the law is used as a way to censor dissent?

SH: It’s always been the interest of the ruling class and the ruling power, whether it’s a party or a political family. After 2005, we essentially had one family, the Rajapaksas, who have controlled Sri Lanka as their fiefdom. They have retained significant political authority, without actually being elected to power, through networks of nepotism and servitude. It’s always been in the service of singular Buddhism. It’s always been in the service of structural racism. It’s always been in the service of some cisgender male. It’s been in the service of a particular family since 2005. It’s always been in the service of hegemonic control and political authority that became indistinguishable from autocracy and authoritarianism. It has always been in the service of gaining political power — absolute political power — and retaining it with the facade of regular elections and liberal democracy and socialism. It has always been the service of absolute power.

The executive presidency is an office that has gone on to be strengthened multiple times but was problematic even at inception and has only gotten stronger. In Sri Lanka, the executive president is immune from everything and can do whatever they want. They are, essentially, a God-like figure. The entire architecture of the state has been designed to capture and retain power. So, the interpretation of laws like the ICCPR has resulted in the shutting down of anyone who is a threat to that absolute power. 

DU: You wrote about how the Sri Lankan government blocked social media following the 2019 Easter Sunday terror attacks in an attempt to counter the spread of rumors and misinformation. What is the impact of these government-imposed social media blocks? How do you think they affect the spread of misinformation? 

SH: It exacerbated the problem. When the block happened, all of the pro-peace messaging and counter-speech to address the violence also continued. But at the end of the day, it just didn’t work. This has consistently been the case in Sri Lanka: media blackouts often most affect the content that is attempting to hinder hate-fuelled harm and violence. This also does not discount the fact that social media vectors were carrying a lot of violent rhetoric that instigated harm and violence as well. 

For example, after the Easter Sunday terrorism, there was a lot of content going around about how members of the Sinhalese community had opened up their homes for Muslims or how the Christians had opened up their homes for the Buddhists and so on and so forth. All of that was organic, and it attempted to create social cohesion at a time when it was in tatters. Yet, this was also severely impacted by the blocks. I have yet to find evidence that proves that these social media blocks contributed to discernible and sustainable reduction in offline violence. And of course, this applies beyond the Sri Lankan context and applies to media blackouts around the world. 

DU: You recently wrote about how the Sri Lankan Elections Commission (EC) imposed media guidelines, including guidelines for social media administrators, to counter misinformation before the 2024 elections. How would you describe the effectiveness of these guidelines? 

SH: It is clear that they are driven by the need to grapple with how social media is integral to elections and electoral moments. In Sri Lanka, there is zero capacity within the commission for any semblance of understanding around what we are discussing right now. They’re talking about things that are outdated and outmoded. The problem in Sri Lanka is that there is an inability to understand the problem as it exists, even by civil society. 

This presidential election was the first time that Sri Lanka technically had a law that required a certain limit on election spending and transparency around campaign finance. Political parties and politicians then had to declare what they had spent. The problem was that the cap was 2 billion rupees, which is in the millions of dollars. Just last week, it was revealed that not a single person has given the elections commission an account of how that money was spent. So we have laws but we don’t have a commission with teeth. 

The Election Commission is a part of the problem. It enables disinformation and influence operations. It enables the misappropriation of the Online Safety Act. It’s a part of the problem. There is nothing I have seen from the commission that addresses what we are actually dealing with today. 

DU: Do you think that government intervention could solve the problem of misinformation, disinformation and hate speech in the country? If yes, what do you think are some of the most effective measures that governments can take?

SH: In countries like Canada, the United Kingdom, and Australia, legislation around hate speech and the instigation of online violence is emerging. None of them are perfect but all of them are trying to grapple with a problem that is rapidly evolving. They are attempting to engage with a conversation that is quite important to have. India and Sri Lanka, on the other hand, are going in the opposite direction. Here, legislation is helping no one beyond the respective leaders of those countries to retain their absolute control over the countries and their peoples. 

I am always hugely suspicious and immediately skeptical of any government intervention. Yet at the same time, we are faced with the truth that oversight by social media platforms is non-existent at best and problematic at worst. It is in that context that I see a role for governments to play. The only issue is in countries with Democratic deficits, which goes beyond the Global South and includes the United States. In countries defined by growing democratic deficits, what do you do when the platforms are unfit for use? And when you have domestic regulatory and legal frameworks which are also unfit, you are left with a civil society under threat.

DU: Where does the role of social media platforms lie? Have they taken any serious effort in Sri Lanka to address it? What other measures do you think they can take?

SH: It is quite evident that social media platforms don’t want to do anything about it. Crowdtangle — a tool used by researchers and journalists to monitor social media posts, notably to track the spread of misinformation — was shut down. Meta’s content libraries are a mess. Musk has eviscerated everything so there is zero insight into what is happening on Twitter/X. Tiktok seems to be struggling. All of the major platforms have severely clamped down or have completely decimated independent access by third party researchers, especially Global South researchers. It is very clear that the platforms don’t want to give out the data. They don’t want the sunlight of scrutiny. 

In just the last year or two, our ability to study what is going on has been decimated. The conversation would’ve been vastly different had we been talking in 2022 or 2023. We now have to rely on what we are told the companies are doing through their own annual reporting. But we don’t really know. The only way that we can find out is through research that is funded by these companies, but how do we check that data? They can’t even share the data sets, so we have to trust the researchers who have signed NDAs with the companies and thus are given data that nobody else has access to. 

What has become a broad global call is a need for access to data. I think the greatest possible thing that they can do is open up that data, which is not an original demand. It is something that almost every researcher has asked for. We just no longer know what is going on, which benefits those who abuse, instrumentalize, and weaponize platforms.

DU: What do you think it will take for social media platforms to give access to that data?

SH: Honestly, I don’t know. The leaders of the social media industry, whether it be Zuckerberg or Musk, don’t care. It will have to be a situation in which white people are affected on a mass scale. We begged and pleaded with Meta to keep CrowdTangle and they just killed it. 

On the other hand, it may be the Europeans and the Digital Services Act (DSA), which brings new requirements on transparency and accountability. I’m not saying it’s a desirable model, but domestic legislation to force companies to articulate what they’ve done regarding disinformation and addressing harm can be a way. Countries like New Zealand are trying different things to force platforms to reveal and release their information. Yet this is also largely based on the assumption that domestic powers want to limit the spread of extremism, disinformation, and online violence, which we know we cannot make.

Share the Post: