Centre for Internet & Society

Blocking websites on the Internet should be proportionate to harm they intend. However, the government of India's approach is against the principles of natural justice.


Published in Tehelka on August 23, 2012.


When: Speech should be regulated when there is harm, or when there is clear and imminent harm. The extent of regulation must be in proportion to the harm.

The mass exodus of people from the Northeast, from certain Indian-cities is clear indication of a ‘public order’ crisis. The government of India, for the very first time, has legitimate reasons for cracking down on intermediaries such as Google and Facebook and their users, unlike in the past when only the egos of politicians, bureaucrats and others in public office or public life were at stake. In most cases temporary restrictions on speech are sufficient to mitigate harm. When potential for harm has dissipated the restrictions should be lifted. Whilst videos and images related to the violations of the human-rights to the Rohingya community might be sensitive material today, there is no reason why such content should be blocked forever, unlike, for example, in the case of child pornography.

How: Does this mean that the Internet rules that were notified in April last year were future-looking policies justified in retrospect? No. When a block is implemented, or a takedown is complied with, three types of notices are required — either immediately or after the imminent harm has been prevented. First, the censored individuals/groups should be informed, so that they can seek redressal and reinstatement; second, those trying to consume the censored material must be warned; and third, the general public has a right to know either immediately or in due course.

Even in authoritarian states like Saudi Arabia, visitors to blocked websites are given clear reasons why the website was blocked along with contact details to seek redressal. There are, also, safe harbour provisions for intermediaries, meaning that they absolve themselves of liability in exchange for acting upon takedown orders sent by non-state actors. Suitable safeguards are required to prevent over-compliance by intermediaries, and the resulting chilling effect on free speech as demonstrated by CIS's research. The intermediary liability rules under the Indian IT Act 2008 have no such safeguards and therefore does not comply with principles of natural justice.

Who: Block and takedown orders need to be very specific. The advisory note issued to Internet intermediaries by the Department of Electronics and Information Technology, Ministry of Communications & Information Technology on the 17 August did not mention details such as URLs, user accounts, group names and content identifiers. Most of the censored material at first glance, appears to be communal in nature. Unfortunately, there are several URLs from mainstream media publications, a few Wikipedia pages and also at least two blog entries debunking rumours in the list, perhaps because of oversight. Images of unrelated human rights violations featuring people with similar racial features are being used to fuel the current rumours. However, blocking all websites featuring such images will not stop such rumour mongering. Censorship must be targeted and proportionate to the potential harm.

Why: Speaking aloud just once in the analog world could either result in harm or good. Imagine shouting “bomb” in a crowded airport. The network effect of technologies such as SMS, social media and micro-blogging amplifies the impact of speech. Article 19(2) of the Constitution of India lists eight reasons for which reasonable restrictions may be applied to the right to free speech. This applies to both analog and speech mediated via networked technologies. Some of these restrictions such as 'public order' and 'incitement to discrimination, hostility or violence' are part of international treaties such as the International Covenant on Civil and Political Rights. Fringe phenomenon and exceptional circumstances should not be the basis for formulating policy. For example — knives used as murder weapons does not necessitate regulations on cutlery. Similarly, criminalising rumour mongering will not prevent false information from going viral, online, and disrupting public order. Videos and photos are doctored and manipulated for a wide variety of legitimate reasons. The existing law regulating speech in the interests of public order are sufficient to deal with the circulation of falsehoods on social media.

Sunil Abraham is the Executive Director of Bangalore based research organisation, the Centre for Internet and Society.

The views and opinions expressed on this page are those of their individual authors. Unless the opposite is explicitly stated, or unless the opposite may be reasonably inferred, CIS does not subscribe to these views and opinions which belong to their individual authors. CIS does not accept any responsibility, legal or otherwise, for the views and opinions of these individual authors. For an official statement from CIS on a particular issue, please contact us directly.