The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 81 to 84.
Chilling Effects and Frozen Words
https://cis-india.org/internet-governance/chilling-effects-frozen-words
<b>What if the real danger is not that we lose our freedom of speech and expression but our sense of humour as a nation? Lawrence Liang's op-ed was published in the Hindu on April 30, 2012. </b>
<p>While freedom of speech and expression is an individual right, its actualisation often relies on a vast infrastructure of intermediaries.</p>
<p>In the offline world, this includes newspapers, television channels, public auditoriums, etc. It is often assumed that the internet has created a more robust public sphere of speech by doing away with many structural barriers to free speech. But the fact of the matter is that even if the internet enables a shift from a ‘few to many' to a ‘many to many' model of communication, intermediaries continue to remain important players in facilitating free speech. Can one imagine free speech on the internet being the same without Twitter, social networks or Youtube?</p>
<p>One way of thinking of the infrastructure of communication is in terms of ecology, and in the ecology of speech — as in the environment — an adverse impact on any component threatens the well-being of all. The idea of cyberspace as a commons is a much cherished myth and in the early days of the internet we were perhaps given a glimpse into its utopian possibility. But we would be deluding ourselves if we believed that the problems that plague free speech in the offline world (including ownership of the avenues of speech) are absent in cyberspace. Recall in recent times that one of the most effective ways in which various governments retaliated to the leaking of official secrets on WikiLeaks was by freezing Julian Assange's PayPal account.</p>
<h3>Direct & Indirect Controls</h3>
<p>It may be useful to distinguish between direct controls on free speech and indirect or structural controls on free speech. India has had a long history of battling direct and indirect controls on free speech and with a few exceptions the interests of the press have often coincided with the interests of a robust public sphere of debate and criticism.</p>
<p>In the late 1950s and early 1960s, a number of large media houses battled restrictions imposed on the press by way of control of the number of pages of a newspaper, regulation of the size of advertisements and the price of imported newsprint. On the face of it, some of these restrictions may have seemed like commercial disputes but the Supreme Court rightly recognised that indirect controls could adversely impact the individual's right to express himself or herself as well as to receive information freely.</p>
<p>In the online context, there has also been a similar recognition of the role of intermediaries in providing platforms of speech and it is with this view in mind that a number of countries have incorporated safe harbour provisions in their information technology laws.</p>
<p>Section 79 of the Information Technology Act is one such safe harbour provision in India which provides that intermediaries shall not be liable for any third party action if they are able to prove that the offence or contravention was committed without their knowledge or that they had exercised due diligence to prevent the commission of such offence or contravention. But this safe harbour has effectively been undone with the passing of the Information Technology (Intermediaries guidelines) Rules, 2011.</p>
<p>The rules clarify what standard of due diligence has to be met by intermediaries and Sec. 3(2) of the rules obliges intermediaries to have rules and conditions of usage which ensure that users do not host, display, upload, modify, publish, transmit, update or share any information that is in contravention of the Section. This includes the all too familiar ones (defamatory, obscene, pornographic content) but also a whole host of new categories which could be invoked to restrict speech (“grossly harmful,” “blasphemous,” “harassing,” “hateful”).</p>
<p>As is well known, any restriction on speech in India has to comply with both the test of reasonableness under Article 19(2) of the Constitution, as well as ensuring that the grounds of censorship are located within 19(2). Even though there are laws regulating hate speech in India, blasphemy is not a category under Art. 19(2) and has hitherto not been a part of Indian law. Some of the other categories such as “grossly harmful” suggest the people who drafted the rules seem to have taken a constitutional nap at the drafting board.</p>
<p>Sec. 3(4) of the rules provides that any intermediary who receives a notice by an aggrieved person about any violation of sub rule (2) will have to act within 36 hours and where applicable will ensure that the information is disabled. In the event that it fails to act or to respond, the intermediary cannot claim exemption for liability under Sec. 70 of the IT Act. It is worth noting that most intermediaries receive from hundreds to thousands of requests from individuals on a daily basis asking for the removal of objectionable material. The Centre for Internet and Society conducted a “sting operation” to determine whether the criteria, procedure and safeguards for administration of the takedowns as prescribed by the Rules lead to a chilling effect on free expression.</p>
<p>In the course of the study, frivolous takedown notices were sent to seven intermediaries and their response to the notices was documented. Different policy factors were permuted in the takedown notices in order to understand at what points in the process of takedown, free expression is being chilled. The takedown notices which were sent by the researcher were intentionally defective as they did not establish how they were interested parties, did not specifically identify and discuss any individual URL on the websites, or present any cause of action, or suggest any legal injury. Of the seven intermediaries to which takedown notices were sent, six over-complied with the notices, despite the apparent flaws in them.</p>
<h3>Caution</h3>
<p>Even in cases where the intermediaries challenged the validity of the takedowns, they erred on the side of caution and took down the material. While a number of intermediaries would see themselves as allies in the fight against censorship, more often than not intermediaries are also large commercial organisations whose primary concern is the protection of their business interests. In the face of any potential legal threat, especially from the government, they prefer to err on the side of caution. The people whose content was removed were not told, nor was the general public informed that the content was removed.</p>
<p>The procedural flaws (subjective determination, absence of the right to be heard, the short response time) coupled with the vague grounds on which such takedowns can be claimed, clearly point to a highly flawed situation in which we will see many more trigger happy demands for offending materials to be taken down.</p>
<p>We have already slipped into a state of being a republic of over sensitivity where any politician, religious group or individual can claim their sentiments have been hurt or they have been portrayed disparagingly, as evidenced by the recent attack and subsequent arrest of Professor Ambikesh Mahapatra of Jadavpur University for posting cartoons lampooning Mamata Banerjee.</p>
<h3>Nervous State</h3>
<p>In the era of global outsourcing it was inevitable that the state censorship machinery would also learn a lesson or two from the global trends and what better way of ensuring censorship than outsourcing it to individuals and to corporations. The renowned anthropologist, Michael Taussig, once compared the state to a nervous system and it seems that the Intermediary rules live up to the expectations of a nervous state ever ready to respond to criticism and disparaging cartoons.</p>
<p>What if the real danger is not even that we lose our freedom of speech and expression but we lose our sense of humour as a nation?</p>
<p>The evident flaws of the rules have been acknowledged even by lawmakers, with P. Rajeeve, the CPI(M) M.P., introducing a motion for the annulment of the rules. The annulment motion is going to be debated in the coming weeks and one hopes that the parliamentarians will seriously reconsider the rules in their current form.</p>
<p>When faced with conundrums of the present it is always useful to turn to history and there is reason to believe that while censorship has a very respectable genealogy in Indian thought, it has also been accompanied in equal measure by a tradition of the right to offend.</p>
<p>In his delightful reading of the <em>Arthashastra</em>, Sibaji Bandyopadhay alerts us to the myriad restrictions that existed to control Kusilavas (the term for entertainers which included actors, dancers, singers, storytellers, minstrels and clowns). These regulations ranged from the regulation of their movement during monsoon to prohibitions placed on them, ensuring that they shall not “praise anyone excessively nor receive excessive presents”. While some of the regulations appear harsh and unwarranted, Bandyopadhay says that in contrast to Plato's <em>Republic</em>, which banished poets altogether from the ideal republic, the <em>Arthashastra</em> goes so far as to grant to Kusilavas what we could now call the right to offend. Verse 4.1.61 of the <em>Arthashastra</em> says, “In their performances, [the entertainers] may, if they so wish, make fun of the customs of regions, castes or families and the practices or love affairs (of individuals)”. One hopes that our lawmakers, even if they are averse to reading the Indian Constitution, will be slightly more open to the poetic licence granted by Kautilya.</p>
<p><a class="external-link" href="http://www.thehindu.com/opinion/lead/article3367917.ece?homepage=true">Click</a> for the original published in the Hindu on April 30, 2012. Lawrence Liang is a lawyer and researcher based at Alternative Law Forum, Bangalore. He can be contacted at <a class="external-link" href="mailto:lawrence@altlawforum.org">lawrence@altlawforum.org</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/chilling-effects-frozen-words'>https://cis-india.org/internet-governance/chilling-effects-frozen-words</a>
</p>
No publisherLawrence LiangFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceIntermediary LiabilityCensorship2012-04-30T07:32:17ZBlog EntryYou Have the Right to Remain Silent
https://cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent
<b>Reflecting upon the state of freedom of speech and expression in India, in the wake of the shut-down of the political satire website narendramodiplans.com.</b>
<hr />
<p style="text-align: justify; ">Nishant Shah's <a class="external-link" href="http://www.downtoearth.org.in/content/you-have-right-remain-silent">column was published in Down to Earth</a> on July 17, 2013.</p>
<hr />
<p style="text-align: justify; ">It took less than a day for narendramodiplans.com, a political satire website that had more than 60,000 hits in the 20 hours of its existence, to be taken down. A simple webpage that showed a smiling picture of Narendra Modi, the touted candidate for India’s next Prime Ministerial campaign, flashing his now trademark ‘V’ for <span><s>Vengeance</s> </span> Victory sign. At the first glimpse it looked like another smart media campaign by the net-savvy minister who has already made use of the social web quite effectively, to connect with his constituencies and influence the younger voting population in the country. Below the image of Mr. Modi was a text that said, "For a detailed explanation of how Mr. Narendra Modi plans to run the nation if elected to the house as a Prime Minister and also for his view/perspective on 2002 riots please click the link below." The button, reminiscent of 'sale' signs on shops that offer permanent discounts, promised to reveal, for once and for all, the puppy plight of Mr. Modi's politics and his plans for the country that he seeks to lead.</p>
<p style="text-align: justify; ">However, when one tried to click on the button, hoping, at least for a manifesto that combined the powers of Machiavelli with the sinister beauty of Kafka, it proved to be an impossible task. The button wiggled, and jiggled, and slithered all over the page, running away from the mouse following it. Referencing the layers of evasive answers, the engineered Public Relations campaigns that try to obfuscate the history to some of the most pointed questions that have been posited to the Modi government through judicial and public forums, the button never stayed still enough to actually reveal the promised answers. For people who are familiar with the history of such political satire and protest online would immediately recognise that this wasn’t the most original of ideas. In fact, it was borrowed from another website - <a href="http://www.thepmlnvision.com/" title="http://www.thepmlnvision.com/">http://www.thepmlnvision.com/</a> that levelled similar accusations of lack of transparency and accountability on the part of Nawaz Sharif of Pakistan. Another instance, which is now also shut down, had a similar deployment where the webpage claimed to give a comprehensive view into Rahul Gandhi’s achievements, to question his proclaimed intentions of being the next prime-minister. In short, this is an internet meme, where a simple web page and a java script allowed for a critical commentary on the future of the next elections and the strengthening battle between #feku and #pappu that has already taken epic proportions on Twitter.</p>
<p style="text-align: justify; ">The early demise of these two websites (please do note, when you click on the links that the Nawaz Sharif website is still working) warns us of the tightening noose around freedom of speech and expression that politicos are responsible for in India. It has been a dreary last couple of years already, with the passing of the <a href="http://www.downtoearth.org.in/content/cis-india.org/internet-governance/intermediary-liability-in-india" target="_blank">Intermediaries Liabilities Rules</a> as an amendment to the IT Act of India, <a href="http://www.indianexpress.com/news/spy-in-the-web/888509/1" target="_blank">Dr. Sibal proposing to pre-censor the social web</a> in a quest to save the face of erring political figures,<a href="http://www.indianexpress.com/news/two-girls-arrested-for-facebook-post-questioning-bal-thackeray-shutdown-of-mumbai-get-bail/1033177/" target="_blank"> teenagers being arrested for voicing political dissent</a>, and <a href="http://en.wikipedia.org/wiki/Aseem_Trivedi" target="_blank">artists being prosecuted</a> for exercising their rights to question the state of governance in our country. Despite battles to keep the web an open space that embodies the democratic potentials and the constitutional rights of freedom of speech and expression in the country, it has been a losing fight to keep up with the ad hoc and dictatorial mandates that seem to govern the web.</p>
<table class="invisible">
<tbody>
<tr>
<th><img src="https://cis-india.org/home-images/Namo.png" alt="Narendra Modi Plans" class="image-inline" title="Narendra Modi Plans" /></th>
</tr>
<tr>
<td>Above is a screen shot from narendramodiplans.com website</td>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">We have no indication of why this latest piece of satirical expression, which should be granted immunity as a work of art, if not as an individual’s right to free speech, was suddenly taken down. The website now has a message that says, “I quit. In a country with freedom of speech, I assumed that I was allowed to make decent satire on any politician more particularly if it is constructive. Clearly, I was wrong.” The web is already abuzz with conspiracy theories, each sounding scarier than the other because they seem so plausible and possible in a country that has easily sacrificed our right to free speech and expression at the altar of political egos. And whether you subscribe to any of the theories or not, whether your sympathies lie with the BJP or with the UPA, whether or not you approve of the political directions that the country seems to be headed in, there is no doubt that you should be as agitated as I am, about the fact that we are in a fast-car to blanket censorship, and we are going there in style.</p>
<p style="text-align: justify; ">What happens online is not just about this one website or the one person or the one political party – it is a reflection on the rising surveillance and bully state that presumes that making voices (and sometimes people) invisible, is enough to resolve the problems that they create. And what happens on the web is soon going to also affect the ways in which we live our everyday lives. So the next time, you call some friends over for dinner, and then sit arguing about the state of politics in the country, make sure your windows are all shut, you are wearing tin-foil hats and if possible, direct all conversations to the task of finally <a href="http://bollywoodjournalist.com/2013/07/08/desperately-seeking-mamta-kulkarni/" target="_blank">finding Mamta Kulkarni</a>. Because anything else that you say might either be censored or land you in a soup, and the only recourse you might have would be a website that shows the glorious political figures of the country, with a sign that says “To defend your right to free speech and expression, please click here”. And you know that you are never going to be able to click on that sign. Ever.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent'>https://cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent</a>
</p>
No publishernishantFreedom of Speech and ExpressionSocial MediaInternet GovernanceIntermediary Liability2013-07-22T06:59:53ZBlog EntryRole of Intermediaries in Countering Online Abuse
https://cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse
<b>The Internet can be a hostile space and protecting users from abuse without curtailing freedom of expression requires a balancing act on the part of online intermediaries.</b>
<p style="text-align: justify; ">This got published as two blog entries in the NALSAR Law Tech Blog. Part 1 can be accessed <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-i/">here</a> and Part 2 <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-ii/">here</a>.</p>
<hr />
<p style="text-align: justify; ">As platforms and services coalesce around user-generated content (UGC) and entrench themselves in the digital publishing universe, they are increasingly taking on the duties and responsibilities of protecting rights including taking reasonable measures to restrict unlawful speech. Arguments around the role of intermediaries tackling unlawful content usually center around the issue of regulation—when is it feasible to regulate speech and how best should this regulation be enforced?</p>
<p class="Standard" style="text-align: justify; ">Recently, Twitter found itself at the periphery of such questions when an anonymous user of the platform, @LutyensInsider, began posting slanderous and sexually explicit comments about Swati Chaturvedi, a Delhi-based journalist. The online spat which began in February last year, culminated into<a href="http://www.dailyo.in/politics/twitter-trolls-swati-chaturvedi-lutyensinsider-presstitutes-bazaru-media-delhi-police/story/1/4300.html"> Swati filing an FIR</a> against the anonymous user, last week. Within hours of the FIR, the anonymous user deleted the tweets and went silent. Predictably, Twitter users <a href="https://twitter.com/bainjal/status/609343547796426752">hailed this</a> as a much needed deterrence to online harassment. Swati’s personal victory is worth celebrating, it is an encouragement for the many women bullied daily on the Internet, where harassment is rampant. However, while Swati might be well within her legal rights to counter slander, the rights and liabilities of private companies in such circumstances are often not as clear cut.</p>
<p class="Standard" style="text-align: justify; ">Should platforms like Twitter take on the mantle of deciding what speech is permissible or not? When and how should the limits on speech be drawn? Does this amount to private censorship?The answers are not easy and as the recent Grand Chamber of the European Court of Human Rights (ECtHR)<a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635"> </a><a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635">judgment in the case of</a> Delfi AS v. Estonia confirms, the role of UGC platforms in balancing the user rights, is an issue far from being settled. In its ruling, the ECtHR reasoned that because of their role in facilitating expression, online platforms have a requirement “<i>to take effective measures to limit the dissemination of hate speech and speech inciting violence was not ‘private censorship”.</i></p>
<p class="Standard" style="text-align: justify; ">This is problematic because the decision moves the regime away from a framework that grants immunity from liability, as long as platforms meet certain criteria and procedures. In <a href="http://www.jipitec.eu/issues/jipitec-5-3-2014/4091">other words</a> the ruling establishes strict liability for intermediaries in relation to manifestly illegal content, even if they may have no knowledge. The 'obligation' placed on the intermediary does not grant them safe harbour and is not proportionate to the monitoring and blocking capacity thus necessitated. Consequently, platforms might be incentivized to err on the side of caution and restrict comments or confine speech resulting in censorship. The ruling is especially worrying, as the standard of care placed on the intermediary does not recognize the different role played by intermediaries in detection and removal of unlawful content. Further, intermediary liability is its own legal regime and is at the same time, a subset of various legal issues that need an understanding of variation in scenarios, mediums and technology both globally and in India.</p>
<h3 class="Standard">Law and Short of IT</h3>
<p class="Standard" style="text-align: justify; ">Earlier this year, in a<a href="http://www.theverge.com/2015/2/4/7982099/twitter-ceo-sent-memo-taking-personal-responsibility-for-the"> leaked memo</a>, the Twitter CEO Dick Costolo took personal responsibility for his platform's chronic problem and failure to deal with harassment and abuse. In Swati's case, Twitter did not intervene or take steps to address harrassment. If it had to, Twitter (India), as all online intermediaries would be bound by the provisions established under Section 79 and accompanying Rules of the Information Technology Act. These legislations outline the obligations and conditions that intermediaries must fulfill to claim immunity from liability for third party content. Under the regime, upon receiving actual knowledge of unlawful information on their platform, the intermediary must comply with the notice and takedown (NTD) procedure for blocking and removal of content.</p>
<p class="Standard" style="text-align: justify; ">Private complainants could invoke the NTD procedure forcing intermediaries to act as adjudicators of an unlawful act—a role they are clearly ill-equipped to perform, especially when the content relates to political speech or alleged defamation or obscenity. The SC judgment in Shreya Singhal addressing this issue, read down the provision (Section 79 by holding that a takedown notice can only be effected if the complainant secures a court order to support her allegation. Further, it was held that the scope of restrictions under the mechanism is restricted to the specific categories identified under Article 19(2). Effectively, this means Twitter need not take down content in the absence of a court order.</p>
<h3 class="Standard">Content Policy as Due Diligence</h3>
<p class="Standard" style="text-align: justify; ">Another provision, Rule 3(2) prescribes a content policy which, prior to the Shreya Singhal judgment was a criteria for administering takedown. This content policy includes an exhaustive list of types of restricted expressions, though worryingly, the terms included in it are not clearly defined and go beyond the reasonable restrictions envisioned under Article 19(2). Terms such as “grossly harmful”, “objectionable”, “harassing”, “disparaging” and “hateful” are not defined anywhere in the Rules, are subjective and contestable as alternate interpretation and standard could be offered for the same term. Further, this content policy is not applicable to content created by the intermediary.</p>
<p class="Standard" style="text-align: justify; ">Prior to the SC verdict in Shreya Singhal, <a href="http://cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability">actual knowledge could have been interpreted</a> to mean the intermediary is called upon its own judgement under sub-rule (4) to restrict impugned content in order to seek exemption from liability. While liability accrued from not complying with takedown requests under the content policy was clear, this is not the case anymore. By reading down of S. 79 (3) (b) the court has addressed the issue of intermediaries complying with places limits on the private censorship of intermediaries and the invisible censorship of opaque government takedown requests as they must and should adhere, to the boundaries set by Article 19(2). Following the SC judgment intermediaries do not have to administer takedowns without a court order thereby rendering this content policy redundant. As it stands, the content policy is an obligation that intermediaries must fulfill in order to be exempted from liability for UGC and this due diligence is limited to publishing rules and regulations, terms and conditions or user agreement informing users of the restrictions on content. The penalties for not publishing this content policy should be clarified.</p>
<p class="Standard" style="text-align: justify; ">Further, having been informed of what is permissible users are agreeing to comply with the policy outlined, by signing up to and using these platforms and services. The requirement of publishing content policy as due diligence is unnecessary given that mandating such ‘standard’ terms of use negates the difference between different types of intermediaries which accrue different kinds of liability. This also places an extraordinary power of censorship in the hands of the intermediary, which could easily stifle freedom of speech online. Such heavy handed regulation could make it impossible to publish critical views about anything without the risk of being summarily censored.</p>
<p class="Standard">Twitter may have complied with its duties by publishing the content policy, though the obligation does not seem to be an effective deterrence. Strong safe harbour provisions for intermediaries are a crucial element in the promotion and protection of the right to freedom of expression online. By absolving platforms of responsibility for UGC as long as they publish a content policy that is vague and subjective is the very reason why India’s IT Rules are in fact, in urgent need of improvement.</p>
<h3 class="Standard">Size Matters</h3>
<p class="Standard" style="text-align: justify; ">The standards for blocking, reporting and responding to abuse vary across different categories of platforms. For example, it may be easier to counter trolls and abuse on blogs or forums where the owner or an administrator is monitoring comments and UGC. Usually platforms outline monitoring and reporting policies and procedures including recourse available to victims and action to be taken against violators. However, these measures are not always effective in curbing abuse as it is possible for users to create new accounts under different usernames. For example, in Swati’s case the anonymous user behind @LutyensInsider account changed<a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx"> </a><a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx">their handle</a> to @gregoryzackim and @gzackim before deleting all tweets. In this case, perhaps the fear of criminal charges ahead was enough to silence the anonymous user, which may not always be the case.</p>
<h3 class="Standard">Tackling the Trolls</h3>
<p class="Standard" style="text-align: justify; ">Most large intermediaries have privacy settings which restrict the audience for user posts as well as prevent strangers from contacting them as a general measure against online harassment. Platforms also publish<a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html"> </a><a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html">monitoring policy</a> outlining the procedure and mechanisms for users to<a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html">register their complaint</a> or<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">report abuse</a>. Often reporting and blocking mechanisms<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">rely on community standards</a> and users reporting unlawful content. Last week Twitter<a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348"> </a><a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348">announced a new feature</a> allowing lists of blocked users to be shared between users. An improvement on existing mechanism for blocking, the feature is aimed at making the service safer for people facing similar issues and while an improvement on standard policies defining permissible limits on content, such efforts may have their limitations.</p>
<p class="Standard" style="text-align: justify; ">The mechanisms follow a one-size-fits-all policy. First, such community driven efforts do not address concerns of differences in opinion and subjectivity. Swati in defending her actions stressed the “<i>coarse discourse”</i> prevalent on social media, though as<a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/"> </a><a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/">this article points out</a> she might be assumed guilty of using offensive and abusive language. Subjectivity and many interpretations of the same opinion can pave the way for many taking offense online. Earlier this month, Nikhil Wagle’s tweets criticising Prime Minister Narendra Modi as a “pervert” was interpreted as “abusive”, “offensive” and “spreading religious disharmony”. While platforms are within their rights to establish policies for dealing with issues faced by users, there is a real danger of them doing so for<a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">“</a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">political reasons” and based on “popularity” measures</a> which may chill free speech. When many get behind a particular interpretation of an opinion, lawful speech may also be stifled as Sreemoyee Kundu <a href="http://www.dailyo.in/user/124/sreemoyeekundu">found out</a>. A victim of online abuse her account was blocked by Facebook owing to multiple reports from a “<i>faceless fanatical mob”. </i>Allowing the users to set standards of permissible speech is an improvement, though it runs the risk of mob justice and platforms need to be vigilant in applying such standards.</p>
<p class="Standard" style="text-align: justify; ">While it may be in the interest of platforms to keep a hands off approach to community policies, certain kind of content may necessiate intervention by the intermediary. There has been an increase in private companies modifying their content policy to place reasonable restriction on certain hateful behaviour in order to protect vulnerable or marginalised voices. <a href="http://www.theguardian.com/technology/2015/mar/12/twitter-bans-revenge-porn-in-user-policy-sharpening">Twitter</a> and <a href="http://www.redditblog.com/2015/05/promote-ideas-protect-people.html">Reddit's</a> policy change in addressing revenge porn are reflective of a growing understanding amongst stakeholders that in order to promote free expression of ideas, recognition and protection of certain rights on the Internet may be necessary. However, any approach to regulate user content must assess the effect of policy decisions on user rights. Google's <a href="http://www.theguardian.com/technology/2015/jun/22/revenge-porn-women-free-speech-abuse">stand on tackling revenge porn</a> may be laudable, though the <a href="https://www.techdirt.com/articles/20141109/06211929087/googles-efforts-to-push-down-piracy-sites-may-lead-more-people-to-malware.shtml">decision to push down</a> 'piracy' sites in its search results could be seen to adversely impact the choice that users have. Terms of service implemented with subjectivity and lack of transparency can and does lead to private censorship.</p>
<h3 class="Standard">The Way Forward</h3>
<p class="Standard" style="text-align: justify; ">Harassment is damaging, because of the feeling of powerlessness that it invokes in the victims and online intermediaries represent new forms of power through which users' negotiate and manage their online identity. Content restriction policies and practices must address this power imbalance by adopting baseline safeguards and best practices. It is only fair that based on principles of equality and justice, intermediaries be held responsible for the damage caused to users due to wrongdoings of other users or when they fail to carry out their operations and services as prescribed by the law. However, in its present state, the intermediary liability regime in India is not sufficient to deal with online harassment and needs to evolve into a more nuanced form of governance.</p>
<p class="Standard" style="text-align: justify; ">Any liability framework must evolve bearing in mind the slippery slope of overbroad regulation and differing standards of community responsibility. Therefore, a balanced framework would need to include elements of both targeted regulation and soft forms of governance as liability regimes need to balance fundamental human rights and the interests of private companies. Often, achieving this balance is problematic given that these companies are expected to be adjudicators and may also be the target of the breach of rights, as is the case in Delfi v Estonia. Global frameworks such as the Manila Principles can be a way forward in developing effective mechanisms. The determination of content restriction practices should always adopt the least restrictive means of doing so, distinguishing between the classes of intermediary. They must evolve considering the proportionality of the harm, the nature of the content and the impact on affected users including the proximity of affected party to content uploader.</p>
<p class="Standard" style="text-align: justify; ">Further, intermediaries and governments should communicate a clear mechanism for review and appeal of restriction decisions. Content restriction policies should incorporate an effective right to be heard. In exceptional circumstances when this is not possible, a post facto review of the restricton order and its implementation must take place as soon as practicable. Further, unlawful content restricted for a limited duration or within a specific geography, must not extend beyond these limits and a periodic review should take place to ensure the validity of the restriction. Regular, systematic review of rules and guidelines guiding intermediary liability will go a long way in ensuring that such frameworks are not overly burdensome and remain effective.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse'>https://cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse</a>
</p>
No publisherjyotiOnline HarassmentInternet GovernanceIntermediary LiabilityChilling EffectOnline Abuse2015-08-02T16:38:36ZBlog EntryUN Special Rapporteur Report on Freedom of Expression and the Private Sector: A Significant Step Forward
https://cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward
<b>On 6 June 2016, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye, released a report on the Information and Communications Technology (“ICT”) sector and freedom of expression in the digital age. Vidushi Marda and Pranesh Prakash highlight the most important aspects of the report.</b>
<h2 dir="ltr">Background</h2>
<p dir="ltr">Today, the private sector is more closely linked to the freedom of expression than it has ever been before. The ability to speak to a mass audience was at one time a privilege restricted to those who had access to mass media. However, with digital technologies, that privilege is available to far more people than was ever possible in the pre-digital era. As private content created on these digital networks is becoming increasingly subject to state regulation, it is crucial to examine the role of the private sector in respect of the freedom of speech and expression.</p>
<p dir="ltr">The first foray by the Special Rapporteur into this broad area has resulted in a sweeping report, that covers almost every aspect of freedom of expression within the ICT sector, except competition which we will elaborate on later in this post.</p>
<h2 dir="ltr">Introduction</h2>
<p dir="ltr">The report aims to “provide guidance on how private actors should protect and promote freedom of expression in a digital age”. It identifies the relevant international legal framework as Article 19 of the <a href="https://treaties.un.org/doc/Publication/UNTS/Volume%20999/volume-999-I-14668-English.pdf">International Covenant on Civil and Political Rights</a>, and Article 19 of the <a href="http://www.un.org/en/udhrbook/pdf/udhr_booklet_en_web.pdf">Universal Declaration of Human Rights</a>. The UN “Protect, Respect and Remedy” Framework and Guiding Principles, also known as the <a href="http://business-humanrights.org/sites/default/files/reports-and-materials/Ruggie-report-7-Apr-2008.pdf">Ruggie Principles</a> provide the framework for private sector responsibilities on business and human rights.</p>
<p dir="ltr">The report categorises different roles of the private sector in organising, accessing, regulating and populating the internet. This is important because the manner in which the ICT sector affects the freedom of expression is far more complicated than traditional communication industries. The report identifies the distinct impact of internet service providers, hardware and software companies, domain name registries and registrars, search engines, platforms, web hosting services, platforms, data brokers and e-commerce facilities on the freedom of expression.</p>
<h2>Legal and Policy Issues</h2>
<div>The Special Rapporteur discusses four distinct legal and policy issues that find relevance in respect of this problem statement: Content Regulation, Surveillance and Digital Security, Transparency and Remedies.</div>
<div> </div>
<h3>Content Regulation</h3>
<p dir="ltr">The report identifies two main channels through which content regulation takes place: the state, and internal processes.</p>
<p>Noting that digital content made on private networks is increasingly subject to State regulation, the report highlights the competing interests of intermediaries who manage platforms and States which demand for regulation of this content on grounds of defamation, blasphemy, protection of national security etc. This tension is demonstrated through vague laws that compel individuals and private corporations to over-comply and err on the side of caution “in order to avoid onerous penalties, filtering content of uncertain legal status and engaging in other modes of censorship and self-censorship.” Excessive intermediary liability forces intermediaries to over-comply with requests in order to ensure that local access to their platforms are not blocked. States attempt at regulating content outside the law through extra legal restrictions, and push private actors to take down content on their own initiative. Filtering content is another method, wherein States block and filter content through the private sector. Government blacklists, illegal content and suspended accounts are methods employed, and these have sometimes raised concerns of necessity and proportionality. <a href="http://scroll.in/article/807277/whatsapp-in-kashmir-when-big-brother-wants-to-go-beyond-watching-you">Network or service shutdowns</a> are classified as a “particularly pernicious” method of content regulation. Non neutral networks also are a method of content regulation with the possibilities of internet service providers throttling traffic. Zero rating is a potential issue, although the report acknowledges that “it remains a subject of debate whether they may be permissible in areas genuinely lacking Internet access”.</p>
<p>The other node of content regulation has been identified as internal policies and practices of the private sector. <a href="https://consentofthenetworked.com/author/rebeccamackinnon/">Terms of service</a> restrictions are often tailored to the jurisdiction’s laws and policies and don’t always address the needs and interests of vulnerable groups. Further, the report notes, <a href="http://www.catchnews.com/tech-news/facebook-free-basics-gatekeeping-powers-extend-to-manipulating-public-discourse-1452077063.html">design and engineering choices</a> of how private players choose to curate content are algorithmically determined and increasingly control the information that we consume. </p>
<h3>Transparency</h3>
<div> The report notes that transparency enables those entities subject to internet regulation to take informed decisions about their responsibilities and liabilities in a digital sphere and points out, that there is a severe lack of transparency about government requests to restrict or remove content. Some states even prohibit the publication of such information, with India being one example. In respect of the private sector, content hosting platforms sometimes at least reveal the circumstances under which content is removed due to a government request, although this is rather erratic. The report recognises the need to balance transparency with competing concerns like security and trade secrecy, and this is a matter of continued debate.</div>
<div> </div>
<h3 dir="ltr">Surveillance and Digital Security</h3>
<p>Freedom of expression concerns arise as data transmitted on private networks is gradually being subjected to surveillance and interference from the State and private actors. The report finds that several internet companies have reported an increase in government requests for customer data and user information. According to the Special Rapporteur, effective resistance strategies include inclusion of human rights guarantees, restrictively interpreting government requests negotiations. Private players also make surveillance and censorship equipment that enable States to intercept communications. Covert surveillance has been previously reported, with States tapping into communications as and when necessary. When private entities become aware of interception and covert surveillance, their human rights responsibilities arise. As private entities work towards enhancing encryption, anonymity and user security, states respond by <a href="http://www.cnbc.com/2016/03/29/apple-vs-fbi-all-you-need-to-know.html">compelling companies</a> to create loopholes for them to circumvent such privacy and security enhancing technology.</p>
<h3 dir="ltr">Remedies</h3>
<p>Unlawful content removal, opaque suspensions, data security breaches are commonplace occurrences in the digital sphere. The ICCPR guarantees that all people whose rights have been violated must have an effective remedy, and similarly, the Ruggie principles require that remedial and grievance mechanisms must be provided by corporations. There is some ambiguity on how these complaint or appeal mechanisms should be designed and implemented, and the nature and structure of these mechanisms is also unclear. The report states that it is necessary to investigate the role of the state in supplementing/regulating corporate mechanisms, its role in ensuring that there is a mechanism for remedies, and its responsibility to make sure that more easily and financially accessible alternatives exist for remedial measures.<br /><br /></p>
<h2> Special Rapporteur’s priorities for future work and thematic developments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Investigating laws, policies and extralegal measures that equip governments to impose restrictions on the provision of telecommunications and internet services. Examining the responsibility of companies to respond in a way that respects human rights, mitigates harm, and provides avenues for redress.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Evaluating content restrictions under terms of service and community standards. Private actors face substantial pressure from governments and individuals to restrict expression, and a priority is to evaluate the interplay of private and state actions on freedom of expression in light of human rights obligations and responsibilities.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Focusing on the legitimacy of rationales for intermediary liability for content hosting, restrictions, conditions for removing third party content.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Exploring censorship and surveillance within the human rights framework, and encouraging greater scrutiny before using these technologies for purposes that undermine the freedom of expression.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Identifying ways to balance an increasing scope of freedom of expression with the need to address governmental interests in national security and public order.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet access - Future work will explore issues around access and private sector engagement and investment in ensuring affordability and accessibility, particularly considering marginalized groups.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet governance - Internet governance frameworks and reform efforts are sensitive to the needs of women, sexual minorities and other vulnerable communities. Throughout this future work, the Special Rapporteur will pay particular attention to legal developments (legislative, regulatory, and judicial) at national and regional levels.</p>
</li></ol>
<div> </div>
<h2>Conclusions and Recommendations</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">States: The report recommends that states should not pressurise the private sector to interfere with the freedom of speech and expression in a manner that does not meet the condition of necessary and proportionate principles. Any request to take down content or access customer information must be based on validly enacted law, subject to oversight, and demonstrate necessary and proportionate means of achieving the aims laid down in Article 19(3) of the ICCPR.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Private Actors: The Special Rapporteur recommends that private actors develop and implement transparent human rights assessment procedures, and develop policies keeping in mind their human rights impact. Apart from this, private entities should integrate commitments to the freedom of expression into internal processes and ensure the “greatest possible transparency”.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">International Organisations: The report recommends that organisations make resources and educational material on internet governance publicly accessible. The Special Rapporteur also recommends encouraging meaningful civil society participation in multi-stakeholder policy making and standard setting processes, with an increased focus on sensitivity to human rights.</p>
</li></ol>
<div> </div>
<h2>CIS Comments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS strongly agrees with the expansion of the Special Rapporteur’s scope that this report represents. He is no longer looking solely at states but at the private sector too.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS also notes that competition is an important aspect of the freedom of expression, but has not been discussed in this report. Viable alternatives to platforms, networks, internet service providers etc., will ensure a healthy, competitive marketplace, and will have a positive impact in resolving the issues identified above.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Our <a href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf/view">work</a> has called for maintaining a balanced approach to liability of intermediaries for their users’ actions, since excessive liability or strict liability would lead to over-caution and removal of legitimate speech, while having no liability at all would make it difficult to act effectively against harmful speech, e.g., revenge porn.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr"><a href="http://cis-india.org/internet-governance/blog/cis-position-on-net-neutrality">CIS’ work</a> on network neutrality has highlighted the importance of neutrality for freedom of speech, and has advocated for an evidence-based approach that ensures there is neither under-regulation, nor over-regulation. The Special Rapporteur suggests that ‘Zero-Rating’ practices always violate Net Neutrality, but the majority of the definitions of Net Neutrality proposed by academics and followed by regulators across the world often do not include Zero-Rating. Similarly, he suggests that the main exception for Zero-Rating is for areas genuinely lacking access to the Internet, whereas the potential for some forms of Zero-Rating to further freedom of expression, especially of minorities, even in areas with access to the Internet, provides sufficient reason for the issue to merit greater debate.</p>
</li></ol>
<div> </div>
<div> </div>
<div>(Pranesh Prakash was invited by the Special Rapporteur to provide his views and took part in a meeting that contributed to this report)</div>
<div> </div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward'>https://cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward</a>
</p>
No publishervidushiFreedom of Speech and ExpressionInternet GovernanceUNHRCDigital MediaIntermediary LiabilityICT2016-06-08T17:27:22ZBlog Entry