The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 71 to 84.
IT (Amendment) Act, 2008, 69 Rules: Draft and Final Version Comparison
https://cis-india.org/internet-governance/blog/it-amendment-act-69-rules-draft-and-final-version-comparison
<b>Jadine Lannon has performed a clause-by-clause comparison of the Draft 69 Rules and official 69 Rules under Section 69B in order to better understand how the two are similar and how they differ. Very brief notes have been included on some changes we deemed to be important.
</b>
<table class="plain">
<tbody>
<tr>
<th><img src="https://cis-india.org/home-images/copy_of_pc1.png" alt="c1" class="image-inline" title="c1" /></th>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/pc2.png" alt="c2" class="image-inline" title="c2" /></td>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/pc3.png" alt="c3" class="image-inline" title="c3" /></td>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/pc4.png" alt="c4" class="image-inline" title="c4" /></td>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/pc5.png" alt="c5" class="image-inline" title="c5" /></td>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/copy_of_pc6.png" alt="c6" class="image-inline" title="c6" /></td>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/pc7.png" alt="c7" class="image-inline" title="c7" /></td>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/pc8.png" alt="c8" class="image-inline" title="c8" /></td>
</tr>
<tr>
<td><img src="https://cis-india.org/home-images/pc9.png" alt="c9" class="image-inline" title="c9" /></td>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">Similar to the other comparisons that I have done on the 69A and 69B Draft and official Rules, the majority of the changes between these two sets of rules serves to restructure and clarify various clauses in the Draft 69 Rules.</p>
<p style="text-align: justify; ">Three new definitions appear in the Clause (2) of the 69 Rules, including a definition for “communication”, which appears in the Draft Rules but has no associated definition under Clause (2) of the Draft Rules.</p>
<p style="text-align: justify; ">Clause (31) of the Draft Rules, which deals with the requirement of security agencies of the State and Union territories to share any information gathered through interception, monitoring and/or decryption with federal agencies, does not make an appearance in the official rules. Further, this necessity does not seem to be implied anywhere in the official 69 Rules.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/it-amendment-act-69-rules-draft-and-final-version-comparison'>https://cis-india.org/internet-governance/blog/it-amendment-act-69-rules-draft-and-final-version-comparison</a>
</p>
No publisherjdineInternet GovernanceIntermediary LiabilityInformation Technology2013-04-30T09:56:07ZBlog EntryGNI and IAMAI Launch Interactive Slideshow Exploring Impact of India's Internet Laws
https://cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws
<b>The Global Network Initiative and the Internet and Mobile Association of India have come together to explain how India’s Internet and technology laws impact economic innovation and freedom of expression. </b>
<p>The <a class="external-link" href="http://www.globalnetworkinitiative.org/">Global Network Initiative (GNI)</a>, and the <a class="external-link" href="http://www.iamai.in/">Internet and Mobile Association of India (IAMAI)</a> have launched an interactive slide show exploring the impact of existing Internet laws on users and businesses in India. The slide show created by Newsbound, and to which Centre for Internet and Society (CIS) has contributed its comments—explain the existing legislative mechanisms prevalent in India, map the challenges of the regulatory environment and highlight areas where such mechanisms can be strengthened.</p>
<p>Foregrounding the difficulties of content regulation, the slides are aimed at informing users and the public of the constraints of current legal mechanisms in place, including safe harbour and take down and notice provisions. Highlighting Section 79(3) and the Intermediary Liability Rules issued in 2011, the slide show identifies some of the challenges faced by Internet platforms, such as the broad interpretation of the legislation by the executive branch.</p>
<p>Challenges governing Internet platforms highlighted in the slide show include uniform Terms of Service that do not consider the type of service being provided by the platform, uncertain requirements for taking down content and compliance obligations related to information disclosure. Further the issues of over compliance and misuse of the legal notice and take down system introduced under Section 79 of the Information Technology (Intermediaries Guidelines) Rules 2011.</p>
<p>The Rules were created with the purpose of providing guidelines for the ‘post-publication redressal mechanism expression as envisioned in the Constitution of India'. However, since their introduction, the Rules have been criticised extensively, by both the national and the international media on account of not conforming to principles of natural justice and freedom of expression. Critics have pointed out that by not recognising the different functions performed by the different intermediaries and by not providing safeguards against misuse of such mechanism for suppressing legitimate expression, the Rules have a chilling effect on freedom of expression.</p>
<p>Under the current Rules, the third party provider/creator of information is not given a chance to be heard by the intermediary, nor is there a requirement to give a reasoned decision by the intermediary to the creator whose content has been taken down. The take down procedure also, does not have any provisions for restoring the removed information, such as providing a counter notice filing mechanism or appealing to a higher authority. Further, the content criteria for removal of content includes terms like 'disparaging' and 'objectionable', which are not defined and prima facie seem to be beyond the reasonable restrictions envisioned by the Constitution of India. With uncertainty in content criteria and no safeguards to prevent abuse complainant may send frivolous complaints and suppress legitimate expressions without any fear of repercussions.</p>
<p>Most importantly, the redressal mechanism under the Rules shifts the burden of censorship, previously, the exclusive domain of the judiciary or the executive, and makes it the responsibility of private intermediaries. Often, private intermediaries, do not have sufficient legal resources to subjectively determine the legitimacy of a legal claim, resulting in over compliance to limit liability. The slide show cites the <a href="https://cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet">2011 CIS research carried out by Rishabh Dara</a> to determine whether the Rules lead to a chilling effect on online free expression, towards highlighting the issue of over compliance and self censorship.</p>
<p>The initiative is timely, given the change of guard in India, and stresses, not only the economic impact of fixing the Internet legal framework, but also the larger impact on users rights and freedom of expression. The initiative calls for a legal environment for the Internet that enables innovation, protects the rights of users, and provides clear rules and regulations for businesses large and small.</p>
<p>See the slideshow here: <a href="http://globalnetworkinitiative.org/india">How India’s Internet Laws Can Help Propel the Country Forward</a></p>
<p><strong>Other GNI reports and resources: </strong></p>
<p><a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">Closing the Gap: Indian Online Intermediaries and a Liability System Not Yet Fit for Purpose</a></p>
<p><a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">Strengthening Protections for Online Platforms Could Add Billions to India’s GDP</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws'>https://cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws</a>
</p>
No publisherjyotiCensorshipFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityChilling EffectInformation Technology2014-07-17T12:01:01ZBlog EntryIntermediary Liability in India: Chilling Effects on Free Expression on the Internet
https://cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet
<b>The Centre for Internet & Society in partnership with Google India conducted the Google Policy Fellowship 2011. This was offered for the first time in Asia Pacific as well as in India. Rishabh Dara was selected as a Fellow and researched upon issues relating to freedom of expression. The results of the paper demonstrate that the ‘Information Technology (Intermediaries Guidelines) Rules 2011’ notified by the Government of India on April 11, 2011 have a chilling effect on free expression.</b>
<p style="text-align: justify; ">Intermediaries are widely recognised as essential cogs in the wheel of exercising the right to freedom of expression on the Internet. Most major jurisdictions around the world have introduced legislations for limiting intermediary liability in order to ensure that this wheel does not stop spinning. With the 2008 amendment of the Information Technology Act 2000, India joined the bandwagon and established a ‘notice and takedown’ regime for limiting intermediary liability.<br /><br />On the 11th of April 2011, the Government of India notified the ‘Information Technology (Intermediaries Guidelines) Rules 2011’ that prescribe, amongst other things, guidelines for administration of takedowns by intermediaries. The Rules have been criticised extensively by both the national and the international media. The media has projected that the Rules, contrary to the objective of promoting free expression, seem to encourage privately administered injunctions to censor and chill free expression. On the other hand, the Government has responded through press releases and assured that the Rules in their current form do not violate the principle of freedom of expression or allow the government to regulate content.<br /><br />This study has been conducted with the objective of determining whether the criteria, procedure and safeguards for administration of the takedowns as prescribed by the Rules lead to a chilling effect on online free expression. In the course of the study, takedown notices were sent to a sample comprising of 7 prominent intermediaries and their response to the notices was documented. Different policy factors were permuted in the takedown notices in order to understand at what points in the process of takedown, free expression is being chilled.<br /><br />The results of the paper clearly demonstrate that the Rules indeed have a chilling effect on free expression. Specifically, the Rules create uncertainty in the criteria and procedure for administering the takedown thereby inducing the intermediaries to err on the side of caution and over-comply with takedown notices in order to limit their liability; and as a result suppress legitimate expressions. Additionally, the Rules do not establish sufficient safeguards to prevent misuse and abuse of the takedown process to suppress legitimate expressions.<br /><br />Of the 7 intermediaries to which takedown notices were sent, 6 intermediaries over-complied with the notices, despite the apparent flaws in them. From the responses to the takedown notices, it can be reasonably presumed that not all intermediaries have sufficient legal competence or resources to deliberate on the legality of an expression. Even if such intermediary has sufficient legal competence, it has a tendency to prioritize the allocation of its legal resources according to the commercial importance of impugned expressions. Further, if such subjective determination is required to be done in a limited timeframe and in the absence of adequate facts and circumstances, the intermediary mechanically (without application of mind or proper judgement) complies with the takedown notice.<br /><br />The results also demonstrate that the Rules are procedurally flawed as they ignore all elements of natural justice. The third party provider of information whose expression is censored is not informed about the takedown, let alone given an opportunity to be heard before or after the takedown. There is also no recourse to have the removed information put-back or restored. The intermediary is under no obligation to provide a reasoned decision for rejecting or accepting a takedown notice.</p>
<p>The Rules in their current form clearly tilt the takedown mechanism in favour of the complainant and adversely against the creator of expression.</p>
<table class="plain">
<tbody>
<tr>
<td>The research highlights the need to:<br />
<ul>
<li> increase the safeguards against misuse of the privately administered takedown regime</li>
</ul>
<ul>
<li>reduce the uncertainty in the criteria for administering the takedown</li>
</ul>
<ul>
<li> reduce the uncertainty in the procedure for administering the takedown</li>
</ul>
<ul>
<li> include various elements of natural justice in the procedure for administering the takedown</li>
</ul>
<ul>
<li>replace the requirement for subjective legal determination by intermediaries with an objective test</li>
</ul>
</td>
</tr>
</tbody>
</table>
<p><a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf" class="internal-link" title="Intermediary Liability in India">Click</a> to download the report [PDF, 406 Kb]</p>
<hr />
<h3>Appendix 2</h3>
<ul>
<li><a href="https://cis-india.org/internet-governance/intermediary-liability-and-foe-executive-summary.pdf" class="internal-link">Intermediary Liability and Freedom of Expression — Executive Summary</a> (PDF, 263 Kb)</li>
<li><a href="https://cis-india.org/internet-governance/counter-proposal-by-cis-draft-it-intermediary-due-diligence-and-information-removal-rules-2012.odt" class="internal-link">Counter-proposal by the Centre for Internet and Society: Draft Information Technology (Intermediary Due Diligence and Information Removal) Rules, 2012</a> (Open Office Document, 231 Kb)</li>
<li><a href="https://cis-india.org/internet-governance/counter-proposal-by-cis-draft-it-intermediary-due-diligence-and-information-removal-rules-2012.pdf" class="internal-link">Counter-proposal by the Centre for Internet and Society: Draft Information Technology (Intermediary Due Diligence and Information Removal) Rules, 2012</a> (PDF, 422 Kb)</li>
</ul>
<hr />
<p>The above documents have been sent to:</p>
<ol>
<li>Shri Kapil Sibal, Minister of Human Resource Development and Minister of Communications and Information Technology</li>
<li>Shri Milind Murli Deora, Minister of State of Communications and Information Technology</li>
<li>Shri Sachin Pilot, Minister of State, Ministry of Communications and Information Technology</li>
<li>Dr. Anita Bhatnagar, Joint Secretary, Department of Electronics & Information Technology, Ministry of Communications & Information Technology</li>
<li>Dr. Ajay Kumar, Joint Secretary, Department of Electronics & Information Technology, Ministry of Communications & Information Technology</li>
<li>Dr. Gulshan Rai, Scientist G & Group Coordinator, Director General, ICERT, Controller Of Certifying, Authorities and Head of Division, Cyber Appellate Tribunal </li>
</ol>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet'>https://cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet</a>
</p>
No publisherRishabh DaraFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceResearchFeaturedIntermediary LiabilityCensorship2012-12-14T10:22:24ZBlog EntryIndia limits social media after civil unrest
https://cis-india.org/news/articles-latimes-com-mark-magnier-aug-23-2012-india-limits-social-media-after-civil-unrest
<b>Indian officials have gone too far in limiting text messages and pressuring local Internet firms as well as Twitter and others to block accounts, critics say.</b>
<hr />
<p style="text-align: justify; ">This article by Mark Magnier was published in <a class="external-link" href="http://articles.latimes.com/2012/aug/23/world/la-fg-india-twitter-20120824">Los Angeles Times</a> on August 23, 2012 and re-posted in <a class="external-link" href="http://www.channel6newsonline.com/2012/08/after-civil-unrest-indian-government-places-limits-social-media/">Channel 6 News</a> on August 24, 2012. Sunil Abraham is quoted.</p>
<hr />
<p style="text-align: justify; ">Has the Indian government lost its sense of humor?</p>
<p style="text-align: justify; ">That's what some in India were asking as word spread that authorities had pressured Twitter into blocking several accounts parodying the prime minister after civil unrest that saw dozens of people from northeastern India killed and thousands flee in panic.</p>
<p style="text-align: justify; ">This week, the government also imposed a two-week limit of five text messages a day — raised Thursday to 20 — potentially affecting hundreds of millions of people, and pressured local Internet companies as well as Facebook, Twitter and Google to block hundreds of websites and user accounts.</p>
<p style="text-align: justify; ">Although journalists, free speech advocates and bloggers said the effort to squelch rumors may be justified, several criticized the actions as excessive.</p>
<p style="text-align: justify; ">"You cannot burn the entire house to kill one mischievous mouse," said Gyana Ranjan Swain, a senior editor at Voice & Data, a networking trade magazine. "You're in the 21st century. Their thinking is still 50 years old. It's just 'kill the messenger.'"</p>
<p style="text-align: justify; ">Comedians said Indian political humor is evolving and there's more leeway to make fun of politicians than a decade ago, but the nation's mores still call for greater respect than in the West.</p>
<p style="text-align: justify; ">"If I tried something like South Park, I'd be put behind bars tomorrow," said Rahul Roushan, founder of Faking News website, which satirizes Indian current events.</p>
<p style="text-align: justify; ">Faking News has lampooned the recent corruption scandals, including specious stories about theme restaurants (where customers must bribe waiters or go hungry); and a tongue-in-cheek report that India has banned the zero because too many of them appear nowadays in auditors' reports, after recent coal and telecommunications scandals each allegedly involving more than $30 billion.</p>
<p style="text-align: justify; ">Roushan, whose site isn't blocked, said he hopes low-level officials misinterpreted government directives.</p>
<p style="text-align: justify; ">"I'm still in a state of disbelief," he said. "I don't think the government is so stupid that it can ask that parody accounts get taken down. If they did, God help this country."</p>
<p style="text-align: justify; ">A spokesman for the prime minister's office said the blocking of six fake Twitter accounts attributed to the prime minister has been in the works for months and wasn't related to the recent crisis. He said the move was in response to tweets containing hate language and caste insults that readers could easily mistake as the Indian leader's. A dozen Twitter accounts and about 300 websites were blocked, according to news reports.</p>
<p style="text-align: justify; ">"We have not lost our sense of humor," said Pankaj Pachauri, the prime minister's spokesman. "We started a procedure to take action against people misrepresenting themselves."</p>
<p style="text-align: justify; ">But some Twitter users whose accounts are frozen, including media consultant Kanchan Gupta, counter that the government may be using the crisis to muzzle critics.</p>
<p style="text-align: justify; ">"I'm very clear in my mind this is a political decision," said Gupta, who has been critical of corruption and the government's policy drift. "If they were openly confrontational of me, they'd go nowhere, so they're trying this."</p>
<p style="text-align: justify; ">Attempts to access his Twitter page Thursday were met with the message: "This website/URL has been blocked until further notice either pursuant to Court orders or on the Directions issued by the Department of Telecommunications."</p>
<p style="text-align: justify; ">Even Britain's Queen Elizabeth II has numerous parody accounts so India needs to lighten up, consultant Gupta said.</p>
<p style="text-align: justify; ">He's received several messages from worried Pakistani friends since the news broke. "They ask if I'm all right, say they hope they haven't frog-marched you to jail," he said. "What irony."</p>
<p style="text-align: justify; ">The restrictions are the latest chapter of a crisis that started in July when Muslims and members of the Bodo tribal community in northeastern India clashed over land, jobs and politics. The result: 75 people killed and 300,000 displaced.</p>
<p style="text-align: justify; ">Muslims in Mumbai, formerly Bombay, staged a sympathy demonstration last week; two more people were killed and dozens injured.</p>
<p style="text-align: justify; ">Rumors, hate messages and altered photos of supposed atrocities against Muslims soon spread on social media sites, and several people from northeastern India were beaten in Bangalore and other cities, prompting the crackdown.</p>
<p style="text-align: justify; ">New Delhi has accused Pakistani websites of fanning the online rumors. (Islamabad said it would investigate if there's any proof.) But Indian news media also reported that 20% of the websites blocked contained inflammatory material uploaded by Hindu nationalist groups in India that were apparently trying to stir up sectarian trouble.</p>
<p style="text-align: justify; ">The Twitter community has responded with derision and humor to limits on text messages on prepaid cellphones.</p>
<p style="text-align: justify; ">"Feeling deeply insulted that I still have not been blocked," tweeted user @abhijitmajumder. "Victim of govt apathy."</p>
<p style="text-align: justify; ">Sunil Abraham, head of the Bangalore civic group Center for Internet and Society, said this week's restrictions are the latest in a series of regulations and recommendations aimed at tightening Internet control.</p>
<p>
For more details visit <a href='https://cis-india.org/news/articles-latimes-com-mark-magnier-aug-23-2012-india-limits-social-media-after-civil-unrest'>https://cis-india.org/news/articles-latimes-com-mark-magnier-aug-23-2012-india-limits-social-media-after-civil-unrest</a>
</p>
No publisherpraskrishnaSocial mediaFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceIntermediary LiabilityCensorship2012-09-04T11:59:01ZNews ItemRebuttal of DIT's Misleading Statements on New Internet Rules
https://cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries
<b>The press statement issued on May 11 by the Department of Information Technology (DIT) on the furore over the newly-issued rules on 'intermediary due diligence' is misleading and is, in places, plainly false. We are presenting a point-by-point rebuttal of the DIT's claims.</b>
<p>In its <a class="external-link" href="http://pib.nic.in/newsite/erelease.aspx?relid=72066">press release on Wednesday, May 11, 2011</a>, the DIT stated:
<blockquote>The
attention of Government has been drawn to news items in a section of
media on certain aspects of the Rules notified under Section 79
pertaining to liability of intermediaries under the Information
Technology Act, 2000. These items have raised two broad issues. One is
that words used in Rules for objectionable content are broad and could
be interpreted subjectively. Secondly, there is an apprehension that the
Rules enable the Government to regulate content in a highly subjective
and possibly arbitrary manner. <br /></blockquote>
<p>There are actually more issues than merely "subjective interpretation" and "arbitrary governmental regulation".</p>
<ul><li style="list-style-type: disc;">The
Indian Constitution limits how much the government can regulate
citizens’ fundamental right to freedom of speech and expression. Any
measure afoul of the constitution is invalid. </li><li style="list-style-type: disc;">Several
portions of the rules are beyond the limited powers that Parliament had
granted the Department of IT to create interpretive rules under the
Information Technology Act. Parliament directed the Government to merely
define what “due diligence” requirements an intermediary would have to
follow in order to claim the qualified protection against liability that
Section 79 of the Information Technology Act provides; these current
rules have gone dangerously far beyond that, by framing rules that
insist that intermediaries, without investigation, has to remove content within 36-hours of receipt of a
complaint, keep records of a users' details and provide them to
law enforcement officials.</li></ul>
<p>The Department of Information Technology (DIT), Ministry of
Communications & IT has clarified that the Intermediaries Guidelines
Rules, 2011 prescribe that due diligence need to be observed by the
Intermediaries to enjoy exemption from liability for hosting any third
party information under Section 79 of the Information Technology Act,
2000. These due diligence practices are the best practices followed
internationally by well-known mega corporations operating on the
Internet. The terms specified in the Rules are in accordance with the
terms used by most of the Intermediaries as part of their existing
practices, policies and terms of service which they have published on
their website.</p>
<ol><li>We are not aware of any country that actually goes to the extent of
deciding what Internet-wide ‘best practices’ are and actually converting
those ‘best practices’ into law by prescribing a universal terms of
service that all Internet services, websites, and products should enforce.</li><li>The Rules require all intermediaries to include the
government-prescribed terms in an agreement, no matter what services
they provide. It is one thing for a company to choose the terms of its
terms of service agreement, and completely another for the government to
dictate those terms of service. As long as the terms of service of an
intermediary are not unlawful or bring up issues of users’ rights (such
as the right to privacy), there is no reason for the government to jump
in and dictate what the terms of service should or should not be.</li><li>The DIT has not offered any proof to back up its assertion that 'most'
intermediaries already have such terms. Google, a ‘mega corporation’
which is an intermediary, <a class="external-link" href="http://www.google.com/accounts/TOS?hl=en">does not have such an overarching policy</a>. Indiatimes, another ‘mega
corporation’ intermediary, <a class="external-link" href="http://www.indiatimes.com/policyterms/1555176.cms">does not either</a>. Just because <a class="external-link" href="http://www.rediff.com/termsofuse.html">a
company like Rediff</a> and <a class="external-link" href="http://us.blizzard.com/en-us/company/legal/wow_tou.html">
Blizzard's World of Warcraft</a> have some of those terms does not mean a) that they should have all of those terms, nor that b) everyone else should as well.<br /><br />In
attempting to take different terms of service from different Internet
services and products—the very fact of which indicate the differing
needs felt across varying online communities—the Department has put in
place a one-size-fits-all approach. How can this be possible on the Internet, when we wouldn't regulate the post-office and a book publisher under the same rules of liability for, say, defamatory speech.</li><li>There is also a significant difference between the effect of those
terms of service and that of these Rules. An intermediary-framed terms of service
suggest that the intermediary <em>may</em> investigate and boot someone off a service for violation, while the Rules insist that
the intermediary simply has to mandatorily remove content, keep records of users' details and provide them to law enforcement officials,
else be subject to crippling legal liability.</li></ol>
<p>So
to equate the effect of these Rules to merely following ‘existing
practices’ is plainly wrong. An intermediary—like the CIS website—should have the freedom to choose not to have terms of service
agreements. We now don’t.“In case any issue arises concerning the interpretation of the terms
used by the Intermediary, which is not agreed to by the user or affected
person, the same can only be adjudicated by a Court of Law. The
Government or any of its agencies have no power to intervene or even
interpret. DIT has reiterated that there is no intention of the
Government to acquire regulatory jurisdiction over content under these
Rules. It has categorically said that these rules do not provide for any
regulation or control of content by the Government.”</p>
<p>The
Rules are based on the presumption that all complaints (and resultant
mandatory taking down of the content) are correct, and that the
incorrectness of the take-downs can be disputed in court. Why not just
invert that, and presume that all complaints need to be proven first, and the correctness of the complaints (instead of the take-downs) be disputed in court? </p>
<p>Indeed,
the courts have insisted that presumption of validity is the only
constitutional way of dealing with speech. (See, for instance, <em>Karthikeyan R. v. Union
of India</em>, a 2010 Madras High Court judgment.)</p>
<p>Further,
only constitutional courts (namely High Courts and the Supreme Court)
can go into the question of the validity of a law. Other courts have to
apply the law, even if it the judge believes it is constitutionally
invalid. So, most courts will be forced to apply this law of highly
questionable constitutionality until a High Court or the Supreme Court
strikes it down.</p>
<p>What
the Department has in fact done is to explicitly open up the floodgates
for increased liability claims and litigation - which runs exactly
counter to the purpose behind the amendment of Section 79 by Parliament
in 2008.</p>
<blockquote>“The
Government adopted a very transparent process for formulation of the
Rules under the Information Technology Act. The draft Rules were
published on the Department of Information Technology website for
comments and were widely covered by the media. None of the Industry
Associations and other stakeholders objected to the formulation which is
now being cited in some section of media.”<br /></blockquote>
<p>This is a blatant lie.</p>
<p>Civil
society voices, including <a href="https://cis-india.org/internet-governance/blog/2011/02/25/intermediary-due-diligence" class="external-link">CIS</a>, <a class="external-link" href="http://www.softwarefreedom.in/index.php?option=com_idoblog&task=viewpost&id=86&Itemid=70">Software Freedom Law Centre</a>, and
individual experts (such as the lawyer and published author <a class="external-link" href="http://www.iltb.net/2011/02/draft-rules-on-intermediary-liability-released-by-the-ministry-of-it/">Apar Gupta</a>)
sent in comments. Companies <a class="external-link" href="http://online.wsj.com/article/SB10001424052748704681904576314652996232860.html?mod=WSJINDIA_hps_LEFTTopWhatNews">such as Google</a>, <a class="external-link" href="http://e2enetworks.com/2011/05/13/e2e-networks-response-to-draft-rules-for-intermediary-guidelines/">E2E Networks</a>, and others had apparently
raised concerns as well. The press has published many a cautionary note, including editorials, op-ed and articles in <a class="external-link" href="http://www.thehindu.com/opinion/lead/article1487299.ece">the</a> <a class="external-link" href="http://www.thehindu.com/opinion/editorial/article1515144.ece">Hindu</a>, <a class="external-link" href="http://www.thehoot.org/web/home/story.php?sectionId=6&mod=1&pg=1&valid=true&storyid=5163">the Hoot</a>, Medianama.com, and Kafila.com, well before the new rules were notified. We at CIS even received a 'read notification'
from the email account of the Group Coordinator of the DIT’s Cyber Laws
Division—Dr. Gulshan Rai—on Thursday, March 3, 2011 at 12:04 PM (we had
sent the mail to Dr. Rai on Monday, February 28, 2011). We never
received any acknowledgement, though, not even after we made an express
request for acknowledgement (and an offer to meet them in person to
explain our concerns) on Tuesday, April 5, 2011 in an e-mail sent to Mr.
Prafulla Kumar and Dr. Gulshan Rai of DIT.</p>
<p>The
process can hardly be called 'transparent' when the replies received
from 'industry associations and other stakeholders' have not been made
public by the DIT. Those comments which are public all indicate that
serious concerns were raised as to the constitutionality of the Rules.</p>
<p>The Government has been forward looking to create a conducive
environment for the Internet medium to catapult itself onto a different
plane with the evolution of the Internet. The Government remains fully
committed to freedom of speech and expression and the citizen’s rights
in this regard.</p>
<p><span id="internal-source-marker_0.8528041979429147">The DIT has limited this statement to the rules on intermediary due
diligence, and has not spoken about the controversial new rules that
stifle cybercafes, and restrict users' privacy and freedom to receive
information.<br /></span></p>
<p><span id="internal-source-marker_0.8528041979429147"></span>If
the government is serious about creating a conducive environment for
innovation, privacy and free expression on the Internet, then it wouldn’t be
passing Rules that curb down on them, and it definitely will not be
doing so in such a non-transparent fashion.</p></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries'>https://cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries</a>
</p>
No publisherpraneshFreedom of Speech and ExpressionIT ActFeaturedIntermediary Liability2012-07-11T13:18:04ZBlog EntryKilling the Internet Softly with Its Rules
https://cis-india.org/internet-governance/blog/killing-the-internet-oped
<b>While regulation of the Internet is a necessity, the Department of IT, through recent Rules under the IT Act, is guilty of over-regulation. This over-regulation is not only a bad idea, but is unconstitutional, and gravely endangers freedom of speech and privacy online.</b>
<div class="visualClear"><br /><span class="Apple-style-span">A slightly modified version of this blog entry was published as </span><a class="external-link" href="http://www.indianexpress.com/story-print/787789/">an op-ed in the Indian Express on May 9, 2011</a><span class="Apple-style-span">.</span></div>
<h2>Over-regulation of the Internet<br /></h2>
<div class="visualClear"> </div>
<p>Regulation of the Internet, as with
regulation of any medium of speech and commerce, is a balancing act.
Too little regulation and you ensure that criminal activities are
carried on with impunity; too much regulation and you curb the
utility of the medium. This is especially so with the Internet, as
it has managed to be the impressively vibrant space it is due to a
careful choice in most countries of eschewing over-regulation.
India, however, seems to be taking a different turn with a three sets
of new rules under the Information Technology Act.</p>
<p>These rules deal with the liability of
intermediaries (i.e., a large, inclusive, group of entities and
individuals, that transmit and allow access to third-party content),
the safeguards that cybercafes need to follow if they are not to be
held liable for their users' activities, and the practices that
intermediaries need to follow to ensure security and privacy of
customer data.</p>
<h3>Effect of not following the rules</h3>
<p>By not observing any of the provisions
of these Rules, the intermediary opens itself up for liability for
actions of its users. Thus, if a third-party defames someone, then
the intermediary can be held liable if he/she/it does not follow the
stringent requirements of the Rules.</p>
<p>The problem, however is that, many of
the provisions of the Rules have no rational nexus with the due
diligence to be observed by the intermediary to absolve itself from
liability.</p>
<h3>What does the Act require?</h3>
<p>Section 79 of the IT Act states that
intermediaries are generally not liable for third party information,
data, or communication link made available or hosted. It qualifies
that by stating that they are not liable if they follow certain
precautions (basically, to show that they are <em>real</em>
intermediaries). They observe 'due diligence' and don't exercise an
editorial role; they don't help or induce commission of the unlawful
act; and upon receiving 'actual knowledge', or on being duly notified
by the appropriate authority, the intermediary takes steps towards
some kind of action.</p>
<p>So, rules were needed to clarify what
'due diligence' involves (i.e., to state that no active monitoring is
required of ISPs), what 'actual knowledge' means, and to clarify what
happens in happens in case of conflicts between this provision and
other parts of IT Act and other Acts.</p>
<h3>Impact on freedom of speech and privacy</h3>
<p>However, that is not what the rules do.
The rules instead propose standard terms of service to be notified
by all intermediaries. This means everyone from Airtel to Hotmail to
Facebook to Rediff Blogs to Youtube to organizations and people that
allow others to post comments on their website. What kinds of terms
of service? It will require intermediaries to bar users from
engaging in speech that is disparaging', It doesn't cover only
intermediaries that are public-facing. So this means that your
forwarding a joke via e-mail, which "belongs to another person
and to which the user does not have any right" will be deemed to
be in violation of the new rules. While gambling (such as betting on
horses) isn’t banned in India and casino gambling is legal in Goa,
for example, under these Rules, all speech ‘promoting gambling’
is prohibited.</p>
<p>The rules are very onerous on
intermediaries, since they require them to act within 36 hours to
disable access to any information that they receive a complaint
about. Any 'affected person' can complain. Intermediaries will now
play the role that judges have traditionally played. Any affected
person can bring forth a complaint about issues as diverse as
defamation, blasphemy, trademark infringement, threatening of
integrity of India, 'disparaging speech', or the blanket 'in
violation of any law'. It is not made mandatory to give the actual
violator an opportunity to be heard, thus violating the cardinal
principle of natural justice of 'hearing the other party' before
denying them a fundamental right. Many parts of the Internet are in
fact public spaces and constitute an online public sphere. A law
requiring private parties to curb speech in such a public sphere is
unconstitutional insofar as it doesn't fall within Art.19(2) of the
Constitution.</p>
<p>Since intermediaries would lose
protection from the law if they don't take down content, they have no
incentives to uphold freedom of speech of their users. They instead
have been provided incentives to take down all content about which
they receive complaints without bothering to apply their minds and
coming to an actual conclusion that the content violates the rules.</p>
<h3>Cybercafe rules</h3>
<p>The cybercafe rules require all
cybercafe customers be identified with supporting documents, their
photographs taken, all their website visit history logged, and these
logs maintained for a year. Compare this to the usage of public
pay-phones. Anyone can use a pay-phone without their details being
logged. Indeed, such logging allows for cybercafe owners to
blackmail their users if they find some embarrassing websites in the
history logs—which could be anything from medical diseases to
sexual orientation to the fact that you're a whistleblower.</p>
<p>The cybercafe rules also require that
all of them install "commercially available safety or filtering
software" to prevent access to pornography. In two cases along
these lines in the Madras High Court (<em>Karthikeyan R.</em> v. <em>Union
of India</em>) and the Bombay High Court (<em>Janhit Manch </em>v.
<em>Union of India</em>), the High Courts refused to direct the
government to take proactive steps to curb access to Internet
pornography stating that such matters require case-by-case analysis
to be constitutionally valid under Art.19(1)(a) [Right to freedom of
speech and expression].</p>
<p>Such software tends to be very
ineffective—non-pornographic websites also get wrongly filtered,
and not all pornographic websites get filtered—and the High Courts
were right in being wary of any blanket ban. They preferred for
individual cases to be registered. If the worry is that our children
are getting corrupted, it is up to parents to provide supervision,
and not for the government to insist that software do the parenting
instead.</p>
<p>Given that all of these were pointed
out by both civil society organizations, news media, and industry
bodies, when the draft rules were released, it smacks of governmental
high-handedness that almost none of the changes suggested by the
public have been incorporated in the final rules.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/killing-the-internet-oped'>https://cis-india.org/internet-governance/blog/killing-the-internet-oped</a>
</p>
No publisherpraneshIT ActInternet GovernanceIntermediary Liability2011-08-20T12:51:42ZBlog EntryCIS Para-wise Comments on Intermediary Due Diligence Rules, 2011
https://cis-india.org/internet-governance/blog/intermediary-due-diligence
<b>On February 7th 2011, the Department of Information Technology, MCIT published draft rules on its website (The Information Technology (Due diligence observed by intermediaries guidelines) Rules, 2011) in exercise of the powers conferred by Section 87(2)(zg), read with Section 79(2) of the Information Technology Act, 2000. Comments were invited from the public before February 25th 2011. Accordingly, Privacy India and Centre for Internet and Society, Bangalore have prepared the following para-wise comments for the Ministry’s consideration.</b>
<h2>A. General Objections</h2>
<p>A number of the provisions under these Rules have no nexus with their parent provision, namely s.79(2). Section 79(1) provides for exemption from liability for intermediaries. Section 79(2) thereupon states:</p>
<blockquote></blockquote>
<blockquote>
<p>79. Intermediaries not to be liable in certain cases—</p>
<blockquote>
<p>(2) The provisions of sub-section (1) shall apply if—</p>
<blockquote>
<blockquote></blockquote>
</blockquote>
<blockquote>
<p>(a) the function of the intermediary is limited to providing access to a communication system over which information made available by third parties is transmitted or temporarily stored or hasted; or</p>
<p>(b) the intermediary does not—</p>
<blockquote>
<p>(i) initiate the transmission,</p>
<p>(ii) select the receiver of the transmission, and</p>
<p>(iii) select or modify the information contained in the transmission;</p>
</blockquote>
<blockquote>
<blockquote></blockquote>
</blockquote>
<p>(c) the intermediary observes due diligence while discharging his duties under this Act and also observes such other guidelines as the Central Government may prescribe in this behalf.</p>
</blockquote>
</blockquote>
</blockquote>
<blockquote></blockquote>
<p> </p>
<p>Therefore, by not observing any of the provisions of the Rules, the intermediary opens itself up for liability for actions of its users. However, many of the provisions of the Rules have no rational nexus with due diligence to be observed by the intermediary to absolve itself from liability.</p>
<h2>B. Specific Objections</h2>
<h3>Rule 2(b), (c), and (k)</h3>
<blockquote></blockquote>
<blockquote></blockquote>
<blockquote>
<p>(b) “Blog” means a type of website, usually maintained by an individual with regular entries of commentary, descriptions of events, or other material such as graphics or video. Usually blog is a shared on-line journal where users can post diary entries about their personal experiences and hobbies;</p>
</blockquote>
<blockquote></blockquote>
<blockquote>
<p>(c) “Blogger” means a person who keeps and updates a blog;</p>
</blockquote>
<blockquote>
<p>(k) “User” means any person including blogger who uses any computer resource for the purpose of sharing information, views or otherwise and includes other persons jointly participating in using the computer resource of intermediary</p>
</blockquote>
<blockquote></blockquote>
<h3><strong>Comments</strong></h3>
<p> It is unclear why it is necessary to specifically target bloggers as users, leaving out other users such as blog commenters, social network users, microbloggers, podcasters, etc. It makes the rules technologically non-neutral.</p>
<h3><strong>Recommendation</strong></h3>
<p>We recommend that these 3 sub-rules be deleted.</p>
<h3> Rule 3(2)</h3>
<blockquote></blockquote>
<blockquote>
<p>3. <strong>Due Diligence observed by intermediary</strong>.— The intermediary shall observe following due diligence while discharging its duties.</p>
<blockquote>
<p>(2) The intermediary shall notify users of computer resource not to use, display, upload, modify, publish, transmit, update, share or store any information that : —</p>
<blockquote>
<p>(a) belongs to another person;</p>
<p>(b) is harmful, threatening, abusive, harassing, blasphemous, objectionable, defamatory, vulgar, obscene, pornographic, paedophilic, libellous, invasive of another’s privacy, hateful, or racially, ethnically or otherwise objectionable, disparaging, relating or encouraging money laundering or gambling, or otherwise unlawful in any manner whatever;</p>
<p>(c) harm minors in any way;</p>
<p>(d) infringes any patent, trademark, copyright or other proprietary rights;</p>
<p>(e) violates any law for the time being in force;</p>
<p>(f) discloses sensitive personal information of other person or to which the user does not have any right to;</p>
<p>(g) causes annoyance or inconvenience or deceives or misleads the addressee about the origin of such messages or communicates any information which is grossly offensive or menacing in nature;</p>
<p>(h) impersonate another person;</p>
<p>(i) contains software viruses or any other computer code, files or programs designed to interrupt, destroy or limit the functionality of any computer resource;</p>
<p>(j) threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign states, or or public order or causes incitement to the commission of any cognizable offence or prevents investigation of any offence or is insulting any other nation.</p>
</blockquote>
</blockquote>
</blockquote>
<blockquote>
<blockquote></blockquote>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>Firstly, such ‘standard’ terms of use [1] might make sense for one intermediary, but not for all. For instance, an intermediary such as site with user-generated content (e.g., Wikipedia) would need different terms of use from an intermediary such as an e-mail provider (e.g., Hotmail), because the kind of liability they accrue are different. This is similar to how the liability that a newspaper publisher accrues is different from that accrued by the post office. However, forcing standard terms of use negates this difference. Thus, these are impractical.</p>
<p>Secondly, read with the legal obligation of the intermediary to remove such information (contained in rule 3(3)), they vest an extraordinary power of censorship in the hands of the intermediary, which could easily lead to the stifling of the constitutionally guaranteed freedom of speech online. Analogous restrictions do not exist in other fields, e.g., against the press in India or against courier companies, and there is no justification to impose them on content posted online. Taken together, these provisions make it impossible to publish critical views about anything without the risk of being summarily censored.</p>
<p>Thirdly, while it is possible to apply Indian law to intermediaries, it is impracticable to require all intermediaries (whether in India or not) to have in their terms of use India-specific clauses such as rule 3(2)(j). Instead, it is better to merely require them to ask their users to follow all relevant laws.</p>
<p>Individual instances of how these rules are overly broad are contained in an appendix to this submission.</p>
<h3><strong>Recommendation</strong></h3>
<p>We strongly recommend the deletion of this sub-rule, except clause (e).</p>
<h3>Rule 3(3)</h3>
<blockquote>
<p>(3) The intermediary shall not itself host or publish or edit or store any information or shall not initiate the transmission, select the receiver of transmission, and select or modify the information contained in the transmission as specified in sub-rule (2).</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>This sub-rule is ultra vires s.79 of the IT Act, which does not require intermediaries not to “host or publish or edit or store any information”. If fact, s.79(2) merely states that by violating the provisions of s.79(2), the intermediary loses the protection of s.79(1). It does not however make it unlawful to violate s.79(2), as rule 3(3) does. This makes rule 3(3) ultra vires the Act.</p>
<h3><strong>Recommendation</strong></h3>
<p>This sub-rule should be deleted.</p>
<h3><strong>Rule 3(4)</strong></h3>
<blockquote>
<p>(4) The intermediary upon obtaining actual knowledge by itself or been brought to actual knowledge by an authority mandated under the law for the time being in force in writing or through email signed with electronic signature about any such information as mentioned in sub-rule (2) above, shall act expeditiously to work with user or owner of such information to remove access to such information that is claimed to be infringing or to be the subject of infringing activity. Further the intermediary shall inform the police about such information and preserve the records for 90 days</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>This rule is also ultra vires s.69A of the IT Act as well as the Constitution of India. Section 69A states all the grounds on which an intermediary may be required to restrict access to information [2]. It does not allow for expansion of those grounds, because it has been carefully worded to maintains its constitutional validity vis-a-vis Articles 19(1)(a) and 19(2) of the Constitution of India. The rules framed under s.69A prescribe an elaborate procedure before such censorship may be ordered. The rules under s.69A will be rendered nugatory if any person could get content removed or blocked under s.79(2).<strong><br /></strong></p>
<p>This rule requires an intermediary to immediately take steps to remove access to information merely upon receiving a written request from “any authority mandated under the law”. Thus, for example, any authority can easily immunize itself from criticism on the internet by simply sending a written notice to the intermediary concerned. This is directly contrary to, and completely subverts the legislative intent expressed in Section 69B which lays down an elaborate procedure to be followed before any information can be lawfully blocked.</p>
<p>If any person is aggrieved by information posted online, they may seek their remedies—including the relief of injunction—from courts of law, under generally applicable civil and criminal law. Inserting a rule such as this one would take away the powers of the judiciary in India to define the line dividing permissible and impermissible speech, and vest it instead in the whims of each intermediary. This can only have a chilling effect on debates in the public domain (of which the Internet is a part) which is the foundation of any democracy.</p>
<h3><strong>Recommendation</strong></h3>
<p>This rule should modified so that an intermediary is obliged to take steps towards removal of content only when (a) backed by an order from a court or (b) a direction issued following the procedure prescribed by the rules framed under Section 69A.</p>
<h3>Rule 3(5) & (7) & (8) & (10)</h3>
<blockquote></blockquote>
<blockquote>
<p>(5) The Intermediary shall inform its users that in case of non-compliance with terms of use of the services and privacy policy provided by the Intermediary, the Intermediary has the right to immediately terminate the access rights of the users to the site of Intermediary;</p>
<p>(7) The intermediary shall not disclose sensitive personal information;</p>
<p>(8) Disclosure of information by intermediary to any third party shall require prior permission or consent from the provider of such information, who has provided such information under lawful contract or otherwise;</p>
<p>(10) The information collected by the intermediary shall be used for the purpose for which it has been collected.</p>
</blockquote>
<blockquote></blockquote>
<h3><strong>Comments</strong></h3>
<p>These sub-rules have no nexus with intermediary liability or non-liability under s.79(2). For instance, it is unreasonable to say that an intermediary may be held liable for the actions of its users if it does not inform its users about its right to terminate access by the user to its services. Furthermore, not all intermediaries need be websites, as sub-rule 5 assumes. An intermediary can even be an “internet service provider” or a “cyber cafe” or a “telecom service provider”, as per rule 2(j) read with s.2(1)(w) of the IT Act.</p>
<p>The requirements under sub-rules (7), (8), and (10) are rightfully the domain of s.43A and the rules made thereunder, and not s.79(2) nor these rules.</p>
<h3><strong>Recommendation</strong></h3>
<p>These sub-rules should be deleted, and sub-rules (7), (8), and (10) may placed instead in the rules made under s.43A.</p>
<h3>Rule 3(9)</h3>
<blockquote>
<p>(9) Intermediary shall provide information to government agencies who are lawfully authorised for investigative, protective, cyber security or intelligence activity. The information shall be provided for the purpose of verification of identity, or for prevention, detection, investigation, prosecution, cyber security incidents and punishment of offences under any law for the time being in force, on a written request stating clearly the purpose of seeking such information.</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>This provision is ultra vires ss.69 and 69B. Rules have already been issued under ss.69 and 69B which stipulate the mechanism and procedure to be followed by the government for interception, monitoring or decrypting information in the hands of intermediaries. Thus under the Interception Rules 2009 framed under Section 69, permission must first be obtained from a “competent authority” before an intermediary can be directed to provide access to its records and facilities. The current rule completely removes the safeguards contained in s.69 and its rules, and would make intermediaries answerable to virtually any request from any government agency. This is contrary to the legislative intent expressed in Section 69.</p>
<h3><strong>Recommendation</strong></h3>
<p>We recommend this sub-rule be deleted.</p>
<h3><strong>Rule 3(12)</strong></h3>
<blockquote>
<p>(12) The intermediary shall report cyber security incidents and also share cyber security incidents related information with the Indian Computer Emergency Response Team.</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>The rules relating to how and when the Indian Computer Emergency Response Team may request for information from intermediaries is rightfully the subject matter of s.70B(5) [3] and the rules made thereunder by virtue of the rule making power granted by s.87(2)(yd). The subject matter of rule 3(12) is not liability of intermediaries for third-party actions, hence there is no nexus between the rule-making power, and the rule.</p>
<h3><strong>Recommendations</strong></h3>
<p>We recommend that this sub-rule be deleted.</p>
<h3>Rule 3(14)</h3>
<blockquote>
<p>(14) The intermediary shall publish on its website the designated agent to receive notification of claimed infringements.</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>It is unclear what “infringements” are being referred to in this sub-rule. Neither s.79 nor these rules provide for “infringements”. The same reasoning applied for rule 3(4) would also apply here. It would be better to require the intermediary to publish on its website a method of providing judicial notice.</p>
<h3><strong>Recommendations</strong></h3>
<p>Delete, and replace with a requirement for the intermediary to publish on its website a method of providing judicial notice.<strong><br /></strong></p>
<h2>Footnotes <br /></h2>
<ol><li>
<p>For instance, the Section B(1) of the World of Warcraft Code of Conduct “When engaging in Chat, you may not: (i) Transmit or post any content or language which, in the sole and absolute discretion of Blizzard, is deemed to be offensive, including without limitation content or language that is unlawful, harmful, threatening, abusive, harassing, defamatory, vulgar, obscene, hateful, sexually explicit, or racially, ethnically or otherwise objectionable.</p>
</li><li>
<p>It is only “in the interest of sovereignty and integrity of India. defence of India, security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of any cognizable offence relating to above” that intermediaries may be issued directions to block access to information.</p>
</li><li>
<p>70B(5) sates that the The manner of performing functions and duties of the agency referred to in sub-section (1) shall be such as may be prescribed.</p>
</li></ol>
<p> </p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/intermediary-due-diligence'>https://cis-india.org/internet-governance/blog/intermediary-due-diligence</a>
</p>
No publisherpraneshFreedom of Speech and ExpressionIT ActIntermediary Liability2012-07-11T10:27:26ZBlog EntryComments on the Draft Rules under the Information Technology Act
https://cis-india.org/internet-governance/blog/comments-draft-rules
<b>The Centre for Internet and Society commissioned an advocate, Ananth Padmanabhan, to produce a comment on the Draft Rules that have been published by the government under the Information Technology Act. In his comments, Mr. Padmanabhan highlights the problems with each of the rules and presents specific recommendations on how they can be improved. These comments were sent to the Department of Information and Technology.</b>
<h2><em>Comments on the Draft Rules under the Information Technology Act as Amended by the Information Technology (Amendment) Act, 2008</em></h2>
<p><em><strong>Submitted by the Centre for Internet and Society, Bangalore</strong></em></p>
<p><em><strong>Prepared by Ananth Padmanabhan, Advocate in the Madras High Court</strong></em></p>
<h2>Interception, Monitoring and Decryption</h2>
<h3>Section 69</h3>
<p>The section says:</p>
<ol><li>Where the Central Government or a State Government or any of its officer specially authorised by the Central Government or the State Government, as the case may be, in this behalf may, if satisfied that it is necessary or expedient so to do in the interest of the sovereignty or integrity of India, defence of India, security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of any cognizable offence relating to above or for investigation of any offence, it may subject to the provisions of sub-section (2), for reasons to be recorded in writing, by order, direct any agency of the appropriate Government to intercept, monitor or decrypt or cause to be intercepted or monitored or decrypted any information generated, transmitted, received or stored in any computer resource. </li><li>The procedure and safeguards subject to which such interception or monitoring or decryption may be carried out, shall be such as may be prescribed.</li><li>The subscriber or intermediary or any person in-charge of the computer resource shall, when called upon by any agency referred to in sub-section (1), extend all facilities and technical assistance to-</li></ol>
<p> (a) provide access to or secure access to the computer resource
generating transmitting, receiving or storing such information; or</p>
<p>
(b) intercept, monitor, or decrypt the information, as the case may be; or</p>
(c) provide information stored in computer resource.
<ol><li>The subscriber or intermediary or any person who fails to assist the agency referred to in sub-section (3) shall be punished with imprisonment for a term which may extend to seven years and shall also be liable to fine. <br /></li></ol>
<p><strong><br /></strong></p>
<p><strong>Recommendation #1</strong><br />Section 69(3) should be amended and the following proviso be inserted:</p>
<p class="callout">Provided that only those intermediaries with respect to any information or computer resource that is sought to be monitored, intercepted or decrypted, shall be subject to the obligations contained in this sub-section, who are, in the opinion of the appropriate authority, prima facie in control of such transmission of the information or computer resource. The nexus between the intermediary and the information or the computer resource that is sought to be intercepted, monitored or decrypted should be clearly indicated in the direction referred to in sub-section (1) of this section.</p>
<p><br /><strong>Reasons for the Recommendation </strong><br />In the case of any information or computer resource, there may be more than one intermediary who is associated with such information. This is because “intermediary” is defined in section 2(w) of the amended Act as,</p>
<p class="callout">“with respect to any electronic record means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record, including telecom service providers, network service providers, internet service providers, webhosting service providers, search engines, online payment sites, online-auction sites, online-market places and cyber cafes”. </p>
<p><br />The State or Central Government should not be given wide-ranging powers to enforce cooperation on the part of any such intermediary without there being a clear nexus between the information that is sought to be decrypted or monitored by the competent authority, and the control that any particular intermediary may have over such information.</p>
<p>To give an illustration, merely because some information may have been posted on an online portal, the computer resources in the office of the portal should not be monitored unless the portal has some concrete control over the nature of information posted in it. This has to be stipulated in the order of the Central or State Government which authorizes interception of the intermediary. </p>
<p><br /><strong>Recommendation #2</strong><br />Section 69(4) should be repealed.</p>
<p><br /><strong>Reasons for the Recommendation</strong><br />The closest parallels to Section 69 of the Act are the provisions in the Telegraph Rules which were brought in after the decision in PUCL v. Union of India, (1997) 1 SCC 301, famously known as the telephone tapping case.</p>
<p>Section 69(4) fixes tremendous liability on the intermediary for non-cooperation. This is violative of Article 14. Similar provisions in the Indian Penal Code and Code of Criminal Procedure, which demand cooperation from members of the public as regards production of documents, letters etc., and impose punishment for non-cooperation on their part, impose a maximum punishment of one month. It is bewildering why the punishment is 7 years imprisonment for an intermediary, when the only point of distinction between an intermediary under the IT Act and a member of the public under the IPC and CrPC is the difference in the media which contains the information.</p>
<p>Section 69(3) is akin to the duty cast upon members of the public to extend cooperation under Section 39 of the Code of Criminal Procedure by way of providing information as to commission of any offence, or the duty, when a summons is issued by the Court or the police, to produce documents under Sections 91 and 92 of the Code of Criminal Procedure. The maximum punishment for non-cooperation prescribed by the Indian Penal Code for omission to cooperate or wilful breach of summons is only a month under Sections 175 and 176 of the Indian Penal Code. Even the maximum punishment for furnishing false information to the police is only six months under Section 177 of the IPC. When this is the case with production of documents required for the purpose of trial or inquiry, it is wholly arbitrary to impose a punishment of six years in the case of intermediaries who do not extend cooperation for providing access to a computer resource which is merely apprehended as being a threat to national security etc. A mere apprehension, however reasonable it may be, should not be used to pin down a liability of such extreme nature on the intermediary.</p>
<p>This would also amount to a violation of Articles 19(1)(a) as well as 19(1)(g) of the Constitution, not to mention Article 20(3). To give an example, much of the information received from confidential sources by members of the press would be stored in computer resources. By coercing them, through the 7 year imprisonment threat, to allow access to this computer resource and thereby part with this information, the State is directly infringing on their right under Article 19(1)(a). Furthermore, if the “subscriber” is the accused, then section 69(4) goes against Article 20(3) by forcing the accused to bear witness against himself.</p>
<p> </p>
<h3>Draft Rules under Section 69 <br /></h3>
<p><strong>Rule 3</strong><br />Directions for interception or monitoring or decryption of any information generated, transmitted, received or stored in any computer resource under sub- section (2) of section 69 of the Information Technology (Amendment) Act, 2008 (hereinafter referred to as the said Act) shall not be issued except by an order made by the concerned competent authority who is Union Home Secretary in case of Government of India; the Secretary in-charge of Home Department in a State Government or Union Territory as the case may be. In unavoidable circumstances, such order may be made by an officer, not below the rank of a Joint Secretary to the Government of India, who has been duly authorised by the Union Home Secretary or by an officer equivalent to rank of Joint Secretary to Government of India duly authorised by the Secretary in-charge of Home Department in the State Government or Union Territory, as the case may be:</p>
<p>Provided that in emergency cases – <br />(i) in remote areas, where obtaining of prior directions for interception or monitoring or decryption of information is not feasible; or <br />(ii) for operational reasons, where obtaining of prior directions for interception or monitoring or decryption of any information generated, transmitted, received or stored in any computer resource is not feasible;</p>
<p>the required interception or monitoring or decryption of any information generated, transmitted, received or stored in any computer resource shall be carried out with the prior approval of the Head or the second senior most officer of the Security and Law Enforcement Agencies (hereinafter referred to as the said Security Agencies) at the Central Level and the officers authorised in this behalf, not below the rank of Inspector General of Police or an officer of equivalent rank, at the State and Union Territory level. The concerned competent authority, however, shall be informed of such interceptions or monitoring or decryption by the approving authority within three working days and that such interceptions or monitoring or decryption shall be got confirmed by the concerned competent authority within a period of seven working days. If the confirmation from the concerned competent authority is not received within the stipulated seven working days, such interception or monitoring or decryption shall cease and the same information shall not be intercepted or monitored or decrypted thereafter without the prior approval of the concerned competent authority, as the case may be. </p>
<p><br /><strong>Recommendation #3</strong><br />In Rule 3, the following proviso may be inserted:</p>
<p class="callout">“Provided that in the event of cooperation by any intermediary being required for the purpose of interception, monitoring or decryption of such information as is referred to in this Rule, prior permission from a Supervisory Committee headed by a retired Judge of the Supreme Court or the High Courts shall be obtained before seeking to enforce the Order mentioned in this Rule against such intermediary.”</p>
<p><strong><br /></strong></p>
<p><strong>Reasons for the Recommendation </strong><br />Section 69 and the draft rules suffer from absence of essential procedural safeguards. This has come in due to the blanket emulation of the Telegraph Rules. Additional safeguards should have been prescribed to ensure that the intermediary is put to minimum hardship when carrying on the monitoring or being granted access to a computer resource. Those are akin to a raid, in the sense that it can stop an online e-commerce portal from carrying out operations for a day or even more, thus affecting their revenue. It is therefore recommended that in any situation where cooperation from the intermediary is sought, prior judicial approval has to be taken. The Central or State Government cannot be the sole authority in such cases.</p>
<p>Furthermore, since access to the computer resource is required, an executive order should not suffice, and a search warrant or an equivalent which results from a judicial application of the mind (by the Supervisory Committee, for instance) should be required.</p>
<p><br /><strong>Recommendation #4</strong><br />The following should be inserted after the last line in Rule 22:</p>
<p class="callout">The Review Committee shall also have the power to award compensation to the intermediary in cases where the intermediary has suffered loss or damage due to the actions of the competent authority while implementing the order issued under Rule 3.</p>
<p><strong><br /></strong></p>
<p><strong>Reasons for the Recommendation</strong><br />The Review Committee should be given the power to award compensation to the loss suffered by the intermediary in cases where the police use equipment or software for monitoring/decryption that causes damage to the intermediary’s computer resources / networks. The Review Committee should also be given the power to award compensation in the case of monitoring directions which are later found to be frivolous or even worse, borne out of mala fide considerations. These provisions will act as a disincentive against the abuse of power contained in Section 69. </p>
<p> </p>
<h2>Blocking of Access to Information</h2>
<h3>Section 69A</h3>
<p>The section provides for blocking of websites if the government is satisfied that it is in the interests of the purposes enlisted in the section. It also provides for penalty of up to seven years for intermediaries who fail to comply with the directions under this section. <br />The rules under this section describe the procedure which have to be followed barring which the review committee may, after due examination of the procedural defects, order an unblocking of the website.</p>
<p> </p>
<p><strong>Section 69A(3)</strong><br />The intermediary who fails to comply with the direction issued under sub-section (1) shall be punished with an imprisonment for a term which may extend to seven years and also be liable to fine.</p>
<p> </p>
<p><strong>Recommendation #5</strong><br />The penalty for intermediaries must be lessened.</p>
<p> </p>
<p><strong>Reasons for Recommendations </strong><br />The penal provision in this section which prescribes up to seven years imprisonment and a fine on an intermediary who fails to comply with the directions so issued is also excessively harsh. Considering the fact that various mechanisms are available to escape the blocking of websites, the intermediaries must be given enough time and space to administer the block effectively and strict application of the penal provisions must be avoided in bona fide cases.</p>
<p>The criticism about Section 69 and the draft rules in so far as intermediary liability is concerned, will also apply mutatis mutandis to these rules as well as Section 69A.</p>
<p> </p>
<h3>Draft Rules under Section 69A</h3>
<p><strong>Rule 22: Review Committee</strong><br />The Review Committee shall meet at least once in two months and record its findings whether the directions issued under Rule (16) are in accordance with the provisions of sub-section (2) of section 69A of the Act. When the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above, it may set aside the directions and order for unblocking of said information generated, transmitted, received, stored or hosted in a computer resource for public access.</p>
<p><br /><strong>Recommendation #6</strong><br />A permanent Review Committee should be specially for the purposes of examining procedural lapses. </p>
<p><br /><strong>Reasons for Recommendation </strong><br />Rule 22 provides for a review committee which shall meet a minimum of once in every two months and order for the unblocking of a site of due procedures have not been followed. This would mean that if a site is blocked, there could take up to two months for a procedural lapse to be corrected and it to be unblocked. Even a writ filed against the policing agencies for unfair blocking would probably take around the same time. Also, it could well be the case that the review committee will be overborne by cases and may fall short of time to inquire into each. Therefore, it is recommended that a permanent Review Committee be set up which will monitor procedural lapses and ensure that there is no blocking in the first place before all the due procedural requirements are met. <br /><br /></p>
<h2>Monitoring and Collection of Traffic Data</h2>
<h3>Draft Rules under Section 69B</h3>
<p>The section provides for monitoring of computer networks or resources if the Central Government is satisfied that conditions so mentioned are satisfied.</p>
<p>The rules provide for the manner in which the monitoring will be done, the process by which the directions for the same will be issued and the liabilities of the intermediaries and monitoring officers with respect to confidentiality of the information so monitored.</p>
<p><br /><strong>Grounds for Monitoring </strong><br /><strong>Rule 4</strong><br />The competent authority may issue directions for monitoring and collection of traffic data or information generated, transmitted, received or stored in any computer resource for any or all of the following purposes related to cyber security:<br />(a) forecasting of imminent cyber incidents;<br />(b) monitoring network application with traffic data or information on computer resource;<br />(c) identification and determination of viruses/computer contaminant;<br />(d) tracking cyber security breaches or cyber security incidents;<br />(e) tracking computer resource breaching cyber security or spreading virus/computer contaminants;<br />(f) identifying or tracking of any person who has contravened, or is suspected of having contravened or being likely to contravene cyber security;<br />(g) undertaking forensic of the concerned computer resource as a part of investigation or internal audit of information security practices in the computer resource;<br />(h) accessing a stored information for enforcement of any provisions of the laws relating to cyber security for the time being in force;<br />(i) any other matter relating to cyber security.</p>
<p><br /><strong>Rule 6</strong><br />No direction for monitoring and collection of traffic data or information generated, transmitted, received or stored in any computer resource shall be given for purposes other than those specified in Rule (4).</p>
<p><br /><strong>Recommendation #7</strong><br />Clauses (a), (b), (c), and (i) of Rule 4 must be repealed.</p>
<p><br /><strong>Reasons for Recommendations </strong><br />The term “cyber incident” has not been defined, and “cyber security” has been provided a circular definition. Rule 6 clearly states that no direction for monitoring and collection of traffic data or information generated, transmitted, received or stored in any computer resource shall be given for purposes other than those specified in Rule 4. Therefore, it may prima facie appear that the government is trying to lay down clear and strict safeguards when it comes to monitoring at the expense of a citizens' privacy. However, Rule 4(i) allows the government to monitor if it is satisfied that it is “any matter related to cyber security”. This may well play as a ‘catch all’ clause to legalise any kind of monitoring and collection and therefore defeats the purported intention of Rule 6 of safeguarding citizen’s interests against arbitrary and groundless intrusion of privacy. Also, the question of degree of liability of the intermediaries or persons in charge of the computer resources for leak of secret and confidential information remains unanswered. <br /><br /><strong>Rule 24: Disclosure of monitored data </strong><br />Any monitoring or collection of traffic data or information in computer resource by the employee of an intermediary or person in-charge of computer resource or a person duly authorised by the intermediary, undertaken in course of his duty relating to the services provided by that intermediary, shall not be unlawful, if such activities are reasonably necessary for the discharge his duties as per the prevailing industry practices, in connection with :<br />(vi) Accessing or analysing information from a computer resource for the purpose of tracing a computer resource or any person who has contravened, or is suspected of having contravened or being likely to contravene, any provision of the Act that is likely to have an adverse impact on the services provided by the intermediary.</p>
<p><br /><strong>Recommendation #8</strong><br />Safeguards must be introduced with respect to exercise of powers conferred by Rule 24(vi). </p>
<p><br /><strong>Reasons for Recommendations </strong><br />Rule 24(vi) provides for access, collection and monitoring of information from a computer resource for the purposes of tracing another computer resource which has or is likely to contravened provisions of the Act and this is likely to have an adverse impact on the services provided by the intermediary. Analysis of a computer resource may reveal extremely confidential and important data, the compromise of which may cause losses worth millions. Therefore, the burden of proof for such an intrusion of privacy of the computer resource, which is first used to track another computer resource which is likely to contravene the Act, should be heavy. Also, this violation of privacy should be weighed against the benefits accruing to the intermediary. The framing of sub rules under this clearly specifying the same is recommended. </p>
<p><br />The disclosure of sensitive information by a monitoring agency for purposes of ‘general trends’ and ‘general analysis of cyber information’ is uncalled for as it dissipates information among lesser bodies that are not governed by sufficient safeguards and this could result in outright violation of citizen’s privacy.</p>
<p> </p>
<h2>Manner of Functioning of CERT-In</h2>
<h3>Draft Rules under Section 70B(5)</h3>
<p>Section 70B provides for an Indian Computer Emergency Response Team (CERT-In) which shall serve as a national agency for performing duties as prescribed by clause 4 of this section in accordance to the rules as prescribed.<br />The rules provide for CERT-In’s authority, composition of advisory committee, constituency, functions and responsibilities, services, stakeholders, policies and procedures, modus operandi, disclosure of information and measures to deal with non compliance of orders so issued. However, there are a few issues which need to be addressed as under:</p>
<p><br /><strong>Definitions</strong><br />In these Rules, unless the context otherwise requires, “Cyber security incident” means any real or suspected adverse event in relation to cyber security that violates an explicit or implied security policy resulting in unauthorized access, denial of service/ disruption, unauthorized use of a computer resource for processing or storage of information or changes to data, information without authorization.</p>
<p><br /><strong>Recommendation #9</strong><br />The words ‘or implied’’ must be excluded from rule 2(g) which defines ‘cyber security incident’, and the term ‘security policy’ must be qualified to state what security policy is being referred to.</p>
<p><br /><strong>Reasons for Recommendation</strong><br />“Cyber security incident” means any real or suspected adverse event in relation to cyber security that violates an explicit or implied security policy resulting in unauthorized access, denial of service/disruption, unauthorized use of a computer resource for processing or storage of information or changes to data, information without authorization. </p>
<p><br />Thus, the section defines any circumstance where an explicit or implied security policy is contravened as a ‘cyber security incident’. Without clearly stating what the security policy is, an inquiry into its contravention is against an individual’s civil rights. If an individual’s actions are to be restricted for reasons of security, then the restrictions must be expressly defined and such restrictions cannot be said to be implied.</p>
<p><br /><strong>Rule 13(4): Disclosure of Information </strong><br />Save as provided in sub-rules (1), (2), (3) of rule 13, it may be necessary or expedient to so to do, for CERT-In to disclose all relevant information to the stakeholders, in the interest of sovereignty or integrity of India, defence of India, security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of an offence relating to cognizable offence or enhancing cyber security in the country.</p>
<p><br /><strong>Recommendation #10</strong><br />Burden of necessity for disclosure of information should be made heavier. </p>
<p><br /><strong>Reasons for the Recommendation</strong><br />Rule 13(4) allows the disclosure of information by CERT-In in the interests of ‘enhancing cyber security’. This enhancement however needs to be weighed against the detriment caused to the individual and the burden of proof must be on the CERT-In to show that this was the only way of achieving the required. </p>
<p><br /><strong>Rule 19: Protection for actions taken in Good Faith </strong><br />All actions of CERT-In and its staff acting on behalf of CERT-In are taken in good faith in fulfillment of its mandated roles and functions, in pursuance of the provisions of the Act or any rule, regulations or orders made thereunder. CERT-In and its staff acting on behalf of CERT-In shall not be held responsible for any unintended fallout of their actions.</p>
<p><br /><strong>Recommendation #11</strong><br />CERT-In should be made liable for their negligent action and no presumption of good faith should be as such provided for. </p>
<p><br /><strong>Reasons for the Recommendation </strong><br />Rule 19 provides for the protection of CERT-In members for the actions taken in ‘good faith’. It defines such actions as ‘unintended fallouts’. Clearly, if information has been called for and the same is highly confidential, then this rule bars the remedy for any leak of the same due to the negligence of the CERT-In members. This is clearly not permissible as an agency that calls for delicate information should also be held responsible for mishandling the same, intentionally or negligently. Good faith can be established if the need arises, and no presumption as to good faith needs to be provided.</p>
<p> </p>
<h3>Draft Rules under Section 52</h3>
<p>These rules, entitled the “Cyber Appellate Tribunal (Salary, Allowances and Other Terms and Conditions of Service of Chairperson and Members) Rules, 2009” are meant to prescribe the framework for the independent and smooth functioning of the Cyber Appellate Tribunal. This is so because of the specific functions entrusted to this Appellate Tribunal. Under the IT Act, 2000 as amended by the IT (Amendment) Act, 2008, this Tribunal has the power to entertain appeals against orders passed by the adjudicating officer under Section 47.</p>
<p><br /><strong>Recommendation #12</strong><br />Amend qualifications Information Technology (Qualification and Experience of Adjudicating Officers and Manner of Holding Enquiry) Rules, 2003, to require judicial training and experience.</p>
<p><br /><strong>Reasons for the Recommendation</strong><br />It is submitted that an examination of these rules governing the Appellate Tribunal cannot be made independent of the powers and qualifications of Adjudicating Officers who are the original authority to decide on contravention of provisions in the IT Act dealing with damage to computer system and failure to furnish information. Even as per the Information Technology (Qualification and Experience of Adjudicating Officers and Manner of Holding Enquiry) Rules, 2003, persons who did not possess judicial experience and training, such as those holding the post of Director in the Central Government, were qualified to perform functions under Section 46 and decide whether there has been unauthorized access to a computer system. This involves appreciation of evidence and is not a merely administrative function that could be carried on by any person who has basic knowledge of information technology.</p>
<p>Viewed from this angle, the qualifications of the Cyber Appellate Tribunal members should have been made much tighter as per the new draft rules. The above rules when read with Section 50 of the IT Act, as amended in 2008, do not say anything about the qualification of the technical members apart from the fact that such person shall not be appointed as a Member, unless he is, or has been, in the service of the Central Government or a State Government, and has held the post of Additional Secretary or Joint Secretary or any equivalent post. Though special knowledge of, and professional experience in, information technology, telecommunication, industry, management or consumer affairs, has been prescribed in the Act as a requirement for any technical member.</p>
<p> </p>
<h3>Draft Rules under Section 54</h3>
<p>These Rules do not suffer any defect and provide for a fair and reasonable enquiry in so far as allegations made against the Chairperson or the members of the Cyber Appellate Tribunal are concerned.</p>
<p> </p>
<h2>Penal Provisions</h2>
<h3>Section 66A</h3>
<p>Any person who sends, by means of a computer resource or a communication device,<br /> (a) any information that is grossly offensive or has menacing character; or<br /> (b) any information which he knows to be false, but for the purpose of causing annoyance, inconvenience, danger, obstruction, insult, injury, criminal intimidation, enmity, hatred or ill will, persistently by making use of such computer resource or a communication device,<br /> (c) any electronic mail or electronic mail message for the purpose of causing annoyance or inconvenience or to deceive or to mislead the addressee or recipient about the origin of such messages,<br />shall be punishable with imprisonment for a term which may extend to three years and with fine.<br />Sec. 32 of the 2008 Act inserts Sec. 66A which provides for penal measures for mala fide use of electronic resources to send information detrimental to the receiver. For the section to be attracted the ‘information’ needs to be grossly offensive, menacing, etc. and the sender needs to have known it to be false.</p>
<p>While the intention of the section – to prevent activities such as spam-sending – might be sound and even desirable, there is still a strong argument to be made that words is submitted that the use of words such as ‘annoyance’ and ‘inconvenience’ (in s.66A(c)) are highly problematic. Further, something can be grossly offensive without touching upon any of the conditions laid down in Article 19(2). Without satisfying the conditions of Article 19(2), this provision would be ultra vires the Constitution.</p>
<p><br /><strong>Recommendation #13</strong><br />The section should be amended and words which lead to ambiguity must be excluded.</p>
<p><br /><strong>Reasons for the Recommendation </strong><br />A clearer phrasing as to what exactly could convey ‘ill will’ or cause annoyance in the electronic forms needs to be clarified. It is possible in some electronic forms for the receiver to know the content of the information. In such circumstances, if such a possibility is ignored and annoyance does occur, is the sender still liable? Keeping in mind the complexity of use of electronic modes of transmitting information, it can be said that several such conditions arise which the section has vaguely covered. Therefore, a stricter and more clinical approach is necessary. </p>
<p><br /><strong>Recommendation #14</strong><br />A proviso should be inserted to this section providing for specific exceptions to the offence contained in this section for reasons such as fair comment, truth, criticism of actions of public officials etc. </p>
<p> </p>
<p><strong>Reasons for the Recommendation </strong><br />The major problem with Section 66A lies in clause (c) as per which any electronic mail or electronic mail message sent with the purpose of causing annoyance or inconvenience is covered within the ambit of offensive messages. This does not pay heed to the fact that even a valid and true criticism of the actions of an individual, when brought to his notice, can amount to annoyance. Indeed, it may be brought to his attention with the sole purpose of causing annoyance to him. When interpreting the Information Technology Act, it is to be kept in mind that the offences created under this Act should not go beyond those prescribed in the Indian Penal Code except where there is a wholly new activity or conduct, such as hacking for instance, which is sought to be criminalized.</p>
<p>Offensive messages have been criminalized in the Indian Penal Code subject to the conditions specified in Chapter XXII being present. It is not an offence to verbally insult or annoy someone without anything more being done such as a threat to commit an offence, etc. When this is the case with verbal communications, there is no reason to make an exception for those made through the electronic medium and bring any electronic mail or message sent with the purpose of causing annoyance or inconvenience within the purview of an offensive message.</p>
<p> </p>
<h3>Section 66F</h3>
<p>The definition of cyber-terrorism under this provision is too wide and can cover several activities which are not actually of a “terrorist” character. <br />Section 66F(1)(B) is particularly harsh and goes much beyond acts of “terrorism” to include various other activities within its purview. As per this provision, <br />“[w]hoever knowingly or intentionally penetrates or accesses a computer resource without authorisation or exceeding authorised access, and by means of such conduct obtains access to information, data or computer database that is restricted for reasons for the security of the State or foreign relations, or any restricted information, data or computer database, with reasons to believe that such information, data or computer database so obtained may be used to cause or is likely to cause injury to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, public order, decency or morality, or in relation to contempt of court, defamation or incitement to an offence, or to the advantage of any foreign nation, group of individuals or otherwise, commits the offence of cyber terrorism.”</p>
<p>This provision suffers from several defects and hence ought to be repealed. </p>
<p><br /><strong>Recommendation #15</strong><br />Section 66F(1)(B) has to be repealed or suitably amended to water down the excessively harsh operation of this provision. The restrictive nature of the information that is unauthorisedly accessed must be confined to those that are restricted on grounds of security of the State or foreign relations. The use to which such information may be put should again be confined to injury to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, or public order. A mere advantage to a foreign nation cannot render the act of unauthorized access one of cyber-terrorism as long as such advantage is not injurious or harmful in any manner to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, or public order. A mens rea requirement should also be introduced whereby mere knowledge that the information which is unauthorisedly accessed can be put to such uses as given in this provision should not suffice for the unauthorised access to amount to cyber-terrorism. The unauthorised access should be with the intention to put such information to this use. The amended provision would read as follows:</p>
<p class="callout">“[w]hoever knowingly or intentionally penetrates or accesses a computer resource without authorisation or exceeding authorised access, and by means of such conduct obtains access to information, data or computer database that is restricted for reasons for the security of the State or foreign relations, with the intention that such information, data or computer database so obtained may be used to cause injury to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, or public order, commits the offence of cyber terrorism.”</p>
<p class="callout"> </p>
<p><strong>Reasons for the Recommendation </strong><br />The ambit of this provision goes much beyond information, data or computer database which is restricted only on grounds of security of the State or foreign relations and extends to “any restricted information, data or computer database”. This expression covers any government file which is marked as confidential or saved in a computer used exclusively by the government. It also covers any file saved in a computer exclusively used by a private corporation or enterprise. Even the use to which such information can be put need not be confined to those that cause or are likely to cause injury to the interests of the sovereignty and integrity of India, the security of the State, or friendly relations with foreign States. Information or data which is defamatory, amounting to contempt of court, or against decency / morality, are all covered within the scope of this provision. This goes way beyond the idea of a terrorist activity and poses serious questions. While there is no one globally accepted definition of cyberterrorism, it is tough to conceive of slander as a terrorist activity.</p>
<p>To give an illustration, if a journalist managed to unauthorisedly break into a restricted database, even one owned by a private corporation, and stumbled upon information that is defamatory in character, he would have committed an act of “cyber-terrorism.” Various kinds of information pertaining to corruption in the judiciary may be precluded from being unauthorisedly accessed on the ground that such information may be put to use for committing contempt of court. Any person who gains such access would again qualify as a cyber-terrorist. The factual situations are numerous where this provision can be put to gross misuse with the ulterior motive of muzzling dissent or freezing access to information that may be restricted in nature but nonetheless have a bearing on probity in public life etc. It is therefore imperative that this provision may be toned down as recommended above. <br /><br /></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/comments-draft-rules'>https://cis-india.org/internet-governance/blog/comments-draft-rules</a>
</p>
No publisherpraneshIT ActEncryptionIntellectual Property RightsIntermediary LiabilityPublicationsCensorship2011-09-21T06:13:42ZBlog EntryPrimer on the New IT Act
https://cis-india.org/internet-governance/blog/primer-it-act
<b>With this draft information bulletin, we briefly discuss some of the problems with the Information Technology Act, and invite your comments.</b>
<p align="justify">The latest amendments to
the Information Technology Act 2000, passed in December 2008 by the
Lok Sabha, and the draft rules framed under it contain several provisions
that can be abused and misused to infringe seriously on citizens'
fundamental rights and basic civil liberties. We have already <a href="https://cis-india.org/internet-governance/it-act/short-note-on-amendment-act-2008" class="internal-link" title="Short note on IT Amendment Act, 2008">written about some of the problems</a> with this Act earlier. With this information bulletin, drafted by Chennai-based advocate Ananth Padmanabhan, we wish to extend that analysis into the form of a citizens' dialogue highlighting ways in which the Act and the rules under it fail. Thus, we invite your comments, suggestions, and queries, as this is very much a work in progress. We will eventually consolidate this dialogue and follow up with the government on the concerns of its citizens.</p>
<h3 align="justify">Intermediaries
beware</h3>
<p align="justify">Internet service
providers, webhosting service providers, search engines, online
payment sites, online auction sites, online market places, and cyber
cafes are all examples of “intermediaries” under this Act. The
Government can force any of these intermediaries to cooperate with
any interception, monitoring or decryption of data by stating broad
and ambiguous reasons such as the “interest of the sovereignty or
integrity of India”, “defence of India”, “security of the
State”, “friendly relations with foreign States”, “public
order” or for “preventing incitement to” or “investigating”
the commission of offences related to those. This power can be abused
to infringe on the privacy of intermediaries as well as to hamper
their constitutional right to conduct their business without interference.</p>
<p align="justify">If a Google search on
“Osama Bin Laden” throws up an article that claims to have
discovered his place of hiding, the Government of India can issue a
direction authorizing the police to monitor Google’s servers to
find the source of this information. While Google can, of course,
establish that this information cannot be attributed directly to the
organization, making the search unwarranted, that would not help it
much. While section 69 grants the government these wide-ranging
powers, it does not provide for adequate safeguards in the form of having to show due cause or having an in-built right of appeal against a decision by the government. If Google refused
to cooperate under such circumstances, its directors would be liable
to imprisonment of up to seven years.</p>
<h3 align="justify">Pre-censorship<br /></h3>
<p align="justify">The State has been given
unbridled power to block access to websites as long as such blocking
is deemed to be in the interest of sovereignty and integrity of
India, defence of India, security of the State, friendly relations
with foreign States, and other such matters.</p>
<p align="justify">Thus, if a web portal or
blog carries or expresses views critical of the Indo-US nuclear deal,
the government can block access to the website and thus muzzle criticism
of its policies. While some may find that suggestion outlandish, it is very much possible under the Act. Since there is no right to be heard before your website is taken down nor is there an in-built mechanism for the website owner to appeal, the decisions made by the government cannot be questioned unless you are prepared to undertake a costly legal battle. </p>
<p align="justify">Again, if an intermediary (like Blogspot or an ISP like Airtel) refuses to cooperate, its directors may be personally liable to imprisonment for up to a period of seven years. Thus, being personally liable, the intermediaries are rid of any incentive to stand up for the freedom of speech and expression.</p>
<h3 align="justify">We need to monitor your computer: you have a virus<br /></h3>
<p align="justify">The government has been
vested with the power to authorize the monitoring and collection of
traffic data and information generated, transmitted, received or
stored in any computer resource. This provision is much too
widely-worded. </p>
<p align="justify">For instance, if the
government feels that there is a virus on your computer that can
spread to another computer, it can demand access to monitor your
e-mails on the ground that such monitoring enhances “cyber
security” and prevents “the spread of computer contaminants”.</p>
<h3 align="justify">Think before you click "Send"<br /></h3>
<p align="justify">If out of anger you send
an e-mail for the purpose of causing “annoyance” or
“inconvenience”, you may be liable for imprisonment up to three
years along with a fine. While that provision (section 66A(c)) was
meant to combat spam and phishing attacks, it criminalizes much more
than it should.</p>
<h3 align="justify">A new brand of "cyber terrorists" <br /></h3>
<p align="justify">The new offence of “cyber
terrorism” has been introduced, which is so badly worded that it
borders on the ludicrous. If a journalist gains
unauthorized access to a computer where information regarding
corruption by certain members of the judiciary is stored, she becomes
a “cyber terrorist” as the information may be used to cause
contempt of court. There is no precedent for any such definition of cyberterrorism. It is unclear what definition of terrorism the government is going by when even unauthorized access to defamatory material is considered cyberterrorism.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/primer-it-act'>https://cis-india.org/internet-governance/blog/primer-it-act</a>
</p>
No publisherpraneshIT ActDigital GovernancePublic AccountabilityIntermediary LiabilityCensorship2011-08-02T07:41:54ZBlog EntryWhy should we care about takedown timeframes?
https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes
<b>The issue of content takedown timeframe - the time period an intermediary is allotted to respond to a legal takedown order - has received considerably less attention in conversations about intermediary liability. This article examines the importance of framing an appropriate timeframe towards ensuring that speech online is not over-censored, and frames recommendations towards the same.
</b>
<p> </p>
<p> </p>
<p><em>This article first <a class="external-link" href="https://cyberbrics.info/why-should-we-care-about-takedown-timeframes/">appeared</a> in the CyberBRICS website. It has since been <a class="external-link" href="https://www.medianama.com/2020/04/223-content-takedown-timeframes-cyberbrics/">cross-posted</a> to the Medianama.</em></p>
<p><em>The findings and opinions expressed in this article are derived from the larger research report 'A deep dive into content takedown timeframes', which can be accessed <a class="external-link" href="https://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames">here</a>.</em></p>
<p><strong>Introduction</strong></p>
<p>Since the Ministry of Electronics and Information Technology (MeitY) proposed the draft amendments to the intermediary liability guidelines in December of 2018, speculations regarding their potential effects have been numerous. These have included, <a class="external-link" href="http://www.medianama.com/2020/01/223-traceability-accountability-necessary-intermediary-liability/">mapping</a> the requirement of traceability of originators vis-a-vis chilling effect on free speech online, or <a class="external-link" href="http://cyberbrics.info/rethinking-the-intermediary-liability-regime-in-india/">critiquing</a> the proactive filtering requirement as potentially leading to censorship.</p>
<p>One aspect, however, that has received a lesser amount of attention is encoded within Rule 3(8) of the draft amendments. By the virtue of that rule, the time-limit given to the intermediaries to respond to a legal content takedown request (“turnaround time”) has been reduced from 36 hours (as it was in the older version of the rules) to 24 hours. In essence, intermediaries, when faced with a takedown order from the government or the court, would now have to remove the concerned piece of content within 24 hours of receipt of the notice.</p>
<p>Why is this important? Consider this: the <a class="external-link" href="http://indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf">definition</a> of an ‘intermediary’ within the Indian law encompasses a vast amount of entities – cyber cafes, online-marketplaces, internet service providers and more. Governance of any intermediary liability norms would accordingly require varying levels of regulation, each of which recognizes the different composition of these entities. In light of that, the content takedown requirement, and specifically the turnaround time becomes problematic. Let alone that the vast amount of entities under the definition of intermediaries would probably find it impossible to implement this obligation due to their technical architecture, this obligation also seems to erase the nuances existing within entities which would actually fall within its scope. </p>
<p>Each category of online content, and more importantly, each category of intermediary are different, and any content takedown requirement must appreciate these differences. A smaller intermediary may find it more difficult to adhere to a stricter, shorter timeframe, than an incumbent. A piece of ‘terrorist’ content may be required to be treated with more urgency than something that is defamatory. These contextual cues are critical, and must be accordingly incorporated in any law on content takedown.</p>
<p>While making our submissions to the draft amendments, we found that there was a lack of research from the government’s side justifying the shortened turnaround time, nor were there any literature which focussed on turnaround time-frames as a critical point of regulation of intermediary liability. Accordingly, I share some findings from our research in the subsequent sections, which throw light on certain nuances that must be considered before proposing any content takedown time-frame. It is important to note that our research has not yet found what should be an appropriate turnaround time in a given situation. However, the following findings would hopefully start a preliminary conversation which may ultimately lead us to a right answer.</p>
<p><strong>What to consider when regulating takedown time-frames?</strong></p>
<p>I classify the findings from our research into a chronological sequence: a) broad legal reforms, b) correct identification of scope and extent of the law, c) institution of proper procedural safeguards, and d) post-facto review of the time-frame for evidence based policy-making.</p>
<p><em>1. Broad legal reforms: Harmonize the law on content takedown.</em></p>
<p>The Indian law for content takedown is administered through two different provisions under the Information Technology (IT) Act, each with their own legal procedures and scope. While the 24-hour turnaround time would be applicable for the procedure under one of them, there would continue to <a class="external-link" href="http://cis-india.org/internet-governance/resources/information-technology-procedure-and-safeguards-for-blocking-for-access-of-information-by-public-rules-2009">exist</a> a completely different legal procedure under which the government could still effectuate content takedown. For the latter, intermediaries would be given a 48-hour timeframe to respond to a government request with clarifications (if any).</p>
<p>Such differing procedures contributes to the creation of a confusing legal ecosystem surrounding content takedown, leading to arbitrary ways in which Indian users experience internet censorship. Accordingly, it is important to harmonize the existing law in a manner that the procedures and safeguards are seamless, and the regulatory process of content takedown is streamlined.</p>
<p><em>2. Correct identification of scope and extent of the law: Design a liability framework on the basis of the differences in the intermediaries, and the content in question.</em></p>
<p>As I have highlighted before, regulation of illegal content online cannot be <a class="external-link" href="https://blog.mozilla.org/netpolicy/2018/07/11/sustainable-policy-solutions-for-illegal-content/">one-size-fits-all</a>. Accordingly, a good law on content takedown must account for the nuances existing in the way intermediaries operate and the diversity of speech online. More specifically, there are two levels of classification that are critical.</p>
<p><em>One</em>, the law must make a fundamental classification between the intermediaries within the scope of the law. An obligation to remove illegal content can be implemented only by those entities whose technical architecture allows them to. While a search engine would be able to delink websites that are declared ‘illegal’, it would be absurd to expect a cyber cafe to follow a similar route of responding to a legal takedown order within a specified timeframe.</p>
<p>Therefore, one basis of classification must incorporate this difference in the technical architecture of these intermediaries. Apart from this, the law must also design liability for intermediaries on the basis of their user-base, annual revenue generated, and the reach, scope and potential impact of the intermediary’s actions.</p>
<p><em>Two, </em>it is important that the law recognizes that certain types of content would require more urgent treatment than other types of content. Several regulations across jurisdiction, including the NetzDG and the EU Regulation on Preventing of Dissemination of Terrorist Content Online, while problematic in their own counts, attempt to either limit their scope of application or frame liability based on the nature of content targeted.</p>
<p>The Indian law on the other hand, encompasses within its scope, a vast, varying array of content that is ‘illegal’, which includes on one hand, critical items like threatening ‘the sovereignty and integrity of India’ and on the other hand, more subjective speech elements like ‘decency or morality’. While an expedited time-frame may be permissible for the former category of speech, it is difficult to justify the same for the latter. More contextual judgments may be needed to assess the legality of content that is alleged to be defamatory or obscene, thereby making it problematic to have a shorter time-frame for the same.</p>
<p><em>3. Institution of proper procedural safeguards: Make notices mandatory and make sanctions gradated</em>.</p>
<p>Apart from the correct identification of scope and extent, it is important that there are sufficient procedural safeguards to ensure that the interests of the intermediaries and the users are not curtailed. While these may seem ancillary to the main point, how the law chooses to legislate on these issues (or does not), nevertheless has a direct bearing on the issue of content takedown and time-frames.</p>
<p>Firstly, while the Indian law mandates content takedown, it does not mandate a process through which a user is notified of such an action being taken. The mere fact that an incumbent intermediary is able to respond to removal notifications within a specified time-frame does not imply that its actions would not have ramifications on free speech. Ability to takedown content does not translate into accuracy of the action taken, and the Indian law fails to take this into account.</p>
<p>Therefore, additional obligations of informing users when their content has been taken down, institutes due process in the procedure. In the context of legal takedown, such notice mechanisms also <a class="external-link" href="http://www.eff.org/wp/who-has-your-back-2019">empower</a> users to draw attention to government censorship and targeting.</p>
<p>Secondly, a uniform time-frame of compliance, coupled with severe sanctions goes on to disrupt the competition against the smaller intermediaries. While the current law does not clearly elaborate upon the nature of sanctions that would be imposed, general principles of the doctrine of safe harbour dictate that upon failure to remove the content, the intermediary would be subject to the same level of liability as the person uploading the content. This threat of sanctions may have adverse effects on free speech online, resulting in potential <a class="external-link" href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf">over-censorship</a> of legitimate speech.</p>
<p>Accordingly, sanctions should be restricted to instances of systematic violations. For critical content, the contours of what constitutes systematic violation may differ. The regulator must accordingly take into account the nature of content which the intermediary failed to remove, while assessing their liability.</p>
<p><em>4. Post-facto review of the time-frame for evidence based policy-making: Mandate transparency reporting.</em></p>
<p>Transparency reporting, apart from ensuring accountability of intermediary action, is also a useful tool for understanding the impact of the law, specifically with relation to time period of response. The NetzDG, for all its criticism, has received <a class="external-link" href="https://www.article19.org/wp-content/uploads/2017/09/170901-Legal-Analysis-German-NetzDG-Act.pdfhttp://">support</a> for requiring intermediaries to produce bi-annual transparency reports. These reports provide us important insight into the efficacy of any proposed turnaround time, which in turn helps us to propose more nuanced reforms into the law.</p>
<p>However, to cull out the optimal amount of information from these reports, it is important that these reporting practices are standardized. There exists some international body of work which proposes a methodology for standardizing transparency reports, including the Santa Clara Principles and the Electronic Frontier Foundation’s (EFF) ‘Who has your back?’ reports. We have also previously proposed a methodology that utilizes some of these pointers.</p>
<p>Additionally, due to the experimental nature of the provision, including a review provision in the law would ensure the efficacy of the exercise can also be periodically assessed. If the discussion in the preceding section is any indication, the issue of an appropriate turnaround time is currently in a regulatory flux, with no correct answer. In such a scenario, periodic assessments compel policymakers and stakeholders to discuss effectiveness of solutions, and the nature of the problems faced, leading to <a class="external-link" href="http://www.livemint.com/Opinion/svjUfdqWwbbeeVzRjFNkUK/Making-laws-with-sunset-clauses.html">evidence-based</a> policymaking.</p>
<p><strong>Why should we care?</strong></p>
<p>There is a lot at stake while regulating any aspect of intermediary liability, and the lack of smart policy-making may result in the dampening of the interests of any one of the stakeholder groups involved. As the submissions to the draft amendments by various civil societies and industry groups show, the updated turnaround time suffers from issues, which if not addressed, may lead to over-removal, and lack of due process in the content removal procedure.</p>
<p>Among others, these submissions pointed out that the shortened time-frame did not allow the intermediaries sufficient time to scrutinize a takedown request to ensure that all technical and legal requirements are adhered to. This in turn, may also prompt third-party action against user actions. Additionally, the significantly short time-frame also raised several implementational challenges. For smaller companies with fewer employees, such a timeframe can both be burdensome, from both a financial and capability point of view. This in turn, may result in over-censorship of speech online.</p>
<p>Failing to recognize and incorporate contextual nuances into any law on intermediary liability therefore, may critically alter the way we interact with online intermediaries, and in a larger scheme, with the internet.</p>
<p> </p>
<p> </p>
<p> </p>
<div> </div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes'>https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes</a>
</p>
No publisherTorSharkContent takedownIntermediary LiabilityChilling Effect2020-04-10T04:58:56ZBlog EntryChilling Effects and Frozen Words
https://cis-india.org/internet-governance/chilling-effects-frozen-words
<b>What if the real danger is not that we lose our freedom of speech and expression but our sense of humour as a nation? Lawrence Liang's op-ed was published in the Hindu on April 30, 2012. </b>
<p>While freedom of speech and expression is an individual right, its actualisation often relies on a vast infrastructure of intermediaries.</p>
<p>In the offline world, this includes newspapers, television channels, public auditoriums, etc. It is often assumed that the internet has created a more robust public sphere of speech by doing away with many structural barriers to free speech. But the fact of the matter is that even if the internet enables a shift from a ‘few to many' to a ‘many to many' model of communication, intermediaries continue to remain important players in facilitating free speech. Can one imagine free speech on the internet being the same without Twitter, social networks or Youtube?</p>
<p>One way of thinking of the infrastructure of communication is in terms of ecology, and in the ecology of speech — as in the environment — an adverse impact on any component threatens the well-being of all. The idea of cyberspace as a commons is a much cherished myth and in the early days of the internet we were perhaps given a glimpse into its utopian possibility. But we would be deluding ourselves if we believed that the problems that plague free speech in the offline world (including ownership of the avenues of speech) are absent in cyberspace. Recall in recent times that one of the most effective ways in which various governments retaliated to the leaking of official secrets on WikiLeaks was by freezing Julian Assange's PayPal account.</p>
<h3>Direct & Indirect Controls</h3>
<p>It may be useful to distinguish between direct controls on free speech and indirect or structural controls on free speech. India has had a long history of battling direct and indirect controls on free speech and with a few exceptions the interests of the press have often coincided with the interests of a robust public sphere of debate and criticism.</p>
<p>In the late 1950s and early 1960s, a number of large media houses battled restrictions imposed on the press by way of control of the number of pages of a newspaper, regulation of the size of advertisements and the price of imported newsprint. On the face of it, some of these restrictions may have seemed like commercial disputes but the Supreme Court rightly recognised that indirect controls could adversely impact the individual's right to express himself or herself as well as to receive information freely.</p>
<p>In the online context, there has also been a similar recognition of the role of intermediaries in providing platforms of speech and it is with this view in mind that a number of countries have incorporated safe harbour provisions in their information technology laws.</p>
<p>Section 79 of the Information Technology Act is one such safe harbour provision in India which provides that intermediaries shall not be liable for any third party action if they are able to prove that the offence or contravention was committed without their knowledge or that they had exercised due diligence to prevent the commission of such offence or contravention. But this safe harbour has effectively been undone with the passing of the Information Technology (Intermediaries guidelines) Rules, 2011.</p>
<p>The rules clarify what standard of due diligence has to be met by intermediaries and Sec. 3(2) of the rules obliges intermediaries to have rules and conditions of usage which ensure that users do not host, display, upload, modify, publish, transmit, update or share any information that is in contravention of the Section. This includes the all too familiar ones (defamatory, obscene, pornographic content) but also a whole host of new categories which could be invoked to restrict speech (“grossly harmful,” “blasphemous,” “harassing,” “hateful”).</p>
<p>As is well known, any restriction on speech in India has to comply with both the test of reasonableness under Article 19(2) of the Constitution, as well as ensuring that the grounds of censorship are located within 19(2). Even though there are laws regulating hate speech in India, blasphemy is not a category under Art. 19(2) and has hitherto not been a part of Indian law. Some of the other categories such as “grossly harmful” suggest the people who drafted the rules seem to have taken a constitutional nap at the drafting board.</p>
<p>Sec. 3(4) of the rules provides that any intermediary who receives a notice by an aggrieved person about any violation of sub rule (2) will have to act within 36 hours and where applicable will ensure that the information is disabled. In the event that it fails to act or to respond, the intermediary cannot claim exemption for liability under Sec. 70 of the IT Act. It is worth noting that most intermediaries receive from hundreds to thousands of requests from individuals on a daily basis asking for the removal of objectionable material. The Centre for Internet and Society conducted a “sting operation” to determine whether the criteria, procedure and safeguards for administration of the takedowns as prescribed by the Rules lead to a chilling effect on free expression.</p>
<p>In the course of the study, frivolous takedown notices were sent to seven intermediaries and their response to the notices was documented. Different policy factors were permuted in the takedown notices in order to understand at what points in the process of takedown, free expression is being chilled. The takedown notices which were sent by the researcher were intentionally defective as they did not establish how they were interested parties, did not specifically identify and discuss any individual URL on the websites, or present any cause of action, or suggest any legal injury. Of the seven intermediaries to which takedown notices were sent, six over-complied with the notices, despite the apparent flaws in them.</p>
<h3>Caution</h3>
<p>Even in cases where the intermediaries challenged the validity of the takedowns, they erred on the side of caution and took down the material. While a number of intermediaries would see themselves as allies in the fight against censorship, more often than not intermediaries are also large commercial organisations whose primary concern is the protection of their business interests. In the face of any potential legal threat, especially from the government, they prefer to err on the side of caution. The people whose content was removed were not told, nor was the general public informed that the content was removed.</p>
<p>The procedural flaws (subjective determination, absence of the right to be heard, the short response time) coupled with the vague grounds on which such takedowns can be claimed, clearly point to a highly flawed situation in which we will see many more trigger happy demands for offending materials to be taken down.</p>
<p>We have already slipped into a state of being a republic of over sensitivity where any politician, religious group or individual can claim their sentiments have been hurt or they have been portrayed disparagingly, as evidenced by the recent attack and subsequent arrest of Professor Ambikesh Mahapatra of Jadavpur University for posting cartoons lampooning Mamata Banerjee.</p>
<h3>Nervous State</h3>
<p>In the era of global outsourcing it was inevitable that the state censorship machinery would also learn a lesson or two from the global trends and what better way of ensuring censorship than outsourcing it to individuals and to corporations. The renowned anthropologist, Michael Taussig, once compared the state to a nervous system and it seems that the Intermediary rules live up to the expectations of a nervous state ever ready to respond to criticism and disparaging cartoons.</p>
<p>What if the real danger is not even that we lose our freedom of speech and expression but we lose our sense of humour as a nation?</p>
<p>The evident flaws of the rules have been acknowledged even by lawmakers, with P. Rajeeve, the CPI(M) M.P., introducing a motion for the annulment of the rules. The annulment motion is going to be debated in the coming weeks and one hopes that the parliamentarians will seriously reconsider the rules in their current form.</p>
<p>When faced with conundrums of the present it is always useful to turn to history and there is reason to believe that while censorship has a very respectable genealogy in Indian thought, it has also been accompanied in equal measure by a tradition of the right to offend.</p>
<p>In his delightful reading of the <em>Arthashastra</em>, Sibaji Bandyopadhay alerts us to the myriad restrictions that existed to control Kusilavas (the term for entertainers which included actors, dancers, singers, storytellers, minstrels and clowns). These regulations ranged from the regulation of their movement during monsoon to prohibitions placed on them, ensuring that they shall not “praise anyone excessively nor receive excessive presents”. While some of the regulations appear harsh and unwarranted, Bandyopadhay says that in contrast to Plato's <em>Republic</em>, which banished poets altogether from the ideal republic, the <em>Arthashastra</em> goes so far as to grant to Kusilavas what we could now call the right to offend. Verse 4.1.61 of the <em>Arthashastra</em> says, “In their performances, [the entertainers] may, if they so wish, make fun of the customs of regions, castes or families and the practices or love affairs (of individuals)”. One hopes that our lawmakers, even if they are averse to reading the Indian Constitution, will be slightly more open to the poetic licence granted by Kautilya.</p>
<p><a class="external-link" href="http://www.thehindu.com/opinion/lead/article3367917.ece?homepage=true">Click</a> for the original published in the Hindu on April 30, 2012. Lawrence Liang is a lawyer and researcher based at Alternative Law Forum, Bangalore. He can be contacted at <a class="external-link" href="mailto:lawrence@altlawforum.org">lawrence@altlawforum.org</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/chilling-effects-frozen-words'>https://cis-india.org/internet-governance/chilling-effects-frozen-words</a>
</p>
No publisherLawrence LiangFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceIntermediary LiabilityCensorship2012-04-30T07:32:17ZBlog EntryYou Have the Right to Remain Silent
https://cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent
<b>Reflecting upon the state of freedom of speech and expression in India, in the wake of the shut-down of the political satire website narendramodiplans.com.</b>
<hr />
<p style="text-align: justify; ">Nishant Shah's <a class="external-link" href="http://www.downtoearth.org.in/content/you-have-right-remain-silent">column was published in Down to Earth</a> on July 17, 2013.</p>
<hr />
<p style="text-align: justify; ">It took less than a day for narendramodiplans.com, a political satire website that had more than 60,000 hits in the 20 hours of its existence, to be taken down. A simple webpage that showed a smiling picture of Narendra Modi, the touted candidate for India’s next Prime Ministerial campaign, flashing his now trademark ‘V’ for <span><s>Vengeance</s> </span> Victory sign. At the first glimpse it looked like another smart media campaign by the net-savvy minister who has already made use of the social web quite effectively, to connect with his constituencies and influence the younger voting population in the country. Below the image of Mr. Modi was a text that said, "For a detailed explanation of how Mr. Narendra Modi plans to run the nation if elected to the house as a Prime Minister and also for his view/perspective on 2002 riots please click the link below." The button, reminiscent of 'sale' signs on shops that offer permanent discounts, promised to reveal, for once and for all, the puppy plight of Mr. Modi's politics and his plans for the country that he seeks to lead.</p>
<p style="text-align: justify; ">However, when one tried to click on the button, hoping, at least for a manifesto that combined the powers of Machiavelli with the sinister beauty of Kafka, it proved to be an impossible task. The button wiggled, and jiggled, and slithered all over the page, running away from the mouse following it. Referencing the layers of evasive answers, the engineered Public Relations campaigns that try to obfuscate the history to some of the most pointed questions that have been posited to the Modi government through judicial and public forums, the button never stayed still enough to actually reveal the promised answers. For people who are familiar with the history of such political satire and protest online would immediately recognise that this wasn’t the most original of ideas. In fact, it was borrowed from another website - <a href="http://www.thepmlnvision.com/" title="http://www.thepmlnvision.com/">http://www.thepmlnvision.com/</a> that levelled similar accusations of lack of transparency and accountability on the part of Nawaz Sharif of Pakistan. Another instance, which is now also shut down, had a similar deployment where the webpage claimed to give a comprehensive view into Rahul Gandhi’s achievements, to question his proclaimed intentions of being the next prime-minister. In short, this is an internet meme, where a simple web page and a java script allowed for a critical commentary on the future of the next elections and the strengthening battle between #feku and #pappu that has already taken epic proportions on Twitter.</p>
<p style="text-align: justify; ">The early demise of these two websites (please do note, when you click on the links that the Nawaz Sharif website is still working) warns us of the tightening noose around freedom of speech and expression that politicos are responsible for in India. It has been a dreary last couple of years already, with the passing of the <a href="http://www.downtoearth.org.in/content/cis-india.org/internet-governance/intermediary-liability-in-india" target="_blank">Intermediaries Liabilities Rules</a> as an amendment to the IT Act of India, <a href="http://www.indianexpress.com/news/spy-in-the-web/888509/1" target="_blank">Dr. Sibal proposing to pre-censor the social web</a> in a quest to save the face of erring political figures,<a href="http://www.indianexpress.com/news/two-girls-arrested-for-facebook-post-questioning-bal-thackeray-shutdown-of-mumbai-get-bail/1033177/" target="_blank"> teenagers being arrested for voicing political dissent</a>, and <a href="http://en.wikipedia.org/wiki/Aseem_Trivedi" target="_blank">artists being prosecuted</a> for exercising their rights to question the state of governance in our country. Despite battles to keep the web an open space that embodies the democratic potentials and the constitutional rights of freedom of speech and expression in the country, it has been a losing fight to keep up with the ad hoc and dictatorial mandates that seem to govern the web.</p>
<table class="invisible">
<tbody>
<tr>
<th><img src="https://cis-india.org/home-images/Namo.png" alt="Narendra Modi Plans" class="image-inline" title="Narendra Modi Plans" /></th>
</tr>
<tr>
<td>Above is a screen shot from narendramodiplans.com website</td>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">We have no indication of why this latest piece of satirical expression, which should be granted immunity as a work of art, if not as an individual’s right to free speech, was suddenly taken down. The website now has a message that says, “I quit. In a country with freedom of speech, I assumed that I was allowed to make decent satire on any politician more particularly if it is constructive. Clearly, I was wrong.” The web is already abuzz with conspiracy theories, each sounding scarier than the other because they seem so plausible and possible in a country that has easily sacrificed our right to free speech and expression at the altar of political egos. And whether you subscribe to any of the theories or not, whether your sympathies lie with the BJP or with the UPA, whether or not you approve of the political directions that the country seems to be headed in, there is no doubt that you should be as agitated as I am, about the fact that we are in a fast-car to blanket censorship, and we are going there in style.</p>
<p style="text-align: justify; ">What happens online is not just about this one website or the one person or the one political party – it is a reflection on the rising surveillance and bully state that presumes that making voices (and sometimes people) invisible, is enough to resolve the problems that they create. And what happens on the web is soon going to also affect the ways in which we live our everyday lives. So the next time, you call some friends over for dinner, and then sit arguing about the state of politics in the country, make sure your windows are all shut, you are wearing tin-foil hats and if possible, direct all conversations to the task of finally <a href="http://bollywoodjournalist.com/2013/07/08/desperately-seeking-mamta-kulkarni/" target="_blank">finding Mamta Kulkarni</a>. Because anything else that you say might either be censored or land you in a soup, and the only recourse you might have would be a website that shows the glorious political figures of the country, with a sign that says “To defend your right to free speech and expression, please click here”. And you know that you are never going to be able to click on that sign. Ever.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent'>https://cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent</a>
</p>
No publishernishantFreedom of Speech and ExpressionSocial MediaInternet GovernanceIntermediary Liability2013-07-22T06:59:53ZBlog EntryRole of Intermediaries in Countering Online Abuse
https://cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse
<b>The Internet can be a hostile space and protecting users from abuse without curtailing freedom of expression requires a balancing act on the part of online intermediaries.</b>
<p style="text-align: justify; ">This got published as two blog entries in the NALSAR Law Tech Blog. Part 1 can be accessed <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-i/">here</a> and Part 2 <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-ii/">here</a>.</p>
<hr />
<p style="text-align: justify; ">As platforms and services coalesce around user-generated content (UGC) and entrench themselves in the digital publishing universe, they are increasingly taking on the duties and responsibilities of protecting rights including taking reasonable measures to restrict unlawful speech. Arguments around the role of intermediaries tackling unlawful content usually center around the issue of regulation—when is it feasible to regulate speech and how best should this regulation be enforced?</p>
<p class="Standard" style="text-align: justify; ">Recently, Twitter found itself at the periphery of such questions when an anonymous user of the platform, @LutyensInsider, began posting slanderous and sexually explicit comments about Swati Chaturvedi, a Delhi-based journalist. The online spat which began in February last year, culminated into<a href="http://www.dailyo.in/politics/twitter-trolls-swati-chaturvedi-lutyensinsider-presstitutes-bazaru-media-delhi-police/story/1/4300.html"> Swati filing an FIR</a> against the anonymous user, last week. Within hours of the FIR, the anonymous user deleted the tweets and went silent. Predictably, Twitter users <a href="https://twitter.com/bainjal/status/609343547796426752">hailed this</a> as a much needed deterrence to online harassment. Swati’s personal victory is worth celebrating, it is an encouragement for the many women bullied daily on the Internet, where harassment is rampant. However, while Swati might be well within her legal rights to counter slander, the rights and liabilities of private companies in such circumstances are often not as clear cut.</p>
<p class="Standard" style="text-align: justify; ">Should platforms like Twitter take on the mantle of deciding what speech is permissible or not? When and how should the limits on speech be drawn? Does this amount to private censorship?The answers are not easy and as the recent Grand Chamber of the European Court of Human Rights (ECtHR)<a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635"> </a><a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635">judgment in the case of</a> Delfi AS v. Estonia confirms, the role of UGC platforms in balancing the user rights, is an issue far from being settled. In its ruling, the ECtHR reasoned that because of their role in facilitating expression, online platforms have a requirement “<i>to take effective measures to limit the dissemination of hate speech and speech inciting violence was not ‘private censorship”.</i></p>
<p class="Standard" style="text-align: justify; ">This is problematic because the decision moves the regime away from a framework that grants immunity from liability, as long as platforms meet certain criteria and procedures. In <a href="http://www.jipitec.eu/issues/jipitec-5-3-2014/4091">other words</a> the ruling establishes strict liability for intermediaries in relation to manifestly illegal content, even if they may have no knowledge. The 'obligation' placed on the intermediary does not grant them safe harbour and is not proportionate to the monitoring and blocking capacity thus necessitated. Consequently, platforms might be incentivized to err on the side of caution and restrict comments or confine speech resulting in censorship. The ruling is especially worrying, as the standard of care placed on the intermediary does not recognize the different role played by intermediaries in detection and removal of unlawful content. Further, intermediary liability is its own legal regime and is at the same time, a subset of various legal issues that need an understanding of variation in scenarios, mediums and technology both globally and in India.</p>
<h3 class="Standard">Law and Short of IT</h3>
<p class="Standard" style="text-align: justify; ">Earlier this year, in a<a href="http://www.theverge.com/2015/2/4/7982099/twitter-ceo-sent-memo-taking-personal-responsibility-for-the"> leaked memo</a>, the Twitter CEO Dick Costolo took personal responsibility for his platform's chronic problem and failure to deal with harassment and abuse. In Swati's case, Twitter did not intervene or take steps to address harrassment. If it had to, Twitter (India), as all online intermediaries would be bound by the provisions established under Section 79 and accompanying Rules of the Information Technology Act. These legislations outline the obligations and conditions that intermediaries must fulfill to claim immunity from liability for third party content. Under the regime, upon receiving actual knowledge of unlawful information on their platform, the intermediary must comply with the notice and takedown (NTD) procedure for blocking and removal of content.</p>
<p class="Standard" style="text-align: justify; ">Private complainants could invoke the NTD procedure forcing intermediaries to act as adjudicators of an unlawful act—a role they are clearly ill-equipped to perform, especially when the content relates to political speech or alleged defamation or obscenity. The SC judgment in Shreya Singhal addressing this issue, read down the provision (Section 79 by holding that a takedown notice can only be effected if the complainant secures a court order to support her allegation. Further, it was held that the scope of restrictions under the mechanism is restricted to the specific categories identified under Article 19(2). Effectively, this means Twitter need not take down content in the absence of a court order.</p>
<h3 class="Standard">Content Policy as Due Diligence</h3>
<p class="Standard" style="text-align: justify; ">Another provision, Rule 3(2) prescribes a content policy which, prior to the Shreya Singhal judgment was a criteria for administering takedown. This content policy includes an exhaustive list of types of restricted expressions, though worryingly, the terms included in it are not clearly defined and go beyond the reasonable restrictions envisioned under Article 19(2). Terms such as “grossly harmful”, “objectionable”, “harassing”, “disparaging” and “hateful” are not defined anywhere in the Rules, are subjective and contestable as alternate interpretation and standard could be offered for the same term. Further, this content policy is not applicable to content created by the intermediary.</p>
<p class="Standard" style="text-align: justify; ">Prior to the SC verdict in Shreya Singhal, <a href="http://cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability">actual knowledge could have been interpreted</a> to mean the intermediary is called upon its own judgement under sub-rule (4) to restrict impugned content in order to seek exemption from liability. While liability accrued from not complying with takedown requests under the content policy was clear, this is not the case anymore. By reading down of S. 79 (3) (b) the court has addressed the issue of intermediaries complying with places limits on the private censorship of intermediaries and the invisible censorship of opaque government takedown requests as they must and should adhere, to the boundaries set by Article 19(2). Following the SC judgment intermediaries do not have to administer takedowns without a court order thereby rendering this content policy redundant. As it stands, the content policy is an obligation that intermediaries must fulfill in order to be exempted from liability for UGC and this due diligence is limited to publishing rules and regulations, terms and conditions or user agreement informing users of the restrictions on content. The penalties for not publishing this content policy should be clarified.</p>
<p class="Standard" style="text-align: justify; ">Further, having been informed of what is permissible users are agreeing to comply with the policy outlined, by signing up to and using these platforms and services. The requirement of publishing content policy as due diligence is unnecessary given that mandating such ‘standard’ terms of use negates the difference between different types of intermediaries which accrue different kinds of liability. This also places an extraordinary power of censorship in the hands of the intermediary, which could easily stifle freedom of speech online. Such heavy handed regulation could make it impossible to publish critical views about anything without the risk of being summarily censored.</p>
<p class="Standard">Twitter may have complied with its duties by publishing the content policy, though the obligation does not seem to be an effective deterrence. Strong safe harbour provisions for intermediaries are a crucial element in the promotion and protection of the right to freedom of expression online. By absolving platforms of responsibility for UGC as long as they publish a content policy that is vague and subjective is the very reason why India’s IT Rules are in fact, in urgent need of improvement.</p>
<h3 class="Standard">Size Matters</h3>
<p class="Standard" style="text-align: justify; ">The standards for blocking, reporting and responding to abuse vary across different categories of platforms. For example, it may be easier to counter trolls and abuse on blogs or forums where the owner or an administrator is monitoring comments and UGC. Usually platforms outline monitoring and reporting policies and procedures including recourse available to victims and action to be taken against violators. However, these measures are not always effective in curbing abuse as it is possible for users to create new accounts under different usernames. For example, in Swati’s case the anonymous user behind @LutyensInsider account changed<a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx"> </a><a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx">their handle</a> to @gregoryzackim and @gzackim before deleting all tweets. In this case, perhaps the fear of criminal charges ahead was enough to silence the anonymous user, which may not always be the case.</p>
<h3 class="Standard">Tackling the Trolls</h3>
<p class="Standard" style="text-align: justify; ">Most large intermediaries have privacy settings which restrict the audience for user posts as well as prevent strangers from contacting them as a general measure against online harassment. Platforms also publish<a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html"> </a><a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html">monitoring policy</a> outlining the procedure and mechanisms for users to<a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html">register their complaint</a> or<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">report abuse</a>. Often reporting and blocking mechanisms<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">rely on community standards</a> and users reporting unlawful content. Last week Twitter<a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348"> </a><a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348">announced a new feature</a> allowing lists of blocked users to be shared between users. An improvement on existing mechanism for blocking, the feature is aimed at making the service safer for people facing similar issues and while an improvement on standard policies defining permissible limits on content, such efforts may have their limitations.</p>
<p class="Standard" style="text-align: justify; ">The mechanisms follow a one-size-fits-all policy. First, such community driven efforts do not address concerns of differences in opinion and subjectivity. Swati in defending her actions stressed the “<i>coarse discourse”</i> prevalent on social media, though as<a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/"> </a><a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/">this article points out</a> she might be assumed guilty of using offensive and abusive language. Subjectivity and many interpretations of the same opinion can pave the way for many taking offense online. Earlier this month, Nikhil Wagle’s tweets criticising Prime Minister Narendra Modi as a “pervert” was interpreted as “abusive”, “offensive” and “spreading religious disharmony”. While platforms are within their rights to establish policies for dealing with issues faced by users, there is a real danger of them doing so for<a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">“</a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">political reasons” and based on “popularity” measures</a> which may chill free speech. When many get behind a particular interpretation of an opinion, lawful speech may also be stifled as Sreemoyee Kundu <a href="http://www.dailyo.in/user/124/sreemoyeekundu">found out</a>. A victim of online abuse her account was blocked by Facebook owing to multiple reports from a “<i>faceless fanatical mob”. </i>Allowing the users to set standards of permissible speech is an improvement, though it runs the risk of mob justice and platforms need to be vigilant in applying such standards.</p>
<p class="Standard" style="text-align: justify; ">While it may be in the interest of platforms to keep a hands off approach to community policies, certain kind of content may necessiate intervention by the intermediary. There has been an increase in private companies modifying their content policy to place reasonable restriction on certain hateful behaviour in order to protect vulnerable or marginalised voices. <a href="http://www.theguardian.com/technology/2015/mar/12/twitter-bans-revenge-porn-in-user-policy-sharpening">Twitter</a> and <a href="http://www.redditblog.com/2015/05/promote-ideas-protect-people.html">Reddit's</a> policy change in addressing revenge porn are reflective of a growing understanding amongst stakeholders that in order to promote free expression of ideas, recognition and protection of certain rights on the Internet may be necessary. However, any approach to regulate user content must assess the effect of policy decisions on user rights. Google's <a href="http://www.theguardian.com/technology/2015/jun/22/revenge-porn-women-free-speech-abuse">stand on tackling revenge porn</a> may be laudable, though the <a href="https://www.techdirt.com/articles/20141109/06211929087/googles-efforts-to-push-down-piracy-sites-may-lead-more-people-to-malware.shtml">decision to push down</a> 'piracy' sites in its search results could be seen to adversely impact the choice that users have. Terms of service implemented with subjectivity and lack of transparency can and does lead to private censorship.</p>
<h3 class="Standard">The Way Forward</h3>
<p class="Standard" style="text-align: justify; ">Harassment is damaging, because of the feeling of powerlessness that it invokes in the victims and online intermediaries represent new forms of power through which users' negotiate and manage their online identity. Content restriction policies and practices must address this power imbalance by adopting baseline safeguards and best practices. It is only fair that based on principles of equality and justice, intermediaries be held responsible for the damage caused to users due to wrongdoings of other users or when they fail to carry out their operations and services as prescribed by the law. However, in its present state, the intermediary liability regime in India is not sufficient to deal with online harassment and needs to evolve into a more nuanced form of governance.</p>
<p class="Standard" style="text-align: justify; ">Any liability framework must evolve bearing in mind the slippery slope of overbroad regulation and differing standards of community responsibility. Therefore, a balanced framework would need to include elements of both targeted regulation and soft forms of governance as liability regimes need to balance fundamental human rights and the interests of private companies. Often, achieving this balance is problematic given that these companies are expected to be adjudicators and may also be the target of the breach of rights, as is the case in Delfi v Estonia. Global frameworks such as the Manila Principles can be a way forward in developing effective mechanisms. The determination of content restriction practices should always adopt the least restrictive means of doing so, distinguishing between the classes of intermediary. They must evolve considering the proportionality of the harm, the nature of the content and the impact on affected users including the proximity of affected party to content uploader.</p>
<p class="Standard" style="text-align: justify; ">Further, intermediaries and governments should communicate a clear mechanism for review and appeal of restriction decisions. Content restriction policies should incorporate an effective right to be heard. In exceptional circumstances when this is not possible, a post facto review of the restricton order and its implementation must take place as soon as practicable. Further, unlawful content restricted for a limited duration or within a specific geography, must not extend beyond these limits and a periodic review should take place to ensure the validity of the restriction. Regular, systematic review of rules and guidelines guiding intermediary liability will go a long way in ensuring that such frameworks are not overly burdensome and remain effective.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse'>https://cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse</a>
</p>
No publisherjyotiOnline HarassmentInternet GovernanceIntermediary LiabilityChilling EffectOnline Abuse2015-08-02T16:38:36ZBlog EntryUN Special Rapporteur Report on Freedom of Expression and the Private Sector: A Significant Step Forward
https://cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward
<b>On 6 June 2016, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye, released a report on the Information and Communications Technology (“ICT”) sector and freedom of expression in the digital age. Vidushi Marda and Pranesh Prakash highlight the most important aspects of the report.</b>
<h2 dir="ltr">Background</h2>
<p dir="ltr">Today, the private sector is more closely linked to the freedom of expression than it has ever been before. The ability to speak to a mass audience was at one time a privilege restricted to those who had access to mass media. However, with digital technologies, that privilege is available to far more people than was ever possible in the pre-digital era. As private content created on these digital networks is becoming increasingly subject to state regulation, it is crucial to examine the role of the private sector in respect of the freedom of speech and expression.</p>
<p dir="ltr">The first foray by the Special Rapporteur into this broad area has resulted in a sweeping report, that covers almost every aspect of freedom of expression within the ICT sector, except competition which we will elaborate on later in this post.</p>
<h2 dir="ltr">Introduction</h2>
<p dir="ltr">The report aims to “provide guidance on how private actors should protect and promote freedom of expression in a digital age”. It identifies the relevant international legal framework as Article 19 of the <a href="https://treaties.un.org/doc/Publication/UNTS/Volume%20999/volume-999-I-14668-English.pdf">International Covenant on Civil and Political Rights</a>, and Article 19 of the <a href="http://www.un.org/en/udhrbook/pdf/udhr_booklet_en_web.pdf">Universal Declaration of Human Rights</a>. The UN “Protect, Respect and Remedy” Framework and Guiding Principles, also known as the <a href="http://business-humanrights.org/sites/default/files/reports-and-materials/Ruggie-report-7-Apr-2008.pdf">Ruggie Principles</a> provide the framework for private sector responsibilities on business and human rights.</p>
<p dir="ltr">The report categorises different roles of the private sector in organising, accessing, regulating and populating the internet. This is important because the manner in which the ICT sector affects the freedom of expression is far more complicated than traditional communication industries. The report identifies the distinct impact of internet service providers, hardware and software companies, domain name registries and registrars, search engines, platforms, web hosting services, platforms, data brokers and e-commerce facilities on the freedom of expression.</p>
<h2>Legal and Policy Issues</h2>
<div>The Special Rapporteur discusses four distinct legal and policy issues that find relevance in respect of this problem statement: Content Regulation, Surveillance and Digital Security, Transparency and Remedies.</div>
<div> </div>
<h3>Content Regulation</h3>
<p dir="ltr">The report identifies two main channels through which content regulation takes place: the state, and internal processes.</p>
<p>Noting that digital content made on private networks is increasingly subject to State regulation, the report highlights the competing interests of intermediaries who manage platforms and States which demand for regulation of this content on grounds of defamation, blasphemy, protection of national security etc. This tension is demonstrated through vague laws that compel individuals and private corporations to over-comply and err on the side of caution “in order to avoid onerous penalties, filtering content of uncertain legal status and engaging in other modes of censorship and self-censorship.” Excessive intermediary liability forces intermediaries to over-comply with requests in order to ensure that local access to their platforms are not blocked. States attempt at regulating content outside the law through extra legal restrictions, and push private actors to take down content on their own initiative. Filtering content is another method, wherein States block and filter content through the private sector. Government blacklists, illegal content and suspended accounts are methods employed, and these have sometimes raised concerns of necessity and proportionality. <a href="http://scroll.in/article/807277/whatsapp-in-kashmir-when-big-brother-wants-to-go-beyond-watching-you">Network or service shutdowns</a> are classified as a “particularly pernicious” method of content regulation. Non neutral networks also are a method of content regulation with the possibilities of internet service providers throttling traffic. Zero rating is a potential issue, although the report acknowledges that “it remains a subject of debate whether they may be permissible in areas genuinely lacking Internet access”.</p>
<p>The other node of content regulation has been identified as internal policies and practices of the private sector. <a href="https://consentofthenetworked.com/author/rebeccamackinnon/">Terms of service</a> restrictions are often tailored to the jurisdiction’s laws and policies and don’t always address the needs and interests of vulnerable groups. Further, the report notes, <a href="http://www.catchnews.com/tech-news/facebook-free-basics-gatekeeping-powers-extend-to-manipulating-public-discourse-1452077063.html">design and engineering choices</a> of how private players choose to curate content are algorithmically determined and increasingly control the information that we consume. </p>
<h3>Transparency</h3>
<div> The report notes that transparency enables those entities subject to internet regulation to take informed decisions about their responsibilities and liabilities in a digital sphere and points out, that there is a severe lack of transparency about government requests to restrict or remove content. Some states even prohibit the publication of such information, with India being one example. In respect of the private sector, content hosting platforms sometimes at least reveal the circumstances under which content is removed due to a government request, although this is rather erratic. The report recognises the need to balance transparency with competing concerns like security and trade secrecy, and this is a matter of continued debate.</div>
<div> </div>
<h3 dir="ltr">Surveillance and Digital Security</h3>
<p>Freedom of expression concerns arise as data transmitted on private networks is gradually being subjected to surveillance and interference from the State and private actors. The report finds that several internet companies have reported an increase in government requests for customer data and user information. According to the Special Rapporteur, effective resistance strategies include inclusion of human rights guarantees, restrictively interpreting government requests negotiations. Private players also make surveillance and censorship equipment that enable States to intercept communications. Covert surveillance has been previously reported, with States tapping into communications as and when necessary. When private entities become aware of interception and covert surveillance, their human rights responsibilities arise. As private entities work towards enhancing encryption, anonymity and user security, states respond by <a href="http://www.cnbc.com/2016/03/29/apple-vs-fbi-all-you-need-to-know.html">compelling companies</a> to create loopholes for them to circumvent such privacy and security enhancing technology.</p>
<h3 dir="ltr">Remedies</h3>
<p>Unlawful content removal, opaque suspensions, data security breaches are commonplace occurrences in the digital sphere. The ICCPR guarantees that all people whose rights have been violated must have an effective remedy, and similarly, the Ruggie principles require that remedial and grievance mechanisms must be provided by corporations. There is some ambiguity on how these complaint or appeal mechanisms should be designed and implemented, and the nature and structure of these mechanisms is also unclear. The report states that it is necessary to investigate the role of the state in supplementing/regulating corporate mechanisms, its role in ensuring that there is a mechanism for remedies, and its responsibility to make sure that more easily and financially accessible alternatives exist for remedial measures.<br /><br /></p>
<h2> Special Rapporteur’s priorities for future work and thematic developments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Investigating laws, policies and extralegal measures that equip governments to impose restrictions on the provision of telecommunications and internet services. Examining the responsibility of companies to respond in a way that respects human rights, mitigates harm, and provides avenues for redress.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Evaluating content restrictions under terms of service and community standards. Private actors face substantial pressure from governments and individuals to restrict expression, and a priority is to evaluate the interplay of private and state actions on freedom of expression in light of human rights obligations and responsibilities.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Focusing on the legitimacy of rationales for intermediary liability for content hosting, restrictions, conditions for removing third party content.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Exploring censorship and surveillance within the human rights framework, and encouraging greater scrutiny before using these technologies for purposes that undermine the freedom of expression.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Identifying ways to balance an increasing scope of freedom of expression with the need to address governmental interests in national security and public order.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet access - Future work will explore issues around access and private sector engagement and investment in ensuring affordability and accessibility, particularly considering marginalized groups.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet governance - Internet governance frameworks and reform efforts are sensitive to the needs of women, sexual minorities and other vulnerable communities. Throughout this future work, the Special Rapporteur will pay particular attention to legal developments (legislative, regulatory, and judicial) at national and regional levels.</p>
</li></ol>
<div> </div>
<h2>Conclusions and Recommendations</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">States: The report recommends that states should not pressurise the private sector to interfere with the freedom of speech and expression in a manner that does not meet the condition of necessary and proportionate principles. Any request to take down content or access customer information must be based on validly enacted law, subject to oversight, and demonstrate necessary and proportionate means of achieving the aims laid down in Article 19(3) of the ICCPR.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Private Actors: The Special Rapporteur recommends that private actors develop and implement transparent human rights assessment procedures, and develop policies keeping in mind their human rights impact. Apart from this, private entities should integrate commitments to the freedom of expression into internal processes and ensure the “greatest possible transparency”.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">International Organisations: The report recommends that organisations make resources and educational material on internet governance publicly accessible. The Special Rapporteur also recommends encouraging meaningful civil society participation in multi-stakeholder policy making and standard setting processes, with an increased focus on sensitivity to human rights.</p>
</li></ol>
<div> </div>
<h2>CIS Comments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS strongly agrees with the expansion of the Special Rapporteur’s scope that this report represents. He is no longer looking solely at states but at the private sector too.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS also notes that competition is an important aspect of the freedom of expression, but has not been discussed in this report. Viable alternatives to platforms, networks, internet service providers etc., will ensure a healthy, competitive marketplace, and will have a positive impact in resolving the issues identified above.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Our <a href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf/view">work</a> has called for maintaining a balanced approach to liability of intermediaries for their users’ actions, since excessive liability or strict liability would lead to over-caution and removal of legitimate speech, while having no liability at all would make it difficult to act effectively against harmful speech, e.g., revenge porn.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr"><a href="http://cis-india.org/internet-governance/blog/cis-position-on-net-neutrality">CIS’ work</a> on network neutrality has highlighted the importance of neutrality for freedom of speech, and has advocated for an evidence-based approach that ensures there is neither under-regulation, nor over-regulation. The Special Rapporteur suggests that ‘Zero-Rating’ practices always violate Net Neutrality, but the majority of the definitions of Net Neutrality proposed by academics and followed by regulators across the world often do not include Zero-Rating. Similarly, he suggests that the main exception for Zero-Rating is for areas genuinely lacking access to the Internet, whereas the potential for some forms of Zero-Rating to further freedom of expression, especially of minorities, even in areas with access to the Internet, provides sufficient reason for the issue to merit greater debate.</p>
</li></ol>
<div> </div>
<div> </div>
<div>(Pranesh Prakash was invited by the Special Rapporteur to provide his views and took part in a meeting that contributed to this report)</div>
<div> </div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward'>https://cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward</a>
</p>
No publishervidushiFreedom of Speech and ExpressionInternet GovernanceUNHRCDigital MediaIntermediary LiabilityICT2016-06-08T17:27:22ZBlog Entry