The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 31 to 45.
Right to Exclusion, Government Spaces, and Speech
https://cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech
<b>The conclusion of the litigation surrounding Trump blocking its critiques on Twitter brings to forefront two less-discussed aspects of intermediary liability: a) if social media platforms could be compelled to ‘carry’ speech under any established legal principles, thereby limiting their right to exclude users or speech, and b) whether users have a constitutional right to access social media spaces of elected officials. This essay analyzes these issues under the American law, as well as draws parallel for India, in light of the ongoing litigation around the suspension of advocate Sanjay Hegde’s Twitter account.</b>
<p> </p>
<p>This article first appeared on the Indian Journal of Law and Technology (IJLT) blog, and can be accessed <a class="external-link" href="https://www.ijlt.in/post/right-to-exclusion-government-controlled-spaces-and-speech">here</a>. Cross-posted with permission. </p>
<p>---</p>
<h2><span class="s1">Introduction</span></h2>
<p class="p2"><span class="s1">On April 8, the Supreme Court of the United States (SCOTUS), vacated the judgment of the US Court of Appeals for Second Circuit’s in <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2"><em>Knight First Amendment Institute v Trump</em></span></a>. In that case, the Court of Appeals had precluded Donald Trump, then-POTUS, from blocking his critics from his Twitter account on the ground that such action amounted to the erosion of constitutional rights of his critics. The Court of Appeals had held that his use of @realDonaldTrump in his official capacity had transformed the nature of the account from private to public, and therefore, blocking users he disagreed with amounted to viewpoint discrimination, something that was incompatible with the First Amendment.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">The SCOTUS <a href="https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf"><span class="s2">ordered</span></a> the case to be dismissed as moot, on account of Trump no longer being in office. Justice Clarence Thomas issued a ten-page concurrence that went into additional depth regarding the nature of social media platforms and user rights. It must be noted that the concurrence does not hold any direct precedential weightage, since Justice Thomas was not joined by any of his colleagues at the bench for the opinion. However, given that similar questions of public import, are currently being deliberated in the ongoing <em>Sanjay Hegde</em> <a href="https://www.barandbench.com/news/litigation/delhi-high-court-sanjay-hegde-challenge-suspension-twitter-account-hearing-july-8"><span class="s2">litigation</span></a> in the Delhi High Court, Justice Thomas’ concurrence might hold some persuasive weightage in India. While the facts of these litigations might be starkly different, both of them are nevertheless characterized by important questions of applying constitutional doctrines to private parties like Twitter and the supposedly ‘public’ nature of social media platforms.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In this essay, we consider the legal questions raised in the opinion as possible learnings for India. In the first part, we analyze the key points raised by Justice Thomas, vis-a-vis the American legal position on intermediary liability and freedom of speech. In the second part, we apply these deliberations to the <em>Sanjay Hegde </em>litigation, as a case-study and a roadmap for future legal jurisprudence to be developed.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">A flawed analogy</span></h2>
<p class="p2"><span class="s1">At the outset, let us briefly refresh the timeline of Trump’s tryst with Twitter, and the history of this litigation: the Court of Appeals decision was <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2">issued</span></a> in 2019, when Trump was still in office. Post-November 2020 Presidential Election, where he was voted out, his supporters <a href="https://indianexpress.com/article/explained/us-capitol-hill-siege-explained-7136632/"><span class="s2">broke</span></a> into Capitol Hill. Much of the blame for the attack was pinned on Trump’s use of social media channels (including Twitter) to instigate the violence and following this, Twitter <a href="https://blog.twitter.com/en_us/topics/company/2020/suspension"><span class="s2">suspended</span></a> his account permanently.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is this final fact that seized Justice Thomas’ reasoning. He noted that a private party like Twitter’s power to do away with Trump’s account altogether was at odds with the Court of Appeals’ earlier finding about the public nature of the account. He deployed a hotel analogy to justify this: government officials renting a hotel room for a public hearing on regulation could not kick out a dissenter, but if the same officials gather informally in the hotel lounge, then they would be within their rights to ask the hotel to kick out a heckler. The difference in the two situations would be that, <em>“the government controls the space in the first scenario, the hotel, in the latter.” </em>He noted that Twitter’s conduct was similar to the second situation, where it “<em>control(s) the avenues for speech</em>”. Accordingly, he dismissed the idea that the original respondents (the users whose accounts were blocked) had any First Amendment claims against Trump’s initial blocking action, since the ultimate control of the ‘avenue’ was with Twitter, and not Trump.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In the facts of the case however, this analogy was not justified. The Court of Appeals had not concerned itself with the question of private ‘control’ of entire social media spaces, and given the timeline of the litigation, it was impossible for them to pre-empt such considerations within the judgment. In fact, the only takeaway from the original decision had been that an elected representative’s utilization of his social media account for official purposes transformed </span><span class="s3">only that particular space</span><span class="s1"><em> </em>into a public forum where constitutional rights would find applicability. In delving into questions of ‘control’ and ‘avenues of speech’, issues that had been previously unexplored, Justice Thomas conflates a rather specific point into a much bigger, general conundrum. Further deliberations in the concurrence are accordingly put forward upon this flawed premise.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Right to exclusion (and must carry claims)</span></h2>
<p class="p2"><span class="s1">From here, Justice Thomas identified the problem to be “<em>private, concentrated control over online content and platforms available to the public</em>”, and brought forth two alternate regulatory systems — common carrier and public accommodation — to argue for ‘equal access’ over social media space. He posited that successful application of either of the two analogies would effectively restrict a social media platform’s right to exclude its users, and “<em>an answer may arise for dissatisfied platform users who would appreciate not being blocked</em>”. Essentially, this would mean that platforms would be obligated to carry <em>all </em>forms of (presumably) legal speech, and users would be entitled to sue platforms in case they feel their content has been unfairly taken down, a phenomenon Daphne Keller <a href="http://cyberlaw.stanford.edu/blog/2018/09/why-dc-pundits-must-carry-claims-are-relevant-global-censorship"><span class="s2">describes</span></a> as ‘must carry claims’.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">Again, this is a strange place to find the argument to proceed, since the original facts of the case were not about ‘<em>dissatisfied platform users’,</em> but an elected representative’s account being used in dissemination of official information. Beyond the initial ‘private’ control deliberation, Justice Thomas did not seem interested in exploring this original legal position, and instead emphasized on analogizing social media platforms in order to enforce ‘equal access’, finally arriving at a position that would be legally untenable in the USA.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">The American law on intermediary liability, as embodied in Section 230 of the Communications Decency Act (CDA), has two key components: first, intermediaries are <a href="https://www.eff.org/issues/cda230"><span class="s2">protected</span></a> against the contents posted by its users, under a legal model <a href="https://www.article19.org/wp-content/uploads/2018/02/Intermediaries_ENGLISH.pdf"><span class="s2">termed</span></a> as ‘broad immunity’, and second, an intermediary does not stand to lose its immunity if it chooses to moderate and remove speech it finds objectionable, popularly <a href="https://intpolicydigest.org/section-230-how-it-actually-works-what-might-change-and-how-that-could-affect-you/"><span class="s2">known</span></a> as the Good Samaritan protection. It is the effect of these two components, combined, that allows platforms to take calls on what to remove and what to keep, translating into a ‘right to exclusion’. Legally compelling them to carry speech, under the garb of ‘access’ would therefore, strike at the heart of the protection granted by the CDA.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Learnings for India</span></h2>
<p class="p2"><span class="s1">In his petition to the Delhi High Court, Senior Supreme Court Advocate, Sanjay Hegde had contested that the suspension of his Twitter account, on the grounds of him sharing anti-authoritarian imagery, was arbitrary and that:<span class="Apple-converted-space"> </span></span></p>
<ol style="list-style-type: lower-alpha;" class="ol1"><li class="li2"><span class="s1">Twitter was carrying out a public function and would be therefore amenable to writ jurisdiction under Article 226 of the Indian Constitution; and</span></li><li class="li2"><span class="s1">The suspension of his account had amounted to a violation of his right to freedom of speech and expression under Article 19(1)(a) and his rights to assembly and association under Article 19(1)(b) and 19(1)(c); and</span></li><li class="li2"><span class="s1">The government has a positive obligation to ensure that any censorship on social media platforms is done in accordance with Article 19(2).<span class="Apple-converted-space"> </span></span></li></ol>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">The first two prongs of the original petition are perhaps easily disputed: as previous <a href="https://indconlawphil.wordpress.com/2020/01/28/guest-post-social-media-public-forums-and-the-freedom-of-speech-ii/"><span class="s2">commentary</span></a> has pointed out, existing Indian constitutional jurisprudence on ‘public function’ does not implicate Twitter, and accordingly, it would be a difficult to make out a case that account suspensions, no matter how arbitrary, would amount to a violation of the user’s fundamental rights. It is the third contention that requires some additional insight in the context of our previous discussion.<span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Does the Indian legal system support a right to exclusion?<span class="Apple-converted-space"> </span></span></h3>
<p class="p2"><span class="s1">Suing Twitter to reinstate a suspended account, on the ground that such suspension was arbitrary and illegal, is in its essence a request to limit Twitter’s right to exclude its users. The petition serves as an example of a must-carry claim in the Indian context and vindicates Justice Thomas’ (misplaced) defence of ‘<em>dissatisfied platform users</em>’. Legally, such claims perhaps have a better chance of succeeding here, since the expansive protection granted to intermediaries via Section 230 of the CDA, is noticeably absent in India. Instead, intermediaries are bound by conditional immunity, where availment of a ‘safe harbour’, i.e., exemption from liability, is contingent on fulfilment of statutory conditions, made under <a href="https://indiankanoon.org/doc/844026/"><span class="s2">section 79</span></a> of the Information Technology (IT) Act and the rules made thereunder. Interestingly, in his opinion, Justice Thomas had briefly visited a situation where the immunity under Section 230 was made conditional: to gain Good Samaritan protection, platforms might be induced to ensure specific conditions, including ‘nondiscrimination’. This is controversial (and as commentators have noted, <a href="https://www.lawfareblog.com/justice-thomas-gives-congress-advice-social-media-regulation"><span class="s2">wrong</span></a>), since it had the potential to whittle down the US' ‘broad immunity’ model of intermediary liability to a system that would resemble the Indian one.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is worth noting that in the newly issued Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, proviso to Rule 3(1)(d) allows for “<em>the removal or disabling of access to any information, data or communication link [...] under clause (b) on a voluntary basis, or on the basis of grievances received under sub-rule (2) [...]</em>” without dilution of statutory immunity. This does provide intermediaries a right to exclude, albeit limited, since its scope is restricted to content removed under the operation of specific sub-clauses within the rules, as opposed to Section 230, which is couched in more general terms. Of course, none of this precludes the government from further prescribing obligations similar to those prayed in the petition.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">On the other hand, it is a difficult proposition to support that Twitter’s right to exclusion should be circumscribed by the Constitution, as prayed. In the petition, this argument is built over the judgment in <a href="https://indiankanoon.org/doc/110813550/"><span class="s2"><em>Shreya Singhal v Union of India</em></span></a>, where it was held that takedowns under section 79 are to be done only on receipt of a court order or a government notification, and that the scope of the order would be restricted to Article 19(2). This, in his opinion, meant that “<em>any suo-motu takedown of material by intermediaries must conform to Article 19(2)</em>”.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">To understand why this argument does not work, it is important to consider the context in which the <em>Shreya Singhal </em>judgment was issued. Previously, intermediary liability was governed by the Information Technology (Intermediaries Guidelines) Rules, 2011 issued under section 79 of the IT Act. Rule 3(4) made provisions for sending takedown orders to the intermediary, and the prerogative to send such orders was on ‘<em>an affected person</em>’. On receipt of these orders, the intermediary was bound to remove content and neither the intermediary nor the user whose content was being censored, had the opportunity to dispute the takedown.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">As a result, the potential for misuse was wide-open. Rishabh Dara’s <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf"><span class="s2">research</span></a> provided empirical evidence for this; intermediaries were found to act on flawed takedown orders, on the apprehension of being sanctioned under the law, essentially chilling free expression online. The <em>Shreya Singhal</em> judgment, in essence, reined in this misuse by stating that an intermediary is legally obliged to act <em>only when </em>a takedown order is sent by the government or the court. The intent of this was, in the court’s words: “<em>it would be very difficult for intermediaries [...] to act when millions of requests are made and the intermediary is then to judge as to which of such requests are legitimate and which are not.</em>”<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">In light of this, if Hegde’s petition succeeds, it would mean that intermediaries would now be obligated to subsume the entirety of Article 19(2) jurisprudence in their decision-making, interpret and apply it perfectly, and be open to petitions from users when they fail to do so. This might be a startling undoing of the court’s original intent in <em>Shreya Singhal</em>. Such a reading also means limiting an intermediary’s prerogative to remove speech that may not necessarily fall within the scope of Article 19(2), but is still systematically problematic, including unsolicited commercial communications. Further, most platforms today are dealing with an unprecedented spread and consumption of harmful, misleading information. Limiting their right to exclude speech in this manner, we might be <a href="https://www.hoover.org/sites/default/files/research/docs/who-do-you-sue-state-and-platform-hybrid-power-over-online-speech_0.pdf"><span class="s2">exacerbating</span></a> this problem. <span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Government-controlled spaces on social media platforms</span></h3>
<p class="p2"><span class="s1">On the other hand, the original finding of the Court of Appeals, regarding the public nature of an elected representative’s social media account and First Amendment rights of the people to access such an account, might yet still prove instructive for India. While the primary SCOTUS order erases the precedential weight of the original case, there have been similar judgments issued by other courts in the USA, including by the <a href="https://globalfreedomofexpression.columbia.edu/cases/davison-v-randall/"><span class="s2">Fourth Circuit</span></a> court and as a result of a <a href="https://knightcolumbia.org/content/texas-attorney-general-unblocks-twitter-critics-in-knight-institute-v-paxton"><span class="s2">lawsuit</span></a> against a Texas Attorney General.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">A similar situation can be envisaged in India as well. The Supreme Court has <a href="https://indiankanoon.org/doc/591481/"><span class="s2">repeatedly</span></a> <a href="https://indiankanoon.org/doc/27775458/"><span class="s2">held</span></a> that Article 19(1)(a) encompasses not just the right to disseminate information, but also the right to <em>receive </em>information, including <a href="https://indiankanoon.org/doc/438670/"><span class="s2">receiving</span></a> information on matters of public concern. Additionally, in <a href="https://indiankanoon.org/doc/539407/"><span class="s2"><em>Secretary, Ministry of Information and Broadcasting v Cricket Association of Bengal</em></span></a>, the Court had held that the right of dissemination included the right of communication through any media: print, electronic or audio-visual. Then, if we assume that government-controlled spaces on social media platforms, used in dissemination of official functions, are ‘public spaces’, then the government’s denial of public access to such spaces can be construed to be a violation of Article 19(1)(a).<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Conclusion</span></h2>
<p class="p2"><span class="s1">As indicated earlier, despite the facts of the two litigations being different, the legal questions embodied within converge startlingly, inasmuch that are both examples of the growing discontent around the power wielded by social media platforms, and the flawed attempts at fixing it.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">While the above discussion might throw some light on the relationship between an individual, the state and social media platforms, many questions still continue to remain unanswered. For instance, once we establish that users have a fundamental right to access certain spaces within the social media platform, then does the platform have a right to remove that space altogether? If it does so, can a constitutional remedy be made against the platform? Initial <a href="https://indconlawphil.wordpress.com/2018/07/01/guest-post-social-media-public-forums-and-the-freedom-of-speech/"><span class="s2">commentary</span></a> on the Court of Appeals’ decision had contested that the takeaway from that judgment had been that constitutional norms had a primacy over the platform’s own norms of governance. In such light, would the platform be constitutionally obligated to <em>not </em>suspend a government account, even if the content on such an account continues to be harmful, in violation of its own moderation standards?<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">This is an incredibly tricky dimension of the law, made trickier still by the dynamic nature of the platforms, the intense political interests permeating the need for governance, and the impacts on users in the instance of a flawed solution. Continuous engagement, scholarship and emphasis on having a human rights-respecting framework underpinning the regulatory system, are the only ways forward.<span class="Apple-converted-space"> </span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space">---</span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"></span></span></p>
<p>The author would like to thank Gurshabad Grover and Arindrajit Basu for reviewing this piece. </p>
<div> </div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech'>https://cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech</a>
</p>
No publisherTorSharkFreedom of Speech and ExpressionIntermediary LiabilityInformation Technology2021-07-02T12:05:13ZBlog EntryIntermediary Liability in India: Chilling Effects on Free Expression on the Internet 2011
https://cis-india.org/internet-governance/intermediary-liability-in-india
<b>Intermediaries are widely recognised as essential cogs in the wheel of exercising the right to freedom of expression on the Internet. Most major jurisdictions around the world have introduced legislations for limiting intermediary liability in order to ensure that this wheel does not stop spinning. With the 2008 amendment of the Information Technology Act 2000, India joined the bandwagon and established a ‘notice and takedown’ regime for limiting intermediary liability.</b>
<p>On the 11th of April 2011, the Government of India notified the Information Technology (Intermediaries Guidelines) Rules 2011 that prescribe, amongst other things, guidelines for administration of takedowns by intermediaries. The Rules have been criticised extensively by both national and international media. The media has projected that the Rules, contrary to the objective of promoting free expression, seem to encourage privately administered injunctions to censor and chill free expression. On the other hand, the Government has responded through press releases and assured that the Rules in their current form do not violate the principle of freedom of expression or allow the government to regulate content.</p>
<p>This study has been conducted with the objective of determining whether the criteria, procedure and safeguards for administration of the takedowns as prescribed by the Rules lead to a chilling effect on online free expression. In the course of the study, takedown notices were sent to a sample comprising of 7 prominent intermediaries and their response to the notices was documented. Different policy factors were permuted in the takedown notices in order to understand at what points in the process of takedown, free expression is being chilled.</p>
<p>The results of the paper clearly demonstrate that the Rules indeed have a chilling effect on free expression. Specifically, the Rules create uncertainty in the criteria and procedure for administering the takedown thereby inducing the intermediaries to err on the side of caution and over-comply with takedown notices in order to limit their liability and as a result suppress legitimate expressions. Additionally, the Rules do not establish sufficient safeguards to prevent misuse and abuse of the takedown process to suppress legitimate expressions.</p>
<p>Of the 7 intermediaries to which takedown notices were sent, 6 intermediaries over-complied with the notices, despite the apparent flaws in them. From the responses to the takedown notices, it can be reasonably presumed that not all intermediaries have sufficient legal competence or resources to deliberate on the legality of an expression. Even if such intermediary has sufficient legal competence, it has a tendency to prioritise the allocation of its legal resources according to the commercial importance of impugned expressions. Further, if such subjective determination is required to be done in a limited timeframe and in the absence of adequate facts and circumstances, the intermediary mechanically (without application of mind or proper judgement) complies with the takedown notice.</p>
<p>The results also demonstrate that the Rules are procedurally flawed as they ignore all elements of natural justice. The third party provider of information whose expression is censored is not informed about the takedown, let alone given an opportunity to be heard before or after the takedown. There is also no recourse to have the removed information put-back or restored. The intermediary is under no obligation to provide a reasoned decision for rejecting or accepting a takedown notice. The Rules in their current form clearly tilt the takedown mechanism in favour of the complainant and adversely against the creator of expression.</p>
<table class="plain">
<tbody>
<tr>
<td>The research highlights the need to:<br />
<ul><li>increase the safeguards against misuse of the privately administered takedown regime;</li></ul>
<ul><li>reduce the uncertainty in the criteria for administering the takedown;</li></ul>
<ul><li>reduce the uncertainty in the procedure for administering the takedown;</li></ul>
<ul><li>include various elements of natural justice in the procedure for administering the takedown; and</li></ul>
<ul><li>replace the requirement for subjective legal determination by intermediaries with an objective test.</li></ul>
</td>
</tr>
</tbody>
</table>
<hr />
This executive summary is a research output of the Google Policy Fellowship 2011. The Centre for Internet & Society was the host organization. For the entire paper along with references, please write to <a class="external-link" href="mailto:rishabhdara@gmail.com">rishabhdara@gmail.com</a> or<a class="external-link" href="mailto:sunil@cis-india.org"> sunil@cis-india.org</a>
<p>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/intermediary-liability-in-india'>https://cis-india.org/internet-governance/intermediary-liability-in-india</a>
</p>
No publisherRishabh DaraFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityCensorship2012-04-21T18:05:58ZBlog EntryAn Evidence based Intermediary Liability Policy Framework: Workshop at IGF
https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework
<b>CIS is organising a workshop at the Internet Governance Forum 2014. The workshop will be an opportunity to present and discuss ongoing research on the changing definition of intermediaries and their responsibilities across jurisdictions and technologies and contribute to a comprehensible framework for liability that is consistent with the capacity of the intermediary and with international human-rights standards.</b>
<p style="text-align: justify; ">The Centre for Internet and Society, India and Centre for Internet and Society, Stanford Law School, USA, will be organising a workshop to analyse the role of intermediary platforms in relation to freedom of expression, freedom of information and freedom of association at the Internet Governance Forum 2014. <span>The aim of the workshop is to highlight the increasing importance of digital rights and broad legal protections of stakeholders in an increasingly knowledge-based economy. The workshop will discuss public policy issues associated with Internet intermediaries, in particular their roles, legal responsibilities and related liability limitations in context of the evolving nature and role of intermediaries in the Internet ecosystem. distinct</span></p>
<p style="text-align: justify; "><b>Online Intermediaries: Setting the context</b></p>
<p style="text-align: justify; ">The Internet has facilitated unprecedented access to information and amplified avenues for expression and engagement by removing the limits of geographic boundaries and enabling diverse sources of information and online communities to coexist. Against the backdrop of a broadening base of users, the role of intermediaries that enable economic, social and political interactions between users in a global networked communication is ubiquitous. Intermediaries are essential to the functioning of the Internet as many producers and consumers of content on the internet rely on the action of some third party–the so called intermediary. Such intermediation ranges from the mere provision of connectivity, to more advanced services such as providing online storage spaces for data, acting as platforms for storage and sharing of user generated content (UGC), or platforms that provides links to other internet content.</p>
<p style="text-align: justify; ">Online intermediaries enhance economic activity by reducing costs, inducing competition by lowering the barriers for participation in the knowledge economy and fuelling innovation through their contribution to the wider ICT sector as well as through their key role in operating and maintaining Internet infrastructure to meet the network capacity demands of new applications and of an expanding base of users.</p>
<p style="text-align: justify; ">Intermediary platforms also provide social benefits, by empowering users and improving choice through social and participative networks, or web services that enable creativity and collaboration amongst individuals. By enabling platforms for self-expression and cooperation, intermediaries also play a critical role in establishing digital trust, protection of human rights such as freedom of speech and expression, privacy and upholding fundamental values such as freedom and democracy.</p>
<p style="text-align: justify; ">However, the economic and social benefits of online intermediaries are conditional to a framework for protection of intermediaries against legal liability for the communication and distribution of content which they enable.</p>
<p style="text-align: justify; "><b>Intermediary Liability</b></p>
<p style="text-align: justify; ">Over the last decade, right holders, service providers and Internet users have been locked in a debate on the potential liability of online intermediaries. The debate has raised global concerns on issues such as, the extent to which Internet intermediaries should be held responsible for content produced by third parties using their Internet infrastructure and how the resultant liability would affect online innovation and the free flow of knowledge in the information economy?</p>
<p style="text-align: justify; ">Given the impact of their services on communications, intermediaries find themselves as either directly liable for their actions, or indirectly (or “secondarily”) liable for the actions of their users. Requiring intermediaries to monitor the legality of the online content poses an insurmountable task. Even if monitoring the legality of content by intermediaries against all applicable legislations were possible, the costs of doing so would be prohibitively high. Therefore, placing liability on intermediaries can deter their willingness and ability to provide services, hindering the development of the internet itself.</p>
<p style="text-align: justify; ">Economics of intermediaries are dependent on scale and evaluating the legality of an individual post exceeds the profit from hosting the speech, and in the absence of judicial oversight can lead to a private censorship regime. Intermediaries that are liable for content or face legal exposure, have powerful incentives, to police content and limit user activity to protect themselves. The result is curtailing of legitimate expression especially where obligations related to and definition of illegal content is vague. Content policing mandates impose significant compliance costs limiting the innovation and competiveness of such platforms.</p>
<p style="text-align: justify; ">More importantly, placing liability on intermediaries has a chilling effect on freedom of expression online. Gate keeping obligations by service providers threaten democratic participation and expression of views online, limiting the potential of individuals and restricting freedoms. Imposing liability can also indirectly lead to the death of anonymity and pseudonymity, pervasive surveillance of users' activities, extensive collection of users' data and ultimately would undermine the digital trust between stakeholders.</p>
<p style="text-align: justify; ">Thus effectively, imposing liability for intermediaries creates a chilling effect on Internet activity and speech, create new barriers to innovation and stifles the Internet's potential to promote broader economic and social gains. To avoid these issues, legislators have defined 'safe harbours', limiting the liability of intermediaries under specific circumstances.</p>
<p style="text-align: justify; ">Online intermediaries do not have direct control of what information is or information are exchanged via their platform and might not be aware of illegal content per se. A key framework for online intermediaries, such limited liability regimes provide exceptions for third party intermediaries from liability rules to address this asymmetry of information that exists between content producers and intermediaries.</p>
<p style="text-align: justify; ">However, it is important to note, that significant differences exist concerning the subjects of these limitations, their scope of provisions and procedures and modes of operation. The 'notice and takedown' procedures are at the heart of the safe harbour model and can be subdivided into two approaches:</p>
<p style="text-align: justify; ">a. Vertical approach where liability regime applies to specific types of content exemplified in the US Digital Copyright Millennium Act</p>
<p style="text-align: justify; ">b. Horizontal approach based on the E-Commerce Directive (ECD) where different levels of immunity are granted depending on the type of activity at issue</p>
<p style="text-align: justify; "><b>Current framework </b></p>
<p style="text-align: justify; ">Globally, three broad but distinct models of liability for intermediaries have emerged within the Internet ecosystem:</p>
<p style="text-align: justify; ">1. Strict liability model under which intermediaries are liable for third party content used in countries such as China and Thailand</p>
<p style="text-align: justify; ">2. Safe harbour model granting intermediaries immunity, provided their compliance on certain requirements</p>
<p style="text-align: justify; ">3. Broad immunity model that grants intermediaries broad or conditional immunity from liability for third party content and exempts them from any general requirement to monitor content. <b> </b></p>
<p style="text-align: justify; ">While the models described above can provide useful guidance for the drafting or the improvement of the current legislation, they are limited in their scope and application as they fail to account for the different roles and functions of intermediaries. Legislators and courts are facing increasing difficulties, in interpreting these regulations and adapting them to a new economic and technical landscape that involves unprecedented levels user generated content and new kinds of and online intermediaries.</p>
<p style="text-align: justify; ">The nature and role of intermediaries change considerably across jurisdictions, and in relation to the social, economic and technical contexts. In addition to the dynamic nature of intermediaries the different categories of Internet intermediaries‘ are frequently not clear-cut, with actors often playing more than one intermediation role. Several of these intermediaries offer a variety of products and services and may have number of roles, and conversely, several of these intermediaries perform the same function. For example , blogs, video services and social media platforms are considered to be 'hosts'. Search engine providers have been treated as 'hosts' and 'technical providers'.</p>
<p style="text-align: justify; ">This limitations of existing models in recognising that different types of intermediaries perform different functions or roles and therefore should have different liability, poses an interesting area for research and global deliberation. Establishing classification of intermediaries, will also help analyse existing patterns of influence in relation to content for example when the removal of content by upstream intermediaries results in undue over-blocking.</p>
<p style="text-align: justify; ">Distinguishing intermediaries on the basis of their roles and functions in the Internet ecosystem is critical to ensuring a balanced system of liability and addressing concerns for freedom of expression. Rather than the highly abstracted view of intermediaries as providing a single unified service of connecting third parties, the definition of intermediaries must expand to include the specific role and function they have in relation to users' rights. A successful intermediary liability regime must balance the needs of producers, consumers, affected parties and law enforcement, address the risk of abuses for political or commercial purposes, safeguard human rights and contribute to the evolution of uniform principles and safeguards.</p>
<p style="text-align: justify; "><b>Towards an evidence based intermediary liability policy framework</b></p>
<p style="text-align: justify; ">This workshop aims to bring together leading representatives from a broad spectrum of stakeholder groups to discuss liability related issues and ways to enhance Internet users’ trust.</p>
<p style="text-align: justify; ">Questions to address at the panel include:</p>
<p style="text-align: justify; ">1. What are the varying definitions of intermediaries across jurisdictions?</p>
<p style="text-align: justify; ">2. What are the specific roles and functions that allow for classification of intermediaries?</p>
<p style="text-align: justify; ">3. How can we ensure the legal framework keeps pace with technological advances and the changing roles of intermediaries?</p>
<p style="text-align: justify; ">4. What are the gaps in existing models in balancing innovation, economic growth and human rights?</p>
<p style="text-align: justify; ">5. What could be the respective role of law and industry self-regulation in enhancing trust?</p>
<p style="text-align: justify; ">6. How can we enhance multi-stakeholder cooperation in this space?</p>
<p style="text-align: justify; ">Confirmed Panel:</p>
<p style="text-align: justify; ">Technical Community: Malcolm Hutty: Internet Service Providers Association (ISPA)<br />Civil Society: Gabrielle Guillemin: Article19<br />Academic: Nicolo Zingales: Assistant Professor of Law at Tilburg University<br />Intergovernmental: Rebecca Mackinnon: Consent of the Networked, UNESCO project<br />Civil Society: Anriette Esterhuysen: Association for Progressive Communication (APC)<br />Civil Society: Francisco Vera: Advocacy Director: Derechos Digitale<br />Private Sector: Titi Akinsanmi: Policy and Government Relations Manager, Google Sub-Saharan Africa<br />Legal: Martin Husovec: MaxPlanck Institute</p>
<p style="text-align: justify; "><b> </b></p>
<p style="text-align: justify; "><span>Moderator(s): </span><span>Giancarlo Frosio, Centre for Internet and Society (CIS) and </span><span>Jeremy Malcolm, Electronic Frontier Foundation </span></p>
<p style="text-align: justify; "><span><span>Remote Moderator: </span><span>Anubha Sinha, New Delhi</span></span></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework'>https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework</a>
</p>
No publisherjyotihuman rightsDigital Governanceinternet governanceFreedom of Speech and ExpressionInternet Governance ForumHuman Rights OnlineIntermediary LiabilityPoliciesMulti-stakeholder2014-07-04T06:41:10ZBlog EntryOn the legality and constitutionality of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
https://cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021
<b>This note examines the legality and constitutionality of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The analysis is consistent with previous work carried out by CIS on issues of intermediary liability and freedom of expression. </b>
<p><span id="docs-internal-guid-6127737f-7fff-b2eb-1b4a-ff9009a1050f"></span></p>
<p dir="ltr">On 25 February 2021, the Ministry of Electronics and Information Technology (Meity) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereinafter, ‘the rules’). In this note, we examine whether the rules meet the tests of constitutionality under Indian jurisprudence, whether they are consistent with the parent Act, and discuss potential benefits and harms that may arise from the rules as they are currently framed. Further, we make some recommendations to amend the rules so that they stay in constitutional bounds, and are consistent with a human rights based approach to content regulation. Please note that we cover some of the issues that CIS has already highlighted in comments on previous versions of the rules.</p>
<p dir="ltr"> </p>
<p dir="ltr">The note can be downloaded <a class="external-link" href="https://cis-india.org/internet-governance/legality-constitutionality-il-rules-digital-media-2021">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021'>https://cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021</a>
</p>
No publisherTorsha Sarkar, Gurshabad Grover, Raghav Ahooja, Pallavi Bedi and Divyank KatiraFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityInternet FreedomInformation Technology2021-06-21T11:52:39ZBlog EntryIntermediary Liability Resources
https://cis-india.org/internet-governance/blog/intermediary-liability-resources
<b>We bring you a list of intermediary resources as part of research on internet governance. This blog post will be updated on an ongoing basis.</b>
<ol> </ol><ol>
<li style="text-align: justify; "><b>Shielding the Messengers: Protecting Platforms for Expression and Innovation. </b>The Centre for Democracy and Technology. December 2012, available at: <a href="https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf">https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf</a>: This paper analyses the impact that intermediary liability regimes have on freedom of expression, privacy, and innovation. In doing so, the paper highlights different models of intermediary liability regimes, reviews different technological means of restricting access to content, and provides recommendations for intermediary liability regimes and provides alternative ways of addressing illegal content online.</li>
<li style="text-align: justify; "><b>Internet Intermediaries: Dilemma of Liability:</b> Article 19. 2013, available at: <a href="http://www.article19.org/data/files/Intermediaries_ENGLISH.pdf">http://www.article19.org/data/files/Intermediaries_ENGLISH.pdf:</a>This Policy Document reviews different components of intermediary liability and highlights the challenges and risks that current models of liability have to online freedom of expression. Relying on international standards for freedom of expression and comparative law, the document includes recommendations and alternative models that provide stronger protection for freedom of expression. The key recommendation in the document include: web hosting providers or hosts should be immune from liability to third party content if they have not modified the content, privatised enforcement should not be a model and removal orders should come only from courts or adjudicatory bodies, the model of notice to notice should replace notice and takedown regimes, in cases of alleged serious criminality clear conditions should be in place and defined.</li>
<li style="text-align: justify; "><b>Comparative Analysis of the National Approaches to the Liability of Internet Intermediaries:</b> Prepared by Daniel Seng for WIPO, available at http://www.wipo.int/export/sites/www/copyright/en/doc/liability_of_internet_intermediaries.pdf:This Report reviews the intermediary liability regimes and associated laws in place across fifteen different contexts with a focus on civil copyright liability for internet intermediaries. The Report seeks to find similarities and differences across the regimes studied and highlight principles and components in different that can be used in international treaties and instruments, upcoming policies, and court decisions.</li>
<li style="text-align: justify; "><b>Freedom of Expression, Indirect Censorship, & Liability for Internet Intermediaries.</b> The Electronic Frontier Foundation. February 2011, available at: <a href="http://infojustice.org/download/tpp/tpp-civil-society/EFF%20presentation%20ISPs%20and%20Freedom%20of%20Expression.pdf">http://infojustice.org/download/tpp/tpp-civil-society/EFF%20presentation%20ISPs%20and%20Freedom%20of%20Expression.pdf</a>:This presentation was created for the Trans-Pacific Partnership Stakeholder Forum in Chile and highlights that for freedom of expression to be protected, clear legal protections for internet intermediaries are needed and advocates for a regime that provides blanket immunity to intermediaries or is based on judicial takedown notices.</li>
<li style="text-align: justify; "><b>Study on the Liability of Internet Intermediaries. Contracted by the European Commission.</b> 2007, available at: <a href="http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf">http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf</a>. This Report provides insight on the application of the intermediary liability sections of the EU e-commerce directive and studies the impact of the regulations under the Directive on the functioning of intermediary information society services. To achieve this objective, the study identifies relavant case law across member states, calls out and evaluates developing trends across Member States, and draws conclusions.</li>
<li style="text-align: justify; "><b>Internet Intermediary Liability: Identifying Best Practices for Africa.</b> Nicolo Zingales for the Association for Progressive Communications, available at: <a href="https://www.apc.org/en/system/files/APCInternetIntermediaryLiability_BestPracticesAfrica_20131125.pdf">https://www.apc.org/en/system/files/APCInternetIntermediaryLiability_BestPracticesAfrica_20131125.pdf</a>: This background paper seeks to identify challenges and opportunities in addressing intermediary liability for countries in the African Union and recommend safeguards that can be included in emerging intermediary liability regimes in the context of human rights. The paper also reviews different models of intermediary liability and discusses the limitations, scope, and modes of operation of each model. </li>
<li style="text-align: justify; "><b>The Liability of Internet Intermediaries in Nigeria, Kenya, South Africa, and Uganda</b>: An uncertain terrain. Association for Progressive Communications. October 2012, available at: <a href="http://www.academia.edu/2484536/The_liability_of_internet_intermediaries_in_Nigeria_Kenya_South_Africa_and_Uganda_An_uncertain_terrain">http://www.academia.edu/2484536/The_liability_of_internet_intermediaries_in_Nigeria_Kenya_South_Africa_and_Uganda_An_uncertain_terrain</a>:This Report reviews intermediary liability in Nigeria, Kenya, South Africa and Uganda – providing background to the political context, relevant legislation, and present challenges . In doing so, the Report provides insight into how intermediary liability has changed in recent years in these contexts and explores past and present debates on intermediary liability. The Report concludes with recommendations for stakeholders affected by intermediary liability. </li>
<li style="text-align: justify; "><b>The Fragmentation of intermediary liability in the UK</b>. Daithi Mac Sithigh. 2013, available at: <a href="http://jiplp.oxfordjournals.org/content/8/7/521.full.pdf?keytype=ref&ijkey=zuL8aFSzKJqkozT">http://jiplp.oxfordjournals.org/content/8/7/521.full.pdf?keytype=ref&ijkey=zuL8aFSzKJqkozT</a>. This article looks at the application of the Electronic Commerce Directive across Europe and argues that it is being intermixed and subsequently replaced with provisions from national legislation and provisions of law from area specific legislation. Thus, the article argues that systems for intermediary liability are diving into multiple systems – for example for content related to copyright intermediaries are being placed with new responsibilities while for content related to defamation, there is a reducing in the liability that intermediaries are held to. </li>
<li><b>Regimes of Legal Liability for Online Intermediaries: an Overview</b>. OECD, available at: <a href="http://www.oecd.org/sti/ieconomy/45509050.pdf">http://www.oecd.org/sti/ieconomy/45509050.pdf</a>. This article provides an overview of different intermediary liability regimes including EU and US. </li>
<li style="text-align: justify; "><b> Closing the Gap: Indian Online Intermediaries and a Liability System Not Yet Fit for Purpose</b>. GNI. 2014, available at: <a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf</a>. This Report argues that the provisions of the Information Technology Act 2000 are not adequate to deal with ICT innovations , and argues that the current liability regime in India is hurting the Indian internet economy. </li>
<li style="text-align: justify; "><b>Intermediary Liability in India</b>. Centre for Internet and Society. 2011, available at: <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf">http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf</a>. This report reviews and ‘tests’ the effect of the Indian intermediary liability on freedom of expression. The report concludes that the present regime in India has a chilling effect on free expression and offers recommendations on how the Indian regime can be amended to protect this right. </li>
<li style="text-align: justify; ">The Liability of Internet Service providers and the exercise of the freedom of expression in Latin America have been explored in detail through the course of this research paper by Claudio Ruiz Gallardo and J. Carlos Lara Galvez. The paper explores the efficacy and the implementation of proposals to put digital communication channels under the oversight of certain State sponsored institutions in varying degrees. The potential consequence of legal intervention in media and digital platforms, on the development of individual rights and freedoms has been addressed through the course of this study. The paper tries to arrive at relevant conclusions with respect to the enforcement of penalties that seek to redress the liability of communication intermediaries and the mechanism that may be used to oversee the balance between the interests at stake as well as take comparative experiences into account. The paper also analyses the liability of technical facilitators of communications while at the same time attempting to define a threshold beyond which the interference into the working of these intermediaries may constitute an offence of the infringement of the privacy of users. Ultimately, it aims to derive a balance between the necessity for intervention, the right of the users who communicate via the internet and interests of the economic actors who may be responsible for the service: <a class="external-link" href="http://www.palermo.edu/cele/pdf/english/Internet-Free-of-Censorship/02-Liability_Internet_Service_Providers_exercise_freedom_expression_Latin_America_Ruiz_Gallardo_Lara_Galvez.pdf">http://www.palermo.edu/cele/pdf/english/Internet-Free-of-Censorship/02-Liability_Internet_Service_Providers_exercise_freedom_expression_Latin_America_Ruiz_Gallardo_Lara_Galvez.pdf</a></li>
</ol>
<hr />
<p><a class="external-link" href="https://crm.apc.org/civicrm/mailing/view?reset=1&id=191">Click to read the newsletter</a> from the Association of Progressive Communications. The summaries for the reports can be found below:</p>
<p style="text-align: justify; ">Internet Intermediaries: The Dilemma of Liability in Africa. APC News, May 2014, available at: <a href="http://www.apc.org/en/node/19279/">http://www.apc.org/en/node/19279/</a>. This report summarizes the challenges facing internet content regulators in Africa, and the effects of these regulations on the state of the internet in Africa. Many African countries do not protect intermediaries from potential liability, so some intermediaries are too afraid to transmit or host content on the internet in those countries. The report calls for a universal rights protection for internet intermediaries.</p>
<p style="text-align: justify; ">APC’s Frequently Asked Questions on Internet Intermediary Liability: APC, May 2014, available at: <a href="http://www.apc.org/en/node/19291/">http://www.apc.org/en/node/19291/</a>. This report addresses common questions pertaining to internet intermediaries, which are entities which provide services that enable people to use the internet, from network providers to search engines to comments sections on blogs. Specifically, the report outlines different models of intermediary liability, defining two main models. The “Generalist” model intermediary liability is judged according to the general rules of civil and criminal law, while the “Safe Harbour” model protects intermediaries with a legal safe zone.</p>
<p style="text-align: justify; ">New Developments in South Africa: APC News, May 2014, available at: <a href="http://www.apc.org/en/news/intermediary-liability-new-developments-south-afri">http://www.apc.org/en/news/intermediary-liability-new-developments-south-afri</a>. This interview with researchers Alex Comninos and Andrew Rens goes into detail about the challenges of intermediary in South Africa. The researchers discuss the balance that needs to be struck between insulating intermediaries from a fear of liability and protecting women’s rights in an environment that is having trouble dealing with violence against women. They also discuss South Africa’s three strikes policy for those who pirate material.</p>
<p style="text-align: justify; ">Preventing Hate Speech Online In Kenya: APCNews, May 2014, available at: <a href="http://www.apc.org/en/news/intermediary-liability-preventing-hate-speech-onli">http://www.apc.org/en/news/intermediary-liability-preventing-hate-speech-onli</a>. This interview with Grace Githaiga investigates the uncertain fate of internet intermediaries under Kenya’s new regime. The new government has mandated everyone to register their SIM cards, and indicated that it was monitoring text messages and flagging those that were deemed risky. This has led to a reduction in the amount of hate speech via text messages. Many intermediaries, such as newspaper comments sections, have established rules on how readers should post on their platforms. Githaiga goes on to discuss the issue of surveillance and the lack of a data protection law in Kenya, which she sees as the most pressing internet issue in Kenya.</p>
<p style="text-align: justify; ">New Laws in Uganda Make Internet Providers More Vulnerable to Liability and State Intervention: APCNews, May 2014, available at: <a href="http://www.apc.org/en/news/new-laws-uganda-make-internet-providers-more-vulne">http://www.apc.org/en/news/new-laws-uganda-make-internet-providers-more-vulne</a>. In an interview, Lilian Nalwoga discusses Uganda’s recent anti-pornography law that can send intermediaries to prison. The Anti-Pornography Act of 2014 criminalizes any sort of association with any form of pornography, and targets ISPs, content providers, and developers, making them liable for content that goes through their systems. This makes being an intermediary extremely risky in Uganda. The other issue with the law is a vague definition of pornography. Nalwoga also explains the Anti-Homosexuality Act of 2014 bans any promotion or recognition of homosexual relations, and the monitoring technology the government is using to enforce these laws.</p>
<p style="text-align: justify; ">New Laws Affecting Intermediary Liability in Nigeria: APCNews, May 2014, available at: <a href="http://www.apc.org/en/news/new-laws-affecting-intermediary-liability-nigeria">http://www.apc.org/en/news/new-laws-affecting-intermediary-liability-nigeria</a>. Gbenga Sesan, executive director of Paradigm Initiative Nigeria, expounds on the latest trends in Nigerian intermediary liability. The Nigerian Communications Commission has a new law that mandates ISPs store users data for at least here years, and wants to make content hosts responsible for what users do on their networks. Additionally, in Nigeria, internet users register with their real name and prove that you are the person who is registration. Sesan goes on to discuss the lack of safe harbor provisions for intermediaries and the remaining freedom of anonymity on social networks in Nigeria.</p>
<p style="text-align: justify; ">Internet Policies That Affect Africans: APC News, May 2014, available at: <a href="http://www.apc.org/en/news/intermediary-liability-internet-policies-affect-af">http://www.apc.org/en/news/intermediary-liability-internet-policies-affect-af</a>. The Associsation for Progressive Communcations interviews researcher Nicolo Zingales about the trend among African governments establishing further regulations to control the flow of information on the internet and hold intermediaries liable for content they circulate. Zingales criticizes intermediary liability for “creating a system of adverse incentives for free speech.” He goes on to offer examples of intermediaries and explain the concept of “safe harbor” legislative frameworks. Asked to identify best and worst practices in Africa, he highlights South Africa’s safe harbor as a good practice, and mentions the registration of users via ID cards as a worst practice.</p>
<p style="text-align: justify; ">Towards Internet Intermediary Responsibility: Carly Nyst, November 2013, available at: <a href="http://www.genderit.org/feminist-talk/towards-internet-intermediary-responsibility">http://www.genderit.org/feminist-talk/towards-internet-intermediary-responsibility</a>. Nyst argues for a middle ground between competing goals in internet regulation in Africa. Achieving one goal, of protecting free speech through internet intermediaries seems at odds with the goal of protecting women’s rights and limiting hate speech, because one demands intermediaries be protected in a legal safe harbor and the other requires intermediaries be vigilant and police their content. Nyst’s solution is not intermediary liability but <i>responsibility</i>, a role defined by empowerment, and establishing an intermediary responsibility to promote positive gender attitudes.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/intermediary-liability-resources'>https://cis-india.org/internet-governance/blog/intermediary-liability-resources</a>
</p>
No publisherelonnaiFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityPrivacy2014-07-03T06:45:48ZBlog EntryEuropean Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties
https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties
<b>The Court of Justice of the European Union has ruled that an "an internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties.” The decision adds to the conundrum of maintaining a balance between freedom of expression, protecting personal data and intermediary liability.</b>
<p style="text-align: justify; ">The ruling is expected to have considerable impact on reputation and privacy related takedown requests as under the decision, data subjects may approach the operator directly seeking removal of links to web pages containing personal data. Currently, users prove whether data needs to be kept online—the new rules reverse the burden of proof, placing an obligation on companies, rather than users for content regulation.</p>
<h3>A win for privacy?</h3>
<p style="text-align: justify; ">The ECJ ruling addresses Mario Costeja González complaint filed in 2010, against Google Spain and Google Inc., requesting that personal data relating to him appearing in search results be protected and that data which was no longer relevant be removed. Referring to <a href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML">the Directive 95/46/EC</a> of the European Parliament, the court said, that Google and other search engine operators should be considered 'controllers' of personal data. Following the decision, Google will be required to consider takedown requests of personal data, regardless of the fact that processing of such data is carried out without distinction in respect of information other than the personal data.</p>
<p style="text-align: justify; ">The decision—which cannot be appealed—raises important of questions of how this ruling will be applied in practice and its impact on the information available online in countries outside the European Union. The decree forces search engine operators such as Google, Yahoo and Microsoft's Bing to make judgement calls on the fairness of the information published through their services that reach over 500 million people across the twenty eight nation bloc of EU.</p>
<p style="text-align: justify; ">ECJ rules that search engines 'as a general rule,' should place the right to privacy above the right to information by the public. Under the verdict, links to irrelevant and out of date data need to be erased upon request, placing search engines in the role of controllers of information—beyond the role of being an arbitrator that linked to data that already existed in the public domain. The verdict is directed at highlighting the power of search engines to retrieve controversial information while limiting their capacity to do so in the future.</p>
<p style="text-align: justify; ">The ruling calls for maintaining a balance in addressing the legitimate interest of internet users in accessing personal information and upholding the data subject’s fundamental rights, but does not directly address either issues. The court also recognised, that the data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life. Acknowledging that data belongs to the individual and is not the right of the company, European Commissioner Viviane Reding, <a href="https://www.facebook.com/permalink.php?story_fbid=304206613078842&id=291423897690447&_ga=1.233872279.883261846.1397148393">hailed the verdict</a>, "a clear victory for the protection of personal data of Europeans".</p>
<p style="text-align: justify; ">The Court stated that if data is deemed irrelevant at the time of the case, even if it has been lawfully processed initially, it must be removed and that the data subject has the right to approach the operator directly for the removal of such content. The liability issue is further complicated by the fact, that search engines such as Google do not publish the content rather they point to information that already exists in the public domain—raising questions of the degree of liability on account of third party content displayed on their services.</p>
<p style="text-align: justify; ">The ECJ ruling is based on the case originally filed against Google, Spain and it is important to note that, González argued that searching for his name linked to two pages originally published in 1998, on the website of the Spanish newspaper La Vanguardia. The Spanish Data Protection Agency did not require La Vanguardia to take down the pages, however, it did order Google to remove links to them. Google appealed this decision, following which the National High Court of Spain sought advice from the European court. The definition of Google as the controller of information, raises important questions related to the distinction between liability of publishers and the liability of processors of information such as search engines.</p>
<h3>The 'right to be forgotten'</h3>
<p style="text-align: justify; ">The decision also brings to the fore, the ongoing debate and <a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law">fragmented opinions within the EU</a>, on the right of the individual to be forgotten. The <a href="http://www.bbc.com/news/technology-16677370">'right to be forgotten</a>' has evolved from the European Commission's wide-ranging plans of an overhaul of the commission's 1995 Data Protection Directive. The plans for the law included allowing people to request removal of personal data with an obligation of compliance for service providers, unless there were 'legitimate' reasons to do otherwise. Technology firms rallying around issues of freedom of expression and censorship, have expressed concerns about the reach of the bill. Privacy-rights activist and European officials have upheld the notion of the right to be forgotten, highlighting the right of the individual to protect their honour and reputation.</p>
<p style="text-align: justify; ">These issues have been controversial amidst EU member states with the UK's Ministry of Justice claiming the law 'raises unrealistic and unfair expectations' and has <a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law">sought to opt-out</a> of the privacy laws. The Advocate General of the European Court <a href="http://curia.europa.eu/juris/document/document.jsf?text=&docid=138782&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=362663#Footref91">Niilo Jääskinen's opinion</a>, that the individual's right to seek removal of content should not be upheld if the information was published legally, contradicts the verdict of the ECJ ruling. The European Court of Justice's move is surprising for many and as Richard Cumbley, information-management and data protection partner at the law firm Linklaters <a href="http://turnstylenews.com/2014/05/13/europe-union-high-court-establishes-the-right-to-be-forgotten/">puts it</a>, “Given that the E.U. has spent two years debating this right as part of the reform of E.U. privacy legislation, it is ironic that the E.C.J. has found it already exists in such a striking manner."</p>
<p style="text-align: justify; ">The economic implications of enforcing a liability regime where search engine operators censor legal content in their results aside, the decision might also have a chilling effect on freedom of expression and access to information. Google <a href="http://www.theguardian.com/technology/2014/may/13/right-to-be-forgotten-eu-court-google-search-results">called the decision</a> “a disappointing ruling for search engines and online publishers in general,” and that the company would take time to analyze the implications. While the implications of the decision are yet to be determined, it is important to bear in mind that while decisions like these are public, the refinements that Google and other search engines will have to make to its technology and the judgement calls on the fairness of the information available online are not public.</p>
<p style="text-align: justify; ">The ECJ press release is available <a href="http://curia.europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf">here</a> and the actual judgement is available <a href="http://curia.europa.eu/juris/documents.jsf?pro=&lgrec=en&nat=or&oqp=&lg=&dates=&language=en&jur=C%2CT%2CF&cit=none%252CC%252CCJ%252CR%252C2008E%252C%252C%252C%252C%252C%252C%252C%252C%252C%252Ctrue%252Cfalse%252Cfalse&num=C-131%252F12&td=%3BALL&pcs=Oor&avg">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties'>https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</a>
</p>
No publisherjyotiFreedom of Speech and ExpressionSocial MediaInternet GovernanceIntermediary Liability2014-05-14T14:18:46ZBlog EntryCentre for Internet and Society joins the Dynamic Coalition for Platform Responsibility
https://cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility
<b>The Centre for Internet and Society (CIS) has joined the multistakeholder cooperative engagement amidst stakeholders towards creating Due Diligence Recommendations for online platforms and Model Contractual Provisions to be enshrined in ToS. This blog provides a brief background of the role of dynamic coalitions within the IGF structure, establishes the need for the coalition and provides an update on the action plan and next steps for interested stakeholders.</b>
<p class="callout" style="text-align: justify; ">"Identify emerging issues, bring them to the attention of the relevant bodies and the general public, and, where appropriate, make recommendations."<br />Tunis Agenda (Para 72.g)</p>
<p style="text-align: justify; ">The first United Nations Internet Governance Forum (IGF), in 2006 saw the emergence of the concept of Dynamic Coalition and a number of coalitions have been established over the years. The IGF is structured to bring together multistakeholder groups to,</p>
<p class="callout" style="text-align: justify; ">"Discuss public policy issues related to key elements of Internet governance in order to foster the sustainability, robustness, security, stability and development of the Internet."<br />Tunis Agenda (Para 72.a)</p>
<p style="text-align: justify; ">While IGF workshops allow various stakeholders to jointly analyse "hot topics" or to examine progress that such issues have undertaken since the previous IGF, dynamic coalitions are informal, issue-specific groups comprising members of various stakeholder groups. With no strictures upon the objects, structure or processes of dynamic coalitions claiming association with the IGF, and no formal institutional affiliation, nor any access to the resources of the IGF Secretariat, IGF Dynamic Coalitions allow collaboration of anyone interested in contributing to their discussions. Currently, there are eleven active dynamic coalitions at the IGF and can be divided into three distinct types—networks, working groups and Birds of Feather (BOFs).</p>
<p style="text-align: justify; ">Workshops at the IGF are content specific events that, though valuable in informing participants, are limited in their impact by being confined to the launch of a report or by the issues raised within the conference room. The coalitions on the other hand are expected to have a broader function, acting as a coalescing point for interested stakeholders to gather and analyse progress around identified issues and plan next steps. The coalitions can also make recommendations around issues, however, no mechanism has been developed so far, by which the recommendations can be considered by the plenary body. The long-term nature of coalition is perhaps, most suited to engage stakeholders in heterogeneous groups, towards understanding and cooperating around emerging issues and to make recommendations to inform policy making.</p>
<h3 style="text-align: justify; ">Platform Responsibility</h3>
<p style="text-align: justify; ">Social networks and other interactive online services, give rise to 'cyber-spaces' where individuals gather, express their personalities and exchange information and ideas. The transnational and private nature of such platforms means that they are regulated through contractual provisions enshrined in the platforms' Terms of Service (ToS). The provisions delineated in the ToS not only extend to users in spite of their geographical location, the private decisions undertaken by platform providers in implementing the ToS are not subject to constitutional guarantees framed under national jurisdictions.</p>
<p style="text-align: justify; ">While ToS serve as binding agreement online, an absence of binding international rules in this area despite the universal nature of human rights represented is a real challenge, and makes it necessary to engage in a multistakeholder effort to produce model contractual provisions that can be incorporated in ToS. The concept of 'platform responsibility' aims to stimulate behaviour in platform providers to provide intelligible and solid mechanisms, in line with the principles laid out by the UN Guiding Principles on Business and Human Rights and equip platform users with common and easy-to-grasp tools to guarantee the full enjoyment of their human rights online. The utilisation of model contractual provisions in ToS may prove instrumental in fostering trust in online services for content production, use and dissemination, increasing demand of services and ultimately consumer demand may drive the market towards human rights compliant solutions.</p>
<h3 style="text-align: justify; ">The Dynamic Coalition on Platform Responsibility</h3>
<p style="text-align: justify; ">To nurture a multi-stakeholder endeavour aimed at the elaboration of model contractual-provisions, Mr. Luca Belli, Council of Europe / Université Paris II, Ms Primavera De Filippi, CNRS / Berkman Center for Internet and Society and Mr Nicolo Zingales, Tilburg University / Center for Technology and Society Rio, initiated and facilitated the creation of the Dynamic Coalition on Platform Responsibility (DCPR). DCPR has over fifty individual and organisational members from civil society organisations, academia, private sector organisations and intergovernmental organisations and held its first meeting at the IGF in Istanbul. The meeting began with an overview of the concept of platform responsibility, highlighting relevant initiatives from Council of Europe, Global Network Initiative, Ranking Digital Rights and the Center for Democracy and Technology have undertaken in this regard. Existing issues such as difficulty in comprehension and lack of standardization of redress across rights were raised along with the fundamental lack of due process in terms of transparency across existing mechanisms.</p>
<p style="text-align: justify; ">Online platforms compliance to human rights is often framed around the duty of States to protect human rights and often, Internet companies do not sufficient consideration of the effects of their business practices on users fundamental rights undermining trust.</p>
<p style="text-align: justify; ">The meeting focused it efforts with a call to identify issues of process and substance and specific rights and challenges to be addressed by the DCPR. The procedural issues raised concerned 'responsibility' in decision-making e.g., giving users the right to be heard and an effective remedy before an impartial decision-making body, and obtaining their consent for changes in the contractual terms. The concerns raised around substantive rights such as privacy and freedom of expression eg., disclosure of personal information and content removal and need to promote 'responsibility' through establishing concrete mechanisms to deal with such issues.</p>
<p style="text-align: justify; ">It was suggested that concept of responsibility including in case of conflict between different rights could be grounded in Human Rights case law eg., from European Court of Human Rights jurisprudence. It was also established that any framework that would evolve from this coalition would consider the distinction between users (eg., adults, children, and people with or without continuous access to the Internet) and platforms (eg., in terms of size and functionality).</p>
<h3 style="text-align: justify; ">Action Plan</h3>
<p style="text-align: justify; ">The participants at the DCPR meeting agreed to establish a multistakeholder cooperative engagement amidst stakeholders that will go beyond dialogue and produce concrete proposals. Particularly, participants suggested developing:</p>
<ol>
<li style="text-align: justify; ">Due Diligence Recommendations: Recommendations to online platforms with regard to processes of compliance with internationally agreed human rights standards.</li>
<li style="text-align: justify; ">Model Contractual Provisions: Elaboration of a set of principles and provisions protecting platform users’ rights and guaranteeing transparent mechanisms to seek redress in case of violations.</li>
</ol>
<p style="text-align: justify; ">DCPR will ground the development of these frameworks in the preliminary step of compilation of existing projects and initiatives dealing with the analysis of ToS compatibility with human rights standards. Members, participants and interested stakeholders are invited to highlight and share relevant initiatives by 10th October regarding:</p>
<ol>
<li>Processes of due diligence for human rights compliance;</li>
<li>The evaluation of ToS cocompliance with human rights standards;</li>
</ol>
<p style="text-align: justify; ">Further to this compilation, a first recommendation draft regarding online platforms' due diligence will be circulated on the mailing list by 30th October 2014. CIS will be contributing to the drafting which will be led and elaborated by the DCPR coordinators. This draft will be open for comments via the DCPR mailing list until 30th November 2014 and we encourage you to sign up to the mailing list (<a class="external-link" href="http://lists.platformresponsibility.info/listinfo/dcpr">http://lists.platformresponsibility.info/listinfo/dcpr</a>).<br /><br />A second draft will be developed compiling the comments expressed via the mailing-list and shared for comments by 10 December 2014. The final version of the recommendation will be drafted by 30 December. Subsequently, the first set of model contractual provisions will be elaborated building upon such recommendation. A call for inputs will be issued in order to gather suggestions on the content of these provisions.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility'>https://cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility</a>
</p>
No publisherjyotiHuman RightsPrivacyInternet Governance ForumData ProtectionTerms of ServiceInternet GovernancePlatform ResponsibilityIntermediary Liability2014-10-07T10:54:03ZBlog EntryMPs oppose curbs on internet; Sibal promises discussions
https://cis-india.org/news/mps-oppose-curbs-on-internet
<b>With MPs raising concerns over open-ended interpretations of restrictive terms in the rules seeking to regulate social media and internet, the government promised to evolve a consensus on points of contention.</b>
<p><a class="external-link" href="http://goo.gl/MCXLB">Pranesh Prakash is quoted in this article published by the Times of India on May 18, 2012</a></p>
<p>Telecom minister Kapil Sibal's assurance came at the end of an engrossing debate in Rajya Sabha on a motion moved by CPM MP P Rajeeve who said the rules violated freedom of expression and free speech.</p>
<p>He found support from leader of opposition <a class="external-link" href="http://timesofindia.indiatimes.com/topic/Arun-Jaitley">Arun Jaitley</a> who picked several examples to point out that terms or descriptions like "harmful", "blasphemous" and "defamatory" did not lend themselves to precise legal definitions.</p>
<p>Jaitley said what the government may find defamatory may not be seen in similar light by its critics. He also pointed to the difficulties of controlling technology and asked if it was desirable to do so.</p>
<p>Assuring MPs who sought the annulment of 'rules' which are aimed at regulating internet content, <a class="external-link" href="http://timesofindia.indiatimes.com/topic/United-Company-RUSAL">Sibal</a> said, "My assurance to the House is that I will request the MPs to write letters to me objecting to any specific words. I will then call a meeting of the members as well as the industry and all stakeholders. We will have a discussion and whatever consensus emerges, we will implement it."</p>
<p>The move to put rules in place flows from the government's annoyance with what it sees as scurrilous and disrespectful comments about senior Congress leaders. It had suggested pre-screening of content which service providers were reluctant to consider.</p>
<p>The motion for annulling the Information Technology (Intermediaries Guidelines) Rules notified in April 2011 was, however, defeated by a voice vote. Justifying the rules, the minister said "these are sensitive issues" as most internet companies were registered abroad and not subjected to Indian laws.</p>
<p>TOI was first to report about the new rules that put a lot of the onus on intermediaries like internet service providers, Facebook and Twitter, to manage and monitor content produced by their users. Web activists believe the IT rules are open to arbitrary interpretation and can be misused to silence freedom of speech.</p>
<p>Google, which participated in the public consultative process before the rules were framed, had told TOI, "If Internet platforms are held liable for third party content, it would lead to self-censorship and reduce the free flow of information."</p>
<p>Moving the motion, Rajeeve said, "I am not against any regulation on internet but I am against any control on internet... In control, there is no freedom... These rules attempt to control internet and curtail the freedom of expression."</p>
<p>Complimenting the CPM member, Jaitley said, "I think he (Rajeeve) deserves a compliment for educating us on this rule that Parliament has a supervisory control as far as subordinate legislations are concerned, and, if need be, we can express our vote of disapproval to the subordinate legislations."</p>
<p>MPs felt the government should consider a regime where offensive content can be removed immediately after being posted rather than trying to sieve it out.</p>
<p>Noting that it is extremely difficult, if not impossible, to defy technology and that the days of withholding information have gone, Jaitley urged the minister to "reconsider the language of restraints" to prevent its misuse. He pointed to certain words - harmful, harassing, blasphemous, defamatory - used in the rules, explaining how these could be interpreted/misinterpreted at any stage.</p>
<p>The MPs did note that the internet had a risk of inciting hate speech and frenzy in society and therefore it needed to be restrained but the device could be swift identification of objectionable content.</p>
<p>Pranesh Prakash of Centre for Internet and Society, an organization that has been advocating withdrawal of the rules, said he was sad with the outcome in Rajya Sabha. "The IT minister has promised to hold consultations but the ideal way to do so would have been to scrap the rules and start from scratch," he said.</p>
<p>"It's not only about language in these rules. There is a problem with provisions like the one that empowers intermediaries to remove content without notifying the user who had uploaded the content or giving users a chance to explain themselves."</p>
<p> </p>
<p>
For more details visit <a href='https://cis-india.org/news/mps-oppose-curbs-on-internet'>https://cis-india.org/news/mps-oppose-curbs-on-internet</a>
</p>
No publisherpraskrishnaInternet GovernanceIntermediary LiabilityCensorship2012-05-24T10:25:35ZNews ItemDonald Trump is attacking the social media giants; here’s what India should do differently
https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently
<b>For a robust and rights-respecting public sphere, India needs to ensure that large social media platforms receive adequate protections, and are made more responsible to its users.</b>
<p>This piece was first published at <a class="external-link" href="https://scroll.in/article/965151/donald-trump-is-attacking-the-social-media-giants-heres-what-india-should-do-differently">Scroll</a>. The authors would like to thank Torsha Sarkar for reviewing and editing the piece, and to Divij Joshi for his feedback.</p>
<hr />
<div id="article-contents" class="article-body">
<p>In retaliation to Twitter <a class="link-external" href="https://www.nytimes.com/2020/05/26/technology/twitter-trump-mail-in-ballots.html" rel="nofollow noopener" target="_blank">labelling</a> one of US President Donald Trump’s tweets as being misleading, the White House signed an <a class="link-external" href="https://www.whitehouse.gov/presidential-actions/executive-order-preventing-online-censorship/" rel="nofollow noopener" target="_blank">executive order</a>
on May 28 that seeks to dilute protections that social media companies
in the US have with respect to third-party content on their platforms.</p>
<p>The
order argues that social media companies that engage in censorship stop
functioning as ‘passive bulletin boards’: they must consequently be
treated as ‘content creators’, and be held liable for content on their
platforms as such. The shockwaves of the decision soon reached India,
with news coverage of the event <a class="link-external" href="https://www.business-standard.com/article/companies/trump-twitter-spat-debate-rages-on-role-of-social-media-companies-120053100055_1.html" rel="nofollow noopener" target="_blank">starting</a> to <a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/feud-between-donald-trump-and-jack-dorsey-can-have-long-lasting-effects-on-how-we-consume-media-in-india/articleshow/76111556.cms" rel="nofollow noopener" target="_blank">debate</a> the <a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/trumps-move-against-social-media-cos-unlikely-to-change-indias-stand/articleshow/76094586.cms?from=mdr" rel="nofollow noopener" target="_blank">consequences</a> of Trump’s order on how India regulates internet services and social media companies.</p>
<p>The
debate on the responsibilities of online platforms is not new to India,
and recently took main stage in December 2018 when the Ministry of
Electronics and Information Technology, Meity, published a draft set of
guidelines that most online services – ‘intermediaries’ – must follow.
The draft rules, which haven’t been notified yet, propose to
significantly expand the obligations on intermediaries.</p>
<p>Trump’s
executive order, however, comes in the context of content moderation
practices by social media platforms, i.e. when platforms censor speech
of their volition, and not because of legal requirements. The legal
position of content moderation is relatively under-discussed, at least
in legal terms, when it comes to India.</p>
<p>In contrast to
commentators who have implicitly assumed that Indian law permits content
moderation by social media companies, we believe Indian law fails to
adequately account for content moderation and curation practices
performed by social media companies. There may be adverse consequences
for the exercise of freedom of expression in India if this lacuna is not
filled soon.</p>
<h3 class="cms-block cms-block-heading">India vs US<br /></h3>
<p>A
useful starting point for the analysis is to compare how the US and
India regulate liability for online services. In the US, Section 230 of
the Communications Decency Act provides online services with broad
immunity from liability for third party content that they host or
transmit.</p>
<p>There are two critical components to what is generally referred to as Section 230.</p>
<p>First,
providers of an ‘interactive computer service’, like your internet
service provider or a company like Facebook, will not be treated as
publishers or speakers of third-party content. This system has allowed
the internet speech and economy to <a class="link-external" href="https://law.emory.edu/elj/content/volume-63/issue-3/articles/how-law-made-silicon-valley.html" rel="nofollow noopener" target="_blank">flourish</a>
since it allows companies to focus on their service without a constant
paranoia for what users are transmitting through their service.</p>
<p>The
second part of Section 230 states that services are allowed to moderate
and remove, in ‘good faith’, such third-party content that they may
deem offensive or obscene. This allows for online services to instate
their own community guidelines or content policies.</p>
<p>In India,
section 79 of the Information Technology Act is the analogous provision:
it grants intermediaries conditional ‘safe harbour’. This means
intermediaries, again like Facebook or your internet provider, are
exempt from liability for third-party content – like messages or videos
posted by ordinary people – provided their functioning meets certain
requirements, and they comply with the allied rules, known as
Intermediary Guidelines.</p>
<p>The notable and stark difference between
Indian law and Section 230 is that India’s IT Act is largely silent on
content moderation practices. As Rahul Matthan <a class="link-external" href="https://www.livemint.com/opinion/columns/shield-online-platforms-for-content-moderation-to-work-11591116270685.html" rel="nofollow noopener" target="_blank">points out</a>,
there is no explicit allowance in Indian law for platforms to take down
content based on their own policies, even if such actions are done in
good faith.</p>
<h3 class="cms-block cms-block-heading">Safe harbour</h3>
<div> </div>
<p>One
may argue that the absence of an explicit permission does not
necessarily mean that any platform engaging in content moderation
practices will lose its safe harbour. However, the language of Section
79 and the allied rules may even create room for divesting social media
platforms of their safe harbour.</p>
<p>The first such indication is
that the conditions to qualify for safe harbour, intermediaries must not
modify said content, not select the recipients of particular content,
and take information down when it is brought to their notice by
governments or courts.</p>
<p>Most of the conditions are almost a
verbatim copy of a ‘mere conduit’ as defined by the EU Directive on
E-Commerce, 2000. This definition was meant to encapsulate the
functioning of services like infrastructure providers, which transmit
content without exerting any real control. Thus, by adopting this
definition for all intermediaries, Indian law mostly considers internet
services, even social media platforms, to be passive plumbing through
which information flows.</p>
<p>It is easy to see how this narrow conception of online services is severely <a class="link-external" href="https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Gilespie-pp-198-216.pdf" rel="nofollow noopener" target="_blank">lacking</a>.</p>
<p>Most prominent social media platforms <a class="link-external" href="http://guidelines." rel="nofollow noopener" target="_blank">remove</a> or <a class="link-external" href="https://techcrunch.com/2019/12/16/instagram-fact-checking/" rel="nofollow noopener" target="_blank">hide</a> content, <a class="link-external" href="https://about.fb.com/news/2016/06/building-a-better-news-feed-for-you/" rel="nofollow noopener" target="_blank">algorithmically curate</a> news-feeds to make users keep coming back for more, and increasingly add <a class="link-external" href="https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html" rel="nofollow noopener" target="_blank">labels</a>
to content. If the law is interpreted strictly, these practices may be
adjudged to run afoul of the aforementioned conditions that
intermediaries need to satisfy in order to qualify for safe harbour.</p>
<h3 class="cms-block cms-block-heading">Platforms or editors?<br /></h3>
<p>For
instance, it can be argued that social media platforms initiate
transmission in some form when they pick and ‘suggest’ relevant
third-party content to users. When it comes to newsfeeds, neither the
content creator nor the consumer have as much control over how their
content is disseminated or curated as much as the platform does. By
curating newsfeeds, social media platforms can be said to essentially
‘selecting the receiver’ of transmissions.</p>
<p>The Intermediary
Guidelines further complicate matters by specifically laying out what is
not to be construed as ‘editing’ under the law. Under rule 3(3), the
act of taking down content pursuant to orders under the Act will not be
considered as ‘editing’ of said content.</p>
<p>Since the term ‘editing’
has been left undefined beyond the negative qualification, several
social media intermediaries may well qualify as editors. They use
algorithms that curate content for their users; like traditional news
editors, these algorithms use certain <a class="link-external" href="https://www.researchgate.net/profile/Michael_Devito/publication/302979999_From_Editors_to_Algorithms_A_values-based_approach_to_understanding_story_selection_in_the_Facebook_news_feed/links/5a19cc3d4585155c26ac56d4/From-Editors-to-Algorithms-A-values-based-approach-to-understanding-story-selection-in-the-Facebook-news-feed.pdf" rel="nofollow noopener" target="_blank">‘values’</a>
to determine what is relevant to their audiences. In other words, one
can argue that it is difficult to draw a bright line between editorial
and algorithmic acts.</p>
<p>To retain their safe harbour, the
counter-argument that social media platforms can rely is the fact that
Rule 3(5) of the Intermediary Guidelines requires intermediaries to
inform users that intermediaries reserve the right to take down user
content that relates to a wide of variety of acts, including content
that threatens national security, or is “[...] grossly harmful,
harassing, blasphemous, [etc.]”.</p>
<p>In practice, however, the
content moderation practices of some social media companies may go
beyond these categories. Additionally, the rule does not address the
legal questions created by these platforms’ curation of news-feeds.</p>
<p>The
purpose of highlighting how Section 79 treats the practices of social
media platforms is not with the intention of arguing that these
platforms should be held liable for user-generated content. Online
spaces created by social media platforms have allowed for individuals to
express themselves and participate in political organisation and <a class="link-external" href="https://www.pewresearch.org/internet/2018/07/11/public-attitudes-toward-political-engagement-on-social-media/" rel="nofollow noopener" target="_blank">debate</a>.</p>
<p>A
level of protection of intermediaries from immunity is therefore
critical for the protection of several human rights, especially the
right to freedom of speech. This piece only serves to highlight that
section 79 is antiquated and unfit to deal with modern online services.
The interpretative dangers that exist in the provision create regulatory
uncertainty for organisations operating in India.</p>
<h3 class="cms-block cms-block-heading">Dangers to speech<br /></h3>
<p>These dangers may not just be theoretical.</p>
<p>Only last year, Twitter CEO Jack Dorsey was <a class="link-external" href="https://www.hindustantimes.com/india-news/twitter-ceo-jack-dorsey-summoned-by-parliamentary-panel-on-feb-25-panel-refuses-to-hear-other-officials/story-8x9OUbNBo36uvp92L5nOKI.html" rel="nofollow noopener" target="_blank">summoned</a>
by the Parliamentary Committee on Information Technology to answer
accusations of the platform having a bias against ‘right-wing’ accounts.
More recently, BJP politician Vinit Goenka <a class="link-external" href="https://www.medianama.com/2020/06/223-vinit-goenka-twitter-khalistan/" rel="nofollow noopener" target="_blank">encouraged people to file cases against Twitter</a> for promoting separatist content.</p>
<p>Recent <a class="link-external" href="https://sflc.in/sites/default/files/reports/Intermediary_Liability_2_0_-_A_Shifting_Paradigm.pdf" rel="nofollow noopener" target="_blank">interventions</a>
from the Supreme Court have imposed proactive filtration and blocking
requirements on intermediaries, but these have been limited to
reasonable restrictions that may be imposed on free speech under Article
19 of India’s Constitution. Content moderation policies of
intermediaries like Twitter and Facebook go well beyond the scope of
Article 19 restrictions, and the apex court has not yet addressed this.</p>
<p>The
Delhi High Court, in Christian Louboutin v. Nakul Bajaj, has already
highlighted criteria for when e-commerce intermediaries can stake claim
to Section 79 safe harbour protections based on the active (or passive)
nature of their services. While the order came in the context of
intellectual property violations, nothing keeps a court from similarly
finding that Facebook and Twitter play an ‘active’ role when it comes to
content moderation and curation.</p>
<p>These companies may one day
find the ‘safe harbour’ rug pulled from under their feet if a court
reads section 79 more strictly. In fact, judicial intervention may not
even be required. The threat of such an interpretation may simply be
exploited by the government, and used as leverage to get social media
platforms to toe the government line.</p>
<h3 class="cms-block cms-block-heading">Protection and responsibility<br /></h3>
<p>Unfortunately,
the amendments to the intermediary guidelines proposed in 2018 do not
address the legal position of content moderation either. More recent
developments <a class="link-external" href="https://www.medianama.com/2020/04/223-meity-information-technology-act-amendments/" rel="nofollow noopener" target="_blank">suggest</a>
that the Meity may be contemplating amending the IT Act. This presents
an opportunity for a more comprehensive reworking of the Indian
intermediary liability regime than what is possible through delegated
legislation like the intermediary rules.</p>
<p>Intermediaries, rather
than being treated uniformly, should be classified based on their
function and the level of control they exercise over the content they
process. For instance, network infrastructure should continue to be
treated as ‘mere conduits’ and enjoy broad immunity from liability for
user-generated content.</p>
<p>More complex services like search engines
and online social media platforms can have differentiated
responsibilities based on the extent they can contextualise and change
content. The law should carve out an explicit permission to platforms to
moderate content in good faith. Such an allowance should be accompanied
by outlining best practices that these platforms can follow to ensure <a class="link-external" href="https://santaclaraprinciples.org/" rel="nofollow noopener" target="_blank">transparency and accountability</a> to their users.</p>
<p>For
a robust and rights-respecting public sphere, India needs to ensure
that large social media platforms receive adequate protections, and are
made more responsible to its users.</p>
<p><em>Anna Liz Thomas is a law
graduate and a policy researcher, currently working with the Centre for
Internet and Society. Gurshabad Grover manages research in the freedom
of expression and internet governance team at CIS</em>.</p>
</div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently'>https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently</a>
</p>
No publisherAnna Liz Thomas and Gurshabad GroverContent takedownFreedom of Speech and ExpressionIntermediary Liability2020-06-25T09:07:52ZBlog EntryAnalyzing the Latest List of Blocked Sites (Communalism and Rioting Edition) Part II
https://cis-india.org/internet-governance/analyzing-the-latest-list-of-blocked-sites-communalism-and-rioting-edition-part-ii
<b>Snehashish Ghosh does a further analysis of the leaked list of the websites blocked by the Indian Government from August 18, 2012 till August 21, 2012 (“leaked list”). </b>
<p style="text-align: justify; "><b>Unnecessary Blocks and Mistakes:</b></p>
<ol>
<li style="text-align: justify; ">http://hinduexistance.files.wordpress.com/..., which appears on the leaked list, does not exist because the URL is incorrect. However, the correct URL does contain an image which, in my opinion, can be considered to be capable of inciting violence. It has not been blocked due to a spelling error in the order. Instead of blocking hinduexist<b><i>e</i></b>nce.wordpress.com/... the DoT has ordered the blocking of hinduexist<b><i>a</i></b>nce.wordpress.com/..., which does not exist.</li>
<li style="text-align: justify; ">Two URLs in the block order are from the website of the High Council for Human Rights, Judiciary of the Islamic Republic of Iran. The reason for blocking these two links from this particular website is unclear.</li>
<li style="text-align: justify; ">The website of the Union of NGOs of the Islamic World was blocked. Again, the reason for blocking this website remains unclear.</li>
<li style="text-align: justify; ">URLs such as, http://farazahmed.com/..., mumblingminion.blogspot.com, were blocked. The content on these URLs was in fact debunking the fake photographs.</li>
<li style="text-align: justify; ">Certain blocked Facebook pages did not have any bearing on the North East exodus which was the main reason behind the blocks. For example, Facebook link leading to United States Institute for Peace page was blocked.</li>
</ol>
<p style="text-align: justify; "><b> </b></p>
<p style="text-align: justify; "><b>Duration of the Block</b></p>
<p style="text-align: justify; ">The Department of Telecommunications (DoT) did not specify the period for which the block has been implemented in its orders. As a result of which certain URLs still remain blocked while a majority of the links in the leaked list can be accessed. Lack of clear directions from the DoT has resulted in haphazard blocking and certain internet service providers (ISPs) have lifted the block on certain links whereas some other ISPs have continued with a complete block.</p>
<p style="text-align: justify; "><b> </b></p>
<p style="text-align: justify; "><b>How have the intermediaries reacted to the block orders?</b></p>
<p style="text-align: justify; ">Going by the leaked list of websites blocked by DoT, it issued the block orders to ‘all internet service licensees’. Intermediaries that do not fall in the category of 'internet service licensees’ were also sent a separate set of requests for taking down third party content. However, it is unclear under which provision of the law such request was made by the Government.</p>
<p style="text-align: justify; "><b>Internet Service Licensees</b></p>
<p style="text-align: justify; "><b><img src="https://cis-india.org/home-images/chart_1.png" alt="Implementation of the order at the ISP level" class="image-inline" title="Implementation of the order at the ISP level" /><br /></b></p>
<p style="text-align: justify; ">The internet service licensee or the ISPs have not followed any uniform system to notify that a particular URL or website in the leaked list is blocked according to DoT’s orders. The lack of transparency in the implementation of the block orders, have a chilling effect on free speech.</p>
<p style="text-align: justify; ">For instance, BSNL returns the following messages:</p>
<p style="text-align: justify; ">"This website/URL has been blocked until further notice either pursuant to Court orders or on the Directions issued by the Department of Telecommunications" or “This site has been blocked as per instructions from Department of Telecom (DOT).”</p>
<p style="text-align: justify; ">However, these messages are not uniform across all the URLs/websites in the leaked list. BSNL does not generate any response for the majority of the URLs in the leaked list. This results in ‘invisible censorship’ as the person who is trying to access the blocked URL does not have any means to know whether a particular URL is unavailable or certain sites are blocked by government orders.</p>
<p style="text-align: justify; ">Lack of notification does not only infringes upon the fundamental right to freedom of speech and expression but also violates the fundamental right to a constitutional remedy guaranteed under Article 32 of our Constitution. The person aggrieved by such block orders cannot approach the Court for a remedy because there is no means to figure out:</p>
<p style="text-align: justify; ">(a) Description of the content blocked?</p>
<p style="text-align: justify; ">(b) Who has issued the block order/request?</p>
<p style="text-align: justify; ">(c) Under which provision of the law such block order/request has been issued?</p>
<p style="text-align: justify; ">(d) Who has implemented the block order/request? and</p>
<p style="text-align: justify; ">(e) What was the reason for the block?</p>
<p style="text-align: justify; ">The intermediaries should provide with the above notification details while implementing a block order issued by the Government. </p>
<p style="text-align: justify; "><b>Intermediaries hosting third party content: </b></p>
<p align="right" style="text-align: justify; ">More than 100 out of the 309 blocks are Facebook (http and https) URLs. Facebook has not informed its users about the reasons behind unavailability of certain pages or content. This is another instance of invisible censorship. However, YouTube, a Google service, has maintained certain level of transparency, and informs the user that the content has been blocked as per ‘government removal request’. It is interesting to note that certain YouTube user accounts were terminated as well. It is unclear whether this was as a result of the block order. Furthermore, links associated with blogger.com, which is another service provided by Google, have been removed.</p>
<hr />
<p align="right" style="text-align: justify; ">This was <a class="external-link" href="http://www.medianama.com/2012/09/223-analyzing-the-latest-list-of-blocked-sites-communalism-rioting-edition-part-ii/">re-posted</a> by Medianama on September 26, 2012.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/analyzing-the-latest-list-of-blocked-sites-communalism-and-rioting-edition-part-ii'>https://cis-india.org/internet-governance/analyzing-the-latest-list-of-blocked-sites-communalism-and-rioting-edition-part-ii</a>
</p>
No publishersnehashishIT ActSocial mediaFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceIntermediary LiabilitySocial Networking2012-09-27T10:42:30ZBlog EntryOnline Pre-Censorship is Harmful and Impractical
https://cis-india.org/internet-governance/online-pre-censorship-harmful-impractical
<b>The Union Minister for Communications and Information Technology, Mr. Kapil Sibal wants Internet intermediaries to pre-censor content uploaded by their users. Pranesh Prakash takes issue with this and explains why this is a problem, even if the government's heart is in the right place. Further, he points out that now is the time to take action on the draconian IT Rules which are before the Parliament.</b>
<p>Mr. Sibal is a knowledgeable lawyer, and according to a senior lawyer friend of his with whom I spoke yesterday, greatly committed to ideals of freedom of speech. He would not lightly propose regulations that contravene Article 19(1)(a) [freedom of speech and expression] of our Constitution. Yet his recent proposals regarding controlling online speech seem unreasonable. My conclusion is that the minister has not properly grasped the way the Web works, is frustrated because of the arrogance of companies like Facebook, Google, Yahoo and Microsoft. And while he has his heart in the right place, his lack of knowledge of the Internet is leading him astray. The more important concern is the<a class="external-link" href="http://www.mit.gov.in/sites/upload_files/dit/files/RNUS_CyberLaw_15411.pdf"> IT Rules</a> that have been in force since April 2011.</p>
<h3>Background <br /></h3>
<p>The New York Times scooped a story on Monday revealing that Mr. Sibal and the <a class="external-link" href="http://www.mit.gov.in/">MCIT</a> had been <a class="external-link" href="http://india.blogs.nytimes.com/2011/12/05/india-asks-google-facebook-others-to-screen-user-content/?scp=2&sq=kapil%20sibal&st=cse">in touch with Facebook, Google, Yahoo, and Microsoft</a>, asking them to set up a system whereby they would manually filter user-generated content before it is published, to ensure that objectionable speech does not get published. Specifically, he mentioned content that hurt people's religious sentiments and content that Member of Parliament Shashi Tharoor described as <a class="external-link" href="http://zeenews.india.com/news/nation/i-am-against-web-censorship-shashi-tharoor_745587.html">'vile' and capable of inciting riots as being problems</a>. Lastly, Mr. Sibal defended this as not being "censorship" by the government, but "supervision" of user-generated content by the companies themselves.</p>
<h3>Concerns <br /></h3>
<p>One need not give lectures on the benefits of free speech, and Mr. Sibal is clear that he does not wish to impinge upon it. So one need not point out that freedom of speech means nothing if not the freedom to offend (as long as no harm is caused). There can, of course, be reasonable limitations on freedom of speech as provided in Article 19 of the <a class="external-link" href="http://www2.ohchr.org/english/law/ccpr.htm">ICCPR</a> and in Article 19(2) of our Constitution. My problem lies elsewhere.</p>
<h3>Secrecy <br /></h3>
<p>It is unfortunate that the New York Times has to be given credit for Mr. Sibal addressing a press conference on this issue (and he admitted as much). What he is proposing is not enforcement of existing rules and regulations, but of a new restriction on online speech. This should have, in a democracy, been put out for wide-ranging public consultations first.</p>
<h3>Making intermediaries responsible <br /></h3>
<p>The more fundamental disagreement is that over how the question of what should not be published should be decided, and how that decision should be and how that should be carried out, and who can be held liable for unlawful speech. I believe that "to make the intermediary liable for the user violating that code would, I think, not serve the larger interests of the market." Mr. Sibal said that in May this year <a class="external-link" href="http://online.wsj.com/article/SB10001424052702304563104576355223687825048.html">in an interview with the Wall Street Journal</a>. The intermediaries (that is, all persons and companies who transmit or host content on behalf of a third party), are but messengers just like a post office and do not exercise editorial control, unlike a newspaper. (By all means prosecute Facebook, Google, Yahoo, and Microsoft whenever they have created unlawful content, have exercised editorial control over unlawful content, have incited and encouraged unlawful activities, or know after a court order or the like that they are hosting illegal content and still do not remove it.)
Newspapers have editors who can take responsibility for content published in the newspaper. They can afford to, because the number of articles in a newspaper is limited. YouTube, which has 48 hours of videos uploaded every minutes, cannot. One wag suggested that Mr. Sibal was not suggesting a means of censorship, but of employment generation and social welfare for censors and editors. To try and extend editorial duties to these 'intermediaries' by executive order or through 'forceful suggestions' to these companies cannot happen without amending s.79 of the Information Technology Act which ensures they are not to be held liable for their user's content: the users are.
Internet speech has, to my knowledge, and to date, has never caused a riot in India. It is when it is translated into inflammatory speeches on the ground with megaphones that offensive speech, whether in books or on the Internet, actually become harmful, and those should be targeted instead. And the same laws that apply to offline speech already apply online. If such speech is inciting violence then the police can be contacted and a magistrate can take action. Indeed, Internet companies like Facebook, Google, etc., exercise self-regulation already (excessively and wrongly, I feel sometimes). Any person can flag any content on YouTube or Facebook as violating the site's terms of use. Indeed, even images of breast-feeding mothers have been removed from Facebook on the basis of such complaints. So it is mistaken to think that there is no self-regulation. In two recent cases, the High Courts of Bombay (<a href="https://cis-india.org/internet-governance/janhit-manch-v-union-of-india" class="internal-link" title="Janhit Manch & Ors. v. The Union of India"><em>Janhit Manch v. Union of India</em></a>) and Madras (<em>R. Karthikeyan v. Union of India</em>) refused to direct the government and intermediaries to police online content, saying that places an excessive burden on freedom of speech.</p>
<h3>IT Rules, 2011 <br /></h3>
<p>In this regard, the IT Rules published in April 2011 are great offenders. While speech that is 'disparaging' (while not being defamatory) is not prohibited by any statute, yet intermediaries are required not to carry 'disparaging' speech, or speech to which the user has no right (how is this to be judged? do you have rights to the last joke that you forwarded?), or speech that promotes gambling (as the government of Sikkim does through the PlayWin lottery), and a myriad other kinds of speech that are not prohibited in print or on TV. Who is to judge whether something is 'disparaging'? The intermediary itself, on pain of being liable for prosecution if it is found have made the wrong decision. And any person may send a notice to an intermediary to 'disable' content, which has to be done within 36 hours if the intermediary doesn't want to be held liable. Worst of all, there is no requirement to inform the user whose content it is, nor to inform the public that the content is being removed. It just disappears, into a memory hole. It does not require a paranoid conspiracy theorist to see this as a grave threat to freedom of speech.
Many human rights activists and lawyers have made a very strong case that the IT Rules on Intermediary Due Diligence are unconstitutional. Parliament still has an opportunity to reject these rules until the end of the 2012 budget session. Parliamentarians must act now to uphold their oaths to the Constitution.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/online-pre-censorship-harmful-impractical'>https://cis-india.org/internet-governance/online-pre-censorship-harmful-impractical</a>
</p>
No publisherpraneshIT ActObscenityFreedom of Speech and ExpressionPublic AccountabilityYouTubeSocial mediaInternet GovernanceFeaturedIntermediary LiabilityCensorshipSocial Networking2011-12-12T17:00:50ZBlog EntryIntermediary liability law needs updating
https://cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating
<b>The time has come for India to exert its foreign policy muscle. There is a less charitable name for intermediary liability regimes like Sec 79 of the IT Act — private censorship regimes. </b>
<p style="text-align: justify; ">The article was published in <a class="external-link" href="https://www.business-standard.com/article/opinion/intermediary-liability-law-needs-updating-119020900705_1.html">Business Standard</a> on February 9, 2019.</p>
<hr />
<p style="text-align: justify; ">Intermediaries get immunity from liability emerging from user-generated and third-party content because they have no “actual knowledge” until it is brought to their notice using “take down” requests or orders.</p>
<p style="text-align: justify; ">Since some of the harm caused is immediate, irreparable and irreversible, it is the preferred alternative to approaching courts for each case. When intermediary liability regimes were first enacted, most intermediaries were acting as common carriers — ie they did not curate the experience of users in a substantial fashion. While some intermediaries like Wikipedia continue this common carrier tradition, others driven by advertising revenue no longer treat all parties and all pieces of content neutrally. Facebook, Google and Twitter do everything they can to raise advertising revenues. They make you depressed. And if they like you, they get you to go out and vote. There is an urgent need to update intermediary liability law.</p>
<p style="text-align: justify; ">In response to being summoned by multiple governments, Facebook has announced the establishment of an independent oversight board. A global free speech court for the world’s biggest online country. The time has come for India to exert its foreign policy muscle. The amendments to our intermediary liability regime can have global repercussions, and shape the structure and functioning of this and other global courts.</p>
<p style="text-align: justify; ">While with one hand Facebook dealt the oversight board, with the other hand it took down APIs that would enable press and civil society to monitor political advertising in real time. How could they do that with no legal consequences? The answer is simple — those APIs were provided on a voluntary basis. There was no law requiring them to do so.</p>
<p style="text-align: justify; ">There are two approaches that could be followed. One, as scholar of regulatory theory Amba Kak puts it, is to “disincentivise the black box”. Most transparency reports produced by intermediaries today are on a voluntary basis; there is no requirement for this under law. Our new law could require a extensive transparency with appropriate privacy safeguards for the government, affected parties and the general public in terms of revenues, content production and consumption, policy development, contracts, service-level agreements, enforcement, adjudication and appeal. User empowerment measures in the user interface and algorithm explainability could be required. The key word in this approach is transparency.</p>
<p style="text-align: justify; ">The alternative is to incentivise the black box. Here faith is placed in technological solutions like artificial intelligence. To be fair, technological solutions may be desirable for battling child pornography, where pre-censorship (or deletion before content is published) is required. Fingerprinting technology is used to determine if the content exists in a global database maintained by organisations like the Internet Watch Foundation. A similar technology called Content ID is used pre-censor copyright infringement. Unfortunately, this is done by ignoring the flexibilities that exist in Indian copyright law to promote education, protect access knowledge by the disabled, etc. Even within such narrow application of technologies, there have been false positives. Recently, a video of a blogger testing his microphone was identified as a pre-existing copyrighted work.</p>
<p style="text-align: justify; ">The goal of a policy-maker working on this amendment should be to prevent repeats of the Shreya Singhal judgment where sections of the IT Act were read down or struck down. To avoid similar constitution challenges in the future, the rules should not specify any new categories of illegal content, because that would be outside the scope of the parent clause. The fifth ground in the list is sufficient — “violates any law for the time being in force”. Additional grounds, such as “harms minors in anyway”, is vague and cannot apply to all categories of intermediaries — for example, a dating site for sexual minorities. The rights of children need to be protected. But that is best done within the ongoing amendment to the POCSO Act.</p>
<p style="text-align: justify; ">As an engineer, I vote to eliminate redundancy. If there are specific offences that cannot fit in other parts of the law, those offences can be added as separate sections in the IT Act. For example, even though voyeurism is criminalised in the IT Act, the non-consensual distribution of intimate content could be criminalised, as it has been done in the Philippines.</p>
<p style="text-align: justify; ">Provisions that have to do with data retention and government access to that data for the purposes of national security, law enforcement and also anonymised datasets for the public interest should be in the upcoming Data Protection law. The rules for intermediary liability is not the correct place to deal with it, because data retention may also be required of those intermediaries that don’t handle any third-party information or user generated content. Finally, there have to be clear procedures in place for reinstatement of content that has been taken down.</p>
<hr />
<p style="text-align: justify; "><i>Disclosure: The Centre for Internet and Society receives grants from Facebook, Google and Wikimedia Foundation</i></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating'>https://cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating</a>
</p>
No publishersunilInternet GovernanceIntermediary Liability2019-02-13T00:05:30ZBlog EntryWebinar on counter-comments to the draft Intermediary Guidelines
https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines
<b>CCAOI and the ISOC Delhi Chapter organised a webinar on February 11 to discuss the comments submitted to the Information Technology [Intermediary Guidelines (Amendment) Rules] 2018, and counter-comments that were due by February 14. </b>
<p>The agenda of the discussion was:</p>
<ul>
<li>A brief introduction to the counter comment process [Shashank Mishra]</li>
<li>Invited stakeholders comment on key issues and perspectives on the submissions and the points to be countered.</li>
</ul>
<p>The following people participated:</p>
<ul>
<li>Amba Kak, Mozilla</li>
<li>Rajesh Chharia, ISPAI</li>
<li>Gurshabad Grover, CIS</li>
<li>Priyanka Chaudhari, SFLC</li>
<li>Divij Joshi, Vidhi Centre for Legal Policy</li>
</ul>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines'>https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines</a>
</p>
No publisherAdminInternet GovernanceIntermediary LiabilityInformation Technology2019-02-22T01:51:19ZNews ItemReading the Fine Script: Service Providers, Terms and Conditions and Consumer Rights
https://cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights
<b>This year, an increasing number of incidents, related to consumer rights and service providers, have come to light. This blog illustrates the facts of the cases, and discusses the main issues at stake, namely, the role and responsibilities of providers of platforms for user-created content with regard to consumer rights.</b>
<p style="text-align: justify; "><span>On 1st July, 2014 the Federal Trade Commission (FTC) filed a complaint against T-Mobile USA,</span><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn1">[1]</a><span> accusing the service provider of 'cramming' customers bills, with millions of dollars of unauthorized charges. Recently, another service provider, received flak from regulators and users worldwide, after it published a paper, 'Experimental evidence of massive-scale emotional contagion through social networks'.</span><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn2">[2]</a><span> The paper described Facebook's experiment on more than 600,000 users, to determine whether manipulating user-generated content, would affect the emotions of its users.</span></p>
<p style="text-align: justify; ">In both incidents the terms that should ensure the protection of their user's legal rights, were used to gain consent for actions on behalf of the service providers, that were not anticipated at the time of agreeing to the terms and conditions (T&Cs) by the consumer. More precisely, both cases point to the underlying issue of how users are bound by T&Cs, and in a mediated online landscape—highlight, the need to pay attention to the regulations that govern the online engagement of users.</p>
<p style="text-align: justify; "><b>I have read and agree to the terms</b></p>
<p style="text-align: justify; ">In his statement, Chief Executive Officer, John Legere might have referred to T-Mobile as "the most pro-consumer company in the industry",<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn3">[3]</a> however the FTC investigation revelations, that many customers never authorized the charges, suggest otherwise. The FTC investigation also found that, T-Mobile received 35-40 per cent of the amount charged for subscriptions, that were made largely through innocuous services, that customers had been signed up to, without their knowledge or consent. Last month news broke, that just under 700,000 users 'unknowingly' participated in the Facebook study, and while the legality and ethics of the experiment are being debated, what is clear is that Facebook violated consumer rights by not providing the choice to opt in or out, or even the knowledge of such social or psychological experiments to its users.</p>
<p style="text-align: justify; ">Both incidents boil down to the sensitive question of consent. While binding agreements around the world work on the condition of consent, how do we define it and what are the implications of agreeing to the terms?</p>
<p style="text-align: justify; "><b>Terms of Service: Conditions are subject to change </b></p>
<p style="text-align: justify; ">A legal necessity, the existing terms of service (TOS)—as they are also known—as an acceptance mechanism are deeply broken. The policies of online service providers are often, too long, and with no shorter or multilingual versions, require substantial effort on part of the user to go through in detail. A 2008 Carnegie Mellon study estimated it would take an average user 244 hours every year to go through the policies they agree to online.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn4">[4]</a> Based on the study, Atlantic's Alexis C. Madrigal derived that reading all of the privacy policies an average Internet user encounters in a year, would take 76 working days.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn5">[5]</a></p>
<p style="text-align: justify; ">The costs of time are multiplied by the fact that terms of services change with technology, making it very hard for a user to keep track of all of the changes over time. Moreover, many services providers do not even commit to the obligation of notifying the users of any changes in the TOS. Microsoft, Skype, Amazon, YouTube are examples of some of the service providers that have not committed to any obligations of notification of changes and often, there are no mechanisms in place to ensure that service providers are keeping users updated.</p>
<p style="text-align: justify; ">Facebook has said that the recent social experiment is perfectly legal under its TOS,<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn6">[6]</a> the question of fairness of the conditions of users consent remain debatable. Facebook has a broad copyright license that goes beyond its operating requirements, such as the right to 'sublicense'. The copyright also does not end when users stop using the service, unless the content has been deleted by everyone else.</p>
<p style="text-align: justify; ">More importantly, since 2007, Facebook has brought major changes to their lengthy TOS about every year.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn7">[7]</a> And while many point that Facebook is transparent, as it solicits feedback preceding changes to their terms, the accountability remains questionable, as the results are not binding unless 30% of the actual users vote. Facebook can and does, track users and shares their data across websites, and has no obligation or mechanism to inform users of the takedown requests.</p>
<p style="text-align: justify; ">Courts in different jurisdictions under different laws may come to different conclusions regarding these practices, especially about whether changing terms without notifying users is acceptable or not. Living in a society more protective of consumer rights is however, no safeguard, as TOS often include a clause of choice of law which allow companies to select jurisdictions whose laws govern the terms.</p>
<p style="text-align: justify; ">The recent experiment bypassed the need for informed user consent due to Facebook's Data Use Policy<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn8">[8]</a>, which states that once an account has been created, user data can be used for 'internal operations, including troubleshooting, data analysis, testing, research and service improvement.' While the users worldwide may be outraged, legally, Facebook acted within its rights as the decision fell within the scope of T&Cs that users consented to. The incident's most positive impact might be in taking the questions of Facebook responsibilities towards protecting users, including informing them of the usage of their data and changes in data privacy terms, to a worldwide audience.</p>
<p style="text-align: justify; "><b>My right is bigger than yours</b></p>
<p style="text-align: justify; ">Most TOS agreements, written by lawyers to protect the interests of the companies add to the complexities of privacy, in an increasingly user-generated digital world. Often, intentionally complicated agreements, conflict with existing data and user rights across jurisdictions and chip away at rights like ownership, privacy and even the ability to sue. With conditions that that allow for change in terms at anytime, existing users do not have ownership or control over their data.</p>
<p style="text-align: justify; ">In April New York Times, reported of updates to the legal policy of General Mills (GM), the multibillion-dollar food company.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn9">[9]</a> The update broadly asserted that consumers interacting with the company in a variety of ways and venues no longer can sue GM, but must instead, submit any complaint to “informal negotiation” or arbitration. Since then, GM has backtracked and clarified that “online communities” mentioned in the policy referred only to those online communities hosted by the company on its own websites.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn10">[10]</a> Clarification aside, as Julia Duncan, Director of Federal programs at American Association for Justice points out, the update in the terms were so broad, that they were open to wide interpretation and anything that consumers purchase from the company could have been held to this clause. <a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn11">[11]</a></p>
<p style="text-align: justify; "><b>Data and whose rights?</b></p>
<p style="text-align: justify; ">Following Snowden revelations, data privacy has become a contentious issue in the EU, and TOS, that allow the service providers to unilaterally alter terms of the contract, will face many challenges in the future. In March Edward Snowden sent his testimony to the European Parliament calling for greater accountability and highlighted that in "a global, interconnected world where, when national laws fail like this, our international laws provide for another level of accountability."<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn12">[12]</a> Following the testimony came the European Parliament's vote in favor of new safeguards on the personal data of EU citizens, when it’s transferred to non-EU.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn13">[13]</a> The new regulations seek to give users more control over their personal data including the right to ask for data from companies that control it and seek to place the burden of proof on the service providers.</p>
<p style="text-align: justify; ">The regulation places responsibility on companies, including third-parties involved in data collection, transfer and storing and greater transparency on concerned requests for information. The amendment reinforces data subject right to seek erasure of data and obliges concerned parties to communicate data rectification. Also, earlier this year, the European Court of Justice (ECJ) ruled in favor of the 'right to be forgotten'<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn14">[14]</a>. The ECJ ruling recognised data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life.</p>
<p style="text-align: justify; ">In May, the Norwegian Consumer Council filed a complaint with the Norwegian Consumer Ombudsman, “… based on the discrepancies between Norwegian Law and the standard terms and conditions applicable to the Apple iCloud service...”, and, “...in breach of the law regarding control of marketing and standard agreements.”<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn15">[15]</a> The council based its complaint on the results of a study, published earlier this year, that found terms were hazy and varied across services including iCloud, Drop Box, Google Drive, Jotta Cloud, and Microsoft OneDrive. The Norwegian Council study found that Google TOS, allow for users content to be used for other purposes than storage, including by partners and that it has rights of usage even after the service is cancelled. None of the providers provide a guarantee that data is safe from loss, while many, have the ability to terminate an account without notice. All of the service providers can change the terms of service but only Google and Microsoft give an advance notice.</p>
<p style="text-align: justify; ">The study also found service providers lacking with respect to European privacy standards, with many allowing for browsing of user content. Tellingly, Google had received a fine in January by the French Data Protection Authority, that stated regarding Google's TOS, "permits itself to combine all the data it collects about its users across all of its services without any legal basis."</p>
<p style="text-align: justify; "><b>To blame or not to blame</b></p>
<p style="text-align: justify; ">Facebook is facing a probe by the UK Information Commissioner's Office, to assess if the experiment conducted in 2012 was a violation of data privacy laws.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn16">[16]</a> The FTC asked the court to order T-Mobile USA, to stop mobile cramming, provide refunds and give up any revenues from the practice. The existing mechanisms of online consent, do not simplify the task of agreeing to multiple documents and services at once, a complexity which manifolds, with the involvement of third parties.</p>
<p style="text-align: justify; ">Unsurprisingly, T-Mobile's Legere termed the FTC lawsuit misdirected and blamed the companies providing the text services for the cramming.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn17">[17]</a> He felt those providers should be held accountable, despite allegations that T-Mobile's billing practices made it difficult for consumers to detect that they were being charged for unauthorized services and having shared revenues with third-party providers. Interestingly, this is the first action against a wireless carrier for cramming and the FTC has a precedent of going after smaller companies that provide the services.</p>
<p style="text-align: justify; ">The FTC charged T-Mobile USA with deceptive billing practices in putting the crammed charges under a total for 'use charges' and 'premium services' and failure to highlight that portion of the charge was towards third-party charges. Further, the company urged customers to take complaints to vendors and was not forthcoming with refunds. For now, T-Mobile may be able to share the blame, the incident brings to question its accountability, especially as going forward it has entered a pact along with other carriers in USA including Verizon and AT&T, agreeing to stop billing customers for third-party services. Even when practices such as cramming are deemed illegal, it does not necessarily mean that harm has been prevented. Often users bear the burden of claiming refunds and litigation comes at a cost while even after being fined companies could have succeeded in profiting from their actions.</p>
<p style="text-align: justify; "><b>Conclusion </b></p>
<p style="text-align: justify; ">Unfair terms and conditions may arise when service providers include terms that are difficult to understand or vague in their scope. TOS that prevent users from taking legal action, negate liability for service providers actions despite the companies actions that may have a direct bearing on users, are also considered unfair. More importantly, any term that is hidden till after signing the contract, or a term giving the provider the right to change the contract to their benefit including wider rights for service provider wide in comparison to users such as a term that that makes it very difficult for users to end a contract create an imbalance. These issues get further complicated when the companies control and profiting from data are doing so with user generated data provided free to the platform.</p>
<p style="text-align: justify; ">In the knowledge economy, web companies play a decisive role as even though they work for profit, the profit is derived out of the knowledge held by individuals and groups. In their function of aggregating human knowledge, they collect and provide opportunities for feedback of the outcomes of individual choices. The significance of consent becomes a critical part of the equation when harnessing individual information. In France, consent is part of the four conditions necessary to be forming a valid contract (article 1108 of the Code Civil).</p>
<p style="text-align: justify; ">The cases highlight the complexities that are inherent in the existing mechanisms of online consent. The question of consent has many underlying layers such as reasonable notice and contractual obligations related to consent such as those explored in the case in Canada, which looked at whether clauses of TOS were communicated reasonably to the user, a topic for another blog. For now, we must remember that by creating and organising social knowledge that further human activity, service providers, serve a powerful function. And as the saying goes, with great power comes great responsibility.</p>
<hr size="1" style="text-align: justify; " width="33%" />
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref1">[1]</a> 'FTC Alleges T-Mobile Crammed Bogus Charges onto Customers’ Phone Bills', published 1 July, 2014. See: http://www.ftc.gov/news-events/press-releases/2014/07/ftc-alleges-t-mobile-crammed-bogus-charges-customers-phone-bills</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref2">[2]</a> 'Experimental evidence of massive-scale emotional contagion through social networks', Adam D. I. Kramera,1, Jamie E. Guilloryb, and Jeffrey T. Hancock, published March 25, 2014. See:http://www.pnas.org/content/111/24/8788.full.pdf+html?sid=2610b655-db67-453d-bcb6-da4efeebf534</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref3">[3]</a> 'U.S. sues T-Mobile USA, alleges bogus charges on phone bills, Reuters published 1st July, 2014 See: http://www.reuters.com/article/2014/07/01/us-tmobile-ftc-idUSKBN0F656E20140701</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref4">[4]</a> 'The Cost of Reading Privacy Policies', Aleecia M. McDonald and Lorrie Faith Cranor, published I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue. See: http://lorrie.cranor.org/pubs/readingPolicyCost-authorDraft.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref5">[5]</a> 'Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days', Alexis C. Madrigal, published The Atlantic, March 2012 See: http://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-in-a-year-would-take-76-work-days/253851/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref6">[6]</a> Facebook Legal Terms. See: https://www.facebook.com/legal/terms</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref7">[7]</a> 'Facebook's Eroding Privacy Policy: A Timeline', Kurt Opsahl, Published Electronic Frontier Foundation , April 28, 2010 See:https://www.eff.org/deeplinks/2010/04/facebook-timeline</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref8">[8]</a> Facebook Data Use Policy. See: https://www.facebook.com/about/privacy/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref9">[9]</a> 'When ‘Liking’ a Brand Online Voids the Right to Sue', Stephanie Strom, published in New York Times on April 16, 2014 See: http://www.nytimes.com/2014/04/17/business/when-liking-a-brand-online-voids-the-right-to-sue.html?ref=business</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref10">[10]</a> Explaining our website privacy policy and legal terms, published April 17, 2014 See:http://www.blog.generalmills.com/2014/04/explaining-our-website-privacy-policy-and-legal-terms/#sthash.B5URM3et.dpufhttp://www.blog.generalmills.com/2014/04/explaining-our-website-privacy-policy-and-legal-terms/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref11">[11]</a> General Mills Amends New Legal Policies, Stephanie Strom, published in New York Times on 1http://www.nytimes.com/2014/04/18/business/general-mills-amends-new-legal-policies.html?_r=0</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref12">[12]</a> Edward Snowden Statement to European Parliament published March 7, 2014. See: http://www.europarl.europa.eu/document/activities/cont/201403/20140307ATT80674/20140307ATT80674EN.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref13">[13]</a> Progress on EU data protection reform now irreversible following European Parliament vote, published 12 March 201 See: http://europa.eu/rapid/press-release_MEMO-14-186_en.htm</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref14">[14]</a> European Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties, Jyoti Panday, published on CIS blog on May 14, 2014. See: http://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref15">[15]</a> Complaint regarding Apple iCloud’s terms and conditions , published on 13 May 2014 See:http://www.forbrukerradet.no/_attachment/1175090/binary/29927</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref16">[16]</a> 'Facebook faces UK probe over emotion study' See: http://www.bbc.co.uk/news/technology-28102550</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref17">[17]</a> Our Reaction to the FTC Lawsuit See: http://newsroom.t-mobile.com/news/our-reaction-to-the-ftc-lawsuit.htm</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights'>https://cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights</a>
</p>
No publisherjyotiSocial MediaConsumer RightsGoogleinternet and societyPrivacyTransparency and AccountabilityIntermediary LiabilityAccountabilityFacebookData ProtectionPoliciesSafety2014-07-04T06:31:37ZBlog EntryMinimising Legal Risks of Online Intermediaries while Protecting User Rights
https://cis-india.org/internet-governance/events/minimising-legal-risks-of-online-intermediaries-while-protecting-user-rights
<b>The Centre for Internet and Society (CIS) in partnership with Software Freedom Law Centre (SFLC.in) is organizing a workshop during the APrIGF event to be held at Crown Plaza, Greater Noida on August 5, 2014, 3.30 p.m. to 5.00 p.m. Jyoti Panday will be a panelist.</b>
<h3>Thematic Area of Interest</h3>
<ul>
<li>Internet business in the Asia Pacific region</li>
</ul>
<ul>
<li>Consumer protection for users of global Internet services</li>
</ul>
<ul>
<li>Internet for socio-economic development</li>
</ul>
<h3></h3>
<h3>Specific Issues of Discussions & Description</h3>
<p style="text-align: justify; ">Internet usage in the Asia Pacific region has been growing at a phenomenal rate and online service providers have benefited enormously from this growth. However, the region poses challenges for online service providers in terms of legal risks involved with respect to user generated content. Across the world from Europe to the US, it has been an accepted policy that service providers on the Internet cannot be held liable for user-generated content and this principle has found place in legislations enacted in this field in most countries. However, the Asian region has often seen blocking of services and websites due to user-generated content that is deemed to be illegal. There needs to be a debate on safe harbour provisions for intermediaries and the take-down provisions in legislations to ensure that the right to freedom of expression of citizens are protected while maintaining an environment that permits innovation in this space.</p>
<p style="text-align: justify; ">The workshop will also consider the different classes of intermediaries, how they differ functionally and if their differing roles should bear an impact on their responsibility with regards to protection of rights of users. Traditional models of consumer protection are based on distinguishing the roles and responsibilities of suppliers, facilitators and consumers. While developing consumer protection models for online intermediary platforms, their evolving roles and responsibilities as a supplier and a facilitator need to be considered. Intermediary platforms have also created and highlighted new consumer relations and issues that call for robust and fluid reddressal mechanisms.</p>
<p style="text-align: justify; ">The need to reflect on reddressal mechanisms for consumer issues pertaining to online intermediaries is also necessary, given the economic implications associated with intermediary liability. Failure to protect intermediaries stems innovation and restricts growth of start-ups and small to medium enterprises in the digital economy and has negative financial implications. Moreover, intermediaries are crucial in connecting developing countries to global markets and a failure to protect them, creates a barrier to information exchange and capacity building.</p>
<p style="text-align: justify; ">The panel will discuss the following issues:</p>
<ul>
<li>Take-down procedures and Put-back provisions used in various countries in the region</li>
<li>Safe-harbour provisions for intermediaries</li>
<li>Need for classification of Intermediaries for the purpose of a take-down regime and user rights</li>
<li>Rights of users of services provided by online intermediaries </li>
<li>Recommendations for a balanced intermediary liability regime</li>
</ul>
<h3 style="text-align: justify; "></h3>
<h3 style="text-align: justify; ">Expected Format and Confirmed Panel Members</h3>
<p>The workshop will be a ninety minute panel divided in two sessions of forty five minutes each. The proposed panel includes:</p>
<p style="text-align: justify; "><b>Mishi Choudhary</b> (Moderator) SFLC.IN Civil Society India<br />Mishi Choudhary is the founding director of SFLC India. She started working with SFLC in New York following the completion of her fellowship during which she earned her LLM from Columbia Law School and was a Stone Scholar. In addition to her LLM, she has an LLB and a bachelors degree in political science from the University of Delhi, India.</p>
<p style="text-align: justify; "><b>Jyoti Panday</b>, Center for Internet and Society, Civil Society, India <br />Jyoti Panday is Programme Officer at the Centre for Internet and Society working on Internet governance and on issues related to the role and responsibility of intermediaries in protecting user rights and freedom of expression. She has experience in strategy, campaign management and research on issues and processes related to the development agenda, sustainability and democracy. She has completed her MSc in Public Policy from Queen Mary, University of London.</p>
<p style="text-align: justify; "><b>Shahzad Ahmed</b>, Bytes for All Pakistan, Civil Society, Pakistan<br />Shahzad Ahmad is the Country Coordinator of Bytes for All, Pakistan and founder of the Digital Rights Institute (DRI). He is currently working on issues of ICT policy advocacy, internet rights and freedom of expression. He is a development communications expert and is at the forefront of the Internet Rights movement in Pakistan.</p>
<p style="text-align: justify; ">Mr. Ahmad is a Diplo Fellow, Executive Board Member of the Association for Progressive Communications, Advisory Board Member of .PK ccTLD and a member of the International Advisory Board of Privacy International, UK. He regularly contributes to various publications and research studies on ICTs for development, freedom of expression and gender related issues. Widely travelled, he regularly participates in various forums at local, regional and global level. Mr. Ahmad maintains a strong engagement with broader civil society networks and strongly believes in participation and openness.</p>
<p style="text-align: justify; "><b>Professor KS Park</b>, Korea University Law School Professor <br />One of the founders of Open Net Korea, Professor Park has written and is active in internet, free speech, privacy, defamation, copyright, international business contracting, etc. He has given expert testimonies in high-profile free speech cases including the /Minerva /case, the internet real name verification case, the military’s subversive book blacklisting case, the newspaper consumers’ boycott case, and the Park Jung-Geun Retweet case. As a result, the “false news” crime and the internet real name verification laws were struck down as unconstitutional, Park Jung-Geun and Minerva acquitted, the soldiers challenging book blacklisting reinstated, the newspaper boycotters acquitted partially as to the “secondary boycotting” charge (2010-2013).</p>
<p style="text-align: justify; ">Since 2006, he serves as the Executive Director of the PSPD Law Center, a non-profit entity that has organized several impact litigations in the areas of free speech, privacy, and copyright. There, the Law Center won the world’s first damage lawsuit against a copyright holder for “bad faith” takedown (2009) and the first damage lawsuit against a portal for warrantless disclosure of the user identity data to the police (2012).</p>
<p style="text-align: justify; "><b>Arvind Gupta</b>, National Head-Information and Technology, Government/ BJP Political party, India<br />National Head, BJP Information Technology Cell</p>
<p style="text-align: justify; "><b>Faisal Farooqui</b>, CEO, MouthShut.com, Private Sector, India<br />Faisal Farooqui is a highly recognized entrepreneur who is among the trailblazers of his generation. Faisal has founded and managed two successful Internet and technology companies -MouthShut.com, India's largest consumer review and social media portal and Zarca Interactive, a Virginia based enterprise survey and feedback company.</p>
<p style="text-align: justify; "><b>Ramanjit Singh Chima</b>, Google, Private Sector, India<br />Raman Jit Singh Chima serves as Policy Counsel and Government Affairs Manager for Google, based in New Delhi. He currently helps lead Google'spublic policy and government affairs work in India. He is a graduate of the Bachelors in Arts and Law (Honours) programme of the National Law School of India University, Bangalore. While at the National Law School, he was Chief Editor of the Indian Journal of Law and Technology. He has studied Internet regulation as an independent research fellow with the Sarai programme of the Centre for the Study of Developing Societies and contributed to Freedom House's 2009 Freedom on the Internet report.</p>
<p style="text-align: justify; "><b>Apar Gupta</b>, Legal, India <br />Apar Gupta is a practicing lawyer in Delhi working as a Partner at the law firm of Advani & Co. His practice areas include, commercial litigation and arbitration with a focus on technology and media. Apar as a retained counsel, represents an internet industry organisation in government affairs, including consultations on draft laws and policies which effect the sector. These issues include legal risks of intermediaries, media freedom and consumer rights. He has completed his masters in law from Columbia Law School, New York and has written columns for the Business Standard, Indian Express and the Pioneer on legal issues. Apar also is a visiting faculty at National Law University, Delhi.</p>
<h3 style="text-align: justify; ">Full Name, Affiliation and Contact Details of the Workshop Organizer</h3>
<p>The workshop will be jointly organised by SFLC.IN and the Centre for Internet & Society, India. The details of the contact person for the workshop is given below:</p>
<ol>
<li>Name: Ms. Mishi Choudhary, Executive Director, SFLC.IN I<br />E: mishi@softwarefreedom.org</li>
<li>Jyoti Panday—Centre for Internet & Society, India<br />E: jyoti@cis-india.org</li>
</ol>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/events/minimising-legal-risks-of-online-intermediaries-while-protecting-user-rights'>https://cis-india.org/internet-governance/events/minimising-legal-risks-of-online-intermediaries-while-protecting-user-rights</a>
</p>
No publisherpraskrishnaFreedom of Speech and ExpressionInternet GovernanceEventIntermediary Liability2014-07-29T07:50:51ZEvent