The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 1 to 3.
Notes From a Foreign Field: The European Court of Human Rights on Russia’s Website Blocking
https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking
<b>This blogpost summarises the human rights principles applied by the Court to website blocking, and discusses how they can be instructive to petitions in the Delhi High Court that challenge arbitrary censorship in India.</b>
<p class="has-text-align-justify"> </p>
<p class="has-text-align-justify">This blogpost was authored by Gurshabad Grover and Anna Liz Thomas. It was first published at the <a class="external-link" href="https://indconlawphil.wordpress.com/2021/02/05/notes-from-a-foreign-fieldthe-european-court-of-human-rights-on-russias-website-blocking-guest-post/">Indian Constitutional Law and Philosophy Blog</a> on February 5, 2021, and has been reproduced here with permission.</p>
<hr />
<p class="has-text-align-justify"> </p>
<p class="has-text-align-justify">From PUBG to TikTok, online services
are regularly blocked in India under an opaque censorship regime flowing
from section 69A of the Information Technology (IT) Act. Russia happens
to have a very similar online content blocking regime, parts and
processes of which were recently challenged in the European Court of
Human Rights (‘the Court’). This blogpost summarises the human rights
principles applied by the Court to website blocking, and discusses how
they can be instructive to petitions in the Delhi High Court that
challenge arbitrary censorship in India.</p>
<h3><strong>Challenges to Russia’s Website Blocking Practices</strong></h3>
<p class="has-text-align-justify">On 23 June 2020, the Court delivered <a href="https://strasbourgobservers.com/2020/08/26/the-strasbourg-court-establishes-standards-on-blocking-access-to-websites/">four judgements</a>
on the implementation of Russia’s Information Act, under which content
on the internet can be deemed illegal and taken down or blocked. Under
some of these provisions, a court order is not required, and the
government can send a blocking request directly to Roskomnadzor,
Russia’s telecom service regulator. Roskomnadzor, in turn, requests
internet service providers (ISPs) to block access to the webpage or
websites. Roskomnadzor also notifies the website owner within 24 hours.
Under the law, once the website owner notifies the Roskomnadzor that the
illegal content has been removed from the website, the Roskomnadzor
verifies the same and informs ISPs that access to the website may be
restored for users.</p>
<p class="has-text-align-justify">In the case of <a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203177%22%5D%7D"><em>Vladimir Kharitonov</em></a><em>, </em>the
complainant’s website had been blocked as a result of a blocking order
against another website, which shared the same IP address as that of the
complainant. In <a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203180%22%5D%7D"><em>Engels</em></a><em>, </em>the
applicant’s website had been ordered by a court to be blocked for
having provided information about online censorship circumvention tools,
despite the fact that such information was not unlawful under any
Russian law. <em><a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203178%22%5D%7D">OOO Flavius</a></em>
concerned three online media outlets that had their entire websites
blocked on the grounds that some of their webpages may have featured
unlawful content. Similarly, in the case of <a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203181%22%5D%7D"><em>Bulgakov</em></a><em>, </em>the
implementation of a blocking order targeting extremist content (one
particular pamphlet) had the effect of blocking access to the
applicant’s entire website. In both the cases of <em>Engels </em>and <em>Bulgakov, </em>where court proceedings had taken place, the proceedings had been concluded <em>inter se </em>the
Prosecutor General and server providers, without the involvement of the
website owner. In all four cases, appeals to higher Russian courts had
been summarily dismissed. Even in those cases where website owners had
taken down the offending content, their websites had not been restored.</p>
<p class="has-text-align-justify">The Court assessed the law and its
application on the basis of a three-part test on whether the censorship
is (a) prescribed by law (including foreseeability and accessibility
aspects of the law), (b) necessary (and proportionate) in a democratic
society, and (c) pursuing a legitimate aim.</p>
<p class="has-text-align-justify">Based on the application of these
tests, the Court ruled against the Russian authorities in all four
cases. The Court also held that the wholesale blocking of entire
websites was an extreme measure tantamount to banning a newspaper or a
television station, which has the collateral effect of interfering with
lawful content. According to the Court, blocking entire websites can
thus amount to prior restraint, which is only justified in exceptional
circumstances.</p>
<p class="has-text-align-justify">The Court further held that procedural
safeguards were required under domestic law in the context of online
content blocking, such as the government authorities: (a) conducting an
impact assessment prior to the implementation of blocking measures; (b)
providing advance notice to website owners, and their involvement in
blocking proceedings; (c) providing interested parties with the
opportunity to remove illegal content or apply for judicial review; and
(d) requiring public authorities to justify the necessity and
proportionality of blocking, provide reasons as to why less intrusive
means could not be employed and communicate the blocking request to the
owner of the targeted website.</p>
<p class="has-text-align-justify">The Court also referenced an earlier judgment it had issued in the case of <em>Ahmet Yildirim vs. Turkey, </em> acknowledging
that content creators are not the only ones affected; website blocking
interferes with the public’s right to receive information.</p>
<p class="has-text-align-justify">The Court also held that the
participation of the ISP as a designated defendant was not enough in the
case of court proceedings concerning blocking requests, because the ISP
has no vested interest in the proceedings. Therefore, in the absence of
a targeted website’s owner, blocking proceedings in court would lose
their adversarial nature, and would not provide a forum for interested
parties to be heard.</p>
<h3><strong>Implications for India</strong></h3>
<p class="has-text-align-justify">The online censorship regime in India
is similar to Russian terms of legal procedure, but perhaps worse when
it comes to the architecture of the law’s implementation. Note that for
this discussion, we will restrict ourselves to government-directed
blocking and not consider court orders for content takedown (the latter
may also include intellectual property infringement and defamatory
content).</p>
<p class="has-text-align-justify"><a href="https://indiankanoon.org/doc/10190353/">Section 69A</a>
of the Information Technology (IT) Act permits the Central Government
to order intermediaries, including ISPs, to block online content on
several grounds when it thinks it is “necessary or expedient” to do so.
Amongst others, these grounds include national security, public order
and prevention of cognisable offences.</p>
<p class="has-text-align-justify">In 2009, the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules, 2009 (‘<a href="https://cis-india.org/internet-governance/resources/information-technology-procedure-and-safeguards-for-blocking-for-access-of-information-by-public-rules-2009">blocking rules</a>’)
were issued under the Act. They lay out an entirely executive-driven
process: a committee (consisting entirely of secretaries from various
Ministries) examines blocking requests from various government
departments, and finally orders intermediaries to block such content.</p>
<p class="has-text-align-justify">As per Rule 8, the chairperson of this committee is required to “make all reasonable efforts identify the person <strong>or</strong>
intermediary who has hosted the information” (emphasis ours) and send
them a notice and give them an opportunity for a hearing. A plain
reading suggests that the content creator can then not be involved in
the blocking proceedings. Even this safeguard can be circumvented in
“emergency” situations as described in Rule 9, under which blocking
orders can be issued immediately. The rules ask for such orders to be
examined by the committee in the next two days, where they can decide to
continue or rescind the block.</p>
<p class="has-text-align-justify">The rules also task a separate committee, <a href="https://cis-india.org/internet-governance/resources/rule-419-a-indian-telegraph-rules-1951">appointed</a>
under the Telegraph Act, to meet every two months to review all
blocking orders. Pertinently, only ministerial secretaries comprise that
committee as well.</p>
<p class="has-text-align-justify">These are the limited safeguards
prescribed in the rules. Public accountability in the law is further
severely limited by a requirement of strict confidentiality (Rule 16) of
blocking orders. With no judicial, parliamentary or public oversight,
it is easy to see how online censorship in India operates in complete
secrecy, making it <a href="https://scroll.in/article/953146/how-india-is-using-its-information-technology-act-to-arbitrarily-take-down-online-content">susceptible</a> to wide abuse.</p>
<p class="has-text-align-justify">When the constitutionality of provision and the blocking rules was challenged in <a href="https://indiankanoon.org/doc/110813550/"><em>Shreya Singhal v. Union of India</em></a>,
the Supreme Court was satisfied with these minimal safeguards. However,
it saved the rules only because of two reasons. First, it noted that an
opportunity of a hearing is given “to the originator <strong>and</strong>
intermediary” (emphasis ours: notice how this is different from the
‘or’ in the blocking rules). It also specifically noted that the law
required reasoned orders that could be challenged through writ
petitions.</p>
<p class="has-text-align-justify">On this blog, Gautam Bhatia has earlier <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">argued</a>
that the judgment then should be read as obligating the government to
mandatorily notify the content creator before issuing blocking orders.
Unfortunately, the reality of the implementation of the law has <a href="https://scroll.in/article/953146/how-india-is-using-its-information-technology-act-to-arbitrarily-take-down-online-content">not lived up</a> to this optimism. While intermediaries (ISPs when it comes to website blocking) <em>may</em>
be getting a chance to respond, content creators are also almost never
given a hearing. As we saw in the European Court’s judgment, ISPs do not
have any incentive to challenge the government’s directions.</p>
<p class="has-text-align-justify">Additionally, although the law states that “reasons [for blocking content are] to be recorded in writing”, <a href="https://internetfreedom.in/whistleblower-provides-website-blocking-orders-on-4000-websites/">leaked blocking orders</a>
suggest that even ISPs are not given this information. Apart from the
opacity around the rationale for blocking, RTI requests to uncover even
the <em>list</em> of blocked websites have been <a href="https://www.hindustantimes.com/analysis/to-preserve-freedoms-online-amend-the-it-act/story-aC0jXUId4gpydJyuoBcJdI.html">repeatedly</a> rejected (for comparison, Roskomnadzor at least maintains a <a href="https://blocklist.rkn.gov.ru/">public registry</a> of websites blocked in Russia). This lack of transparency and fair proceedings also means that <em>entire </em>websites
may be getting blocked when there are only specific web pages on that
website that serve content related to unlawful acts.</p>
<p class="has-text-align-justify">When it comes to the technical methods
of blocking, the rules are silent, leaving this decision to the ISPs.
While a recent study by the Centre for Internet and Society showed that
popular ISPs are <a href="https://arxiv.org/pdf/1912.08590.pdf">using methods</a> that target specific websites, there are some recent reports that <a href="https://theprint.in/judiciary/us-firm-one-signal-moves-delhi-hc-says-ip-address-blocked-in-india-without-intimation/587852/">suggest</a>
ISPs may be blocking IP addresses too. The latter can have the effect
of blocking access to other websites that are hosted on the same
address.</p>
<p class="has-text-align-justify">There are two challenges to the rules
in the Delhi High Court, serving as opportunities for reform of website
blocking and content takedown in India. The first was filed in December
2019 by <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">Tanul Thakur</a>,
whose website DowryCalculator.com (a satirical take on the practice of
dowry) was blocked without any notice or hearing. Tanul Thakur was not
reached out to by the committee responsible for passing blocking orders
despite the fact that Thakur has publicly claimed its ownership multiple
times, and has been interviewed by the media about the website. When
Thakur <a href="https://drive.google.com/file/d/0B2NvpMoZE5HGbGVCOG5TNVF6RDRGXzk5T3VNMlhTQ0E3QUlz/view">filed</a>
a RTI asking why DowryCalculator.com was blocked, the Ministry of
Electronics cited the confidentiality rule to refuse sharing such
information!</p>
<p class="has-text-align-justify">This month, an American company providing mobile notifications services, One Signal Inc., has <a href="https://theprint.in/judiciary/us-firm-one-signal-moves-delhi-hc-says-ip-address-blocked-in-india-without-intimation/587852/">alleged</a>
that ISPs are blocking its IP address, and petitioned the court to set
aside any government order to that effect because they did not receive a
hearing. Interestingly, the IP address belongs to a popular hosting
service provider, which serves multiple websites. Considering this fact
and the lack of transparency in blocking orders, one may question
whether One Signal was the intended target at all! The European Court’s
judgment in <em>Vladimir Kharitonov</em> is quite relevant here: ISPs
should not be blocking IP addresses that are shared amongst multiple
websites, because such a measure can cause collateral damage, and make
other legitimate expression inaccessible.</p>
<p class="has-text-align-justify">Given the broad similarities between
the Indian and Russian website blocking regimes, the four judgements by
the European Court of Human Rights will be instructive to the Delhi High
Court. Note that section 69A is used for content takedown in general,
i.e. censoring posts on Twitter, not just blocking websites): the right
to hearing must extend to all such content creators. The principles
applied by the European Court can thus provide for a more rights
respecting foundation for content blocking in India for the judiciary to
uphold, or for the legislature to amend.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking'>https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking</a>
</p>
No publishergurshabadContent takedown69AConstitutional Law2021-02-13T08:42:18ZBlog EntryDonald Trump is attacking the social media giants; here’s what India should do differently
https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently
<b>For a robust and rights-respecting public sphere, India needs to ensure that large social media platforms receive adequate protections, and are made more responsible to its users.</b>
<p>This piece was first published at <a class="external-link" href="https://scroll.in/article/965151/donald-trump-is-attacking-the-social-media-giants-heres-what-india-should-do-differently">Scroll</a>. The authors would like to thank Torsha Sarkar for reviewing and editing the piece, and to Divij Joshi for his feedback.</p>
<hr />
<div id="article-contents" class="article-body">
<p>In retaliation to Twitter <a class="link-external" href="https://www.nytimes.com/2020/05/26/technology/twitter-trump-mail-in-ballots.html" rel="nofollow noopener" target="_blank">labelling</a> one of US President Donald Trump’s tweets as being misleading, the White House signed an <a class="link-external" href="https://www.whitehouse.gov/presidential-actions/executive-order-preventing-online-censorship/" rel="nofollow noopener" target="_blank">executive order</a>
on May 28 that seeks to dilute protections that social media companies
in the US have with respect to third-party content on their platforms.</p>
<p>The
order argues that social media companies that engage in censorship stop
functioning as ‘passive bulletin boards’: they must consequently be
treated as ‘content creators’, and be held liable for content on their
platforms as such. The shockwaves of the decision soon reached India,
with news coverage of the event <a class="link-external" href="https://www.business-standard.com/article/companies/trump-twitter-spat-debate-rages-on-role-of-social-media-companies-120053100055_1.html" rel="nofollow noopener" target="_blank">starting</a> to <a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/feud-between-donald-trump-and-jack-dorsey-can-have-long-lasting-effects-on-how-we-consume-media-in-india/articleshow/76111556.cms" rel="nofollow noopener" target="_blank">debate</a> the <a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/trumps-move-against-social-media-cos-unlikely-to-change-indias-stand/articleshow/76094586.cms?from=mdr" rel="nofollow noopener" target="_blank">consequences</a> of Trump’s order on how India regulates internet services and social media companies.</p>
<p>The
debate on the responsibilities of online platforms is not new to India,
and recently took main stage in December 2018 when the Ministry of
Electronics and Information Technology, Meity, published a draft set of
guidelines that most online services – ‘intermediaries’ – must follow.
The draft rules, which haven’t been notified yet, propose to
significantly expand the obligations on intermediaries.</p>
<p>Trump’s
executive order, however, comes in the context of content moderation
practices by social media platforms, i.e. when platforms censor speech
of their volition, and not because of legal requirements. The legal
position of content moderation is relatively under-discussed, at least
in legal terms, when it comes to India.</p>
<p>In contrast to
commentators who have implicitly assumed that Indian law permits content
moderation by social media companies, we believe Indian law fails to
adequately account for content moderation and curation practices
performed by social media companies. There may be adverse consequences
for the exercise of freedom of expression in India if this lacuna is not
filled soon.</p>
<h3 class="cms-block cms-block-heading">India vs US<br /></h3>
<p>A
useful starting point for the analysis is to compare how the US and
India regulate liability for online services. In the US, Section 230 of
the Communications Decency Act provides online services with broad
immunity from liability for third party content that they host or
transmit.</p>
<p>There are two critical components to what is generally referred to as Section 230.</p>
<p>First,
providers of an ‘interactive computer service’, like your internet
service provider or a company like Facebook, will not be treated as
publishers or speakers of third-party content. This system has allowed
the internet speech and economy to <a class="link-external" href="https://law.emory.edu/elj/content/volume-63/issue-3/articles/how-law-made-silicon-valley.html" rel="nofollow noopener" target="_blank">flourish</a>
since it allows companies to focus on their service without a constant
paranoia for what users are transmitting through their service.</p>
<p>The
second part of Section 230 states that services are allowed to moderate
and remove, in ‘good faith’, such third-party content that they may
deem offensive or obscene. This allows for online services to instate
their own community guidelines or content policies.</p>
<p>In India,
section 79 of the Information Technology Act is the analogous provision:
it grants intermediaries conditional ‘safe harbour’. This means
intermediaries, again like Facebook or your internet provider, are
exempt from liability for third-party content – like messages or videos
posted by ordinary people – provided their functioning meets certain
requirements, and they comply with the allied rules, known as
Intermediary Guidelines.</p>
<p>The notable and stark difference between
Indian law and Section 230 is that India’s IT Act is largely silent on
content moderation practices. As Rahul Matthan <a class="link-external" href="https://www.livemint.com/opinion/columns/shield-online-platforms-for-content-moderation-to-work-11591116270685.html" rel="nofollow noopener" target="_blank">points out</a>,
there is no explicit allowance in Indian law for platforms to take down
content based on their own policies, even if such actions are done in
good faith.</p>
<h3 class="cms-block cms-block-heading">Safe harbour</h3>
<div> </div>
<p>One
may argue that the absence of an explicit permission does not
necessarily mean that any platform engaging in content moderation
practices will lose its safe harbour. However, the language of Section
79 and the allied rules may even create room for divesting social media
platforms of their safe harbour.</p>
<p>The first such indication is
that the conditions to qualify for safe harbour, intermediaries must not
modify said content, not select the recipients of particular content,
and take information down when it is brought to their notice by
governments or courts.</p>
<p>Most of the conditions are almost a
verbatim copy of a ‘mere conduit’ as defined by the EU Directive on
E-Commerce, 2000. This definition was meant to encapsulate the
functioning of services like infrastructure providers, which transmit
content without exerting any real control. Thus, by adopting this
definition for all intermediaries, Indian law mostly considers internet
services, even social media platforms, to be passive plumbing through
which information flows.</p>
<p>It is easy to see how this narrow conception of online services is severely <a class="link-external" href="https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Gilespie-pp-198-216.pdf" rel="nofollow noopener" target="_blank">lacking</a>.</p>
<p>Most prominent social media platforms <a class="link-external" href="http://guidelines." rel="nofollow noopener" target="_blank">remove</a> or <a class="link-external" href="https://techcrunch.com/2019/12/16/instagram-fact-checking/" rel="nofollow noopener" target="_blank">hide</a> content, <a class="link-external" href="https://about.fb.com/news/2016/06/building-a-better-news-feed-for-you/" rel="nofollow noopener" target="_blank">algorithmically curate</a> news-feeds to make users keep coming back for more, and increasingly add <a class="link-external" href="https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html" rel="nofollow noopener" target="_blank">labels</a>
to content. If the law is interpreted strictly, these practices may be
adjudged to run afoul of the aforementioned conditions that
intermediaries need to satisfy in order to qualify for safe harbour.</p>
<h3 class="cms-block cms-block-heading">Platforms or editors?<br /></h3>
<p>For
instance, it can be argued that social media platforms initiate
transmission in some form when they pick and ‘suggest’ relevant
third-party content to users. When it comes to newsfeeds, neither the
content creator nor the consumer have as much control over how their
content is disseminated or curated as much as the platform does. By
curating newsfeeds, social media platforms can be said to essentially
‘selecting the receiver’ of transmissions.</p>
<p>The Intermediary
Guidelines further complicate matters by specifically laying out what is
not to be construed as ‘editing’ under the law. Under rule 3(3), the
act of taking down content pursuant to orders under the Act will not be
considered as ‘editing’ of said content.</p>
<p>Since the term ‘editing’
has been left undefined beyond the negative qualification, several
social media intermediaries may well qualify as editors. They use
algorithms that curate content for their users; like traditional news
editors, these algorithms use certain <a class="link-external" href="https://www.researchgate.net/profile/Michael_Devito/publication/302979999_From_Editors_to_Algorithms_A_values-based_approach_to_understanding_story_selection_in_the_Facebook_news_feed/links/5a19cc3d4585155c26ac56d4/From-Editors-to-Algorithms-A-values-based-approach-to-understanding-story-selection-in-the-Facebook-news-feed.pdf" rel="nofollow noopener" target="_blank">‘values’</a>
to determine what is relevant to their audiences. In other words, one
can argue that it is difficult to draw a bright line between editorial
and algorithmic acts.</p>
<p>To retain their safe harbour, the
counter-argument that social media platforms can rely is the fact that
Rule 3(5) of the Intermediary Guidelines requires intermediaries to
inform users that intermediaries reserve the right to take down user
content that relates to a wide of variety of acts, including content
that threatens national security, or is “[...] grossly harmful,
harassing, blasphemous, [etc.]”.</p>
<p>In practice, however, the
content moderation practices of some social media companies may go
beyond these categories. Additionally, the rule does not address the
legal questions created by these platforms’ curation of news-feeds.</p>
<p>The
purpose of highlighting how Section 79 treats the practices of social
media platforms is not with the intention of arguing that these
platforms should be held liable for user-generated content. Online
spaces created by social media platforms have allowed for individuals to
express themselves and participate in political organisation and <a class="link-external" href="https://www.pewresearch.org/internet/2018/07/11/public-attitudes-toward-political-engagement-on-social-media/" rel="nofollow noopener" target="_blank">debate</a>.</p>
<p>A
level of protection of intermediaries from immunity is therefore
critical for the protection of several human rights, especially the
right to freedom of speech. This piece only serves to highlight that
section 79 is antiquated and unfit to deal with modern online services.
The interpretative dangers that exist in the provision create regulatory
uncertainty for organisations operating in India.</p>
<h3 class="cms-block cms-block-heading">Dangers to speech<br /></h3>
<p>These dangers may not just be theoretical.</p>
<p>Only last year, Twitter CEO Jack Dorsey was <a class="link-external" href="https://www.hindustantimes.com/india-news/twitter-ceo-jack-dorsey-summoned-by-parliamentary-panel-on-feb-25-panel-refuses-to-hear-other-officials/story-8x9OUbNBo36uvp92L5nOKI.html" rel="nofollow noopener" target="_blank">summoned</a>
by the Parliamentary Committee on Information Technology to answer
accusations of the platform having a bias against ‘right-wing’ accounts.
More recently, BJP politician Vinit Goenka <a class="link-external" href="https://www.medianama.com/2020/06/223-vinit-goenka-twitter-khalistan/" rel="nofollow noopener" target="_blank">encouraged people to file cases against Twitter</a> for promoting separatist content.</p>
<p>Recent <a class="link-external" href="https://sflc.in/sites/default/files/reports/Intermediary_Liability_2_0_-_A_Shifting_Paradigm.pdf" rel="nofollow noopener" target="_blank">interventions</a>
from the Supreme Court have imposed proactive filtration and blocking
requirements on intermediaries, but these have been limited to
reasonable restrictions that may be imposed on free speech under Article
19 of India’s Constitution. Content moderation policies of
intermediaries like Twitter and Facebook go well beyond the scope of
Article 19 restrictions, and the apex court has not yet addressed this.</p>
<p>The
Delhi High Court, in Christian Louboutin v. Nakul Bajaj, has already
highlighted criteria for when e-commerce intermediaries can stake claim
to Section 79 safe harbour protections based on the active (or passive)
nature of their services. While the order came in the context of
intellectual property violations, nothing keeps a court from similarly
finding that Facebook and Twitter play an ‘active’ role when it comes to
content moderation and curation.</p>
<p>These companies may one day
find the ‘safe harbour’ rug pulled from under their feet if a court
reads section 79 more strictly. In fact, judicial intervention may not
even be required. The threat of such an interpretation may simply be
exploited by the government, and used as leverage to get social media
platforms to toe the government line.</p>
<h3 class="cms-block cms-block-heading">Protection and responsibility<br /></h3>
<p>Unfortunately,
the amendments to the intermediary guidelines proposed in 2018 do not
address the legal position of content moderation either. More recent
developments <a class="link-external" href="https://www.medianama.com/2020/04/223-meity-information-technology-act-amendments/" rel="nofollow noopener" target="_blank">suggest</a>
that the Meity may be contemplating amending the IT Act. This presents
an opportunity for a more comprehensive reworking of the Indian
intermediary liability regime than what is possible through delegated
legislation like the intermediary rules.</p>
<p>Intermediaries, rather
than being treated uniformly, should be classified based on their
function and the level of control they exercise over the content they
process. For instance, network infrastructure should continue to be
treated as ‘mere conduits’ and enjoy broad immunity from liability for
user-generated content.</p>
<p>More complex services like search engines
and online social media platforms can have differentiated
responsibilities based on the extent they can contextualise and change
content. The law should carve out an explicit permission to platforms to
moderate content in good faith. Such an allowance should be accompanied
by outlining best practices that these platforms can follow to ensure <a class="link-external" href="https://santaclaraprinciples.org/" rel="nofollow noopener" target="_blank">transparency and accountability</a> to their users.</p>
<p>For
a robust and rights-respecting public sphere, India needs to ensure
that large social media platforms receive adequate protections, and are
made more responsible to its users.</p>
<p><em>Anna Liz Thomas is a law
graduate and a policy researcher, currently working with the Centre for
Internet and Society. Gurshabad Grover manages research in the freedom
of expression and internet governance team at CIS</em>.</p>
</div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently'>https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently</a>
</p>
No publisherAnna Liz Thomas and Gurshabad GroverContent takedownFreedom of Speech and ExpressionIntermediary Liability2020-06-25T09:07:52ZBlog EntryWhy should we care about takedown timeframes?
https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes
<b>The issue of content takedown timeframe - the time period an intermediary is allotted to respond to a legal takedown order - has received considerably less attention in conversations about intermediary liability. This article examines the importance of framing an appropriate timeframe towards ensuring that speech online is not over-censored, and frames recommendations towards the same.
</b>
<p> </p>
<p> </p>
<p><em>This article first <a class="external-link" href="https://cyberbrics.info/why-should-we-care-about-takedown-timeframes/">appeared</a> in the CyberBRICS website. It has since been <a class="external-link" href="https://www.medianama.com/2020/04/223-content-takedown-timeframes-cyberbrics/">cross-posted</a> to the Medianama.</em></p>
<p><em>The findings and opinions expressed in this article are derived from the larger research report 'A deep dive into content takedown timeframes', which can be accessed <a class="external-link" href="https://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames">here</a>.</em></p>
<p><strong>Introduction</strong></p>
<p>Since the Ministry of Electronics and Information Technology (MeitY) proposed the draft amendments to the intermediary liability guidelines in December of 2018, speculations regarding their potential effects have been numerous. These have included, <a class="external-link" href="http://www.medianama.com/2020/01/223-traceability-accountability-necessary-intermediary-liability/">mapping</a> the requirement of traceability of originators vis-a-vis chilling effect on free speech online, or <a class="external-link" href="http://cyberbrics.info/rethinking-the-intermediary-liability-regime-in-india/">critiquing</a> the proactive filtering requirement as potentially leading to censorship.</p>
<p>One aspect, however, that has received a lesser amount of attention is encoded within Rule 3(8) of the draft amendments. By the virtue of that rule, the time-limit given to the intermediaries to respond to a legal content takedown request (“turnaround time”) has been reduced from 36 hours (as it was in the older version of the rules) to 24 hours. In essence, intermediaries, when faced with a takedown order from the government or the court, would now have to remove the concerned piece of content within 24 hours of receipt of the notice.</p>
<p>Why is this important? Consider this: the <a class="external-link" href="http://indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf">definition</a> of an ‘intermediary’ within the Indian law encompasses a vast amount of entities – cyber cafes, online-marketplaces, internet service providers and more. Governance of any intermediary liability norms would accordingly require varying levels of regulation, each of which recognizes the different composition of these entities. In light of that, the content takedown requirement, and specifically the turnaround time becomes problematic. Let alone that the vast amount of entities under the definition of intermediaries would probably find it impossible to implement this obligation due to their technical architecture, this obligation also seems to erase the nuances existing within entities which would actually fall within its scope. </p>
<p>Each category of online content, and more importantly, each category of intermediary are different, and any content takedown requirement must appreciate these differences. A smaller intermediary may find it more difficult to adhere to a stricter, shorter timeframe, than an incumbent. A piece of ‘terrorist’ content may be required to be treated with more urgency than something that is defamatory. These contextual cues are critical, and must be accordingly incorporated in any law on content takedown.</p>
<p>While making our submissions to the draft amendments, we found that there was a lack of research from the government’s side justifying the shortened turnaround time, nor were there any literature which focussed on turnaround time-frames as a critical point of regulation of intermediary liability. Accordingly, I share some findings from our research in the subsequent sections, which throw light on certain nuances that must be considered before proposing any content takedown time-frame. It is important to note that our research has not yet found what should be an appropriate turnaround time in a given situation. However, the following findings would hopefully start a preliminary conversation which may ultimately lead us to a right answer.</p>
<p><strong>What to consider when regulating takedown time-frames?</strong></p>
<p>I classify the findings from our research into a chronological sequence: a) broad legal reforms, b) correct identification of scope and extent of the law, c) institution of proper procedural safeguards, and d) post-facto review of the time-frame for evidence based policy-making.</p>
<p><em>1. Broad legal reforms: Harmonize the law on content takedown.</em></p>
<p>The Indian law for content takedown is administered through two different provisions under the Information Technology (IT) Act, each with their own legal procedures and scope. While the 24-hour turnaround time would be applicable for the procedure under one of them, there would continue to <a class="external-link" href="http://cis-india.org/internet-governance/resources/information-technology-procedure-and-safeguards-for-blocking-for-access-of-information-by-public-rules-2009">exist</a> a completely different legal procedure under which the government could still effectuate content takedown. For the latter, intermediaries would be given a 48-hour timeframe to respond to a government request with clarifications (if any).</p>
<p>Such differing procedures contributes to the creation of a confusing legal ecosystem surrounding content takedown, leading to arbitrary ways in which Indian users experience internet censorship. Accordingly, it is important to harmonize the existing law in a manner that the procedures and safeguards are seamless, and the regulatory process of content takedown is streamlined.</p>
<p><em>2. Correct identification of scope and extent of the law: Design a liability framework on the basis of the differences in the intermediaries, and the content in question.</em></p>
<p>As I have highlighted before, regulation of illegal content online cannot be <a class="external-link" href="https://blog.mozilla.org/netpolicy/2018/07/11/sustainable-policy-solutions-for-illegal-content/">one-size-fits-all</a>. Accordingly, a good law on content takedown must account for the nuances existing in the way intermediaries operate and the diversity of speech online. More specifically, there are two levels of classification that are critical.</p>
<p><em>One</em>, the law must make a fundamental classification between the intermediaries within the scope of the law. An obligation to remove illegal content can be implemented only by those entities whose technical architecture allows them to. While a search engine would be able to delink websites that are declared ‘illegal’, it would be absurd to expect a cyber cafe to follow a similar route of responding to a legal takedown order within a specified timeframe.</p>
<p>Therefore, one basis of classification must incorporate this difference in the technical architecture of these intermediaries. Apart from this, the law must also design liability for intermediaries on the basis of their user-base, annual revenue generated, and the reach, scope and potential impact of the intermediary’s actions.</p>
<p><em>Two, </em>it is important that the law recognizes that certain types of content would require more urgent treatment than other types of content. Several regulations across jurisdiction, including the NetzDG and the EU Regulation on Preventing of Dissemination of Terrorist Content Online, while problematic in their own counts, attempt to either limit their scope of application or frame liability based on the nature of content targeted.</p>
<p>The Indian law on the other hand, encompasses within its scope, a vast, varying array of content that is ‘illegal’, which includes on one hand, critical items like threatening ‘the sovereignty and integrity of India’ and on the other hand, more subjective speech elements like ‘decency or morality’. While an expedited time-frame may be permissible for the former category of speech, it is difficult to justify the same for the latter. More contextual judgments may be needed to assess the legality of content that is alleged to be defamatory or obscene, thereby making it problematic to have a shorter time-frame for the same.</p>
<p><em>3. Institution of proper procedural safeguards: Make notices mandatory and make sanctions gradated</em>.</p>
<p>Apart from the correct identification of scope and extent, it is important that there are sufficient procedural safeguards to ensure that the interests of the intermediaries and the users are not curtailed. While these may seem ancillary to the main point, how the law chooses to legislate on these issues (or does not), nevertheless has a direct bearing on the issue of content takedown and time-frames.</p>
<p>Firstly, while the Indian law mandates content takedown, it does not mandate a process through which a user is notified of such an action being taken. The mere fact that an incumbent intermediary is able to respond to removal notifications within a specified time-frame does not imply that its actions would not have ramifications on free speech. Ability to takedown content does not translate into accuracy of the action taken, and the Indian law fails to take this into account.</p>
<p>Therefore, additional obligations of informing users when their content has been taken down, institutes due process in the procedure. In the context of legal takedown, such notice mechanisms also <a class="external-link" href="http://www.eff.org/wp/who-has-your-back-2019">empower</a> users to draw attention to government censorship and targeting.</p>
<p>Secondly, a uniform time-frame of compliance, coupled with severe sanctions goes on to disrupt the competition against the smaller intermediaries. While the current law does not clearly elaborate upon the nature of sanctions that would be imposed, general principles of the doctrine of safe harbour dictate that upon failure to remove the content, the intermediary would be subject to the same level of liability as the person uploading the content. This threat of sanctions may have adverse effects on free speech online, resulting in potential <a class="external-link" href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf">over-censorship</a> of legitimate speech.</p>
<p>Accordingly, sanctions should be restricted to instances of systematic violations. For critical content, the contours of what constitutes systematic violation may differ. The regulator must accordingly take into account the nature of content which the intermediary failed to remove, while assessing their liability.</p>
<p><em>4. Post-facto review of the time-frame for evidence based policy-making: Mandate transparency reporting.</em></p>
<p>Transparency reporting, apart from ensuring accountability of intermediary action, is also a useful tool for understanding the impact of the law, specifically with relation to time period of response. The NetzDG, for all its criticism, has received <a class="external-link" href="https://www.article19.org/wp-content/uploads/2017/09/170901-Legal-Analysis-German-NetzDG-Act.pdfhttp://">support</a> for requiring intermediaries to produce bi-annual transparency reports. These reports provide us important insight into the efficacy of any proposed turnaround time, which in turn helps us to propose more nuanced reforms into the law.</p>
<p>However, to cull out the optimal amount of information from these reports, it is important that these reporting practices are standardized. There exists some international body of work which proposes a methodology for standardizing transparency reports, including the Santa Clara Principles and the Electronic Frontier Foundation’s (EFF) ‘Who has your back?’ reports. We have also previously proposed a methodology that utilizes some of these pointers.</p>
<p>Additionally, due to the experimental nature of the provision, including a review provision in the law would ensure the efficacy of the exercise can also be periodically assessed. If the discussion in the preceding section is any indication, the issue of an appropriate turnaround time is currently in a regulatory flux, with no correct answer. In such a scenario, periodic assessments compel policymakers and stakeholders to discuss effectiveness of solutions, and the nature of the problems faced, leading to <a class="external-link" href="http://www.livemint.com/Opinion/svjUfdqWwbbeeVzRjFNkUK/Making-laws-with-sunset-clauses.html">evidence-based</a> policymaking.</p>
<p><strong>Why should we care?</strong></p>
<p>There is a lot at stake while regulating any aspect of intermediary liability, and the lack of smart policy-making may result in the dampening of the interests of any one of the stakeholder groups involved. As the submissions to the draft amendments by various civil societies and industry groups show, the updated turnaround time suffers from issues, which if not addressed, may lead to over-removal, and lack of due process in the content removal procedure.</p>
<p>Among others, these submissions pointed out that the shortened time-frame did not allow the intermediaries sufficient time to scrutinize a takedown request to ensure that all technical and legal requirements are adhered to. This in turn, may also prompt third-party action against user actions. Additionally, the significantly short time-frame also raised several implementational challenges. For smaller companies with fewer employees, such a timeframe can both be burdensome, from both a financial and capability point of view. This in turn, may result in over-censorship of speech online.</p>
<p>Failing to recognize and incorporate contextual nuances into any law on intermediary liability therefore, may critically alter the way we interact with online intermediaries, and in a larger scheme, with the internet.</p>
<p> </p>
<p> </p>
<p> </p>
<div> </div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes'>https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes</a>
</p>
No publisherTorSharkContent takedownIntermediary LiabilityChilling Effect2020-04-10T04:58:56ZBlog Entry