<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 1 to 3.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking">
    <title>Notes From a Foreign Field: The European Court of Human Rights on Russia’s Website Blocking</title>
    <link>https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking</link>
    <description>
        &lt;b&gt;This blogpost summarises the human rights principles applied by the Court to website blocking, and discusses how they can be instructive to petitions in the Delhi High Court that challenge arbitrary censorship in India.&lt;/b&gt;
        
&lt;p class="has-text-align-justify"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;This blogpost was authored by Gurshabad Grover and Anna Liz Thomas. It was first published at the &lt;a class="external-link" href="https://indconlawphil.wordpress.com/2021/02/05/notes-from-a-foreign-fieldthe-european-court-of-human-rights-on-russias-website-blocking-guest-post/"&gt;Indian Constitutional Law and Philosophy Blog&lt;/a&gt; on February 5, 2021, and has been reproduced here with permission.&lt;/p&gt;
&lt;hr /&gt;
&lt;p class="has-text-align-justify"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;From PUBG to TikTok, online services 
are regularly blocked in India under an opaque censorship regime flowing
 from section 69A of the Information Technology (IT) Act. Russia happens
 to have a very similar online content blocking regime, parts and 
processes of which were recently challenged in the European Court of 
Human Rights (‘the Court’). This blogpost summarises the human rights 
principles applied by the Court to website blocking, and discusses how 
they can be instructive to petitions in the Delhi High Court that 
challenge arbitrary censorship in India.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Challenges to Russia’s Website Blocking Practices&lt;/strong&gt;&lt;/h3&gt;
&lt;p class="has-text-align-justify"&gt;On 23 June 2020, the Court delivered &lt;a href="https://strasbourgobservers.com/2020/08/26/the-strasbourg-court-establishes-standards-on-blocking-access-to-websites/"&gt;four judgements&lt;/a&gt;
 on the implementation of Russia’s Information Act, under which content 
on the internet can be deemed illegal and taken down or blocked. Under 
some of these provisions, a court order is not required, and the 
government can send a blocking request directly to Roskomnadzor, 
Russia’s telecom service regulator. Roskomnadzor, in turn, requests 
internet service providers (ISPs) to block access to the webpage or 
websites. Roskomnadzor also notifies the website owner within 24 hours. 
Under the law, once the website owner notifies the Roskomnadzor that the
 illegal content has been removed from the website, the Roskomnadzor 
verifies the same and informs ISPs that access to the website may be 
restored for users.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;In the case of &lt;a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203177%22%5D%7D"&gt;&lt;em&gt;Vladimir Kharitonov&lt;/em&gt;&lt;/a&gt;&lt;em&gt;, &lt;/em&gt;the
 complainant’s website had been blocked as a result of a blocking order 
against another website, which shared the same IP address as that of the
 complainant. In &lt;a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203180%22%5D%7D"&gt;&lt;em&gt;Engels&lt;/em&gt;&lt;/a&gt;&lt;em&gt;, &lt;/em&gt;the
 applicant’s website had been ordered by a court to be blocked for 
having provided information about online censorship circumvention tools,
 despite the fact that such information was not unlawful under any 
Russian law. &lt;em&gt;&lt;a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203178%22%5D%7D"&gt;OOO Flavius&lt;/a&gt;&lt;/em&gt;
 concerned three online media outlets that had their entire websites 
blocked on the grounds that some of their webpages may have featured 
unlawful content. Similarly, in the case of &lt;a href="https://hudoc.echr.coe.int/eng#%7B%22itemid%22:%5B%22001-203181%22%5D%7D"&gt;&lt;em&gt;Bulgakov&lt;/em&gt;&lt;/a&gt;&lt;em&gt;, &lt;/em&gt;the
 implementation of a blocking order targeting extremist content (one 
particular pamphlet) had the effect of blocking access to the 
applicant’s entire website. In both the cases of &lt;em&gt;Engels &lt;/em&gt;and &lt;em&gt;Bulgakov, &lt;/em&gt;where court proceedings had taken place, the proceedings had been concluded &lt;em&gt;inter se &lt;/em&gt;the
 Prosecutor General and server providers, without the involvement of the
 website owner. In all four cases, appeals to higher Russian courts had 
been summarily dismissed. Even in those cases where website owners had 
taken down the offending content, their websites had not been restored.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;The Court assessed the law and its 
application on the basis of a three-part test on whether the censorship 
is (a) prescribed by law (including foreseeability and accessibility 
aspects of the law), (b) necessary (and proportionate) in a democratic 
society, and (c) pursuing a legitimate aim.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;Based on the application of these 
tests, the Court ruled against the Russian authorities in all four 
cases. The Court also held that the wholesale blocking of entire 
websites was an extreme measure tantamount to banning a newspaper or a 
television station, which has&amp;nbsp; the collateral effect of interfering with
 lawful content. According to the Court, blocking entire websites can 
thus amount to prior restraint, which is only justified in exceptional 
circumstances.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;The Court further held that procedural
 safeguards were required under domestic law in the context of online 
content blocking, such as the government authorities: (a) conducting an 
impact assessment prior to the implementation of blocking measures; (b) 
providing advance notice to website owners, and their involvement in 
blocking proceedings; (c) providing interested parties with the 
opportunity to remove illegal content or apply for judicial review; and 
(d) requiring&amp;nbsp; public authorities to justify the necessity and 
proportionality of blocking, provide reasons as to why less intrusive 
means could not be employed and communicate the blocking request to the 
owner of the targeted website.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;The Court also referenced an earlier judgment it had issued in the case of &lt;em&gt;Ahmet Yildirim vs. Turkey, &lt;/em&gt;&amp;nbsp;acknowledging
 that content creators are not the only ones affected; website blocking 
interferes with the public’s right to receive information.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;The Court also held that the 
participation of the ISP as a designated defendant was not enough in the
 case of court proceedings concerning blocking requests, because the ISP
 has no vested interest in the proceedings. Therefore, in the absence of
 a targeted website’s owner, blocking proceedings in court would lose 
their adversarial nature, and would not provide a forum for interested 
parties to be heard.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Implications for India&lt;/strong&gt;&lt;/h3&gt;
&lt;p class="has-text-align-justify"&gt;The online censorship regime in India 
is similar to Russian terms of legal procedure, but perhaps worse when 
it comes to&amp;nbsp; the architecture of the law’s implementation. Note that for
 this discussion, we will restrict ourselves to government-directed 
blocking and not consider court orders for content takedown (the latter 
may also include intellectual property infringement and defamatory 
content).&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;&lt;a href="https://indiankanoon.org/doc/10190353/"&gt;Section 69A&lt;/a&gt;
 of the Information Technology (IT) Act permits the Central Government 
to order intermediaries, including ISPs, to block online content on 
several grounds when it thinks it is “necessary or expedient” to do so. 
Amongst others, these grounds include national security, public order 
and prevention of cognisable offences.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;In 2009, the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules, 2009 (‘&lt;a href="https://cis-india.org/internet-governance/resources/information-technology-procedure-and-safeguards-for-blocking-for-access-of-information-by-public-rules-2009"&gt;blocking rules&lt;/a&gt;’)
 were issued under the Act. They lay out an entirely executive-driven 
process: a committee (consisting entirely of secretaries from various 
Ministries) examines blocking requests from various government 
departments, and finally orders intermediaries to block such content.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;As per Rule 8, the chairperson of this committee is required to “make all reasonable efforts identify the person &lt;strong&gt;or&lt;/strong&gt;
 intermediary who has hosted the information” (emphasis ours) and send 
them a notice and give them an opportunity for a hearing. A plain 
reading suggests that the content creator can then not be involved in 
the blocking proceedings. Even this safeguard can be circumvented in 
“emergency” situations as described in Rule 9, under which blocking 
orders can be issued immediately. The rules ask for such orders to be 
examined by the committee in the next two days, where they can decide to
 continue or rescind the block.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;The rules also task a separate committee, &lt;a href="https://cis-india.org/internet-governance/resources/rule-419-a-indian-telegraph-rules-1951"&gt;appointed&lt;/a&gt;
 under the Telegraph Act, to meet every two months to review all 
blocking orders. Pertinently, only ministerial secretaries comprise that
 committee as well.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;These are the limited safeguards 
prescribed in the rules. Public accountability in the law is further 
severely limited by a requirement of strict confidentiality (Rule 16) of
 blocking orders. With no judicial, parliamentary or public oversight, 
it is easy to see how online censorship in India operates in complete 
secrecy, making it &lt;a href="https://scroll.in/article/953146/how-india-is-using-its-information-technology-act-to-arbitrarily-take-down-online-content"&gt;susceptible&lt;/a&gt; to wide abuse.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;When the constitutionality of provision and the blocking rules was challenged in &lt;a href="https://indiankanoon.org/doc/110813550/"&gt;&lt;em&gt;Shreya Singhal v. Union of India&lt;/em&gt;&lt;/a&gt;,
 the Supreme Court was satisfied with these minimal safeguards. However,
 it saved the rules only because of two reasons. First, it noted that an
 opportunity of a hearing is given “to the originator &lt;strong&gt;and&lt;/strong&gt;
 intermediary” (emphasis ours: notice how this is different from the 
‘or’ in the blocking rules). It also specifically noted that the law 
required reasoned orders that could be challenged through writ 
petitions.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;On this blog, Gautam Bhatia has earlier &lt;a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/"&gt;argued&lt;/a&gt;
 that the judgment then should be read as obligating the government to 
mandatorily notify the content creator before issuing blocking orders. 
Unfortunately, the reality of the implementation of the law has &lt;a href="https://scroll.in/article/953146/how-india-is-using-its-information-technology-act-to-arbitrarily-take-down-online-content"&gt;not lived up&lt;/a&gt; to this optimism. While intermediaries (ISPs when it comes to website blocking) &lt;em&gt;may&lt;/em&gt;
 be getting a chance to respond, content creators are also almost never 
given a hearing. As we saw in the European Court’s judgment, ISPs do not
 have any incentive to challenge the government’s directions.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;Additionally, although the law states that “reasons [for blocking content are] to be recorded in writing”, &lt;a href="https://internetfreedom.in/whistleblower-provides-website-blocking-orders-on-4000-websites/"&gt;leaked blocking orders&lt;/a&gt;
 suggest that even ISPs are not given this information. Apart from the 
opacity around the rationale for blocking, RTI requests to uncover even 
the &lt;em&gt;list&lt;/em&gt; of blocked websites have been &lt;a href="https://www.hindustantimes.com/analysis/to-preserve-freedoms-online-amend-the-it-act/story-aC0jXUId4gpydJyuoBcJdI.html"&gt;repeatedly&lt;/a&gt; rejected (for comparison, Roskomnadzor at least maintains a &lt;a href="https://blocklist.rkn.gov.ru/"&gt;public registry&lt;/a&gt; of websites blocked in Russia). This lack of transparency and fair proceedings also means that &lt;em&gt;entire &lt;/em&gt;websites
 may be getting blocked when there are only specific web pages on that 
website that serve content related to unlawful acts.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;When it comes to the technical methods
 of blocking, the rules are silent, leaving this decision to the ISPs. 
While a recent study by the Centre for Internet and Society showed that 
popular ISPs are &lt;a href="https://arxiv.org/pdf/1912.08590.pdf"&gt;using methods&lt;/a&gt; that target specific websites, there are some recent reports that &lt;a href="https://theprint.in/judiciary/us-firm-one-signal-moves-delhi-hc-says-ip-address-blocked-in-india-without-intimation/587852/"&gt;suggest&lt;/a&gt;
 ISPs may be blocking IP addresses too. The latter can have the effect 
of blocking access to other websites that are hosted on the same 
address.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;There are two challenges to the rules 
in the Delhi High Court, serving as opportunities for reform of website 
blocking and content takedown in India. The first was filed in December 
2019 by &lt;a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/"&gt;Tanul Thakur&lt;/a&gt;,
 whose website DowryCalculator.com (a satirical take on the practice of 
dowry) was blocked without any notice or hearing. Tanul Thakur was not 
reached out to by the committee responsible for passing blocking orders 
despite the fact that Thakur has publicly claimed its ownership multiple
 times, and has been interviewed by the media about the website. When 
Thakur &lt;a href="https://drive.google.com/file/d/0B2NvpMoZE5HGbGVCOG5TNVF6RDRGXzk5T3VNMlhTQ0E3QUlz/view"&gt;filed&lt;/a&gt;
 a RTI asking why DowryCalculator.com was blocked, the Ministry of 
Electronics cited the confidentiality rule to refuse sharing such 
information!&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;This month, an American company providing mobile notifications services, One Signal Inc., has &lt;a href="https://theprint.in/judiciary/us-firm-one-signal-moves-delhi-hc-says-ip-address-blocked-in-india-without-intimation/587852/"&gt;alleged&lt;/a&gt;
 that ISPs are blocking its IP address, and petitioned the court to set 
aside any government order to that effect because they did not receive a
 hearing. Interestingly, the IP address belongs to a popular hosting 
service provider, which serves multiple websites. Considering this fact 
and the lack of transparency in blocking orders, one may question 
whether One Signal was the intended target at all! The European Court’s 
judgment in &lt;em&gt;Vladimir Kharitonov&lt;/em&gt; is quite relevant here: ISPs 
should not be blocking IP addresses that are shared amongst multiple 
websites, because such a measure can cause collateral damage, and make 
other legitimate expression inaccessible.&lt;/p&gt;
&lt;p class="has-text-align-justify"&gt;Given the broad similarities between 
the Indian and Russian website blocking regimes, the four judgements by 
the European Court of Human Rights will be instructive to the Delhi High
 Court. Note that section 69A is used for content takedown in general, 
i.e. censoring posts on Twitter,&amp;nbsp; not just blocking websites): the right
 to hearing must extend to all such content creators. The principles 
applied by the European Court can thus provide for a more rights 
respecting foundation for content blocking in India for the judiciary to
 uphold, or for the legislature to amend.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking'&gt;https://cis-india.org/internet-governance/blog/notes-from-a-foreign-field-the-european-court-of-human-rights-on-russia2019s-website-blocking&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>gurshabad</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Content takedown</dc:subject>
    
    
        <dc:subject>69A</dc:subject>
    
    
        <dc:subject>Constitutional Law</dc:subject>
    

   <dc:date>2021-02-13T08:42:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently">
    <title>Donald Trump is attacking the social media giants; here’s what India should do differently</title>
    <link>https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently</link>
    <description>
        &lt;b&gt;For a robust and rights-respecting public sphere, India needs to ensure that large social media platforms receive adequate protections, and are made more responsible to its users.&lt;/b&gt;
        
&lt;p&gt;This piece was first published at &lt;a class="external-link" href="https://scroll.in/article/965151/donald-trump-is-attacking-the-social-media-giants-heres-what-india-should-do-differently"&gt;Scroll&lt;/a&gt;. The authors would like to thank Torsha Sarkar for reviewing and editing the piece, and to Divij Joshi for his feedback.&lt;/p&gt;
&lt;hr /&gt;
&lt;div id="article-contents" class="article-body"&gt;
&lt;p&gt;In retaliation to Twitter &lt;a class="link-external" href="https://www.nytimes.com/2020/05/26/technology/twitter-trump-mail-in-ballots.html" rel="nofollow noopener" target="_blank"&gt;labelling&lt;/a&gt; one of US President Donald Trump’s tweets as being misleading, the White House signed an &lt;a class="link-external" href="https://www.whitehouse.gov/presidential-actions/executive-order-preventing-online-censorship/" rel="nofollow noopener" target="_blank"&gt;executive order&lt;/a&gt;
 on May 28 that seeks to dilute protections that social media companies 
in the US have with respect to third-party content on their platforms.&lt;/p&gt;
&lt;p&gt;The
 order argues that social media companies that engage in censorship stop
 functioning as ‘passive bulletin boards’: they must consequently be 
treated as ‘content creators’, and be held liable for content on their 
platforms as such. The shockwaves of the decision soon reached India, 
with news coverage of the event &lt;a class="link-external" href="https://www.business-standard.com/article/companies/trump-twitter-spat-debate-rages-on-role-of-social-media-companies-120053100055_1.html" rel="nofollow noopener" target="_blank"&gt;starting&lt;/a&gt; to &lt;a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/feud-between-donald-trump-and-jack-dorsey-can-have-long-lasting-effects-on-how-we-consume-media-in-india/articleshow/76111556.cms" rel="nofollow noopener" target="_blank"&gt;debate&lt;/a&gt; the &lt;a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/trumps-move-against-social-media-cos-unlikely-to-change-indias-stand/articleshow/76094586.cms?from=mdr" rel="nofollow noopener" target="_blank"&gt;consequences&lt;/a&gt; of Trump’s order on how India regulates internet services and social media companies.&lt;/p&gt;
&lt;p&gt;The
 debate on the responsibilities of online platforms is not new to India,
 and recently took main stage in December 2018 when the Ministry of 
Electronics and Information Technology, Meity, published a draft set of 
guidelines that most online services – ‘intermediaries’ – must follow. 
The draft rules, which haven’t been notified yet, propose to 
significantly expand the obligations on intermediaries.&lt;/p&gt;
&lt;p&gt;Trump’s 
executive order, however, comes in the context of content moderation 
practices by social media platforms, i.e. when platforms censor speech 
of their volition, and not because of legal requirements. The legal 
position of content moderation is relatively under-discussed, at least 
in legal terms, when it comes to India.&lt;/p&gt;
&lt;p&gt;In contrast to 
commentators who have implicitly assumed that Indian law permits content
 moderation by social media companies, we believe Indian law fails to 
adequately account for content moderation and curation practices 
performed by social media companies. There may be adverse consequences 
for the exercise of freedom of expression in India if this lacuna is not
 filled soon.&lt;/p&gt;
&lt;h3 class="cms-block cms-block-heading"&gt;India vs US&lt;br /&gt;&lt;/h3&gt;
&lt;p&gt;A
 useful starting point for the analysis is to compare how the US and 
India regulate liability for online services. In the US, Section 230 of 
the Communications Decency Act provides online services with broad 
immunity from liability for third party content that they host or 
transmit.&lt;/p&gt;
&lt;p&gt;There are two critical components to what is generally referred to as Section 230.&lt;/p&gt;
&lt;p&gt;First,
 providers of an ‘interactive computer service’, like your internet 
service provider or a company like Facebook, will not be treated as 
publishers or speakers of third-party content. This system has allowed 
the internet speech and economy to &lt;a class="link-external" href="https://law.emory.edu/elj/content/volume-63/issue-3/articles/how-law-made-silicon-valley.html" rel="nofollow noopener" target="_blank"&gt;flourish&lt;/a&gt;
 since it allows companies to focus on their service without a constant 
paranoia for what users are transmitting through their service.&lt;/p&gt;
&lt;p&gt;The
 second part of Section 230 states that services are allowed to moderate
 and remove, in ‘good faith’, such third-party content that they may 
deem  offensive or obscene. This allows for online services to instate 
their own community guidelines or content policies.&lt;/p&gt;
&lt;p&gt;In India, 
section 79 of the Information Technology Act is the analogous provision:
 it grants intermediaries conditional ‘safe harbour’. This means 
intermediaries, again like Facebook or your internet provider, are 
exempt from liability for third-party content – like messages or videos 
posted by ordinary people – provided their functioning meets certain 
requirements, and they comply with the allied rules, known as 
Intermediary Guidelines.&lt;/p&gt;
&lt;p&gt;The notable and stark difference between
 Indian law and Section 230 is that India’s IT Act is largely silent on 
content moderation practices. As Rahul Matthan &lt;a class="link-external" href="https://www.livemint.com/opinion/columns/shield-online-platforms-for-content-moderation-to-work-11591116270685.html" rel="nofollow noopener" target="_blank"&gt;points out&lt;/a&gt;,
 there is no explicit allowance in Indian law for platforms to take down
 content based on their own policies, even if such actions are done in 
good faith.&lt;/p&gt;
&lt;h3 class="cms-block cms-block-heading"&gt;Safe harbour&lt;/h3&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;p&gt;One
 may argue that the absence of an explicit permission does not 
necessarily mean that any platform engaging in content moderation 
practices will lose its safe harbour. However, the language of Section 
79 and the allied rules may even create room for divesting social media 
platforms of their safe harbour.&lt;/p&gt;
&lt;p&gt;The first such indication is 
that the conditions to qualify for safe harbour, intermediaries must not
 modify said content, not select the recipients of particular content, 
and take information down when it is brought to their notice by 
governments or courts.&lt;/p&gt;
&lt;p&gt;Most of the conditions are almost a 
verbatim copy of a ‘mere conduit’ as defined by the EU Directive on 
E-Commerce, 2000. This definition was meant to encapsulate the 
functioning of services like infrastructure providers, which transmit 
content without exerting any real control. Thus, by adopting this 
definition for all intermediaries, Indian law mostly considers internet 
services, even social media platforms, to be passive plumbing through 
which information flows.&lt;/p&gt;
&lt;p&gt;It is easy to see how this narrow conception of online services is severely &lt;a class="link-external" href="https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Gilespie-pp-198-216.pdf" rel="nofollow noopener" target="_blank"&gt;lacking&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Most prominent social media platforms &lt;a class="link-external" href="http://guidelines." rel="nofollow noopener" target="_blank"&gt;remove&lt;/a&gt; or &lt;a class="link-external" href="https://techcrunch.com/2019/12/16/instagram-fact-checking/" rel="nofollow noopener" target="_blank"&gt;hide&lt;/a&gt; content, &lt;a class="link-external" href="https://about.fb.com/news/2016/06/building-a-better-news-feed-for-you/" rel="nofollow noopener" target="_blank"&gt;algorithmically curate&lt;/a&gt; news-feeds to make users keep coming back for more, and increasingly add &lt;a class="link-external" href="https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html" rel="nofollow noopener" target="_blank"&gt;labels&lt;/a&gt;
 to content. If the law is interpreted strictly, these practices may be 
adjudged to run afoul of the aforementioned conditions that 
intermediaries need to satisfy in order to qualify for safe harbour.&lt;/p&gt;
&lt;h3 class="cms-block cms-block-heading"&gt;Platforms or editors?&lt;br /&gt;&lt;/h3&gt;
&lt;p&gt;For
 instance, it can be argued that social media platforms initiate 
transmission in some form when they pick and ‘suggest’ relevant 
third-party content to users. When it comes to newsfeeds, neither the 
content creator nor the consumer have as much control over how their 
content is disseminated or curated as much as the platform does. By 
curating newsfeeds, social media platforms can be said to essentially 
‘selecting the receiver’ of transmissions.&lt;/p&gt;
&lt;p&gt;The Intermediary 
Guidelines further complicate matters by specifically laying out what is
 not to be construed as ‘editing’ under the law. Under rule 3(3), the 
act of taking down content pursuant to orders under the Act will not be 
considered as ‘editing’ of said content.&lt;/p&gt;
&lt;p&gt;Since the term ‘editing’
 has been left undefined beyond the negative qualification, several 
social media intermediaries may well qualify as editors. They use 
algorithms that curate content for their users; like traditional news 
editors, these algorithms use certain &lt;a class="link-external" href="https://www.researchgate.net/profile/Michael_Devito/publication/302979999_From_Editors_to_Algorithms_A_values-based_approach_to_understanding_story_selection_in_the_Facebook_news_feed/links/5a19cc3d4585155c26ac56d4/From-Editors-to-Algorithms-A-values-based-approach-to-understanding-story-selection-in-the-Facebook-news-feed.pdf" rel="nofollow noopener" target="_blank"&gt;‘values’&lt;/a&gt;
 to determine what is relevant to their audiences. In other words, one 
can argue that it is difficult to draw a bright line between editorial 
and algorithmic acts.&lt;/p&gt;
&lt;p&gt;To retain their safe harbour, the 
counter-argument that social media platforms can rely is the fact that 
Rule 3(5) of the Intermediary Guidelines requires intermediaries to 
inform users that intermediaries reserve the right to take down user 
content that relates to a wide of variety of acts, including content 
that threatens national security, or is “[...] grossly harmful, 
harassing, blasphemous, [etc.]”.&lt;/p&gt;
&lt;p&gt;In practice, however, the 
content moderation practices of some social media companies may go 
beyond these categories. Additionally, the rule does not address the 
legal questions created by these platforms’ curation of news-feeds.&lt;/p&gt;
&lt;p&gt;The
 purpose of highlighting how Section 79 treats the practices of social 
media platforms is not with the intention of arguing that these 
platforms should be held liable for user-generated content. Online 
spaces created by social media platforms have allowed for individuals to
 express themselves and participate in political organisation and &lt;a class="link-external" href="https://www.pewresearch.org/internet/2018/07/11/public-attitudes-toward-political-engagement-on-social-media/" rel="nofollow noopener" target="_blank"&gt;debate&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;A
 level of protection of intermediaries from immunity is therefore 
critical for the protection of several human rights, especially the 
right to freedom of speech. This piece only serves to highlight that 
section 79 is antiquated and unfit to deal with modern online services. 
The interpretative dangers that exist in the provision create regulatory
 uncertainty for organisations operating in India.&lt;/p&gt;
&lt;h3 class="cms-block cms-block-heading"&gt;Dangers to speech&lt;br /&gt;&lt;/h3&gt;
&lt;p&gt;These dangers may not just be theoretical.&lt;/p&gt;
&lt;p&gt;Only last year, Twitter CEO Jack Dorsey was &lt;a class="link-external" href="https://www.hindustantimes.com/india-news/twitter-ceo-jack-dorsey-summoned-by-parliamentary-panel-on-feb-25-panel-refuses-to-hear-other-officials/story-8x9OUbNBo36uvp92L5nOKI.html" rel="nofollow noopener" target="_blank"&gt;summoned&lt;/a&gt;
 by the Parliamentary Committee on Information Technology to answer 
accusations of the platform having a bias against ‘right-wing’ accounts.
 More recently, BJP politician Vinit Goenka &lt;a class="link-external" href="https://www.medianama.com/2020/06/223-vinit-goenka-twitter-khalistan/" rel="nofollow noopener" target="_blank"&gt;encouraged people to file cases against Twitter&lt;/a&gt; for promoting separatist content.&lt;/p&gt;
&lt;p&gt;Recent &lt;a class="link-external" href="https://sflc.in/sites/default/files/reports/Intermediary_Liability_2_0_-_A_Shifting_Paradigm.pdf" rel="nofollow noopener" target="_blank"&gt;interventions&lt;/a&gt;
 from the Supreme Court have imposed proactive filtration and blocking 
requirements on intermediaries, but these have been limited to 
reasonable restrictions that may be imposed on free speech under Article
 19 of India’s Constitution. Content moderation policies of 
intermediaries like Twitter and Facebook go well beyond the scope of 
Article 19 restrictions, and the apex court has not yet addressed this.&lt;/p&gt;
&lt;p&gt;The
 Delhi High Court, in Christian Louboutin v. Nakul Bajaj,  has already 
highlighted criteria for when e-commerce intermediaries can stake claim 
to Section 79 safe harbour protections based on the active (or passive) 
nature of their services. While the order came in the context of 
intellectual property violations, nothing keeps a court from similarly 
finding that Facebook and Twitter play an ‘active’ role when it comes to
 content moderation and curation.&lt;/p&gt;
&lt;p&gt;These companies may one day 
find the ‘safe harbour’ rug pulled from under their feet if a court 
reads section 79 more strictly. In fact, judicial intervention may not 
even be required. The threat of such an interpretation may simply be 
exploited by the government, and used as leverage to get social media 
platforms to toe the government line.&lt;/p&gt;
&lt;h3 class="cms-block cms-block-heading"&gt;Protection and responsibility&lt;br /&gt;&lt;/h3&gt;
&lt;p&gt;Unfortunately,
 the amendments to the intermediary guidelines proposed in 2018 do not 
address the legal position of content moderation either. More recent 
developments &lt;a class="link-external" href="https://www.medianama.com/2020/04/223-meity-information-technology-act-amendments/" rel="nofollow noopener" target="_blank"&gt;suggest&lt;/a&gt;
 that the Meity may be contemplating  amending the IT Act. This presents
 an opportunity for a more comprehensive reworking of the Indian 
intermediary liability regime than what is possible through delegated 
legislation like the intermediary rules.&lt;/p&gt;
&lt;p&gt;Intermediaries, rather 
than being treated uniformly, should be classified based on their 
function and the level of control they exercise over the content they 
process. For instance, network infrastructure should continue to be 
treated as ‘mere conduits’ and enjoy broad immunity from liability for 
user-generated content.&lt;/p&gt;
&lt;p&gt;More complex services like search engines
 and online social media platforms can have differentiated 
responsibilities based on the extent they can contextualise and change 
content. The law should carve out an explicit permission to platforms to
 moderate content in good faith. Such an allowance should be accompanied
 by outlining best practices that these platforms can follow to ensure &lt;a class="link-external" href="https://santaclaraprinciples.org/" rel="nofollow noopener" target="_blank"&gt;transparency and accountability&lt;/a&gt; to their users.&lt;/p&gt;
&lt;p&gt;For
 a robust and rights-respecting public sphere, India needs to ensure 
that large social media platforms receive adequate protections, and are 
made more responsible to its users.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Anna Liz Thomas is a law 
graduate and a policy researcher, currently working with the Centre for 
Internet and Society. Gurshabad Grover manages research in the freedom 
of expression and internet governance team at CIS&lt;/em&gt;.&lt;/p&gt;
&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently'&gt;https://cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anna Liz Thomas and Gurshabad Grover</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Content takedown</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    

   <dc:date>2020-06-25T09:07:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes">
    <title>Why should we care about takedown timeframes?</title>
    <link>https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes</link>
    <description>
        &lt;b&gt;The issue of content takedown timeframe - the time period an intermediary is allotted to respond to a legal takedown order - has received considerably less attention in conversations about intermediary liability. This article examines the importance of framing an appropriate timeframe towards ensuring that speech online is not over-censored, and frames recommendations towards the same.
&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;This article first &lt;a class="external-link" href="https://cyberbrics.info/why-should-we-care-about-takedown-timeframes/"&gt;appeared&lt;/a&gt; in the CyberBRICS website. It has since been &lt;a class="external-link" href="https://www.medianama.com/2020/04/223-content-takedown-timeframes-cyberbrics/"&gt;cross-posted&lt;/a&gt; to the Medianama.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;The findings and opinions expressed in this article are derived from the larger research report 'A deep dive into content takedown timeframes', which can be accessed &lt;a class="external-link" href="https://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames"&gt;here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Since the Ministry of Electronics and Information Technology (MeitY) proposed the draft amendments to the intermediary liability guidelines in December of 2018, speculations regarding their potential effects have been numerous. These have included, &lt;a class="external-link" href="http://www.medianama.com/2020/01/223-traceability-accountability-necessary-intermediary-liability/"&gt;mapping&lt;/a&gt; the&amp;nbsp; requirement of traceability of originators vis-a-vis chilling effect on free speech online, or &lt;a class="external-link" href="http://cyberbrics.info/rethinking-the-intermediary-liability-regime-in-india/"&gt;critiquing&lt;/a&gt; the proactive filtering requirement as potentially leading to censorship.&lt;/p&gt;
&lt;p&gt;One aspect, however, that has received a lesser amount of attention is encoded within Rule 3(8) of the draft amendments. By the virtue of that rule, the time-limit given to the intermediaries to respond to a legal content takedown request (“turnaround time”) has been reduced from 36 hours (as it was in the older version of the rules) to 24 hours. In essence, intermediaries, when faced with a takedown order from the government or the court, would now have to remove the concerned piece of content within 24 hours of receipt of the notice.&lt;/p&gt;
&lt;p&gt;Why is this important? Consider this: the &lt;a class="external-link" href="http://indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf"&gt;definition&lt;/a&gt; of an ‘intermediary’ within the Indian law encompasses a vast amount of entities – cyber cafes, online-marketplaces, internet service providers and more. Governance of any intermediary liability norms would accordingly require varying levels of regulation, each of which recognizes the different composition of these entities. In light of that, the content takedown requirement, and specifically the turnaround time becomes problematic. Let alone that the vast amount of entities under the definition of intermediaries would probably find it impossible to implement this obligation due to their technical architecture, this obligation also seems to erase the nuances existing within entities which would actually fall within its scope.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Each category of online content, and more importantly, each category of intermediary are different, and any content takedown requirement must appreciate these differences. A smaller intermediary may find it more difficult to adhere to a stricter, shorter timeframe, than an incumbent. A piece of ‘terrorist’ content may be required to be treated with more urgency than something that is defamatory. These contextual cues are critical, and must be accordingly incorporated in any law on content takedown.&lt;/p&gt;
&lt;p&gt;While making our submissions to the draft amendments, we found that there was a lack of research from the government’s side justifying the shortened turnaround time, nor were there any literature which focussed on turnaround time-frames as a critical point of regulation of intermediary liability. Accordingly, I share some findings from our research in the subsequent sections, which throw light on certain nuances that must be considered before proposing any content takedown time-frame. It is important to note that our research has not yet found what should be an appropriate turnaround time in a given situation. However, the following findings would hopefully start a preliminary conversation which may ultimately lead us to a right answer.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What to consider when regulating takedown time-frames?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;I classify the findings from our research into a chronological sequence: a) broad legal reforms, b) correct identification of scope and extent of the law, c) institution of proper procedural safeguards, and d) post-facto review of the time-frame for evidence based policy-making.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;1. Broad legal reforms: Harmonize the law on content takedown.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;The Indian law for content takedown is administered through two different provisions under the Information Technology (IT) Act, each with their own legal procedures and scope. While the 24-hour turnaround time would be applicable for the procedure under one of them, there would continue to &lt;a class="external-link" href="http://cis-india.org/internet-governance/resources/information-technology-procedure-and-safeguards-for-blocking-for-access-of-information-by-public-rules-2009"&gt;exist&lt;/a&gt; a completely different legal procedure under which the government could still effectuate content takedown. For the latter, intermediaries would be given a 48-hour timeframe to respond to a government request with clarifications (if any).&lt;/p&gt;
&lt;p&gt;Such differing procedures contributes to the creation of a confusing legal ecosystem surrounding content takedown, leading to arbitrary ways in which Indian users experience internet censorship. Accordingly, it is important to harmonize the existing law in a manner that the procedures and safeguards are seamless, and the regulatory process of content takedown is streamlined.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;2. Correct identification of scope and extent of the law: Design a liability framework on the basis of the differences in the intermediaries, and the content in question.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;As I have highlighted before, regulation of illegal content online cannot be &lt;a class="external-link" href="https://blog.mozilla.org/netpolicy/2018/07/11/sustainable-policy-solutions-for-illegal-content/"&gt;one-size-fits-all&lt;/a&gt;. Accordingly, a good law on content takedown must account for the nuances existing in the way intermediaries operate and the diversity of speech online. More specifically, there are two levels of classification that are critical.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;One&lt;/em&gt;, the law must make a fundamental classification between the intermediaries within the scope of the law. An obligation to remove illegal content can be implemented only by those entities whose technical architecture allows them to. While a search engine would be able to delink websites that are declared ‘illegal’, it would be absurd to expect a cyber cafe to follow a similar route of responding to a legal takedown order within a specified timeframe.&lt;/p&gt;
&lt;p&gt;Therefore, one basis of classification must incorporate this difference in the technical architecture of these intermediaries. Apart from this, the law must also design liability for intermediaries on the basis of their user-base, annual revenue generated, and the reach, scope and potential impact of the intermediary’s actions.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Two, &lt;/em&gt;it is important that the law recognizes that certain types of content would require more urgent treatment than other types of content. Several regulations across jurisdiction, including the NetzDG and the EU Regulation on Preventing of Dissemination of Terrorist Content Online, while problematic in their own counts, attempt to either limit their scope of application or frame liability based on the nature of content targeted.&lt;/p&gt;
&lt;p&gt;The Indian law on the other hand, encompasses within its scope, a vast, varying array of content that is ‘illegal’, which includes on one hand, critical items like threatening ‘the sovereignty and integrity of India’ and on the other hand, more subjective speech elements like ‘decency or morality’. While an expedited time-frame may be permissible for the former category of speech, it is difficult to justify the same for the latter. More contextual judgments may be needed to assess the legality of content that is alleged to be defamatory or obscene, thereby making it problematic to have a shorter time-frame for the same.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;3. Institution of proper procedural safeguards: Make notices mandatory and make sanctions gradated&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Apart from the correct identification of scope and extent, it is important that there are sufficient procedural safeguards to ensure that the interests of the intermediaries and the users are not curtailed. While these may seem ancillary to the main point, how the law chooses to legislate on these issues (or does not), nevertheless has a direct bearing on the issue of content takedown and time-frames.&lt;/p&gt;
&lt;p&gt;Firstly, while the Indian law mandates content takedown, it does not mandate a process through which a user is notified of such an action being taken. The mere fact that an incumbent intermediary is able to respond to removal notifications within a specified time-frame does not imply that its actions would not have ramifications on free speech. Ability to takedown content does not translate into accuracy of the action taken, and the Indian law fails to take this into account.&lt;/p&gt;
&lt;p&gt;Therefore, additional obligations of informing users when their content has been taken down, institutes due process in the procedure. In the context of legal takedown, such notice mechanisms also &lt;a class="external-link" href="http://www.eff.org/wp/who-has-your-back-2019"&gt;empower&lt;/a&gt; users to draw attention to government censorship and targeting.&lt;/p&gt;
&lt;p&gt;Secondly, a uniform time-frame of compliance, coupled with severe sanctions goes on to disrupt the competition against the smaller intermediaries. While the current law does not clearly elaborate upon the nature of sanctions that would be imposed, general principles of the doctrine of safe harbour dictate that upon failure to remove the content, the intermediary would be subject to the same level of liability as the person uploading the content. This threat of sanctions may have adverse effects on free speech online, resulting in potential &lt;a class="external-link" href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf"&gt;over-censorship&lt;/a&gt; of legitimate speech.&lt;/p&gt;
&lt;p&gt;Accordingly, sanctions should be restricted to instances of systematic violations. For critical content, the contours of what constitutes systematic violation may differ. The regulator must accordingly take into account the nature of content which the intermediary failed to remove, while assessing their liability.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;4. Post-facto review of the time-frame for evidence based policy-making: Mandate transparency reporting.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Transparency reporting, apart from ensuring accountability of intermediary action, is also a useful tool for understanding the impact of the law, specifically with relation to time period of response. The NetzDG, for all its criticism, has received &lt;a class="external-link" href="https://www.article19.org/wp-content/uploads/2017/09/170901-Legal-Analysis-German-NetzDG-Act.pdfhttp://"&gt;support&lt;/a&gt; for requiring intermediaries to produce bi-annual transparency reports. These reports provide us important insight into the efficacy of any proposed turnaround time, which in turn helps us to propose more nuanced reforms into the law.&lt;/p&gt;
&lt;p&gt;However, to cull out the optimal amount of information from these reports, it is important that these reporting practices are standardized. There exists some international body of work which proposes a methodology for standardizing transparency reports, including the Santa Clara Principles and the Electronic Frontier Foundation’s (EFF) ‘Who has your back?’ reports. We have also previously proposed a methodology that utilizes some of these pointers.&lt;/p&gt;
&lt;p&gt;Additionally, due to the experimental nature of the provision, including a review provision in the law would ensure the efficacy of the exercise can also be periodically assessed. If the discussion in the preceding section is any indication, the issue of an appropriate turnaround time is currently in a regulatory flux, with no correct answer. In such a scenario, periodic assessments compel policymakers and stakeholders to discuss effectiveness of solutions, and the nature of the problems faced, leading to &lt;a class="external-link" href="http://www.livemint.com/Opinion/svjUfdqWwbbeeVzRjFNkUK/Making-laws-with-sunset-clauses.html"&gt;evidence-based&lt;/a&gt; policymaking.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Why should we care?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;There is a lot at stake while regulating any aspect of intermediary liability, and the lack of smart policy-making may result in the dampening of the interests of any one of the stakeholder groups involved. As the submissions to the draft amendments by various civil societies and industry groups show, the updated turnaround time suffers from issues, which if not addressed, may lead to over-removal, and lack of due process in the content removal procedure.&lt;/p&gt;
&lt;p&gt;Among others, these submissions pointed out that the shortened time-frame did not allow the intermediaries sufficient time to scrutinize a takedown request to ensure that all technical and legal requirements are adhered to. This in turn, may also prompt third-party action against user actions. Additionally, the significantly short time-frame also raised several implementational challenges. For smaller companies with fewer employees, such a timeframe can both be burdensome, from both a financial and capability point of view. This in turn, may result in over-censorship of speech online.&lt;/p&gt;
&lt;p&gt;Failing to recognize and incorporate contextual nuances into any law on intermediary liability therefore, may critically alter the way we interact with online intermediaries, and in a larger scheme, with the internet.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes'&gt;https://cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>TorShark</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Content takedown</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    
    
        <dc:subject>Chilling Effect</dc:subject>
    

   <dc:date>2020-04-10T04:58:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
