The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 21 to 35.
Comments to the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021
<b>The Centre for Internet & Society (CIS) presented its comments on the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (‘the rules’), which were released on 6 June, 2022 for public comments.</b>
<p style="text-align: justify; ">These comments examine whether the proposed amendments are in adherence to established principles of constitutional law, intermediary liability and other relevant legal doctrines. We thank the Ministry of Electronics and Information Technology (MEITY) for allowing us this opportunity. Our comments are divided into two parts. In the first part, we reiterate some of our comments to the existing version of the rules, which we believe holds relevance for the proposed amendments as well. And in the second part, we provide issue-wise comments that we believe need to be addressed prior to finalising the amendments to the rules.</p>
<hr />
<p style="text-align: justify; ">To access the full text of the Comments to the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, <a href="https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-it-rules-2021.pdf" class="internal-link">click here</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021'>https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021</a>
</p>
No publisherAnamika Kundu, Digvijay Chaudhary, Divyansha Sehgal, Isha Suri and Torsha SarkarDigital MediaInternet GovernanceIntermediary LiabilityInformation Technology2022-07-07T02:39:28ZBlog EntryDeitY says 143 URLs have been Blocked in 2015; Procedure for Blocking Content Remains Opaque and in Urgent Need of Transparency Measures
https://cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015
<b>Across India on 30 December 2014, following an order issued by the Department of Telecom (DOT), Internet Service Providers (ISPs) blocked 32 websites including Vimeo, Dailymotion, GitHub and Pastebin.</b>
<p style="text-align: justify;">In February 2015, the Centre for Internet and Society (CIS) requested the Department of Electronics and Information Technology (DeitY) under the Right to Information Act, 2005 (RTI Act) to provide information clarifying the procedures for blocking in India. We have received a response from DeitY which may be <a href="https://cis-india.org/internet-governance/blog/response-deity.clarifying-procedures-for-blocking.pdf" class="external-link">seen here</a>.</p>
<p style="text-align: justify;">In this post, I shall elaborate on this response from DeitY and highlight some of the accountability and transparency measures that the procedure needs. To stress the urgency of reform, I shall also touch upon two recent developments—the response from Ministry of Communication to questions raised in Parliament on the blocking procedures and the Supreme Court (SC) judgment in Shreya Singhal v. Union of India.</p>
<h2 style="text-align: justify;">Section 69A and the Blocking Rules</h2>
<p align="JUSTIFY" class="western">Section 69A of the Information Technology Act, 2008 (S69A hereinafter) grants powers to the central government to issue directions for blocking of access to any information through any computer resource. In other words, it allows the government to block any websites under certain grounds. The Government has notified rules laying down the procedure for blocking access online under the Procedure and Safeguards for Blocking for Access of Information by Public Rules, 2009 (Rules, 2009 hereinafter). CIS has produced a poster explaining the blocking procedure (<a href="http://cis-india.org/internet-governance/blog/blocking-websites.pdf/at_download/file">download PDF</a>, 2.037MB).</p>
<p align="JUSTIFY" class="western">There are <em>three key aspects</em> of the blocking rules that need to be kept under consideration:</p>
<h3 align="JUSTIFY" class="western">Officers and committees handling requests</h3>
<p style="text-align: justify;"><strong>Designated Officer (DO)</strong> – Appointed by the Central government, officer not below the rank of Joint Secretary.<br /><strong>Nodal Officer (NO)</strong> – Appointed by organizations including Ministries or Departments of the State governments and Union Territories and any agency of the Central Government. <br /><strong>Intermediary contact</strong>–Appointed by every intermediary to receive and handle blocking directions from the DO.<br /><strong>Committee for Examination of Request (CER)</strong> – The request along with printed sample of alleged offending information is examined by the CER—committee with the DO serving as the Chairperson and representatives from Ministry of Law and Justice; Ministry of Home Affairs; Ministry of Information and Broadcasting and representative from the Indian Computer Emergency Response Team (CERT-In). The CER is responsible for examining each blocking request and makes recommendations including revoking blocking orders to the DO, which are taken into consideration for final approval of request for blocking by the Secretary, DOT. <br /><strong>Review Committee (RC) </strong>– Constituted under rule 419A of the Indian Telegraph Act, 1951, the RC includes the Cabinet Secretary, Secretary to the Government of India (Legal Affairs) and Secretary (Department of Telecom). The RC is mandated to meet at least once in 2 months and record its findings and has to validate that directions issued are in compliance with S69A(1).</p>
<h3 style="text-align: justify;">Provisions outlining the procedure for blocking</h3>
<p>Rules 6, 9 and 10 create three distinct blocking procedures, which must commence within 7 days of the DO receiving the request.</p>
<p style="text-align: justify;">a) Rule 6 lays out the first procedure, under which any person may approach the NO and request blocking, alternatively, the NO may also raise a blocking request. After the NO of the approached Ministry or Department of the State governments and Union Territories and/or any agency of the Central Government, is satisfied of the validity of the request they forward it to the DO. Requests when not sent through the NO of any organization, must be approved by Chief Secretary of the State or Union Territory or the Advisor to the Administrator of the Union Territory, before being sent to the DO.</p>
<p style="text-align: justify;">The DO upon receiving the request places, must acknowledge receipt within 24 four hours and places the request along with printed copy of alleged information for validation by the CER. The DO also, must make reasonable efforts to identify the person or intermediary hosting the information, and having identified them issue a notice asking them to appear and submit their reply and clarifications before the committee at a specified date and time, within forty eight hours of the receipt of notice.</p>
<p style="text-align: justify;">Foreign entities hosting the information are also informed and the CER gives it recommendations after hearing from the intermediary or the person has clarified their position and even if there is no representation by the same and after examining if the request falls within the scope outlined under S69A(1). The blocking directions are issued by the Secretary (DeitY), after the DO forwards the request and the CER recommendations. If approval is granted the DO directs the relevant intermediary or person to block the alleged information.</p>
<p style="text-align: justify;" class="western">b) Rule 9 outlines a procedure wherein, under emergency circumstances, and after the DO has established the necessity and expediency to block alleged information submits recommendations in writing to the Secretary, DeitY. The Secretary, upon being satisfied by the justification for, and necessity of, and expediency to block information may issue an blocking directions as an interim measure and must record the reasons for doing so in writing.</p>
<p style="text-align: justify;" class="western">Under such circumstances, the intermediary and person hosting information is not given the opportunity of a hearing. Nevertheless, the DO is required to place the request before the CER within forty eight hours of issuing of directions for interim blocking. Only upon receiving the final recommendations from the committee can the Secretary pass a final order approving the request. If the request for blocking is not approved then the interim order passed earlier is revoked, and the intermediary or identified person should be directed to unblock the information for public access.</p>
<p style="text-align: justify;" class="western">c) Rule 10 outlines the process when an order is issued by the courts in India. The DO upon receipt of the court order for blocking of information submits it to the Secretary, DeitY and initiates action as directed by the courts.</p>
<h3 style="text-align: justify;" class="western">Confidentiality clause</h3>
<p style="text-align: justify;">Rule 16 mandates confidentiality regarding all requests and actions taken thereof, which renders any requests received by the NO and the DO, recommendations made by the DO or the CER and any written reasons for blocking or revoking blocking requests outside the purview of public scrutiny. More detail on the officers and committees that enforce the blocking rules and procedure can be found <a href="http://cis-india.org/internet-governance/blog/is-india2019s-website-blocking-law-constitutional-2013-i-law-procedure">here</a>.</p>
<h2>Response on blocking from the Ministry of Communication and Information Technology</h2>
<p style="text-align: justify;">The response to our RTI from E-Security and Cyber Law Group is timely, given the recent clarification from the Ministry of Communication and Information Technology to a number of questions, raised by parliamentarian Shri Avinash Pande in the Rajya Sabha. The questions had been raised in reference to the Emergency blocking order under IT Act, the current status of the Central Monitoring System, Data Privacy law and Net Neutrality. The Centre for Communication Governance (CCG), National Law University New Delhi have extracted a set of 6 questions and you can read the full article <a href="https://ccgnludelhi.wordpress.com/2015/04/24/governments-response-to-fundamental-questions-regarding-the-internet-in-india/">here</a>.</p>
<p align="JUSTIFY" class="western">The governments response as quoted by CCG, clarifies under rule 9—the Government has issued directions for emergency blocking of <em>a total number of 216 URLs from 1st January, 2014 till date </em>and that <em>a total of 255 URLs were blocked in 2014 and no URLs has been blocked in 2015 (till 31 March 2015)</em> under S69A through the Committee constituted under the rules therein. Further, a total of 2091 URLs and 143 URLs were blocked in order to comply with the directions of the competent courts of India in 2014 and 2015 (till 31 March 2015) respectively. The government also clarified that the CER, had recommended not to block 19 URLs in the meetings held between 1<sup>st</sup><sup> </sup>January 2014 upto till date and so far, two orders have been issued to revoke 251 blocked URLs from 1st January 2014 till date. Besides, CERT-In received requests for blocking of objectionable content from individuals and organisations, and these were forwarded to the concerned websites for appropriate action, however the response did not specify the number of requests.</p>
<p align="JUSTIFY" class="western">We have prepared a table explaining the information released by the government and to highlight the inconsistency in their response.</p>
<table class="grid listing">
<colgroup> <col width="331"> <col width="90"> <col width="91"> <col width="119"> </colgroup>
<tbody>
<tr>
<td rowspan="2">
<p align="LEFT"><strong>Applicable rule and procedure outlined under the Blocking Rules</strong></p>
</td>
<td colspan="3">
<p align="CENTER"><strong>Number of websites</strong></p>
</td>
</tr>
<tr>
<td>
<p align="CENTER"><em>2014</em></p>
</td>
<td>
<p align="CENTER"><em>2015</em></p>
</td>
<td>
<p align="CENTER"><em>Total</em></p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 6 - Blocking requests from NO and others</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
<td>
<p align="CENTER">None</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 9 - Blocking under emergency circumstances</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">216</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 10 - Blocking orders from Court</p>
</td>
<td>
<p align="CENTER">2091</p>
</td>
<td>
<p align="CENTER">143</p>
</td>
<td>
<p align="CENTER">2234</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Requests from individuals and orgs forwarded to CERT-In</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Recommendations to not block by CER</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">19</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Number of blocking requests revoked</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">251</p>
</td>
</tr>
</tbody>
</table>
<p>In a <a href="http://sflc.in/deity-says-2341-urls-were-blocked-in-2014-refuses-to-reveal-more/">response </a>to an RTI filed by the Software Freedom Law Centre, DeitY said that 708 URLs were blocked in 2012, 1,349 URLs in 2013, and 2,341 URLs in 2014.</p>
<h2>Shreya Singhal v. Union of India</h2>
<p style="text-align: justify;">In its recent judgment, the SC of India upheld the constitutionality of 69A, stating that it was a narrowly-drawn provision with adequate safeguards. The constitutional challenge on behalf of the People’s Union for Civil Liberties (PUCL) considered the manner in which the blocking is done and the arguments focused on the secrecy present in blocking.</p>
<p style="text-align: justify;">The rules may indicate that there is a requirement to identify and contact the originator of information, though as an expert <a href="http://indianexpress.com/article/opinion/columns/but-what-about-section-69a/">has pointed out</a>, there is no evidence of this in practice. The court has stressed the importance of a written order so that writ petitions may be filed under Article 226 of the Constitution. In doing so, the court seems to have assumed that the originator or intermediary is informed, and therefore held the view that any procedural inconsistencies may be challenged through writ petitions. However, this recourse is rendered ineffective not only due to procedural constraints, but also because of the confidentiality clause. The opaqueness through rule 16 severely reigns in the recourse that may be given to the originator and the intermediary. While the court notes that rule 16 requiring confidentality was argued to be unconstitutional, it does not state its opinion on this question in the judgment. One expert, holds the <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">view</a> that this, by implication, requires that requests cannot be confidential. However, such a reading down of rule 16 is yet to be tested.</p>
<p style="text-align: justify;">Further, Sunil Abraham has <a href="http://cis-india.org/internet-governance/blog/economic-and-political-weekly-sunil-abraham-april-11-2015-shreya-singhal-and-66a">pointed</a> out, “block orders are unevenly implemented by ISPs making it impossible for anyone to independently monitor and reach a conclusion whether an internet resource is inaccessible as a result of a S69A block order or due to a network anomaly.” As there are no comprehensive list of blocked websites or of the legal orders through which they are blocked exists, the public has to rely on media reports and filing RTI requests to understand the censorship regime in India. CIS has previously <a href="http://cis-india.org/internet-governance/blog/analysing-blocked-sites-riots-communalism">analysed</a> the leaked block lists and lists received as responses to RTI requests which have revealed that the block orders are full of errors and blocking of entire platforms and not just specific links has taken place.</p>
<p style="text-align: justify;">While the state has the power of blocking content, doing so in secrecy and without judical scrutiny, mark deficiencies that remain in the procedure outlined under the provisions of the blocking rules . The Court could read down rule 16 except for a really narrow set of exceptions, and in not doing so, perhaps has overlooked the opportunities for reform in the existing system. The blocking of 32 websites, is an example of the opaqueness of the system of blocking orders, and where the safeguards assumed by the SC are often not observed such as there being no access to the recommendations that were made by the CER, or towards the revocation of the blocking orders subsequently. CIS filed the RTI to try and understand the grounds for blocking and related procedures and the response has thrown up some issues that must need urgent attention.</p>
<h2>Response to RTI filed by CIS</h2>
<p align="JUSTIFY" class="western">Our first question sought clarification on the websites blocked on 30<sup>th</sup><sup> </sup>December 2014 and the response received from DeitY, E-Security and Cyber Law Group reveals that the websites had been blocked as “they were being used to post information related to ISIS using the resources provided by these websites”. The response also clarifies that the directions to block were issued on <em>18-12-2014 and as of 09-01-2015</em>, after obtaining an undertaking from website owners, stating their compliance with the Government and Indian laws, the sites were unblocked.</p>
<p align="JUSTIFY" class="western">It is not clear if ATS, Mumbai had been intercepting communication or if someone reported these websites. If the ATS was indeed intercepting communication, then as per the rules, the RC should be informed and their recommendations sought. It is unclear, if this was the case and the response evokes the confidentiality clause under rule 16 for not divulging further details. Based on our reading of the rules, court orders should be accessible to the public and without copies of requests and complaints received and knowledge of which organization raised them, there can be no appeal or recourse available to the intermediary or even the general public.</p>
<p align="JUSTIFY" class="western">We also asked for a list of all requests for blocking of information that had been received by the DO between January 2013 and January 2015, including the copies of all files that had accepted or rejected. We also specifically, asked for a list of requests under rule 9. The response from DeitY stated that since January 1, 2015 to March 31, 2015 directions to block 143 URLs had been issued based on court orders. The response completely overlooks our request for information, covering the 2 year time period. It also does not cover all types of blocking orders under rule 6 and rule 9, nor the requests that are forwarded to CERT-In, as we have gauged from the ministry's response to the Parliament. Contrary to the SC's assumption of contacting the orginator of information, it is also clear from DeitY's response that only the websites had been contacted and the letter states that the “websites replied only after blocking of objectionable content”. </p>
<p align="JUSTIFY" class="western">Further, seeking clarification on the functioning of the CER, we asked for the recent composition of members and the dates and copies of the minutes of all meetings including copies of the recommendations made by them. The response merely quotes rule 7 as the reference for the composition and does not provide any names or other details. We ascertain that as per the DeitY website Shri B.J. Srinath, Scientist-G/GC is the appointed Designated Officer, however this needs confirmation. While we are already aware of the structure of the CER which representatives and appointed public officers are guiding the examination of requests remains unclear. Presently, there are 3 Joint Secretaries appointed under the Ministry of Law and Justice, the Home Ministry has appointed 19, while 3 are appointed under the Ministry of Information and Broadcasting. Further, it is not clear which grade of scientist would be appointed to this committee from CERT-In as the rules do not specify this. While the government has clarified in their answer to Parliament that the committee had recommended not to block 19 URLs in the meetings held between 1st January 2014 to till date, it is remains unclear who is taking these decisions to block and revoke blocked URLs. The response from DeitY specifies that the CER has met six times between 2014 and March 2015, however stops short on sharing any further information or copies of files on complaints and recommendations of the CER, citing rule 16.</p>
<p align="JUSTIFY" class="western">Finally, answering our question on the composition of the RC the letter merely highlights the provision providing for the composition under 419A of the Indian Telegraph Rules, 1951. The response clarifies that so far, the RC has met once on 7th December, 2013 under the Chairmanship of the Cabinet Secretary, Department of Legal Affaits and Secretary, DOT. Our request for minutes of meetings and copies of orders and findings of the RC is denied by simply stating that “minutes are not available”. Under 419A, any directions for interception of any message or class of messages under sub-section (2) of Section 5 of the Indian Telegraph Act, 1885 issued by the competent authority shall contain reasons for such direction and a copy of such order shall be forwarded to the concerned RC within a period of seven working days. Given that the RC has met just once since 2013, it is unclear if the RC is not functioning or if the interception of messages is being guided through other procedures. Further, we do not yet know details or have any records of revocation orders or notices sent to intermediary contacts. This restricts the citizens’ right to receive information and DeitY should work to make these available for the public.</p>
<p align="JUSTIFY" class="western">Given the response to our RTI, the Ministry's response to Parliament and the SC judgment we recommend the following steps be taken by the DeitY to ensure that we create a procedure that is just, accountable and follows the rule of law.</p>
<p align="JUSTIFY" class="western">The revocation of rule 16 needs urgent clarification for two reasons:</p>
<ol>
<li>Under Section 22 of the RTI Act provisions thereof, override all conflicting provisions in any other legislation.</li>
<li style="text-align: justify;">In upholding the constitutionality of S69A the SC cites the requirement of reasons behind blocking orders to be recorded in writing, so that they may be challenged by means of writ petitions filed under <a href="http://indiankanoon.org/doc/1712542/">A</a><a href="http://indiankanoon.org/doc/1712542/">rticle 226</a> of the Constitution of India.</li></ol>
<p style="text-align: justify;">If the blocking orders or the meetings of the CER and RC that consider the reasons in the orders are to remain shrouded in secrecy and unavailable through RTI requests, filing writ petitions challenging these decisions will not be possible, rendering this very important safeguard for the protection of online free speech and expression infructuous. In summation, the need for comprehensive legislative reform remains in the blocking procedures and the government should act to address the pressing need for transparency and accountability. Not only does opacity curtial the strengths of democracy it also impedes good governance. We have filed an RTI seeking a comprehensive account of the blocking procedure, functioning of committees from 2009-2015 and we shall publish any information that we may receive.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015'>https://cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015</a>
</p>
No publisherjyotiCensorshipFreedom of Speech and ExpressionRTIIntermediary LiabilityAccountabilityFeatured69AInternet GovernanceChilling EffectTransparencyHomepageBlocking2015-04-30T07:37:40ZBlog EntryThe Ministry And The Trace: Subverting End-To-End Encryption
https://cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption
<b>A legal and technical analysis of the 'traceability' rule and its impact on messaging privacy.</b>
<p> </p>
<p>The paper was published in the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">NUJS Law Review Volume 14 Issue 2 (2021)</a>.</p>
<hr />
<h2>Abstract</h2>
<div class="justify">
<div class="pbs-main-wrapper">
<p>End-to-end
encrypted messaging allows individuals to hold confidential
conversations free from the interference of states and private
corporations. To aid surveillance and prosecution of crimes, the Indian
Government has mandated online messaging providers to enable
identification of originators of messages that traverse their platforms.
This paper establishes how the different ways in which this
‘traceability’ mandate can be implemented (dropping end-to-end
encryption, hashing messages, and attaching originator information to
messages) come with serious costs to usability, security and privacy.
Through a legal and constitutional analysis, we contend that
traceability exceeds the scope of delegated legislation under the
Information Technology Act, and is at odds with the fundamental right to
privacy.</p>
<p> </p>
<p>Click here to read the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">full paper</a>.</p>
</div>
</div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption'>https://cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption</a>
</p>
No publisherGurshabad Grover, Tanaya Rajwade and Divyank KatiraCryptographyIntermediary LiabilityConstitutional LawInternet GovernanceMessagingEncryption Policy2021-07-12T08:18:18ZBlog EntryPanel Discussion on Internet Intermediaries, Law and Innovation
https://cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation
<b>CII, Google and Centre For Communications Governance, NLU Delhi hosted a panel discussion on June 2 in New Delhi. Jyoti Panday attended.</b>
<p style="text-align: justify; ">The Centre for Internet & Society (CIS) participated in the panel discussion on 'Internet Intermediaries, Law and Innovation' hosted by CII, Google and Centre For Communications Governance, NLU Delhi. The panel discussed the impact of the existing provisions on intermediary liability and innovation and sought suggestions and the way forward.<br /><br />The panel was moderated by Dr Subho Ray, President, IAMAI<br /><br />Other panelists included:</p>
<ul style="text-align: justify; ">
<li> Mr Anupam Chander, Eminent Global Lawyer & Academician</li>
<li> Mr Apar Gupta, Advocate</li>
<li> Ms Mishi Choudhary, Founding Director , Software Freedom Law Centre</li>
<li> Mr J Sai Deepak, Associate Partner, Litigation Team, Saikrishna & Associates</li>
</ul>
<ul style="text-align: justify; ">
<li> Mr Indranil Choudhury, Founder and CEO, Lexplosion</li>
</ul>
<p><a href="https://cis-india.org/internet-governance/blog/internet-intermediaries-law-and-innovation-panel.odp" class="internal-link">Click to download the presentation.</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation'>https://cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation</a>
</p>
No publisherjyotiInternet GovernanceIntermediary Liability2015-06-14T16:37:56ZNews ItemNew intermediary guidelines: The good and the bad
https://cis-india.org/internet-governance/blog/new-intermediary-guidelines-the-good-and-the-bad
<b>In pursuance of the government releasing the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, this blogpost offers a quick rundown of some of the changes brought about the Rules, and how they line up with existing principles of best practices in content moderation, among others. </b>
<p> </p>
<p>This article originally appeared in the Down to Earth <a class="external-link" href="https://www.downtoearth.org.in/blog/governance/new-intermediary-guidelines-the-good-and-the-bad-75693">magazine</a>. Reposted with permission.</p>
<p>-------</p>
<p>The Government of India notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The operation of these rules would be in supersession of the existing intermediary liability rules under the Information Technology (IT) Act, made back in 2011.</p>
<p>These IL rules would have a significant impact on our relationships with internet ‘intermediaries’, i.e. gatekeepers and getaways to the internet, including social media platforms, communication and messaging channels.</p>
<p>The rules also make a bid to include entities that have not traditionally been considered ‘intermediaries’ within the law, including curated-content platforms such as Netflix and Amazon Prime as well as digital news publications.</p>
<p>These rules are a significant step-up from the draft version of the amendments floated by the Union government two years ago; in this period, the relationship between the government around the world and major intermediaries changed significantly. </p>
<p>The insistence of these entities in the past, that they are not ‘arbiters of truth’, for instance, has not always held water in their own decision-makings.</p>
<p>Both Twitter and Facebook, for instance, have locked the former United States president Donald Trump out of their platforms. Twitter has also resisted to fully comply with government censorship requests in India, spilling into an interesting policy tussle between the two entities. It is in the context of these changes, therefore, that we must we consider the new rules.</p>
<p><strong>What changed for the good?</strong></p>
<p>One of the immediate standouts of these rules is in the more granular way in which it aims to approach the problem of intermediary regulation. The previous draft — and in general the entirety of the law — had continued to treat ‘intermediaries’ as a monolithic entity, entirely definable by section 2(w) of the IT Act, which in turn derived much of its legal language from the EU E-commerce Directive of 2000.</p>
<p>Intermediaries in the directive were treated more like ‘simple conduits’ or dumb, passive carriers who did not play any active role in the content. While that might have been the truth of the internet when these laws and rules were first enacted, the internet today looks much different.</p>
<p>Not only is there a diversification of services offered by these intermediaries, there’s also a significant issue of scale, wielded by a few select players, either by centralisation or by the sheer number of user bases. A broad, general mandate would, therefore, miss out on many of these nuances, leading to imperfect regulatory outcomes.</p>
<p>The new rules, therefore, envisage three types of entities:</p>
<ul><li>There are the ‘intermediaries’ within the traditional, section 2(w) meaning of the IT Act. This would be the broad umbrella term for all entities that would fall within the ambit of the rules.</li><li>There are the ‘social media intermediaries’ (SMI), as entities, which enable online interaction between two or more users.</li><li>The rules identify ‘significant social media intermediaries’ (SSMI), which would mean entities with user-thresholds as notified by the Central Government.</li></ul>
<p>The levels of obligations vary based on these hierarchies of classification. For instance, an SSMI would be obligated with a much higher standard of transparency and accountability towards their users. They would have to fulfill by publishing six-monthly transparency reports, where they have to outline how they dealt with requests for content removal, how they deployed automated tools to filter content, and so on.</p>
<p>I have previously argued how transparency reports, when done well, are an excellent way of understanding the breadth of government and social media censorships. Legally mandating this is then perhaps a step in the right direction.</p>
<p>Some other requirements under this transparency principle include giving notice to users whose content has been disabled, allowing them to contest such removal, etc.</p>
<p>One of the other rules from the older draft that had raised a significant amount of concern was the proactive filtering mandate, where intermediaries were liable to basically filter for all unlawful content. This was problematic on two counts:</p>
<ul><li>Developments in machine learning technologies are simply not up there to make this a possibility, which would mean that there would always be a chance that legitimate and legal content would get censored, leading to general chilling effect on digital expression</li><li>The technical and financial burden this would impose on intermediaries would have impacted the competition in the market.</li></ul>
<p>The new rules seemed to have lessened this burden, by first, reducing it from being mandatory to being best endeavour-basis; and second, by reducing the ambit of ‘unlawful content’ to only include content depicting sexual abuse, child sexual abuse imagery (CSAM) and duplicating to already disabled / removed content.</p>
<p>This specificity would be useful for better deployment of such technologies, since previous research has shown that it’s considerably easier to train a machine learning tool on corpus of CSAM or abuse, rather than on more contextual, subjective matters such as hate speech.</p>
<p><strong>What should go?</strong></p>
<p>That being said, it is concerning that the new rules choose to bring online curated content platforms (OCCPs) within the ambit of the law by proposals of a three-tiered self-regulatory body and schedules outlining guidelines about the rating system these entities should deploy.</p>
<p>In the last two years, several attempts have been made by the Internet and Mobile Association of India (IAMAI), an industry body consisting of representatives of these OCCPs, to bring about a self-regulatory code that fills in the supposed regulatory gap in the Indian law.</p>
<p>It is not known if these stakeholders were consulted before the enactment of these provisions. Some of this framework would also apply to publishers of digital news portals.</p>
<p>Noticeably, this entire chapter was also missing from the old draft, and introducing it in the final form of the law without due public consultations is problematic.</p>
<p>Part III and onwards of the rules, which broadly deal with the regulation of these entities, therefore, should be put on hold and opened up for a period of public and stakeholder consultations to adhere to the true spirit of democratic participation.</p>
<p><em>The author would like to thank Gurshabad Grover for his editorial suggestions. </em></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/new-intermediary-guidelines-the-good-and-the-bad'>https://cis-india.org/internet-governance/blog/new-intermediary-guidelines-the-good-and-the-bad</a>
</p>
No publisherTorSharkIT ActIntermediary LiabilityInternet GovernanceCensorshipArtificial Intelligence2021-03-15T13:52:46ZBlog EntryLearning Forum: Transparency and Human Rights in the Digital Age
https://cis-india.org/internet-governance/news/learning-forum-transparency-and-human-rights-in-the-digital-age
<b>Pranesh Prakash spoke at this event organized by Global Network Initiative on November 6, 2014 in California. </b>
<p style="text-align: justify; ">Pranesh Prakash spoke on transparency reports and their use and abuse in India; the Intermediary Liability Rules in India (and its non-provision of any transparency mechanism); and the need for transparency in private speech regulation, not just governmental speech regulation.</p>
<hr />
<p> </p>
<p><img alt="GNI logo" src="https://cdn.evbuc.com/eventlogos/21069154/gnilogo.jpg" title="GNI logo" width="600" /></p>
<p><img alt="Telecom Industry Dialogue" src="https://cdn.evbuc.com/eventlogos/21069154/screenshot20141002at11.11.24am.png" title="ID logos" width="600" /></p>
<p style="text-align: justify; "><span>The Global Network Initiative and the Telecommunications Industry Dialogue on Freedom of Expression and Privacy present:</span></p>
<p style="text-align: justify; "><b>2014 Learning Forum - Silicon Valley </b><br /><b><span>Transparency and Human Rights in the Digital Age</span></b></p>
<p style="text-align: justify; "><span><span>Hosted by LinkedIn </span></span></p>
<p style="text-align: justify; "><b><span><span>Agenda</span></span></b></p>
<p style="text-align: justify; "><b><span><span>1:30PM - Registration</span></span></b></p>
<p style="text-align: justify; "><b><span><span>2:00PM - Opening Remarks</span></span></b></p>
<p style="text-align: justify; "><span><span>Mark Stephens, Independent Chair, Global Network Initiative</span></span></p>
<p style="text-align: justify; "><span style="text-align: center; ">Jeffrey Dygert, Executive Director of Public Policy, AT&T</span></p>
<p style="text-align: justify; "><span style="text-align: center; ">Pablo Chavez, Vice President, Global Public Policy and Government Affairs, LinkedIn</span></p>
<p style="text-align: justify; "><b><i><span><span>2:15PM - Why does transparency matter for protecting and respecting rights online?</span></span></i></b></p>
<p style="text-align: justify; ">Arvind Ganesan, Director of Business and Human Rights, Human Rights Watch</p>
<p style="text-align: justify; ">Deirdre Mulligan, Associate Professor, UC Berkeley School of Information</p>
<p style="text-align: justify; ">Michael Samway, School of Foreign Service, Georgetown University</p>
<p style="text-align: justify; "><b><i><span><span>3:00PM - What is the state of transparency reporting by companies and governments, and what's missing?</span></span></i></b></p>
<p style="text-align: justify; "><span><span>Steve Crown, Vice President and Deputy General Counsel, Microsoft</span></span></p>
<p style="text-align: justify; "><span><span>Jeffrey Dygert, Executive Director of Public Policy, AT&T</span></span></p>
<p style="text-align: justify; "><span><span>Jason Pielemeier, Bureau of Democracy, Human Rights, and Labor, U.S. Department of State</span></span></p>
<p style="text-align: justify; "><span><span>Pranesh Prakash, Policy Director, Centre for Internet & Society, Bangalore </span></span></p>
<p style="text-align: justify; "><span><span>Moderated by Bennett Freeman, Senior Vice President, Sustainability Research and Policy, Calvert Investments</span></span></p>
<p style="text-align: justify; "><b><span><span>4:00PM - Break</span></span></b></p>
<p style="text-align: justify; "><b><i><span><span>4:30PM - How do companies communicate with users in response to live events? </span></span></i></b></p>
<p style="text-align: justify; "><span><span>Ben Blink, Senior Policy Analyst, Free Expression and International Relations, Google</span></span></p>
<p style="text-align: justify; "><span><span>Patrik Hiselius, Senior Advisor, Digital Rights, TeliaSonera</span></span></p>
<p style="text-align: justify; "><span><span>Rebecca MacKinnon, Director, Ranking Digital Rights Project, New America Foundation</span></span></p>
<p style="text-align: justify; "><span><span>Hemanshu Nigam, CEO, SSP Blue</span></span></p>
<p style="text-align: justify; "><span><span>Sana Saleem, Director, Bolo Bhi</span></span></p>
<p style="text-align: justify; "><span><span>Moderated by Cynthia Wong, Senior Internet Researcher, Human Rights Watch</span></span></p>
<p style="text-align: justify; "><b><i>The program will be followed by a reception from 5:30 to 6:30pm.</i></b></p>
<p style="text-align: justify; ">By invitation only, non-transferrable.</p>
<p class="mceContentBody documentContent">Have questions about Learning Forum: Transparency and Human Rights in the Digital Age? <a class="contact_organizer_link js-d-modal" href="#lightbox_contact"> Contact Global Network Initiative </a></p>
<hr />
<p class="mceContentBody documentContent">The original was <a class="external-link" href="https://www.eventbrite.com/e/learning-forum-transparency-and-human-rights-in-the-digital-age-tickets-13387240597">published here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/learning-forum-transparency-and-human-rights-in-the-digital-age'>https://cis-india.org/internet-governance/news/learning-forum-transparency-and-human-rights-in-the-digital-age</a>
</p>
No publisherpraskrishnaInternet GovernanceIntermediary Liability2014-12-04T16:14:38ZNews ItemDo IT Rules 2011 indirectly leads to Censorship of Internet
https://cis-india.org/news/do-it-rules-indirectly-lead-to-censorship-of-internet
<b>Pranesh Prakash along with Dr. Arvind Gupta, National Convener, BJP IT Cell and Ms.
Mishi Choudhary, Executive Director, SFLC participated in a panel discussion on censorship of the Internet on May 8, 2012.
</b>
<p>The discussion was broadcast on Yuva iTV. See the video below:</p>
<h2>Video</h2>
<p><iframe src="http://www.youtube.com/embed/KRIJRhpW-Bc" frameborder="0" height="315" width="320"></iframe></p>
<p><a class="external-link" href="http://www.youtube.com/watch?v=KRIJRhpW-Bc">Click for the video on YouTube</a></p>
<p>
For more details visit <a href='https://cis-india.org/news/do-it-rules-indirectly-lead-to-censorship-of-internet'>https://cis-india.org/news/do-it-rules-indirectly-lead-to-censorship-of-internet</a>
</p>
No publisherpraskrishnaIT ActInternet GovernanceVideoIntermediary LiabilityCensorship2012-05-31T09:00:41ZNews ItemSuper Cassettes v. MySpace (Redux)
https://cis-india.org/a2k/blogs/super-cassettes-v-myspace
<b>The latest judgment in the matter of Super Cassettes v. MySpace is a landmark and progressive ruling, which strengthens the safe harbor immunity enjoyed by Internet intermediaries in India. It interprets the provisions of the IT Act, 2000 and the Copyright Act, 1957 to restore safe harbor immunity to intermediaries even in the case of copyright claims. It also relieves MySpace from pre-screening user-uploaded content, endeavouring to strike a balance between free speech and censorship. CIS was one of the intervenors in the case, and has been duly acknowledged in the judgment.</b>
<p> </p>
<p>On 23rd December 2016, Justice Ravindra Bhat and Justice Deepa Sharma of the Delhi High Court delivered a decision overturning the 2012 order in the matter of Super Cassettes Industries Limited v. MySpace. The 2012 order was heavily criticized, for it was agnostic to the technological complexities of regulating speech on the Internet and cast unfathomable burdens on MySpace. In the following post I summarise the decision of the Division Bench. Click <a class="external-link" href="http://lobis.nic.in/ddir/dhc/SRB/judgement/24-12-2016/SRB23122016FAOOS5402011.pdf">here</a> to read the judgment.</p>
<h3><strong>Brief Facts</strong></h3>
<p>In 2007, Super Cassettes Industries Limited (SCIL) filed a suit against MySpace, a social networking platform, alleging copyright infringement against MySpace. The platform allowed users to upload and share media files,
<em>inter alia</em>, and it was discovered that users were sharing SCIL’s copyrighted works sans authorisation. SCIL promptly proceeded to file a civil suit against MySpace for primary infringement under section 51(a)(i)
of the Copyright Act as well as secondary infringement under section 51(a)(ii).</p>
<p> The 2012 order was extremely worrisome as it had turned the clock several decades back on concepts of internet intermediary liability. The court had held MySpace liable for copyright infringement despite it having shown no knowledge about specific instances of infringement; that it removed infringing content upon complaints; and that Super Cassettes had failed to submit songs to MySpace's song ID database. The most impractical burden of duty that the court pronounced was that MySpace was required to pre-screen content, rather than relying on post-infringement measures to remove infringing content. This was a result of interpreting due diligence to include pre-screening.</p>
<p>The court injuncted MySpace from permitting any uploads of SCIL's copyrighted content, and directed to expeditiously execute content removal requests. To read CIS' analysis of the Single Judge's interim order, click <a class="external-link" href="http://cis-india.org/a2k/blogs/super-cassettes-v-my-space">here</a>.</p>
<p>In the instant judgment, the bench limited their examination to MySpace’s liability for secondary infringement, and left the direct infringement determination to the Single Judge at the subsequent trial stage. In doing so, the court answered the following three questions:</p>
<h4>1) Whether MySpace could be said to have knowledge of infringement so as to attract liability for
secondary infringement under Section 51(a)(ii)?</h4>
<p>No. According to the Court, in the case of internet intermediaries, section 51(a)(ii) contemplates actual knowledge and not general awareness.</p>
<p>Elaborating re the circumstances of the case, the Court held that to attract liability for secondary infringement, MySpace should have had actual knowledge and not mere awareness of the infringement. Appreciating the difference between virtual and physical worlds, the judgment stated “<em>the nature of internet media is such that the interpretation of knowledge cannot be the same as that is used for a physical premise.”</em></p>
<p>As per the court, the following facts only amounted to a general awareness, which was not sufficient to establish secondary liability:</p>
<ol><li>Existence of user agreement terms which prohibited users from unauthorised uploading of content;<br />
</li><li>Operation of post-infringement mechanisms instituted by MySpace to identify and remove content;<br />
</li><li>SCIL sharing a voluminous catalogue of 100,000 copyrighted songs with MySpace, expecting the latter to monitor and quell any infringement;<br />
</li><li>Modifying videos to insert ads in them: SCIL contended that MySpace invited users to share and upload content which it would use to insert ads and make revenues – and this amounted to knowledge. The Court found that video modification for ad insertion only changed the format of the video and not the content; further, it was a pure automated process and there was no human intervention.</li></ol>
<p>Additionally, no constructive knowledge could be attributed to MySpace to demonstrate reasonable ground for believing that infringement had occurred. A reasonable belief could emerge only after MySpace had perused all the content uploaded and shared on its platform – a task that was impossible to perform due to the voluminous catalogue
handed to it and existing technological limitations.</p>
<p>The Court imposed a duty on SCIL to specify the works in which it owned copyright <em>and </em>being shared
without authorisation on MySpace. It held that merely giving names of all content it owned without expressly pointing out the infringing works was contrary to the established principles of copyright law. Further, MySpace contended and the judge agreed, that in many instances the works were legally shared by distributors and performers – and often users created remixed works which only bore semblance to the title of the copyright work.</p>
<p class="callout"><strong><em>In such cases it becomes even more important for a plaintiff such as
MySpace to provide specific titles, because while an intermediary may
remove the content fearing liability and damages, an authorized
individual’s license and right to fair use will suffer or stand negated.
(Para 38 in decision)</em></strong></p>
<p>Thus, where as MySpace undoubtedly permitted a place of profit for communication of infringing works uploaded by users, it did not have specific knowledge, nor reasonable belief of the infringement.</p>
<h4>2) Does proviso to Section 81 override the "safe harbor" granted to intermediaries under Section 79 of the IT Act, 2000?</h4>
<p>and</p>
<h4>3) Whether it was possible to harmoniously read and interpret Sections 79 and 81 of the IT Act, and Section 51 of the Copyright Act?</h4>
<p>No, the proviso does not override the safe harbor, i.e. the safe harbor
defence cannot be denied to the intermediary in the case of copyright
actions.The three sections have to be read harmoniously, indeed.</p>
<p>
The judgment referred to the Parliamentary Standing Committee report as a relevant tool in interpreting the two provisions, declaring that the rights conferred under the IT Act, 2000 are supplementary and not in derogation of the Patents Act or the Copyright Act. The proviso was inserted only to permit copyright owners to demand action
against intermediaries who may themselves post infringing content – the safe harbor only existed for circumstances when content was third party/user generated.</p>
<p class="callout"><strong><em>Given the supplementary nature of the provisions- one where infringement
is defined and traditional copyrights are guaranteed and the other
where digital economy and newer technologies have been kept in mind, the
only logical and harmonious manner to interpret the law would be to read
them together. Not doing so would lead to an undesirable situation
where intermediaries would be held liable irrespective of their due
diligence. (Para 49 in decision)</em></strong></p>
<p>Regarding section 79, the court reiterated that the section only granted a limited immunity to intermediaries by granting a <em>measured privilege to an intermediary</em>, which was in the nature of an affirmative defence and not a blanket immunity to avoid liability. The very purpose of section 79 was to regulate and limit this liability; where as the Copyright Act granted and controlled rights of a copyright owner.</p>
<p>The Court found Judge Whyte’s decision in Religious Technology Centre v. Netcom Online Communication Services (1995), to be particularly relevant to the instant case, and agreed with its observations. To recall, <em>Netcom</em> was the landmark US ruling which established that when a subscriber was responsible for direct infringement, and the service providers did nothing more than setting up and operating tech systems which were
necessary for the functioning of the Internet, it was illogical to impute liability on the service provider.</p>
<h3><strong>On MySpace Complying with Safe Harbor Requirements under Section 79 of the IT Act, 2000 (and Intermediary Rules, 2011)</strong></h3>
<p>The court held that MySpace's operations were in compliance with section 79(2)(b). The content transmission was initiated at the behest of the users, the recipients were not chosen by MySpace, neither was there modification of content. On the issue of modification, the court reasoned that since modification was an automated process (MySpace was inserting ads) which changed the format only, without MySpace's tacit or expressed control or knowledge, it was in compliance of the legislative requirement.</p>
<p class="callout"><strong><em>Despite several safeguard tools and notice and take down regimes,
infringed videos find their way. The remedy here is not to target
intermediaries but to ensure that infringing material is removed in an
orderly and reasonable manner. A further balancing act is required which
is that of freedom of speech and privatized censorship. If an
intermediary is tasked with the responsibility of identifying infringing
content from non-infringing one, it could have a chilling effect on
free speech; an unspecified or incomplete list may do that.
(Para 62 in decision)</em></strong></p>
On the second aspect of due-diligence, the court held that Mypace complied with the due diligence procedure specified in the Rules - it published rules, regulations, privacy policy and user agreement for access of usage. Reading Rule 3(4) with section 79(2)(c), the court held that it due diligence required MySpace to remove content within 36 hours of gaining actual knowledge or receiving knowledge by another person of the infringing content. <strong>If MySpace failed to take infringing content down accordingly, then only will safe harbour be denied to MySpace.</strong>
<p>This liberal interpretation of due diligence is a big win for internet intermediaries in India.</p>
<h3><strong>Additional Issues Considered by the Court</strong></h3>
<p>MySpace also tried to defend its activities by claiming the shield of the fair dealing section of the Indian Copyright Act. However, the Court refused, stating that the fair dealing defence was inapplicable to the case as the provisions protected transient and incidental storage. Whereas, in the instant circumstances, the content in question was stored/hosted permanently.</p>
<p>MySpace also contended that the Single Judge's injunction order was vague and general and had foisted unimplementable duties on MySpace, disregarding the way the Internet functioned. If MySpace had to strictly comply with the order, it would have to shut its business in India. <strong>The Court said that the Single Judge's order, if enforced, would create a system of unwarranted private censorship, running contrary to the principles of a free speech regime, devoid of considerations of peculiarities of the internet intermediary industry. </strong>Private censorship would also invite upon the ISP the legal risk of wrongfully terminating a user account.</p>
<p>Finally, the Court urged MySpace to explore and innovate techniques to protect the interests of traditional copyright holders in a more efficient manner.</p>
<h3><strong>Relief Granted</strong></h3>
<p>Setting aside the Single Judge's order aside, the Court directed SCIL to provide a specific catalogue of infringing works which also pointed to the URL of the files. Upon receiving such specific knowledge, MySpace has been directed to remove the content within 36 hours of the issued notice. MySpace will also keep an account of the removals, and the revenues earned from ads placed for calculating damages at the trial stage.</p>
<p> </p>
<p>
For more details visit <a href='https://cis-india.org/a2k/blogs/super-cassettes-v-myspace'>https://cis-india.org/a2k/blogs/super-cassettes-v-myspace</a>
</p>
No publishersinhaIntermediary LiabilityCopyrightCensorshipAccess to Knowledge2017-01-18T14:31:25ZBlog EntryTwitter's India troubles show tough path ahead for digital platforms
https://cis-india.org/internet-governance/news/dw-june-21-2021-aditya-sharma-twitter-india-troubles-show-tough-path-ahead-for-digital-platforms
<b>Twitter is in a standoff with Indian authorities over the government's new digital rules. Critics see the rules as an attempt to curb free speech, while others say more action is needed to hold tech giants accountable.
</b>
<p style="text-align: justify; ">The blog by Aditya Sharma <a class="external-link" href="https://www.dw.com/en/twitters-india-troubles-show-tough-path-ahead-for-digital-platforms/a-57980916">was published by DW</a> on 21 June 2021. Torsha Sarkar was quoted.</p>
<hr style="text-align: justify; " />
<p style="text-align: justify; "><img src="https://cis-india.org/home-images/Intermediary.jpg/@@images/08eb8de3-4fd6-408f-94d2-3f202da0e730.jpeg" alt="Intermediary" class="image-right" title="Intermediary" /></p>
<p style="text-align: justify; ">Twitter holds a relatively low share of India's social media market. But, since 2017, the huge nation has emerged as Twitter's fastest-growing market, becoming critical to its global expansion plans.</p>
<p style="text-align: justify; ">In February, the Indian government <a href="https://www.dw.com/en/india-targets-twitter-whatsapp-with-new-regulatory-rules/a-56708566">introduced new guidelines</a> to regulate digital content on rapidly growing social media platforms.</p>
<p style="text-align: justify; ">The so-called Intermediary Guidelines are aimed at regulating content on internet platforms such as Twitter and Facebook, making them more accountable to legal requests for the removal of posts and sharing information about the originators of messages.</p>
<p style="text-align: justify; ">Employees at these companies can be held criminally liable for not complying with the government's requests.</p>
<p style="text-align: justify; ">Large social media firms must also set up mechanisms to address grievances and appoint executives to liaise with law enforcement under the new rules, as well as appoint an India-based compliance officer who would be held criminally liable for the content on their platforms.</p>
<p style="text-align: justify; ">The Indian government says the rules empower "users who become victims of defamation, morphed images, sexual abuse," among other online crimes. It also said that the rules seek to tackle the problem of disinformation.</p>
<p style="text-align: justify; ">But critics fear that the rules could be used to target government opponents and make sure dissidents don't use the platforms.</p>
<p style="text-align: justify; ">Social media companies were expected to comply with the new rules by May 25.</p>
<p style="text-align: justify; ">Some Indian media reports have recently said that Twitter lost its status as an "intermediary" and the legal protection that came with it, due to its failure to comply with the new rules.</p>
<h3 style="text-align: justify; ">Failure to comply and serious implications</h3>
<p style="text-align: justify; ">Apar Gupta, the executive director of the Internet Freedom Foundation, a New Delhi-based digital rights advocacy group, says failure to comply with the rules could threaten Twitter's India operations.</p>
<p style="text-align: justify; ">"Not complying with the rules would pose a real risk to Twitter's operational environment," he told DW.</p>
<p style="text-align: justify; ">"It will need to go to court to defend itself each time criminal prosecutions are launched against it," he added.</p>
<p style="text-align: justify; ">The first case against Twitter was filed last week, where it was charged with failing to stop the spread of a video on its platform that allegedly incited "hate and enmity" between two religious groups.</p>
<p style="text-align: justify; "><span>'Heavy censorship'</span></p>
<p style="text-align: justify; ">Gupta says adhering to all the government's demands would substantially change Twitter.</p>
<p style="text-align: justify; ">"Absolute compliance would mean heavy censorship of individual tweets, removal of the manipulated media tags, and blocking/suspension of accounts at the government's behest," he said.</p>
<p style="text-align: justify; ">Torsha Sarkar, policy officer at the Bengaluru-based Centre for Internet and Society, fears that Twitter might at times be compelled to overcomply with government demands, threatening user rights.</p>
<p style="text-align: justify; ">"This can be either by over-complying with flawed information requests, thereby selling out its users, or taking down content that offends the majoritarian sensibilities," she told DW.</p>
<p style="text-align: justify; ">Last week, three special rapporteurs appointed by a top UN human rights body expressed "serious concerns" that certain parts of the guidelines "may result in the limiting or infringement of a wide range of human rights."</p>
<p style="text-align: justify; ">They urged New Delhi to review the rules, adding that they did not conform to India's international human rights obligations and could threaten the digital rights of Indians.</p>
<h3 style="text-align: justify; ">Twitter's balancing act</h3>
<p style="text-align: justify; ">It is not the first time that Twitter has been accused of giving in to government pressure to censor content on its platform.</p>
<p style="text-align: justify; ">At the height of the long-running farmer protests, <a href="https://www.dw.com/en/farmer-protests-india-blocks-prominent-twitter-accounts-detains-journalists/a-56411354">Twitter blocked hundreds of tweets</a> and accounts, including the handle of a prominent news magazine. It subsequently unblocked them following public outrage.</p>
<p style="text-align: justify; ">The US company stopped short of complying with demands to block the accounts of activists, politicians and journalists, arguing that such a move would "violate their fundamental right to free expression under Indian law."</p>
<p style="text-align: justify; ">According to local media reports, Twitter's Indian executives were reportedly threatened with fines and imprisonment if the accounts were not taken down.</p>
<h3 style="text-align: justify; ">Special police notify Twitter offices</h3>
<p style="text-align: justify; ">Last month, the labeling of a tweet by a politician from the ruling BJP as "manipulated media" prompted a special unit of the <a href="https://www.dw.com/en/india-police-visit-twitter-offices-over-manipulated-tweet/a-57650193">Delhi police to visit Twitter's offices</a> in the capital and neighboring Gurgaon. Police notified the offices about an investigation into the labeling of the post.</p>
<p style="text-align: justify; ">Twitter India's managing director, Manish Maheswari, was said to have been asked to appear before the police for questioning, according to media reports.</p>
<p style="text-align: justify; ">Some Twitter employees have refused to talk about the ongoing tensions for fear of government reprisals.</p>
<p style="text-align: justify; ">"Such kind of intimidation does not happen every day. (But) Everyone at Twitter India is terrified," people familiar with the matter told DW on the condition of anonymity.</p>
<h3 style="text-align: justify; ">Big Tech VS sovereign power?</h3>
<p style="text-align: justify; ">Those calling for better regulation of tech giants say transnational <a href="https://www.dw.com/en/india-social-media-conflict/a-57702394">social media companies like Twitter lack accountability</a>, blaming them for the alleged inaction against online abuse and disinformation campaigns.</p>
<p style="text-align: justify; ">"The problem with these rules is that they centralize greater power toward the government without providing for the objective benefit of rights toward users," Gupta said.</p>
<p style="text-align: justify; ">"If Twitter were to comply with these rules, it would make a bad situation worse," he said.</p>
<p style="text-align: justify; ">Twitter is unlikely to ditch a major market such as India.</p>
<p style="text-align: justify; ">Sarkar from the Centre for Internet and Society said "It might be difficult to say how the powers of big tech are going to collide with sovereign nations, especially in light of flawed legal interventions around the world."</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/dw-june-21-2021-aditya-sharma-twitter-india-troubles-show-tough-path-ahead-for-digital-platforms'>https://cis-india.org/internet-governance/news/dw-june-21-2021-aditya-sharma-twitter-india-troubles-show-tough-path-ahead-for-digital-platforms</a>
</p>
No publisherAditya SharmaSocial MediaInternet GovernanceIntermediary LiabilityInformation Technology2021-06-26T02:54:19ZNews ItemSubmission to the Facebook Oversight Board in Case 2021-008-FB-FBR: Brazil, Health Misinformation and Lockdowns
https://cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-in-case-2021-008-fb-fbr-brazil-health-misinformation-and-lockdowns
<b>In this note, we answer questions set out by the Board, pursuant to case 2021-008-FB-FBR, which concerned a post made by a Brazilian sub-national health official, and raised questions on health misinformation and enforcement of Facebook's community standards. </b>
<h1 style="text-align: justify;" dir="ltr">Background </h1>
<p dir="ltr">The <a href="https://about.fb.com/news/tag/oversight-board/">Oversight Board</a> is an expert body created to exercise oversight over Facebook’s content moderation decisions and enforcement of community guidelines. It is entirely independent from Facebook in its funding and administration and provides decisions on questions of policy as well as individual cases. It can also make recommendations on Facebook’s content policies. Its decisions are binding on Facebook, unless implementing them could violate the law. Accordingly, Facebook <a href="https://transparency.fb.com/oversight/oversight-board-cases/">implements</a> these decisions across identical content with parallel context, when it is technically and operationally possible to do so. </p>
<p dir="ltr">In June 2021, the Board made an <a href="https://oversightboard.com/news/170403765029629-announcement-of-case-2021-008-fb-fbr/">announcement</a> soliciting public comments on case 2021-008-FB-FBR, concerning a Brazilian state level medical council’s post questioning the effectiveness of lockdowns during the COVID-19 pandemic. Specifically, the post noted that lockdowns (i) are ineffective; (ii) lead to an increase in mental disorders, alcohol abuse, drug abuse, economic damage etc.; (iii) are against fundamental rights under the Brazilian Constitution; and (iv) are condemned by the World Health Organisation (“WHO”). These assertions were backed up by two statements (i) an alleged quote by Dr. Nabarro (WHO) stating that “the lockdown does not save lives and makes poor people much poorer”; and (ii) an example of how the Brazilian state of Amazonas had an increase in deaths and hospital admissions after lockdown. Ultimately, the post concluded that effective COVID-19 preventive measures include education campaigns about hygiene measures, use of masks, social distancing, vaccination and extensive monitoring by the government — but never the decision to adopt lockdowns. The post was viewed around 32,000 times and shared over 270 times. It was not reported by anyone. </p>
<p dir="ltr">Facebook did not take any action against the post, since it had opined that the post is not violative of its community standards. Moreover, WHO has also not advised Facebook to remove claims against lockdowns. In such a scenario, Facebook referred the case to the Oversight Board citing its public importance. </p>
<p dir="ltr">In its announcement, the Board sought answers on the following points: </p>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether Facebook’s decision to take no action against the content was consistent with its Community Standards and other policies, including the Misinformation and Harm policy (which sits within the rules on <a href="https://www.facebook.com/communitystandards/credible_violence">Violence and Incitement</a>). </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether Facebook’s decision to take no action is consistent with the company’s stated values and human rights commitments. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether, in this case, Facebook should have considered alternative enforcement measures to removing the content (e.g., the <a href="https://www.facebook.com/communitystandards/false_news">False News</a> Community Standard places an emphasis on “reduce” and “inform,” including: labelling, downranking, providing additional context etc.), and what principles should inform the application of these measures. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">How Facebook should treat content posted by the official accounts of national or sub-national level public health authorities, including where it may diverge from official guidance from international health organizations. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Insights on the post’s claims and their potential impact in the context of Brazil, including on national efforts to prevent the spread of COVID-19. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether Facebook should create a new Community Standard on health misinformation, as recommended by the Oversight Board in case decision <a href="https://oversightboard.com/decision/FB-XWJQBU9A/">2020-006-FB-FBR</a>.</p>
</li></ol>
<h1 style="text-align: justify;" dir="ltr">Submission to the Board</h1>
<p dir="ltr">Facebook’s decision to take no action against the post is consistent with its (i) <a href="https://www.facebook.com/communitystandards/credible_violence">Violence and Incitement</a> community standard read with the <a href="https://www.facebook.com/help/230764881494641">COVID-19 Policy Updates and Protections</a>; and (ii) <a href="https://www.facebook.com/communitystandards/false_news">False News</a> community standard. Facebook’s<a href="https://about.fb.com/news/2018/08/hard-questions-free-expression/"> website</a> as well as all of the Board’s <a href="https://oversightboard.com/decision/FB-6YHRXHZR/">past</a> <a href="https://oversightboard.com/decision/FB-QBJDASCV/">decisions</a> refer to the International Covenant on Civil and Political Rights’ (ICCPR) jurisprudence based <a href="https://www2.ohchr.org/english/bodies/hrc/docs/gc34.pdf">three-pronged test</a> of legality, legitimate aim, and necessity and proportionality in determining violations of Facebook’s community standards. Facebook must apply the same principles to guide the use of its enforcement actions too, keeping in mind the context, intent, tone and impact of the speech. </p>
<p dir="ltr">First, none of Facebook’s aforementioned rules contain explicit prohibitions on content questioning lockdown effectiveness. There is nothing to indicate that “misinformation”, which is undefined, includes within its scope information about the effectiveness of lockdowns. The World Health Organisation has also not advised against such posts. Applying the principle of legality, any person cannot reasonably foresee that such content is prohibited. Accordingly, Facebook’s community standards have not been violated, </p>
<p dir="ltr">Second, the post does not meet the threshold of causing “imminent” harm stipulated in the community standards. Case decision <a href="https://oversightboard.com/decision/FB-XWJQBU9A/">2020-006-FB-FBR</a>, notes that an assessment of “imminence” is made with reference to factors like context, speaker credibility, language etc. Presently, the post’s language and tone, including its quoting of experts and case studies, indicate that its intent is to encourage informed, scientific debate on lockdown effectiveness. </p>
<p dir="ltr">Third, Facebook’s false news community standard does contain any explicit prohibitions. Hence there is no question of its violation. Any decision to the contrary may go against the standard’s stated policy logic of not stifling public discourse, and create a chilling effect on posts questioning the lockdown efficacy. This will set a problematic precedent that Facebook will be mandated to implement.</p>
<p dir="ltr">Presently, Facebook cannot remove the post since no community standards have been violated. Facebook must not reduce the post’s circulation since this may stifle public discussion around lockdown effectiveness. Further, its removal would have resulted in violation of the user’s right to freedom of opinion and expression, as guaranteed by the Universal Declaration of Human Rights (UDHR) and the ICCPR, which are in turn part of Facebook’s Corporate Human Rights Policy. </p>
<p dir="ltr">Instead, Facebook can provide additional context along with the post through its “<a href="https://about.fb.com/news/2018/04/inside-feed-article-context/">related articles</a>” feature, by showing fact checked articles talking about the benefits of lockdown. This approach is the most beneficial since (i) it is less restrictive than reducing circulation of the post; (ii) it balances interests better than not taking any actions by allowing people to be informed about both sides of the debate on lockdowns so that they can make an informed assessment. </p>
<p dir="ltr">Further, Facebook’s treatment of content posted by official accounts of national or sub-national health authorities should be circumscribed by its updated <a href="https://transparency.fb.com/features/approach-to-newsworthy-content/">Newsworthy Content Policy</a>, and the Board’s decision in the <a href="https://oversightboard.com/decision/FB-691QAMHJ/">2021-001-FB-FBR</a>, which had adopted the <a href="https://www.ohchr.org/en/issues/freedomopinion/articles19-20/pages/index.aspx">Rabat Plan of Action</a> to determine whether a restriction on freedom of expression is required to prevent incitement. The Rabat Plan of Action proposes a six-prong test, that considers: a) the social and political context, b) status of the speaker, c) intent to incite the audience against a target group, d) content and form of the speech, e) extent of its dissemination and f) likelihood of harm, including imminence. Apart from taking these factors into consideration, Facebook must <a href="https://transparency.fb.com/features/approach-to-newsworthy-content/">perform</a> a balancing test to determine whether the public interest of the information in the post outweighs the risks of harm. </p>
<p dir="ltr">In the Board’s decision in <a href="https://oversightboard.com/decision/FB-XWJQBU9A/">2020-006-FB-FBR</a>, it was recommended to Facebook to: a) set out a clear and accessible Community Standard on health misinformation, b) consolidate and clarify existing rules in one place (including defining key terms such as misinformation) and c) provision of "detailed hypotheticals that illustrate the nuances of interpretation and application of [these] rules" to provide further clarity for users. Following this, Facebook has <a href="https://assets.documentcloud.org/documents/20491921/covid-19-response-full.pdf">notified</a> its implementation measures, where it has fully implemented these recommendations, thereby bringing it into compliance.</p>
<p dir="ltr">Finally, Brazil is one of the <a href="https://www.bbc.com/news/world-51235105">worst affected</a> countries in the pandemic. It has also been <a href="https://www.ft.com/content/ea62950e-89c0-4b8b-b458-05c90a55b81f">struggling </a>to combat the spread of fake news during the pandemic. President Bolsanaro has been <a href="https://www.hrw.org/news/2021/01/28/brazil-crackdown-critics-covid-19-response">criticised</a> for <a href="https://www.theguardian.com/commentisfree/2020/feb/07/democracy-and-freedom-of-expression-are-under-threat-in-brazil">curbing free speech</a> by using a dictatorship-era <a href="http://www.iconnectblog.com/2021/02/undemocratic-legislation-to-undermine-freedom-of-speech-in-brazil/">national security law</a>., and questioned on his handling of the pandemic, including his own controversial <a href="https://www.bbc.com/news/world-latin-america-56479614">statements </a>questioning lockdown effectiveness. In such a scenario, the post may be perceived in a political colour rather than as an attempt at scientific discussion. However, it is unlikely that the post will lead to any-knee jerk reactions, since people are already familiar with the lockdown debate on which much has already been said and done. A post like this which merely reiterates one side of an ongoing debate is not likely to cause people to take any action to violate lockdown.</p>
<p dir="ltr">For detailed explanation on these questions, please see <a class="external-link" href="https://cis-india.org/internet-governance/facebook-oversight-board-submission-brazil">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-in-case-2021-008-fb-fbr-brazil-health-misinformation-and-lockdowns'>https://cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-in-case-2021-008-fb-fbr-brazil-health-misinformation-and-lockdowns</a>
</p>
No publisherTanvi Apte and Torsha SarkarInternet FreedomMisinformationIntermediary LiabilityInformation Technology2021-07-01T07:34:09ZBlog EntryRight to Exclusion, Government Spaces, and Speech
https://cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech
<b>The conclusion of the litigation surrounding Trump blocking its critiques on Twitter brings to forefront two less-discussed aspects of intermediary liability: a) if social media platforms could be compelled to ‘carry’ speech under any established legal principles, thereby limiting their right to exclude users or speech, and b) whether users have a constitutional right to access social media spaces of elected officials. This essay analyzes these issues under the American law, as well as draws parallel for India, in light of the ongoing litigation around the suspension of advocate Sanjay Hegde’s Twitter account.</b>
<p> </p>
<p>This article first appeared on the Indian Journal of Law and Technology (IJLT) blog, and can be accessed <a class="external-link" href="https://www.ijlt.in/post/right-to-exclusion-government-controlled-spaces-and-speech">here</a>. Cross-posted with permission. </p>
<p>---</p>
<h2><span class="s1">Introduction</span></h2>
<p class="p2"><span class="s1">On April 8, the Supreme Court of the United States (SCOTUS), vacated the judgment of the US Court of Appeals for Second Circuit’s in <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2"><em>Knight First Amendment Institute v Trump</em></span></a>. In that case, the Court of Appeals had precluded Donald Trump, then-POTUS, from blocking his critics from his Twitter account on the ground that such action amounted to the erosion of constitutional rights of his critics. The Court of Appeals had held that his use of @realDonaldTrump in his official capacity had transformed the nature of the account from private to public, and therefore, blocking users he disagreed with amounted to viewpoint discrimination, something that was incompatible with the First Amendment.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">The SCOTUS <a href="https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf"><span class="s2">ordered</span></a> the case to be dismissed as moot, on account of Trump no longer being in office. Justice Clarence Thomas issued a ten-page concurrence that went into additional depth regarding the nature of social media platforms and user rights. It must be noted that the concurrence does not hold any direct precedential weightage, since Justice Thomas was not joined by any of his colleagues at the bench for the opinion. However, given that similar questions of public import, are currently being deliberated in the ongoing <em>Sanjay Hegde</em> <a href="https://www.barandbench.com/news/litigation/delhi-high-court-sanjay-hegde-challenge-suspension-twitter-account-hearing-july-8"><span class="s2">litigation</span></a> in the Delhi High Court, Justice Thomas’ concurrence might hold some persuasive weightage in India. While the facts of these litigations might be starkly different, both of them are nevertheless characterized by important questions of applying constitutional doctrines to private parties like Twitter and the supposedly ‘public’ nature of social media platforms.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In this essay, we consider the legal questions raised in the opinion as possible learnings for India. In the first part, we analyze the key points raised by Justice Thomas, vis-a-vis the American legal position on intermediary liability and freedom of speech. In the second part, we apply these deliberations to the <em>Sanjay Hegde </em>litigation, as a case-study and a roadmap for future legal jurisprudence to be developed.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">A flawed analogy</span></h2>
<p class="p2"><span class="s1">At the outset, let us briefly refresh the timeline of Trump’s tryst with Twitter, and the history of this litigation: the Court of Appeals decision was <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2">issued</span></a> in 2019, when Trump was still in office. Post-November 2020 Presidential Election, where he was voted out, his supporters <a href="https://indianexpress.com/article/explained/us-capitol-hill-siege-explained-7136632/"><span class="s2">broke</span></a> into Capitol Hill. Much of the blame for the attack was pinned on Trump’s use of social media channels (including Twitter) to instigate the violence and following this, Twitter <a href="https://blog.twitter.com/en_us/topics/company/2020/suspension"><span class="s2">suspended</span></a> his account permanently.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is this final fact that seized Justice Thomas’ reasoning. He noted that a private party like Twitter’s power to do away with Trump’s account altogether was at odds with the Court of Appeals’ earlier finding about the public nature of the account. He deployed a hotel analogy to justify this: government officials renting a hotel room for a public hearing on regulation could not kick out a dissenter, but if the same officials gather informally in the hotel lounge, then they would be within their rights to ask the hotel to kick out a heckler. The difference in the two situations would be that, <em>“the government controls the space in the first scenario, the hotel, in the latter.” </em>He noted that Twitter’s conduct was similar to the second situation, where it “<em>control(s) the avenues for speech</em>”. Accordingly, he dismissed the idea that the original respondents (the users whose accounts were blocked) had any First Amendment claims against Trump’s initial blocking action, since the ultimate control of the ‘avenue’ was with Twitter, and not Trump.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In the facts of the case however, this analogy was not justified. The Court of Appeals had not concerned itself with the question of private ‘control’ of entire social media spaces, and given the timeline of the litigation, it was impossible for them to pre-empt such considerations within the judgment. In fact, the only takeaway from the original decision had been that an elected representative’s utilization of his social media account for official purposes transformed </span><span class="s3">only that particular space</span><span class="s1"><em> </em>into a public forum where constitutional rights would find applicability. In delving into questions of ‘control’ and ‘avenues of speech’, issues that had been previously unexplored, Justice Thomas conflates a rather specific point into a much bigger, general conundrum. Further deliberations in the concurrence are accordingly put forward upon this flawed premise.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Right to exclusion (and must carry claims)</span></h2>
<p class="p2"><span class="s1">From here, Justice Thomas identified the problem to be “<em>private, concentrated control over online content and platforms available to the public</em>”, and brought forth two alternate regulatory systems — common carrier and public accommodation — to argue for ‘equal access’ over social media space. He posited that successful application of either of the two analogies would effectively restrict a social media platform’s right to exclude its users, and “<em>an answer may arise for dissatisfied platform users who would appreciate not being blocked</em>”. Essentially, this would mean that platforms would be obligated to carry <em>all </em>forms of (presumably) legal speech, and users would be entitled to sue platforms in case they feel their content has been unfairly taken down, a phenomenon Daphne Keller <a href="http://cyberlaw.stanford.edu/blog/2018/09/why-dc-pundits-must-carry-claims-are-relevant-global-censorship"><span class="s2">describes</span></a> as ‘must carry claims’.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">Again, this is a strange place to find the argument to proceed, since the original facts of the case were not about ‘<em>dissatisfied platform users’,</em> but an elected representative’s account being used in dissemination of official information. Beyond the initial ‘private’ control deliberation, Justice Thomas did not seem interested in exploring this original legal position, and instead emphasized on analogizing social media platforms in order to enforce ‘equal access’, finally arriving at a position that would be legally untenable in the USA.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">The American law on intermediary liability, as embodied in Section 230 of the Communications Decency Act (CDA), has two key components: first, intermediaries are <a href="https://www.eff.org/issues/cda230"><span class="s2">protected</span></a> against the contents posted by its users, under a legal model <a href="https://www.article19.org/wp-content/uploads/2018/02/Intermediaries_ENGLISH.pdf"><span class="s2">termed</span></a> as ‘broad immunity’, and second, an intermediary does not stand to lose its immunity if it chooses to moderate and remove speech it finds objectionable, popularly <a href="https://intpolicydigest.org/section-230-how-it-actually-works-what-might-change-and-how-that-could-affect-you/"><span class="s2">known</span></a> as the Good Samaritan protection. It is the effect of these two components, combined, that allows platforms to take calls on what to remove and what to keep, translating into a ‘right to exclusion’. Legally compelling them to carry speech, under the garb of ‘access’ would therefore, strike at the heart of the protection granted by the CDA.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Learnings for India</span></h2>
<p class="p2"><span class="s1">In his petition to the Delhi High Court, Senior Supreme Court Advocate, Sanjay Hegde had contested that the suspension of his Twitter account, on the grounds of him sharing anti-authoritarian imagery, was arbitrary and that:<span class="Apple-converted-space"> </span></span></p>
<ol style="list-style-type: lower-alpha;" class="ol1"><li class="li2"><span class="s1">Twitter was carrying out a public function and would be therefore amenable to writ jurisdiction under Article 226 of the Indian Constitution; and</span></li><li class="li2"><span class="s1">The suspension of his account had amounted to a violation of his right to freedom of speech and expression under Article 19(1)(a) and his rights to assembly and association under Article 19(1)(b) and 19(1)(c); and</span></li><li class="li2"><span class="s1">The government has a positive obligation to ensure that any censorship on social media platforms is done in accordance with Article 19(2).<span class="Apple-converted-space"> </span></span></li></ol>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">The first two prongs of the original petition are perhaps easily disputed: as previous <a href="https://indconlawphil.wordpress.com/2020/01/28/guest-post-social-media-public-forums-and-the-freedom-of-speech-ii/"><span class="s2">commentary</span></a> has pointed out, existing Indian constitutional jurisprudence on ‘public function’ does not implicate Twitter, and accordingly, it would be a difficult to make out a case that account suspensions, no matter how arbitrary, would amount to a violation of the user’s fundamental rights. It is the third contention that requires some additional insight in the context of our previous discussion.<span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Does the Indian legal system support a right to exclusion?<span class="Apple-converted-space"> </span></span></h3>
<p class="p2"><span class="s1">Suing Twitter to reinstate a suspended account, on the ground that such suspension was arbitrary and illegal, is in its essence a request to limit Twitter’s right to exclude its users. The petition serves as an example of a must-carry claim in the Indian context and vindicates Justice Thomas’ (misplaced) defence of ‘<em>dissatisfied platform users</em>’. Legally, such claims perhaps have a better chance of succeeding here, since the expansive protection granted to intermediaries via Section 230 of the CDA, is noticeably absent in India. Instead, intermediaries are bound by conditional immunity, where availment of a ‘safe harbour’, i.e., exemption from liability, is contingent on fulfilment of statutory conditions, made under <a href="https://indiankanoon.org/doc/844026/"><span class="s2">section 79</span></a> of the Information Technology (IT) Act and the rules made thereunder. Interestingly, in his opinion, Justice Thomas had briefly visited a situation where the immunity under Section 230 was made conditional: to gain Good Samaritan protection, platforms might be induced to ensure specific conditions, including ‘nondiscrimination’. This is controversial (and as commentators have noted, <a href="https://www.lawfareblog.com/justice-thomas-gives-congress-advice-social-media-regulation"><span class="s2">wrong</span></a>), since it had the potential to whittle down the US' ‘broad immunity’ model of intermediary liability to a system that would resemble the Indian one.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is worth noting that in the newly issued Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, proviso to Rule 3(1)(d) allows for “<em>the removal or disabling of access to any information, data or communication link [...] under clause (b) on a voluntary basis, or on the basis of grievances received under sub-rule (2) [...]</em>” without dilution of statutory immunity. This does provide intermediaries a right to exclude, albeit limited, since its scope is restricted to content removed under the operation of specific sub-clauses within the rules, as opposed to Section 230, which is couched in more general terms. Of course, none of this precludes the government from further prescribing obligations similar to those prayed in the petition.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">On the other hand, it is a difficult proposition to support that Twitter’s right to exclusion should be circumscribed by the Constitution, as prayed. In the petition, this argument is built over the judgment in <a href="https://indiankanoon.org/doc/110813550/"><span class="s2"><em>Shreya Singhal v Union of India</em></span></a>, where it was held that takedowns under section 79 are to be done only on receipt of a court order or a government notification, and that the scope of the order would be restricted to Article 19(2). This, in his opinion, meant that “<em>any suo-motu takedown of material by intermediaries must conform to Article 19(2)</em>”.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">To understand why this argument does not work, it is important to consider the context in which the <em>Shreya Singhal </em>judgment was issued. Previously, intermediary liability was governed by the Information Technology (Intermediaries Guidelines) Rules, 2011 issued under section 79 of the IT Act. Rule 3(4) made provisions for sending takedown orders to the intermediary, and the prerogative to send such orders was on ‘<em>an affected person</em>’. On receipt of these orders, the intermediary was bound to remove content and neither the intermediary nor the user whose content was being censored, had the opportunity to dispute the takedown.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">As a result, the potential for misuse was wide-open. Rishabh Dara’s <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf"><span class="s2">research</span></a> provided empirical evidence for this; intermediaries were found to act on flawed takedown orders, on the apprehension of being sanctioned under the law, essentially chilling free expression online. The <em>Shreya Singhal</em> judgment, in essence, reined in this misuse by stating that an intermediary is legally obliged to act <em>only when </em>a takedown order is sent by the government or the court. The intent of this was, in the court’s words: “<em>it would be very difficult for intermediaries [...] to act when millions of requests are made and the intermediary is then to judge as to which of such requests are legitimate and which are not.</em>”<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">In light of this, if Hegde’s petition succeeds, it would mean that intermediaries would now be obligated to subsume the entirety of Article 19(2) jurisprudence in their decision-making, interpret and apply it perfectly, and be open to petitions from users when they fail to do so. This might be a startling undoing of the court’s original intent in <em>Shreya Singhal</em>. Such a reading also means limiting an intermediary’s prerogative to remove speech that may not necessarily fall within the scope of Article 19(2), but is still systematically problematic, including unsolicited commercial communications. Further, most platforms today are dealing with an unprecedented spread and consumption of harmful, misleading information. Limiting their right to exclude speech in this manner, we might be <a href="https://www.hoover.org/sites/default/files/research/docs/who-do-you-sue-state-and-platform-hybrid-power-over-online-speech_0.pdf"><span class="s2">exacerbating</span></a> this problem. <span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Government-controlled spaces on social media platforms</span></h3>
<p class="p2"><span class="s1">On the other hand, the original finding of the Court of Appeals, regarding the public nature of an elected representative’s social media account and First Amendment rights of the people to access such an account, might yet still prove instructive for India. While the primary SCOTUS order erases the precedential weight of the original case, there have been similar judgments issued by other courts in the USA, including by the <a href="https://globalfreedomofexpression.columbia.edu/cases/davison-v-randall/"><span class="s2">Fourth Circuit</span></a> court and as a result of a <a href="https://knightcolumbia.org/content/texas-attorney-general-unblocks-twitter-critics-in-knight-institute-v-paxton"><span class="s2">lawsuit</span></a> against a Texas Attorney General.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">A similar situation can be envisaged in India as well. The Supreme Court has <a href="https://indiankanoon.org/doc/591481/"><span class="s2">repeatedly</span></a> <a href="https://indiankanoon.org/doc/27775458/"><span class="s2">held</span></a> that Article 19(1)(a) encompasses not just the right to disseminate information, but also the right to <em>receive </em>information, including <a href="https://indiankanoon.org/doc/438670/"><span class="s2">receiving</span></a> information on matters of public concern. Additionally, in <a href="https://indiankanoon.org/doc/539407/"><span class="s2"><em>Secretary, Ministry of Information and Broadcasting v Cricket Association of Bengal</em></span></a>, the Court had held that the right of dissemination included the right of communication through any media: print, electronic or audio-visual. Then, if we assume that government-controlled spaces on social media platforms, used in dissemination of official functions, are ‘public spaces’, then the government’s denial of public access to such spaces can be construed to be a violation of Article 19(1)(a).<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Conclusion</span></h2>
<p class="p2"><span class="s1">As indicated earlier, despite the facts of the two litigations being different, the legal questions embodied within converge startlingly, inasmuch that are both examples of the growing discontent around the power wielded by social media platforms, and the flawed attempts at fixing it.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">While the above discussion might throw some light on the relationship between an individual, the state and social media platforms, many questions still continue to remain unanswered. For instance, once we establish that users have a fundamental right to access certain spaces within the social media platform, then does the platform have a right to remove that space altogether? If it does so, can a constitutional remedy be made against the platform? Initial <a href="https://indconlawphil.wordpress.com/2018/07/01/guest-post-social-media-public-forums-and-the-freedom-of-speech/"><span class="s2">commentary</span></a> on the Court of Appeals’ decision had contested that the takeaway from that judgment had been that constitutional norms had a primacy over the platform’s own norms of governance. In such light, would the platform be constitutionally obligated to <em>not </em>suspend a government account, even if the content on such an account continues to be harmful, in violation of its own moderation standards?<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">This is an incredibly tricky dimension of the law, made trickier still by the dynamic nature of the platforms, the intense political interests permeating the need for governance, and the impacts on users in the instance of a flawed solution. Continuous engagement, scholarship and emphasis on having a human rights-respecting framework underpinning the regulatory system, are the only ways forward.<span class="Apple-converted-space"> </span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space">---</span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"></span></span></p>
<p>The author would like to thank Gurshabad Grover and Arindrajit Basu for reviewing this piece. </p>
<div> </div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech'>https://cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech</a>
</p>
No publisherTorSharkFreedom of Speech and ExpressionIntermediary LiabilityInformation Technology2021-07-02T12:05:13ZBlog EntryIntermediary Liability in India: Chilling Effects on Free Expression on the Internet 2011
https://cis-india.org/internet-governance/intermediary-liability-in-india
<b>Intermediaries are widely recognised as essential cogs in the wheel of exercising the right to freedom of expression on the Internet. Most major jurisdictions around the world have introduced legislations for limiting intermediary liability in order to ensure that this wheel does not stop spinning. With the 2008 amendment of the Information Technology Act 2000, India joined the bandwagon and established a ‘notice and takedown’ regime for limiting intermediary liability.</b>
<p>On the 11th of April 2011, the Government of India notified the Information Technology (Intermediaries Guidelines) Rules 2011 that prescribe, amongst other things, guidelines for administration of takedowns by intermediaries. The Rules have been criticised extensively by both national and international media. The media has projected that the Rules, contrary to the objective of promoting free expression, seem to encourage privately administered injunctions to censor and chill free expression. On the other hand, the Government has responded through press releases and assured that the Rules in their current form do not violate the principle of freedom of expression or allow the government to regulate content.</p>
<p>This study has been conducted with the objective of determining whether the criteria, procedure and safeguards for administration of the takedowns as prescribed by the Rules lead to a chilling effect on online free expression. In the course of the study, takedown notices were sent to a sample comprising of 7 prominent intermediaries and their response to the notices was documented. Different policy factors were permuted in the takedown notices in order to understand at what points in the process of takedown, free expression is being chilled.</p>
<p>The results of the paper clearly demonstrate that the Rules indeed have a chilling effect on free expression. Specifically, the Rules create uncertainty in the criteria and procedure for administering the takedown thereby inducing the intermediaries to err on the side of caution and over-comply with takedown notices in order to limit their liability and as a result suppress legitimate expressions. Additionally, the Rules do not establish sufficient safeguards to prevent misuse and abuse of the takedown process to suppress legitimate expressions.</p>
<p>Of the 7 intermediaries to which takedown notices were sent, 6 intermediaries over-complied with the notices, despite the apparent flaws in them. From the responses to the takedown notices, it can be reasonably presumed that not all intermediaries have sufficient legal competence or resources to deliberate on the legality of an expression. Even if such intermediary has sufficient legal competence, it has a tendency to prioritise the allocation of its legal resources according to the commercial importance of impugned expressions. Further, if such subjective determination is required to be done in a limited timeframe and in the absence of adequate facts and circumstances, the intermediary mechanically (without application of mind or proper judgement) complies with the takedown notice.</p>
<p>The results also demonstrate that the Rules are procedurally flawed as they ignore all elements of natural justice. The third party provider of information whose expression is censored is not informed about the takedown, let alone given an opportunity to be heard before or after the takedown. There is also no recourse to have the removed information put-back or restored. The intermediary is under no obligation to provide a reasoned decision for rejecting or accepting a takedown notice. The Rules in their current form clearly tilt the takedown mechanism in favour of the complainant and adversely against the creator of expression.</p>
<table class="plain">
<tbody>
<tr>
<td>The research highlights the need to:<br />
<ul><li>increase the safeguards against misuse of the privately administered takedown regime;</li></ul>
<ul><li>reduce the uncertainty in the criteria for administering the takedown;</li></ul>
<ul><li>reduce the uncertainty in the procedure for administering the takedown;</li></ul>
<ul><li>include various elements of natural justice in the procedure for administering the takedown; and</li></ul>
<ul><li>replace the requirement for subjective legal determination by intermediaries with an objective test.</li></ul>
</td>
</tr>
</tbody>
</table>
<hr />
This executive summary is a research output of the Google Policy Fellowship 2011. The Centre for Internet & Society was the host organization. For the entire paper along with references, please write to <a class="external-link" href="mailto:rishabhdara@gmail.com">rishabhdara@gmail.com</a> or<a class="external-link" href="mailto:sunil@cis-india.org"> sunil@cis-india.org</a>
<p>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/intermediary-liability-in-india'>https://cis-india.org/internet-governance/intermediary-liability-in-india</a>
</p>
No publisherRishabh DaraFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityCensorship2012-04-21T18:05:58ZBlog EntryAn Evidence based Intermediary Liability Policy Framework: Workshop at IGF
https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework
<b>CIS is organising a workshop at the Internet Governance Forum 2014. The workshop will be an opportunity to present and discuss ongoing research on the changing definition of intermediaries and their responsibilities across jurisdictions and technologies and contribute to a comprehensible framework for liability that is consistent with the capacity of the intermediary and with international human-rights standards.</b>
<p style="text-align: justify; ">The Centre for Internet and Society, India and Centre for Internet and Society, Stanford Law School, USA, will be organising a workshop to analyse the role of intermediary platforms in relation to freedom of expression, freedom of information and freedom of association at the Internet Governance Forum 2014. <span>The aim of the workshop is to highlight the increasing importance of digital rights and broad legal protections of stakeholders in an increasingly knowledge-based economy. The workshop will discuss public policy issues associated with Internet intermediaries, in particular their roles, legal responsibilities and related liability limitations in context of the evolving nature and role of intermediaries in the Internet ecosystem. distinct</span></p>
<p style="text-align: justify; "><b>Online Intermediaries: Setting the context</b></p>
<p style="text-align: justify; ">The Internet has facilitated unprecedented access to information and amplified avenues for expression and engagement by removing the limits of geographic boundaries and enabling diverse sources of information and online communities to coexist. Against the backdrop of a broadening base of users, the role of intermediaries that enable economic, social and political interactions between users in a global networked communication is ubiquitous. Intermediaries are essential to the functioning of the Internet as many producers and consumers of content on the internet rely on the action of some third party–the so called intermediary. Such intermediation ranges from the mere provision of connectivity, to more advanced services such as providing online storage spaces for data, acting as platforms for storage and sharing of user generated content (UGC), or platforms that provides links to other internet content.</p>
<p style="text-align: justify; ">Online intermediaries enhance economic activity by reducing costs, inducing competition by lowering the barriers for participation in the knowledge economy and fuelling innovation through their contribution to the wider ICT sector as well as through their key role in operating and maintaining Internet infrastructure to meet the network capacity demands of new applications and of an expanding base of users.</p>
<p style="text-align: justify; ">Intermediary platforms also provide social benefits, by empowering users and improving choice through social and participative networks, or web services that enable creativity and collaboration amongst individuals. By enabling platforms for self-expression and cooperation, intermediaries also play a critical role in establishing digital trust, protection of human rights such as freedom of speech and expression, privacy and upholding fundamental values such as freedom and democracy.</p>
<p style="text-align: justify; ">However, the economic and social benefits of online intermediaries are conditional to a framework for protection of intermediaries against legal liability for the communication and distribution of content which they enable.</p>
<p style="text-align: justify; "><b>Intermediary Liability</b></p>
<p style="text-align: justify; ">Over the last decade, right holders, service providers and Internet users have been locked in a debate on the potential liability of online intermediaries. The debate has raised global concerns on issues such as, the extent to which Internet intermediaries should be held responsible for content produced by third parties using their Internet infrastructure and how the resultant liability would affect online innovation and the free flow of knowledge in the information economy?</p>
<p style="text-align: justify; ">Given the impact of their services on communications, intermediaries find themselves as either directly liable for their actions, or indirectly (or “secondarily”) liable for the actions of their users. Requiring intermediaries to monitor the legality of the online content poses an insurmountable task. Even if monitoring the legality of content by intermediaries against all applicable legislations were possible, the costs of doing so would be prohibitively high. Therefore, placing liability on intermediaries can deter their willingness and ability to provide services, hindering the development of the internet itself.</p>
<p style="text-align: justify; ">Economics of intermediaries are dependent on scale and evaluating the legality of an individual post exceeds the profit from hosting the speech, and in the absence of judicial oversight can lead to a private censorship regime. Intermediaries that are liable for content or face legal exposure, have powerful incentives, to police content and limit user activity to protect themselves. The result is curtailing of legitimate expression especially where obligations related to and definition of illegal content is vague. Content policing mandates impose significant compliance costs limiting the innovation and competiveness of such platforms.</p>
<p style="text-align: justify; ">More importantly, placing liability on intermediaries has a chilling effect on freedom of expression online. Gate keeping obligations by service providers threaten democratic participation and expression of views online, limiting the potential of individuals and restricting freedoms. Imposing liability can also indirectly lead to the death of anonymity and pseudonymity, pervasive surveillance of users' activities, extensive collection of users' data and ultimately would undermine the digital trust between stakeholders.</p>
<p style="text-align: justify; ">Thus effectively, imposing liability for intermediaries creates a chilling effect on Internet activity and speech, create new barriers to innovation and stifles the Internet's potential to promote broader economic and social gains. To avoid these issues, legislators have defined 'safe harbours', limiting the liability of intermediaries under specific circumstances.</p>
<p style="text-align: justify; ">Online intermediaries do not have direct control of what information is or information are exchanged via their platform and might not be aware of illegal content per se. A key framework for online intermediaries, such limited liability regimes provide exceptions for third party intermediaries from liability rules to address this asymmetry of information that exists between content producers and intermediaries.</p>
<p style="text-align: justify; ">However, it is important to note, that significant differences exist concerning the subjects of these limitations, their scope of provisions and procedures and modes of operation. The 'notice and takedown' procedures are at the heart of the safe harbour model and can be subdivided into two approaches:</p>
<p style="text-align: justify; ">a. Vertical approach where liability regime applies to specific types of content exemplified in the US Digital Copyright Millennium Act</p>
<p style="text-align: justify; ">b. Horizontal approach based on the E-Commerce Directive (ECD) where different levels of immunity are granted depending on the type of activity at issue</p>
<p style="text-align: justify; "><b>Current framework </b></p>
<p style="text-align: justify; ">Globally, three broad but distinct models of liability for intermediaries have emerged within the Internet ecosystem:</p>
<p style="text-align: justify; ">1. Strict liability model under which intermediaries are liable for third party content used in countries such as China and Thailand</p>
<p style="text-align: justify; ">2. Safe harbour model granting intermediaries immunity, provided their compliance on certain requirements</p>
<p style="text-align: justify; ">3. Broad immunity model that grants intermediaries broad or conditional immunity from liability for third party content and exempts them from any general requirement to monitor content. <b> </b></p>
<p style="text-align: justify; ">While the models described above can provide useful guidance for the drafting or the improvement of the current legislation, they are limited in their scope and application as they fail to account for the different roles and functions of intermediaries. Legislators and courts are facing increasing difficulties, in interpreting these regulations and adapting them to a new economic and technical landscape that involves unprecedented levels user generated content and new kinds of and online intermediaries.</p>
<p style="text-align: justify; ">The nature and role of intermediaries change considerably across jurisdictions, and in relation to the social, economic and technical contexts. In addition to the dynamic nature of intermediaries the different categories of Internet intermediaries‘ are frequently not clear-cut, with actors often playing more than one intermediation role. Several of these intermediaries offer a variety of products and services and may have number of roles, and conversely, several of these intermediaries perform the same function. For example , blogs, video services and social media platforms are considered to be 'hosts'. Search engine providers have been treated as 'hosts' and 'technical providers'.</p>
<p style="text-align: justify; ">This limitations of existing models in recognising that different types of intermediaries perform different functions or roles and therefore should have different liability, poses an interesting area for research and global deliberation. Establishing classification of intermediaries, will also help analyse existing patterns of influence in relation to content for example when the removal of content by upstream intermediaries results in undue over-blocking.</p>
<p style="text-align: justify; ">Distinguishing intermediaries on the basis of their roles and functions in the Internet ecosystem is critical to ensuring a balanced system of liability and addressing concerns for freedom of expression. Rather than the highly abstracted view of intermediaries as providing a single unified service of connecting third parties, the definition of intermediaries must expand to include the specific role and function they have in relation to users' rights. A successful intermediary liability regime must balance the needs of producers, consumers, affected parties and law enforcement, address the risk of abuses for political or commercial purposes, safeguard human rights and contribute to the evolution of uniform principles and safeguards.</p>
<p style="text-align: justify; "><b>Towards an evidence based intermediary liability policy framework</b></p>
<p style="text-align: justify; ">This workshop aims to bring together leading representatives from a broad spectrum of stakeholder groups to discuss liability related issues and ways to enhance Internet users’ trust.</p>
<p style="text-align: justify; ">Questions to address at the panel include:</p>
<p style="text-align: justify; ">1. What are the varying definitions of intermediaries across jurisdictions?</p>
<p style="text-align: justify; ">2. What are the specific roles and functions that allow for classification of intermediaries?</p>
<p style="text-align: justify; ">3. How can we ensure the legal framework keeps pace with technological advances and the changing roles of intermediaries?</p>
<p style="text-align: justify; ">4. What are the gaps in existing models in balancing innovation, economic growth and human rights?</p>
<p style="text-align: justify; ">5. What could be the respective role of law and industry self-regulation in enhancing trust?</p>
<p style="text-align: justify; ">6. How can we enhance multi-stakeholder cooperation in this space?</p>
<p style="text-align: justify; ">Confirmed Panel:</p>
<p style="text-align: justify; ">Technical Community: Malcolm Hutty: Internet Service Providers Association (ISPA)<br />Civil Society: Gabrielle Guillemin: Article19<br />Academic: Nicolo Zingales: Assistant Professor of Law at Tilburg University<br />Intergovernmental: Rebecca Mackinnon: Consent of the Networked, UNESCO project<br />Civil Society: Anriette Esterhuysen: Association for Progressive Communication (APC)<br />Civil Society: Francisco Vera: Advocacy Director: Derechos Digitale<br />Private Sector: Titi Akinsanmi: Policy and Government Relations Manager, Google Sub-Saharan Africa<br />Legal: Martin Husovec: MaxPlanck Institute</p>
<p style="text-align: justify; "><b> </b></p>
<p style="text-align: justify; "><span>Moderator(s): </span><span>Giancarlo Frosio, Centre for Internet and Society (CIS) and </span><span>Jeremy Malcolm, Electronic Frontier Foundation </span></p>
<p style="text-align: justify; "><span><span>Remote Moderator: </span><span>Anubha Sinha, New Delhi</span></span></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework'>https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework</a>
</p>
No publisherjyotihuman rightsDigital Governanceinternet governanceFreedom of Speech and ExpressionInternet Governance ForumHuman Rights OnlineIntermediary LiabilityPoliciesMulti-stakeholder2014-07-04T06:41:10ZBlog EntryOn the legality and constitutionality of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
https://cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021
<b>This note examines the legality and constitutionality of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The analysis is consistent with previous work carried out by CIS on issues of intermediary liability and freedom of expression. </b>
<p><span id="docs-internal-guid-6127737f-7fff-b2eb-1b4a-ff9009a1050f"></span></p>
<p dir="ltr">On 25 February 2021, the Ministry of Electronics and Information Technology (Meity) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereinafter, ‘the rules’). In this note, we examine whether the rules meet the tests of constitutionality under Indian jurisprudence, whether they are consistent with the parent Act, and discuss potential benefits and harms that may arise from the rules as they are currently framed. Further, we make some recommendations to amend the rules so that they stay in constitutional bounds, and are consistent with a human rights based approach to content regulation. Please note that we cover some of the issues that CIS has already highlighted in comments on previous versions of the rules.</p>
<p dir="ltr"> </p>
<p dir="ltr">The note can be downloaded <a class="external-link" href="https://cis-india.org/internet-governance/legality-constitutionality-il-rules-digital-media-2021">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021'>https://cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021</a>
</p>
No publisherTorsha Sarkar, Gurshabad Grover, Raghav Ahooja, Pallavi Bedi and Divyank KatiraFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityInternet FreedomInformation Technology2021-06-21T11:52:39ZBlog EntryIntermediary Liability Resources
https://cis-india.org/internet-governance/blog/intermediary-liability-resources
<b>We bring you a list of intermediary resources as part of research on internet governance. This blog post will be updated on an ongoing basis.</b>
<ol> </ol><ol>
<li style="text-align: justify; "><b>Shielding the Messengers: Protecting Platforms for Expression and Innovation. </b>The Centre for Democracy and Technology. December 2012, available at: <a href="https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf">https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf</a>: This paper analyses the impact that intermediary liability regimes have on freedom of expression, privacy, and innovation. In doing so, the paper highlights different models of intermediary liability regimes, reviews different technological means of restricting access to content, and provides recommendations for intermediary liability regimes and provides alternative ways of addressing illegal content online.</li>
<li style="text-align: justify; "><b>Internet Intermediaries: Dilemma of Liability:</b> Article 19. 2013, available at: <a href="http://www.article19.org/data/files/Intermediaries_ENGLISH.pdf">http://www.article19.org/data/files/Intermediaries_ENGLISH.pdf:</a>This Policy Document reviews different components of intermediary liability and highlights the challenges and risks that current models of liability have to online freedom of expression. Relying on international standards for freedom of expression and comparative law, the document includes recommendations and alternative models that provide stronger protection for freedom of expression. The key recommendation in the document include: web hosting providers or hosts should be immune from liability to third party content if they have not modified the content, privatised enforcement should not be a model and removal orders should come only from courts or adjudicatory bodies, the model of notice to notice should replace notice and takedown regimes, in cases of alleged serious criminality clear conditions should be in place and defined.</li>
<li style="text-align: justify; "><b>Comparative Analysis of the National Approaches to the Liability of Internet Intermediaries:</b> Prepared by Daniel Seng for WIPO, available at http://www.wipo.int/export/sites/www/copyright/en/doc/liability_of_internet_intermediaries.pdf:This Report reviews the intermediary liability regimes and associated laws in place across fifteen different contexts with a focus on civil copyright liability for internet intermediaries. The Report seeks to find similarities and differences across the regimes studied and highlight principles and components in different that can be used in international treaties and instruments, upcoming policies, and court decisions.</li>
<li style="text-align: justify; "><b>Freedom of Expression, Indirect Censorship, & Liability for Internet Intermediaries.</b> The Electronic Frontier Foundation. February 2011, available at: <a href="http://infojustice.org/download/tpp/tpp-civil-society/EFF%20presentation%20ISPs%20and%20Freedom%20of%20Expression.pdf">http://infojustice.org/download/tpp/tpp-civil-society/EFF%20presentation%20ISPs%20and%20Freedom%20of%20Expression.pdf</a>:This presentation was created for the Trans-Pacific Partnership Stakeholder Forum in Chile and highlights that for freedom of expression to be protected, clear legal protections for internet intermediaries are needed and advocates for a regime that provides blanket immunity to intermediaries or is based on judicial takedown notices.</li>
<li style="text-align: justify; "><b>Study on the Liability of Internet Intermediaries. Contracted by the European Commission.</b> 2007, available at: <a href="http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf">http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf</a>. This Report provides insight on the application of the intermediary liability sections of the EU e-commerce directive and studies the impact of the regulations under the Directive on the functioning of intermediary information society services. To achieve this objective, the study identifies relavant case law across member states, calls out and evaluates developing trends across Member States, and draws conclusions.</li>
<li style="text-align: justify; "><b>Internet Intermediary Liability: Identifying Best Practices for Africa.</b> Nicolo Zingales for the Association for Progressive Communications, available at: <a href="https://www.apc.org/en/system/files/APCInternetIntermediaryLiability_BestPracticesAfrica_20131125.pdf">https://www.apc.org/en/system/files/APCInternetIntermediaryLiability_BestPracticesAfrica_20131125.pdf</a>: This background paper seeks to identify challenges and opportunities in addressing intermediary liability for countries in the African Union and recommend safeguards that can be included in emerging intermediary liability regimes in the context of human rights. The paper also reviews different models of intermediary liability and discusses the limitations, scope, and modes of operation of each model. </li>
<li style="text-align: justify; "><b>The Liability of Internet Intermediaries in Nigeria, Kenya, South Africa, and Uganda</b>: An uncertain terrain. Association for Progressive Communications. October 2012, available at: <a href="http://www.academia.edu/2484536/The_liability_of_internet_intermediaries_in_Nigeria_Kenya_South_Africa_and_Uganda_An_uncertain_terrain">http://www.academia.edu/2484536/The_liability_of_internet_intermediaries_in_Nigeria_Kenya_South_Africa_and_Uganda_An_uncertain_terrain</a>:This Report reviews intermediary liability in Nigeria, Kenya, South Africa and Uganda – providing background to the political context, relevant legislation, and present challenges . In doing so, the Report provides insight into how intermediary liability has changed in recent years in these contexts and explores past and present debates on intermediary liability. The Report concludes with recommendations for stakeholders affected by intermediary liability. </li>
<li style="text-align: justify; "><b>The Fragmentation of intermediary liability in the UK</b>. Daithi Mac Sithigh. 2013, available at: <a href="http://jiplp.oxfordjournals.org/content/8/7/521.full.pdf?keytype=ref&ijkey=zuL8aFSzKJqkozT">http://jiplp.oxfordjournals.org/content/8/7/521.full.pdf?keytype=ref&ijkey=zuL8aFSzKJqkozT</a>. This article looks at the application of the Electronic Commerce Directive across Europe and argues that it is being intermixed and subsequently replaced with provisions from national legislation and provisions of law from area specific legislation. Thus, the article argues that systems for intermediary liability are diving into multiple systems – for example for content related to copyright intermediaries are being placed with new responsibilities while for content related to defamation, there is a reducing in the liability that intermediaries are held to. </li>
<li><b>Regimes of Legal Liability for Online Intermediaries: an Overview</b>. OECD, available at: <a href="http://www.oecd.org/sti/ieconomy/45509050.pdf">http://www.oecd.org/sti/ieconomy/45509050.pdf</a>. This article provides an overview of different intermediary liability regimes including EU and US. </li>
<li style="text-align: justify; "><b> Closing the Gap: Indian Online Intermediaries and a Liability System Not Yet Fit for Purpose</b>. GNI. 2014, available at: <a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf</a>. This Report argues that the provisions of the Information Technology Act 2000 are not adequate to deal with ICT innovations , and argues that the current liability regime in India is hurting the Indian internet economy. </li>
<li style="text-align: justify; "><b>Intermediary Liability in India</b>. Centre for Internet and Society. 2011, available at: <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf">http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf</a>. This report reviews and ‘tests’ the effect of the Indian intermediary liability on freedom of expression. The report concludes that the present regime in India has a chilling effect on free expression and offers recommendations on how the Indian regime can be amended to protect this right. </li>
<li style="text-align: justify; ">The Liability of Internet Service providers and the exercise of the freedom of expression in Latin America have been explored in detail through the course of this research paper by Claudio Ruiz Gallardo and J. Carlos Lara Galvez. The paper explores the efficacy and the implementation of proposals to put digital communication channels under the oversight of certain State sponsored institutions in varying degrees. The potential consequence of legal intervention in media and digital platforms, on the development of individual rights and freedoms has been addressed through the course of this study. The paper tries to arrive at relevant conclusions with respect to the enforcement of penalties that seek to redress the liability of communication intermediaries and the mechanism that may be used to oversee the balance between the interests at stake as well as take comparative experiences into account. The paper also analyses the liability of technical facilitators of communications while at the same time attempting to define a threshold beyond which the interference into the working of these intermediaries may constitute an offence of the infringement of the privacy of users. Ultimately, it aims to derive a balance between the necessity for intervention, the right of the users who communicate via the internet and interests of the economic actors who may be responsible for the service: <a class="external-link" href="http://www.palermo.edu/cele/pdf/english/Internet-Free-of-Censorship/02-Liability_Internet_Service_Providers_exercise_freedom_expression_Latin_America_Ruiz_Gallardo_Lara_Galvez.pdf">http://www.palermo.edu/cele/pdf/english/Internet-Free-of-Censorship/02-Liability_Internet_Service_Providers_exercise_freedom_expression_Latin_America_Ruiz_Gallardo_Lara_Galvez.pdf</a></li>
</ol>
<hr />
<p><a class="external-link" href="https://crm.apc.org/civicrm/mailing/view?reset=1&id=191">Click to read the newsletter</a> from the Association of Progressive Communications. The summaries for the reports can be found below:</p>
<p style="text-align: justify; ">Internet Intermediaries: The Dilemma of Liability in Africa. APC News, May 2014, available at: <a href="http://www.apc.org/en/node/19279/">http://www.apc.org/en/node/19279/</a>. This report summarizes the challenges facing internet content regulators in Africa, and the effects of these regulations on the state of the internet in Africa. Many African countries do not protect intermediaries from potential liability, so some intermediaries are too afraid to transmit or host content on the internet in those countries. The report calls for a universal rights protection for internet intermediaries.</p>
<p style="text-align: justify; ">APC’s Frequently Asked Questions on Internet Intermediary Liability: APC, May 2014, available at: <a href="http://www.apc.org/en/node/19291/">http://www.apc.org/en/node/19291/</a>. This report addresses common questions pertaining to internet intermediaries, which are entities which provide services that enable people to use the internet, from network providers to search engines to comments sections on blogs. Specifically, the report outlines different models of intermediary liability, defining two main models. The “Generalist” model intermediary liability is judged according to the general rules of civil and criminal law, while the “Safe Harbour” model protects intermediaries with a legal safe zone.</p>
<p style="text-align: justify; ">New Developments in South Africa: APC News, May 2014, available at: <a href="http://www.apc.org/en/news/intermediary-liability-new-developments-south-afri">http://www.apc.org/en/news/intermediary-liability-new-developments-south-afri</a>. This interview with researchers Alex Comninos and Andrew Rens goes into detail about the challenges of intermediary in South Africa. The researchers discuss the balance that needs to be struck between insulating intermediaries from a fear of liability and protecting women’s rights in an environment that is having trouble dealing with violence against women. They also discuss South Africa’s three strikes policy for those who pirate material.</p>
<p style="text-align: justify; ">Preventing Hate Speech Online In Kenya: APCNews, May 2014, available at: <a href="http://www.apc.org/en/news/intermediary-liability-preventing-hate-speech-onli">http://www.apc.org/en/news/intermediary-liability-preventing-hate-speech-onli</a>. This interview with Grace Githaiga investigates the uncertain fate of internet intermediaries under Kenya’s new regime. The new government has mandated everyone to register their SIM cards, and indicated that it was monitoring text messages and flagging those that were deemed risky. This has led to a reduction in the amount of hate speech via text messages. Many intermediaries, such as newspaper comments sections, have established rules on how readers should post on their platforms. Githaiga goes on to discuss the issue of surveillance and the lack of a data protection law in Kenya, which she sees as the most pressing internet issue in Kenya.</p>
<p style="text-align: justify; ">New Laws in Uganda Make Internet Providers More Vulnerable to Liability and State Intervention: APCNews, May 2014, available at: <a href="http://www.apc.org/en/news/new-laws-uganda-make-internet-providers-more-vulne">http://www.apc.org/en/news/new-laws-uganda-make-internet-providers-more-vulne</a>. In an interview, Lilian Nalwoga discusses Uganda’s recent anti-pornography law that can send intermediaries to prison. The Anti-Pornography Act of 2014 criminalizes any sort of association with any form of pornography, and targets ISPs, content providers, and developers, making them liable for content that goes through their systems. This makes being an intermediary extremely risky in Uganda. The other issue with the law is a vague definition of pornography. Nalwoga also explains the Anti-Homosexuality Act of 2014 bans any promotion or recognition of homosexual relations, and the monitoring technology the government is using to enforce these laws.</p>
<p style="text-align: justify; ">New Laws Affecting Intermediary Liability in Nigeria: APCNews, May 2014, available at: <a href="http://www.apc.org/en/news/new-laws-affecting-intermediary-liability-nigeria">http://www.apc.org/en/news/new-laws-affecting-intermediary-liability-nigeria</a>. Gbenga Sesan, executive director of Paradigm Initiative Nigeria, expounds on the latest trends in Nigerian intermediary liability. The Nigerian Communications Commission has a new law that mandates ISPs store users data for at least here years, and wants to make content hosts responsible for what users do on their networks. Additionally, in Nigeria, internet users register with their real name and prove that you are the person who is registration. Sesan goes on to discuss the lack of safe harbor provisions for intermediaries and the remaining freedom of anonymity on social networks in Nigeria.</p>
<p style="text-align: justify; ">Internet Policies That Affect Africans: APC News, May 2014, available at: <a href="http://www.apc.org/en/news/intermediary-liability-internet-policies-affect-af">http://www.apc.org/en/news/intermediary-liability-internet-policies-affect-af</a>. The Associsation for Progressive Communcations interviews researcher Nicolo Zingales about the trend among African governments establishing further regulations to control the flow of information on the internet and hold intermediaries liable for content they circulate. Zingales criticizes intermediary liability for “creating a system of adverse incentives for free speech.” He goes on to offer examples of intermediaries and explain the concept of “safe harbor” legislative frameworks. Asked to identify best and worst practices in Africa, he highlights South Africa’s safe harbor as a good practice, and mentions the registration of users via ID cards as a worst practice.</p>
<p style="text-align: justify; ">Towards Internet Intermediary Responsibility: Carly Nyst, November 2013, available at: <a href="http://www.genderit.org/feminist-talk/towards-internet-intermediary-responsibility">http://www.genderit.org/feminist-talk/towards-internet-intermediary-responsibility</a>. Nyst argues for a middle ground between competing goals in internet regulation in Africa. Achieving one goal, of protecting free speech through internet intermediaries seems at odds with the goal of protecting women’s rights and limiting hate speech, because one demands intermediaries be protected in a legal safe harbor and the other requires intermediaries be vigilant and police their content. Nyst’s solution is not intermediary liability but <i>responsibility</i>, a role defined by empowerment, and establishing an intermediary responsibility to promote positive gender attitudes.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/intermediary-liability-resources'>https://cis-india.org/internet-governance/blog/intermediary-liability-resources</a>
</p>
No publisherelonnaiFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityPrivacy2014-07-03T06:45:48ZBlog Entry