The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 31 to 45.
Panel Discussion on Internet Intermediaries, Law and Innovation
https://cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation
<b>CII, Google and Centre For Communications Governance, NLU Delhi hosted a panel discussion on June 2 in New Delhi. Jyoti Panday attended.</b>
<p style="text-align: justify; ">The Centre for Internet & Society (CIS) participated in the panel discussion on 'Internet Intermediaries, Law and Innovation' hosted by CII, Google and Centre For Communications Governance, NLU Delhi. The panel discussed the impact of the existing provisions on intermediary liability and innovation and sought suggestions and the way forward.<br /><br />The panel was moderated by Dr Subho Ray, President, IAMAI<br /><br />Other panelists included:</p>
<ul style="text-align: justify; ">
<li> Mr Anupam Chander, Eminent Global Lawyer & Academician</li>
<li> Mr Apar Gupta, Advocate</li>
<li> Ms Mishi Choudhary, Founding Director , Software Freedom Law Centre</li>
<li> Mr J Sai Deepak, Associate Partner, Litigation Team, Saikrishna & Associates</li>
</ul>
<ul style="text-align: justify; ">
<li> Mr Indranil Choudhury, Founder and CEO, Lexplosion</li>
</ul>
<p><a href="https://cis-india.org/internet-governance/blog/internet-intermediaries-law-and-innovation-panel.odp" class="internal-link">Click to download the presentation.</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation'>https://cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation</a>
</p>
No publisherjyotiInternet GovernanceIntermediary Liability2015-06-14T16:37:56ZNews ItemDeitY says 143 URLs have been Blocked in 2015; Procedure for Blocking Content Remains Opaque and in Urgent Need of Transparency Measures
https://cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015
<b>Across India on 30 December 2014, following an order issued by the Department of Telecom (DOT), Internet Service Providers (ISPs) blocked 32 websites including Vimeo, Dailymotion, GitHub and Pastebin.</b>
<p style="text-align: justify;">In February 2015, the Centre for Internet and Society (CIS) requested the Department of Electronics and Information Technology (DeitY) under the Right to Information Act, 2005 (RTI Act) to provide information clarifying the procedures for blocking in India. We have received a response from DeitY which may be <a href="https://cis-india.org/internet-governance/blog/response-deity.clarifying-procedures-for-blocking.pdf" class="external-link">seen here</a>.</p>
<p style="text-align: justify;">In this post, I shall elaborate on this response from DeitY and highlight some of the accountability and transparency measures that the procedure needs. To stress the urgency of reform, I shall also touch upon two recent developments—the response from Ministry of Communication to questions raised in Parliament on the blocking procedures and the Supreme Court (SC) judgment in Shreya Singhal v. Union of India.</p>
<h2 style="text-align: justify;">Section 69A and the Blocking Rules</h2>
<p align="JUSTIFY" class="western">Section 69A of the Information Technology Act, 2008 (S69A hereinafter) grants powers to the central government to issue directions for blocking of access to any information through any computer resource. In other words, it allows the government to block any websites under certain grounds. The Government has notified rules laying down the procedure for blocking access online under the Procedure and Safeguards for Blocking for Access of Information by Public Rules, 2009 (Rules, 2009 hereinafter). CIS has produced a poster explaining the blocking procedure (<a href="http://cis-india.org/internet-governance/blog/blocking-websites.pdf/at_download/file">download PDF</a>, 2.037MB).</p>
<p align="JUSTIFY" class="western">There are <em>three key aspects</em> of the blocking rules that need to be kept under consideration:</p>
<h3 align="JUSTIFY" class="western">Officers and committees handling requests</h3>
<p style="text-align: justify;"><strong>Designated Officer (DO)</strong> – Appointed by the Central government, officer not below the rank of Joint Secretary.<br /><strong>Nodal Officer (NO)</strong> – Appointed by organizations including Ministries or Departments of the State governments and Union Territories and any agency of the Central Government. <br /><strong>Intermediary contact</strong>–Appointed by every intermediary to receive and handle blocking directions from the DO.<br /><strong>Committee for Examination of Request (CER)</strong> – The request along with printed sample of alleged offending information is examined by the CER—committee with the DO serving as the Chairperson and representatives from Ministry of Law and Justice; Ministry of Home Affairs; Ministry of Information and Broadcasting and representative from the Indian Computer Emergency Response Team (CERT-In). The CER is responsible for examining each blocking request and makes recommendations including revoking blocking orders to the DO, which are taken into consideration for final approval of request for blocking by the Secretary, DOT. <br /><strong>Review Committee (RC) </strong>– Constituted under rule 419A of the Indian Telegraph Act, 1951, the RC includes the Cabinet Secretary, Secretary to the Government of India (Legal Affairs) and Secretary (Department of Telecom). The RC is mandated to meet at least once in 2 months and record its findings and has to validate that directions issued are in compliance with S69A(1).</p>
<h3 style="text-align: justify;">Provisions outlining the procedure for blocking</h3>
<p>Rules 6, 9 and 10 create three distinct blocking procedures, which must commence within 7 days of the DO receiving the request.</p>
<p style="text-align: justify;">a) Rule 6 lays out the first procedure, under which any person may approach the NO and request blocking, alternatively, the NO may also raise a blocking request. After the NO of the approached Ministry or Department of the State governments and Union Territories and/or any agency of the Central Government, is satisfied of the validity of the request they forward it to the DO. Requests when not sent through the NO of any organization, must be approved by Chief Secretary of the State or Union Territory or the Advisor to the Administrator of the Union Territory, before being sent to the DO.</p>
<p style="text-align: justify;">The DO upon receiving the request places, must acknowledge receipt within 24 four hours and places the request along with printed copy of alleged information for validation by the CER. The DO also, must make reasonable efforts to identify the person or intermediary hosting the information, and having identified them issue a notice asking them to appear and submit their reply and clarifications before the committee at a specified date and time, within forty eight hours of the receipt of notice.</p>
<p style="text-align: justify;">Foreign entities hosting the information are also informed and the CER gives it recommendations after hearing from the intermediary or the person has clarified their position and even if there is no representation by the same and after examining if the request falls within the scope outlined under S69A(1). The blocking directions are issued by the Secretary (DeitY), after the DO forwards the request and the CER recommendations. If approval is granted the DO directs the relevant intermediary or person to block the alleged information.</p>
<p style="text-align: justify;" class="western">b) Rule 9 outlines a procedure wherein, under emergency circumstances, and after the DO has established the necessity and expediency to block alleged information submits recommendations in writing to the Secretary, DeitY. The Secretary, upon being satisfied by the justification for, and necessity of, and expediency to block information may issue an blocking directions as an interim measure and must record the reasons for doing so in writing.</p>
<p style="text-align: justify;" class="western">Under such circumstances, the intermediary and person hosting information is not given the opportunity of a hearing. Nevertheless, the DO is required to place the request before the CER within forty eight hours of issuing of directions for interim blocking. Only upon receiving the final recommendations from the committee can the Secretary pass a final order approving the request. If the request for blocking is not approved then the interim order passed earlier is revoked, and the intermediary or identified person should be directed to unblock the information for public access.</p>
<p style="text-align: justify;" class="western">c) Rule 10 outlines the process when an order is issued by the courts in India. The DO upon receipt of the court order for blocking of information submits it to the Secretary, DeitY and initiates action as directed by the courts.</p>
<h3 style="text-align: justify;" class="western">Confidentiality clause</h3>
<p style="text-align: justify;">Rule 16 mandates confidentiality regarding all requests and actions taken thereof, which renders any requests received by the NO and the DO, recommendations made by the DO or the CER and any written reasons for blocking or revoking blocking requests outside the purview of public scrutiny. More detail on the officers and committees that enforce the blocking rules and procedure can be found <a href="http://cis-india.org/internet-governance/blog/is-india2019s-website-blocking-law-constitutional-2013-i-law-procedure">here</a>.</p>
<h2>Response on blocking from the Ministry of Communication and Information Technology</h2>
<p style="text-align: justify;">The response to our RTI from E-Security and Cyber Law Group is timely, given the recent clarification from the Ministry of Communication and Information Technology to a number of questions, raised by parliamentarian Shri Avinash Pande in the Rajya Sabha. The questions had been raised in reference to the Emergency blocking order under IT Act, the current status of the Central Monitoring System, Data Privacy law and Net Neutrality. The Centre for Communication Governance (CCG), National Law University New Delhi have extracted a set of 6 questions and you can read the full article <a href="https://ccgnludelhi.wordpress.com/2015/04/24/governments-response-to-fundamental-questions-regarding-the-internet-in-india/">here</a>.</p>
<p align="JUSTIFY" class="western">The governments response as quoted by CCG, clarifies under rule 9—the Government has issued directions for emergency blocking of <em>a total number of 216 URLs from 1st January, 2014 till date </em>and that <em>a total of 255 URLs were blocked in 2014 and no URLs has been blocked in 2015 (till 31 March 2015)</em> under S69A through the Committee constituted under the rules therein. Further, a total of 2091 URLs and 143 URLs were blocked in order to comply with the directions of the competent courts of India in 2014 and 2015 (till 31 March 2015) respectively. The government also clarified that the CER, had recommended not to block 19 URLs in the meetings held between 1<sup>st</sup><sup> </sup>January 2014 upto till date and so far, two orders have been issued to revoke 251 blocked URLs from 1st January 2014 till date. Besides, CERT-In received requests for blocking of objectionable content from individuals and organisations, and these were forwarded to the concerned websites for appropriate action, however the response did not specify the number of requests.</p>
<p align="JUSTIFY" class="western">We have prepared a table explaining the information released by the government and to highlight the inconsistency in their response.</p>
<table class="grid listing">
<colgroup> <col width="331"> <col width="90"> <col width="91"> <col width="119"> </colgroup>
<tbody>
<tr>
<td rowspan="2">
<p align="LEFT"><strong>Applicable rule and procedure outlined under the Blocking Rules</strong></p>
</td>
<td colspan="3">
<p align="CENTER"><strong>Number of websites</strong></p>
</td>
</tr>
<tr>
<td>
<p align="CENTER"><em>2014</em></p>
</td>
<td>
<p align="CENTER"><em>2015</em></p>
</td>
<td>
<p align="CENTER"><em>Total</em></p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 6 - Blocking requests from NO and others</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
<td>
<p align="CENTER">None</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 9 - Blocking under emergency circumstances</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">216</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 10 - Blocking orders from Court</p>
</td>
<td>
<p align="CENTER">2091</p>
</td>
<td>
<p align="CENTER">143</p>
</td>
<td>
<p align="CENTER">2234</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Requests from individuals and orgs forwarded to CERT-In</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Recommendations to not block by CER</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">19</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Number of blocking requests revoked</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">251</p>
</td>
</tr>
</tbody>
</table>
<p>In a <a href="http://sflc.in/deity-says-2341-urls-were-blocked-in-2014-refuses-to-reveal-more/">response </a>to an RTI filed by the Software Freedom Law Centre, DeitY said that 708 URLs were blocked in 2012, 1,349 URLs in 2013, and 2,341 URLs in 2014.</p>
<h2>Shreya Singhal v. Union of India</h2>
<p style="text-align: justify;">In its recent judgment, the SC of India upheld the constitutionality of 69A, stating that it was a narrowly-drawn provision with adequate safeguards. The constitutional challenge on behalf of the People’s Union for Civil Liberties (PUCL) considered the manner in which the blocking is done and the arguments focused on the secrecy present in blocking.</p>
<p style="text-align: justify;">The rules may indicate that there is a requirement to identify and contact the originator of information, though as an expert <a href="http://indianexpress.com/article/opinion/columns/but-what-about-section-69a/">has pointed out</a>, there is no evidence of this in practice. The court has stressed the importance of a written order so that writ petitions may be filed under Article 226 of the Constitution. In doing so, the court seems to have assumed that the originator or intermediary is informed, and therefore held the view that any procedural inconsistencies may be challenged through writ petitions. However, this recourse is rendered ineffective not only due to procedural constraints, but also because of the confidentiality clause. The opaqueness through rule 16 severely reigns in the recourse that may be given to the originator and the intermediary. While the court notes that rule 16 requiring confidentality was argued to be unconstitutional, it does not state its opinion on this question in the judgment. One expert, holds the <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">view</a> that this, by implication, requires that requests cannot be confidential. However, such a reading down of rule 16 is yet to be tested.</p>
<p style="text-align: justify;">Further, Sunil Abraham has <a href="http://cis-india.org/internet-governance/blog/economic-and-political-weekly-sunil-abraham-april-11-2015-shreya-singhal-and-66a">pointed</a> out, “block orders are unevenly implemented by ISPs making it impossible for anyone to independently monitor and reach a conclusion whether an internet resource is inaccessible as a result of a S69A block order or due to a network anomaly.” As there are no comprehensive list of blocked websites or of the legal orders through which they are blocked exists, the public has to rely on media reports and filing RTI requests to understand the censorship regime in India. CIS has previously <a href="http://cis-india.org/internet-governance/blog/analysing-blocked-sites-riots-communalism">analysed</a> the leaked block lists and lists received as responses to RTI requests which have revealed that the block orders are full of errors and blocking of entire platforms and not just specific links has taken place.</p>
<p style="text-align: justify;">While the state has the power of blocking content, doing so in secrecy and without judical scrutiny, mark deficiencies that remain in the procedure outlined under the provisions of the blocking rules . The Court could read down rule 16 except for a really narrow set of exceptions, and in not doing so, perhaps has overlooked the opportunities for reform in the existing system. The blocking of 32 websites, is an example of the opaqueness of the system of blocking orders, and where the safeguards assumed by the SC are often not observed such as there being no access to the recommendations that were made by the CER, or towards the revocation of the blocking orders subsequently. CIS filed the RTI to try and understand the grounds for blocking and related procedures and the response has thrown up some issues that must need urgent attention.</p>
<h2>Response to RTI filed by CIS</h2>
<p align="JUSTIFY" class="western">Our first question sought clarification on the websites blocked on 30<sup>th</sup><sup> </sup>December 2014 and the response received from DeitY, E-Security and Cyber Law Group reveals that the websites had been blocked as “they were being used to post information related to ISIS using the resources provided by these websites”. The response also clarifies that the directions to block were issued on <em>18-12-2014 and as of 09-01-2015</em>, after obtaining an undertaking from website owners, stating their compliance with the Government and Indian laws, the sites were unblocked.</p>
<p align="JUSTIFY" class="western">It is not clear if ATS, Mumbai had been intercepting communication or if someone reported these websites. If the ATS was indeed intercepting communication, then as per the rules, the RC should be informed and their recommendations sought. It is unclear, if this was the case and the response evokes the confidentiality clause under rule 16 for not divulging further details. Based on our reading of the rules, court orders should be accessible to the public and without copies of requests and complaints received and knowledge of which organization raised them, there can be no appeal or recourse available to the intermediary or even the general public.</p>
<p align="JUSTIFY" class="western">We also asked for a list of all requests for blocking of information that had been received by the DO between January 2013 and January 2015, including the copies of all files that had accepted or rejected. We also specifically, asked for a list of requests under rule 9. The response from DeitY stated that since January 1, 2015 to March 31, 2015 directions to block 143 URLs had been issued based on court orders. The response completely overlooks our request for information, covering the 2 year time period. It also does not cover all types of blocking orders under rule 6 and rule 9, nor the requests that are forwarded to CERT-In, as we have gauged from the ministry's response to the Parliament. Contrary to the SC's assumption of contacting the orginator of information, it is also clear from DeitY's response that only the websites had been contacted and the letter states that the “websites replied only after blocking of objectionable content”. </p>
<p align="JUSTIFY" class="western">Further, seeking clarification on the functioning of the CER, we asked for the recent composition of members and the dates and copies of the minutes of all meetings including copies of the recommendations made by them. The response merely quotes rule 7 as the reference for the composition and does not provide any names or other details. We ascertain that as per the DeitY website Shri B.J. Srinath, Scientist-G/GC is the appointed Designated Officer, however this needs confirmation. While we are already aware of the structure of the CER which representatives and appointed public officers are guiding the examination of requests remains unclear. Presently, there are 3 Joint Secretaries appointed under the Ministry of Law and Justice, the Home Ministry has appointed 19, while 3 are appointed under the Ministry of Information and Broadcasting. Further, it is not clear which grade of scientist would be appointed to this committee from CERT-In as the rules do not specify this. While the government has clarified in their answer to Parliament that the committee had recommended not to block 19 URLs in the meetings held between 1st January 2014 to till date, it is remains unclear who is taking these decisions to block and revoke blocked URLs. The response from DeitY specifies that the CER has met six times between 2014 and March 2015, however stops short on sharing any further information or copies of files on complaints and recommendations of the CER, citing rule 16.</p>
<p align="JUSTIFY" class="western">Finally, answering our question on the composition of the RC the letter merely highlights the provision providing for the composition under 419A of the Indian Telegraph Rules, 1951. The response clarifies that so far, the RC has met once on 7th December, 2013 under the Chairmanship of the Cabinet Secretary, Department of Legal Affaits and Secretary, DOT. Our request for minutes of meetings and copies of orders and findings of the RC is denied by simply stating that “minutes are not available”. Under 419A, any directions for interception of any message or class of messages under sub-section (2) of Section 5 of the Indian Telegraph Act, 1885 issued by the competent authority shall contain reasons for such direction and a copy of such order shall be forwarded to the concerned RC within a period of seven working days. Given that the RC has met just once since 2013, it is unclear if the RC is not functioning or if the interception of messages is being guided through other procedures. Further, we do not yet know details or have any records of revocation orders or notices sent to intermediary contacts. This restricts the citizens’ right to receive information and DeitY should work to make these available for the public.</p>
<p align="JUSTIFY" class="western">Given the response to our RTI, the Ministry's response to Parliament and the SC judgment we recommend the following steps be taken by the DeitY to ensure that we create a procedure that is just, accountable and follows the rule of law.</p>
<p align="JUSTIFY" class="western">The revocation of rule 16 needs urgent clarification for two reasons:</p>
<ol>
<li>Under Section 22 of the RTI Act provisions thereof, override all conflicting provisions in any other legislation.</li>
<li style="text-align: justify;">In upholding the constitutionality of S69A the SC cites the requirement of reasons behind blocking orders to be recorded in writing, so that they may be challenged by means of writ petitions filed under <a href="http://indiankanoon.org/doc/1712542/">A</a><a href="http://indiankanoon.org/doc/1712542/">rticle 226</a> of the Constitution of India.</li></ol>
<p style="text-align: justify;">If the blocking orders or the meetings of the CER and RC that consider the reasons in the orders are to remain shrouded in secrecy and unavailable through RTI requests, filing writ petitions challenging these decisions will not be possible, rendering this very important safeguard for the protection of online free speech and expression infructuous. In summation, the need for comprehensive legislative reform remains in the blocking procedures and the government should act to address the pressing need for transparency and accountability. Not only does opacity curtial the strengths of democracy it also impedes good governance. We have filed an RTI seeking a comprehensive account of the blocking procedure, functioning of committees from 2009-2015 and we shall publish any information that we may receive.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015'>https://cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015</a>
</p>
No publisherjyotiCensorshipFreedom of Speech and ExpressionRTIIntermediary LiabilityAccountabilityFeatured69AInternet GovernanceChilling EffectTransparencyHomepageBlocking2015-04-30T07:37:40ZBlog EntryThe Supreme Court Judgment in Shreya Singhal and What It Does for Intermediary Liability in India?
https://cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability
<b>Even as free speech advocates and users celebrate the Supreme Court of India's landmark judgment striking down Section 66A of the Information Technology Act of 2000, news that the Central government has begun work on drafting a new provision to replace the said section of the Act has been trickling in.</b>
<p style="text-align: justify; ">The SC judgement in upholding the constitutionality of Section 69A (procedure for blocking websites) and in reading down Section 79 (exemption from liability of intermediaries) of the IT Act, raises crucial questions regarding transparency, accountability and under what circumstances may reasonable restrictions be placed on free speech on the Internet. While discussions and analysis of S. 66A continue, in this post I will focus on the aspect of the judgment related to intermediary liability that could benefit from further clarification from the apex court and in doing so, will briefly touch upon S. 69A and secret blocking.</p>
<h3 style="text-align: justify; ">Conditions qualifying intermediary for exemption and obligations not related to exemption</h3>
<p align="JUSTIFY">The intermediary liability regime in India is defined under S. 79 and assosciated rules that were introduced to protect intermediaries for liability from user generated content and ensure the Internet continues to evolve as a <i>“marketplace of ideas”</i>. But as intermediaries may not have sufficient legal competence or resources to deliberate on the legality of an expression, they may end up erring on the side of caution and takedown lawful expression. As a study by Centre for Internet and Society (CIS) in 2012 revealed, the criteria, procedure and safeguards for administration of the takedowns as prescribed by the rules lead to a chilling effect on online free expression.</p>
<p align="JUSTIFY"><span><span><span>S. 69A grants powers to the Central Government to </span></span></span><span><i><span>“issue directions for blocking of public access to any information through any computer resource”.</span></i></span><span><span><span> The 2009 </span></span></span><span><span><span>rules </span></span></span><span><span><span>allow the blocking of websites by a court order, </span></span></span><span><span><span>and </span></span></span><span><span><span>sets in place a review committee to review the decision to block websites </span></span></span><span><span><span>a</span></span></span><span><span><span>s also establishes </span></span></span><span><span><span>penalt</span></span></span><span><span><span>ies </span></span></span><span><span><span>for the intermediary </span></span></span><span><span><span>that fails to extend cooperation in this respect. </span></span></span></p>
<p align="JUSTIFY"><span><span><span>There are two key aspects of both these provisions that must be noted:</span></span></span></p>
<p align="JUSTIFY">a) S. 79 is an exemption provision that qualifies the intermediary for conditional immunity, as long as they fulfil the conditions of the section. The judgement notes this distinction, adding that “<i>being an exemption provision, it is closely related to provisions which provide for offences including S. 69A.”</i></p>
<p align="JUSTIFY"><span><span><span>b) S. 69A does not contribute to immunity for the intermediary rather places additional obligations on the intermediary and as the judgement notes </span></span></span><span><i><span>“intermediary who finally fails to comply with the directions issued who is punishable under sub-section (3) of 69A.”</span></i></span><span><span><span> The provision though outside of the conditional immunity liability regime enacted through S. 79 contributes to the restriction of access to, or removing content online by placing liability on intermediaries to block unlawful third party content or information that is being generated, transmitted, received, stored or hosted by them. Therefore restriction requests must fall within the contours outlined in Article 19(2) and include principles of natural justice and elements of due process.</span></span></span></p>
<h3 align="JUSTIFY">Subjective Determination of Knowledge</h3>
<p align="JUSTIFY">The provisions for exemption laid down in S. 79 do not apply when they receive <i>“actual knowledge” </i>of illegal content under section 79(3)(b). Prior to the court's verdict actual knowledge could have been interpreted to mean the intermediary is called upon its own judgement under sub-rule (4) to restrict impugned content in order to seek exemption from liability. Removing the need for intermediaries to take on an adjudicatory role and deciding on which content to restrict or takedown, the SC has read down <i>“actual knowledge”</i> to mean that there has to be a court order directing the intermediary to expeditiously remove or disable access to content online. The court also read down <i>“upon obtaining knowledge by itself”</i> and <i>“brought to actual knowledge”</i> under Rule 3(4) in the same manner as 79(3)(b).</p>
<p align="JUSTIFY"><span><span><span>Under S.79(3)(b) the intermediary must comply with the orders from the executive in order to qualify for immunity. Further, S. 79 (3)(b) goes beyond the specific categories of restriction identified in Article 19(2) by including the term </span></span></span><span><i><span>“unlawful acts”</span></i></span><span><span><span> and places the executive in an adjudicatory role of determining the illegality of content. The government cannot emulate private regulation as it is bound by the Constitution and the court addresses this issue by applying the limitation of 19(2) on unlawful acts, </span></span></span><span><i><span>“the court order and/or the notification by the appropriate government or its agency must strictly conform to the subject matters aid down in Article 19(2).”</span></i></span><span><span><span> </span></span></span></p>
<p align="JUSTIFY"><span><span><span>By reading down of S. 79 (3) (b) the court has addressed the issue of intermediaries </span></span></span><span><span><span>complying with tak</span></span></span><span><span><span>edown requests from non-government entities and </span></span></span><span><span><span>has </span></span></span><span><span><span>made government notifications and court orders to be consistent with reasonable restrictions in Article 19(2). This is an important clarification from the court, because this places limits on the private censorship of intermediaries and the invisible censorship of opaque government takedown requests as they must </span></span></span><span><span><span>and should </span></span></span><span><span><span>adhere, to </span></span></span><span><span><span>the </span></span></span><span><span><span>boundaries set by Article 19(2).</span></span></span></p>
<h3><span><span><span>Procedural Safeguards</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The SC does not touch upon other parts of the rules and in not doing so, has left significant procedural issues open for debate. It is relevant to bear in mind and as established above, S. 69A blocking and restriction requirements for the intermediary are part of their additional obligations and do not qualify them for immunity. The court ruled in favour of upholding S. 69A as constitutional on the basis that blocking orders are issued when the executive has sufficiently established that it is absolutely necessary to do so, and that the necessity is relatable to only some subjects set out in Article 19(2). Further the court notes that reasons for the blocking orders must be recorded in writing so that they may be challenged through writ petitions. The court also goes on to specify that under S. 69A the intermediary and the 'originator' if identified, have the right to be heard before the committee decides to issue the blocking order. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>Under S. 79 the intermediary must also comply with government restriction orders and the procedure for notice and takedown is not sufficiently transparent and lacks procedural safeguards that have been included in the notice and takedown procedures under S. 69. For example, there is no requirement for committee to evaluate the necessity of issuing the restriction order, though the ruling does clarify that these restriction notices must be within the confines of Article 19(2). The judgement could have gone further to directing the government to state their entire cause of action and provide reasonable level of proof (prima facie). It should have also addressed issues such as the government using extra-judicial measures to restrict content including collateral pressures to force changes in terms of service, to promote or enforce so-called "voluntary" practices. </span></span></span></p>
<h3><span><span><span>Accountability</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The judgement could also have delved deeper into issues of accountability such as the need to consider 'udi alteram partem' by providing the owner of the information or the intermediary a hearing prior to issuing the restriction or blocking order nor is an post-facto review or appeal mechanism made available except for the recourse of writ petition. Procedural uncertainty around wrongly restricted content remains, including what limitations should be placed on the length, duration and geographical scope of the restriction. The court also does not address the issue of providing a recourse for the third party provider of information to have the removed information restored or put-back remains unclear. Relatedly, the court also does not clarify the concerns related to frivolous requests by establishing penalties nor is there a codified recourse under the rules presently, for the intermediary to claim damages even if it can be established that the takedown process is being abused.</span></span></span></p>
<h3><span><span><span>Transparency</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The bench in para 113 in addressing S. 79 notes that the intermediary in addition to publishing rules and regulations, privacy policy and user agreement for access or usage of their service has to also inform users of the due diligence requirements including content restriction policy under rule 3(2). However, the court ought to have noted the differentiation between different categories of intermediaries which may require different terms of use. Rather than stressing a standard terms of use as a procedural safeguard, the court should have insisted on establishing terms of use and content restriction obligations that is proportional to the role of the intermediary and based on the liability accrued in providing the service, including the impact of the restriction by the intermediary both on access and free speech. By placing requirement of disclosure or transparency on the intermediary including what has been restricted under the intermediary's own terms of service, the judgment could have gone a step further than merely informing users of their rights in using the service as it stands presently, to ensuring that users can review and have knowledge of what information has been restricted and why. The judgment also does not touch upon broader issues of intermediary liability such as proactive filtering sought by government and private parties, an important consideration given the recent developments around the right to be forgotten in Europe and around issues of defamation and pornography in India. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>The judgment, while a welcome one in the direction of ensuring the Internet remains a democratic space where free speech thrives, could benefit from the application of the recently launched Manila principles developed by CIS and others. The Manila Principles is a framework of baseline safeguards and best practices that should be considered by policymakers and intermediaries when developing, adopting, and reviewing legislation, policies and practices that govern the liability of intermediaries for third-party content. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>The court's ruling is truly worth celebrating, in terms of the tone it sets on how we think of free speech and the contours of censorship that exist in the digital space. But the real impact of this judgment lies in the debates and discussions which it will throw open about content removal practices that involve intermediaries making determinations on requests received, or those which only respond to the interests of the party requesting removal. As the Manila Principles highlight a balance between public and private interests can be obtained through a mechanism where power is distributed between the parties involved, and where an impartial, independent, and accountable oversight mechanism exists. <br /></span></span></span></p>
<p><span><span><span><br /></span></span></span></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability'>https://cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability</a>
</p>
No publisherjyotiIT ActCensorshipFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityChilling Effect2015-04-17T23:59:34ZBlog EntryNo more 66A!
https://cis-india.org/internet-governance/blog/no-more-66a
<b>In a landmark decision, the Supreme Court has struck down Section 66A. Today was a great day for freedom of speech on the Internet! When Section 66A was in operation, if you made a statement that led to offence, you could be prosecuted. We are an offence-friendly nation, judging by media reports in the last year. It was a year of book-bans, website blocking and takedown requests. Facebook’s Transparency Report showed that next to the US, India made the most requests for information about user accounts. A complaint under Section 66A would be a ground for such requests.</b>
<p style="text-align: justify; ">Section 66A hung like a sword in the middle: Shaheen Dhada was arrested in Maharashtra for observing that Bal Thackeray’s funeral shut down the city, Devu Chodankar in Goa and Syed Waqar in Karnataka were arrested for making posts about Narendra Modi, and a Puducherry man was arrested for criticizing P. Chidambaram’s son. The law was vague and so widely worded that it was prone to misuse, and was in fact being misused.</p>
<p style="text-align: justify; ">Today, the Supreme Court struck down Section 66A in its judgment on a <a class="external-link" href="http://cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact">set of petitions</a> heard together last year and earlier this year. Stating that the law is vague, the bench comprising Chelameshwar and Nariman, JJ. held that while restrictions on free speech are constitutional insofar as they are in line with Article 19(2) of the Constitution. Section 66A, they held, does not meet this test: The central protection of free speech is the freedom to make statements that “offend, shock or disturb”, and Section 66A is an unconstitutional curtailment of these freedoms. To cross the threshold of constitutional limitation, the impugned speech must be of such a nature that it incites violence or is an exhortation to violence. Section 66A, by being extremely vague and broad, does not meet this threshold. These are, of course, drawn from news reports of the judgment; the judgment is not available yet.</p>
<p style="text-align: justify; ">Reports also say that Section 79(3)(b) has been read down. Previously, any private individual or entity, and the government and its departments could request intermediaries to take down a website, without a court order. If the intermediaries did not comply, they would lose immunity under Section 79. The Supreme Court judgment states that both in Rule 3(4) of the Intermediaries Guidelines and in Section 79(3)(b), the "actual knowledge of the court order or government notification" is necessary before website takedowns can be effected. In effect, this mean that intermediaries <i>need not</i> act upon private notices under Section 79, while they can act upon them if they choose. This stops intermediaries from standing judge over what constitutes an unlawful act. If they choose not to take down content after receiving a private notice, they will not lose immunity under Section 79.</p>
<p style="text-align: justify; ">Section 69A, the website blocking procedure, has been left intact by the Court, despite infirmities such as a lack of judicial review and non-transparent operation. More updates when the judgment is made available.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/no-more-66a'>https://cis-india.org/internet-governance/blog/no-more-66a</a>
</p>
No publishergeethaCensorshipFreedom of Speech and ExpressionHomepageIntermediary LiabilityFeaturedChilling EffectSection 66AArticle 19(1)(a)Blocking2015-03-26T02:01:31ZBlog EntryOverview of the Constitutional Challenges to the IT Act
https://cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact
<b>There are currently ten cases before the Supreme Court challenging various provisions of the Information Technology Act, the rules made under that, and other laws, that are being heard jointly. Advocate Gopal Sankaranarayanan who's arguing Anoop M.K. v. Union of India has put together this chart that helps you track what's being challenged in each case.</b>
<br />
<br />
<br />
<table class="tg" style="undefined;table-layout: fixed; border=">
<tr>
<th class="tg-s6z2">PENDING MATTERS</th>
<th class="tg-s6z2">CASE NUMBER</th>
<th class="tg-0ord">PROVISIONS CHALLENGED</th>
</tr>
<tr>
<td class="tg-4eph">Shreya Singhal v. Union of India</td>
<td class="tg-spn1">W.P.(CRL.) NO. 167/2012</td>
<td class="tg-zapm">66A</td>
</tr>
<tr>
<td class="tg-031e">Common Cause & Anr. v. Union of India</td>
<td class="tg-s6z2">W.P.(C) NO. 21/2013</td>
<td class="tg-0ord">66A, 69A & 80</td>
</tr>
<tr>
<td class="tg-4eph">Rajeev Chandrasekhar v. Union of India & Anr.</td>
<td class="tg-spn1">W.P.(C) NO. 23/2013</td>
<td class="tg-zapm">66A & Rules 3(2), 3(3), 3(4) & 3(7) of the Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-031e">Dilip Kumar Tulsidas Shah v. Union of India & Anr.</td>
<td class="tg-s6z2">W.P.(C) NO. 97/2013</td>
<td class="tg-0ord">66A</td>
</tr>
<tr>
<td class="tg-4eph">Peoples Union for Civil Liberties v. Union of India & Ors.</td>
<td class="tg-spn1">W.P.(CRL.) NO. 199/2013</td>
<td class="tg-zapm">66A, 69A, Intermediaries Rules 2011 (s.79(2) Rules) & Blocking of Access of Information by Public Rules 2009 (s.69A Rules)</td>
</tr>
<tr>
<td class="tg-031e">Mouthshut.Com (India) Pvt. Ltd. & Anr. v. Union of India & Ors.</td>
<td class="tg-s6z2">W.P.(C) NO. 217/2013</td>
<td class="tg-0ord">66A & Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-4eph">Taslima Nasrin v. State of U.P & Ors.</td>
<td class="tg-spn1">W.P.(CRL.) NO. 222/2013</td>
<td class="tg-zapm">66A</td>
</tr>
<tr>
<td class="tg-031e">Manoj Oswal v. Union of India & Anr.</td>
<td class="tg-s6z2">W.P.(CRL.) NO. 225/2013</td>
<td class="tg-0ord">66A & 499/500 Indian Penal Code</td>
</tr>
<tr>
<td class="tg-4eph">Internet and Mobile Ass'n of India & Anr. v. Union of India & Anr.</td>
<td class="tg-spn1">W.P.(C) NO. 758/2014</td>
<td class="tg-zapm">79(3) & Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-031e">Anoop M.K. v. Union of India & Ors.</td>
<td class="tg-s6z2">W.P.(CRL.) NO. 196/2014</td>
<td class="tg-0ord">66A, 69A, 80 & S.118(d) of the Kerala Police Act, 2011</td>
</tr>
</table>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact'>https://cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact</a>
</p>
No publisherpraneshIT ActCourt CaseFreedom of Speech and ExpressionIntermediary LiabilityConstitutional LawCensorshipSection 66AArticle 19(1)(a)Blocking2014-12-19T09:01:50ZBlog EntryLearning Forum: Transparency and Human Rights in the Digital Age
https://cis-india.org/internet-governance/news/learning-forum-transparency-and-human-rights-in-the-digital-age
<b>Pranesh Prakash spoke at this event organized by Global Network Initiative on November 6, 2014 in California. </b>
<p style="text-align: justify; ">Pranesh Prakash spoke on transparency reports and their use and abuse in India; the Intermediary Liability Rules in India (and its non-provision of any transparency mechanism); and the need for transparency in private speech regulation, not just governmental speech regulation.</p>
<hr />
<p> </p>
<p><img alt="GNI logo" src="https://cdn.evbuc.com/eventlogos/21069154/gnilogo.jpg" title="GNI logo" width="600" /></p>
<p><img alt="Telecom Industry Dialogue" src="https://cdn.evbuc.com/eventlogos/21069154/screenshot20141002at11.11.24am.png" title="ID logos" width="600" /></p>
<p style="text-align: justify; "><span>The Global Network Initiative and the Telecommunications Industry Dialogue on Freedom of Expression and Privacy present:</span></p>
<p style="text-align: justify; "><b>2014 Learning Forum - Silicon Valley </b><br /><b><span>Transparency and Human Rights in the Digital Age</span></b></p>
<p style="text-align: justify; "><span><span>Hosted by LinkedIn </span></span></p>
<p style="text-align: justify; "><b><span><span>Agenda</span></span></b></p>
<p style="text-align: justify; "><b><span><span>1:30PM - Registration</span></span></b></p>
<p style="text-align: justify; "><b><span><span>2:00PM - Opening Remarks</span></span></b></p>
<p style="text-align: justify; "><span><span>Mark Stephens, Independent Chair, Global Network Initiative</span></span></p>
<p style="text-align: justify; "><span style="text-align: center; ">Jeffrey Dygert, Executive Director of Public Policy, AT&T</span></p>
<p style="text-align: justify; "><span style="text-align: center; ">Pablo Chavez, Vice President, Global Public Policy and Government Affairs, LinkedIn</span></p>
<p style="text-align: justify; "><b><i><span><span>2:15PM - Why does transparency matter for protecting and respecting rights online?</span></span></i></b></p>
<p style="text-align: justify; ">Arvind Ganesan, Director of Business and Human Rights, Human Rights Watch</p>
<p style="text-align: justify; ">Deirdre Mulligan, Associate Professor, UC Berkeley School of Information</p>
<p style="text-align: justify; ">Michael Samway, School of Foreign Service, Georgetown University</p>
<p style="text-align: justify; "><b><i><span><span>3:00PM - What is the state of transparency reporting by companies and governments, and what's missing?</span></span></i></b></p>
<p style="text-align: justify; "><span><span>Steve Crown, Vice President and Deputy General Counsel, Microsoft</span></span></p>
<p style="text-align: justify; "><span><span>Jeffrey Dygert, Executive Director of Public Policy, AT&T</span></span></p>
<p style="text-align: justify; "><span><span>Jason Pielemeier, Bureau of Democracy, Human Rights, and Labor, U.S. Department of State</span></span></p>
<p style="text-align: justify; "><span><span>Pranesh Prakash, Policy Director, Centre for Internet & Society, Bangalore </span></span></p>
<p style="text-align: justify; "><span><span>Moderated by Bennett Freeman, Senior Vice President, Sustainability Research and Policy, Calvert Investments</span></span></p>
<p style="text-align: justify; "><b><span><span>4:00PM - Break</span></span></b></p>
<p style="text-align: justify; "><b><i><span><span>4:30PM - How do companies communicate with users in response to live events? </span></span></i></b></p>
<p style="text-align: justify; "><span><span>Ben Blink, Senior Policy Analyst, Free Expression and International Relations, Google</span></span></p>
<p style="text-align: justify; "><span><span>Patrik Hiselius, Senior Advisor, Digital Rights, TeliaSonera</span></span></p>
<p style="text-align: justify; "><span><span>Rebecca MacKinnon, Director, Ranking Digital Rights Project, New America Foundation</span></span></p>
<p style="text-align: justify; "><span><span>Hemanshu Nigam, CEO, SSP Blue</span></span></p>
<p style="text-align: justify; "><span><span>Sana Saleem, Director, Bolo Bhi</span></span></p>
<p style="text-align: justify; "><span><span>Moderated by Cynthia Wong, Senior Internet Researcher, Human Rights Watch</span></span></p>
<p style="text-align: justify; "><b><i>The program will be followed by a reception from 5:30 to 6:30pm.</i></b></p>
<p style="text-align: justify; ">By invitation only, non-transferrable.</p>
<p class="mceContentBody documentContent">Have questions about Learning Forum: Transparency and Human Rights in the Digital Age? <a class="contact_organizer_link js-d-modal" href="#lightbox_contact"> Contact Global Network Initiative </a></p>
<hr />
<p class="mceContentBody documentContent">The original was <a class="external-link" href="https://www.eventbrite.com/e/learning-forum-transparency-and-human-rights-in-the-digital-age-tickets-13387240597">published here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/learning-forum-transparency-and-human-rights-in-the-digital-age'>https://cis-india.org/internet-governance/news/learning-forum-transparency-and-human-rights-in-the-digital-age</a>
</p>
No publisherpraskrishnaInternet GovernanceIntermediary Liability2014-12-04T16:14:38ZNews ItemNational Consultation on Media Law
https://cis-india.org/internet-governance/news/national-consultation-on-media-law
<b>The Law Commission of India and the National University, Delhi have joined hands to organize the National Consultation on Media Law at the India Habitat Centre in New Delhi on September 27 and 28, 2014. Nehaa Chaudhari participated in this event. </b>
<p>Click to view the:</p>
<ol>
<li><a href="https://cis-india.org/internet-governance/blog/national-consultation-on-media-law-schedule.pdf" class="internal-link">Schedule</a></li>
<li><a href="https://cis-india.org/internet-governance/blog/consultation-paper-media-law.pdf" class="internal-link">Consultation Paper on Media Law</a></li>
<li><a href="https://cis-india.org/internet-governance/blog/overview-of-responses.pdf" class="internal-link">Overview of Responses</a></li>
<li><a href="https://cis-india.org/internet-governance/blog/list-of-useful-sources.pdf" class="internal-link">List of Useful Sources</a></li>
</ol>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/national-consultation-on-media-law'>https://cis-india.org/internet-governance/news/national-consultation-on-media-law</a>
</p>
No publisherpraskrishnaInternet GovernanceIntermediary LiabilityPrivacy2014-09-30T06:52:50ZNews ItemCentre for Internet and Society joins the Dynamic Coalition for Platform Responsibility
https://cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility
<b>The Centre for Internet and Society (CIS) has joined the multistakeholder cooperative engagement amidst stakeholders towards creating Due Diligence Recommendations for online platforms and Model Contractual Provisions to be enshrined in ToS. This blog provides a brief background of the role of dynamic coalitions within the IGF structure, establishes the need for the coalition and provides an update on the action plan and next steps for interested stakeholders.</b>
<p class="callout" style="text-align: justify; ">"Identify emerging issues, bring them to the attention of the relevant bodies and the general public, and, where appropriate, make recommendations."<br />Tunis Agenda (Para 72.g)</p>
<p style="text-align: justify; ">The first United Nations Internet Governance Forum (IGF), in 2006 saw the emergence of the concept of Dynamic Coalition and a number of coalitions have been established over the years. The IGF is structured to bring together multistakeholder groups to,</p>
<p class="callout" style="text-align: justify; ">"Discuss public policy issues related to key elements of Internet governance in order to foster the sustainability, robustness, security, stability and development of the Internet."<br />Tunis Agenda (Para 72.a)</p>
<p style="text-align: justify; ">While IGF workshops allow various stakeholders to jointly analyse "hot topics" or to examine progress that such issues have undertaken since the previous IGF, dynamic coalitions are informal, issue-specific groups comprising members of various stakeholder groups. With no strictures upon the objects, structure or processes of dynamic coalitions claiming association with the IGF, and no formal institutional affiliation, nor any access to the resources of the IGF Secretariat, IGF Dynamic Coalitions allow collaboration of anyone interested in contributing to their discussions. Currently, there are eleven active dynamic coalitions at the IGF and can be divided into three distinct types—networks, working groups and Birds of Feather (BOFs).</p>
<p style="text-align: justify; ">Workshops at the IGF are content specific events that, though valuable in informing participants, are limited in their impact by being confined to the launch of a report or by the issues raised within the conference room. The coalitions on the other hand are expected to have a broader function, acting as a coalescing point for interested stakeholders to gather and analyse progress around identified issues and plan next steps. The coalitions can also make recommendations around issues, however, no mechanism has been developed so far, by which the recommendations can be considered by the plenary body. The long-term nature of coalition is perhaps, most suited to engage stakeholders in heterogeneous groups, towards understanding and cooperating around emerging issues and to make recommendations to inform policy making.</p>
<h3 style="text-align: justify; ">Platform Responsibility</h3>
<p style="text-align: justify; ">Social networks and other interactive online services, give rise to 'cyber-spaces' where individuals gather, express their personalities and exchange information and ideas. The transnational and private nature of such platforms means that they are regulated through contractual provisions enshrined in the platforms' Terms of Service (ToS). The provisions delineated in the ToS not only extend to users in spite of their geographical location, the private decisions undertaken by platform providers in implementing the ToS are not subject to constitutional guarantees framed under national jurisdictions.</p>
<p style="text-align: justify; ">While ToS serve as binding agreement online, an absence of binding international rules in this area despite the universal nature of human rights represented is a real challenge, and makes it necessary to engage in a multistakeholder effort to produce model contractual provisions that can be incorporated in ToS. The concept of 'platform responsibility' aims to stimulate behaviour in platform providers to provide intelligible and solid mechanisms, in line with the principles laid out by the UN Guiding Principles on Business and Human Rights and equip platform users with common and easy-to-grasp tools to guarantee the full enjoyment of their human rights online. The utilisation of model contractual provisions in ToS may prove instrumental in fostering trust in online services for content production, use and dissemination, increasing demand of services and ultimately consumer demand may drive the market towards human rights compliant solutions.</p>
<h3 style="text-align: justify; ">The Dynamic Coalition on Platform Responsibility</h3>
<p style="text-align: justify; ">To nurture a multi-stakeholder endeavour aimed at the elaboration of model contractual-provisions, Mr. Luca Belli, Council of Europe / Université Paris II, Ms Primavera De Filippi, CNRS / Berkman Center for Internet and Society and Mr Nicolo Zingales, Tilburg University / Center for Technology and Society Rio, initiated and facilitated the creation of the Dynamic Coalition on Platform Responsibility (DCPR). DCPR has over fifty individual and organisational members from civil society organisations, academia, private sector organisations and intergovernmental organisations and held its first meeting at the IGF in Istanbul. The meeting began with an overview of the concept of platform responsibility, highlighting relevant initiatives from Council of Europe, Global Network Initiative, Ranking Digital Rights and the Center for Democracy and Technology have undertaken in this regard. Existing issues such as difficulty in comprehension and lack of standardization of redress across rights were raised along with the fundamental lack of due process in terms of transparency across existing mechanisms.</p>
<p style="text-align: justify; ">Online platforms compliance to human rights is often framed around the duty of States to protect human rights and often, Internet companies do not sufficient consideration of the effects of their business practices on users fundamental rights undermining trust.</p>
<p style="text-align: justify; ">The meeting focused it efforts with a call to identify issues of process and substance and specific rights and challenges to be addressed by the DCPR. The procedural issues raised concerned 'responsibility' in decision-making e.g., giving users the right to be heard and an effective remedy before an impartial decision-making body, and obtaining their consent for changes in the contractual terms. The concerns raised around substantive rights such as privacy and freedom of expression eg., disclosure of personal information and content removal and need to promote 'responsibility' through establishing concrete mechanisms to deal with such issues.</p>
<p style="text-align: justify; ">It was suggested that concept of responsibility including in case of conflict between different rights could be grounded in Human Rights case law eg., from European Court of Human Rights jurisprudence. It was also established that any framework that would evolve from this coalition would consider the distinction between users (eg., adults, children, and people with or without continuous access to the Internet) and platforms (eg., in terms of size and functionality).</p>
<h3 style="text-align: justify; ">Action Plan</h3>
<p style="text-align: justify; ">The participants at the DCPR meeting agreed to establish a multistakeholder cooperative engagement amidst stakeholders that will go beyond dialogue and produce concrete proposals. Particularly, participants suggested developing:</p>
<ol>
<li style="text-align: justify; ">Due Diligence Recommendations: Recommendations to online platforms with regard to processes of compliance with internationally agreed human rights standards.</li>
<li style="text-align: justify; ">Model Contractual Provisions: Elaboration of a set of principles and provisions protecting platform users’ rights and guaranteeing transparent mechanisms to seek redress in case of violations.</li>
</ol>
<p style="text-align: justify; ">DCPR will ground the development of these frameworks in the preliminary step of compilation of existing projects and initiatives dealing with the analysis of ToS compatibility with human rights standards. Members, participants and interested stakeholders are invited to highlight and share relevant initiatives by 10th October regarding:</p>
<ol>
<li>Processes of due diligence for human rights compliance;</li>
<li>The evaluation of ToS cocompliance with human rights standards;</li>
</ol>
<p style="text-align: justify; ">Further to this compilation, a first recommendation draft regarding online platforms' due diligence will be circulated on the mailing list by 30th October 2014. CIS will be contributing to the drafting which will be led and elaborated by the DCPR coordinators. This draft will be open for comments via the DCPR mailing list until 30th November 2014 and we encourage you to sign up to the mailing list (<a class="external-link" href="http://lists.platformresponsibility.info/listinfo/dcpr">http://lists.platformresponsibility.info/listinfo/dcpr</a>).<br /><br />A second draft will be developed compiling the comments expressed via the mailing-list and shared for comments by 10 December 2014. The final version of the recommendation will be drafted by 30 December. Subsequently, the first set of model contractual provisions will be elaborated building upon such recommendation. A call for inputs will be issued in order to gather suggestions on the content of these provisions.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility'>https://cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility</a>
</p>
No publisherjyotiHuman RightsPrivacyInternet Governance ForumData ProtectionTerms of ServiceInternet GovernancePlatform ResponsibilityIntermediary Liability2014-10-07T10:54:03ZBlog EntryReport on CIS' Workshop at the IGF:'An Evidence Based Framework for Intermediary Liability'
https://cis-india.org/internet-governance/report-on-cis-workshop-at-igf
<b>An evidence based framework for intermediary liability' was organised to present evidence and discuss ongoing research on the changing definition, function and responsibilities of intermediaries across jurisdictions.</b>
<p style="text-align: justify; ">The discussion from the workshop will contribute to a comprehensible framework for liability, consistent with the capacity of the intermediary and with international human-rights standards.</p>
<p style="text-align: justify; ">Electronic Frontier Foundation (USA), Article 19 (UK) and Centre for Internet and Society (India) have come together towards the development of best practices and principles related to the regulation of online content through intermediaries. The nine principles are: Transparency, Consistency, Clarity, Mindful Community Policy Making, Necessity and Proportionality in Content Restrictions, Privacy, Access to Remedy, Accountability, and Due Process in both Legal and Private Enforcement. The workshop discussion will contribute to a comprehensible framework for liability that is consistent with the capacity of the intermediary and with international human-rights standards. The session was hosted by Centre for Internet and Society (India) and Centre for Internet and Society, Stanford (USA) and attended by 7 speakers and 40 participants.</p>
<p style="text-align: justify; ">Jeremy Malcolm, Senior Global Policy Analyst EFF kicked off the workshop highlighting the need to develop a liability framework for intermediaries that is derived out of an understanding of their different functions, their role within the economy and their impact on human rights. He went on to structure the discussion which would follow to focus on ongoing projects and examples that highlight central issues related to gathering and presenting evidence to inform the policy space.</p>
<p style="text-align: justify; ">Martin Husovec from the International Max Planck Research School for Competition and Innovation, began his presentation, tracking the development of safe harbour frameworks within social contract theory. Opining that safe harbour was created as a balancing mechanism between a return of investments of the right holders and public interest for Internet as a public space, he introduced emerging claims that technological advancement have altered this equilibrium. Citing injunctions and private lawsuits as instruments, often used against law abiding intermediaries, he pointed to the problem within existing liability frameoworks, where even intermediaries, who diligently deal with illegitimate content on their services, can be still subject to a forced cooperation to the benefit of right holders. He added that for liability frameworks to be effective, they must keep pace with advances in technology and are fair to right holders and the public interest.</p>
<p style="text-align: justify; ">He also pointed that in any liability framework because the ‘law’ that prescribes an interference, must be always sufficiently clear and foreseeable, as to both the meaning and nature of the applicable measures, so it sufficiently outlines the scope and manner of exercise of the power of interference in the exercise of the rights guaranteed. He illustrated this with the example of the German Federal Supreme Court attempts with Wi-Fi policy-making in 2010. He also raised issues of costs of uncertainty in seeking courts as the only means to balance rights as they often, do not have the necessary information. Similarly, society also does not benefit from open ended accountability of intermediaries and called for a balanced approach to regulation.</p>
<p style="text-align: justify; ">The need for consistency in liability regimes across jurisdictions, was raised by Giancarlo Frosio, Intermediary Liability Fellow at Stanford's Centre for Internet and Society. He introduced the World Intermediary Liability Map, a project mapping legislation and case law across 70 countries towards creating a repository of information that informs policymaking and helps create accountability. Highlighting key takeaways from his research, he stressed the necessity of having clear definitions in the field of intermediary liability and the need to develop taxonomy of issues to deepen our understanding of the issues at stake towards an understanding of type of liability appropriate for a particular jurisdiction.</p>
<p style="text-align: justify; ">Nicolo Zingales, Assistant Professor of Law at Tilburg University highlighted the need for due process and safeguards for human rights and called for more user involvement in systems that are in place in different countries to respond to requests of takedown. Presenting his research findings, he pointed to the imbalance in the way notice and takedown regimes are structured, where content is taken down presumptively, but the possibility of restoring user content is provided only at a subsequent stage or not at all in many cases. He cited several examples of enhancing user participation in liability mechanisms including notice and notice, strict litigation sanction inferring the knowledge that the content might have been legal and shifting the presumption in favor of the users and the reverse notice and takedown procedure. He also raised the important question, if multistakeholder cooperation is sufficient or adequate to enable the users to have a say and enter as part of the social construct in this space? Reminding the participants of the failure of the multistakeholder agreement process regarding the cost for the filters in the UK, that would be imposed according to judicial procedure, he called for strengthening our efforts to enable users to get more involved in protecting their rights online.</p>
<p style="text-align: justify; ">Gabrielle Guillemin from Article 19 presented her research on the types of intermediaries and models of liability in place across jurisdictions. Pointing to the problems associated with intermediaries having to monitor content and determine legality of content, she called for procedural safeguards and stressed the need to place the dispute back in the hands of users and content owners and the person who has written the content rather than the intermediary. She goes on to provide some useful and practically-grounded solutions to strengthen existing takedown mechanisms including, adding details to the notices, introducing fees in order to extend the number of claims that are made and defining procedure regards criminal content.</p>
<p style="text-align: justify; ">Elonnai Hickok introduced CIS' research to the UNESCO report Fostering Freedom Online: the Role of Internet Intermediaries, comparing a range of liability models in different stages of development and provisions across jurisdictions. She argued for a liability framework that tackles procedural and regulatory uncertainty, lack of due process, lack of remedy and varying content criteria.</p>
<p style="text-align: justify; ">Francisco Vera, Advocacy Director, Derechos Digitales from Chile raised issues related to mindful community policy-making expounding on Chile's implementation of intermediary liability obligation with the USA, the introduction of judicial oversight under Chilean legislation which led to US objection to Chile on grounds of not fulfilling their standards in terms of Internet property protection. He highlighted the tensions that arise in balancing the needs of the multiple communities and interests engaged over common resources and stressed the need for evidence in policy-making to balance the needs of rights holders and public interest. He stressed the need for evidence to inform policy-making and ensure it keeps pace with technological developments citing the example of the ongoing Transpacific Partnership Agreement negotiations that call for exporting provisions DMCA provisions to 11 countries even though there is no evidence of the success of the system for public interest. He concluded by cautioning against the development of frameworks that are or have the potential to be used as anti-competitive mechanisms that curtail innovation and therby do not serve public interest.</p>
<p style="text-align: justify; ">Malcolm Hutty associated with the European Internet Service Providers Association, Chair of the Intermediary Reliability Committee and London Internet Exchange brought in the intermediaries' perspective into the discussion. He argued for challenging the link between liability and forced cooperation, understated the problems arising from distinction without a difference and incentives built in within existing regimes. He raised issues arising from the expectancy on the part of those engaged in pre-emptive regulation of unwanted or undesirable content for intermediaries to automate content. Pointing to the increasing impact of intermediaries in our lives he underscored how exposing vast areas of people's lives to regulatory enforce, which enhances power of the state to implement public policy in the public interest and expect it to be executed, can have both positive and negative implications on issues such as privacy and freedom of expression.</p>
<p style="text-align: justify; ">He called out practices in regulatory regimes that focus on one size fits all solutions such as seeking automating filters on a massive scale and instead called for context and content specific solutions, that factor the commercial imperatives of intermediaries. He also addressed the economic consequences of liability frameworks to the industry including cost effectiveness of balancing rights, barriers to investments that arise in heavily regulated or new types of online services that are likely to be the targeted for specific enforcement measures and the long term costs of adapting old enforcement mechanisms that apply, while networks need to be updated to extend services to users.</p>
<p style="text-align: justify; ">The workshop presented evidence of a variety of approaches and the issues that arise in applying those approaches to impose liability on intermediaries. Two choices emerged towards developing frameworks for enforcing responsibility on intermediaries. We could either rely on a traditional approach, essentially court-based and off-line mechanisms for regulating behaviour and disputes. The downside of this is it will be slow and costly to the public purse. In particular, we will lose a great deal of the opportunity to extend regulation much more deeply into people's lives so as to implement the public interest.<br /><br />Alternatively, we could rely on intermediaries to develop and automate systems to control our online behaviour. While this approach does not suffer from efficiency problems of the earlier approach it does lack, both in terms of hindering the developments of the Information Society, and potentially yielding up many of the traditionally expected protections under a free and liberal society. The right approach lies somewhere in the middle and development of International Principles for Intermediary Liability, announced at the end of the workshop, is a step closer to the developing a balanced framework for liability.</p>
<hr />
<p>See the <a class="external-link" href="http://www.intgovforum.org/cms/174-igf-2014/transcripts/1968-2014-09-03-ws206-an-evidence-based-liability-policy-framework-room-5">transcript on IGF website</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/report-on-cis-workshop-at-igf'>https://cis-india.org/internet-governance/report-on-cis-workshop-at-igf</a>
</p>
No publisherjyotiPrivacyFreedom of Speech and ExpressionInternet Governance ForumInternet GovernanceIntermediary Liability2014-09-24T10:47:30ZBlog EntryZero Draft of Content Removal Best Practices White Paper
https://cis-india.org/internet-governance/blog/zero-draft-of-content-removal-best-practices-white-paper
<b>EFF and CIS Intermediary Liability Project is aimed towards the creation of a set of principles for intermediary liability in consultation with groups of Internet-focused NGOs and the academic community.</b>
<p style="text-align: justify; ">The draft paper has been created to frame the discussion and will be made available for public comments and feedback. The draft document and the views represented here are not representative of the positions of the organisations involved in the drafting.</p>
<p style="text-align: justify; "><a class="external-link" href="http://tinyurl.com/k2u83ya">http://tinyurl.com/k2u83ya</a></p>
<p style="text-align: justify; ">3 September 2014</p>
<h2 style="text-align: justify; ">Introduction</h2>
<p style="text-align: justify; ">The purpose of this white paper is to frame the discussion at several meetings between groups of Internet-focused NGOs that will lead to the creation of a set of principles for intermediary liability.</p>
<p style="text-align: justify; ">The principles that develop from this white paper are intended as a civil society contribution to help guide companies, regulators and courts, as they continue to build out the legal landscape in which online intermediaries operate. One aim of these principles is to move towards greater consistency with regards to the laws that apply to intermediaries and their application in practice.</p>
<p style="text-align: justify; ">There are three general approaches to intermediary liability that have been discussed in much of the recent work in this area, including CDT’s 2012 report called “Shielding the Messengers: Protecting Platforms for Expression and Innovation.” The CDT’s 2012 report divides approaches to intermediary liability into three models: 1. Expansive Protections Against Liability for Intermediaries, 2. Conditional Safe Harbor from Liability, 3. Blanket or Strict Liability for Intermediaries.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt1"><sup>[1]</sup></a></p>
<p style="text-align: justify; ">This white paper argues in the alternative that (a) the “expansive protections against liability” model is preferable, but likely not possible given the current state of play in the legal and policy space (b) therefore the white paper supports “conditional safe harbor from liability” operating via a ‘notice-to-notice’ regime if possible, and a ‘notice and action’ regime if ‘notice-to-notice’ is deemed impossible, and finally (c) all of the other principles discussed in this white paper should apply to whatever model for intermediary liability is adopted unless those principles are facially incompatible with the model that is finally adopted.</p>
<p style="text-align: justify; ">As further general background, this white paper works from the position that there are three general types of online intermediaries- Internet Service Providers (ISPs), search engines, and social networks. As outlined in the recent draft UNESCO Report (from which this white paper draws extensively);</p>
<p style="text-align: justify; ">“With many kinds of companies operating many kinds of products and services, it is important to clarify what constitutes an intermediary. In a 2010 report, the Organization for Economic Co-operation and Development (OECD) explains that Internet intermediaries “bring together or facilitate transactions between third parties on the Internet. They give access to, host, transmit and index content, products and services originated by third parties on the Internet or provide Internet-based services to third parties.”</p>
<p style="text-align: justify; ">Most definitions of intermediaries explicitly exclude content producers. The freedom of expression advocacy group Article 19 distinguishes intermediaries from “those individuals or organizations who are responsible for producing information in the first place and posting it online.” Similarly, the Center for Democracy and Technology explains that “these entities facilitate access to content created by others.” The OECD emphasizes “their role as ‘pure’ intermediaries between third parties,” excluding “activities where service providers give access to, host, transmit or index content or services that they themselves originate.” These views are endorsed in some laws and court rulings. In other words, publishers and other media that create and disseminate original content are not intermediaries. Examples of such media entities include a news website that publishes articles written and edited by its staff, or a digital video subscription service that hires people to produce videos and disseminates them to subscribers.</p>
<p style="text-align: justify; ">For the purpose of this case study we will maintain that intermediaries offer services that host, index, or facilitate the transmission and sharing of content created by others. For example, Internet Service Providers (ISPs) connect a user’s device, whether it is a laptop, a mobile phone or something else, to the network of networks known as the Internet. Once a user is connected to the Internet, search engines make a portion of the World Wide Web accessible by allowing individuals to search their database. Search engines are often an essential go-between between websites and Internet users. Social networks connect individual Internet users by allowing them to exchange messages, photos, videos, as well as by allowing them to post content to their network of contacts, or the public at large. Web hosting providers, in turn, make it possible for websites to be published and to be accessed online.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt2"><sup>[2]</sup></a></p>
<h2 style="text-align: justify; ">General Principles for ISP Governance - Content Removals</h2>
<p style="text-align: justify; ">The discussion that follows below outlines nine principles to guide companies, government, and civil society in the development of best practices related to the regulation of online content through intermediaries, as norms, policies, and laws develop in the coming years. The nine principles are: Transparency, Consistency, Clarity, Mindful Community Policy Making, Necessity and Proportionality in Content Restrictions, Privacy, Access to Remedy, Accountability, and Due Process in both Legal and Private Enforcement. Each principle contains subsections that expand upon the theme of the principle to cover more specific issues related to the rights and responsibilities of online intermediaries, government, civil society, and users.</p>
<h3 style="text-align: justify; ">Principle I: Transparency</h3>
<p style="text-align: justify; ">“Transparency enables users’ right to privacy and right to freedom of expression. Transparency of laws, policies, practices, decisions, rationale, and outcomes related to privacy and restrictions allow users to make informed choices with respect to their actions and speech online. As such - both governments and companies have a responsibility in ensuring that the public is informed through transparency initiatives.” <a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt3"><sup>[3]</sup></a></p>
<p style="text-align: justify; "><b>Government Transparency</b></p>
<ul style="text-align: justify; ">
<li>In general, governments should publish transparency reports:</li>
</ul>
<p style="text-align: justify; ">As part of the democratic process, the citizens of each country have a right to know how their government is applying its laws, and a right to provide feedback about the government’s legal interpretations of its laws. Thus, all governments should be required to publish online transparency reports that provide information about all requests issued by any branch or agency of government for the removal or restriction of online content. Further, governments should allow for the submission of comments and suggestions by a webform hosted on the same webpage where that government’s transparency report is hosted. There should also be some legal mechanism that requires the government to look at the feedback provided by its citizens, ensure that relevant feedback is passed along to legislative bodies, and provide for action to be taken on the citizen-provided feedback where appropriate. Finally, and where possible, the raw data that constitutes each government’s transparency report should be made available online, for free, in a common file format such as .csv, so that civil society may have easy access to it for research purposes.</p>
<li style="text-align: justify; ">Governments should be more transparent about content orders that they impose on ISPs<br />The legislative process proceeds most effectively when the government knows how the laws that it creates are applied in practice and is able to receive feedback from the public about how those laws should change further, or remain the same. Relatedly, regulation of the Internet is most effective when the legislative and judicial branches are aware of what the other is doing. For all of these reasons, governments should publish information about all of the court orders and executive requests for content removals that they send to online intermediaries. Publishing all of this information in one place necessarily requires that some single entity within the government collects the information, which will have the benefits of giving the government a holistic view of how it is regulating the internet, encouraging dialogue between different branches of government about how best to create and enforce internet content regulation, and encouraging dialogue between the government and its citizens about the laws that govern internet content and their application. </li>
<li style="text-align: justify; ">Governments should make the compliance requirements they impose on ISPs public<br />Each government should maintain a public website that publishes as complete a picture as possible of the content removal requests made by any branch of that government, including the judicial branch. The availability of a public website of this type will further many of the goals and objectives discussed elsewhere in this section. The website should be biased towards high levels of detail about each request and towards disclosure that requests were made, subject only to limited exceptions for compelling public policy reasons, where the disclosure bias conflicts directly with another law, or where disclosure would reveal a user’s PII. The information should be published periodically, ideally more than once a year. The general principle should be: the more information made available, the better. On the same website where a government publishes its ‘Transparency Report,’ that government should attempt to provide a plain-language description of its various laws related to online content, to provide users notice about what content is lawful vs. unlawful, as well as to show how the laws that it enacts in the Internet space fit together. Further, and as discussed in section “b,” infra, government should provide citizens with an online feedback mechanism so that they may participate in the legislative process as it applies to online content.</li>
<li style="text-align: justify; ">Governments should give their citizens a way to provide input on these policies<br />Private citizens should have the right to provide feedback on the balancing between their civil liberties and other public policies such as security that their government engages in on their behalf. If and when these policies and the compliance requirements they impose on online intermediaries are made publicly available online, there should also be a feedback mechanism built into the site where this information is published. This public feedback mechanism could take a number of different forms, like, for example, a webform that allowed users to indicate their level of satisfaction with prevailing policy choices by choosing amongst several radio buttons, while also providing open text fields to allow the user to submit clarifying comments and specific suggestions. In order to be effective, this online feedback mechanism would have to be accompanied by some sort of legal and budgetary apparatus that would ensure that the feedback was monitored and given some minimum level of deference in the discussions and meetings that led to new policies being created.</li>
<p style="text-align: justify; ">Government should meet users concerned about its content policies in the online domain. Internet users, as citizens of both the internet and the country their country of origin, have a natural interest in defining and defending their civil liberties online; government should meet them there to extend the democratic process to the Internet. Denying Internet users a voice in the policymaking processes that determine their rights undermines government credibility and negatively influences users’ ability to freely share information online. As such, content policies should be posted in general terms online and users should have the ability to provide input on those policies online.</p>
<p style="text-align: justify; "><b>ISP Transparency</b><br />“The transparency practices of a company impact users’ freedom expression by providing insight into the scope of restriction that is taking in place in specific jurisdiction. Key areas of transparency for companies include: specific restrictions, aggregate numbers related to restrictions, company imposed regulations on content, and transparency of applicable law and regulation that the service provider must abide by.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt4"><sup>[4]</sup></a></p>
<p style="text-align: justify; ">“Disclosure by service providers of notices received and actions taken can provide an important check against abuse. In addition to providing valuable data for assessing the value and effectiveness of a N&A system, creating the expectation that notices will be disclosed may help deter fraudulent or otherwise unjustified notices. In contrast, without transparency, Internet users may remain unaware that content they have posted or searched for has been removed pursuant due to a notice of alleged illegality. Requiring notices to be submitted to a central publication site would provide the most benefit, enabling patterns of poor quality or abusive notices to be readily exposed.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt5"><sup>[5]</sup></a> Therefore, ISPs at all levels should publish transparency reports that include:</p>
<ul style="text-align: justify; ">
<li>Government Requests</li>
</ul>
<p style="text-align: justify; ">All requests from government agencies and courts should be published in a periodic transparency report, accessible on the intermediary’s website, that publishes information about the requests the intermediary received and what the intermediary did with them in the highest level of detail that is legally possible. The more information that is provided about each request, the better the understanding that the public will have about how laws that affect their rights online are being applied. That said, steps should be taken to prevent the disclosure of personal information in relation to the publication of transparency reports. Beyond redaction of personal information, however, the maximum amount of information about each request should be published, subject as well to the (ideally minimal) restrictions imposed by applicable law. A thorough Transparency Report published by an ISP or online intermediary should include information about the following categories of requests:</p>
<li style="text-align: justify; ">Police and/or Executive Requests<br />This category includes all requests to the intermediary from an agency that is wholly a part of the national government; from police departments, to intelligence agencies, to school boards from small towns. Surfacing information about all requests from any part of the government helps to avoid corruption and/or inappropriate exercises of governmental power by reminding all government officials, regardless of their rank or seniority, that information about the requests they submit to online intermediaries is subject to public scrutiny. </li>
<li style="text-align: justify; ">Court Orders<br />This category includes all orders issued by courts and signed by a judicial officer. It can include ex-parte orders, default judgments, court orders directed at an online intermediary, or court orders directed at a third party presented to the intermediary as evidence in support of a removal request. To the extent legally possible, detailed information should be published about these court orders detailing the type of court order each request was, its constituent elements, and the actions(s) that the intermediary took in response to it. All personally identifying information should be redacted from any court orders that are published by the intermediary as part of a transparency report before publication.</li>
<li style="text-align: justify; ">First Party<br />Information about court orders should be further broken down into two groups; first party and third party. First party court orders are orders directed at the online intermediary in an adversarial proceeding to which the online intermediary was a party.</li>
<li style="text-align: justify; ">Third Party<br />As mentioned above, ‘third party’ refers to court orders that are not directed at the online intermediary, but rather a third party such as an individual user who posted an allegedly defamatory remark on the intermediary’s platform. If the user who obtains a court order approaches an online intermediary seeking removal of content with a court order directed at the poster of, say, the defamatory content, and the intermediary decides to remove the content in response to the request, the online intermediary that decided to perform the takedown should publish a record of that removal. To be accepted by an intermediary, third party court orders should be issued by a court of appropriate jurisdiction after an adversarial legal proceeding, contain a certified and specific statement that certain content is unlawful, and specifically identify the content that the court has found to be unlawful, by specific, permalinked URL where possible.</li>
<p style="text-align: justify; ">This type of court order should be broken out separately from court orders directed at the applicable online intermediary in companies’ transparency reports because merely providing aggregate numbers that do not distinguish between the two types gives an inaccurate impression to users that a government is attempting to censor more content than it actually is. The idea of including first party court orders to remove content as a subcategory of ‘government requests’ is that a government’s judiciary speaks on behalf of the government, making determinations about what is permitted under the laws of that country. This analogy does not hold for court orders directed at third parties- when the court made its determination of legality on the content in question, it did not contemplate that the intermediary would remove the content. As such, the court likely did not weigh the relevant public interest and policy factors that would include the importance of freedom of expression or the precedential value of its decision. Therefore, the determination does not fairly reflect an attempt by the government to censor content and should not be considered as such.</p>
<p style="text-align: justify; ">Instead, and especially considering that these third party court order may be the basis for a number of content removals, third party court orders should be counted separately and presented with some published explanation in the company’s transparency report as to what they are and why the company has decided it should removed content pursuant to its receipt of one.</p>
<p style="text-align: justify; "><b>Private-Party Requests</b><br />Private-party requests are requests to remove content that are not issued by a government agency or accompanied by a court order. Some examples of private party requests include copyright complaints submitted pursuant to the Digital Millennium Copyright Act or complaints based on the laws of specific countries, such as laws banning holocaust denial in Germany.</p>
<p style="text-align: justify; "><b>Policy/TOS Enforcement</b><br />To give users a complete picture of the content that is being removed from the platforms that they use, corporate transparency reports should also provide information about the content that the intermediary removes pursuant to its own policies or terms of service, though there may not be a legal requirement to do so.</p>
<p style="text-align: justify; "><b>User Data Requests</b><br />While this white paper is squarely focused on liability for content posted online and best practices for deciding when and how content should be removed from online services, corporate transparency reports should also provide information about requests for user data from executive agencies, courts, and others.</p>
<h3 style="text-align: justify; ">Principle II: Consistency</h3>
<li style="text-align: justify; ">Legal requirements for ISPs should be consistent, based on a global legal framework that establishes baseline limitations on legal immunity<br />Broad variation amongst the legal regimes of the countries in which online intermediaries operate increases compliance costs for companies and may discourage them from offering their services in some countries due to the high costs of localized compliance. Reducing the number of speech platforms that citizens have access to limits their ability to express themselves. Therefore, to ensure that citizens of a particular country have access to a robust range of speech platforms, each country should work to harmonize the requirements that it imposes upon online intermediaries with the requirements of other countries. While a certain degree of variation between what is permitted in one country as compared to another is inevitable, all countries should agree on certain limitations to intermediary liability, such as the following: </li>
<li style="text-align: justify; ">Conduits should be immune from claims about content that they neither created nor modified<br />As noted in the 2011 Joint Declaration on Freedom of Expression and the Internet, “[n]o one who simply provides technical Internet services such as providing access, or searching for, or transmission or caching of information, should be liable for content generated by others, which is disseminated using those services, as long as they do not specifically intervene in that content or refuse to obey a court order to remove that content, where they have the capacity to do so (‘mere conduit principle’).”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt6"><sup>[6]</sup></a></li>
<li style="text-align: justify; ">Court orders should be required for the removal of content that is related to speech, such as defamation removal requests<br />In the Center for Democracy and Technology’s Additional Responses Regarding Notice and Action, CDT outlines the case against allowing notice and action procedures to apply to defamation removal requests. They write: </li>
<p style="text-align: justify; ">“Uniform notice-and-action procedures should not apply horizontally to all types of illegal content. In particular, CDT believes notice-and-takedown is inappropriate for defamation and other areas of law requiring complex legal and factual questions that make private notices especially subject to abuse. Blocking or removing content on the basis of mere allegations of illegality raises serious concerns for free expression and access to information. Hosts are likely to err on the side of caution and comply with most if not all notices they receive, because evaluating notices is burdensome and declining to comply may jeopardize their protection from liability. The risk of legal content being taken down is especially high in cases where assessing the illegality of the content would require detailed factual analysis and careful legal judgments that balance competing fundamental rights and interests. Intermediaries will be extremely reluctant to exercise their own judgment when the legal issues are unclear, and it will be easy for any party submitting a notice to claim a good faith belief that the content in question is unlawful. In short, the murkier the legal analysis, the greater the potential for abuse.</p>
<p style="text-align: justify; ">To reduce this risk, removal of or disablement of access to content based on unadjudicated allegations of illegality (i.e., notices from private parties) should be limited to cases where the content at issue is manifestly illegal – and then only with necessary safeguards against abuse as described above.</p>
<p style="text-align: justify; ">CDT believes that online free expression is best served by narrowing what is considered manifestly illegal and subject to takedown upon private notice. With proper safeguards against abuse, for example, notice-and-action can be an appropriate policy for addressing online copyright infringement. Copyright is an area of law where there is reasonable international consensus regarding what is illegal and where much infringement is straightforward. There can be difficult questions at the margins – for example concerning the applicability of limitations and exceptions such as “fair use” – but much online infringement is not disputable.</p>
<p style="text-align: justify; ">Quite different considerations apply to the extension of notice-and-action procedures to allegations of defamation or other illegal content. Other areas of law, including defamation, routinely require far more difficult factual and legal determinations. There is greater potential for abuse of notice-and-action where illegality is less manifest and more disputable. If private notices are sufficient to have allegedly defamatory content removed, for example, any person unhappy about something that has been written about him or her would have the ability and incentive to make an allegation of defamation, creating a significant potential for unjustified notices that harm free expression. This and other areas where illegality is more disputable require different approaches to notice and action. In the case of defamation, CDT believes “notice” for purposes of removing or disabling access to content should come only from a competent court after full adjudication.</p>
<p style="text-align: justify; ">In cases where it would be inappropriate to remove or disable access to content based on untested allegations of illegality, service providers receiving allegations of illegal content may be able to take alternative actions in response to notices. Forwarding notices to the content provider or preserving data necessary to facilitate the initiation of legal proceedings, for example, can pose less risk to content providers’ free expression rights, provided there is sufficient process to allow the content provider to challenge the allegations and assert his or her rights, including the right to speak anonymously.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt7"><sup>[7]</sup></a></p>
<h3 style="text-align: justify; ">Principle III: Clarity</h3>
<li style="text-align: justify; ">All notices that request the removal of content should be clear and meet certain minimum requirements<br />The Center for Democracy and Technology outlined requirements for clear notices in a notice and action system in response a European Commission public comment period on a revised notice and action regime.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt8"><sup>[8]</sup></a> They write:</li>
<p style="text-align: justify; ">“Notices should include the following features:</p>
<ol style="text-align: justify; ">
<li>Specificity. Notices should be required to specify the exact location of the material – such as a specific URL – in order to be valid. This is perhaps the most important requirement, in that it allows hosts to take targeted action against identified illegal material without having to engage in burdensome search or monitoring. Notices that demand the removal of particular content wherever it appears on a site without specifying any location(s) are not sufficiently precise to enable targeted action. </li>
<li>Description of alleged illegal content. Notices should be required to include a detailed description of the specific content alleged to be illegal and to make specific reference to the law allegedly being violated. In the case of copyright, the notice should identify the specific work or works claimed to be infringed. </li>
<li>Contact details. Notices should be required to contain contact information for the sender. This facilitates assessment of notices’ validity, feedback to senders regarding invalid notices, sanctions for abusive notices, and communication or legal action between the sending party and the poster of the material in question. </li>
<li>Standing: Notices should be issued only by or on behalf of the party harmed by the content. For copyright, this would be the rightsholder or an agent acting on the rightsholderʼs behalf. For child sexual abuse images, a suitable issuer of notice would be a law enforcement agency or a child abuse hotline with expertise in assessing such content. For terrorism content, only government agencies would have standing to submit notice. </li>
<li>Certification: A sender of a notice should be required to attest under legal penalty to a good-faith belief that the content being complained of is in fact illegal; that the information contained in the notice is accurate; and, if applicable, that the sender either is the harmed party or is authorized to act on behalf of the harmed party. This kind of formal certification requirement signals to notice-senders that they should view misrepresentation or inaccuracies on notices as akin to making false or inaccurate statements to a court or administrative body. </li>
<li>Consideration of limitations, exceptions, and defenses: Senders should be required to certify that they have considered in good faith whether any limitations, exceptions, or defenses apply to the material in question. This is particularly relevant for copyright and other areas of law in which exceptions are specifically described in law. </li>
<li>An effective appeal and counter-notice mechanism. A notice-and-action regime should include counter-notice procedures so that content providers can contest mistaken and abusive notices and have their content reinstated if its removal was wrongful. </li>
<li>Penalties for unjustified notices. Senders of erroneous or abusive notices should face possible sanctions. In the US, senders may face penalties for knowingly misrepresenting that content is infringing, but the standard for “knowingly misrepresenting” is quite high and the provision has rarely been invoked. A better approach might be to use a negligence standard, whereby a sender could be held liable for damages or attorneys’ fees for making negligent misrepresentations (or for repeatedly making negligent misrepresentations). In addition, the notice-and-action system should allow content hosts to ignore notices from senders with an established record of sending erroneous or abusive notices or allow them to demand more information or assurances in notices from those who have in the past submitted erroneous notices. (For example, hosts might be deemed within the safe harbor if they require repeat abusers to specifically certify that they have actually examined the alleged infringing content before sending a notice).”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt9"><sup>[9]</sup></a> </li>
</ol>
<li style="text-align: justify; ">All ISPs should publish their content removal policies online and keep them current as they evolve<br />The UNESCO report states, by way of background, that “[c]ontent restriction practices based on Terms of Service are opaque. How companies remove content based on Terms of Service violations is more opaque than their handling of content removals based on requests from authorized authorities. When content is removed from a platform based on company policy, [our] research found that all companies provide a generic notice of this restriction to the user, but do not provide the reason for the restriction. Furthermore, most companies do not provide notice to the public that the content has been removed. In addition, companies are inconsistently open about removal of accounts and their reasons for doing so.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt10"><sup>[10]</sup></a></li>
<p style="text-align: justify; ">There are legitimate reasons why an ISP may want to have policies that permit less content, and a narrower range of content, than is technically permitted under the law, such as maintaining a product that appeals to families. However, if a company is going to go beyond the minimal legal requirements in terms of content that it must restrict, the company should have clear policies that are published online and kept up-to-date to provide its users notice of what content is and is not permitted on the company’s platform. Notice to the user about the types of content that are permitted encourages her to speak freely and helps her to understand why content that she posted was taken down if it must be taken down for violating a company policy.</p>
<li style="text-align: justify; ">When content is removed, a clear notice should be provided in the product that explains in simple terms that content has been removed and why<br />This subsection works in conjunction with “ii,” above. If content is removed for any reason, either pursuant to a legal request or because of a violation of company policy, a user should be able to learn that content was removed if they try to access it. Requiring an on-screen message that explains that content has been removed and why is the post-takedown accompaniment to the pre-takedown published online policy of the online intermediary: both work together to show the user what types of content are and are not permitted on each online platform. Explaining to users why content has been removed in sufficient detail may also spark their curiosity as to the laws or policies that caused the content to be removed, resulting in increased civic engagement in the internet law and policy space, and a community of citizens that demands that the companies and governments it interacts with are more responsive to how it thinks content regulation should work in the online context.</li>
<p style="text-align: justify; ">The UNESCO report provides the following example of how Google provides notice to its users when a search result is removed, which includes a link to a page hosted by Chilling Effects:<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt11"><sup>[11]</sup></a></p>
<p style="text-align: justify; ">“When search results are removed in response to government or copyright holder demands, a notice describing the number of results removed and the reasons for their removal is displayed to users (see screenshot below) and a copy of the request to the independent non-proft organization ChillingEffects.org, which archives and publishes the request. When possible the company also contacts the website’s owners.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt12"><sup>[12]</sup></a></p>
<p style="text-align: justify; ">This is an example of the message that is displayed when Google removes a search result pursuant to a copyright complaint.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt13"><sup>[13]</sup></a></p>
<li style="text-align: justify; ">Requirements that governments impose on intermediaries should be as clear and unambiguous as possible<br />Imposing liability on internet intermediaries without providing clear guidance as to the precise type of content that is not lawful and the precise requirements of a legally sufficient notice encourages intermediaries to over-remove content. As Article 19 noted in its 2013 report on intermediary liability:</li>
<p style="text-align: justify; ">“International bodies have also criticized ‘notice and takedown’ procedures as they lack a clear legal basis. For example, the 2011 OSCE report on Freedom of Expression on the internet highlighted that: Liability provisions for service providers are not always clear and complex notice and takedown provisions exist for content removal from the Internet within a number of participating States. Approximately 30 participating States have laws based on the EU E-Commerce Directive. However, the EU Directive provisions rather than aligning state level policies, created differences in interpretation during the national implementation process. These differences emerged once the national courts applied the provisions.</p>
<p style="text-align: justify; ">These procedures have also been criticized for being unfair. Rather than obtaining a court order requiring the host to remove unlawful material (which, in principle at least, would involve an independent judicial determination that the material is indeed unlawful), hosts are required to act merely on the say-so of a private party or public body. This is problematic because hosts tend to err on the side of caution and therefore take down material that may be perfectly legitimate and lawful. For example, in his report, the UN Special Rapporteur on freedom of expression noted:</p>
<p style="text-align: justify; ">[W]hile a notice-and-takedown system is one way to prevent intermediaries from actively engaging in or encouraging unlawful behavior on their services, it is subject to abuse by both State and private actors. Users who are notified by the service provider that their content has been flagged as unlawful often has little recourse or few resources to challenge the takedown. Moreover, given that intermediaries may still be held financially or in some cases criminally liable if they do not remove content upon receipt of notification by users regarding unlawful content, they are inclined to err on the side of safety by overcensoring potentially illegal content. Lack of transparency in the intermediaries’ decision-making process also often obscures discriminatory practices or political pressure affecting the companies’ decisions. Furthermore, intermediaries, as private entities, are not best placed to make the determination of whether a particular content is illegal, which requires careful balancing of competing interests and consideration of defenses.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt14"><sup>[14]</sup></a></p>
<p style="text-align: justify; ">Considering the above, if liability is to be imposed on intermediaries for certain types of unlawful content, the legal requirements that outline what is unlawful content and how to report it must be clear. Lack of clarity in this area will result in over-removal of content by rational intermediaries that want to minimize their legal exposure and compliance costs. Over-removal of content is at odds with the goals of freedom of expression.</p>
<p style="text-align: justify; ">The UNESCO Report made a similar recommendation, stating that; “Governments need to ensure that legal frameworks and company policies are in place to address issues arising out of intermediary liability. These legal frameworks and policies should be contextually adapted and be consistent with a human rights framework and a commitment to due process and fair dealing. Legal and regulatory frameworks should also be precise and grounded in a clear understanding of the technology they are meant to address, removing legal uncertainty that would provide opportunity for abuse.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt15"><sup>[15]</sup></a></p>
<p style="text-align: justify; ">Similarly, the 2011 Joint Declaration on Freedom of Expression and the Internet states:</p>
<p style="text-align: justify; ">“Consideration should be given to insulating fully other intermediaries, including those mentioned in the preamble, from liability for content generated by others under the same conditions as in paragraph 2(a). At a minimum, intermediaries should not be required to monitor user-generated content and should not be subject to extrajudicial content takedown rules which fail to provide sufficient protection for freedom of expression (which is the case with many of the ‘notice and takedown’ rules currently being applied).”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt16"><sup>[16]</sup></a></p>
<h3 style="text-align: justify; ">Principle IV: Mindful Community Policy Making</h3>
<p style="text-align: justify; ">“Laws and regulations as well as corporate policies are more likely to be compatible with freedom of expression if they are developed in consultation with all affected stakeholders – particularly those whose free expression rights are known to be at risk.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt17"><sup>[17]</sup></a> To be effective, policies should be created through a multi-stakeholder consultation process that gives voice to the communities most at risk of being targeted for the information they share online. Further, both companies and governments should embed an ‘outreach to at-risk communities’ step into both legislative and policymaking processes to be especially sure that their voices are heard. Finally, civil society should work to ensure that all relevant stakeholders have a voice in both the creation and revision of policies that affect online intermediaries. In the context of corporate policymaking, civil society can use strategies from activist investing to encourage investors to make the human rights and freedom of expression policies of Internet companies’ part of the calculus that investors use to decide where to place their money. Considering the above;</p>
<ol style="text-align: justify; ">
<li style="text-align: justify; ">Human rights impact assessments, considering the impact of the proposed law or policy on various communities from the perspectives of gender, sexuality, sexual preference, ethnicity, religion, and freedom of expression, should be required before:</li>
<li>New laws are written that govern content issues affecting ISPs or conduct that occurs primarily online</li>
<li style="text-align: justify; ">“Protection of online freedom of expression will be strengthened if governments carry out human rights impact assessments to determine how proposed laws or regulations will affect Internet users’ freedom of expression domestically and globally.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt18"><sup>[18]</sup></a></li>
</ol>
<li style="text-align: justify; ">Intermediaries enact new policies<br />“Protection of online freedom of expression will be strengthened if companies carry out human rights impact assessments to determine how their policies, practices, and business operations affect Internet users’ freedom of expression. This assessment process should be anchored in robust engagement with stakeholders whose freedom of expression rights are at greatest risk online, as well as stakeholders who harbor concerns about other human rights affected by online speech.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt19"><sup>[19]</sup></a></li>
<li style="text-align: justify; ">Multi-stakeholder consultation processes should precede any new legislation that will apply to content issues affecting online intermediaries or online conduct<br />“Laws and regulations as well as corporate policies are more likely to be compatible with freedom of expression if they are developed in consultation with all affected stakeholders – particularly those whose free expression rights are known to be at risk.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt20"><sup>[20]</sup></a></li>
<li style="text-align: justify; ">Civil society and public interest groups should encourage responsible investment in companies who implement policies that reflect best practices for internet intermediaries<br />“Over the past thirty years, responsible investors have played a powerful role in incentivizing companies to improve environmental sustainability, supply chain labor practices, and respect for human rights of communities where companies physically operate. Responsible investors can also play a powerful role in incentivizing companies to improve their policies and practices affecting freedom of expression and privacy by developing metrics and criteria for evaluating companies on these issues in the same way that they evaluate companies on other “environmental, social, and governance” criteria.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt21"><sup>[21]</sup></a></li>
<h3 style="text-align: justify; ">Principle V: Necessity and Proportionality in Content Restriction</h3>
<li style="text-align: justify; ">Content should only be restricted when there is a legal basis for doing so, or the removal is performed in accordance with a clear, published policy of the ISP<br />As CDT outlined in its 2012 intermediary liability report, “[a]ctions required of intermediaries must be narrowly tailored and proportionate, to protect the fundamental rights of Internet users. Any actions that a safe-harbor regime requires intermediaries to take must be evaluated in terms of the principle of proportionality and their impact on Internet users’ fundamental rights, including rights to freedom of expression, access to information, and protection of personal data. Laws that encourage intermediaries to take down or block certain content have the potential to impair online expression or access to information. Such laws must therefore ensure that the actions they call for are proportional to a legitimate aim, no more restrictive than is required for achievement of the aim, and effective for achieving the aim. In particular, intermediary action requirements should be narrowly drawn, targeting specific unlawful content rather than entire websites or other Internet resources that may support both lawful and unlawful uses.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt22"><sup>[22]</sup></a></li>
<li style="text-align: justify; ">When content must be restricted, it should be restricted in the most minimal way possible (i.e., prefer domain removals to IP-blocking)<br />There are a number of different ways that access to content can be restricted. Examples include hard deletion of the content from all of a company’s servers, blocking the download of an app or other software program in a particular country, blocking the content on all IP addresses affiliated with a particular country (“IP-Blocking”), removing the content from a particular domain of a product (i.e., removing from a link from the .fr version of a search engine that remains accessible on the .com version), blocking content from a ‘version’ of an online product that is accessible through a ‘country’ or ‘language’ setting on that product, or some combination of the last three options (i.e., an online product that directs the user to a version of the product based on the country that their IP address is coming from, but where the user can alter a URL or manipulate a drop-down menu to show her a different ‘country version’ of the product, providing access to content that may otherwise be inaccessible). </li>
<p style="text-align: justify; ">While almost all of the different types of content restrictions described above can be circumvented by technical means such as the use of proxies, IP-cloaking, or Tor, the average internet user does not know that these techniques exist, much less how to use them. Of the different types of content restrictions described above, a domain removal, for example, is easier for an individual user to circumvent than IP-Blocked content because you only have to change the URL of the product you are using to, i.e. “.com” to see content that has been locally restricted. To get around an IP-block, you would have to be sufficiently savvy to employ a proxy or cloak your true IP address.</p>
<p style="text-align: justify; ">Therefore, the technical means used to restrict access to controversial content has a direct impact on the magnitude of the actual restriction on speech. The more restrictive the technical removal method, the fewer people that will have access to that content. To preserve access to lawful content, online intermediaries should choose the least restrictive means of complying with removal requests, especially when the removal request is based on the law of a particular country that makes certain content unlawful that is not unlawful in other countries. Further, when building new products and services, intermediaries should built in removal capability that minimally restricts access to controversial content.</p>
<li style="text-align: justify; ">If content is restricted due to its illegality in a particular country, the geographical scope of the content restriction should be as minimal as possible<br />Building on the discussion in “ii,” supra, a user should be able to access content that is lawful in her country even if it is not lawful in another country. Different countries have different laws and it is often difficult for intermediaries to determine how to effectively respond to requests and reconcile the inherent conflicts that result. For example, content that denies the holocaust is illegal in certain countries, but not in others. If an intermediary receives a request to remove content based on the laws of a particular country and determines that it will comply because the content is not lawful in that country, it should not restrict access to the content such that it cannot be accessed by users in other countries where the content is lawful. To respond to a request based on the law of a particular country by blocking access to that content for users around the world, or even users of more than one country, essentially allows for extraterritorial application of the laws of the country that the request came from. While it is preferable to standardize and limit the legal requirements imposed on online intermediaries throughout the world, to the extent that this is not possible, the next-best option is to limit the application of laws that are interpreted to declare certain content unlawful to the users that live in that country. Therefore, intermediaries should choose the technical means of content restriction that is most narrowly tailored to limit the geographical scope and impact of the removal.</li>
<li style="text-align: justify; ">The ability of conduits (telecommunications/internet service providers) to filter content should be minimized to the extent technically and legally possible</li>
<p style="text-align: justify; ">The 2011 Joint Declaration on Freedom of Expression and the Internet made the following points about the dangers of allowing filtering technology:</p>
<p style="text-align: justify; ">“Mandatory blocking of entire websites, IP addresses, ports, network protocols or types of uses (such as social networking) is an extreme measure – analogous to banning a newspaper or broadcaster – which can only be justified in accordance with international standards, for example where necessary to protect children against sexual abuse.</p>
<p style="text-align: justify; ">Content filtering systems which are imposed by a government or commercial service provider and which are not end-user controlled are a form of prior censorship and are not justifiable as a restriction on freedom of expression.</p>
<p style="text-align: justify; ">Products designed to facilitate end-user filtering should be required to be accompanied by clear information to end-users about how they work and their potential pitfalls in terms of over-inclusive filtering.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt23"><sup>[23]</sup></a></p>
<p style="text-align: justify; ">In short, filtering at the conduit level is a blunt instrument that should be avoided whenever possible. Similar to how conduits should not be legally responsible for content that they neither host nor modify (the ‘mere conduit’ rule discussed supra), conduits should technically restrict their ability to filter content such that it would be inefficient for government agencies to contact them to have content filtered. Mere conduits are not able to assess the context surrounding the controversial content that they are asked to remove and are therefore not the appropriate party to receive takedown requests. Further, when mere conduits have the technical ability to filter content, they open themselves to pressure from government to exercise that capability. Therefore, mere conduits should limit or not build in the capability to filter content.</p>
<li style="text-align: justify; ">Notice and notice, or notice and judicial takedown, should be preferred to notice and takedown, which should be preferred to unilateral removal<br />Mechanisms for content removal that involve intermediaries acting without any oversight or accountability, or those which only respond to the interests of the party requesting removal, are unlikely to do a very good job at balancing public and private interests. A much better balance is likely to be struck through a mechanism where power is distributed between the parties, and/or where an independent and accountable oversight mechanism exists.</li>
<p style="text-align: justify; ">Considered in this way, there is a continuum of content removal mechanisms that ranges from those are the least balanced and accountable, and those that are more so. The least accountable is the unilateral removal of content by the intermediary without legal compulsion in response to a request received, without affording the uploader of the content the right to be heard or access to remedy.</p>
<p style="text-align: justify; ">Notice and takedown mechanisms fit next along the continuum, provided that they incorporate, as the DMCA attempts to do, an effective appeal and counter-notice mechanism. However where notice and takedown falls down is that the cost and incentive structure is weighted towards removal of content in the case of doubt or dispute, resulting in more content being taken down and staying down than would be socially optimal.</p>
<p style="text-align: justify; ">A better balance is likely to be struck by a “notice and notice” regime, which provides strong social incentives for those whose content is reported to be unlawful to remove the content, but does not legally compel them to do so. If legal compulsion is required, a court order must be separately obtained.</p>
<p style="text-align: justify; ">Canada is an example of a jurisdiction with a notice and notice regime, though limited to copyright content disputes. Although this regime is now established in legislation, it formalizes a previous voluntary regime, whereby major ISPs would forward copyright infringement notifications received from rightsholders to subscribers, but without removing any content and without releasing subscriber data to the rightsholders absent a court order. Under the new legislation additional record-keeping requirements are imposed on ISPs, but otherwise the essential features of the regime remain unchanged.</p>
<p style="text-align: justify; ">Analysis of data collected during this voluntary regime indicates that it has been effective in changing the behavior of allegedly infringing subscribers. A 2010 study by the Entertainment Software Association of Canada (ESAC) found that 71% of notice recipients did not infringe again, whereas a similar 2011 study by Canadian ISP Rogers found 68% only received one notice, and 89% received no more than two notices, with only 1 subscriber in 800,000 receiving numerous notices.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt24"><sup>[24]</sup></a> However, in cases where a subscriber has a strong good faith belief that the notice they received was wrong, there is no risk to them in disregarding the erroneous notice – a feature that does not apply to notice and takedown.</p>
<p style="text-align: justify; ">Another similar way in which public and private interests can be balanced is through a notice and judicial takedown regime, whereby the rightsholder who issues a notice about offending content must have it assessed by an independent judicial (or perhaps administrative) authority before the intermediary will respond by taking the content down.</p>
<p style="text-align: justify; ">An example of this is found in Chile, again limited to the case of copyright.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt25"><sup>[25]</sup></a> In response to its Free Trade Agreement with the United States, the system introduced in 2010 is broadly similar to the DMCA, with the critical difference that intermediaries are not required to take material down in order to benefit from a liability safe harbor, until such time as a court order for removal of the material is made. Responsibility for evaluating the copyright claims made is therefore shifted from intermediaries onto the courts.</p>
<p style="text-align: justify; ">Although this requirement does impose a burden on the rightsholder, this serves a purpose by disincentivizing the issue of automated or otherwise unjustified notices that are more likely to restrict or chill freedom of expression. In cases where there is no serious dispute about the legality of the content, it is unlikely that the lawsuit would be defended. In any case, the legislation authorizes the court to issue a preliminary injunction on an ex parte basis, on condition of payment of a bond.</p>
<li style="text-align: justify; ">Intermediaries should be allowed to charge for the time and expense associated with processing legal requests<br />As an intermediary, it is time consuming and relatively expensive to understand the obligations that each country’s legal regime imposes on you, and to accurately how each legal request should be handled. Especially for intermediaries without many resources, such as forum operators or owners of home Wifi networks, the costs associated with being an intermediary can be prohibitive. Therefore, it should be within their rights to charge for their compliance costs if they are either below a certain user threshold or can show financial necessity in some way.</li>
<li style="text-align: justify; ">Legal requirements imposed on intermediaries should be a floor, not a ceiling- ISPs can adopt more restrictive policies to more effectively serve their users as long as they have published policies that explain what they are doing<br />The Internet has space for a wide range of platforms and applications directed to different communities, with different needs and desires. A social networking site directed at children, for example, may reasonably want to have policies that are much more restrictive than a political discussion board. Therefore, legal requirements that compel intermediaries to take down content should be seen as a ‘floor,’ but not a ‘ceiling’ on the range and quantity that of content those intermediaries may remove. Intermediaries should retain control over their own policies as long as they are transparent about what those policies are, what type of content the intermediary removes, and why they removed certain pieces of content. </li>
<h3 style="text-align: justify; ">Principle VI: Privacy</h3>
<li style="text-align: justify; ">It is important to protect the ability of Internet users to speak by narrowing and making less ambiguous the range of content that intermediaries can be held liable for, but it is also very important to make users feel comfortable sharing their view by ensuring that their privacy is protected. Protecting the user’s ability to share her views, especially when those views are controversial or have a direct bearing on important political issues, requires that the user can trust the intermediaries that she uses. This concept can be further broken down into three sub-principles:</li>
<li style="text-align: justify; ">The user’s personal information should be protected to the greatest extent possible given the state of the art in encryption, security, and policy<br />Users will be less willing to speak on important topics if they have legitimate concerns that their data may be taken from them. As stated in the UNESCO Report, “[b]ecause of the amount of personal information held by companies and ability to access the same, a company’s practices around collection, access, disclosure, and retention are key. To a large extent a service provider’s privacy practices are influenced by applicable law and operating licenses required by the host government. These can include requirements for service providers to verify subscribers, collect and retain subscriber location data, and cooperate with law enforcement when requested. Outcome: The implications of companies trying to balance a user’s expectation for privacy with a government’s expectation for cooperation can be serious and are inadequately managed in all jurisdictions studied.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt26"><sup>[26]</sup></a></li>
<li style="text-align: justify; ">Where possible, ISPs should help to preserve the user’s right to speak anonymously<br />An important aspect of an Internet user’s ability to exercise her right to free expression online is ability to speak anonymously. Anonymous speech is one of the great advances of the Internet as a communications medium and should be preserved to the extent possible. As noted by special rapporteur Frank LaRue, “[i]n order for individuals to exercise their right to privacy in communications, they must be able to ensure that these remain private, secure and, if they choose, anonymous. Privacy of communications infers that individuals are able to exchange information and ideas in a space that is beyond the reach of other members of society, the private sector, and ultimately the State itself. Security of communications means that individuals should be able to verify that only their intended recipients, without interference or alteration, receive their communications and that the communications they receive are equally free from intrusion. Anonymity of communications is one of the most important advances enabled by the Internet, and allows individuals to express themselves freely without fear of retribution or condemnation.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt27"><sup>[27]</sup></a></li>
<li style="text-align: justify; ">The user’s PII should never be sold or used without her consent, and she should always know what is being done with it via an easily comprehensible dashboard<br />The user’s trust in the online platform that she uses and relies upon is influenced not only by the relationships the intermediary maintains with the government, but also with other commercial entities. A user, who feels that her data will be constantly shared with third parties, perhaps without her consent and/or for marketing purposes, will never feel like she is able to freely express her opinion. Therefore, it is the intermediary’s responsibility to ensure that its users know exactly what information it retains about them, who it shares that information with and under what circumstances, and how to change the way that her data is shared. All of this information should be available on a dashboard that is comprehensible to the average user, and which gives her the ability to easily modify or withdraw her consent to the way her data is being shared, or the amount of data, or specific data, that the intermediary is retaining about her.</li>
<h3 style="text-align: justify; ">Principle VII: Access to Remedy</h3>
<li style="text-align: justify; ">As noted in the UNESCO Report, “Remedy is the third central pillar of the UN Guiding Principles on Business and Human Rights, placing an obligation both on governments and on companies to provide individuals access to effective remedy. This area is where both governments and companies are almost consistently lacking. Across intermediary types, across jurisdictions and across the types of restriction, individuals whose content is restricted and individuals who wish to access such content are offered little or no effective recourse to appeal restriction decisions, whether in response to government orders, third party requests or in accordance with company policy. There are no private grievance or due process mechanisms that are clearly communicated and readily available to all users, or consistently applied.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt28"><sup>[28]</sup></a></li>
<p style="text-align: justify; "><br />Any notice and takedown system is subject to abuse, and any company policy that results in the removal of content is subject to mistaken or inaccurate takedowns, both of which are substantial problems that can only be remedied by the ability for users to let the intermediary know when the intermediary improperly removed a specific piece of content and the technical and procedural ability of the intermediary to put the content back. However, the technical ability to reinstate content that was improperly removed may conflict with data retention laws. This conflict should be explored in more detail. In general, however, every time content is removed, there should be:</p>
<li style="text-align: justify; ">A clear mechanism through which users can request reinstatement of content<br />When an intermediary decides to remove content, it should be immediately clear to the user that content has been removed and why it was removed (see discussion of in-product notice, supra). If the user disagrees with the content removal decision, there should be an obvious, online method for her to request reinstatement of the content.</li>
<li style="text-align: justify; ">Reinstatement of content should be technically possible<br />When intermediaries (who are subject to intermediary liability) are building new products, they should build the capability to remove content into the product with a high degree of specificity so as to allow for narrowly tailored content removals when a removal is legally required. Relatedly, all online intermediaries should build the capability to reinstate content into their products while maintaining compliance with data retention laws.</li>
<li style="text-align: justify; ">Intermediaries should have policies and procedures in place to handle reinstatement requests<br />Between the front end (online mechanism to request reinstatement of content) and the backend (technical ability to reinstate content) is the necessary middle layer, which consists of the intermediary’s internal policies and processes that allow for valid reinstatement requests to be assessed and acted upon. In line with the corporate ‘responsibility to respect’ human rights, and considered along with the human rights principle of ‘access to remedy,’ intermediaries should have a system in place from the time that an online product launches to ensure that reinstatement requests can be made and will be processed quickly and appropriately.</li>
<h3 style="text-align: justify; ">Principle VIII: Accountability</h3>
<li style="text-align: justify; ">Governments must ensure that independent, transparent, and impartial accountability mechanisms exist to verify the practices of government and companies with regards to managing content created online<br />“While it is important that companies make commitments to core principles on freedom of expression and privacy, make efforts to implement those principles through transparency, policy advocacy, and human rights impact assessments, it is also important that companies take these steps in a manner that is accountable to stakeholders. One way of doing this is by committing to external third party assurance to verify that their policies and practices are being implemented to a meaningful standard, with acceptable consistency wherever their service is offered. Such assurance gains further public credibility when carried out with the supervision and affirmation of multiple stakeholders including civil society groups, academics, and responsible investors. The Global Network Initiative provides one such mechanism for public accountability. Companies not currently participating in GNI, or a process of similar rigor and multi-stakeholder involvement, should be urged by users, investors, and regulators to do so.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt29"><sup>[29]</sup></a></li>
<li style="text-align: justify; ">Civil society should encourage comparative studies between countries and between ISPs with regards to their content removal practices to identify best practices<br />Civil society has the unique ability to look longitudinally across this issue to determine and compare how different intermediaries and governments are responding to content removal requests. Without information about how other governments and intermediaries are handling these issues, it will be difficult for each government or intermediary to learn how to improve its laws or policies. Therefore, civil society has an important role to play in the process of creating increasingly better human rights outcomes for online platforms by performing and sharing ongoing, comparative research.</li>
<li style="text-align: justify; ">Civil society should establish best practices and benchmarks against which ISPs and government can be measured, and should track governments and ISPs over time in public reports<br />“A number of projects that seek, define and implement indicators and benchmarks for governments or companies are either in development (examples include: UNESCO’s Indicators of Internet Development project examining country performance, Ranking Digital Rights focusing on companies) or already in operation (examples include the Web Foundation’s Web Index, Freedom House’s Internet Freedom Index, etc.). The emergence of credible, widely-used benchmarks and indicators that enable measurement of country and company performance on freedom of expression will help to inform policy, practice, stakeholder engagement processes, and advocacy.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt30"><sup>[30]</sup></a></li>
<h3 style="text-align: justify; ">Principle IX: Due Process - In Both Legal and Private Enforcement</h3>
<li style="text-align: justify; ">ISPs should always consider context before removing content and Governments and courts should always consider context before ordering that certain content be removed<br />“Governments need to ensure that legal frameworks and company policies are in place to address issues arising out of intermediary liability. These legal frameworks and policies should be contextually adapted and be consistent with a human rights framework and a commitment to due process and fair dealing. Legal and regulatory frameworks should also be precise and grounded in a clear understanding of the technology they are meant to address, removing legal uncertainty that would provide opportunity for abuse.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt31"><sup>[31]</sup></a></li>
<li style="text-align: justify; ">Principles for Courts</li>
<p style="text-align: justify; ">An independent and impartial judiciary exists, at least in part, to preserve the citizen’s due process rights. Many have called for an increased reliance on courts to make determinations about the legality of content posted online in order to both shift the censorship function from unaccountable private actors and to ensure that courts only order the removal of content that is actually unlawful. However, when courts do not have an adequate technical understanding of how content is created and shared on the internet, the rights of the intermediaries that facilitate the posting of the content, and who should be ordered to remove unlawful content, they do not add value to the online ecosystem. Therefore, courts should keep certain principles in mind to preserve the due process rights of the users that post content and the intermediaries that host the content.</p>
<li style="text-align: justify; ">Preserve due process for intermediaries- do not order them to do something before giving them notice and the opportunity to appear before the court</li>
<p style="text-align: justify; ">In a dispute between two private parties over a specific piece of content posted online, it may appear to the court that the easy solution is to order the intermediary who hosts the content to remove it. However, this approach does not extend any due process protections to the intermediary and does not adequately reflect the intermediary's status as something other than the creator of the content. If a court feels that it is necessary for an intermediary to intervene in a legal proceeding between two private parties, the court should provide the intermediary with proper notice and give them the opportunity to appear before the court before issuing any orders.</p>
<li style="text-align: justify; ">Necessity and proportionality of judicial determinations- judicial orders determining the illegality of specific content should be narrowly tailored to avoid over-removal of content </li>
<p style="text-align: justify; ">With regards to government removal requests, the UNESCO Report notes that “[o]ver-broad law and heavy liability regimes cause intermediaries to over-comply with government requests in ways that compromise users’ right to freedom of expression, or broadly restrict content in anticipation of government demands even if demands are never received and if the content could potentially be found legitimate even in a domestic court of law.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt32"><sup>[32]</sup></a> Courts should follow the same principle: only order the removal of the bare minimum of content that is necessary to remedy the harm identified and nothing more.</p>
<li style="text-align: justify; ">Courts should clarify whether ISPs have to remove content in response to court orders directed to third parties, or only have to remove content when directly ordered to do so (first party court orders) after an adversarial proceeding to which the ISP was a party</li>
<p style="text-align: justify; ">See discussion of the difference between first party and third party court orders (supra, section a., “Transparency”). Ideally, any decision that courts reach on this issue would be consistent across different countries.</p>
<li style="text-align: justify; ">Questions- related unresolved issues that should be kicked to the larger group</li>
<li style="text-align: justify; ">How should the conflict between access to remedy and data retention laws that say content must be hard deleted after a certain period of time be resolved? I think the access to remedy has to be subordinated to the data protection laws. Let's make that our draft position, but continue to flag it for discussion.</li>
<li style="text-align: justify; ">Should ISPs have to remove content in response to court orders directed to third parties, or only have to remove content when directly ordered to do so (first party court orders) after an adversarial proceeding to which the ISP was a party? I think first party orders. Let's make that our draft position, but continue to flag it for discussion.</li>
<hr style="text-align: justify; " />
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref1">[1]</a> Center for Democracy and Technology, Shielding the Messengers: Protecting Platforms for Expression and Innovation at 4-15 (Version 2, 2012), available at <a href="https://www.google.com/url?q=https%3A%2F%2Fwww.cdt.org%2Ffiles%2Fpdfs%2FCDT-Intermediary-Liability-2012.pdf&sa=D&sntz=1&usg=AFQjCNHNG5ji0HEiYXyelfwwK8qTCgOHiw">https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf</a> (see pp.4-15 for an explanation of these different models and the pros and cons of each).</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref2">[2]</a> UNESCO, “Fostering Freedom Online: The Roles, Challenges, and Obstacles of Internet Intermediaries” at 6-7 (Draft Version, June 16th, 2014) (Hereinafter “UNESCO Report”).</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref3">[3]</a> UNESCO Report at 56.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref4">[4]</a> UNESCO Report at 37.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref5">[5]</a> Center for Democracy and Technology, Additional Responses Regarding Notice and Action, Available at https://www.cdt.org/files/file/CDT%20N&A%20supplement.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref6">[6]</a> The United Nations (UN) Special Rapporteur on Freedom of Opinion and Expression, the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media, the Organization of American States (OAS) Special Rapporteur on Freedom of Expression and the African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on Freedom of Expression and Access to Information, Article 19, Global Campaign for Free Expression, and the Centre for Law and Democracy, JOINT DECLARATION ON FREEDOM OF EXPRESSION AND THE INTERNET at 2 (2011), available at <a href="http://www.google.com/url?q=http%3A%2F%2Fwww.osce.org%2Ffom%2F78309&sa=D&sntz=1&usg=AFQjCNF8QmlhRMreM_BT0Eyfrw_J7ZdTGg">http://www.osce.org/fom/78309</a> (Hereinafter “Joint Declaration on Freedom of Expression).</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref7">[7]</a> Center for Democracy and Technology, Additional Responses Regarding Notice and Action, Available at https://www.cdt.org/files/file/CDT%20N&A%20supplement.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref8">[8]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref9">[9]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref10">[10]</a> UNESCO Report at 113-14.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref11">[11]</a> ‘Chilling Effects’ is a website that allows recipients of ‘cease and desist’ notices to submit the notice to the site and receive information about their legal rights. For more information about ‘Chilling Effects’ see: http://www.chillingeffects.org.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref12">[12]</a> Id. at 73. You can see an example of a complaint published on Chilling Effects at the following location. “DtecNet DMCA (Copyright) Complaint to Google,” Chilling Effects Clearinghouse, March 12, 2013, www.chillingeffects.org/notice.cgi?sID=841442.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref13">[13]</a> UNESCO Report at 73.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref14">[14]</a> Article 19, Internet Intermediaries: Dilemma of Liability (2013), available at http://www.article19.org/data/files/Intermediaries_ENGLISH.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref15">[15]</a> UNESCO Report at 120.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref16">[16]</a> Joint Declaration on Freedom of Expression and the Internet at 2.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref17">[17]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref18">[18]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref19">[19]</a> Id. at 121.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref20">[20]</a> Id. at 104.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref21">[21]</a> Id. at 122.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref22">[22]</a> Center for Democracy and Technology, Shielding the Messengers: Protecting Platforms for Expression and Innovation at 12 (Version 2, 2012), available at https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref23">[23]</a> Joint Declaration on Freedom of Expression at 2-3.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref24">[24]</a> Geist, Michael, Rogers Provides New Evidence on Effectiveness of Notice-and-Notice System (2011), available at http://www.michaelgeist.ca/2011/03/effectiveness-of-notice-and-notice/</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref25">[25]</a> Center for Democracy and Technology, Chile’s Notice-and-Takedown System for Copyright Protection: An Alternative Approach (2012), available at https://www.cdt.org/files/pdfs/Chile-notice-takedown.pdf</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref26">[26]</a> UNESCO Report at 54.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref27">[27]</a> “Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Frank La Rue (A/HRC/23/40),” United Nations Human Rights, 17 April 2013, http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A.HRC.23.40_EN.pdf, § 24, p. 7.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref28">[28]</a> UNESCO Report at 118.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref29">[29]</a> UNESCO Report at 122.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref30">[30]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref31">[31]</a> UNESCO Report at 120.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref32">[32]</a> Id. at 119.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/zero-draft-of-content-removal-best-practices-white-paper'>https://cis-india.org/internet-governance/blog/zero-draft-of-content-removal-best-practices-white-paper</a>
</p>
No publisherjyotiInternet GovernanceIntermediary Liability2014-09-10T07:11:09ZBlog EntryMinimising Legal Risks of Online Intermediaries while Protecting User Rights
https://cis-india.org/internet-governance/events/minimising-legal-risks-of-online-intermediaries-while-protecting-user-rights
<b>The Centre for Internet and Society (CIS) in partnership with Software Freedom Law Centre (SFLC.in) is organizing a workshop during the APrIGF event to be held at Crown Plaza, Greater Noida on August 5, 2014, 3.30 p.m. to 5.00 p.m. Jyoti Panday will be a panelist.</b>
<h3>Thematic Area of Interest</h3>
<ul>
<li>Internet business in the Asia Pacific region</li>
</ul>
<ul>
<li>Consumer protection for users of global Internet services</li>
</ul>
<ul>
<li>Internet for socio-economic development</li>
</ul>
<h3></h3>
<h3>Specific Issues of Discussions & Description</h3>
<p style="text-align: justify; ">Internet usage in the Asia Pacific region has been growing at a phenomenal rate and online service providers have benefited enormously from this growth. However, the region poses challenges for online service providers in terms of legal risks involved with respect to user generated content. Across the world from Europe to the US, it has been an accepted policy that service providers on the Internet cannot be held liable for user-generated content and this principle has found place in legislations enacted in this field in most countries. However, the Asian region has often seen blocking of services and websites due to user-generated content that is deemed to be illegal. There needs to be a debate on safe harbour provisions for intermediaries and the take-down provisions in legislations to ensure that the right to freedom of expression of citizens are protected while maintaining an environment that permits innovation in this space.</p>
<p style="text-align: justify; ">The workshop will also consider the different classes of intermediaries, how they differ functionally and if their differing roles should bear an impact on their responsibility with regards to protection of rights of users. Traditional models of consumer protection are based on distinguishing the roles and responsibilities of suppliers, facilitators and consumers. While developing consumer protection models for online intermediary platforms, their evolving roles and responsibilities as a supplier and a facilitator need to be considered. Intermediary platforms have also created and highlighted new consumer relations and issues that call for robust and fluid reddressal mechanisms.</p>
<p style="text-align: justify; ">The need to reflect on reddressal mechanisms for consumer issues pertaining to online intermediaries is also necessary, given the economic implications associated with intermediary liability. Failure to protect intermediaries stems innovation and restricts growth of start-ups and small to medium enterprises in the digital economy and has negative financial implications. Moreover, intermediaries are crucial in connecting developing countries to global markets and a failure to protect them, creates a barrier to information exchange and capacity building.</p>
<p style="text-align: justify; ">The panel will discuss the following issues:</p>
<ul>
<li>Take-down procedures and Put-back provisions used in various countries in the region</li>
<li>Safe-harbour provisions for intermediaries</li>
<li>Need for classification of Intermediaries for the purpose of a take-down regime and user rights</li>
<li>Rights of users of services provided by online intermediaries </li>
<li>Recommendations for a balanced intermediary liability regime</li>
</ul>
<h3 style="text-align: justify; "></h3>
<h3 style="text-align: justify; ">Expected Format and Confirmed Panel Members</h3>
<p>The workshop will be a ninety minute panel divided in two sessions of forty five minutes each. The proposed panel includes:</p>
<p style="text-align: justify; "><b>Mishi Choudhary</b> (Moderator) SFLC.IN Civil Society India<br />Mishi Choudhary is the founding director of SFLC India. She started working with SFLC in New York following the completion of her fellowship during which she earned her LLM from Columbia Law School and was a Stone Scholar. In addition to her LLM, she has an LLB and a bachelors degree in political science from the University of Delhi, India.</p>
<p style="text-align: justify; "><b>Jyoti Panday</b>, Center for Internet and Society, Civil Society, India <br />Jyoti Panday is Programme Officer at the Centre for Internet and Society working on Internet governance and on issues related to the role and responsibility of intermediaries in protecting user rights and freedom of expression. She has experience in strategy, campaign management and research on issues and processes related to the development agenda, sustainability and democracy. She has completed her MSc in Public Policy from Queen Mary, University of London.</p>
<p style="text-align: justify; "><b>Shahzad Ahmed</b>, Bytes for All Pakistan, Civil Society, Pakistan<br />Shahzad Ahmad is the Country Coordinator of Bytes for All, Pakistan and founder of the Digital Rights Institute (DRI). He is currently working on issues of ICT policy advocacy, internet rights and freedom of expression. He is a development communications expert and is at the forefront of the Internet Rights movement in Pakistan.</p>
<p style="text-align: justify; ">Mr. Ahmad is a Diplo Fellow, Executive Board Member of the Association for Progressive Communications, Advisory Board Member of .PK ccTLD and a member of the International Advisory Board of Privacy International, UK. He regularly contributes to various publications and research studies on ICTs for development, freedom of expression and gender related issues. Widely travelled, he regularly participates in various forums at local, regional and global level. Mr. Ahmad maintains a strong engagement with broader civil society networks and strongly believes in participation and openness.</p>
<p style="text-align: justify; "><b>Professor KS Park</b>, Korea University Law School Professor <br />One of the founders of Open Net Korea, Professor Park has written and is active in internet, free speech, privacy, defamation, copyright, international business contracting, etc. He has given expert testimonies in high-profile free speech cases including the /Minerva /case, the internet real name verification case, the military’s subversive book blacklisting case, the newspaper consumers’ boycott case, and the Park Jung-Geun Retweet case. As a result, the “false news” crime and the internet real name verification laws were struck down as unconstitutional, Park Jung-Geun and Minerva acquitted, the soldiers challenging book blacklisting reinstated, the newspaper boycotters acquitted partially as to the “secondary boycotting” charge (2010-2013).</p>
<p style="text-align: justify; ">Since 2006, he serves as the Executive Director of the PSPD Law Center, a non-profit entity that has organized several impact litigations in the areas of free speech, privacy, and copyright. There, the Law Center won the world’s first damage lawsuit against a copyright holder for “bad faith” takedown (2009) and the first damage lawsuit against a portal for warrantless disclosure of the user identity data to the police (2012).</p>
<p style="text-align: justify; "><b>Arvind Gupta</b>, National Head-Information and Technology, Government/ BJP Political party, India<br />National Head, BJP Information Technology Cell</p>
<p style="text-align: justify; "><b>Faisal Farooqui</b>, CEO, MouthShut.com, Private Sector, India<br />Faisal Farooqui is a highly recognized entrepreneur who is among the trailblazers of his generation. Faisal has founded and managed two successful Internet and technology companies -MouthShut.com, India's largest consumer review and social media portal and Zarca Interactive, a Virginia based enterprise survey and feedback company.</p>
<p style="text-align: justify; "><b>Ramanjit Singh Chima</b>, Google, Private Sector, India<br />Raman Jit Singh Chima serves as Policy Counsel and Government Affairs Manager for Google, based in New Delhi. He currently helps lead Google'spublic policy and government affairs work in India. He is a graduate of the Bachelors in Arts and Law (Honours) programme of the National Law School of India University, Bangalore. While at the National Law School, he was Chief Editor of the Indian Journal of Law and Technology. He has studied Internet regulation as an independent research fellow with the Sarai programme of the Centre for the Study of Developing Societies and contributed to Freedom House's 2009 Freedom on the Internet report.</p>
<p style="text-align: justify; "><b>Apar Gupta</b>, Legal, India <br />Apar Gupta is a practicing lawyer in Delhi working as a Partner at the law firm of Advani & Co. His practice areas include, commercial litigation and arbitration with a focus on technology and media. Apar as a retained counsel, represents an internet industry organisation in government affairs, including consultations on draft laws and policies which effect the sector. These issues include legal risks of intermediaries, media freedom and consumer rights. He has completed his masters in law from Columbia Law School, New York and has written columns for the Business Standard, Indian Express and the Pioneer on legal issues. Apar also is a visiting faculty at National Law University, Delhi.</p>
<h3 style="text-align: justify; ">Full Name, Affiliation and Contact Details of the Workshop Organizer</h3>
<p>The workshop will be jointly organised by SFLC.IN and the Centre for Internet & Society, India. The details of the contact person for the workshop is given below:</p>
<ol>
<li>Name: Ms. Mishi Choudhary, Executive Director, SFLC.IN I<br />E: mishi@softwarefreedom.org</li>
<li>Jyoti Panday—Centre for Internet & Society, India<br />E: jyoti@cis-india.org</li>
</ol>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/events/minimising-legal-risks-of-online-intermediaries-while-protecting-user-rights'>https://cis-india.org/internet-governance/events/minimising-legal-risks-of-online-intermediaries-while-protecting-user-rights</a>
</p>
No publisherpraskrishnaFreedom of Speech and ExpressionInternet GovernanceEventIntermediary Liability2014-07-29T07:50:51ZEventGNI and IAMAI Launch Interactive Slideshow Exploring Impact of India's Internet Laws
https://cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws
<b>The Global Network Initiative and the Internet and Mobile Association of India have come together to explain how India’s Internet and technology laws impact economic innovation and freedom of expression. </b>
<p>The <a class="external-link" href="http://www.globalnetworkinitiative.org/">Global Network Initiative (GNI)</a>, and the <a class="external-link" href="http://www.iamai.in/">Internet and Mobile Association of India (IAMAI)</a> have launched an interactive slide show exploring the impact of existing Internet laws on users and businesses in India. The slide show created by Newsbound, and to which Centre for Internet and Society (CIS) has contributed its comments—explain the existing legislative mechanisms prevalent in India, map the challenges of the regulatory environment and highlight areas where such mechanisms can be strengthened.</p>
<p>Foregrounding the difficulties of content regulation, the slides are aimed at informing users and the public of the constraints of current legal mechanisms in place, including safe harbour and take down and notice provisions. Highlighting Section 79(3) and the Intermediary Liability Rules issued in 2011, the slide show identifies some of the challenges faced by Internet platforms, such as the broad interpretation of the legislation by the executive branch.</p>
<p>Challenges governing Internet platforms highlighted in the slide show include uniform Terms of Service that do not consider the type of service being provided by the platform, uncertain requirements for taking down content and compliance obligations related to information disclosure. Further the issues of over compliance and misuse of the legal notice and take down system introduced under Section 79 of the Information Technology (Intermediaries Guidelines) Rules 2011.</p>
<p>The Rules were created with the purpose of providing guidelines for the ‘post-publication redressal mechanism expression as envisioned in the Constitution of India'. However, since their introduction, the Rules have been criticised extensively, by both the national and the international media on account of not conforming to principles of natural justice and freedom of expression. Critics have pointed out that by not recognising the different functions performed by the different intermediaries and by not providing safeguards against misuse of such mechanism for suppressing legitimate expression, the Rules have a chilling effect on freedom of expression.</p>
<p>Under the current Rules, the third party provider/creator of information is not given a chance to be heard by the intermediary, nor is there a requirement to give a reasoned decision by the intermediary to the creator whose content has been taken down. The take down procedure also, does not have any provisions for restoring the removed information, such as providing a counter notice filing mechanism or appealing to a higher authority. Further, the content criteria for removal of content includes terms like 'disparaging' and 'objectionable', which are not defined and prima facie seem to be beyond the reasonable restrictions envisioned by the Constitution of India. With uncertainty in content criteria and no safeguards to prevent abuse complainant may send frivolous complaints and suppress legitimate expressions without any fear of repercussions.</p>
<p>Most importantly, the redressal mechanism under the Rules shifts the burden of censorship, previously, the exclusive domain of the judiciary or the executive, and makes it the responsibility of private intermediaries. Often, private intermediaries, do not have sufficient legal resources to subjectively determine the legitimacy of a legal claim, resulting in over compliance to limit liability. The slide show cites the <a href="https://cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet">2011 CIS research carried out by Rishabh Dara</a> to determine whether the Rules lead to a chilling effect on online free expression, towards highlighting the issue of over compliance and self censorship.</p>
<p>The initiative is timely, given the change of guard in India, and stresses, not only the economic impact of fixing the Internet legal framework, but also the larger impact on users rights and freedom of expression. The initiative calls for a legal environment for the Internet that enables innovation, protects the rights of users, and provides clear rules and regulations for businesses large and small.</p>
<p>See the slideshow here: <a href="http://globalnetworkinitiative.org/india">How India’s Internet Laws Can Help Propel the Country Forward</a></p>
<p><strong>Other GNI reports and resources: </strong></p>
<p><a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">Closing the Gap: Indian Online Intermediaries and a Liability System Not Yet Fit for Purpose</a></p>
<p><a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">Strengthening Protections for Online Platforms Could Add Billions to India’s GDP</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws'>https://cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws</a>
</p>
No publisherjyotiCensorshipFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityChilling EffectInformation Technology2014-07-17T12:01:01ZBlog EntryReading the Fine Script: Service Providers, Terms and Conditions and Consumer Rights
https://cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights
<b>This year, an increasing number of incidents, related to consumer rights and service providers, have come to light. This blog illustrates the facts of the cases, and discusses the main issues at stake, namely, the role and responsibilities of providers of platforms for user-created content with regard to consumer rights.</b>
<p style="text-align: justify; "><span>On 1st July, 2014 the Federal Trade Commission (FTC) filed a complaint against T-Mobile USA,</span><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn1">[1]</a><span> accusing the service provider of 'cramming' customers bills, with millions of dollars of unauthorized charges. Recently, another service provider, received flak from regulators and users worldwide, after it published a paper, 'Experimental evidence of massive-scale emotional contagion through social networks'.</span><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn2">[2]</a><span> The paper described Facebook's experiment on more than 600,000 users, to determine whether manipulating user-generated content, would affect the emotions of its users.</span></p>
<p style="text-align: justify; ">In both incidents the terms that should ensure the protection of their user's legal rights, were used to gain consent for actions on behalf of the service providers, that were not anticipated at the time of agreeing to the terms and conditions (T&Cs) by the consumer. More precisely, both cases point to the underlying issue of how users are bound by T&Cs, and in a mediated online landscape—highlight, the need to pay attention to the regulations that govern the online engagement of users.</p>
<p style="text-align: justify; "><b>I have read and agree to the terms</b></p>
<p style="text-align: justify; ">In his statement, Chief Executive Officer, John Legere might have referred to T-Mobile as "the most pro-consumer company in the industry",<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn3">[3]</a> however the FTC investigation revelations, that many customers never authorized the charges, suggest otherwise. The FTC investigation also found that, T-Mobile received 35-40 per cent of the amount charged for subscriptions, that were made largely through innocuous services, that customers had been signed up to, without their knowledge or consent. Last month news broke, that just under 700,000 users 'unknowingly' participated in the Facebook study, and while the legality and ethics of the experiment are being debated, what is clear is that Facebook violated consumer rights by not providing the choice to opt in or out, or even the knowledge of such social or psychological experiments to its users.</p>
<p style="text-align: justify; ">Both incidents boil down to the sensitive question of consent. While binding agreements around the world work on the condition of consent, how do we define it and what are the implications of agreeing to the terms?</p>
<p style="text-align: justify; "><b>Terms of Service: Conditions are subject to change </b></p>
<p style="text-align: justify; ">A legal necessity, the existing terms of service (TOS)—as they are also known—as an acceptance mechanism are deeply broken. The policies of online service providers are often, too long, and with no shorter or multilingual versions, require substantial effort on part of the user to go through in detail. A 2008 Carnegie Mellon study estimated it would take an average user 244 hours every year to go through the policies they agree to online.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn4">[4]</a> Based on the study, Atlantic's Alexis C. Madrigal derived that reading all of the privacy policies an average Internet user encounters in a year, would take 76 working days.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn5">[5]</a></p>
<p style="text-align: justify; ">The costs of time are multiplied by the fact that terms of services change with technology, making it very hard for a user to keep track of all of the changes over time. Moreover, many services providers do not even commit to the obligation of notifying the users of any changes in the TOS. Microsoft, Skype, Amazon, YouTube are examples of some of the service providers that have not committed to any obligations of notification of changes and often, there are no mechanisms in place to ensure that service providers are keeping users updated.</p>
<p style="text-align: justify; ">Facebook has said that the recent social experiment is perfectly legal under its TOS,<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn6">[6]</a> the question of fairness of the conditions of users consent remain debatable. Facebook has a broad copyright license that goes beyond its operating requirements, such as the right to 'sublicense'. The copyright also does not end when users stop using the service, unless the content has been deleted by everyone else.</p>
<p style="text-align: justify; ">More importantly, since 2007, Facebook has brought major changes to their lengthy TOS about every year.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn7">[7]</a> And while many point that Facebook is transparent, as it solicits feedback preceding changes to their terms, the accountability remains questionable, as the results are not binding unless 30% of the actual users vote. Facebook can and does, track users and shares their data across websites, and has no obligation or mechanism to inform users of the takedown requests.</p>
<p style="text-align: justify; ">Courts in different jurisdictions under different laws may come to different conclusions regarding these practices, especially about whether changing terms without notifying users is acceptable or not. Living in a society more protective of consumer rights is however, no safeguard, as TOS often include a clause of choice of law which allow companies to select jurisdictions whose laws govern the terms.</p>
<p style="text-align: justify; ">The recent experiment bypassed the need for informed user consent due to Facebook's Data Use Policy<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn8">[8]</a>, which states that once an account has been created, user data can be used for 'internal operations, including troubleshooting, data analysis, testing, research and service improvement.' While the users worldwide may be outraged, legally, Facebook acted within its rights as the decision fell within the scope of T&Cs that users consented to. The incident's most positive impact might be in taking the questions of Facebook responsibilities towards protecting users, including informing them of the usage of their data and changes in data privacy terms, to a worldwide audience.</p>
<p style="text-align: justify; "><b>My right is bigger than yours</b></p>
<p style="text-align: justify; ">Most TOS agreements, written by lawyers to protect the interests of the companies add to the complexities of privacy, in an increasingly user-generated digital world. Often, intentionally complicated agreements, conflict with existing data and user rights across jurisdictions and chip away at rights like ownership, privacy and even the ability to sue. With conditions that that allow for change in terms at anytime, existing users do not have ownership or control over their data.</p>
<p style="text-align: justify; ">In April New York Times, reported of updates to the legal policy of General Mills (GM), the multibillion-dollar food company.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn9">[9]</a> The update broadly asserted that consumers interacting with the company in a variety of ways and venues no longer can sue GM, but must instead, submit any complaint to “informal negotiation” or arbitration. Since then, GM has backtracked and clarified that “online communities” mentioned in the policy referred only to those online communities hosted by the company on its own websites.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn10">[10]</a> Clarification aside, as Julia Duncan, Director of Federal programs at American Association for Justice points out, the update in the terms were so broad, that they were open to wide interpretation and anything that consumers purchase from the company could have been held to this clause. <a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn11">[11]</a></p>
<p style="text-align: justify; "><b>Data and whose rights?</b></p>
<p style="text-align: justify; ">Following Snowden revelations, data privacy has become a contentious issue in the EU, and TOS, that allow the service providers to unilaterally alter terms of the contract, will face many challenges in the future. In March Edward Snowden sent his testimony to the European Parliament calling for greater accountability and highlighted that in "a global, interconnected world where, when national laws fail like this, our international laws provide for another level of accountability."<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn12">[12]</a> Following the testimony came the European Parliament's vote in favor of new safeguards on the personal data of EU citizens, when it’s transferred to non-EU.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn13">[13]</a> The new regulations seek to give users more control over their personal data including the right to ask for data from companies that control it and seek to place the burden of proof on the service providers.</p>
<p style="text-align: justify; ">The regulation places responsibility on companies, including third-parties involved in data collection, transfer and storing and greater transparency on concerned requests for information. The amendment reinforces data subject right to seek erasure of data and obliges concerned parties to communicate data rectification. Also, earlier this year, the European Court of Justice (ECJ) ruled in favor of the 'right to be forgotten'<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn14">[14]</a>. The ECJ ruling recognised data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life.</p>
<p style="text-align: justify; ">In May, the Norwegian Consumer Council filed a complaint with the Norwegian Consumer Ombudsman, “… based on the discrepancies between Norwegian Law and the standard terms and conditions applicable to the Apple iCloud service...”, and, “...in breach of the law regarding control of marketing and standard agreements.”<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn15">[15]</a> The council based its complaint on the results of a study, published earlier this year, that found terms were hazy and varied across services including iCloud, Drop Box, Google Drive, Jotta Cloud, and Microsoft OneDrive. The Norwegian Council study found that Google TOS, allow for users content to be used for other purposes than storage, including by partners and that it has rights of usage even after the service is cancelled. None of the providers provide a guarantee that data is safe from loss, while many, have the ability to terminate an account without notice. All of the service providers can change the terms of service but only Google and Microsoft give an advance notice.</p>
<p style="text-align: justify; ">The study also found service providers lacking with respect to European privacy standards, with many allowing for browsing of user content. Tellingly, Google had received a fine in January by the French Data Protection Authority, that stated regarding Google's TOS, "permits itself to combine all the data it collects about its users across all of its services without any legal basis."</p>
<p style="text-align: justify; "><b>To blame or not to blame</b></p>
<p style="text-align: justify; ">Facebook is facing a probe by the UK Information Commissioner's Office, to assess if the experiment conducted in 2012 was a violation of data privacy laws.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn16">[16]</a> The FTC asked the court to order T-Mobile USA, to stop mobile cramming, provide refunds and give up any revenues from the practice. The existing mechanisms of online consent, do not simplify the task of agreeing to multiple documents and services at once, a complexity which manifolds, with the involvement of third parties.</p>
<p style="text-align: justify; ">Unsurprisingly, T-Mobile's Legere termed the FTC lawsuit misdirected and blamed the companies providing the text services for the cramming.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn17">[17]</a> He felt those providers should be held accountable, despite allegations that T-Mobile's billing practices made it difficult for consumers to detect that they were being charged for unauthorized services and having shared revenues with third-party providers. Interestingly, this is the first action against a wireless carrier for cramming and the FTC has a precedent of going after smaller companies that provide the services.</p>
<p style="text-align: justify; ">The FTC charged T-Mobile USA with deceptive billing practices in putting the crammed charges under a total for 'use charges' and 'premium services' and failure to highlight that portion of the charge was towards third-party charges. Further, the company urged customers to take complaints to vendors and was not forthcoming with refunds. For now, T-Mobile may be able to share the blame, the incident brings to question its accountability, especially as going forward it has entered a pact along with other carriers in USA including Verizon and AT&T, agreeing to stop billing customers for third-party services. Even when practices such as cramming are deemed illegal, it does not necessarily mean that harm has been prevented. Often users bear the burden of claiming refunds and litigation comes at a cost while even after being fined companies could have succeeded in profiting from their actions.</p>
<p style="text-align: justify; "><b>Conclusion </b></p>
<p style="text-align: justify; ">Unfair terms and conditions may arise when service providers include terms that are difficult to understand or vague in their scope. TOS that prevent users from taking legal action, negate liability for service providers actions despite the companies actions that may have a direct bearing on users, are also considered unfair. More importantly, any term that is hidden till after signing the contract, or a term giving the provider the right to change the contract to their benefit including wider rights for service provider wide in comparison to users such as a term that that makes it very difficult for users to end a contract create an imbalance. These issues get further complicated when the companies control and profiting from data are doing so with user generated data provided free to the platform.</p>
<p style="text-align: justify; ">In the knowledge economy, web companies play a decisive role as even though they work for profit, the profit is derived out of the knowledge held by individuals and groups. In their function of aggregating human knowledge, they collect and provide opportunities for feedback of the outcomes of individual choices. The significance of consent becomes a critical part of the equation when harnessing individual information. In France, consent is part of the four conditions necessary to be forming a valid contract (article 1108 of the Code Civil).</p>
<p style="text-align: justify; ">The cases highlight the complexities that are inherent in the existing mechanisms of online consent. The question of consent has many underlying layers such as reasonable notice and contractual obligations related to consent such as those explored in the case in Canada, which looked at whether clauses of TOS were communicated reasonably to the user, a topic for another blog. For now, we must remember that by creating and organising social knowledge that further human activity, service providers, serve a powerful function. And as the saying goes, with great power comes great responsibility.</p>
<hr size="1" style="text-align: justify; " width="33%" />
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref1">[1]</a> 'FTC Alleges T-Mobile Crammed Bogus Charges onto Customers’ Phone Bills', published 1 July, 2014. See: http://www.ftc.gov/news-events/press-releases/2014/07/ftc-alleges-t-mobile-crammed-bogus-charges-customers-phone-bills</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref2">[2]</a> 'Experimental evidence of massive-scale emotional contagion through social networks', Adam D. I. Kramera,1, Jamie E. Guilloryb, and Jeffrey T. Hancock, published March 25, 2014. See:http://www.pnas.org/content/111/24/8788.full.pdf+html?sid=2610b655-db67-453d-bcb6-da4efeebf534</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref3">[3]</a> 'U.S. sues T-Mobile USA, alleges bogus charges on phone bills, Reuters published 1st July, 2014 See: http://www.reuters.com/article/2014/07/01/us-tmobile-ftc-idUSKBN0F656E20140701</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref4">[4]</a> 'The Cost of Reading Privacy Policies', Aleecia M. McDonald and Lorrie Faith Cranor, published I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue. See: http://lorrie.cranor.org/pubs/readingPolicyCost-authorDraft.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref5">[5]</a> 'Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days', Alexis C. Madrigal, published The Atlantic, March 2012 See: http://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-in-a-year-would-take-76-work-days/253851/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref6">[6]</a> Facebook Legal Terms. See: https://www.facebook.com/legal/terms</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref7">[7]</a> 'Facebook's Eroding Privacy Policy: A Timeline', Kurt Opsahl, Published Electronic Frontier Foundation , April 28, 2010 See:https://www.eff.org/deeplinks/2010/04/facebook-timeline</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref8">[8]</a> Facebook Data Use Policy. See: https://www.facebook.com/about/privacy/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref9">[9]</a> 'When ‘Liking’ a Brand Online Voids the Right to Sue', Stephanie Strom, published in New York Times on April 16, 2014 See: http://www.nytimes.com/2014/04/17/business/when-liking-a-brand-online-voids-the-right-to-sue.html?ref=business</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref10">[10]</a> Explaining our website privacy policy and legal terms, published April 17, 2014 See:http://www.blog.generalmills.com/2014/04/explaining-our-website-privacy-policy-and-legal-terms/#sthash.B5URM3et.dpufhttp://www.blog.generalmills.com/2014/04/explaining-our-website-privacy-policy-and-legal-terms/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref11">[11]</a> General Mills Amends New Legal Policies, Stephanie Strom, published in New York Times on 1http://www.nytimes.com/2014/04/18/business/general-mills-amends-new-legal-policies.html?_r=0</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref12">[12]</a> Edward Snowden Statement to European Parliament published March 7, 2014. See: http://www.europarl.europa.eu/document/activities/cont/201403/20140307ATT80674/20140307ATT80674EN.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref13">[13]</a> Progress on EU data protection reform now irreversible following European Parliament vote, published 12 March 201 See: http://europa.eu/rapid/press-release_MEMO-14-186_en.htm</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref14">[14]</a> European Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties, Jyoti Panday, published on CIS blog on May 14, 2014. See: http://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref15">[15]</a> Complaint regarding Apple iCloud’s terms and conditions , published on 13 May 2014 See:http://www.forbrukerradet.no/_attachment/1175090/binary/29927</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref16">[16]</a> 'Facebook faces UK probe over emotion study' See: http://www.bbc.co.uk/news/technology-28102550</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref17">[17]</a> Our Reaction to the FTC Lawsuit See: http://newsroom.t-mobile.com/news/our-reaction-to-the-ftc-lawsuit.htm</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights'>https://cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights</a>
</p>
No publisherjyotiSocial MediaConsumer RightsGoogleinternet and societyPrivacyTransparency and AccountabilityIntermediary LiabilityAccountabilityFacebookData ProtectionPoliciesSafety2014-07-04T06:31:37ZBlog EntryAn Evidence based Intermediary Liability Policy Framework: Workshop at IGF
https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework
<b>CIS is organising a workshop at the Internet Governance Forum 2014. The workshop will be an opportunity to present and discuss ongoing research on the changing definition of intermediaries and their responsibilities across jurisdictions and technologies and contribute to a comprehensible framework for liability that is consistent with the capacity of the intermediary and with international human-rights standards.</b>
<p style="text-align: justify; ">The Centre for Internet and Society, India and Centre for Internet and Society, Stanford Law School, USA, will be organising a workshop to analyse the role of intermediary platforms in relation to freedom of expression, freedom of information and freedom of association at the Internet Governance Forum 2014. <span>The aim of the workshop is to highlight the increasing importance of digital rights and broad legal protections of stakeholders in an increasingly knowledge-based economy. The workshop will discuss public policy issues associated with Internet intermediaries, in particular their roles, legal responsibilities and related liability limitations in context of the evolving nature and role of intermediaries in the Internet ecosystem. distinct</span></p>
<p style="text-align: justify; "><b>Online Intermediaries: Setting the context</b></p>
<p style="text-align: justify; ">The Internet has facilitated unprecedented access to information and amplified avenues for expression and engagement by removing the limits of geographic boundaries and enabling diverse sources of information and online communities to coexist. Against the backdrop of a broadening base of users, the role of intermediaries that enable economic, social and political interactions between users in a global networked communication is ubiquitous. Intermediaries are essential to the functioning of the Internet as many producers and consumers of content on the internet rely on the action of some third party–the so called intermediary. Such intermediation ranges from the mere provision of connectivity, to more advanced services such as providing online storage spaces for data, acting as platforms for storage and sharing of user generated content (UGC), or platforms that provides links to other internet content.</p>
<p style="text-align: justify; ">Online intermediaries enhance economic activity by reducing costs, inducing competition by lowering the barriers for participation in the knowledge economy and fuelling innovation through their contribution to the wider ICT sector as well as through their key role in operating and maintaining Internet infrastructure to meet the network capacity demands of new applications and of an expanding base of users.</p>
<p style="text-align: justify; ">Intermediary platforms also provide social benefits, by empowering users and improving choice through social and participative networks, or web services that enable creativity and collaboration amongst individuals. By enabling platforms for self-expression and cooperation, intermediaries also play a critical role in establishing digital trust, protection of human rights such as freedom of speech and expression, privacy and upholding fundamental values such as freedom and democracy.</p>
<p style="text-align: justify; ">However, the economic and social benefits of online intermediaries are conditional to a framework for protection of intermediaries against legal liability for the communication and distribution of content which they enable.</p>
<p style="text-align: justify; "><b>Intermediary Liability</b></p>
<p style="text-align: justify; ">Over the last decade, right holders, service providers and Internet users have been locked in a debate on the potential liability of online intermediaries. The debate has raised global concerns on issues such as, the extent to which Internet intermediaries should be held responsible for content produced by third parties using their Internet infrastructure and how the resultant liability would affect online innovation and the free flow of knowledge in the information economy?</p>
<p style="text-align: justify; ">Given the impact of their services on communications, intermediaries find themselves as either directly liable for their actions, or indirectly (or “secondarily”) liable for the actions of their users. Requiring intermediaries to monitor the legality of the online content poses an insurmountable task. Even if monitoring the legality of content by intermediaries against all applicable legislations were possible, the costs of doing so would be prohibitively high. Therefore, placing liability on intermediaries can deter their willingness and ability to provide services, hindering the development of the internet itself.</p>
<p style="text-align: justify; ">Economics of intermediaries are dependent on scale and evaluating the legality of an individual post exceeds the profit from hosting the speech, and in the absence of judicial oversight can lead to a private censorship regime. Intermediaries that are liable for content or face legal exposure, have powerful incentives, to police content and limit user activity to protect themselves. The result is curtailing of legitimate expression especially where obligations related to and definition of illegal content is vague. Content policing mandates impose significant compliance costs limiting the innovation and competiveness of such platforms.</p>
<p style="text-align: justify; ">More importantly, placing liability on intermediaries has a chilling effect on freedom of expression online. Gate keeping obligations by service providers threaten democratic participation and expression of views online, limiting the potential of individuals and restricting freedoms. Imposing liability can also indirectly lead to the death of anonymity and pseudonymity, pervasive surveillance of users' activities, extensive collection of users' data and ultimately would undermine the digital trust between stakeholders.</p>
<p style="text-align: justify; ">Thus effectively, imposing liability for intermediaries creates a chilling effect on Internet activity and speech, create new barriers to innovation and stifles the Internet's potential to promote broader economic and social gains. To avoid these issues, legislators have defined 'safe harbours', limiting the liability of intermediaries under specific circumstances.</p>
<p style="text-align: justify; ">Online intermediaries do not have direct control of what information is or information are exchanged via their platform and might not be aware of illegal content per se. A key framework for online intermediaries, such limited liability regimes provide exceptions for third party intermediaries from liability rules to address this asymmetry of information that exists between content producers and intermediaries.</p>
<p style="text-align: justify; ">However, it is important to note, that significant differences exist concerning the subjects of these limitations, their scope of provisions and procedures and modes of operation. The 'notice and takedown' procedures are at the heart of the safe harbour model and can be subdivided into two approaches:</p>
<p style="text-align: justify; ">a. Vertical approach where liability regime applies to specific types of content exemplified in the US Digital Copyright Millennium Act</p>
<p style="text-align: justify; ">b. Horizontal approach based on the E-Commerce Directive (ECD) where different levels of immunity are granted depending on the type of activity at issue</p>
<p style="text-align: justify; "><b>Current framework </b></p>
<p style="text-align: justify; ">Globally, three broad but distinct models of liability for intermediaries have emerged within the Internet ecosystem:</p>
<p style="text-align: justify; ">1. Strict liability model under which intermediaries are liable for third party content used in countries such as China and Thailand</p>
<p style="text-align: justify; ">2. Safe harbour model granting intermediaries immunity, provided their compliance on certain requirements</p>
<p style="text-align: justify; ">3. Broad immunity model that grants intermediaries broad or conditional immunity from liability for third party content and exempts them from any general requirement to monitor content. <b> </b></p>
<p style="text-align: justify; ">While the models described above can provide useful guidance for the drafting or the improvement of the current legislation, they are limited in their scope and application as they fail to account for the different roles and functions of intermediaries. Legislators and courts are facing increasing difficulties, in interpreting these regulations and adapting them to a new economic and technical landscape that involves unprecedented levels user generated content and new kinds of and online intermediaries.</p>
<p style="text-align: justify; ">The nature and role of intermediaries change considerably across jurisdictions, and in relation to the social, economic and technical contexts. In addition to the dynamic nature of intermediaries the different categories of Internet intermediaries‘ are frequently not clear-cut, with actors often playing more than one intermediation role. Several of these intermediaries offer a variety of products and services and may have number of roles, and conversely, several of these intermediaries perform the same function. For example , blogs, video services and social media platforms are considered to be 'hosts'. Search engine providers have been treated as 'hosts' and 'technical providers'.</p>
<p style="text-align: justify; ">This limitations of existing models in recognising that different types of intermediaries perform different functions or roles and therefore should have different liability, poses an interesting area for research and global deliberation. Establishing classification of intermediaries, will also help analyse existing patterns of influence in relation to content for example when the removal of content by upstream intermediaries results in undue over-blocking.</p>
<p style="text-align: justify; ">Distinguishing intermediaries on the basis of their roles and functions in the Internet ecosystem is critical to ensuring a balanced system of liability and addressing concerns for freedom of expression. Rather than the highly abstracted view of intermediaries as providing a single unified service of connecting third parties, the definition of intermediaries must expand to include the specific role and function they have in relation to users' rights. A successful intermediary liability regime must balance the needs of producers, consumers, affected parties and law enforcement, address the risk of abuses for political or commercial purposes, safeguard human rights and contribute to the evolution of uniform principles and safeguards.</p>
<p style="text-align: justify; "><b>Towards an evidence based intermediary liability policy framework</b></p>
<p style="text-align: justify; ">This workshop aims to bring together leading representatives from a broad spectrum of stakeholder groups to discuss liability related issues and ways to enhance Internet users’ trust.</p>
<p style="text-align: justify; ">Questions to address at the panel include:</p>
<p style="text-align: justify; ">1. What are the varying definitions of intermediaries across jurisdictions?</p>
<p style="text-align: justify; ">2. What are the specific roles and functions that allow for classification of intermediaries?</p>
<p style="text-align: justify; ">3. How can we ensure the legal framework keeps pace with technological advances and the changing roles of intermediaries?</p>
<p style="text-align: justify; ">4. What are the gaps in existing models in balancing innovation, economic growth and human rights?</p>
<p style="text-align: justify; ">5. What could be the respective role of law and industry self-regulation in enhancing trust?</p>
<p style="text-align: justify; ">6. How can we enhance multi-stakeholder cooperation in this space?</p>
<p style="text-align: justify; ">Confirmed Panel:</p>
<p style="text-align: justify; ">Technical Community: Malcolm Hutty: Internet Service Providers Association (ISPA)<br />Civil Society: Gabrielle Guillemin: Article19<br />Academic: Nicolo Zingales: Assistant Professor of Law at Tilburg University<br />Intergovernmental: Rebecca Mackinnon: Consent of the Networked, UNESCO project<br />Civil Society: Anriette Esterhuysen: Association for Progressive Communication (APC)<br />Civil Society: Francisco Vera: Advocacy Director: Derechos Digitale<br />Private Sector: Titi Akinsanmi: Policy and Government Relations Manager, Google Sub-Saharan Africa<br />Legal: Martin Husovec: MaxPlanck Institute</p>
<p style="text-align: justify; "><b> </b></p>
<p style="text-align: justify; "><span>Moderator(s): </span><span>Giancarlo Frosio, Centre for Internet and Society (CIS) and </span><span>Jeremy Malcolm, Electronic Frontier Foundation </span></p>
<p style="text-align: justify; "><span><span>Remote Moderator: </span><span>Anubha Sinha, New Delhi</span></span></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework'>https://cis-india.org/internet-governance/blog/igf-workshop-an-evidence-based-intermediary-liability-policy-framework</a>
</p>
No publisherjyotihuman rightsDigital Governanceinternet governanceFreedom of Speech and ExpressionInternet Governance ForumHuman Rights OnlineIntermediary LiabilityPoliciesMulti-stakeholder2014-07-04T06:41:10ZBlog EntryEuropean Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties
https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties
<b>The Court of Justice of the European Union has ruled that an "an internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties.” The decision adds to the conundrum of maintaining a balance between freedom of expression, protecting personal data and intermediary liability.</b>
<p style="text-align: justify; ">The ruling is expected to have considerable impact on reputation and privacy related takedown requests as under the decision, data subjects may approach the operator directly seeking removal of links to web pages containing personal data. Currently, users prove whether data needs to be kept online—the new rules reverse the burden of proof, placing an obligation on companies, rather than users for content regulation.</p>
<h3>A win for privacy?</h3>
<p style="text-align: justify; ">The ECJ ruling addresses Mario Costeja González complaint filed in 2010, against Google Spain and Google Inc., requesting that personal data relating to him appearing in search results be protected and that data which was no longer relevant be removed. Referring to <a href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML">the Directive 95/46/EC</a> of the European Parliament, the court said, that Google and other search engine operators should be considered 'controllers' of personal data. Following the decision, Google will be required to consider takedown requests of personal data, regardless of the fact that processing of such data is carried out without distinction in respect of information other than the personal data.</p>
<p style="text-align: justify; ">The decision—which cannot be appealed—raises important of questions of how this ruling will be applied in practice and its impact on the information available online in countries outside the European Union. The decree forces search engine operators such as Google, Yahoo and Microsoft's Bing to make judgement calls on the fairness of the information published through their services that reach over 500 million people across the twenty eight nation bloc of EU.</p>
<p style="text-align: justify; ">ECJ rules that search engines 'as a general rule,' should place the right to privacy above the right to information by the public. Under the verdict, links to irrelevant and out of date data need to be erased upon request, placing search engines in the role of controllers of information—beyond the role of being an arbitrator that linked to data that already existed in the public domain. The verdict is directed at highlighting the power of search engines to retrieve controversial information while limiting their capacity to do so in the future.</p>
<p style="text-align: justify; ">The ruling calls for maintaining a balance in addressing the legitimate interest of internet users in accessing personal information and upholding the data subject’s fundamental rights, but does not directly address either issues. The court also recognised, that the data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life. Acknowledging that data belongs to the individual and is not the right of the company, European Commissioner Viviane Reding, <a href="https://www.facebook.com/permalink.php?story_fbid=304206613078842&id=291423897690447&_ga=1.233872279.883261846.1397148393">hailed the verdict</a>, "a clear victory for the protection of personal data of Europeans".</p>
<p style="text-align: justify; ">The Court stated that if data is deemed irrelevant at the time of the case, even if it has been lawfully processed initially, it must be removed and that the data subject has the right to approach the operator directly for the removal of such content. The liability issue is further complicated by the fact, that search engines such as Google do not publish the content rather they point to information that already exists in the public domain—raising questions of the degree of liability on account of third party content displayed on their services.</p>
<p style="text-align: justify; ">The ECJ ruling is based on the case originally filed against Google, Spain and it is important to note that, González argued that searching for his name linked to two pages originally published in 1998, on the website of the Spanish newspaper La Vanguardia. The Spanish Data Protection Agency did not require La Vanguardia to take down the pages, however, it did order Google to remove links to them. Google appealed this decision, following which the National High Court of Spain sought advice from the European court. The definition of Google as the controller of information, raises important questions related to the distinction between liability of publishers and the liability of processors of information such as search engines.</p>
<h3>The 'right to be forgotten'</h3>
<p style="text-align: justify; ">The decision also brings to the fore, the ongoing debate and <a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law">fragmented opinions within the EU</a>, on the right of the individual to be forgotten. The <a href="http://www.bbc.com/news/technology-16677370">'right to be forgotten</a>' has evolved from the European Commission's wide-ranging plans of an overhaul of the commission's 1995 Data Protection Directive. The plans for the law included allowing people to request removal of personal data with an obligation of compliance for service providers, unless there were 'legitimate' reasons to do otherwise. Technology firms rallying around issues of freedom of expression and censorship, have expressed concerns about the reach of the bill. Privacy-rights activist and European officials have upheld the notion of the right to be forgotten, highlighting the right of the individual to protect their honour and reputation.</p>
<p style="text-align: justify; ">These issues have been controversial amidst EU member states with the UK's Ministry of Justice claiming the law 'raises unrealistic and unfair expectations' and has <a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law">sought to opt-out</a> of the privacy laws. The Advocate General of the European Court <a href="http://curia.europa.eu/juris/document/document.jsf?text=&docid=138782&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=362663#Footref91">Niilo Jääskinen's opinion</a>, that the individual's right to seek removal of content should not be upheld if the information was published legally, contradicts the verdict of the ECJ ruling. The European Court of Justice's move is surprising for many and as Richard Cumbley, information-management and data protection partner at the law firm Linklaters <a href="http://turnstylenews.com/2014/05/13/europe-union-high-court-establishes-the-right-to-be-forgotten/">puts it</a>, “Given that the E.U. has spent two years debating this right as part of the reform of E.U. privacy legislation, it is ironic that the E.C.J. has found it already exists in such a striking manner."</p>
<p style="text-align: justify; ">The economic implications of enforcing a liability regime where search engine operators censor legal content in their results aside, the decision might also have a chilling effect on freedom of expression and access to information. Google <a href="http://www.theguardian.com/technology/2014/may/13/right-to-be-forgotten-eu-court-google-search-results">called the decision</a> “a disappointing ruling for search engines and online publishers in general,” and that the company would take time to analyze the implications. While the implications of the decision are yet to be determined, it is important to bear in mind that while decisions like these are public, the refinements that Google and other search engines will have to make to its technology and the judgement calls on the fairness of the information available online are not public.</p>
<p style="text-align: justify; ">The ECJ press release is available <a href="http://curia.europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf">here</a> and the actual judgement is available <a href="http://curia.europa.eu/juris/documents.jsf?pro=&lgrec=en&nat=or&oqp=&lg=&dates=&language=en&jur=C%2CT%2CF&cit=none%252CC%252CCJ%252CR%252C2008E%252C%252C%252C%252C%252C%252C%252C%252C%252C%252Ctrue%252Cfalse%252Cfalse&num=C-131%252F12&td=%3BALL&pcs=Oor&avg">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties'>https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</a>
</p>
No publisherjyotiFreedom of Speech and ExpressionSocial MediaInternet GovernanceIntermediary Liability2014-05-14T14:18:46ZBlog Entry