<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 41 to 55.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/ietf106"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/making-voices-heard-project-announcement"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/al-jazeera-video-november-8-2019-india-facial-recognition"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/un-special-rapporteur-on-the-right-to-privacy-consultation-on-privacy-and-gender"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-to-the-unhrc-report-on-gender-and-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/nipfp-seminar-on-exploring-policy-issues-in-the-digital-technology-arena"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/bsides-delhi-2019-security-conference"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019">
    <title> Comments to the Personal Data Protection Bill 2019</title>
    <link>https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019</link>
    <description>
        &lt;b&gt;The Personal Data Protection Bill, 2019 was introduced in the Lok Sabha on December 11, 2019. &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Please view our general comments below, or download as PDF &lt;a href="https://cis-india.org/accessibility/blog/cis-general-comments-to-the-pdp-bill-2019" class="internal-link" title="CIS' General Comments to the PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/h4&gt;
&lt;h4&gt;Our comments and recommendations can be downloaded as PDF &lt;a href="https://cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019" class="internal-link" title="CIS Comments PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/h4&gt;
&lt;h4&gt;We have also prepared an annotated version of the Bill, where our detailed comments and recommendations can be viewed alongside the Bill, available as PDF &lt;a href="https://cis-india.org/accessibility/blog/annotated-ver-pdp-bill-2019" class="internal-link" title="Annotated ver PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/h4&gt;
&lt;hr /&gt;
&lt;h2&gt;General Comments&lt;/h2&gt;
&lt;h3&gt;1. Executive notification cannot abrogate fundamental rights &lt;br /&gt;&lt;/h3&gt;
&lt;p&gt;In 2017, the Supreme Court in K.S. Puttaswamy v Union of India [1] held the right to privacy to be a fundamental right. While this right is subject to reasonable restrictions, the restrictions have to meet a three fold requirement, namely (i) existence of a law; (ii) legitimate state aim; (iii) proportionality.Under the 2018 Bill, the exemption to government agencies for processing of personal data from the provisions of the Bill in the ‘interest of the security of the State’ [2] was subject to a law being passed by Parliament. However, under Clause 35 of the present Bill, the Central Government is merely required to pass a written order exempting the government agency from the provisions of the Bill.Any restriction on the right to privacy will have to comply with the conditions prescribed in Puttaswamy I. An executive order issued by the central government authorising any agency of the government to process personal data does not satisfy the first requirement laid down by the Supreme Court in Puttaswamy I — as it is not a law passed by Parliament. The Supreme Court while deciding upon the validity of Aadhar in K.S. Puttaswamy v Union of India [3] noted that “an executive notification does not satisfy the requirement of a valid law contemplated under Puttaswamy. A valid law in this case would mean a law passed by Parliament, which is just, fair and reasonable. Any encroachment upon the fundamental right cannot be sustained by an executive notification.”&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;2. Exemptions under Clause 35 do not comply with the legitimacy and proportionality test&lt;/h3&gt;
&lt;p&gt;The lead judgement in Puttaswamy I while formulating the three fold test held that the restraint on privacy emanate from the procedural and content based mandate of Article 21 [4]. The Supreme Court in Maneka Gandhi v Union India [5] had clearly established that “mere prescription of some kind of procedure cannot ever meet the mandate of Article 21. The procedure prescribed by law has to be fair, just and reasonable, not fanciful,  oppressive and arbitrary” [6]. The existence of a law is the first requirement; the second requirement is that of ‘legitimate state aim’. As per the lead judgement this requirement ensures that “the nature and content of the law which imposes the restriction falls within the zone of reasonableness mandated by Article 14, which is  a guarantee against arbitrary state action” [7]. It is established that for a provision which confers upon the executive or administrative authority discretionary powers to be regarded as non-arbitrary, the provision should lay down clear and specific guidelines for the executive to exercise  the power [8]. The third test to be complied with is that the restriction should be ‘proportionate,’ i.e. the means that are adopted by the legislature are proportional to the object and needs sought to be fulfilled by the law. The Supreme Court in Modern Dental College &amp;amp; Research Centre v State of Madhya Pradesh [9] specified the components of proportionality standards —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;A measure restricting a right must have a legitimate goal;&lt;/li&gt;
&lt;li&gt;It must be a suitable means of furthering this goal;&lt;/li&gt;
&lt;li&gt;There must not be any less restrictive, but equally effective alternative; and&lt;/li&gt;
&lt;li&gt;The measure must not have any disproportionate impact on the right holder&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;Clause 35 provides extensive grounds for the Central Government to exempt any agency from the requirements of the bill but does not specify the procedure to be followed by the agency while processing personal data under this provision. It merely states that the ‘procedure, safeguards and oversight mechanism to be followed’ will be prescribed in  the rules.The wide powers conferred on the central government without clearly specifying the procedure may be contrary to the three fold test laid down in Puttaswamy I, as it is difficult to ascertain whether a legitimate or proportionate objective is being fulfilled [10].&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;3. Limited powers of Data Protection Authority in comparison with the Central Government&lt;/h3&gt;
&lt;p&gt;In comparison with the last version of the Personal Data Protection Bill, 2018 prepared by the Committee of Experts led by Justice Srikrishna, we witness an abrogation of powers of the Data Protection Authority (Authority), to be created, in this Bill. The powers and functions that were originally intended to be performed by the Authority have now been allocated to the Central Government. For example:&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;In the 2018 Bill, the Authority had the power to notify further categories of sensitive personal data. Under the present Bill, the Central Government in consultation with the sectoral regulators has been conferred the power to do so.&lt;/li&gt;
&lt;li&gt;Under the 2018 Bill, the Authority had the sole power to determine and notify significant data fiduciaries, however, under the present Bill, the Central Government has in consultation with the Authority been given the power to notify social media intermediaries as significant data fiduciaries.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate and resources. The political nature of the personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Authority are independent of the Executive.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;4. No clarity on data sandbox&lt;/h3&gt;
&lt;p&gt;The Bill contemplates a sandbox for “ innovation in artificial intelligence, machine-learning or any other emerging technology in public interest.” A Data Sandbox is a non-operational environment where the analyst can model and manipulate data inside the data management system. Data sandboxes have been envisioned as a secure area where only a copy of the company’s or participant companies’ data is located [11]. In essence, it refers to the scalable and creation platform which can be used to explore an enterprise’s information sets. On the other hand, regulatory sandboxes are controlled environments where firms can introduce innovations to a limited customer base within a relaxed regulatory framework, after which they may be allowed entry into the larger market after meeting certain conditions. This purportedly encourages innovation through the lowering of entry barriers by protecting newer entrants from unnecessary and burdensome regulation. Regulatory sandboxes can be interpreted as a form of responsive regulation by governments that seek to encourage innovation – they allow selected companies to experiment with solutions within an environment that is relatively free of most of the cumbersome regulations that they would ordinarily be subject to, while still subject to some appropriate safeguards and regulatory requirements. Sandboxes are regulatory tools which may be used to permit companies to innovate in the absence of heavy regulatory burdens. However, these ordinarily refer to burdens related to high barriers to entry (such as capital requirements for financial  and banking companies), or regulatory costs. In this Bill, however, the relaxing of data protection provisions for data fiduciaries would lead to restrictions of the privacy of individuals. Limitations to a fundamental rights on grounds of ‘fostering innovation’ is not a constitutional tenable position, and contradict the primary objectives of a data protection law.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;5. The primacy of ‘harm’ in the Bill ought to be reconsidered&lt;/h3&gt;
&lt;p&gt;While a harms based approach is necessary for data protection frameworks, such approaches should be restricted to the positive obligations, penal provisions and responsive regulation of the Authority. The Bill does not provide any guidance on either the interpretation of the term ‘harm,’ [12] or on the various activities covered within the definition of the term. Terms such as ‘loss of reputation or humiliation’ ‘any discriminatory treatment’ are a subjective standard and are open to varied interpretations. This ambiguity in the definition will make it difficult for the data principal to demonstrate harm and for the DPA to take necessary action as several provisions are based upon harm being caused or likely to be caused.Some of the significant provisions where ‘harm’ is a precondition for the provision to come into effect are —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;Clause 25: Data Fiduciary is required to notify the Authority about the breach of personal data processed by the data fiduciary, if such breach is likely to cause harm to any data principal. The Authority after taking into account the severity of the harm that may be caused to the data principal will determine whether the data principal should be notified about the breach.&lt;/li&gt;
&lt;li&gt;Clause 32 (2): A data principal can file a complaint with the data fiduciary for a contravention of any of the provisions of the Act, which has caused or is likely to cause ‘harm’ to the data principal.&lt;/li&gt;&lt;li&gt;Clause 64 (1): A data principal who has suffered harm as a result of any violation of the provision of the Act by a data fiduciary, has the right to seek compensation from the data fiduciary.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;Clause 16 (5): The guardian data fiduciary is barred from profiling, tracking or undertaking targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;6. Non personal data should be outside the scope of this Bill&lt;/h3&gt;
&lt;p&gt;Clause 91 (1) states that the Act does not prevent the Central Government from framing a policy for the digital economy, in so far as such policy does not govern personal data. The Central Government can, in consultation with the Authority, direct any data fiduciary  to provide any anonymised personal data or other non-personal data to enable better targeting of delivery of services or formulation of evidence based policies in any manner as may be prescribed.It is concerning that the data protection bill has specifically carved out an exception for the Central Government to frame policies for the digital economy and seems to indicate that the government plans to freely use any and all anonymized and/or non-personal data that rests with any data fiduciary that falls under the ambit of the bill to support the digital economy including for its growth, security, integrity, and prevention of misuse. It is unclear how the government, in practice, will be able to compel organizations to share this data. Further, there is a lack of clarity on the contours of the definition of non-personal data and the Bill does not define the term. It is also unclear whether the Central Government can compel the data fiduciary to transfer/share all forms of non-personal data and the rights and obligations of the data fiduciaries and data principals over such forms of data. Anonymised data refers to data which has ‘ irreversibly’ been converted into a form in which the data principal cannot be identified. However, as several instances have shown ‘ irreversible’ anonymisation is not possible. In the United States, the home addresses of taxi drivers were uncovered and in Australia individual health records were mined from anonymised medical bills [13]. In September 2019, the Ministry of Electronics and Information Technology, constituted an expert committee under the chairmanship of Kris Gopalkrishnan to study various issues relating to non-personal data and to deliberate over a data governance framework for the regulation of such data.The provision should be deleted and the scope of the bill should be limited to protection of personal data and to provide a framework for the protection of individual privacy. Until the report of the expert committee is published, the Central Government should not frame any law/regulation on the access and monetisation of non-personal/ anonymised data nor can they create a blanket provision allowing them to request such data from any data fiduciary that falls within the ambit of the bill. If the government wishes to use data resting with a data fiduciary; it must do so on a case to case basis and under formal and legal agreements with each data fiduciary.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;7. Steps towards greater decentralisation of power&lt;/h3&gt;
&lt;p&gt;We propose the following steps towards greater decentralisation of powers and devolved jurisdiction —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;Creation of State Data Protection Authorities: A single centralised body may not be the appropriate form of such a regulator. We propose that on the lines of central and state commissions under the Right to Information Act, 2005, state data protection authorities are set up which are in a position to respond to local complaints and exercise jurisdiction over entities within their territorial jurisdictions.&lt;/li&gt;
&lt;li&gt;More involvement of industry bodies and civil society actors: In order to lessen the burden on the data protection authorities it is necessary that there is active engagement with industry bodies, sectoral regulators and civil society bodies engaged in privacy research. Currently, the Bill provides for involvement of industry or trade association, association representing the interests of data principals, sectoral regulator or statutory Authority, or an departments or ministries of the Central or State Government in the formulation of codes of practice. However, it would be useful to also have a more active participation of industry associations and civil society bodies in activities such as promoting  awareness among data fiduciaries of their obligations under this Act, promoting measures and undertaking research for innovation in the field of protection of personal data.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;8. The Authority must be empowered to exercise responsive regulation&lt;/h3&gt;
&lt;p&gt;In a country like India, the challenge is to move rapidly from a state of little or no data protection law, and consequently an abysmal state of data privacy practices to a strong data protection regulation and a powerful regulator capable of enabling a state of robust data privacy practices. This requires a system of supportive mechanisms to the stakeholders in the data ecosystem, as well as systemic measures which enable the proactive detection of breaches. Further, keeping in mind the limited regulatory capacity in India, there is a need for the Authority to make use of different kinds of inexpensive and innovative strategies.We recommend the following additional powers for the Authority to be clearly spelt out in the Bill —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;Informal Guidance: It would be useful for the Authority to set up a mechanism on the lines of the Security and Exchange Board of India (SEBI)’s Informal Guidance Scheme, which enables regulated entities to approach the Authority for non-binding advice on the position of law. Given that this is the first omnibus data protection law in India, and there is very little jurisprudence on the subject from India, it would be extremely useful for regulated entities to get guidance from  the regulator.&lt;/li&gt;
&lt;li&gt;Power to name and shame: When a DPA makes public the names of organisations that have seriously contravened data protection legislation, this is a practice known as “naming and shaming.”  The UK ICO and other DPAs recognise the power of publicity, as evidenced by their willingness to co-operate  with the media. The ICO does not simply post monetary penalty notices (MPNs or fines) on its websites for journalists to find, but frequently issues press releases, briefs journalists and uses social media. The ICO’s publicity statement on communicating enforcement activities states that the “ICO aims to get media coverage for  enforcement activities.”&lt;/li&gt;
&lt;li&gt;Undertakings: The UK ICO has also leveraged the threats of fines into an alternative enforcement mechanism seeking contractual undertakings from data controllers to take certain remedial steps. Undertakings have significant advantages for the regulator. Since an undertaking is a more “co-operative”solution, it is less likely that a data controller will change it. An undertaking is simpler and easier to put in place. Furthermore, the Authority can put an undertaking in place quickly as opposed to legal proceedings which are longer.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;9. No clear roadmap for the implementation of the Bill&lt;/h3&gt;
&lt;p&gt;The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified [14]. It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified.The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within within which the Authority shall be established and specific rules and regulations notified. Considering that 25 provisions have been deferred to rules that have to be framed by the Central Government and a further 19 provisions have been deferred to the regulations to be notified by the Authority the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill.The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017.We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Authority. This is important to give full and effective effect to the right of privacy of the &lt;br /&gt;individual. It is also important to ensure that individuals have an effective mechanism  to enforce the right and seek recourse in case of any breach of obligations by the  data fiduciaries.For offences, we suggest a system of mail boxing where provisions and punishments are enforced in a staggered manner, for a period till the fiduciaries are aligned with the provisions of the Act. The Authority must ensure that data principals and fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment are brought into force. This will allow the data fiduciaries to align their practices with the provisions of this new legislation and the Authority will also have time to define and determine certain provisions that the Bill has left the Authority to define. Additionally enforcing penalties for offences initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders from paying a high price. This will relieve the fear of smaller companies and startups who might fear processing data for the fear of paying penalties for offences.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;10. Lack of interoperability&lt;/h3&gt;
&lt;p&gt;In its current form, a number of the provisions in the Bill will make it difficult for India’s framework to be interoperable with other frameworks globally and in the region. For example, differences between the draft Bill and the GDPR can be found in the grounds for processing,&amp;nbsp; data localization frameworks, the framework for cross border transfers, definitions of sensitive personal data, inclusion of&amp;nbsp; the undefined category of ‘critical&amp;nbsp; data’, and the roles of the authority and the central government.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;11. Legal Uncertainty&lt;/h3&gt;
&lt;p&gt;In its current structure, there are a number of provisions in the Bill that, when implemented, run the risk of creating an environment of legal uncertainty. These include: lack of definition of critical data, lack of clarity in the interpretation of the terms ‘harm’ and ‘significant harm’, ability of the government to define further categories of sensitive personal data,&amp;nbsp; inclusion of requirements for ‘social media intermediaries’, inclusion of ‘non-personal data’, framing of the requirements for data transfers, bar on processing of certain forms of biometric data as defined by the Central Government, the functioning between a consent manager and another data fiduciary, the inclusion of an AI sandbox and the definition of state. To ensure the greatest amount of protection of individual privacy rights and the protection of personal data while also enabling innovation, it is important that any data protection framework is structured and drafted in a way to provide as much legal certainty as possible.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;Endnotes&lt;/h3&gt;
&lt;p&gt;1. (2017) 10 SCC 641 (“Puttaswamy I”).&lt;/p&gt;
&lt;p&gt;2. Clause 42(1) of the 2018 Bill states that “Processing of personal data in the interests of the security of the State shall not be permitted unless it is authorised pursuant to a law, and is in accordance with the procedure established by such law, made by Parliament and is necessary for, and proportionate to such interests being achieved.”&lt;/p&gt;
&lt;p&gt;3. (2019) 1 SCC 1 (“Puttaswamy II”)&lt;/p&gt;
&lt;p&gt;4. Puttaswamy I, supra, para 180.&lt;/p&gt;
&lt;p&gt;5. (1978) 1 SCC 248.&lt;/p&gt;
&lt;p&gt;6. Ibid para 48.&lt;/p&gt;
&lt;p&gt;7. Puttaswamy I supra para 180.&lt;/p&gt;
&lt;p&gt;8. State of W.B. v. Anwar Ali Sarkar, 1952 SCR 284; Satwant Singh Sawhney v A.P.O AIR 1967 SC1836.&lt;/p&gt;
&lt;p&gt;9. (2016)7 SCC 353.&lt;/p&gt;
&lt;p&gt;10. Dvara Research “Initial Comments of Dvara Research dated 16 January 2020 on the Personal Data Protection Bill, 2019 introduced in Lok Sabha on 11 December 2019”, January 2020, https://www.dvara.com/blog/2020/01/17/our-initial-comments-on-the-personal-data-protection-bill-2019/ (“Dvara Research”).&lt;/p&gt;
&lt;p&gt;11. “A Data Sandbox for Your Company”, Terrific Data, last accessed on January 31, 2019, http://terrificdata.com/2016/12/02/3221/.&lt;/p&gt;
&lt;p&gt;12. Clause 3(20) — “harm” includes (i) bodily or mental injury; (ii) loss, distortion or theft of identity; (ii) financial loss or loss of property; (iv) loss of reputation or humiliation; (v) loss of employment; (vi) any discriminatory treatment; (vii) any subjection to blackmail or extortion; (viii) any denial or withdrawal of service,benefit or good resulting from an evaluative decision about the data principal; (ix) any restriction placed or suffered directly or indirectly on speech, movement or any other action arising out of a fear of being observed or surveilled; or (x) any observation or surveillance that is not reasonably expected by the data principal.&lt;/p&gt;
&lt;p&gt;13. Alex Hern “Anonymised data can never be totally anonymous, says study”, July 23, 2019 https://www.theguardian.com/technology/2019/jul/23/anonymised-data-never-be-anonymous-enough-study-finds.&lt;/p&gt;
&lt;p&gt;14. Clause 97 of the 2018 Bill states“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6)The Authority shall no, later than twelve months from the date notified date issue codes of practice on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f ) security safeguards under section 31; (g) research purposes under section 45; (h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019'&gt;https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha, Elonnai Hickok, Pallavi Bedi, Shweta Mohandas, Tanaya Rajwade</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2020-02-21T10:13:35Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call">
    <title>Call for Researchers: Welfare, Gender, and Surveillance</title>
    <link>https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call</link>
    <description>
        &lt;b&gt;We are inviting applications for two researchers. Each researcher is expected to write a narrative essay that interrogates the modes of surveillance that people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations are put under as they seek sexual and reproductive health (SRH) services in India. The researchers are expected to undertake field research in the location they are based in, and reflect on lived experiences gathered through field research as well as their own experiences of doing field research. Please read the sections below for more details about the work involved, the timeline for the same, and the application process for this call.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Call for Researchers: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_Researchers_WelfareGenderSurveillance_Call_20200110.pdf" target="_blank"&gt;Download&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;strong&gt;Description of the Work&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Each researcher is expected to author a narrative essay that presents and reflects on lived experiences of people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations as they seek sexual and reproductive health (SRH) services in India. We expect the essay to contribute to a larger body of knowledge around the increasing focus on data-driven initiatives for public health provision in the country and elsewhere. Accordingly, the researcher may respond to any one or more than one of the following questions, within the context of the geographical focus as specified by the researcher:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What are the modes of surveillance, especially in terms of generation and exploitation of digital data, experienced by people of marginalised gender identities and sexual orientations in India, as they avail of sexual and reproductive healthcare?&lt;/li&gt;
&lt;li&gt;How are the lived experiences of underserved populations, such as people of marginalised gender identities and sexual orientations, shaped by gendered surveillance while accessing sexual and reproductive services?&lt;/li&gt;
&lt;li&gt;What are the modes of governance and gender ideologies that have mediated the increasing datafication of such provision?&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;We expect the researchers to draw on a) the Indian Supreme Court’s framing of privacy in India, as a fundamental right, and its implications; and b) apply and/or build on feminist conceptualisations of privacy. Further, we expect the researchers to respond to the uncertain landscape of legal rights accessible to people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations, especially in the current context shaped by The Transgender Persons (Protection of Rights) Act, 2019.&lt;/p&gt;
&lt;p&gt;The researchers will undertake field research in locations of their choice, conduct interviews and discussions with people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations seeking such services, and conduct formal and informal interviews with officials and personnel associated with public and private sector agencies involved in the provision of SRH services.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Eligibility and Application Process&lt;/strong&gt;&lt;/h3&gt;
&lt;h4&gt;We specifically encourage people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations to submit their applications for this call for researchers.&lt;/h4&gt;
&lt;p&gt;We are seeking applications from individuals who:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Are based in the place where field study is to be undertaken, for the duration of the study;&lt;/li&gt;
&lt;li&gt;Are fluent in the main regional language(s) spoken in the city where the study will be conducted, and in English (especially written);&lt;/li&gt;
&lt;li&gt;Preferably have a postgraduate degree (current students should also apply) in social or technical sciences, journalism, or legal studies (undergraduate degree-holders with research or work experience should also apply); and&lt;/li&gt;
&lt;li&gt;Have previous research and writing experiences on issues at the intersection of sexual and reproductive health, gender justice and women’s rights, and health informatics or digital public health.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;Please send the following documents (in text or PDF formats) to ​&lt;strong&gt;​raw@cis-india.org​​ by ​Friday, January 24​​&lt;/strong&gt; to apply for the researcher positions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Brief CV with relevant academic and professional information;&lt;/li&gt;
&lt;li&gt;Two samples of academic/professional (published/unpublished) writing by the applicant; and&lt;/li&gt;
&lt;li&gt;A brief research proposal (around 500 words) that should specify the scope (geographical and conceptual), research questions, and motivation of the essay to be authored by the applicant.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;All applicants will be informed of the selection decisions by Friday, January 31.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Timeline of the Work&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;February 3-7&lt;/strong&gt; CIS research team will have a call with each researcher to plan out the work to be undertaken by them&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;February - March&lt;/strong&gt; Researchers are to undertake field research, as proposed by the researchers and discussed with the CIS research team&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;March 27&lt;/strong&gt; Researchers are to submit a full draft essay (around 3,000 words)&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;March 30 - April 3&lt;/strong&gt; CIS research team will have call with each researcher to discuss the shared draft essays and make plans towards their finalisation&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;May 15&lt;/strong&gt; Researchers are to submit the final essay (around 5,000 words, without footnotes and references)&lt;/p&gt;
&lt;p&gt;As part of this project, CIS will organise two discussion events in Bengaluru and New Delhi during April-June (tentatively). Event dates are to be decided in conversation with the researchers, and they will be invited to present their works in the same.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Remuneration&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Each researcher will be paid a remuneration of ​Rs. 1,00,000 (inclusive of taxes) ​​over two equal installments: first on signing of the agreement in February 2020, and second on submission of the final essay in May 2020.&lt;/p&gt;
&lt;p&gt;We will also reimburse local travel expenses of each researcher upto Rs. 10,000, and translations and transcriptions expense (if any) incurred by each researcher upto Rs. 10,000. These reimbursements will be made on the basis of expense invoices shared by the researcher.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Description of the Project&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Previous research conducted by CIS on the subject of sexual and reproductive health (SRH) services in India observes that there is a complex web of surveillance, or ‘dataveillance’, around each patient as they avail of SRH services from the state. In this current project, we are aiming to map the ecosystem of surveillance around SRH services as their provision becomes increasingly ‘data-driven’, and explore its implications for patients and beneficiaries.&lt;/p&gt;
&lt;p&gt;Through this project, we are interested in documenting the roles played by both the public and the private sector actors in this ecosystem of health surveillance. We understand the role of private sector actors as central to state provision of sexual and reproductive health services, especially through the institutionalisation of data-driven health insurance models, as well as through extensive privatisation of public health services. By studying semi-private, private, and public medical establishments including hospitals, primary/community health centres and clinics, we aim to develop a comparative analysis of surveillance ecosystems across the three establishment types.&lt;/p&gt;
&lt;p&gt;This project is led by Ambika Tandon, Aayush Rathi, and Sumandro Chattapadhyay at the Centre for Internet and Society, and is supported by a grant from Privacy International.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Indicative Reading List&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;&lt;em&gt;We are sharing below a short and indicative list of readings that may be useful for potential applicants&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Aayush Rathi, &lt;a href="https://www.epw.in/engage/article/indias-digital-health-paradigm-foolproof" target="_blank"&gt;Is India's Digital Health System Foolproof?&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Aayush Rathi and Ambika Tandon, &lt;a href="https://www.epw.in/engage/article/data-infrastructures-inequities-why-does-reproductive-health-surveillance-india-need-urgent-attention" target="_blank"&gt;Data Infrastructures and Inequities: Why Does Reproductive Health Surveillance in India Need Our Urgent Attention?&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Ambika Tandon, &lt;a href="https://cis-india.org/internet-governance/blog/ambika-tandon-december-23-2018-feminist-methodology-in-technology-research" target="_blank"&gt;Feminist Methodology in Technology Research: A Literature Review&lt;/a&gt; (2018)&lt;/p&gt;
&lt;p&gt;Ambika Tandon, &lt;a href="https://cis-india.org/raw/big-data-reproductive-health-india-mcts" target="_blank"&gt;Big Data and Reproductive Health in India: A Case Study of the Mother and Child Tracking System&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Anja Kovacs, &lt;a href="https://genderingsurveillance.internetdemocracy.in/theory/" target="_blank"&gt;Reading Surveillance through a Gendered Lens: Some Theory&lt;/a&gt; (2017)&lt;/p&gt;
&lt;p&gt;Lindsay Weinberg, &lt;a href="https://www.westminsterpapers.org/articles/10.16997/wpcc.258/" target="_blank"&gt;Rethinking Privacy: A Feminist Approach to Privacy Rights after Snowden&lt;/a&gt; (2017)&lt;/p&gt;
&lt;p&gt;Nicole Shephard, &lt;a href="https://www.apc.org/en/pubs/big-data-and-sexual-surveillance" target="_blank"&gt;Big Data and Sexual Surveillance&lt;/a&gt; (2016)&lt;/p&gt;
&lt;p&gt;Sadaf Khan, &lt;a href="https://deepdives.in/data-bleeding-everywhere-a-story-of-period-trackers-8766dc6a1e00" target="_blank"&gt;Data Bleeding Everywhere: A Story of Period Trackers&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call'&gt;https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>ambika</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Welfare Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Gender, Welfare, and Privacy</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2020-02-13T15:05:37Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/ietf106">
    <title>IETF106</title>
    <link>https://cis-india.org/internet-governance/news/ietf106</link>
    <description>
        &lt;b&gt;Gurshabad Grover participated at IETF106, which was held in Singapore 16-22 November, 2019.&lt;/b&gt;
        &lt;p class="moz-quote-pre"&gt;In the meeting of the Human Rights Protocol Considerations (hrpc) research group, I presented an update to draft-irtf-hrpc-guidelines-03 (Guidelines for Human Rights Protocol and Architecture Considerations), which is an Internet Draft adopted by the hrpc rg that he is co-editing with Niels ten Oever. &lt;a class="external-link" href="https://datatracker.ietf.org/doc/draft-irtf-hrpc-guidelines/"&gt;More info here&lt;/a&gt;.&lt;/p&gt;
&lt;p class="moz-quote-pre" style="text-align: justify; "&gt;Among other working/research group meetings, I participated theTransport Layer Security (tls) and the Privacy Enhancements and Assessments research group (pearg) sessions. I also participated inseveral side meetings, including the Public Interest Technology Group(pitg) meeting.&lt;/p&gt;
&lt;p class="moz-quote-pre" style="text-align: justify; "&gt;Agenda for the IETF and the different WGs/RG can be found on the &lt;a class="external-link" href="https://datatracker.ietf.org/meeting/106/agenda"&gt;IETF website&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/ietf106'&gt;https://cis-india.org/internet-governance/news/ietf106&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-12-15T06:14:02Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy">
    <title>Power over privacy: New Personal Data Protection Bill fails to really protect the citizen’s right to privacy</title>
    <link>https://cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy</link>
    <description>
        &lt;b&gt;Nikhil Pahwa throws light on the new personal data protection bill.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Nikhil Pahwa was &lt;a class="external-link" href="https://timesofindia.indiatimes.com/blogs/toi-edit-page/power-over-privacy-new-personal-data-protection-bill-fails-to-really-protect-the-citizens-right-to-privacy/"&gt;published in the Times of India&lt;/a&gt; on December 12, 2019. CIS report was mentioned.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Earlier this year, in April, &lt;a href="https://blog.trendmicro.com/trendlabs-security-intelligence/55m-registered-voters-risk-philippine-commission-elections-hacked/" rel="noopener noreferrer" target="_blank"&gt;a data breach&lt;/a&gt; in the Election Commission of Philippines led to the leakage of personal information of over 55 million eligible voters on a searchable website: including names, addresses and date of birth. This was not the first data breach from the Election Commission. After the first, which took place in March 2016, where  340 GB of voter data was &lt;a href="http://www.rappler.com/newsbreak/in-depth/127870-comelec-leak-identity-theft-scams-experts" rel="noopener noreferrer" target="_blank"&gt;published online by a group of hackers called LulzSec Pilipinas&lt;/a&gt;, the National Privacy Commission of Philippines found that the Election Commission had violated the Data Privacy Act of 2012, and &lt;a href="https://www.privacy.gov.ph/2017/01/privacy-commission-finds-bautista-criminally-liable-for-comeleak-data-breach/" rel="noopener noreferrer" target="_blank"&gt;recommended criminal prosecution of its chairman&lt;/a&gt;, finding him liable when the agency failed to dispense its duty as a “personal information controller”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It’s 2019, and that recommendation has still not been acted upon, because the National Privacy Commission of Philippines only has recommendatory powers for criminal prosecution. Meanwhile, data breaches continue at the Election Commission of Philippines.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Between 2017 and 2018, Aadhaar related personally identifiable data of several Indian citizens, including names, addresses, bank account numbers, in some cases pregnancy information and even religion and caste information of individuals, was published online by Indian government departments. The Centre for Internet and Society, in a report, estimated that &lt;a href="https://www.medianama.com/2017/05/223-aadhaar-numbers-data-leak/" rel="noopener noreferrer" target="_blank"&gt;personally identifiable data for 130-135 million Indian citizens had been leaked&lt;/a&gt;, thus putting them at risk. 210 government websites had made Aadhaar related data public, &lt;a href="https://www.thehindu.com/news/national/210-govt-websites-made-aadhaar-details-public-uidai/article20555266.ece" rel="noopener noreferrer" target="_blank"&gt;UIDAI confirmed in response to an RTI in 2017&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;No one was held liable. There was no data protection law, no data protection authority, no criminal prosecution was recommended. Around that time, the Indian government was instead arguing in the Supreme Court that privacy isn’t a fundamental right under the Indian Constitution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we can learn from these two instances is that for the enforcement of a citizen’s right to privacy, and ensuring that no one takes the protection of data lightly, there needs to be a strong privacy law that holds even the government responsible, and above all, a strong data protection authority that is independent and has powers to penalise even government officials. On some of these counts, the Personal Data Protection Bill, 2019, disappoints.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;First, members of the Data Protection Authority will no longer be appointed by independent entities from diverse backgrounds: where they were previously going to be appointed by a committee comprising the Chief Justice of India or a Supreme Court judge, the Cabinet secretary, and an independent expert, the power to appoint members to DPA now rests solely with government officials, including the appointment of adjudicating officers. In addition, the central government, in the interest of “national security, sovereignty, international relations and public order, can issue directions to DPA, which DPA will be bound by. Powers of DPA have also been reduced: while in the previous version of the bill, DPA had the sole power to categorise data as sensitive personal data, in the current version, the power rests with the central government, albeit in consultation with DPA. The central government will also notify any social media company as a significant data fiduciary, and not DPA. Only the central government can determine what critical personal data is, and not DPA.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This dependence on the government for appointments, functions and definitions, will invariably impact the independence of DPA, and even though the 2019 version of the bill gives it the authority to fine the state a maximum of Rs 5-15 crore, depending on the offence, i’d be surprised if this ever happens.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bill does create significant exceptions for the state to acquire and process data, and an opportunity to create a base for surveillance reform in the country has been lost. The previous version of the bill had brought some sense of safety against mass surveillance, when it included the condition that processing of data by the government must be “necessary and proportionate”, drawing from Supreme Court’s historic right to privacy judgment. This is particularly important given that the bill also gives power to the government to exempt any agency from the provisions of the bill for processing of personal data, which includes acquiring data from any public or private entity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Effectively, this means that government agencies may be exempt from any scrutiny by DPA, and can even collect data from third parties (for example, fin-tech companies, health-tech startups) without the user even knowing. Forget recommending criminal prosecution for mass surveillance, India’s DPA won’t even be able to fine a government agency for such a violation of the fundamental right to privacy. The government also has vast exceptions for data processing: “for the performance of any function of the state authorised by law”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This aside, one of the more curious clauses in the bill is around non-personal data. The government, a few months ago, constituted a committee led by Infosys co-founder Kris Gopalakrishnan to look into the governance of non-personal data. Non-personal data, as the term suggests, is any data that is not related to an individual. In the bill, the government has given itself the right to acquire this data, which is essentially a company’s intellectual property, to “promote framing of policies for digital economy”. Why non-personal data finds a mention in a Personal Data Protection Bill is beyond comprehension, and this move will not inspire much confidence in businesses operating in India, when the state claims eminent domain over intellectual property.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It’s unfortunate minister Ravi Shankar Prasad is sending the bill to a select committee, given the fact that such significant changes to the bill should have led to another public consultation.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy'&gt;https://cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Nikhil Pahwa</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-12-15T05:57:31Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers">
    <title>Outrage As Privileged IITians Use Tech To Spy On Sweepers </title>
    <link>https://cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers</link>
    <description>
        &lt;b&gt;Some members of the housekeeping staff at IIT Ropar were put under round the clock surveillance during working hours for many days in February this year without their consent. IIT Ropar Director Prof S K Das has ordered a probe into the incident.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Rachna Khaira was &lt;a class="external-link" href="https://www.huffingtonpost.in/entry/outrage-as-privileged-iitians-use-tech-to-spy-on-sweepers_in_5df1bbc8e4b06a50a2e9e659"&gt;published in Huffington Post&lt;/a&gt; on December 31, 2019. Aayush Rathi was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The Indian Institute of Technology (IIT), Ropar is conducting a probe into the reported tagging and round the clock electronic surveillance of some housekeeping staff members as part of an experiment run by the Technology Business Incubation Foundation (TBIF) located at the IIT campus  in February this year.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;HuffPost India &lt;/em&gt;has learnt that the TBIF, a tech incubator run within IIT Ropar, signed off on the “Sweepy” project in which housekeeping staff were given wristbands and brooms secretly embedded with tracking chips, without seeking the consent of the janitorial staff, or informing IIT Ropar management.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the housekeeping staff were told the wristbands would record their pulse and heart beat, and that they should wear it while cleaning the campus, the tracking chips were used to track to assess if they were sweeping out hard-to-reach corners of the institute.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Prof. Sarit Kumar Das, Director IIT Ropar  told HuffPost India that a  three member committee comprising of Prof. Bijoy H Barua, Prof. Javed Agrewala and Prof. Deepak Kashyap has been set up to look into the matter.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“We at the IIT Ropar respect privacy and  condemn any such violation made by any of our student or staff member,” said Prof. Das. “Before conducting any experiment on human beings, an approval has to be sought from the human ethics team constituted  in  our institution and they present a case to me after seeking a written consent from the people who would undergo the experiment. Only, after getting my approval, such an experiment can be conducted at the campus.”&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Sweeping surveillance&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;J K Sharma, the Chief Operating Officer of TBIF, told &lt;em&gt;HuffPost India&lt;/em&gt; that his tech incubator deliberately misled the housekeeping staff about the true purpose of the wristband as they felt the housekeeping staff wouldn’t agree to wear such a device.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While elaborating more on the ‘Sweepy’ project, Sharma said that the project was based on an idea that came to the hostellers who were upset over the housekeeping staff for not cleaning their rooms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The sweepers were not working properly and despite reporting the matter several times to the authorities, they were not taking any cognisance. Perturbed, the students developed this programme in which the location of the sweeper can be recorded and monitored in a control room by a gadget tied to the sweeper’s wrist,” said Sharma.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He further added that a beacon records the activity of the sensor pasted to the broom or mop held by the sweeper and can monitor the area  and the time in which it was used. The report was produced digitally on the screen.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Was a consent sought from the sweepers before tagging them?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The testing was done in a secret manner as the housekeeping staff may not have given their consent for the trial. We tried it on three sweepers and while two of them were found working dedicatedly, one was found to have missed  cleaning from few areas assigned to him,” said Sharma.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The findings were shared with the housekeeping supervisor who later directed his staff to do their duty more diligently.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The team working on the project however told &lt;em&gt;HuffPost India&lt;/em&gt; that they secured the privacy of the housekeeping staff by removing the microphone from the gadgets tied to their wrists.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This technology does not have video feature and only monitors location of a moving object and is quite cheap as compared to the radio-frequency identification (RFID) technology that uses electromagnetic fields to automatically identify and track tags attached to objects.&lt;/p&gt;
&lt;blockquote class="pull-quote content-list-component" style="text-align: justify; "&gt;The testing was done in a secret manner as the housekeeping staff may not have given their consent for the trial. We tried it on three sweepers and while two of them were found working dedicatedly, one was found to have missed  cleaning from few areas assigned to himJ K Sharma, Chief Executive Officer, Technology Business Incubation Foundation, IIT Ropar&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;Calling this an increasingly commonplace trend of covert spying on domestic workers without their knowledge, Ayush Rathi, Programme Officer, Centre for Internet and Society, said that the housekeeping staff was made to wear the gadget under a false pretense is telling.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“This is a classic example of how the access to privacy is stratified along the axes of class, caste and gender. And ties in closely with a key purpose of surveillance — that of exerting control over people’s bodies to conform to the surveiller’s ideas of right and wrong,” said Rathi.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He further added that in many ways, this story captures the zeitgeist of the 21st century. The is the essence of so much of what qualifies as innovation today is that they seek to find technological solutions to problems that are structural in nature.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“So, in this instance it is very evident that the objective sought to be achieved was not to merely ‘fix’ the problem of the housekeeping staff performing its duties well, but to solely hold them guilty for failing to do so,” said Rathi.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An alternate, albeit more tedious, approach would have been to speak with the workers and iron out the struggles they were facing at the workplace that were preventing them from performing their job well. Any solution could only have been prepared thereafter — he added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As per Prof. Das, a major problem with the engineering students is that unlike medical students, 90 percent of their experiments are based on machines and not human beings.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“There is  too much deficiency of  the understanding of human psychology amongst engineering students. To curb this, we at the IIT have started a mandatory course on human ethics which is being taught by some of the renowned human psychology experts. Still sometimes, the violations gets reported,” said Prof. Das.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers'&gt;https://cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Rachna Khaira</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-12-15T05:33:21Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after">
    <title>WhatsApp spy attack and after</title>
    <link>https://cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after</link>
    <description>
        &lt;b&gt;Bengaluru experts analyse the Pegasus snooping scandal, and provide advice on what you can do about the gaping holes in your mobile phone security.&lt;/b&gt;
        &lt;p&gt;The article by Theres Sudeep was published in &lt;a class="external-link" href="https://www.deccanherald.com/metrolife/metrolife-your-bond-with-bengaluru/whatsapp-spy-attack-and-after-773955.html"&gt;Deccan Herald&lt;/a&gt; on November 6, 2019. Aayush Rathi was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Last week ended with a sensational piece of news: WhatsApp said spyware Pegasus was being used to hack into the phones of activists and journalists in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The software is the brainchild of the NSO Group, an Israeli company. WhatsApp has detected 1,400 instances of Pegasus being used in the latest wave of attacks between April 29 and May 10. WhatsApp has identified 100-plus cases targeting human rights defenders and journalists. About two dozen of these attacks were in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Among those whose security was reportedly compromised is Congress leader Priyanka Gandhi.The first question is who ordered this snooping. NSO claims they sell their technology only to government agencies for lawful investigation into crime and terrorism. Speculation is rife that there is government involvement in the snooping.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Vinay Srinivas, lawyer with Alternative Law Forum, Bengaluru, says,“The targets of the attack seem to be those who had critical things to say about the current government.”Referring to a tweet by journalist Arvind Gunasekar, Srinivas says there is clear proof that the government knew of the breach and its severity.The tweet includes a screenshot of a report from the CERT-IN (Indian Computer Emergency Response Team) website dated May 17.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It shows severity rating as “High”.WhatsApp says the vulnerability has now been patched and urged users to update the app. But a level of paranoia around smartphones and privacy has been created. Apar Gupta, executive director of the Internet Freedom Foundation, based in Delhi works towards internet freedom and privacy, says Pegasus,specially, is too expensive (it can cost up to eight million dollars a year to licence) to be used on ordinary citizens.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But not all spyware is expensive. “Multiple kinds are now commercially available and easy to procure. These can be used by an estranged lover or even a professional rival to find information about you,” he says. Jija Hari Singh, retired DGP and Karnataka’s first woman IPS officer, says Pegasus is one of the smaller players, and spyware akin to it has been around for three decades. “Monsters bigger than Pegasus are still snooping on us,” she says.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;NOTHING TO HIDE?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Many people fall back on the narrative of ‘I have nothing to hide, so I’m not worried’.Aayush Rathi, Programme Officer at the Centre for Internet and Society, says that this is a flawed premise: “It is like saying free speech is not important for you because you have nothing useful to say.”Gupta breaks down this rationale: “If a person has ‘nothing to hide’ then they should just unlock their phone and hand it over to any person who asks for it. But the minute such a demand is made they would feel uncomfortable.”This discomfort, he says, doesn’t come because they are doing something illegal but because they fear social judgement.“There is a level of intimacy in their conversations that they’d rather not share with anyone else,” he says.Many people believe only illegal activity leads to surveillance, but that is not the case.“Even the most inconsequential actions are being logged on digital devices, and much of this information can be monetised,” he says.The most tangible risks are financial fraud and identity theft, and spyware is also commonly used for corporate espionage.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;UPDATE SECURITY&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;So what must one do if one’s phone is spied on? In the case of Pegasus, Rathi says, “You would have received a communication from WhatsApp if you were targeted. Irrespective, you should update the application immediately as the latest update fixes the vulnerability.”Srinivas says legally the recourse available is the fundamental right to privacy. “Since the government doesn’t have any regulation in place to deal with this, the National Human Rights Commission will have to take it up,” he says.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Gupta advises precautions against preventable hacks. He advises a reading of online guides on surveillance self-defence, especially those by Electronic Frontier Foundation.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after'&gt;https://cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Theres Sudeep</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-12-15T05:06:27Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/making-voices-heard-project-announcement">
    <title>Making Voices Heard: Privacy, Inclusivity, and Accessibility of Voice Interfaces in India</title>
    <link>https://cis-india.org/raw/making-voices-heard-project-announcement</link>
    <description>
        &lt;b&gt;We believe that voice interfaces have the potential to democratise the use of internet by addressing barriers such as accessibility concerns, lack of abilities of reading and writing on digital text interfaces, and lack of options for people to interact with digital devices in their own languages. Through the Making Voice Heard Project supported by Mozilla Corporation,  we will examine the current landscape of voice interfaces in India.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;img src="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_01.jpg" alt="null" width="30%" /&gt; &lt;img src="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_02.jpg" alt="null" width="30%" /&gt; &lt;img src="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_03.jpg" alt="null" width="30%" /&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download the project announcement cards (shown above): &lt;a href="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_01.jpg" target="_blank"&gt;Card 01&lt;/a&gt;, &lt;a href="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_02.jpg" target="_blank"&gt;Card 02&lt;/a&gt;, and &lt;a href="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_03.jpg" target="_blank"&gt;Card 03&lt;/a&gt;&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;Making Voices Heard: Project Announcement&lt;/h3&gt;
&lt;p&gt;Although voice enabled interfaces are being deployed there is a need to understand how they are beneficial, and what have been important knowledge gaps and challenges in their development, adoption, use, and regulation. Through the Making Voice Heard Project &lt;a href="https://blog.mozilla.org/blog/2019/07/05/mozillas-latest-research-grants-prioritizing-research-for-the-internet/" target="_blank"&gt;supported by Mozilla Corporation&lt;/a&gt;, we will be examining the current landscape of voice interfaces in India, and seek to address the following questions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What is the broad (sectoral and functional) typology of available voice interfaces in Indian languages? How widely are these voice interfaces (in Indian languages) used, and what barriers prevent their further adoption and use?&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;What are concerns related to privacy and data protection that emerge with the growth of voice interfaces? What kind of protocols for data processing may need to be built into the design of these interfaces?&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;How accessible are these interfaces for persons with disabilities (PWDs)? What kinds of accessibility features, especially for Indian languages, may need to be developed to ensure effective use of voice technologies by PWDs?&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Where do challenges in these three areas intersect? For instance, is compromising on users’ privacy, including weak or missing data protection regulations, required to create comprehensive speech datasets that may help develop better accessibility features, and address linguistic barriers?&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;In order to approach these questions we have begun mapping the various developers and users of voice interfaces in India. In the next stage of the process we will be looking at these interfaces through the lens of privacy, language, accessibility, and design. In order to add to the mapping and questions, we will be conducting interviews and workshops with users, developers, designers and researchers of voice interfaces in India, including the &lt;a href="https://voice.mozilla.org/en" target="_blank"&gt;Common Voice&lt;/a&gt; team at Mozilla.&lt;/p&gt;
&lt;p&gt;We hereby invite researchers, developers and designers of voice interfaces to speak to us and help inform the study. You may contact Shweta Mohandas at shweta@cis-india.org.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;- Shweta Mohandas, Saumyaa Naidu, Puthiya Purayil Sneha, and Sumandro Chattapadhyay (project team)&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/making-voices-heard-project-announcement'&gt;https://cis-india.org/raw/making-voices-heard-project-announcement&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>shweta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Voice User Interface</dc:subject>
    
    
        <dc:subject>Language</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Accessibility</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Voice Assisted Interface</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Making Voices Heard</dc:subject>
    

   <dc:date>2019-12-18T12:10:05Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability">
    <title>Cyber law experts asks why CERT-In removed advisory warning about WhatsApp vulnerability</title>
    <link>https://cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability</link>
    <description>
        &lt;b&gt;On the missing web page note, CERT-In had provided a detailed explanation of the vulnerability, which could be exploited by an attacker by making a decoy voice call to a target.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Megha Mandavia was &lt;a class="external-link" href="https://tech.economictimes.indiatimes.com/news/internet/cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability/71881880"&gt;published in ET Tech.com&lt;/a&gt; on November 4, 2019. Pranesh Prakash was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Cyber law experts have asked the &lt;a href="https://tech.economictimes.indiatimes.com/tag/government"&gt;government&lt;/a&gt; to explain why the Indian computer emergency response team (&lt;a href="https://tech.economictimes.indiatimes.com/tag/cert-in"&gt;CERT-In&lt;/a&gt;) removed from its website two days ago an advisory it had put out in May warning users of a vulnerability that could be used to exploit &lt;a href="https://tech.economictimes.indiatimes.com/tag/whatsapp"&gt;WhatsApp&lt;/a&gt; on their smartphones.&lt;br /&gt;&lt;br /&gt;“This is merely further evidence that the explanation is to be provided by GoI (Government of India) instead of blame shifting and politicizing the issue,” said Mishi Choudhary, the legal director of the New York-based Software Freedom Law Center. “India is a surveillance state with no judicial oversight.”&lt;br /&gt;&lt;br /&gt;On the missing web page note, CERT-In had provided a detailed explanation of the vulnerability, which could be exploited by an attacker by making a decoy voice call to a target.&lt;br /&gt;&lt;br /&gt;It had warned WhatsApp users that the vulnerability could allow an attacker to access information on the system, such as logs, messages and photos, and could further compromise it. CERT-In rated the severity “high” and asked users to upgrade to the latest version of the app.&lt;br /&gt;&lt;br /&gt;It also listed links to hackernews and cyber security firm Check Point Software that pointed to the alleged involvement of Israeli cyber software firm NSO Group in the hacking of WhatsApp messenger.&lt;br /&gt;&lt;br /&gt;CERT-In Director-General Sanjay Bahl did not respond to ET’s mails or calls seeking clarity on why the advisory was pulled from its website.&lt;br /&gt;&lt;br /&gt;The Times of India reported first the development.&lt;br /&gt;&lt;br /&gt;The government had blamed WhatsApp for not informing it about the attack and asked the Facebook-owned company to respond by November 4.&lt;br /&gt;&lt;br /&gt;In response, WhatsApp sources pointed out that it had informed CERT-in in May about the vulnerability and updated in September that 121 Indian nationals were targeted using the exploit, ET reported on Sunday.&lt;br /&gt;&lt;br /&gt;“We should not read too much into it. It could just be bad website management. The vulnerability was public knowledge. It was reported by the Common Vulnerabilities and Exposures (CVE) organization in May,” said Pranesh Prakash, fellow at the Centre of &lt;a href="https://tech.economictimes.indiatimes.com/news/internet"&gt;Internet&lt;/a&gt; and Society, a non-profit organisation.&lt;br /&gt;&lt;br /&gt;The government has also questioned the timing of the disclosure, as it comes amid a request by it to the Supreme Court seeking three months to frame rules to curb misuse of social media in the country.&lt;br /&gt;&lt;br /&gt;The government has categorically told WhatsApp that it wants the platform to bring in a mechanism that would enable tracing of the origin of messages, a demand that the instant messaging platform has resisted citing privacy concerns.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability'&gt;https://cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Megha Mandavia</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-11-15T00:48:00Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/al-jazeera-video-november-8-2019-india-facial-recognition">
    <title>India facial recognition: How effective will it be?</title>
    <link>https://cis-india.org/internet-governance/news/al-jazeera-video-november-8-2019-india-facial-recognition</link>
    <description>
        &lt;b&gt;India is trying to build what could be the world's largest facial recognition system.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;New Delhi says the system could help fight crime and find missing children.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The technology has already been launched at a few Indian airports.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Police in New Delhi says it has identified nearly 3,000 missing children during a trial period last year.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But not everyone is convinced.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Internet freedom advocates say there is little information about where and what the system will be used for and how data will be stored.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The use of facial recognition software is already common in places like China.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But there are questions about how effective it is, with one British study revealing that the technology could be highly inaccurate.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Pranesh Prakash joins Al Jazeera from Bengaluru in India. He is a fellow at the Centre for Internet and Society but is talking to us in a personal capacity&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Video&lt;/h3&gt;
&lt;p&gt;&lt;iframe frameborder="0" height="315" src="https://www.youtube.com/embed/YAsMf9qy3cc" width="560"&gt;&lt;/iframe&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/al-jazeera-video-november-8-2019-india-facial-recognition'&gt;https://cis-india.org/internet-governance/news/al-jazeera-video-november-8-2019-india-facial-recognition&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-11-15T00:42:35Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/un-special-rapporteur-on-the-right-to-privacy-consultation-on-privacy-and-gender">
    <title>UN Special Rapporteur on the Right to Privacy Consultation on 'Privacy and Gender'</title>
    <link>https://cis-india.org/internet-governance/news/un-special-rapporteur-on-the-right-to-privacy-consultation-on-privacy-and-gender</link>
    <description>
        &lt;b&gt;Ambika Tandon was a speaker at the Consultation on Privacy and Gender organised by the UN Special Rapporteur on the right to privacy held at New York University, New York on October 30 - 31, 2019. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The consultation was held to receive feedback on the report on privacy and gender towards which Pallavi, Aayush, Pranav and Ambika sent comments. Ambika was a speaker in t&lt;span&gt;he session 'The Body: as Data, as Identity, as &lt;/span&gt;&lt;span&gt;Money Maker', chaired by Eva Blum-Dumontet from Privacy &lt;/span&gt;&lt;span&gt;International, with co-panelists Anja Kovacs, Director, Internet &lt;/span&gt;&lt;span&gt;Democracy Project, and Joana Varon, Director, Coding Rights.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/un-special-rapporteur-on-the-right-to-privacy-consultation-on-privacy-and-gender'&gt;https://cis-india.org/internet-governance/news/un-special-rapporteur-on-the-right-to-privacy-consultation-on-privacy-and-gender&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-11-02T06:39:25Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass">
    <title>Through the looking glass: Analysing transparency reports</title>
    <link>https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass</link>
    <description>
        &lt;b&gt;An analysis of companies' transparency reports for government requests for user data and content removal&lt;/b&gt;
        
&lt;p style="text-align: justify;"&gt;Over the past decade, a few private online intermediaries, by rapid innovation and integration, have turned into regulators of a substantial amount of online speech. Such concentrated power calls for a high level of responsibility on them to ensure that the rights of the users online, including their rights to free speech and privacy, are maintained. Such responsibility may include appealing or refusing to entertain government requests that are technically or legally flawed, or resisting gag orders on requests. For the purposes of measuring a company’s practices regarding refusing flawed requests and standing up for user rights, transparency reporting becomes useful and relevant.Making information regarding the same public also ensures that researchers can build upon such data and recommend ways to improve accountability and enables the user to understand information about when and how governments are restricting their rights.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;For some time in the last decade, Google and Twitter were the only major online platforms that published half-yearly transparency reports documenting the number of content take down and user information requests they received from law enforcement agencies. In 2013 however, that changed, when the Snowden leaks revealed, amongst other things, that these companies were often excessively compliant with requests from US’ intelligence operations, and allowed them backdoor surveillance access to user information. Subsequently, all the major Silicon Valley internet companies have been attempting to publish a variance or other of transparency reports, in hopes of re-building their damaged goodwill, and displaying a measure of accountability to its users.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The number of government requests for user data and content removal has also seen a steady rise. In 2014, for instance Google noted that in the US alone, they observed a 19% rise for the second half of the year, and an overall 250% jump in numbers since Google began providing this information. As per a study done by Comparitech, India sent the maximum number of government requests for content removal and user data in the period of 2009 - 2018.8 This highlights the increasing importance of accessible transparency reporting.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Initiatives analysing the transparency reporting practices of online platforms, like The Electronic Frontier Foundation (EFF)’s Who Has Your Back? reports, for instance, have developed a considerable body of work tracing these reporting practices, but have largely focused at them in the context of the United States (US).&amp;nbsp;In our research, we found that the existing methodology and metrics to assess the transparency reports of online platforms developed by organisations like the EFF are not adequate in the Indian context. We identify two reasons for developing a new methodology:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Online platforms make available vastly different information for US and India. For instance, Facebook breaks up the legal requests it receives for US into eight different classes (search warrants, subpoenas, etc.). Such a classification is not present for India. These differences are summarised in Annexure &lt;/li&gt;
&lt;li style="text-align: justify;"&gt;The legal regimes and procedural safeguards under which states can compel platforms to share information or take content down also differ. For instance, in India, an order for content takedown can be issued either under section 79 and its allied rules or under section 69A and its rules, each having their own procedures and relevant authorities. A summary of such provisions for Indian agencies is given in Annexure 3.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;These differences may merit differences in the methodology for research into understanding the reporting practices of these platforms, depending on each jurisdiction’s legal context.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In this report, we would be analyzing the transparency reports of online platforms with a large Indian user-base, specifically focusing on data they publish about user information and takedown requests received from Indian governments’ and courts.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;First, we detail our methodology for this report, including how we selected platforms whose transparency reports we analyse, and then specific metrics relating to information available in those reports. For the latter, we collate relevant metrics from existing frameworks, and propose a standard that can be applicable for our research.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In the second part, we present company-specific reports. We identify general trends in the data published by the company, and then compare the available data to the best practices of transparency reporting that we proposed.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/A%20collation%20and%20analysis%20of%20government%20requests%20for%20user%20data%20%20and%20content%20removal%20from%20non-Indian%20intermediaries%20.pdf"&gt;Download the full report&lt;/a&gt;.&amp;nbsp;The report was edited by Elonnai Hickok. Research assistance by Keying Geng and Anjanaa Aravindan.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass'&gt;https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Torsha Sarkar, Suhan S and Gurshabad Grover</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-11-02T05:48:59Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-to-the-unhrc-report-on-gender-and-privacy">
    <title>Comments to the United Nations Human Rights Commission Report on Gender and Privacy</title>
    <link>https://cis-india.org/internet-governance/blog/comments-to-the-unhrc-report-on-gender-and-privacy</link>
    <description>
        &lt;b&gt;This submission to UNHRC presents a response by researchers at the CIS to ‘gender issues arising in the digital era and their impacts on women, men and individuals of diverse sexual orientations gender identities, gender expressions and sex characteristics’. It was prepared by Aayush Rathi, Ambika Tandon, and Pallavi Bedi in response to a report of consultation by a thematic taskforce established by the Special Rapporteur on the Right to Privacy on ‘Privacy and Personality’ (hereafter, HRC Gender Report).&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;HRC Gender Report - Consultation version: &lt;a href="https://www.ohchr.org/Documents/Issues/Privacy/SR_Privacy/2019_HRC_Annex2_GenderReport.pdf" target="_blank"&gt;Read&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;h4&gt;Submitted comments: &lt;a href="http://cis-india.org/internet-governance/files/comments-to-the-united-nations-human-rights-commission-report-on-gender-and-privacy" target="_blank"&gt;Read&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The Centre for Internet and Society (CIS), India, is an 11-year old non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and regulatory practices around internet, technology, and society in India,and elsewhere. Current focus areas include cybersecurity, privacy, freedom of speech, labour and artificial intelligence. CIS has been taking efforts to mainstream gender across its programmes, as well as develop specifically gender-focused research using a feminist approach.&lt;/p&gt;
&lt;p&gt;CIS appreciates the efforts of Dr. Elizabeth Coombs, Chair, Thematic Action Stream Taskforce on “A better understanding of privacy”, and those of Professor Joseph Cannataci, Special Rapporteur on the Right to Privacy. We are also grateful for the opportunity to put forth our views and comment on the HRC Gender Report.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-to-the-unhrc-report-on-gender-and-privacy'&gt;https://cis-india.org/internet-governance/blog/comments-to-the-unhrc-report-on-gender-and-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Aayush Rathi, Ambika Tandon and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Gender, Welfare, and Privacy</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2019-12-30T17:40:20Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/nipfp-seminar-on-exploring-policy-issues-in-the-digital-technology-arena">
    <title>NIPFP Seminar on Exploring Policy Issues in the Digital Technology Arena</title>
    <link>https://cis-india.org/internet-governance/news/nipfp-seminar-on-exploring-policy-issues-in-the-digital-technology-arena</link>
    <description>
        &lt;b&gt;Anubha Sinha participated in this seminar as a discussant on the "Regulating emerging technologies" panel. The event was held at Indian Institute of Advanced Study, Shimla on October 10 - 11, 2019.

&lt;/b&gt;
        &lt;p&gt;Click to view the &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/exploring-policy-issues-in-the-digital-technology-arena"&gt;agenda here&lt;/a&gt;. The session briefs can be &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/session-briefs"&gt;seen here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/nipfp-seminar-on-exploring-policy-issues-in-the-digital-technology-arena'&gt;https://cis-india.org/internet-governance/news/nipfp-seminar-on-exploring-policy-issues-in-the-digital-technology-arena&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Technologies</dc:subject>
    
    
        <dc:subject>Digital India</dc:subject>
    

   <dc:date>2019-10-20T07:40:16Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women">
    <title>Due Diligence Project FGD by UN Women</title>
    <link>https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women</link>
    <description>
        &lt;b&gt;On October 11, 2019, Radhika Radhakrishnan attended a focussed group discussion at the UN House, New Delhi, organized by UN Women for their multi-country research study on online violence (Due Diligence Project).&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The purpose of the  discussion was to provide a better understanding of the nature and the  scope of this form of VAWG and to provide recommendations to inform  policies, plans, programming and advocacy on the issue.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women'&gt;https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Due Diligence</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-10-20T07:11:13Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/bsides-delhi-2019-security-conference">
    <title>BSides Delhi 2019 Security Conference</title>
    <link>https://cis-india.org/internet-governance/news/bsides-delhi-2019-security-conference</link>
    <description>
        &lt;b&gt;Karan Saini attended the BSides Delhi security conference on October 11, 2019. The event was organized by Bsides Delhi in New Delhi. &lt;/b&gt;
        &lt;p&gt;Click to view the agenda &lt;a class="external-link" href="https://bsidesdelhi.in/program.php"&gt;here&lt;/a&gt;. Videos of the event can be &lt;a class="external-link" href="https://www.youtube.com/channel/UCZidtr5OB-OGQwxWXDDSTBQ"&gt;viewed here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/bsides-delhi-2019-security-conference'&gt;https://cis-india.org/internet-governance/news/bsides-delhi-2019-security-conference&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-10-20T06:47:26Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
