<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 101 to 115.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/comments-to-the-pdp-bill-2019"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/gen-comments-to-pdp-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/annotated-version-of-comments-to-the-personal-data-protection-bill-2019"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-to-national-security-council-on-national-cybersecurity-strategy-2020"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/pegasus-snoopgate-an-opportune-moment-to-revisit-legal-framework-governing-state-surveillance-framework"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/anushree-gupta-ladies-log-women-safety-risk-transfer-ridehailing"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/about/newsletters/november-2019-newsletter"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/extraterritorial-algorithmic-surveillance-and-the-incapacitation-of-international-human-rights-law"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/extra-territorial-surveillance-and-the-incapacitation-of-human-rights"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance">
    <title>Announcing Selected Researchers: Welfare, Gender, and Surveillance </title>
    <link>https://cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance</link>
    <description>
        &lt;b&gt;We published a Call for Researchers on January 10, 2020, to invite applications from researchers interested in writing a narrative essay that interrogates the modes of surveillance that people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations are put under as they seek sexual and reproductive health (SRH) services in India.  We received 29 applications from over 10 locations in India in response to the call, and are truly overwhelmed by and grateful for this interest and support. We eventually selected applications by 3 researchers that we felt aligned best with the specific objectives of the project. Please find below brief profile notes of the selected researchers.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Call for Researchers: &lt;a href="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call" target="_blank"&gt;URL&lt;/a&gt;&lt;/h4&gt;
&lt;hr /&gt;
&lt;h2&gt;Kaushal Bodwal&lt;/h2&gt;
&lt;p&gt;Kaushal is persuing his MPhil in Sociology at Delhi School of Economics, University of Delhi. He completed his Master's in Sociology at Centre for the Study of Social Systems, Jawaharlal Nehru University after getting a BSc honors degree in Biomedical Sciences from Delhi University. He is one of the founding members of Hasratein: a queer collective, New Delhi. He has been an active spokesperson for Queer and Trans Rights in India and have been on a number of panel discussion on Trans Act 2019 in various campuses. He has also delivered a lecture series on Colonialism and Medicine in Ambedkar University, Kashmiri Gate, Delhi. His areas of interest are Sociology of medicine, gender and medicine, sexuality, religion and biomedical science, intersex studies.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://kafila.online/2019/08/27/queerness-as-disease-a-continuing-narrative-in-21st-century-india-kaushal-bodwal/" target="_blank"&gt;Queerness as disease – a continuing narrative in 21st century India&lt;/a&gt;, Kafila, 27 August 2019&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.firstpost.com/india/what-it-means-to-be-a-queer-and-live-under-regime-bent-on-remaking-india-on-terms-of-their-tradition-writes-queer-scholar-trolled-by-right-wing-7915391.html" target="_blank"&gt;What it means to be queer under a regime bent on remaking India on its own ideological terms&lt;/a&gt;, Firstpost, 17 January 2020&lt;/p&gt;
&lt;h2&gt;Rosamma Thomas&lt;/h2&gt;
&lt;p&gt;Rosamma has worked both as a reporter and as an editor of news reports with newspapers. She currently writes reports for NGOs while also undertaking freelance reporting assignments. She is based in Pune.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://iced.cag.gov.in/wp-content/uploads/2016-17/NTP%2007/article.pdf " target="_blank"&gt;India's mining state steps up fight to rein in killer silicosis&lt;/a&gt;, The Times of India, 29 June 2016&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.newsclick.in/doctor-may-have-found-early-marker-silicosis-who-will-fund-him" target="_blank"&gt;Doctor may have found early marker for silicosis, but who will fund him?&lt;/a&gt;, Newsclick, 18 July 2019&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.newsclick.in/Asbestos-Poisoning-Raghunath-Manwar-Fight-Safer-Work-Conditions" target="_blank"&gt;Asbestos poisoning: Raghunath Manwar’s fight for safer work conditions&lt;/a&gt;, Newsclick, 9 January 2020&lt;/p&gt;
&lt;h2&gt;Shreya Ila Anasuya&lt;/h2&gt;
&lt;p&gt;Shreya is a writer, editor, journalist and performance artist currently based in Calcutta. Her fiction explores the places where myth, memory, history and the performing arts meet. As a journalist, her work explores gender, sexuality, politics, culture and history. She has been published in &lt;em&gt;The Wire&lt;/em&gt;, &lt;em&gt;Caravan&lt;/em&gt;, &lt;em&gt;Scroll&lt;/em&gt;, &lt;em&gt;Mint Lounge&lt;/em&gt;, &lt;em&gt;Deep Dives&lt;/em&gt;, &lt;em&gt;GenderIT&lt;/em&gt;, &lt;em&gt;Helter Skelter&lt;/em&gt;, and many more. She is the editor of the digital publication &lt;a href="https://medium.com/skin-stories" target="_blank"&gt;&lt;em&gt;Skin Stories&lt;/em&gt;&lt;/a&gt;, housed at the non-profit Point of View. She is the writer and narrator of ‘Gul - a story in text, song and dance’ which has been performed in several cities in India. She was a Felix Scholar at SOAS, University of London, from where she has an MA in Anthropology. For a full portfolio, please click &lt;a href="http://porterfolio.net/dervishdancing" target="_blank"&gt;here&lt;/a&gt; or visit her &lt;a href="https://www.shreyailaanasuya.com/" target="_blank"&gt;website&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;This project is led by Ambika Tandon, Aayush Rathi, and Sumandro Chattapadhyay at the Centre for Internet and Society, and is supported by a grant from Privacy International.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance'&gt;https://cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sumandro</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Welfare Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Gender, Welfare, and Privacy</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2020-02-13T15:04:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019">
    <title> Comments to the Personal Data Protection Bill 2019</title>
    <link>https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019</link>
    <description>
        &lt;b&gt;The Personal Data Protection Bill, 2019 was introduced in the Lok Sabha on December 11, 2019. &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Please view our general comments below, or download as PDF &lt;a href="https://cis-india.org/accessibility/blog/cis-general-comments-to-the-pdp-bill-2019" class="internal-link" title="CIS' General Comments to the PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/h4&gt;
&lt;h4&gt;Our comments and recommendations can be downloaded as PDF &lt;a href="https://cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019" class="internal-link" title="CIS Comments PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/h4&gt;
&lt;h4&gt;We have also prepared an annotated version of the Bill, where our detailed comments and recommendations can be viewed alongside the Bill, available as PDF &lt;a href="https://cis-india.org/accessibility/blog/annotated-ver-pdp-bill-2019" class="internal-link" title="Annotated ver PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/h4&gt;
&lt;hr /&gt;
&lt;h2&gt;General Comments&lt;/h2&gt;
&lt;h3&gt;1. Executive notification cannot abrogate fundamental rights &lt;br /&gt;&lt;/h3&gt;
&lt;p&gt;In 2017, the Supreme Court in K.S. Puttaswamy v Union of India [1] held the right to privacy to be a fundamental right. While this right is subject to reasonable restrictions, the restrictions have to meet a three fold requirement, namely (i) existence of a law; (ii) legitimate state aim; (iii) proportionality.Under the 2018 Bill, the exemption to government agencies for processing of personal data from the provisions of the Bill in the ‘interest of the security of the State’ [2] was subject to a law being passed by Parliament. However, under Clause 35 of the present Bill, the Central Government is merely required to pass a written order exempting the government agency from the provisions of the Bill.Any restriction on the right to privacy will have to comply with the conditions prescribed in Puttaswamy I. An executive order issued by the central government authorising any agency of the government to process personal data does not satisfy the first requirement laid down by the Supreme Court in Puttaswamy I — as it is not a law passed by Parliament. The Supreme Court while deciding upon the validity of Aadhar in K.S. Puttaswamy v Union of India [3] noted that “an executive notification does not satisfy the requirement of a valid law contemplated under Puttaswamy. A valid law in this case would mean a law passed by Parliament, which is just, fair and reasonable. Any encroachment upon the fundamental right cannot be sustained by an executive notification.”&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;2. Exemptions under Clause 35 do not comply with the legitimacy and proportionality test&lt;/h3&gt;
&lt;p&gt;The lead judgement in Puttaswamy I while formulating the three fold test held that the restraint on privacy emanate from the procedural and content based mandate of Article 21 [4]. The Supreme Court in Maneka Gandhi v Union India [5] had clearly established that “mere prescription of some kind of procedure cannot ever meet the mandate of Article 21. The procedure prescribed by law has to be fair, just and reasonable, not fanciful,  oppressive and arbitrary” [6]. The existence of a law is the first requirement; the second requirement is that of ‘legitimate state aim’. As per the lead judgement this requirement ensures that “the nature and content of the law which imposes the restriction falls within the zone of reasonableness mandated by Article 14, which is  a guarantee against arbitrary state action” [7]. It is established that for a provision which confers upon the executive or administrative authority discretionary powers to be regarded as non-arbitrary, the provision should lay down clear and specific guidelines for the executive to exercise  the power [8]. The third test to be complied with is that the restriction should be ‘proportionate,’ i.e. the means that are adopted by the legislature are proportional to the object and needs sought to be fulfilled by the law. The Supreme Court in Modern Dental College &amp;amp; Research Centre v State of Madhya Pradesh [9] specified the components of proportionality standards —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;A measure restricting a right must have a legitimate goal;&lt;/li&gt;
&lt;li&gt;It must be a suitable means of furthering this goal;&lt;/li&gt;
&lt;li&gt;There must not be any less restrictive, but equally effective alternative; and&lt;/li&gt;
&lt;li&gt;The measure must not have any disproportionate impact on the right holder&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;Clause 35 provides extensive grounds for the Central Government to exempt any agency from the requirements of the bill but does not specify the procedure to be followed by the agency while processing personal data under this provision. It merely states that the ‘procedure, safeguards and oversight mechanism to be followed’ will be prescribed in  the rules.The wide powers conferred on the central government without clearly specifying the procedure may be contrary to the three fold test laid down in Puttaswamy I, as it is difficult to ascertain whether a legitimate or proportionate objective is being fulfilled [10].&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;3. Limited powers of Data Protection Authority in comparison with the Central Government&lt;/h3&gt;
&lt;p&gt;In comparison with the last version of the Personal Data Protection Bill, 2018 prepared by the Committee of Experts led by Justice Srikrishna, we witness an abrogation of powers of the Data Protection Authority (Authority), to be created, in this Bill. The powers and functions that were originally intended to be performed by the Authority have now been allocated to the Central Government. For example:&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;In the 2018 Bill, the Authority had the power to notify further categories of sensitive personal data. Under the present Bill, the Central Government in consultation with the sectoral regulators has been conferred the power to do so.&lt;/li&gt;
&lt;li&gt;Under the 2018 Bill, the Authority had the sole power to determine and notify significant data fiduciaries, however, under the present Bill, the Central Government has in consultation with the Authority been given the power to notify social media intermediaries as significant data fiduciaries.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate and resources. The political nature of the personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Authority are independent of the Executive.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;4. No clarity on data sandbox&lt;/h3&gt;
&lt;p&gt;The Bill contemplates a sandbox for “ innovation in artificial intelligence, machine-learning or any other emerging technology in public interest.” A Data Sandbox is a non-operational environment where the analyst can model and manipulate data inside the data management system. Data sandboxes have been envisioned as a secure area where only a copy of the company’s or participant companies’ data is located [11]. In essence, it refers to the scalable and creation platform which can be used to explore an enterprise’s information sets. On the other hand, regulatory sandboxes are controlled environments where firms can introduce innovations to a limited customer base within a relaxed regulatory framework, after which they may be allowed entry into the larger market after meeting certain conditions. This purportedly encourages innovation through the lowering of entry barriers by protecting newer entrants from unnecessary and burdensome regulation. Regulatory sandboxes can be interpreted as a form of responsive regulation by governments that seek to encourage innovation – they allow selected companies to experiment with solutions within an environment that is relatively free of most of the cumbersome regulations that they would ordinarily be subject to, while still subject to some appropriate safeguards and regulatory requirements. Sandboxes are regulatory tools which may be used to permit companies to innovate in the absence of heavy regulatory burdens. However, these ordinarily refer to burdens related to high barriers to entry (such as capital requirements for financial  and banking companies), or regulatory costs. In this Bill, however, the relaxing of data protection provisions for data fiduciaries would lead to restrictions of the privacy of individuals. Limitations to a fundamental rights on grounds of ‘fostering innovation’ is not a constitutional tenable position, and contradict the primary objectives of a data protection law.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;5. The primacy of ‘harm’ in the Bill ought to be reconsidered&lt;/h3&gt;
&lt;p&gt;While a harms based approach is necessary for data protection frameworks, such approaches should be restricted to the positive obligations, penal provisions and responsive regulation of the Authority. The Bill does not provide any guidance on either the interpretation of the term ‘harm,’ [12] or on the various activities covered within the definition of the term. Terms such as ‘loss of reputation or humiliation’ ‘any discriminatory treatment’ are a subjective standard and are open to varied interpretations. This ambiguity in the definition will make it difficult for the data principal to demonstrate harm and for the DPA to take necessary action as several provisions are based upon harm being caused or likely to be caused.Some of the significant provisions where ‘harm’ is a precondition for the provision to come into effect are —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;Clause 25: Data Fiduciary is required to notify the Authority about the breach of personal data processed by the data fiduciary, if such breach is likely to cause harm to any data principal. The Authority after taking into account the severity of the harm that may be caused to the data principal will determine whether the data principal should be notified about the breach.&lt;/li&gt;
&lt;li&gt;Clause 32 (2): A data principal can file a complaint with the data fiduciary for a contravention of any of the provisions of the Act, which has caused or is likely to cause ‘harm’ to the data principal.&lt;/li&gt;&lt;li&gt;Clause 64 (1): A data principal who has suffered harm as a result of any violation of the provision of the Act by a data fiduciary, has the right to seek compensation from the data fiduciary.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;Clause 16 (5): The guardian data fiduciary is barred from profiling, tracking or undertaking targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;6. Non personal data should be outside the scope of this Bill&lt;/h3&gt;
&lt;p&gt;Clause 91 (1) states that the Act does not prevent the Central Government from framing a policy for the digital economy, in so far as such policy does not govern personal data. The Central Government can, in consultation with the Authority, direct any data fiduciary  to provide any anonymised personal data or other non-personal data to enable better targeting of delivery of services or formulation of evidence based policies in any manner as may be prescribed.It is concerning that the data protection bill has specifically carved out an exception for the Central Government to frame policies for the digital economy and seems to indicate that the government plans to freely use any and all anonymized and/or non-personal data that rests with any data fiduciary that falls under the ambit of the bill to support the digital economy including for its growth, security, integrity, and prevention of misuse. It is unclear how the government, in practice, will be able to compel organizations to share this data. Further, there is a lack of clarity on the contours of the definition of non-personal data and the Bill does not define the term. It is also unclear whether the Central Government can compel the data fiduciary to transfer/share all forms of non-personal data and the rights and obligations of the data fiduciaries and data principals over such forms of data. Anonymised data refers to data which has ‘ irreversibly’ been converted into a form in which the data principal cannot be identified. However, as several instances have shown ‘ irreversible’ anonymisation is not possible. In the United States, the home addresses of taxi drivers were uncovered and in Australia individual health records were mined from anonymised medical bills [13]. In September 2019, the Ministry of Electronics and Information Technology, constituted an expert committee under the chairmanship of Kris Gopalkrishnan to study various issues relating to non-personal data and to deliberate over a data governance framework for the regulation of such data.The provision should be deleted and the scope of the bill should be limited to protection of personal data and to provide a framework for the protection of individual privacy. Until the report of the expert committee is published, the Central Government should not frame any law/regulation on the access and monetisation of non-personal/ anonymised data nor can they create a blanket provision allowing them to request such data from any data fiduciary that falls within the ambit of the bill. If the government wishes to use data resting with a data fiduciary; it must do so on a case to case basis and under formal and legal agreements with each data fiduciary.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;7. Steps towards greater decentralisation of power&lt;/h3&gt;
&lt;p&gt;We propose the following steps towards greater decentralisation of powers and devolved jurisdiction —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;Creation of State Data Protection Authorities: A single centralised body may not be the appropriate form of such a regulator. We propose that on the lines of central and state commissions under the Right to Information Act, 2005, state data protection authorities are set up which are in a position to respond to local complaints and exercise jurisdiction over entities within their territorial jurisdictions.&lt;/li&gt;
&lt;li&gt;More involvement of industry bodies and civil society actors: In order to lessen the burden on the data protection authorities it is necessary that there is active engagement with industry bodies, sectoral regulators and civil society bodies engaged in privacy research. Currently, the Bill provides for involvement of industry or trade association, association representing the interests of data principals, sectoral regulator or statutory Authority, or an departments or ministries of the Central or State Government in the formulation of codes of practice. However, it would be useful to also have a more active participation of industry associations and civil society bodies in activities such as promoting  awareness among data fiduciaries of their obligations under this Act, promoting measures and undertaking research for innovation in the field of protection of personal data.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;8. The Authority must be empowered to exercise responsive regulation&lt;/h3&gt;
&lt;p&gt;In a country like India, the challenge is to move rapidly from a state of little or no data protection law, and consequently an abysmal state of data privacy practices to a strong data protection regulation and a powerful regulator capable of enabling a state of robust data privacy practices. This requires a system of supportive mechanisms to the stakeholders in the data ecosystem, as well as systemic measures which enable the proactive detection of breaches. Further, keeping in mind the limited regulatory capacity in India, there is a need for the Authority to make use of different kinds of inexpensive and innovative strategies.We recommend the following additional powers for the Authority to be clearly spelt out in the Bill —&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;Informal Guidance: It would be useful for the Authority to set up a mechanism on the lines of the Security and Exchange Board of India (SEBI)’s Informal Guidance Scheme, which enables regulated entities to approach the Authority for non-binding advice on the position of law. Given that this is the first omnibus data protection law in India, and there is very little jurisprudence on the subject from India, it would be extremely useful for regulated entities to get guidance from  the regulator.&lt;/li&gt;
&lt;li&gt;Power to name and shame: When a DPA makes public the names of organisations that have seriously contravened data protection legislation, this is a practice known as “naming and shaming.”  The UK ICO and other DPAs recognise the power of publicity, as evidenced by their willingness to co-operate  with the media. The ICO does not simply post monetary penalty notices (MPNs or fines) on its websites for journalists to find, but frequently issues press releases, briefs journalists and uses social media. The ICO’s publicity statement on communicating enforcement activities states that the “ICO aims to get media coverage for  enforcement activities.”&lt;/li&gt;
&lt;li&gt;Undertakings: The UK ICO has also leveraged the threats of fines into an alternative enforcement mechanism seeking contractual undertakings from data controllers to take certain remedial steps. Undertakings have significant advantages for the regulator. Since an undertaking is a more “co-operative”solution, it is less likely that a data controller will change it. An undertaking is simpler and easier to put in place. Furthermore, the Authority can put an undertaking in place quickly as opposed to legal proceedings which are longer.&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;9. No clear roadmap for the implementation of the Bill&lt;/h3&gt;
&lt;p&gt;The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified [14]. It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified.The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within within which the Authority shall be established and specific rules and regulations notified. Considering that 25 provisions have been deferred to rules that have to be framed by the Central Government and a further 19 provisions have been deferred to the regulations to be notified by the Authority the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill.The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017.We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Authority. This is important to give full and effective effect to the right of privacy of the &lt;br /&gt;individual. It is also important to ensure that individuals have an effective mechanism  to enforce the right and seek recourse in case of any breach of obligations by the  data fiduciaries.For offences, we suggest a system of mail boxing where provisions and punishments are enforced in a staggered manner, for a period till the fiduciaries are aligned with the provisions of the Act. The Authority must ensure that data principals and fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment are brought into force. This will allow the data fiduciaries to align their practices with the provisions of this new legislation and the Authority will also have time to define and determine certain provisions that the Bill has left the Authority to define. Additionally enforcing penalties for offences initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders from paying a high price. This will relieve the fear of smaller companies and startups who might fear processing data for the fear of paying penalties for offences.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;10. Lack of interoperability&lt;/h3&gt;
&lt;p&gt;In its current form, a number of the provisions in the Bill will make it difficult for India’s framework to be interoperable with other frameworks globally and in the region. For example, differences between the draft Bill and the GDPR can be found in the grounds for processing,&amp;nbsp; data localization frameworks, the framework for cross border transfers, definitions of sensitive personal data, inclusion of&amp;nbsp; the undefined category of ‘critical&amp;nbsp; data’, and the roles of the authority and the central government.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;11. Legal Uncertainty&lt;/h3&gt;
&lt;p&gt;In its current structure, there are a number of provisions in the Bill that, when implemented, run the risk of creating an environment of legal uncertainty. These include: lack of definition of critical data, lack of clarity in the interpretation of the terms ‘harm’ and ‘significant harm’, ability of the government to define further categories of sensitive personal data,&amp;nbsp; inclusion of requirements for ‘social media intermediaries’, inclusion of ‘non-personal data’, framing of the requirements for data transfers, bar on processing of certain forms of biometric data as defined by the Central Government, the functioning between a consent manager and another data fiduciary, the inclusion of an AI sandbox and the definition of state. To ensure the greatest amount of protection of individual privacy rights and the protection of personal data while also enabling innovation, it is important that any data protection framework is structured and drafted in a way to provide as much legal certainty as possible.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;Endnotes&lt;/h3&gt;
&lt;p&gt;1. (2017) 10 SCC 641 (“Puttaswamy I”).&lt;/p&gt;
&lt;p&gt;2. Clause 42(1) of the 2018 Bill states that “Processing of personal data in the interests of the security of the State shall not be permitted unless it is authorised pursuant to a law, and is in accordance with the procedure established by such law, made by Parliament and is necessary for, and proportionate to such interests being achieved.”&lt;/p&gt;
&lt;p&gt;3. (2019) 1 SCC 1 (“Puttaswamy II”)&lt;/p&gt;
&lt;p&gt;4. Puttaswamy I, supra, para 180.&lt;/p&gt;
&lt;p&gt;5. (1978) 1 SCC 248.&lt;/p&gt;
&lt;p&gt;6. Ibid para 48.&lt;/p&gt;
&lt;p&gt;7. Puttaswamy I supra para 180.&lt;/p&gt;
&lt;p&gt;8. State of W.B. v. Anwar Ali Sarkar, 1952 SCR 284; Satwant Singh Sawhney v A.P.O AIR 1967 SC1836.&lt;/p&gt;
&lt;p&gt;9. (2016)7 SCC 353.&lt;/p&gt;
&lt;p&gt;10. Dvara Research “Initial Comments of Dvara Research dated 16 January 2020 on the Personal Data Protection Bill, 2019 introduced in Lok Sabha on 11 December 2019”, January 2020, https://www.dvara.com/blog/2020/01/17/our-initial-comments-on-the-personal-data-protection-bill-2019/ (“Dvara Research”).&lt;/p&gt;
&lt;p&gt;11. “A Data Sandbox for Your Company”, Terrific Data, last accessed on January 31, 2019, http://terrificdata.com/2016/12/02/3221/.&lt;/p&gt;
&lt;p&gt;12. Clause 3(20) — “harm” includes (i) bodily or mental injury; (ii) loss, distortion or theft of identity; (ii) financial loss or loss of property; (iv) loss of reputation or humiliation; (v) loss of employment; (vi) any discriminatory treatment; (vii) any subjection to blackmail or extortion; (viii) any denial or withdrawal of service,benefit or good resulting from an evaluative decision about the data principal; (ix) any restriction placed or suffered directly or indirectly on speech, movement or any other action arising out of a fear of being observed or surveilled; or (x) any observation or surveillance that is not reasonably expected by the data principal.&lt;/p&gt;
&lt;p&gt;13. Alex Hern “Anonymised data can never be totally anonymous, says study”, July 23, 2019 https://www.theguardian.com/technology/2019/jul/23/anonymised-data-never-be-anonymous-enough-study-finds.&lt;/p&gt;
&lt;p&gt;14. Clause 97 of the 2018 Bill states“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6)The Authority shall no, later than twelve months from the date notified date issue codes of practice on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f ) security safeguards under section 31; (g) research purposes under section 45; (h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019'&gt;https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha, Elonnai Hickok, Pallavi Bedi, Shweta Mohandas, Tanaya Rajwade</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2020-02-21T10:13:35Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/comments-to-the-pdp-bill-2019">
    <title>Comments to The PDP Bill 2019</title>
    <link>https://cis-india.org/internet-governance/comments-to-the-pdp-bill-2019</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/comments-to-the-pdp-bill-2019'&gt;https://cis-india.org/internet-governance/comments-to-the-pdp-bill-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>akash</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2020-02-12T11:52:11Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/gen-comments-to-pdp-bill">
    <title>Gen Comments to PDP Bill</title>
    <link>https://cis-india.org/internet-governance/gen-comments-to-pdp-bill</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/gen-comments-to-pdp-bill'&gt;https://cis-india.org/internet-governance/gen-comments-to-pdp-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>akash</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2020-02-12T11:50:32Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/annotated-version-of-comments-to-the-personal-data-protection-bill-2019">
    <title>Annotated version of Comments to The Personal Data Protection Bill 2019</title>
    <link>https://cis-india.org/internet-governance/annotated-version-of-comments-to-the-personal-data-protection-bill-2019</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/annotated-version-of-comments-to-the-personal-data-protection-bill-2019'&gt;https://cis-india.org/internet-governance/annotated-version-of-comments-to-the-personal-data-protection-bill-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>akash</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2020-02-12T11:18:33Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-to-national-security-council-on-national-cybersecurity-strategy-2020">
    <title>Comments to National Security Council on National Cybersecurity Strategy 2020</title>
    <link>https://cis-india.org/internet-governance/blog/comments-to-national-security-council-on-national-cybersecurity-strategy-2020</link>
    <description>
        &lt;b&gt;CIS submitted brief  comments to the National Security Council on the National Cybersecurity Strategy within the 5000 character limit provided. CIS will continue producing outputs building on these ideas.&lt;/b&gt;
        
&lt;h2&gt;Approach and Key Principles:&lt;/h2&gt;
&lt;h2&gt;&lt;/h2&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;India’s 2020 strategy will need to account for key vectors that have come to define cyberspace including:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;Increased power held by non-state actors - both private corporations and terrorist groups&lt;/li&gt;&lt;li&gt;Augmented capacity of states to use cyberspace as a tool of external power projection-both through asymmetric warfare, and alleged interference via the spread of misinformation&lt;/li&gt;&lt;li&gt;The  progression of norms formulation processes in cyberspace that have failed to attain consensus due to disagreement on the application of specific standards of International Law to cyberspace.&lt;/li&gt;&lt;/ul&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;The 2020 framework should&amp;nbsp; be grounded in:&lt;/div&gt;
&lt;ol&gt;&lt;li&gt;&lt;strong&gt;Legality&lt;/strong&gt;: Capabilities, measures, and processes for cyber security must be&amp;nbsp; legally defined and backed.&amp;nbsp; &lt;br /&gt;&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Necessity and Proportionality&lt;/strong&gt;: Any measure taken for the purpose of&amp;nbsp; ‘cyber security’ that might have implications for fundamental rights&amp;nbsp; must be necessary, and proportionate to the infringement.&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Transparency&lt;/strong&gt;: Transparency must be a key principle with clear standards to resolve&amp;nbsp; situations where there is a conflict of interests. &lt;br /&gt;&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Accountability and Oversight: Capabilities&lt;/strong&gt;, measures and processes must be held accountable through capable and funded bodies and mechanisms. &lt;br /&gt;&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Human Rights&lt;/strong&gt;:&amp;nbsp; Security of the individual, the community, society, and the nation must be achieved through through promoting a ‘feeling of being secure’ that must stem from a rights-respecting framework.&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Free and fair digital economy&lt;/strong&gt;: Pursue both domestic and geo-strategic policies and actions that enable a free and fair digital economy.&amp;nbsp;&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The strategy should be based on the following:&lt;/p&gt;
&lt;ol&gt;&lt;li&gt;&lt;strong&gt;Evidence based&lt;/strong&gt;: Regular audits of the state of cyber security in India to inform action and policy.&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Appropriate metrics:&lt;/strong&gt; Key metrics are needed to measure, track, and communicate&amp;nbsp; cyber security in India. &lt;/li&gt;&lt;li&gt;&lt;strong&gt;Funding:&lt;/strong&gt; Funding for cyber security needs to be built into the budget.&lt;/li&gt;&lt;/ol&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;h2&gt;Pillars of Strategy&lt;/h2&gt;
&lt;h3&gt;Secure &lt;br /&gt;&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Key Defensive Measures: &lt;/strong&gt;Technical defense measures such as:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;Testing and auditing of hardware and software&lt;/li&gt;&lt;li&gt;Identification of threat intelligence vectors and existing vulnerabilities, particularly in systems designated as Critical Information Infrastructure (CII)&lt;/li&gt;&lt;li&gt;Outline scenarios in which retaliatory operations may be taken and their nature,scope and limits&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Designing a credible deterrence strategy, &lt;/strong&gt;which includes:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;Articulation of the nature, scale and permissible limits of retaliatory or escalatory measures undertaken AND&lt;/li&gt;&lt;li&gt;An exposition of how this matches with&amp;nbsp; the application of key tenets of International Law in cyberspace&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Offensive Measures: &lt;/strong&gt;If India pursues cyber offensive capabilities, this must be done in accordance with the principles&amp;nbsp; articulated above. This includes ensuring that the surveillance regime in India is inline with international human rights norms.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Emerging Technologies&lt;/strong&gt;: Emerging technologies must meet high security standards before they are scaled and deployed.&amp;nbsp; Creation of sandboxes should not be an exception.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Developing attribution capabilities&lt;/strong&gt;: If India pursues attribution capabilities,&amp;nbsp; this must be through multi-stakeholder collaboration, should not risk military escalation, and must demonstrate compliance with evidentiary requirements of Indian criminal law and requirements in International Law on State Responsibility.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Process for response&lt;/strong&gt;: Define clear roles for the response protocol to a cyber attack including detection, mitigation and response.&lt;/p&gt;
&lt;h3&gt;Strengthen&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Regulatory Requirements&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;Legal and Technical Security Standards: Develop harmonised and robust legal and technical security standards across sectors for crucial issues - encryption and breach notifications etc. Promote industry wide adoption of standards developed by BIS and encourage participation at standard setting fora.&lt;/li&gt;&lt;li&gt;Cross border sharing of data: Focus on a solution to the MLAT process - potentially including the negotiation of an executive agreement under the CLOUD Act.&lt;/li&gt;&lt;/ul&gt;
&lt;div&gt;
&lt;p&gt;&lt;strong&gt;Coordinated Vulnerability Disclosure&lt;/strong&gt;: Improve the processes for disclosing security vulnerabilities to the Government by stakeholders outside the government.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Incentives&lt;/strong&gt;: Develop incentives for strong cyber security practices such as cyber insurance programmes, certifications and seals, and tax incentives.&amp;nbsp;&lt;/p&gt;
&lt;/div&gt;
&lt;p&gt;&lt;strong&gt;Education and End User Awareness&lt;/strong&gt;: Develop solutions to aid users to understand and manage their digital security.&lt;/p&gt;
&lt;div&gt;
&lt;p&gt;&lt;strong&gt;Harmonization and interoperability&lt;/strong&gt;: Harmonize legislation, legal provisions, and department mandates and processes related to cyber security.&lt;/p&gt;
&lt;h3&gt;Synergise&lt;/h3&gt;
&lt;p&gt;Engage in processes at the regional and global level to prevent potential misunderstandings, define shared understandings, and identify areas of collaboration. This can take place through:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;&lt;strong&gt;Norms&lt;/strong&gt;: Clarify India’s understanding of the applicability of international law to cyber space and engage in norms processes and contribute to the articulation of&amp;nbsp; a development dimension for cyber norms.&lt;/li&gt;&lt;li&gt;&lt;strong&gt;CBMs&lt;/strong&gt;: Focus on political and legal&amp;nbsp; measures around transparency, cooperation, and stability in the region and globally. &lt;br /&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;/div&gt;
&lt;div class="pullquote"&gt;&amp;nbsp;&lt;/div&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-to-national-security-council-on-national-cybersecurity-strategy-2020'&gt;https://cis-india.org/internet-governance/blog/comments-to-national-security-council-on-national-cybersecurity-strategy-2020&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok and Arindrajit Basu</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2020-01-13T09:18:17Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call">
    <title>Call for Researchers: Welfare, Gender, and Surveillance</title>
    <link>https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call</link>
    <description>
        &lt;b&gt;We are inviting applications for two researchers. Each researcher is expected to write a narrative essay that interrogates the modes of surveillance that people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations are put under as they seek sexual and reproductive health (SRH) services in India. The researchers are expected to undertake field research in the location they are based in, and reflect on lived experiences gathered through field research as well as their own experiences of doing field research. Please read the sections below for more details about the work involved, the timeline for the same, and the application process for this call.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Call for Researchers: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_Researchers_WelfareGenderSurveillance_Call_20200110.pdf" target="_blank"&gt;Download&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;strong&gt;Description of the Work&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Each researcher is expected to author a narrative essay that presents and reflects on lived experiences of people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations as they seek sexual and reproductive health (SRH) services in India. We expect the essay to contribute to a larger body of knowledge around the increasing focus on data-driven initiatives for public health provision in the country and elsewhere. Accordingly, the researcher may respond to any one or more than one of the following questions, within the context of the geographical focus as specified by the researcher:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What are the modes of surveillance, especially in terms of generation and exploitation of digital data, experienced by people of marginalised gender identities and sexual orientations in India, as they avail of sexual and reproductive healthcare?&lt;/li&gt;
&lt;li&gt;How are the lived experiences of underserved populations, such as people of marginalised gender identities and sexual orientations, shaped by gendered surveillance while accessing sexual and reproductive services?&lt;/li&gt;
&lt;li&gt;What are the modes of governance and gender ideologies that have mediated the increasing datafication of such provision?&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;We expect the researchers to draw on a) the Indian Supreme Court’s framing of privacy in India, as a fundamental right, and its implications; and b) apply and/or build on feminist conceptualisations of privacy. Further, we expect the researchers to respond to the uncertain landscape of legal rights accessible to people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations, especially in the current context shaped by The Transgender Persons (Protection of Rights) Act, 2019.&lt;/p&gt;
&lt;p&gt;The researchers will undertake field research in locations of their choice, conduct interviews and discussions with people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations seeking such services, and conduct formal and informal interviews with officials and personnel associated with public and private sector agencies involved in the provision of SRH services.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Eligibility and Application Process&lt;/strong&gt;&lt;/h3&gt;
&lt;h4&gt;We specifically encourage people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations to submit their applications for this call for researchers.&lt;/h4&gt;
&lt;p&gt;We are seeking applications from individuals who:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Are based in the place where field study is to be undertaken, for the duration of the study;&lt;/li&gt;
&lt;li&gt;Are fluent in the main regional language(s) spoken in the city where the study will be conducted, and in English (especially written);&lt;/li&gt;
&lt;li&gt;Preferably have a postgraduate degree (current students should also apply) in social or technical sciences, journalism, or legal studies (undergraduate degree-holders with research or work experience should also apply); and&lt;/li&gt;
&lt;li&gt;Have previous research and writing experiences on issues at the intersection of sexual and reproductive health, gender justice and women’s rights, and health informatics or digital public health.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;Please send the following documents (in text or PDF formats) to ​&lt;strong&gt;​raw@cis-india.org​​ by ​Friday, January 24​​&lt;/strong&gt; to apply for the researcher positions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Brief CV with relevant academic and professional information;&lt;/li&gt;
&lt;li&gt;Two samples of academic/professional (published/unpublished) writing by the applicant; and&lt;/li&gt;
&lt;li&gt;A brief research proposal (around 500 words) that should specify the scope (geographical and conceptual), research questions, and motivation of the essay to be authored by the applicant.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;All applicants will be informed of the selection decisions by Friday, January 31.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Timeline of the Work&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;February 3-7&lt;/strong&gt; CIS research team will have a call with each researcher to plan out the work to be undertaken by them&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;February - March&lt;/strong&gt; Researchers are to undertake field research, as proposed by the researchers and discussed with the CIS research team&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;March 27&lt;/strong&gt; Researchers are to submit a full draft essay (around 3,000 words)&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;March 30 - April 3&lt;/strong&gt; CIS research team will have call with each researcher to discuss the shared draft essays and make plans towards their finalisation&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;May 15&lt;/strong&gt; Researchers are to submit the final essay (around 5,000 words, without footnotes and references)&lt;/p&gt;
&lt;p&gt;As part of this project, CIS will organise two discussion events in Bengaluru and New Delhi during April-June (tentatively). Event dates are to be decided in conversation with the researchers, and they will be invited to present their works in the same.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Remuneration&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Each researcher will be paid a remuneration of ​Rs. 1,00,000 (inclusive of taxes) ​​over two equal installments: first on signing of the agreement in February 2020, and second on submission of the final essay in May 2020.&lt;/p&gt;
&lt;p&gt;We will also reimburse local travel expenses of each researcher upto Rs. 10,000, and translations and transcriptions expense (if any) incurred by each researcher upto Rs. 10,000. These reimbursements will be made on the basis of expense invoices shared by the researcher.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Description of the Project&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Previous research conducted by CIS on the subject of sexual and reproductive health (SRH) services in India observes that there is a complex web of surveillance, or ‘dataveillance’, around each patient as they avail of SRH services from the state. In this current project, we are aiming to map the ecosystem of surveillance around SRH services as their provision becomes increasingly ‘data-driven’, and explore its implications for patients and beneficiaries.&lt;/p&gt;
&lt;p&gt;Through this project, we are interested in documenting the roles played by both the public and the private sector actors in this ecosystem of health surveillance. We understand the role of private sector actors as central to state provision of sexual and reproductive health services, especially through the institutionalisation of data-driven health insurance models, as well as through extensive privatisation of public health services. By studying semi-private, private, and public medical establishments including hospitals, primary/community health centres and clinics, we aim to develop a comparative analysis of surveillance ecosystems across the three establishment types.&lt;/p&gt;
&lt;p&gt;This project is led by Ambika Tandon, Aayush Rathi, and Sumandro Chattapadhyay at the Centre for Internet and Society, and is supported by a grant from Privacy International.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Indicative Reading List&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;&lt;em&gt;We are sharing below a short and indicative list of readings that may be useful for potential applicants&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Aayush Rathi, &lt;a href="https://www.epw.in/engage/article/indias-digital-health-paradigm-foolproof" target="_blank"&gt;Is India's Digital Health System Foolproof?&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Aayush Rathi and Ambika Tandon, &lt;a href="https://www.epw.in/engage/article/data-infrastructures-inequities-why-does-reproductive-health-surveillance-india-need-urgent-attention" target="_blank"&gt;Data Infrastructures and Inequities: Why Does Reproductive Health Surveillance in India Need Our Urgent Attention?&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Ambika Tandon, &lt;a href="https://cis-india.org/internet-governance/blog/ambika-tandon-december-23-2018-feminist-methodology-in-technology-research" target="_blank"&gt;Feminist Methodology in Technology Research: A Literature Review&lt;/a&gt; (2018)&lt;/p&gt;
&lt;p&gt;Ambika Tandon, &lt;a href="https://cis-india.org/raw/big-data-reproductive-health-india-mcts" target="_blank"&gt;Big Data and Reproductive Health in India: A Case Study of the Mother and Child Tracking System&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Anja Kovacs, &lt;a href="https://genderingsurveillance.internetdemocracy.in/theory/" target="_blank"&gt;Reading Surveillance through a Gendered Lens: Some Theory&lt;/a&gt; (2017)&lt;/p&gt;
&lt;p&gt;Lindsay Weinberg, &lt;a href="https://www.westminsterpapers.org/articles/10.16997/wpcc.258/" target="_blank"&gt;Rethinking Privacy: A Feminist Approach to Privacy Rights after Snowden&lt;/a&gt; (2017)&lt;/p&gt;
&lt;p&gt;Nicole Shephard, &lt;a href="https://www.apc.org/en/pubs/big-data-and-sexual-surveillance" target="_blank"&gt;Big Data and Sexual Surveillance&lt;/a&gt; (2016)&lt;/p&gt;
&lt;p&gt;Sadaf Khan, &lt;a href="https://deepdives.in/data-bleeding-everywhere-a-story-of-period-trackers-8766dc6a1e00" target="_blank"&gt;Data Bleeding Everywhere: A Story of Period Trackers&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call'&gt;https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>ambika</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Welfare Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Gender, Welfare, and Privacy</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2020-02-13T15:05:37Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward">
    <title>Automated Facial Recognition Systems and the Mosaic Theory of Privacy: The Way Forward</title>
    <link>https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward</link>
    <description>
        &lt;b&gt; Arindrajit Basu and Siddharth Sonkar have co-written this blog as the third of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems? &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The Mosaic Theory of Privacy&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Whether the data collected by the AFRS should be treated similar to 
face photographs taken for the purposes of ABBA is not clear in the 
absence of judicial opinion. The AFRS would ordinarily collect 
significantly more data than facial photographs during authentication. 
This can be explained with the help of the &lt;em&gt;&lt;a href="https://www.lawfareblog.com/defense-mosaic-theory" rel="noreferrer noopener" target="_blank"&gt;mosaic theory of privacy&lt;/a&gt;&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;The mosaic theory of privacy suggests that data collected for long 
durations of an individual can be qualitatively different from single 
instances of observation. It argues that aggregating data from different
 instances can create a picture of an individual which affects her 
reasonable expectation of privacy. This is because a mere slice of 
information reveals a lot less if the same is contextualised in a broad 
pattern — a mosaic.&amp;nbsp;&amp;nbsp; &amp;nbsp;&lt;/p&gt;
&lt;p&gt;The mosaic theory of privacy does not find explicit reference in 
Puttaswamy II. The petitioners had argued that seeding of Aadhaar data 
into existing databases would bridge information across silos so as to 
make real time surveillance possible. This is because information when 
integrated from different silos becomes more than the sum of its parts.&lt;/p&gt;
&lt;p&gt;The Court, however, dismissed this argument, accepting UIDAI’s 
submission that the data collected remains in different silos and 
merging is not permitted within the Aadhaar framework. Therefore, the 
Court did not examine whether it is constitutionally permissible to 
integrate data from different silos; it simply rejected the possibility 
of surveillance as a result of Aadhaar authentication.&lt;/p&gt;
&lt;p&gt;Jurisprudence in other jurisdictions is more advanced. In&amp;nbsp;&lt;em&gt;United States v. Jones&lt;/em&gt;,
 the United States Supreme Court&amp;nbsp;had observed that the insertion of a 
global positioning system into Antoine Jones’ Jeep in the absence of a 
warrant and without his consent invaded his privacy, entitling him to 
Fourth Amendment Protection. In this case, the movement of Jones’ 
vehicle was monitored for a period of twenty-eight days. Five concurring
 opinions in Jones acknowledges that aggregated and extensive 
surveillance is capable of violating the reasonable expectation of 
privacy irrespective of whether or not surveillance has taken place in 
public.&lt;/p&gt;
&lt;p&gt;The Court distinguished between prolonged surveillance and short term
 surveillance. Surveillance in the short run does not reveal what a 
person repeatedly does, as opposed to sustained surveillance which can 
reveal significantly more about a person. The Court takes the example of
 how a sequence of trips to a bar, a bookie, a gym or a church can tell a
 lot more about a person than the story of any single visit viewed in 
isolation.&lt;/p&gt;
&lt;p&gt;Most recently, in&lt;a href="https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf" rel="noreferrer noopener" target="_blank"&gt; &lt;em&gt;Carpenter v. United States&lt;/em&gt;&lt;/a&gt;,
 the Supreme Court of the United States held that the collection of&amp;nbsp; 
historical cell data by the government&amp;nbsp; exposes the physical movements 
of an individual to potential surveillance, and an individual holds a 
reasonable expectation of privacy against such&amp;nbsp; collection. The Court 
admitted that historical-cell site information allows the government to 
go back in time in order to retract the exact whereabouts of a person.&lt;/p&gt;
&lt;p&gt;Judicial decisions have not addressed specifically whether facial 
recognition through law enforcement constitutes a search under the 
Fourth Amendment or a “mere visual observation”.&lt;/p&gt;
&lt;p&gt;The common thread linking CCTV footages and cellular data is the 
unique ability to track the movement of an individual from one place to 
another, enabling extreme forms of surveillance. It is perhaps this 
crucial link that would make ARFS-enabled CCTVs prejudicial to 
individual privacy.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;The mosaic theory as understood in &lt;em&gt;Carpenter&lt;/em&gt; helps one 
understand the extent to which an AFRS can augment the capacities of law
 enforcement in India. This in turn can help in understanding whether it
 is constitutionally permissible to install such systems&amp;nbsp;across the 
country.&lt;/p&gt;
&lt;p&gt;AFRS enabled-CCTV footages from different CCTVs. if viewed in 
conjunction could reveal a sequence of movements of an individual, 
enabling long-term surveillance of a nature that is qualitatively 
distinct from isolated observances observed across unrelated CCTV 
footages.&lt;/p&gt;
&lt;p&gt;Subsequent to &lt;em&gt;Carpenter&lt;/em&gt;, &lt;a href="https://www.lawfareblog.com/four-months-later-how-are-courts-interpreting-carpenter" rel="noreferrer noopener" target="_blank"&gt;federal district courts&lt;/a&gt;
 in the United States have declined to apply Carpenter to video 
surveillance cases since the judgement did not “call into question 
conventional surveillance techniques and tools, such as security 
cameras.”&lt;/p&gt;
&lt;p&gt;The extent of processing that an AFRS-enabled CCTV exposes an 
individual to would be significantly greater. This is because every time
 an individual is in the zone of a AFRS-enabled CCTV, the facial image 
will be compared to a common database. Snippets from different CCTVs 
capturing the individual’s physical presence in two different locations 
may not be meaningful per se. When observed together, the AFRS will make
 it possible to identify the individual’s movement from one place to 
another.&lt;/p&gt;
&lt;p&gt;For instance, the AFRS will be able to identify the person when they 
are on Street A at a particular time and when they are Street B in the 
immediately subsequent hour recorded by respective CCTV cameras, 
indicating the person’s physical movement from A to B. While a CCTV 
camera only records movement of an individual in video format, AFRS 
translates that digital information into individualised data with the 
help of a comparison of facial features with a pre-existing database.&lt;/p&gt;
&lt;p&gt;Through data aggregation, which appears to be the aim of the Indian 
government&amp;nbsp;in their tender that links three databases, it is apparent 
that the right to privacy is in danger. Yet,&amp;nbsp;at present, there does not 
exist any case law or legislation that can render such&amp;nbsp;efforts illegal 
at this juncture.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Conclusions and The Way Forward&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Despite a lack of judicial recognition of the potential 
unconstitutionality of deploying&amp;nbsp;AFRS, it is clear that the introduction
 of these systems pose a clear and present danger to civil rights and 
human dignity. Algorithmic surveillance alters a human being’s life in 
ways that even the subject of this surveillance cannot fully comprehend.
 As an individual’s data is manipulated and aggregated to derive&amp;nbsp;a 
pattern about that individual’s world, the individual or his data no 
longer exists for itself&lt;sup&gt; &lt;/sup&gt;but are massaged into various categories.&lt;/p&gt;
&lt;p&gt;Louis Amoore terms this a ‘&lt;a href="https://journals.sagepub.com/doi/abs/10.1177/0263276411417430?journalCode=tcsa" rel="noreferrer noopener" target="_blank"&gt;data-derivative&lt;/a&gt;’,
 which is an abstract conglomeration of data that continuously shapes 
our futures without us having a say in their framing. The branding of an
 individual as a criminal and then aggregating their data causes 
emotional distress as individuals move about in fear of the state gaze 
and their association with activities that are branded as potentially 
dangerous — thereby suppressing a right to dissent — as exemplified by 
their use reported use during the recent protests in Hong Kong.&lt;/p&gt;
&lt;p&gt;Case law both in India and abroad has clearly suggested that a right 
to privacy is contextual and is not surrendered merely because an 
individual is in a public place. However, the jurisprudence protecting 
public photography or videography under the umbrella of privacy remains 
less clear globally and non-existent in India.&lt;/p&gt;
&lt;p&gt;The mosaic theory of privacy is useful in this regard as it prevents 
mass ‘data-veillance’ of individual behaviour and accurately identifies 
the unique power that the volume, velocity and variety of Big Data 
provides to the state. Therefore, it is imperative that the judiciary 
recognise safeguards from data aggregation as an essential component of a
 reasonable expectation of privacy. At the same time, legislation could 
also provide the required safeguards.&lt;/p&gt;
&lt;p&gt;In the US, Senators Coons and Lee recently introduced a draft Bill titled ‘&lt;a href="https://www.coons.senate.gov/imo/media/doc/ALB19A70.pdf" rel="noreferrer noopener" target="_blank"&gt;The Facial Recognition Technology Warrant Act of 2019’&lt;/a&gt;.
 The Bill aims to impose reasonable restrictions on the use of facial 
recognition technology by law enforcement. The Bill creates safeguards 
against sustained tracking of physical movements of an individual in 
public spaces. The Bill terms such tracking ‘ongoing surveillance’ when 
it occurs for over a period of 72 hours in real time or through 
application of technology to historical records. The Bill requires that 
ongoing surveillance only be conducted for law enforcement purposes &lt;em&gt;and&lt;/em&gt; in pursuance of a Court Order (unless it is impractical to do so).&lt;/p&gt;
&lt;p&gt;While the Bill has its textual problems, it is definitely worth 
considering as a model going forward and ensure that AFR systems are 
deployed in line with a rights-respecting reading of a reasonable 
expectation of privacy.&amp;nbsp; &lt;a href="http://datagovernance.org/report/adoption-and-regulation-of-facial-recognition-technologies-in-india" rel="noreferrer noopener" target="_blank"&gt;Parsheera&lt;/a&gt;
 suggests that the legislation should narrow tailoring of the objects 
and purposes for deployment of AFRS, restrictions on the person whose 
images may be scanned from the databases, judicial approval for its use 
on a case by case basis and effective mechanisms of oversight, analysis 
and verification.&lt;/p&gt;
&lt;p&gt;Appropriate legal intervention is crucial. A failure to implement 
this effectively jeopardizes the expression of our true selves and the 
core tenets of our democracy.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward'&gt;https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu, Siddharth Sonkar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>internet governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2020-01-02T14:12:38Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns">
    <title>Automated Facial Recognition Systems (AFRS): Responding to Related Privacy Concerns</title>
    <link>https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns</link>
    <description>
        &lt;b&gt;Arindrajit Basu and Siddharth Sonkar have co-written this blog as the second of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems? &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The Supreme Court of India, in &lt;a href="https://indiankanoon.org/doc/91938676/"&gt;Puttaswamy I&lt;/a&gt;&lt;em&gt; &lt;/em&gt;recognized&lt;em&gt;&amp;nbsp;&lt;/em&gt;that
 the right to privacy is not surrendered merely because the individual 
is in a public place. Privacy is linked to the individual as it is an 
essential facet of human dignity. Justice Chelameswar further clarified 
that privacy is contextual. Even in a public setting, people trying to 
converse in whispers would signal a claim to the right to privacy. 
Speaking on a loudspeaker would naturally not signal the same claim.&lt;/p&gt;
&lt;p&gt;The Supreme Court of Canada has also affirmed the notion of 
contextual privacy. As recently as on 7 March, 2019, the Supreme Court 
of Canada &lt;a href="http://www.thecourt.ca/r-v-jarvis-carving-out-a-contextual-approach-to-privacy/" rel="noreferrer noopener" target="_blank"&gt;in a landmark decision&lt;/a&gt; defined privacy rights in public areas implicitly applying &lt;a href="https://crypto.stanford.edu/portia/papers/RevnissenbaumDTP31.pdf"&gt;Helena Nissenbaum’s theory of contextual integrity&lt;/a&gt;.
 Helena Nissenbaum explains that the extent to which the right to 
privacy is eroded in public spaces with the help of her theory of 
contextual integrity.&lt;/p&gt;
&lt;p&gt;Nissenbaum suggests that labelling information as exclusively public 
or private fails to take into account the context which rationalises the
 desire of the individual to exercise her privacy in public. To explain 
this with an illustration, there exists a reasonable expectation of 
privacy in the restroom of a restaurant, even though it is in a public 
space.&lt;/p&gt;
&lt;p&gt;In &lt;a href="http://www.thecourt.ca/r-v-jarvis-carving-out-a-contextual-approach-to-privacy/"&gt;&lt;em&gt;R v Jarvis&lt;/em&gt;&lt;/a&gt; (Jarvis), the Court overruled a Court of Appeal for Ontario &lt;a href="https://www.canlii.org/en/on/onca/doc/2017/2017onca778/2017onca778.pdf"&gt;decision&lt;/a&gt;
 to hold that people can have a reasonable expectation of privacy even 
in public spaces. In this case, Jarvis was charged with the offence of 
voyeurism for secretly recording his students. The primary issue that 
the&amp;nbsp; Supreme Court of Canada was concerned with was whether the students
 filmed by Mr. Jarvis enjoyed a reasonable expectation of privacy at 
their school.&lt;/p&gt;
&lt;p&gt;The Court in this case unanimously held that students did indeed have
 a reasonable expectation of privacy.&amp;nbsp; The Court concluded nine 
contextual factors relevant in determining whether a person has a 
reasonable expectation to privacy would arise. The listed factors were:&lt;/p&gt;
&lt;p&gt;“1. The location the person was in when he or she was observed or recorded,&lt;/p&gt;
&lt;p&gt;2. The nature of the impugned conduct (whether it consisted of observation or recording),&lt;/p&gt;
&lt;p&gt;3. Awareness of or consent to potential observation or recording,&lt;/p&gt;
&lt;p&gt;4. The manner in which the observation or recording was done,&lt;/p&gt;
&lt;p&gt;5. The subject matter or content of the observation or recording,&lt;/p&gt;
&lt;p&gt;6. Any rules, regulations or policies that governed the observation or recording in question,&lt;/p&gt;
&lt;p&gt;7. The relationship between the person who was observed or recorded and the person who did the observing or recording,&lt;/p&gt;
&lt;p&gt;8. The purpose for which the observation or recording was done, and&lt;/p&gt;
&lt;p&gt;9. The personal attributes of the person who was observed or recorded.” (paragraph 29 of the judgement).&lt;/p&gt;
&lt;p&gt;The Court emphasized that the factors are not an exhaustive list, but
 rather were meant to be a guiding tool in determining whether a 
reasonable expectation of privacy existed in a given context. It is not 
necessary that each of these factors is present in a given situation to 
give rise to an expectation of privacy.&lt;/p&gt;
&lt;p&gt;Compared to the above-mentioned factors in Jarvis, the Indian Supreme Court in &lt;a href="https://indiankanoon.org/doc/127517806/"&gt;Justice K.S Puttaswamy (Retd.) v. Union of India&lt;/a&gt;: Justice Sikri (Puttaswamy II) &lt;strong&gt;—&lt;/strong&gt;
 the case which upheld the constitutionality of the Aadhaar project 
relied on the following factors to determine a reasonable expectation of
 privacy in a given context:&lt;/p&gt;
&lt;p&gt;“(i) What is the context in which a privacy claim is set up?&lt;/p&gt;
&lt;p&gt;(ii) Does the claim relate to private or family life, or a confidential relationship?&lt;/p&gt;
&lt;p&gt;(iii) Is the claim a serious one or is it trivial?&lt;/p&gt;
&lt;p&gt;(iv) Is the disclosure likely to result in any serious or significant injury and the nature and extent of disclosure?&lt;/p&gt;
&lt;p&gt;(v) Is disclosure relates to personal and sensitive information of an identified person?&lt;/p&gt;
&lt;p&gt;(vi) Does disclosure relate to information already disclosed publicly? If so, its implication?”&lt;/p&gt;
&lt;p&gt;These factors (acknowledged in Puttaswamy II in paragraph 292) seem 
to be very similar to the ones laid down in Jarvis, i.e., there is a 
strong reliance on the context in both cases. While there is no explicit
 mention of individual attributes of the individual claiming a 
reasonable expectation, the holding that children should be given an opt
 out indicates that the Court implicitly takes into account personal 
attributes (e.g. age) as well.&lt;/p&gt;
&lt;p&gt;The Court in Jarvis further (in paragraph 39) took the example of a 
woman in a communal change room at a public pool. She may expect other 
users to incidentally observe her undress but she would continue to 
expect only other women in the change room to observe her and reserve 
her rights against the general public. She would also expect not to be 
video recorded or photographed while undressing, both from other users 
of the pool and by the general public.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;If it is later found out that the change room had a one-way glass 
which allowed the pool staff to view the users change — or if there was a
 concealed camera recording persons while they were changing, she could 
claim a breach of her reasonable expectation of privacy under such 
circumstances and it would constitute an invasion of privacy.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;So, in the context of an AFRS, an individual walking down a 
public road may still signal that they wish to avail of their right to 
privacy. In such contexts, a concerted surveillance mechanism may come 
up against constitutional&amp;nbsp; roadblocks.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What is the nature of information being collected?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The second big question &lt;strong&gt;—&lt;/strong&gt; the nature of information 
which is being collected plays a role in determining the extent to which
 a person can exercise their reasonable expectation of privacy. 
Puttaswamy II laid down that collection of core biometric information 
such as fingerprints, iris scans in the context of the Aadhaar-Based 
Biometric Authentication (‘ABBA’) is constitutionally permissible. The 
basis of this conclusion is that the Aadhaar Act does not deal with the 
individual’s intimate or private sphere.&lt;/p&gt;
&lt;p&gt;The judgement of the Supreme Court in Puttaswamy II is in a very 
specific context (i.e. the ABBA). It does not explain or identify the 
contextual factors which determine the extent to which privacy may be 
reasonably expected over biometrics generally. In this judgment, the 
Court observed that demographic information and photographs do not raise
 a reasonable expectation of privacy under Article 21 unless there exist
 special circumstances such as the disclosure of juveniles in conflict 
of law or a rape victim’s identity.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Most importantly, the Court held that face photographs for 
the purpose of identification are not covered by a reasonable 
expectation of privacy. The Court distinguished face photographs from 
intimate photographs or those photographs which concern confidential 
situations. &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Face photographs, according to the Court, are shared by 
individuals in the ordinary course of conduct for the purpose of 
obtaining a driving &lt;/strong&gt;l&lt;strong&gt;icense, voter id, passport, 
examination admit cards, employment cards, and so on. Face photographs 
by themselves reveal no information.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Naturally, this&amp;nbsp;pronouncement of the Apex Court is a huge boost for the introduction of AFRS in India.&lt;/p&gt;
&lt;p&gt;Abroad, however, on 4 September 2019, in &lt;a href="https://www.judiciary.uk/wp-content/uploads/2019/09/bridges-swp-judgment-Final03-09-19-1.pdf"&gt;Edward Bridges v. Chief Constable of South Wales Police&lt;/a&gt;, a Division Bench of the High Court in England and Wales heard a challenge against an AFRS introduced by law enforcement (&lt;em&gt;see&lt;/em&gt;
 Endnote 1). The High Court rejected a claim for judicial review holding
 that the AFRS in question does not violate inter alia the right to 
privacy under Article 8 of the European Convention of Human Rights 
(‘ECHR’).&lt;/p&gt;
&lt;p&gt;According to the Court, the AFRS was used for specific and limited 
purposes, i.e., only when the image of the public matched a person on an
 existing watchlist. The use of the AFRS was therefore considered a 
lawful and fair restriction.&lt;/p&gt;
&lt;p&gt;The Court, however, acknowledged that extracting biometric data 
through AFRS is “well beyond the expected and unsurprising”. This seems 
to be a departure from the Indian Supreme Court’s observation in 
Puttaswamy II that there is no reasonable expectation of privacy over 
biometric data in the context of ABBA, and may be a wiser approach for 
the Indian courts to adopt.&lt;/p&gt;
&lt;h6&gt;&lt;strong&gt;Endnote &lt;/strong&gt;&lt;/h6&gt;
&lt;p&gt;1. The challenge was put forth by Edward Bridges, a civil liberties 
campaigner from Cardiff for being caught on camera in two particular 
deployments of the AFRS a) when he was at Queen Street, a busy shopping 
area in Cardiff and b) when he was at the Defence Procurement, Research,
 Technology and Exportability Exhibition held at the Motorpoint Arena.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This was published by &lt;a class="external-link" href="https://aipolicyexchange.org/2019/12/28/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns/"&gt;AI Policy Exchange&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns'&gt;https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu, Siddharth Sonkar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>internet governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2020-01-02T14:09:14Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns">
    <title>Decrypting Automated Facial Recognition Systems (AFRS) and Delineating Related Privacy Concerns</title>
    <link>https://cis-india.org/internet-governance/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns</link>
    <description>
        &lt;b&gt;Arindrajit Basu and Siddharth Sonkar have co-written this blog as the first of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems?&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The use of aggregated Big Data by governments has the potential to 
exacerbate power asymmetries and erode civil liberties like few 
technologies of the past. In order to guard against the aggressive&amp;nbsp; 
aggregation&amp;nbsp;and manipulation of&amp;nbsp;the data generated by individuals&amp;nbsp;who 
are branded&amp;nbsp;as suspect, it is critical that our firmly established 
constitutional rights protect human dignity in the face of this 
potential erosion.&lt;/p&gt;
&lt;p&gt;The increasing ubiquity of Automated Facial Recognition Systems 
(AFRS) serve as a prime example of the rising desire of governments to 
push fundamental rights to the brink. With AFRS, the core fundamental 
right in question is privacy, although questions have been posed 
regarding the potential violation of&amp;nbsp;other related rights, such as the 
Right to Equality and the Right to Free Speech and Expression, as well.&lt;/p&gt;
&lt;p&gt;There is a rich corpus of literature, (see &lt;a href="https://indianexpress.com/article/opinion/columns/digital-identification-facial-recognition-system-ncrb-5859072/" rel="noreferrer noopener" target="_blank"&gt;here&lt;/a&gt;, &lt;a href="http://www.unswlawjournal.unsw.edu.au/wp-content/uploads/2017/09/40-1-11.pdf" rel="noreferrer noopener" target="_blank"&gt;here&lt;/a&gt; and an excellent recent paper by Smriti Parsheera &lt;a href="http://datagovernance.org/report/adoption-and-regulation-of-facial-recognition-technologies-in-india" rel="noreferrer noopener" target="_blank"&gt;here)&lt;/a&gt;
 from a diverse coterie of scholars that call out the challenges posed 
by AFRS, particularly with respect to its proportionality as a 
restriction over the right to privacy. Our contribution to this 
discourse focuses on a very specific question around a ‘reasonable 
expectation of privacy’ — the standard identified for the protection of 
privacy in public spaces across jurisdictions, including in India. This 
is because at this juncture, the precise nature of the AFRS which will 
eventually be used and the regulations it will be subject to are not 
clear.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In &lt;a href="https://indiankanoon.org/doc/91938676/'"&gt;Retd. K.S &lt;/a&gt;&lt;a href="https://indiankanoon.org/doc/91938676/" rel="noreferrer noopener" target="_blank"&gt;Puttaswamy (Retd.) v. Union of India&lt;/a&gt;:
 Justice Chandrachud (Puttaswamy I), the Indian Supreme Court was 
concerned with the question whether there exists a fundamental right to 
privacy under the Indian Constitution. A nine-judge bench of the Court 
recognized that the right to privacy is a fundamental right implicit 
inter alia in the right to life within Article 21 of the Constitution.&lt;/p&gt;
&lt;p&gt;The right to privacy protects people and not places. Every person is 
entitled, however, to a reasonable expectation of privacy. The 
expectation of privacy must be twofold. First, the person must prove 
that the alleged act could inflict some harm. Such harm must be real and
 not be speculative or imaginary. Second, society must recognize this 
expectation as reasonable. The test of reasonable expectations is 
contextual, i.e., the extent to which it safeguards privacy depends on 
the place at which the individual is.&lt;/p&gt;
&lt;p&gt;In order to pass any constitutional test, therefore, AFRS must 
satisfy the ‘reasonable expectation’ test articulated in Puttaswamy. 
However, in this context, the test itself has multiple contours. Do we 
have a right to privacy in a public place? Is AFRS collecting any data 
that specifically violates a right to privacy? Is the aggregation of 
that data a potential violation?&lt;/p&gt;
&lt;p&gt;After providing a brief introduction to the use cases of AFRS in 
India and across the world, we embark upon answering all these 
questions.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Primer on Automated Facial Recognition Systems (AFRS)&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Facial recognition is a biometric technology that utilises cameras to
 match stored or live footage of individuals (including both stills and 
moving footage) with images or video&amp;nbsp;from an existing database. Some 
systems might also be used to analyze broader demographic trends or 
conduct sentiment analysis through crowd scanning.&lt;/p&gt;
&lt;p&gt;While the use of photographs and video footage have been core 
components of police investigation, the use of algorithms to process 
vast tracts of Big Data (characterized by ‘Volume, Velocity, and 
Variety), and compare disparate and discrete data points allows for the 
derivation of hitherto unfeasible insights on the subjects of Big Data.&lt;/p&gt;
&lt;p&gt;The utilisation of AFRS for law enforcement is rapidly spreading around the world. &lt;a href="https://carnegieendowment.org/2019/09/17/global-expansion-of-ai-surveillance-pub-79847" rel="noreferrer noopener" target="_blank"&gt;A Global AI Surveillance Index&lt;/a&gt;
 compiled by the Carnegie Endowment for International Peace found that 
at least sixty-four countries are incorporating facial recognition 
systems into their AI surveillance programs.&lt;/p&gt;
&lt;p&gt;Chinese technology company Yitu has entered into a partnership with 
security forces in Malaysia to equip police officers with facial 
recognition body cameras that, powered by enabling technologies, would 
allow a comparison of images caught by the live body cameras with images
 from several central databases.&lt;/p&gt;
&lt;p&gt;In &lt;a href="https://news.sky.com/story/met-polices-facial-recognition-tech-has-81-error-rate-independent-report-says-11755941" rel="noreferrer noopener" target="_blank"&gt;England and Wales&lt;/a&gt;,
 London Metropolitan Police, South Wales Police, and Leicestershire 
Police are all in the process of developing technologies that allow for 
the identification and comparison of live images with those stored in a 
database.&lt;/p&gt;
&lt;p&gt;The technology is being developed by Japanese firm NEC and the police
 force has limited ability to oversee or modify the software, given its 
proprietary nature. The Deputy Chief of South Wales Police stated that 
“the tech is given to [them] as a sealed box… [and the police force 
themselves] have no input – whatever it does, it does what it does.”&lt;/p&gt;
&lt;p&gt;In the US, &lt;a href="https://www.americanbar.org/groups/criminal_justice/publications/criminal-justice-magazine/2019/spring/facial-recognition-technology/" rel="noreferrer noopener" target="_blank"&gt;Baltimore’s police&lt;/a&gt;
 set up facial recognition cameras to track and arrest protestors — a 
system that reached its zenith during the 2018 riots in the city.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;It is suspected that authorities in &lt;a href="https://www.japantimes.co.jp/news/2019/10/23/asia-pacific/hong-kong-protests-ai-facial-recognition-tech/#.Xf1Fs_zhVPY" rel="noreferrer noopener" target="_blank"&gt;Hong Kong&lt;/a&gt; are also using AFRS to clamp down on the ongoing pro-democracy protests.&lt;/p&gt;
&lt;p&gt;In India, the Ministry of Home Affairs, through the National Crime Records Bureau put out a &lt;a href="http://ncrb.gov.in/TENDERS/AFRS/RFP_NAFRS.pdf" rel="noreferrer noopener" target="_blank"&gt;tender for a new AFRS&lt;/a&gt;,
 whose stated objective is to “act as a foundation for national level 
searchable platform of facial images.” The AFRS will pull facial image 
data from CCTV feeds and compare these with existing records across 
databases including the Crime and Criminal Tracking Networks and Systems
 (CCTNS), Inter-operable Criminal Justice System (or ICJS), Immigration 
Visa Foreigner Registration Tracking (IVFRT), Passport, Prisons and 
state police records.&lt;/p&gt;
&lt;p&gt;Plans are also afoot to integrate this with the yet to be deployed 
National Automated Fingerprint Identification System (NAFIS), thereby 
creating a multi-faceted surveillance system.&lt;/p&gt;
&lt;p&gt;Despite raising eyeballs due to its potential all-pervasive scope, 
this tender is not the first instance of AFRS being used by Indian 
authorities. Punjab Police, &lt;a href="https://www.livemint.com/AI/DIh6fmR6croUJps6x7JW5K/Meet-Staqu-a-startup-helping-Indian-law-enforcement-agencie.html" rel="noreferrer noopener" target="_blank"&gt;in partnership with Gurugram-based start-up Staqu&lt;/a&gt;
 has launched and commenced implementation of&amp;nbsp; the Punjab Artificial 
Intelligence System (PAIS) which uses digitised criminal records and 
automated facial recognition to retrieve information on a suspected 
criminal and essentially tracks their public whereabouts, which poses 
potential constitutional questions.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This was published by &lt;a class="external-link" href="https://aipolicyexchange.org/2019/12/26/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns/"&gt;AI Policy Exchange&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns'&gt;https://cis-india.org/internet-governance/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu, Siddharth Sonkar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>internet governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2020-01-02T14:01:48Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/pegasus-snoopgate-an-opportune-moment-to-revisit-legal-framework-governing-state-surveillance-framework">
    <title>Pegasus snoopgate, an opportune moment to revisit legal framework governing state surveillance framework</title>
    <link>https://cis-india.org/internet-governance/blog/pegasus-snoopgate-an-opportune-moment-to-revisit-legal-framework-governing-state-surveillance-framework</link>
    <description>
        &lt;b&gt;Revelations of hacking call for a relook at India’s surveillance regime&lt;/b&gt;
        
&lt;p&gt;This article by Gurshabad Grover and Tanaya Rajwade was published in &lt;a class="external-link" href="https://indianexpress.com/article/opinion/columns/pegasus-whatsapp-surveillance-data-protection-6183355/"&gt;the Indian Express&lt;/a&gt; on December 25, 2019. The authors would like to thank Arindrajit Basu for his comments and suggestions.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;In early November, it became clear that several lawyers and human 
rights activists had been targeted by spyware that allowed attackers 
unfettered access to information stored on victims’ phones. On November 
29, in the Rajya Sabha, the Minister of Electronics and Information 
Technology was repeatedly asked whether any Indian agency had 
commissioned the attack vector ‘‘&lt;a href="https://indianexpress.com/article/explained/whatsapp-spyware-pegasus-india-surveillance-nso-israel-6096910/"&gt;Pegasus&lt;/a&gt;”
 that was used in the attacks from the Israeli firm NSO. Where a 
categorical response would have sufficed, the minister chose to muddy 
the waters through vague assertions such as “standard operating 
procedures have been followed”.&lt;/p&gt;
&lt;p&gt;There are cogent reasons pointing towards an Indian law enforcement 
agency’s hand in procuring Pegasus. First, NSO maintains that it only 
sells services and software to state agencies. Second, some of the known
 Indian targets of the vulnerability are human rights activists. These 
individuals work on India-specific issues and hardly qualify as serious 
threats in the eyes of a foreign government.&lt;/p&gt;
&lt;p&gt;The government derives some of its powers to conduct electronic 
surveillance from Section 69 of the Information Technology (IT) Act. The
 procedures for such surveillance are defined in the IT (Procedure and 
Safeguards for Interception, Monitoring and Decryption of Information) 
Rules, 2009. It is these rules, and not the parent Act that define the 
terms “interception” and “monitoring” as “acquisition of the contents of
 any information through the use of any means” and “to view or to 
inspect or listen to or record information”, respectively. These 
all-encompassing definitions seemingly permit authorised law enforcement
 agencies to use Pegasus-like tools.&lt;/p&gt;
&lt;p&gt;However, the IT Act also penalises unauthorised access to computers 
without the owner’s permission. These provisions, namely section 43 and 
66, do not carve out an exception for law enforcement agencies. As 
lawyer Raman Chima highlighted recently, any action explicitly 
prohibited under the Act cannot be justified by procedures laid out in 
subordinate legislation. Therefore, no law enforcement agency can “hack”
 devices, though they may “intercept” or “monitor” through other means. 
Additionally, the Supreme Court’s privacy verdict held any invasion of 
privacy by the state must be based on a law. As some of the agencies 
authorised to conduct surveillance (like the Intelligence Bureau) do not
 have statutory backing, surveillance by them is unconstitutional.&lt;/p&gt;
&lt;p&gt;The use of spyware gives the state access to private conversations, 
including privileged communications with lawyers. Such an infringement 
of rights may be justified for militants suspected of actively planning 
an armed attack. For academicians and human rights activists, the use of
 broad surveillance without any evidence or anticipation of such 
activities is unfathomable in a democracy.&lt;/p&gt;
&lt;p&gt;With the popularity of end-to-end encryption, surveillance may 
require the exploitation of vulnerabilities on end-users’ devices. The 
Pegasus snoopgate is an opportune moment to revisit the legal framework 
governing the state surveillance framework. It is crucial to dismantle 
state agencies that run surveillance operations despite lacking 
statutory authority. For other agencies, there is a need to introduce 
judicial and parliamentary oversight. Depending on the concerns of law 
enforcement, it may be necessary to enact legislation permitting 
“hacking” into devices on extremely limited grounds.&lt;/p&gt;
&lt;p&gt;Unfortunately, the government has taken a massive leap backwards by 
ignoring the standards laid down by the Supreme Court and Justice 
Srikrishna Committee’s recommendations, and introducing unconstitutional
 surveillance enablers in the Data Protection Bill. Now is the time for 
Parliament to guarantee the privacy and security of Indians.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Grover and Rajwade are researchers at the Centre for 
Internet and Society (CIS). Views are personal. Disclosure: CIS is a 
recipient of research grants from Facebook.&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/pegasus-snoopgate-an-opportune-moment-to-revisit-legal-framework-governing-state-surveillance-framework'&gt;https://cis-india.org/internet-governance/blog/pegasus-snoopgate-an-opportune-moment-to-revisit-legal-framework-governing-state-surveillance-framework&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Gurshabad Grover and Tanaya Rajwade</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2020-07-09T01:30:34Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/anushree-gupta-ladies-log-women-safety-risk-transfer-ridehailing">
    <title>Anushree Gupta - Ladies ‘Log’: Women’s Safety and Risk Transfer in Ridehailing</title>
    <link>https://cis-india.org/raw/anushree-gupta-ladies-log-women-safety-risk-transfer-ridehailing</link>
    <description>
        &lt;b&gt;Working in the gig-economy has been associated with economic vulnerabilities. However, there are also moral and affective vulnerabilities as workers find their worth measured everyday by their performance of—and at—work and in every interaction and movement. This essay by Anushree Gupta is the third among a series of writings by researchers associated with the 'Mapping Digital Labour in India' project at the CIS, supported by the Azim Premji University, that were published on the Platypus blog of the Committee on the Anthropology of Science, Technology, and Computing (CASTAC). The essay is edited by Noopur Raval, who co-led the project concerned.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Originally published by the &lt;a href="http://blog.castac.org/category/series/indias-gig-work-economy/" target="_blank"&gt;Platypus blog&lt;/a&gt; of CASTAC on August, 1, 2019.&lt;/em&gt;&lt;/p&gt;
&lt;h4&gt;Summary of the essay in Hindi: &lt;a href="https://www.youtube.com/watch?v=ty0a_u9lzCE" target="_blank"&gt;Audio&lt;/a&gt; (YouTube) and &lt;a href="http://blog.castac.org/wp-content/uploads/sites/2/2019/07/Blog-Post-Audio-Transcript-Devanigiri.docx" target="_blank"&gt;Transcript&lt;/a&gt; (text)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;Mumbai, India’s financial capital, is also often considered one of the safest cities for women in India, especially in contrast with New Delhi which is infamously dubbed as the “rape capital” within the country. Sensationalised incidents of harassment, molestation and rape serve as anecdotal references and warnings to other women who dare to venture out alone even during the daytime. The Delhi government recently proposed a policy for free transport for women in public buses and metro trains with the objective of increasing women’s affordability and access and to ensure safety in public transportation. [1] Despite such measures to increase women’s visibility and claims to public utilities and spaces, women who use public transport have historically suffered groping and stalking on buses and trains, which uphold self-policing and surveillance narratives. The issue of women’s safety in India remains a priority as well as a good rhetorical claim and goal to aspire to, for public and private initiatives. Ironically, the notion of women’s safety is also advanced to increase moral policing and censure women’s access to public spaces, which also perpetuates exclusion of other marginalised citizens (Phadke 2007). Further, and crucially, whose safety is being imagined, prioritized and designed for (which class of women are central to the imagination of the safety discourse) is often a point of contention.&lt;/p&gt;
&lt;p&gt;In this context, ridehailing services offered by Uber and Ola have come to be frequently cited as safer and more reliable options for women to traverse the cityspace, compared to overcrowded buses and trains. Their mobile applications promise accountability and traceability, enforcing safety standards by way of qualified and well-groomed drivers, SOS buttons and location-sharing features. However, it has increasingly become common knowledge that these alternatives are prone to similar, if not worse, categories of crimes against women. While reports of violence against women in cabs have mostly been outside of Mumbai, due to “platform-effects,” such incidents have widespread ramifications for drivers across the country. Cab drivers who operate via cab aggregator platforms have come under heavy scrutiny not only by the corporate and legal infrastructures of aggregator companies but also in the public eye.  On the other hand, platform companies independently, and in partnership with city and state administrations, continue to launch “social impact” initiatives aimed at women’s safety as well as employment (through taxi-driving training). [2] Incidents of violence against women present jarring narratives of risk not only for female passengers but also for the platform-workers, both of whom are responsible for abiding by the constructed notions of safety for women in urban spaces.&lt;/p&gt;
&lt;p&gt;In this post, I explore women’s presence as workers as well as passengers/customers in the ridehailing platform economy, in the context of women’s safety, situating the analysis with a focus on Mumbai. The related discourses around risk for female commuters give rise to various interventions and women-centric services through female-only cab enterprises and training more women drivers to mitigate this risk. Through these, I will think through the figure of the woman in the ridehailing economy in Mumbai and by extension in India.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Platforms in Gendered Cityscapes&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Mumbai’s public transport is comprised of the local train network, BEST buses and auto rickshaws, with the metro being the newest addition to the mix. Unlike in most of India, kaali-peelis (black-yellow cabs) have been a permanent feature of Mumbai’s landscape since the 1950s and, taking a cab is not necessarily a luxury. Against this backdrop, platform companies have sought to make the claims of democratizing public transport and providing safer travel options to women in the city.&lt;/p&gt;
&lt;p&gt;Cab drivers on ridehailing platforms in Mumbai are usually domestic male migrants or Muslim drivers from within and outside the city, who are more often than not overworked and stressed due to the falling incomes and rising debts. It is important to recognise the ‘veiled masculinities’ (Chopra 2006) which labor to service the emergent platform economy and the hierarchies of caste and class which are sustained through their labor. The incongruence between the masculinity of a working class man and the demands of the service economy (Nixon 2009) exacerbates emotional pressures in customer-facing services, which can offer an explanation for angry outbursts and conflicts between drivers and customers.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;img src="https://cis-india.org/CIS_APU_DigitalLabour_PlatypusEssays_AG_01.jpg/image_preview" alt="CIS_APU_DigitalLabour_PlatypusEssays_AG_01" class="image-left image-inline" title="CIS_APU_DigitalLabour_PlatypusEssays_AG_01" /&gt;
&lt;h5&gt;Uber’s ad on a billboard in Mumbai promises earnings of more than Rs. 1 lakh per month. Using a woman’s image illustrates the extent of their potential for transforming lives and livelihoods. &lt;em&gt;Source: Drivers’ Union Telegram Group&lt;/em&gt;.&lt;/h5&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While Uber and Ola claim that a large number of women drivers work on their platforms, actual experiences of passengers and the male drivers I spoke to, suggested otherwise. Ironically, mass driver-training programs are seen as a quick way to make low-skilled and migrant male workers employable in Indian cities while, despite public-private partnerships to train women, it has been impossible to retain women drivers due to stereotypical perceptions of gender and persistent social stigma. [3] This made the ridehailing passenger woman (upper middle class, affording professional) a stakeholder to design for, while female drivers (but all female workers) appeared as liability for platforms.&lt;/p&gt;
&lt;p&gt;These narratives speak directly to the construction of insecurity and risk for women (Berrington and Jones 2002) on public transport systems as they highlight vulnerabilities due to public exposure of women’s bodies. Pandering to a moral panic standpoint and creating personalised or ‘inside’ safe spaces for women to manage risk (Green and Singleton 2006), these platforms can then be imagined as a boundary-setting exercise. Access to public spaces is encouraged but it is delimited by confining the woman’s body to a singular vehicle in the custody of the cab driver. Autonomy and access afforded by the platform manages to transform women—particularly upper class and upper caste women who can afford these services—into potential customers. Their agency is bounded though by tasking the driver to ferry her across the otherwise hostile cityscape filled with ‘unfriendly bodies’ (Phadke 2013). The production of the city’s gendered space goes hand in hand with the confinement/erasure of female bodies in the public space as they embody patriarchal norms even in a city as ‘progressive’ as Mumbai. As demonstrated by studies mapping the movement of women in the city (Ranade 2007), the spatio-temporal factors lend themselves to creating gendered bodies in order to keep patriarchal norms intact. These norms, as I argue in this post, are detrimental not just to women but also other marginalised sections of the urban population, in this case platform workers.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Terms of Safety&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Male drivers’ social identities as lower class, lower caste individuals do not inspire confidence in the standards of safety boasted by these companies in the eyes of their predominantly upper caste and upper class customer base. Risk to female passengers is further exaggerated due to the closed space in which the service is provided, highlighting the proximity to a potential aggressor by way of these platforms. In specific situations wherein a female passenger is inebriated or is travelling alone at night, drivers report being extra cautious and helpful towards her. Many respondents proudly mention going out of their way to make sure women get home safely, for instance, prolonging waiting time or escorting them to the entrance of their residential buildings or involving the security guard at the gate.&lt;/p&gt;
&lt;p&gt;However, there have also been cases wherein the driver has been under scrutiny either by an overly careful passenger or by the public. One driver reported being surrounded by a crowd at a traffic signal, only to realise that he was being suspected of foul play with the female passenger who had fallen asleep on the backseat of the car. In contrast to their western counterparts, the class differences between drivers and passengers in India exacerbate doubts, fears and insecurities in India which tend to take a caste-purity angle as well. The woman’s body undergoes an exchange of custody in these instances wherein she is deemed incapable of taking care of herself and requires external assistance. Imagining a deterrence effect of ridesharing services (Park et. al 2017) reinforces the logic of guardianship and protectionism for the woman. The risk of carrying her in the vehicle in these situations is borne by the cab driver, operating under a framework of overbearing protectiveness which holds him culpable for any misgivings, assumed or otherwise.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;img src="https://cis-india.org/CIS_APU_DigitalLabour_PlatypusEssays_AG_02.jpg/image_preview" alt="CIS_APU_DigitalLabour_PlatypusEssays_AG_02" class="image-left image-inline" title="CIS_APU_DigitalLabour_PlatypusEssays_AG_02" /&gt;
&lt;h5&gt;Cautionary listicles advise women to not take a cab alone at night, carrying pepper sprays/umbrellas as tools for self-defence, refrain from conversations with drivers or talk continuously on the phone, among other things. The onus of the woman’s safety is either on the individual herself or the driver who is ferrying her. Moreover, the driver is a likely assailant whom the woman should guard against as well. &lt;em&gt;Source: &lt;a href="https://www.hellotravel.com/stories/10-ways-for-women-to-ensure-safety-when-boarding-cab" target="_blank"&gt;HelloTravel&lt;/a&gt;&lt;/em&gt;.&lt;/h5&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Notions of safety and risk are embodied in everyday interactions in urban spaces and mediated by disparate infrastructures of knowledge across distinctions of caste, class and gender. These distinctions define constraints which govern social interactions between actors of these categories. Interactions between lower caste or Muslim men and upper caste/class women are circumscribed by what Tuan (1979) describes as ‘landscapes of fear’. Be it the apprehensions about sharing a ride with a passenger of the opposite sex (Sarriera et. al 2017) or reports of gang-rapes by cab drivers, the boundaries of social conduct are laid out clearly by constructing narratives of risk and safety. The protection of the female body and her sexual safety is not her responsibility alone but that of the society as a whole. The so called preventive measures for rape and violence against women produce the dichotomies of frailty and strength (Campbell 2005) in so far as they project the woman as always at risk with the shadow of a potential assault always looming large.&lt;/p&gt;
&lt;p&gt;When asked about interactions with women as customers or fellow drivers, drivers performed exaggerated respectability for women. The catch in these narratives however was that drivers justified and extended respect only to ‘good’ customers, where a ‘good’ woman was a certain kind of a moral actor.&lt;/p&gt;
&lt;p&gt;Given the prevailing discontent with redressal mechanisms for workers on the platforms, it was not surprising to witness a group of drivers at the Uber Seva Kendra (help centre) in Mumbai, debating whether they should be accepting requests from any female customers at all. Drivers also had to attend mandatory training sessions for ‘good conduct’ with customers wherein they underwent behavioral correction and gender sensitisation lessons. [4] The gendering of the platform economy is baked into these instructions and trainings that reproduce male drivers as figures of safety and constant positive affect.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Gender, Safety, and Enterprise&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;In my fieldwork, I also came across a slew of ventures run by fleet owners and others that sought to service women passengers and employ women drivers exclusively. Claiming to fill in the gaps of inadequate vetting mechanisms in existing platforms, these alternate ventures purportedly smoothened out some anxieties by eliminating the risk of interacting with a man from different socio-economic strata. The premium charged by these companies was telling of the value of safety and affordability of these services for a large section of their intended audience, namely women with higher disposable incomes residing in metropolitan cities.&lt;/p&gt;
&lt;p&gt;On the flipside, these enterprises encouraged women to break stereotypical perceptions about women drivers, also giving a nod to increasing and diversifying opportunities of employment for women. However, these ideas remained attractive only in principle and fizzled out sooner or later as most of these ventures did not succeed. A severe capital crunch due to unsustainable business models, limited funding options and lack of substantial supportive ecosystems for training and upkeep are possible reasons for failure. [5] Even so, the idea of a women-centric service continues to remain valuable because of the promise of safety which is produced through considerations of class, caste, gender and religion (Phadke 2005). Any alternative to avoid interaction with men from a lower class or caste background or from another religion (especially Hindu/Muslim in Mumbai) is welcome in a society which is deeply stratified and entrenched in caste-class systems of religion and economy alike.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;The pervasiveness of the discourses of safety and risk in the ride hailing space became apparent to me during field research. Respondents indicated a heightened awareness of my gender, referring to me as “madam” and taking measures to ensure my safety. They advised me to use a separate phone to interact with drivers and moderated my interactions with drivers on the Telegram group (run by one of the Unions in Mumbai). Union representatives were also diligent in moderating the group to filter out abusive language as a token of respect for women. My apprehensions in interacting with drivers, most of whom were older men from a lower class/caste community, were also indicative of my social conditioning as an upper class and upper caste woman. Self-policing and boundary setting in both physical and virtual interactions, while necessary to some extent, were often rendered useless as the shifting of risks became apparent to me in my interactions with the drivers.&lt;/p&gt;
&lt;p&gt;In this piece, I have tried to show how gendered norms govern the construction of safety and risk which in turn regulate social interactions. Limiting exposure in a personal cab as opposed to a public bus/train also heightens considerations of intimacy and proximity to a potential aggressor (often from a marginalised sociocultural background). Women-centric cab services mitigate this by promoting the image of the female driver who breaks social norms. However, these services dwindle till they completely disappear due to a capital crunch or insufficient infrastructural support. Patriarchal contexts reaffirm the woman as a risky object by highlighting narratives of vulnerabilities and insecurities in the ridehailing space. Besides the woman, the cab drivers are held accountable for bearing this risk and ensuring her sexual and physical safety. These patriarchal hierarchies of protectionism are sustained by platform workers’ affective labour which lubricate the wheels of the platform economy.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Endnotes&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;[1] &lt;a href="https://www.thehindu.com/news/cities/Delhi/free-rides-for-women-only-the-starting-point-say-activists/article28111938.ece" target="_blank"&gt;https://www.thehindu.com/news/cities/Delhi/free-rides-for-women-only-the-starting-point-say-activists/article28111938.ece&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[2] &lt;a href="https://www.olacabs.com/media/in/press/ola-foundation-launches-drive-to-enable-sustainable-livelihoods-for-500000-women-by-2025" target="_blank"&gt;https://www.olacabs.com/media/in/press/ola-foundation-launches-drive-to-enable-sustainable-livelihoods-for-500000-women-by-2025&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[3] &lt;a href="https://www.buzzfeed.com/soniathomas/girl-power" target="_blank"&gt;https://www.buzzfeed.com/soniathomas/girl-power&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[4] &lt;a href="https://yourstory.com/2018/11/uber-gender-awareness-sensitisation-driver" target="_blank"&gt;https://yourstory.com/2018/11/uber-gender-awareness-sensitisation-driver&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[5] &lt;a href="https://www.livemint.com/Companies/bo4534H8mOWo0oG6VQ0xbM/As-demand-for-womenonly-cab-services-grow-challenges-loom.html" target="_blank"&gt;https://www.livemint.com/Companies/bo4534H8mOWo0oG6VQ0xbM/As-demand-for-womenonly-cab-services-grow-challenges-loom.html&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Berrington, E. and Jones, H., 2002. Reality vs. myth: Constructions of women’s insecurity. Feminist Media Studies, 2(3), pp.307-323.&lt;/p&gt;
&lt;p&gt;Campbell, A., 2005. Keeping the ‘lady’ safe: The regulation of femininity through crime prevention literature. Critical Criminology, 13(2), pp.119-140.&lt;/p&gt;
&lt;p&gt;Chopra, R., 2006. Invisible men: Masculinity, sexuality, and male domestic Labor. Men and Masculinities, 9(2), pp.152-167.&lt;/p&gt;
&lt;p&gt;Green, E. and Singleton, C., 2006. Risky bodies at leisure: Young women negotiating space and place. Sociology, 40(5), pp.853-871.&lt;/p&gt;
&lt;p&gt;Nixon, D., 2009. I Can’t Put a Smiley Face On’: Working‐Class Masculinity, Emotional Labour and Service Work in the ‘New Economy. Gender, Work &amp;amp; Organization, 16(3), pp.300-322.&lt;/p&gt;
&lt;p&gt;Park, J., Kim, J., Pang, M.S. and Lee, B., 2017. Offender or guardian? An empirical analysis of ride-sharing and sexual assault. An Empirical Analysis of Ride-Sharing and Sexual Assault (April 10, 2017). KAIST College of Business Working Paper Series, (2017-006), pp.18-010.&lt;/p&gt;
&lt;p&gt;Phadke, S., 2005. ‘You Can Be Lonely in a Crowd’ The Production of Safety in Mumbai. Indian Journal of Gender Studies, 12(1), pp.41-62.&lt;/p&gt;
&lt;p&gt;Phadke, S., 2007. Dangerous liaisons: Women and men: Risk and reputation in Mumbai. Economic and Political Weekly, pp.1510-1518.&lt;/p&gt;
&lt;p&gt;Phadke, S., 2013. Unfriendly bodies, hostile cities: Reflections on loitering and gendered public space. Economic and Political Weekly, pp.50-59.&lt;/p&gt;
&lt;p&gt;Ranade, S., 2007. The way she moves: Mapping the everyday production of gender-space. Economic and Political Weekly, pp.1519-1526.&lt;/p&gt;
&lt;p&gt;Raval, N. and Dourish, P., 2016, February. Standing out from the crowd: Emotional labor, body labor, and temporal labor in ridesharing. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work &amp;amp; Social Computing (pp. 97-107). ACM.&lt;/p&gt;
&lt;p&gt;Sarriera, J.M., Álvarez, G.E., Blynn, K., Alesbury, A., Scully, T. and Zhao, J., 2017. To share or not to share: Investigating the social aspects of dynamic ridesharing. Transportation Research Record, 2605(1), pp.109-117.&lt;/p&gt;
&lt;p&gt;Tuan, Y.F., 2013. Landscapes of fear. U of Minnesota Press.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/anushree-gupta-ladies-log-women-safety-risk-transfer-ridehailing'&gt;https://cis-india.org/raw/anushree-gupta-ladies-log-women-safety-risk-transfer-ridehailing&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anushree Gupta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Labour</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Platform-Work</dc:subject>
    
    
        <dc:subject>Network Economies</dc:subject>
    
    
        <dc:subject>Publications</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Mapping Digital Labour in India</dc:subject>
    

   <dc:date>2020-05-19T06:29:12Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/about/newsletters/november-2019-newsletter">
    <title>November 2019 Newsletter</title>
    <link>https://cis-india.org/about/newsletters/november-2019-newsletter</link>
    <description>
        &lt;b&gt;CIS newsletter for November 2019&lt;/b&gt;
        &lt;table class="grid listing"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;Highlights for November 2019&lt;/th&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;If you think that Indian languages are as important as international languages, like English, then, you are on the same page with this article.  Suswetha Kolluru and Nitesh Gill explains this in &lt;a class="external-link" href="https://cis-india.org/a2k/blogs/suswetha-kolluru-and-nitesh-gill-november-22-2019-project-tiger-2"&gt;their blog posts&lt;/a&gt; published in multiple languages: English, Punjabi, Hindi and Telugu.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Gurshabad Grover &lt;a class="external-link" href="https://cis-india.org/openness/news/gurshabad-grover-nominated-to-join-advisory-group-on-open-source-software-for-iso-iec-jtc-1"&gt;was nominated&lt;/a&gt; through the Bureau of Indian Standards (BIS) to be a member of the Advisory Group AG) on Open Source Software for ISO/IEC JTC 1.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;CIS &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/project-on-gender-health-communications-and-online-activism-with-city-university"&gt;is a partner on the project 'Gender, Health Communications and Online Activism in the Digital Age&lt;/a&gt;'. The project is lead by Dr. Carolina Matos, Senior Lecturer in Sociology and Media in the Department of Sociology at City University. Ambika Tandon, Policy Officer at CIS, conducted fieldwork for the project in May and June 2019 as a research assistant. Dr. Carolina Matos's presentation can be &lt;a class="external-link" href="https://cis-india.org/internet-governance/presentation-gender-health-communications-and-online-activism-in-the-digital-age-pdf"&gt;accessed here&lt;/a&gt;.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;The need for intervention in the cybersecurity imagery in media publications was realised during a brainstorming workshop that was conducted by CIS with illustrators, designers, and cybersecurity researchers. Towards this CIS &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/introducing-the-cybersecurity-visuals-media-handbook"&gt;compiled a handbook introducing Cybersecurity Visuals Media&lt;/a&gt;. CIS, along with Design Beku &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cybersecurity-visuals-media-handbook-launch-event"&gt;launched  the Cybersecurity Visuals Media Handbook&lt;/a&gt;&lt;span&gt;. This handbook has been conceived to be a concise guide for media publications to understand the specific concepts within cybersecurity and use it as a reference to create visuals that are more informative, relevant, and look beyond stereotypes.&lt;/span&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Anusha Madhusudhan authored a research paper titled "&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/blockchain-a-primer-for-india"&gt;Blockchain: A primer for India&lt;/a&gt;". The paper aims to provide a comprehensive review of existing research and major debates surrounding Blockchain technology and its developments in select jurisdictions with a specific focus on India.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Vipul Kharbanda authored a research paper titled &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/draft-security-standards-for-the-financial-technology-sector-in-india"&gt;Draft Security Standards for The Financial Technology Sector in India&lt;/a&gt;. This document includes draft information security standards, which seek to ensure that not only the data of users is dealt with in a secure and safe manner but also that the smaller businesses in the fintech industry have a specific standard to look at in order to limit their liabilities for any future breaches.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Pukhraj Singh, a cyber threat intelligence analyst who has worked with the Indian government and security response teams of global companies authored a guest blog post titled "Before cyber norms, let’s talk about disanalogy and disintermediation". &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/guest-post-before-cyber-norms-let2019s-talk-about-disanalogy-and-disintermediation"&gt;Pukhraj looks at the critical fissures&lt;/a&gt; – at the technical and policy levels – in global normative efforts to secure cyberspace. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;span&gt;CIS  was acknowledged in the&lt;/span&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/advancing-cyberstability-final-report"&gt; final report&lt;/a&gt;&lt;span&gt; of the Global Commission on Stability of Cyberspace.&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3&gt;CIS and the News&lt;/h3&gt;
&lt;p&gt;The following articles and research papers were authored by CIS secretariat during the month:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/lawfare-arindrajit-basu-november-7-2019-indias-role-in-global-cyber-policy-formulation"&gt;India’s Role in Global Cyber Policy Formulation&lt;/a&gt; (Arindrajit Basu; Lawfare; November 7, 2019). The article was reviewed and edited by Elonnai Hickok and Justin Sherman.&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://www.business-standard.com/article/opinion/the-telecom-crisis-is-an-npa-problem-119110700062_1.html"&gt;The Telecom Crisis is an NPA Problem&lt;/a&gt; (Shyam Ponappa; Business Standard; November 7, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;CIS in the News&lt;/h3&gt;
&lt;p&gt;CIS secretariat was consulted for the following articles published during the month in various publications:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/why-having-more-cctv-cameras-does-not-translate-to-crime-prevention"&gt;Why having more CCTV cameras does not translate to crime prevention&lt;/a&gt; (Manasa Rao; News Minute; September 3, 2019). &lt;i&gt;Published on the CIS website on December 5, 2019&lt;/i&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/hindu-businessline-november-1-2019-kv-kurmanath-activists-demand-judicial-probe-into-whatsapp-snooping"&gt;Activists demand judicial probe into WhatsApp snooping&lt;/a&gt; (K.V. Kuramath; Hindu Businessline; November 1, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability"&gt;Cyber law experts asks why CERT-In removed advisory warning about WhatsApp vulnerability&lt;/a&gt; (Megha Mandavia; ET Tech.com; November 4, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after"&gt;WhatsApp spy attack and after&lt;/a&gt; (Theres Sudeep; Deccan Herald; November 6, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/hindu-pj-george-november-8-2019-should-online-political-advertising-be-regulated"&gt;Should online political advertising be regulated?&lt;/a&gt; (P.J. George; Hindu; November 8, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/al-jazeera-video-november-8-2019-india-facial-recognition"&gt;India facial recognition: How effective will it be?&lt;/a&gt; (Al Jazeera; November 8, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/livemint-november-27-2019-saumya-tewari-and-abhijit-ahaskar-proposals-to-regulate-social-media-run-into-multiple-roadblocks"&gt;Proposals to regulate social media run into multiple roadblocks&lt;/a&gt; ( Saumya Tewari and Abhijit Ahaskar; Livemint; November 27, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class="external-link" href="https://cis-india.org/a2k"&gt;Access to Knowledge&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Access to Knowledge is a campaign to promote the fundamental principles of justice, freedom, and economic development. It deals with issues like copyrights, patents and trademarks, which are an important part of the digital landscape.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Wikipedia&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Under a grant from Wikimedia Foundation we are doing a project for the growth of Indic language communities and projects by designing community collaborations and partnerships that recruit and cultivate new editors and explore innovative approaches to building projects.&lt;/p&gt;
&lt;p style="text-align: left; "&gt;&lt;span style="text-align: justify; "&gt;&lt;strong&gt;Blog Entries&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/suswetha-kolluru-and-nitesh-gill-november-22-2019-project-tiger-2"&gt;Project Tiger 2.0&lt;/a&gt; (Suswetha Kolluru and Nitesh Gill; November 22, 2019). in English&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/project-tiger-2-punjabi"&gt;Project Tiger 2.0&lt;/a&gt; (Suswetha Kolluru and Nitesh Gill; November 22, 2019). in Punjabi&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/project-tiger-2-hindi"&gt;Project Tiger 2.0&lt;/a&gt; (Suswetha Kolluru and Nitesh Gill; November 22, 2019). in Hindi&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/project-tiger-2-telugu"&gt;Project Tiger 2.0&lt;/a&gt; (Suswetha Kolluru and Nitesh Gill; November 22, 2019). in Telugu&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance"&gt;Internet Governance&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The Tunis Agenda of the second World Summit on the Information Society has defined internet governance as the development and application by governments, the private sector and civil society, in their respective roles of shared principles, norms, rules, decision making procedures and programmes that shape the evolution and use of the Internet. As part of internet governance work we work on policy issues relating to freedom of expression primarily focusing on the Information Technology Act and issues of liability of intermediaries for unlawful speech and simultaneously ensuring that the right to privacy is safeguarded as well.&lt;/p&gt;
&lt;h3&gt;Freedom of Speech &amp;amp; Expression&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Under a grant from the MacArthur Foundation, CIS is doing research on the restrictions placed on freedom of expression online by the Indian government and contribute studies, reports and policy briefs to feed into the ongoing debates at the national as well as international level. As part of the project we bring you the following outputs:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Blog Entry&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/reliance-jio-is-using-sni-inspection-to-block-websites"&gt;Reliance Jio is using SNI inspection to block websites&lt;/a&gt; (Gurshabad Grover and Kushagra Singh, and edited by Elonnai Hickok; November 7, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Participation in Events&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/roundtable-discussion-on-intermediary-liability"&gt;Roundtable Discussion on Intermediary Liability&lt;/a&gt; (Organized by SFLC and the Dialogue; New Delhi; October 17, 2019). Tanaya Rajwade participated in a roundtable discussion on intermediary liability.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;&lt;/h3&gt;
&lt;h3&gt;Gender&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Blog Entry&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/project-on-gender-health-communications-and-online-activism-with-city-university"&gt;Project on Gender, Health Communications and Online Activism with City University&lt;/a&gt; (Ambika Tandon; November 28, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Cyber Security&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Research Paper&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/introducing-the-cybersecurity-visuals-media-handbook"&gt;Introducing the Cybersecurity Visuals Media Handbook&lt;/a&gt; (Handbook concept, content and design by: Padmini Ray Murray and Paulanthony George; Blog post authored by: Saumyaa Naidu and Arindrajit Basu; With inputs from: Karan Saini; Edited by: Shweta Mohandas; November 15, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Blog Entry&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/guest-post-before-cyber-norms-let2019s-talk-about-disanalogy-and-disintermediation"&gt;Before cyber norms, let’s talk about disanalogy and disintermediation&lt;/a&gt; (Pukhraj Singh; November 15, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Information Technology / Financial Technology&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Research Papers&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/draft-security-standards-for-the-financial-technology-sector-in-india"&gt;Draft Security Standards for The Financial Technology Sector in India&lt;/a&gt;&lt;span&gt; (Vipul Kharbanda with inputs from Prem Sylvester; November 15, 2019).&lt;/span&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;span&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/blockchain-a-primer-for-india"&gt;Blockchain: A primer for India&lt;/a&gt; (Anusha Madhusudhan; with inputs from Karan Saini and Mira Swaminathan; edited by Elonnai Hickok, Siddharth Sonkar and Aayush Rathi; November 18, 2019).&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Blog Entry&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/event-report-consultation-on-draft-information-technology-fintech-security-standards-rules"&gt;Event Report: Consultation on Draft Information Technology (Fintech Security Standards) Rules &lt;/a&gt;(Anindya Kanan; reviewed and edited by Vipul Kharbanda,  Elonnai Hickok and Arindrajit Basu; November 12, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;&lt;span&gt;Privacy&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Under a grant from Privacy International and IDRC we are doing a project on surveillance. CIS is researching the history of privacy in India and how it shapes the contemporary debates around technology mediated identity projects like Aadhar. As part of our ongoing research, we bring you the following outputs:&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Participation in Event&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/ietf106"&gt;IETF106&lt;/a&gt; (Organized by IETF; Singapore; November 16 - 22, 2019). Gurshabad Grover participated in the meeting.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;&lt;/h3&gt;
&lt;h3&gt;&lt;a class="external-link" href="https://cis-india.org/raw" style="text-align: justify; "&gt;Researchers@Work&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The researchers@work programme at CIS produces and supports pioneering and sustained trans-disciplinary research on key thematics at the intersections of internet and society; organise and incubate networks of and fora for researchers and practitioners studying and making internet in India; and contribute to development of critical digital pedagogy, research methodology, and creative practice.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Event Organized&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;a class="external-link" href="https://cis-india.org/raw/domestic-work-in-the-gig-economy-20191116"&gt;Domestic Work in the ‘Gig Economy’&lt;/a&gt; (Organized by CIS and Domestic Workers’ Rights Union; November 16, 2019; Student Christian Movement of India, Mission Road, Bangalore).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class="external-link" href="https://cis-india.org/telecom"&gt;Telecom&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The growth in telecommunications in India has been impressive. While the potential for growth and returns exist, a range of issues need to be addressed for this potential to be realized. One aspect is more extensive rural coverage and the second aspect is a countrywide access to broadband which is low at about eight million subscriptions. Both require effective and efficient use of networks and resources, including spectrum.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Monthly Blog&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/telecom/blog/shyam-ponappa-business-standard-and-organizing-india-blogspot-november-7-2019-telecom-crisis-is-an-npa-problem"&gt;The Telecom Crisis is an NPA Problem&lt;/a&gt; (Shyam Ponappa; Business Standard and Organizing India Blogspot; November 7, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;a class="external-link" href="http://cis-india.org/"&gt;About CIS&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;CIS is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with disabilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, open access, open educational resources, and open video), internet governance, telecommunication reform, digital privacy, and cyber-security. The academic research at CIS seeks to understand the reconfigurations of social and cultural processes and structures as mediated through the internet and digital media technologies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Follow CIS on:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Twitter:&lt;a href="http://twitter.com/cis_india"&gt; http://twitter.com/cis_india&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Access to Knowledge: &lt;a href="https://twitter.com/CISA2K"&gt;https://twitter.com/CISA2K&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Information Policy: &lt;a href="https://twitter.com/CIS_InfoPolicy"&gt;https://twitter.com/CIS_InfoPolicy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Facebook - Access to Knowledge:&lt;a href="https://www.facebook.com/cisa2k"&gt; https://www.facebook.com/cisa2k&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;E-Mail - Access to Knowledge: a2k@cis-india.org&lt;/li&gt;
&lt;li&gt;E-Mail - Researchers at Work: raw@cis-india.org&lt;/li&gt;
&lt;li&gt;List - Researchers at Work: &lt;a href="https://lists.ghserv.net/mailman/listinfo/researchers"&gt;https://lists.ghserv.net/mailman/listinfo/researchers&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Support CIS:&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Please help us defend consumer and citizen rights on the Internet! Write a cheque in favour of 'The Centre for Internet and Society' and mail it to us at No. 194, 2nd 'C' Cross, Domlur, 2nd Stage, Bengaluru - 5600 71.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Collaborate with CIS:&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We invite researchers, practitioners, artists, and theoreticians, both organisationally and as individuals, to engage with us on topics related internet and society, and improve our collective understanding of this field. To discuss such possibilities, please write to Sunil Abraham, Executive Director, at sunil@cis-india.org (for policy research), or Sumandro Chattapadhyay, Research Director, at sumandro@cis-india.org (for academic research), with an indication of the form and the content of the collaboration you might be interested in. To discuss collaborations on Indic language Wikipedia projects, write to Tanveer Hasan, Programme Officer, at tanveer@cis-india.org.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;CIS is grateful to its primary donor the Kusuma Trust founded by Anurag Dikshit and Soma Pujari, philanthropists of Indian origin for its core funding and support for most of its projects. CIS is also grateful to its other donors, Wikimedia Foundation, Ford Foundation, Privacy International, UK, Hans Foundation, MacArthur Foundation, and IDRC for funding its various projects&lt;/em&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/about/newsletters/november-2019-newsletter'&gt;https://cis-india.org/about/newsletters/november-2019-newsletter&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2019-12-31T14:53:11Z</dc:date>
   <dc:type>Page</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/extraterritorial-algorithmic-surveillance-and-the-incapacitation-of-international-human-rights-law">
    <title>EXTRATERRITORIAL ALGORITHMIC SURVEILLANCE AND THE INCAPACITATION OF INTERNATIONAL HUMAN RIGHTS LAW</title>
    <link>https://cis-india.org/internet-governance/extraterritorial-algorithmic-surveillance-and-the-incapacitation-of-international-human-rights-law</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/extraterritorial-algorithmic-surveillance-and-the-incapacitation-of-international-human-rights-law'&gt;https://cis-india.org/internet-governance/extraterritorial-algorithmic-surveillance-and-the-incapacitation-of-international-human-rights-law&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranav</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2019-12-31T10:55:51Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/extra-territorial-surveillance-and-the-incapacitation-of-human-rights">
    <title>Extra-Territorial Surveillance and the Incapacitation of Human Rights</title>
    <link>https://cis-india.org/internet-governance/extra-territorial-surveillance-and-the-incapacitation-of-human-rights</link>
    <description>
        &lt;b&gt;This paper was published in Volume 12 (2) of the NUJS Law Review. &lt;/b&gt;
        
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;Our 
networked data trails dictate, define, and modulate societies in hitherto
 inconceivable ways. The ability to access and manipulate that data is a
 product of stark power asymmetry in geo-politics, leading to a dynamic 
that privileges the interests of a few over the right to privacy and 
dignity of the many. I argue that the persistent de facto violation of 
human rights norms through extraterritorial surveillance conducted by 
western intelligence agencies, compounded by the failure of judicial 
intervention in the West has lead to the incapacitation of international
 human rights law. Despite robust jurisprudence including case law, 
comments by the United Nations, and widespread state practice on the 
right to privacy and the application of human rights obligations to 
extraterritorial stakeholders, extraterritorial surveillance continues 
with aplomb. Procedural safeguards and proportionality tests regularly 
sway towards a ‘ritual incantation’ of national security even in 
scenarios where a less intrusive option is available. The vulnerable 
citizen abroad is unable to challenge these processes and becomes an 
unwitting victim of nefarious surveillance practices that further widens
 global power asymmetry and entrenches geo-political fissures.&lt;/div&gt;
&lt;div&gt;&lt;br /&gt;The full article can be found &lt;a href="https://cis-india.org/internet-governance/extraterritorial-algorithmic-surveillance-and-the-incapacitation-of-international-human-rights-law" class="internal-link" title="EXTRATERRITORIAL ALGORITHMIC SURVEILLANCE AND THE INCAPACITATION OF INTERNATIONAL HUMAN RIGHTS LAW"&gt;here&lt;/a&gt;.&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/extra-territorial-surveillance-and-the-incapacitation-of-human-rights'&gt;https://cis-india.org/internet-governance/extra-territorial-surveillance-and-the-incapacitation-of-human-rights&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2020-01-02T11:02:26Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
