<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 21 to 35.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/pdp-bill-is-coming-whatsapp-privacy-policy-analysis"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/response-to-pegasus-questionnaire-issued-by-sc-technical-committee"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/deccan-herald-aman-nair-and-pallavi-bedi-june-13-2021-pandemic-technology-takes-its-toll-on-data-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function">
    <title>How Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</link>
    <description>
        &lt;b&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;published in Medianama&lt;/a&gt; on February 18, 2022. This is the first of a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote &lt;a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html"&gt;an opinion piece&lt;/a&gt; providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash &lt;a href="https://twitter.com/pranesh/status/1023116679440621568"&gt;pointed out&lt;/a&gt; that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —&lt;/p&gt;
&lt;p&gt;a)  where it is necessary to respond to a medical emergency&lt;br /&gt;b)  where it is necessary for state to provide a service or benefit to the individual&lt;br /&gt;c)  where it is necessary for the state to issue any certification, licence or permit&lt;br /&gt;d)  where it is necessary under any central or state legislation, or to comply with a judicial order&lt;br /&gt;e)  where it is necessary for any measures during an epidemic, outbreak or public health&lt;br /&gt;f)  where it is necessary for safety procedures during disaster or breakdown of public order&lt;/p&gt;
&lt;p&gt;In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.&lt;/p&gt;
&lt;h2&gt;Twin restrictions in Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Scope of privacy under Puttaswamy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench ​was ​not ​required ​to ​provide ​a ​legal ​test ​to ​determine ​the ​extent ​and ​scope ​of the ​right ​to ​privacy, but they do provide sufficient ​guidance ​for ​us ​to ​contemplate ​how ​the ​limits ​and ​scope ​of ​the ​constitutional ​right ​to ​privacy ​could ​be ​determined ​in ​future ​cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —&lt;br /&gt;a) the existence of a “law”&lt;br /&gt;b) a “legitimate State interest”&lt;br /&gt;c) the requirement of “proportionality”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proportionality and Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —&lt;/p&gt;
&lt;p&gt;a)  the limiting measures must be carefully designed, or rationally connected, to the objective&lt;br /&gt;b)  they must impair the right as little as possible&lt;br /&gt;c)  the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'&gt;https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T14:56:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill">
    <title>CIS Comments and Recommendations on the Data Protection Bill, 2021</title>
    <link>https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill</link>
    <description>
        &lt;b&gt;This document is a revised version of the comments we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;After nearly two years of deliberations and a few changes in its composition, the Joint Parliamentary Committee (JPC), on 17 December 2021, submitted its report on the Personal Data Protection Bill, 2019  (2019 Bill). The report also contains a new version of the law titled the Data Protection Bill, 2021 (2021 Bill). Although there were no major revisions from the previous version other than the inclusion of all data under the ambit of the bill, some provisions were amended.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This document is a revised version of the&lt;a href="https://cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019"&gt; comments&lt;/a&gt; we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill. Through this document we aim to shed light on the issues that we highlighted in our previous comments that have not yet been addressed, along with additional comments on sections that have become more relevant since the pandemic began. In several instances our previous comments have either not been addressed or only partially been addressed; in such instances, we reiterate them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These general comments should be read in conjunction with our previous recommendations for the reader to get a comprehensive overview of what has changed from the previous version and what has remained the same. This document can also be read while referencing the new Data Protection Bill 2021 and the JPC’s report to understand some of the significant provisions of the bill.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;&lt;a href="https://cis-india.org/internet-governance/general-comments-data-protection-bill.pdf" class="internal-link"&gt;Read on to access the comments&lt;/a&gt; | &lt;/strong&gt;&lt;span&gt;Review and editing by Arindrajit Basu. Copy editing: The Clean Copy; Shared under Creative Commons Attribution 4.0 International license&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-02-14T16:07:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf">
    <title>Facial Recognition Technology in India</title>
    <link>https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf'&gt;https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facial Recognition</dc:subject>
    

   <dc:date>2021-09-02T16:17:44Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india">
    <title>Facial Recognition Technology in India </title>
    <link>https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india</link>
    <description>
        &lt;b&gt;The Human Rights, Big Data and Technology Project, University of Essex, UK and the Centre for Internet &amp; Society (CIS) have jointly published a research paper on facial recognition technology. Authors, Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha, examine technological tools such as CCTV and FRT which are increasingly being deployed by the government.&lt;/b&gt;
        &lt;h3&gt;Executive Summary&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Over the past two decades there has been a sustained effort at digitising India’s governance structure in order to foster development and innovation. The field of law enforcement and safety has seen significant change in that direction, with technological tools such as Closed Circuit Television (CCTV) and Facial Recognition Technology (FRT) increasingly being deployed by the government.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Yet for all its increased use, there is still a lack of a coherent legal and regulatory framework governing FRT in India. Towards informing such a framework, this paper seeks to document present uses of FRT in India, specifically by  law enforcement agencies and central and state governments, understand the applicability of existing legal frameworks to the use of FRT, and define key areas that need to be addressed when using the technology in India. We also briefly look at how the coverage of FRT has increased beyond law enforcement; it now covers educational institutions, employment purposes, and it is now being used for providing Covid-19 vaccines.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We begin by examining use cases of FRT systems by various divisions of central and state governments. In doing so, it becomes apparent that there is a lack of uniform standards or guidelines at either the state or central level - leading to different FRT systems having differing standards of applicability and scope of use.  And while the use of such systems seems to be growing at a rapid rate, questions around their legality persist.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is unclear whether the use of FRT is compliant with the fundamental right to privacy as affirmed by the Supreme Court in 2017 in &lt;i&gt;Puttaswamy&lt;/i&gt;. While the right to privacy is not an absolute right, for the state to curtail this right, the restrictions will have to comply with a three-fold requirement— first, being the need for explicit legislative mandate in instances where the government looks to curtail the right. However, the FRT systems we have analysed do not have such a mandate and are often the result of administrative or executive decisions with no legislative blessing or judicial oversight.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We further locate the use of FRT technology within the country’s wider legislative, judicial and constitutional frameworks governing surveillance. We also briefly articulate comparative perspectives on the use of  FRT in other jurisdictions. We further analyse the impact of the proposed Personal Data Protection Bill on the deployment of FRT. Finally, we propose a set of recommendations to develop a path forward for the technology’s use which include the need for a comprehensive legal and regulatory framework that governs the use of FRT. Such a framework must take into consideration the necessity of use, proportionality, consent, security, retention, redressal mechanisms, purpose limitation, and other such principles. Since the use of FRT in India is also at a nascent stage, it is imperative that there is greater public research and dialogue into its development and use to ensure that any harms that may arise in the field are mitigated.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Click to download the entire &lt;a href="https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf" class="external-link"&gt;research paper here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india'&gt;https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facial Recognition</dc:subject>
    

   <dc:date>2021-09-02T16:21:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19">
    <title>Techno-solutionist Responses to COVID-19</title>
    <link>https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19</link>
    <description>
        &lt;b&gt;The Indian state has increasingly adopted a digital approach to service delivery over the past decade, with vaccination being the latest area to be subsumed by this strategy. In the context of the need for universal vaccination, the limitations of the government’s vaccination platform Co-WIN need to be analysed.&lt;/b&gt;
        &lt;p&gt;&lt;span style="text-align: justify; "&gt;The article by Amber Sinha, Pallavi Bedi, and Aman Nair was published in the &lt;/span&gt;&lt;a class="external-link" href="https://www.epw.in/journal/2021/29/commentary/techno-solutionist-responses-covid-19.html" style="text-align: justify; "&gt;Economic &amp;amp; Political Weekly&lt;/a&gt;&lt;span style="text-align: justify; "&gt;, Vol. 56, Issue No. 29, 17 Jul, 2021.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Over the last two decades, slowly but steadily, the governance agenda of the Indian state has moved to the digital realm. In 2006, the National e-Governance Plan (NeGP) was approved by the Indian state wherein a massive infrastructure was developed to reach the remotest corners and facilitate easy access of government services efficiently at affordable costs. The first set of NeGP projects focused on digitalising governance schemes that dealt with taxation, regulation of corporate entities, issuance of passports, and pensions. Over a period of time, they have come to include most interactions between the state and citizens from healthcare to education, transportation to employment, and policing to housing. Upon the launch of the Digital India Mission by the union government, the NeGP was subsumed under the e-Gov and e-Kranti components of the project. The original press release by the central government reporting the approval by the cabinet of ministers of the Digital India programme speaks of “cradle to grave” digital identity as one of its vision areas. This identity was always intended to be “unique, lifelong, online and authenticable.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Since the inception of the Digital India campaign by the current government, there have been various concerns raised about the privacy issues posed by this project. The initiative includes over 50 “mission mode projects” in various stages of implementation. All of these projects entail collection of vast quantities of personally identifiable information of the citizens. However, most of these initiatives do not have clearly laid down privacy policies. There is also a lack of properly articulated access control mech­anism and doubts exist over important issues such as data ownership owing to most projects involving public–private partnership which involves a private org­anisation collecting, processing and retaining large amounts of data. Most importantly, they have continued to exist and prosper in a state of regulatory vacuum with no data protection legislation to govern them. Further, the state of digital divide and digital literacy in India should automatically underscore the need to not rely solely on digital solutions.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;span&gt;Click to &lt;/span&gt;&lt;a class="external-link" href="https://www.epw.in/journal/2021/29/commentary/techno-solutionist-responses-covid-19.html"&gt;read the full article here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19'&gt;https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha, Pallavi Bedi and Aman Nair</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digitalisation</dc:subject>
    
    
        <dc:subject>Co-WIN</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    
    
        <dc:subject>Digital Technologies</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Technology</dc:subject>
    
    
        <dc:subject>E-Governance</dc:subject>
    

   <dc:date>2021-08-10T15:34:06Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates">
    <title>Do We Really Need an App for That? Examining the Utility and Privacy Implications of India’s Digital Vaccine Certificates</title>
    <link>https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates</link>
    <description>
        &lt;b&gt;We examine the purported benefits of digital vaccine certificates over regular paper-based ones and analyse the privacy implications of their use.&lt;/b&gt;
        
&lt;p&gt;&lt;em&gt;This blogpost was edited by Gurshabad Grover, Yesha Tshering Paul, and Amber Sinha.&lt;br /&gt;It was originally published on &lt;a href="https://digitalid.design/vaccine-certificates.html"&gt;Digital Identities: Design and Uses&lt;/a&gt; and is cross-posted here.&lt;br /&gt;&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;In an experiment to streamline its COVID-19 immunisation drive, India has adopted a centralised vaccine administration system called CoWIN (or COVID Vaccine Intelligence Network). In addition to facilitating registration for both online and walk-in vaccine appointments, the system also allows for the &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;digital verification&lt;/a&gt; of vaccine certificates, which it issues to people who have received a dose. This development aligns with a global trend, as many countries have adopted or are in the process of adopting “vaccine passports” to facilitate safe movement of people while resuming commercial activity.
    &lt;br /&gt;&lt;br /&gt;Some places, such as the &lt;a href="https://www.schengenvisainfo.com/news/all-your-questions-on-eus-covid-19-vaccine-certificate-answered/" target="_blank"&gt;EU&lt;/a&gt;, have constrained the scope of use of their vaccine certificates to international travel. The Indian government, however, has so far &lt;a href="https://www.livemint.com/opinion/columns/vaccination-certificates-need-a-framework-to-govern-their-use-11618160385602.html" target="_blank"&gt;skirted&lt;/a&gt; important questions around where and when this technology should be used. By allowing &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;anyone&lt;/a&gt; to use the online CoWIN portal to scan and verify certificates, and even providing a way for the private-sector to incorporate this functionality into their applications, the government has opened up the possibility of these digital certificates being used, and even mandated, for domestic everyday use such as going to a grocery shop, a crowded venue, or a workplace.
    &lt;br /&gt;&lt;br /&gt;In this blog post, we examine the purported benefits of digital vaccine certificates over regular paper-based ones, analyse the privacy implications of their use, and present recommendations to make them more privacy respecting. We hope that such an analysis can help inform policy on appropriate use of this technology and improve its privacy properties in cases where its use is warranted.
    &lt;br /&gt;&lt;br /&gt;We also note that while this post only examines the merits of a technological solution put out by the government, it is more important to &lt;a href="https://www.accessnow.org/cms/assets/uploads/2021/04/Covid-Vaccine-Passports-Threaten-Human-Rights.pdf" target="_blank"&gt;consider&lt;/a&gt; the effects that placing restrictions on the movement of unvaccinated people has on their civil liberties in the face of a vaccine rollout that is inequitable along many lines, including &lt;a href="https://thewire.in/gender/women-falling-behind-in-indias-covid-19-vaccination-drive" target="_blank"&gt;gender&lt;/a&gt;, &lt;a href="https://www.thehindu.com/sci-tech/science/will-25-covid-19-vaccines-for-private-hospitals-aggravate-inequity/article34799098.ece" target="_blank"&gt;caste-class&lt;/a&gt;, and &lt;a href="https://scroll.in/article/994871/tech-savvy-indians-drive-to-villages-for-covid-19-vaccinations-those-without-smartphones-lose-out" target="_blank"&gt;access to technology&lt;/a&gt;.&lt;/p&gt;
&lt;h4&gt;How do digital vaccine certificates work?&lt;/h4&gt;
&lt;p&gt;Every vaccine recipient in the country is required to be registered on the CoWIN platform using one of &lt;a href="https://www.cowin.gov.in/faq" target="_blank"&gt;seven&lt;/a&gt; existing identity documents. [1] &lt;a name="ref1"&gt;&lt;/a&gt; Once a vaccine is administered, CoWIN generates a vaccine certificate which the recipient can access on the CoWIN website. The certificate is a single page document that contains the recipient’s personal information — their name, age, gender, identity document details, unique health ID, a reference ID — and some details about the vaccine given.&lt;a name="ref2"&gt;&lt;/a&gt; [2] It also includes a “secure QR code” and a link to CoWIN’s verification &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;portal&lt;/a&gt;.
  &lt;br /&gt;&lt;br /&gt;The verification portal allows for the verification of a certificate by scanning the attached QR code. Upon completion, the portal displays a success message along with some of the information printed on the certificate.
  &lt;br /&gt;&lt;br /&gt;Verification is done using a cryptographic mechanism known as &lt;a href="https://en.wikipedia.org/wiki/Digital_signature" target="_blank"&gt;digital signatures&lt;/a&gt;, which are encoded into the QR code attached to a vaccine certificate. This mechanism allows “offline verification”, which means that the CoWIN verification portal or any private sector app attempting to verify a certificate does not need to contact the CoWIN servers to establish its authenticity. It instead uses a “public key” issued by CoWIN beforehand to verify the digital signature attached to the certificate.
  &lt;br /&gt;&lt;br /&gt;The benefit of this convoluted design is that it protects user privacy. Performing verification offline and not contacting the CoWIN servers, precludes CoWIN from gleaning sensitive metadata about usage of the vaccine certificate. This means that CoWIN does not learn about where and when an individual uses their vaccine certificate, and who is verifying it. This closes off a potential avenue for mass surveillance. [3] However, given how certificate revocation checks are being implemented (detailed in the privacy implications section below), CoWIN ends up learning this information anyway.&lt;/p&gt;
&lt;h4&gt;Where is digital verification useful?&lt;/h4&gt;
&lt;p&gt;The primary argument for the adoption of digital verification of vaccine certificates over visual examination of regular paper-based ones is security. In the face of vaccine hesitancy, there are concerns that people may forge vaccine certificates to get around any restrictions that may be put in place on the movement of unvaccinated people. The use of digital signatures serves to allay these fears.
&lt;br /&gt;&lt;br /&gt;In its current form, however, digital verification of vaccine certificates is no more secure than visually inspecting paper-based ones. While the “secure QR code” attached to digital certificates can be used to verify the authenticity of the certificate itself, the CoWIN verification portal does not provide any mechanism nor does it instruct verifiers to authenticate the identity of the person presenting the certificate. This means that unless an accompanying identity document is also checked, an individual can simply present someone else’s certificate.
&lt;br /&gt;&lt;br /&gt;There are no simple solutions to this limitation; adding a requirement to inspect identity documents in addition to digital verification of the vaccine certificate would not be a strong enough security measure to prevent the use of duplicate vaccine certificates. People who are motivated enough to forge a vaccine certificate, can also duplicate one of the seven ID documents which can be used to register on CoWIN, some of which are simple paper-based documents. [4] Requiring even stronger identity checks, such as the use of Aadhaar-based biometrics, would make digital verification of vaccine certificates more secure. However, this would be a wildly disproportionate incursion on user privacy — allowing for the mass collection of metadata like when and where a certificate is used — something that digital vaccine certificates were explicitly designed to prevent. Additionally, in Russia, people were &lt;a href="https://www.washingtonpost.com/world/europe/moscow-fake-vaccine-coronavirus/2021/06/26/0881e1e4-cf98-11eb-a224-bd59bd22197c_story.html" target="_blank"&gt;found&lt;/a&gt; issuing fake certificates by discarding real vaccine doses instead of administering them. No technological solution can prevent such fraud.
&lt;br /&gt;&lt;br /&gt;As such, the utility of digital certificates is limited to uses such as international travel, where border control agencies already have strong identity checks in place for travellers. Any everyday usage of the digital verification functionality on vaccine certificates would not present any benefit over visually examining a piece of paper or a screen.&lt;/p&gt;
&lt;h4&gt;Privacy implications of digital certificates&lt;/h4&gt;
&lt;p&gt;In addition to providing little security utility over manual inspection of certificates, digital certificates also present privacy issues, these are listed below along with recommendations to mitigate them:
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(i) The verification portal leaks sensitive metadata to CoWIN’s servers:&lt;/em&gt; An analysis of network requests made by the CoWin verification portal reveals that it conducts a ‘revocation check’ each time a certificate is verified. This check was also found in the source &lt;a href="https://github.com/egovernments/DIVOC/blob/e667697b47a50a552b8d0a8c89a950180217b945/interfaces/vaccination-api.yaml#L385" target="_blank"&gt;code&lt;/a&gt;, which is made openly available&lt;a name="ref5"&gt;&lt;/a&gt;.
[5]&lt;/p&gt;
&lt;p&gt;Revocation checks are an important security consideration while using digital signatures. They allow the issuing authority (CoWIN, in this case) to revoke a certificate in case the account associated with it is lost or stolen, or if a certificate requires correction. However, the way they have been implemented here presents a significant privacy issue. Sending certificate details to the server on every verification attempt allows it to learn about where and when an individual is using their vaccine certificate.
&lt;br /&gt;&lt;br /&gt;We note that the revocation check performed by the CoWIN portal does not necessarily mean that it is storing this information. Nevertheless, sending certificate information to the server directly contradicts claims of an “offline verification” process, which is the basis of the design of these digital certificates.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendations:&lt;/strong&gt; Implementing privacy-respecting revocation checks such as Certificate Revocation Lists, [6] or Range Queries [7] would mitigate this issue. However, these solutions are either complex or present bandwidth and storage tradeoffs for the verifier.
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(ii) Oversharing of personally identifiable information:&lt;/em&gt; CoWIN’s vaccine certificates include more personally identifiable information (name, age, gender, identity document details and unique health ID) than is required for the purpose of verifying the certificate. An examination of the vaccine certificates available to us revealed that while the Aadhaar number is appropriately masked, other personal identifiers such as passport number and unique health ID were not masked. Additionally, the inclusion of demographic details, such as age and gender, provides little security benefit by limiting the pool of duplicate certificates that can be used and are not required in light of the security analysis above.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendation:&lt;/strong&gt; Personal identifiers (such as passport number and unique health ID) should be appropriately masked and demographic details (age, gender) can be removed.
&lt;br /&gt;&lt;br /&gt;The minimal set of data required for identity-linked usage for digital verification, as described above, is a full name and masked ID document details. All other personally identifying information can be removed. In case of paper-based certificates, which is suggested for domestic usage, only the details about vaccine validity would suffice and no personal information is required.
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(iii) Making information available digitally increases the likelihood of collection:&lt;/em&gt; All of the personal information printed on the certificate is also encoded into the QR code. This is &lt;a href="https://www.bbc.com/news/uk-scotland-57208607" target="_blank"&gt;necessary&lt;/a&gt; because the digital signature verification process also verifies the integrity of this information (i.e. it wasn’t modified). A side effect of this is that the personal information is made readily available in digital form to verifiers when it is scanned, making it easy for them to store. This is especially likely in private sector apps who may be interested in collecting demographic information and personal identifiers to track customer behaviour.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendation:&lt;/strong&gt; Removing extraneous information from the certificate, as suggested above, mitigates this risk as well.&lt;/p&gt;
&lt;h4&gt;Conclusion&lt;/h4&gt;
&lt;p&gt;Our analysis reveals that without incorporating strong, privacy-invasive identity checks, digital verification of vaccine certificates does not provide any security benefit over manually inspecting a piece of paper. The utility of digital verification is limited to purposes that already conduct strong identity checks.
&lt;br /&gt;&lt;br /&gt;In addition to their limited applicability, in their current form, these digital certificates also generate a trail of data and metadata, giving both government and industry an opportunity to infringe upon the privacy of the individuals using them.
&lt;br /&gt;&lt;br /&gt;Keeping this in mind, the adoption of this technology should be discouraged for everyday use.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;References&lt;/h4&gt;
&lt;p&gt;[1] Exceptions &lt;a href="https://web.archive.org/web/20210511045921/https://www.mohfw.gov.in/pdf/SOPforCOVID19VaccinationofPersonswithoutPrescribedIdentityCards.pdf" target="_blank"&gt;exist&lt;/a&gt; for people without state-issued identity documents.&lt;/p&gt;
&lt;p&gt;[2] This information was gathered by inspecting three vaccine certificates linked to the author’s CoWIN account, which they were authorised to view, and may not be fully accurate.&lt;/p&gt;
&lt;p&gt;[3] This design is similar to Aadhaar’s “&lt;a href="https://resident.uidai.gov.in/offline-kyc" target="_blank"&gt;offline KYC&lt;/a&gt;” process.&lt;/p&gt;
&lt;p&gt;[4] “Aadhaar Card: UIDAI says downloaded versions on ordinary paper, mAadhaar perfectly valid”, &lt;em&gt;Zee Business&lt;/em&gt;, April 29 2019, &lt;em&gt;https://www.zeebiz.com/india/news-aadhaar-card-uidai-says-downloaded-versions-on-ordinary-paper-maadhaar-perfectly-valid-96790&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;[5] This check was also verified to be present in the reference &lt;a href="https://github.com/egovernments/DIVOC/blob/261a61093b89990fe34698f9ba17367d4cb74c34/public_app/src/components/CertificateStatus/index.js#L125" target="_blank"&gt;code&lt;/a&gt; made available for private-sector applications incorporating this functionality, suggesting that private sector apps will also be affected by this.&lt;/p&gt;
&lt;p&gt;[6] &lt;a href="https://en.wikipedia.org/wiki/Certificate_revocation_list" target="_blank"&gt;Certificate Revocation Lists&lt;/a&gt; allow the server to provide a list of revoked certificates to the verifier, instead of the verifier querying the server each time. This, however, can place heavy bandwidth and storage requirements on the verifying app as this list can potentially grow long.&lt;/p&gt;
&lt;p&gt;[7] Range Queries are described in this &lt;a href="https://www.ics.uci.edu/~gts/paps/st06.pdf" target="_blank"&gt;paper&lt;/a&gt;. In this method, the verifier requests revocation status from the server by specifying a range of certificate identifiers within which the certificate being verified lies. If there are any revoked certificates within this range, the server will send their identifiers to the verifier, who can then check if the certificate in question is on the list. For this to work, the range selected must be sufficiently large to include enough potential candidates to keep the server from guessing which one is in use.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates'&gt;https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>divyank</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digital ID</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    
    
        <dc:subject>Appropriate Use of Digital ID</dc:subject>
    

   <dc:date>2021-08-03T05:13:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india">
    <title>State of Consumer Digital Security in India</title>
    <link>https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india</link>
    <description>
        &lt;b&gt;This report attempts to identify the existing state of digital safety in India, with a mapping of digital threats, which will aid stakeholders in identifying and addressing digital security problems in the country. This project was funded by the Asia Foundation.&lt;/b&gt;
        
&lt;p style="text-align: justify;"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Since 2006, successive Union governments in India have shown increased focus on digital governance. The National e-Governance Plan was launched by the UPA government in2006, and several digital projects led by the state such as digitisation of the filing of taxes, appointment process for passports, corporate governance, and the Aadhaar programme(India’s unique digital identity system that utilises biometric and demographic data) arose under it, in the form of mission mode projects (projects that are part of a broader National e-governance initiative, each focusing on specific e-Governance aspects, like banking, land records, or commercial taxes). In 2014, when the NDA government came to power, the National e-Governance Plan was subsumed under the government’s flagship project of Digital India, and several mission mode projects were added. In the meantime, the internet connectivity, first in the form of wire connectivity, and later in the form of mobile connectivity has increased greatly. In the same period, use of digital services, first in new services native to the Internet such as email, social networking, instant messaging, and later the platformization and disruption of traditional business models in transportation, healthcare, finance and virtually every sector, has led to a deluge of digital private service providers in India.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Currently, India has 500 million internet users — over a third of its total population — making it the country with the second largest number of Internet users after China. The uptake of these technological services has also been accompanied by several kinds of digital threats that an average digital consumer in India must regularly contend with. This report is a mapping of consumer-facing digital threats in India and is intended to aid stakeholders in identifying and addressing digital security problems. The first part of the report categorises digital threats into four kinds, Personal Data Threats, Online Content Related Threats, Financial Threats, and Online Sexual Harassment Threats. Threats under each category are then defined, with detailed consumer-facing consequences, and past instances where harm has been caused because of these threats.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Read the full report &lt;a href="https://cis-india.org/internet-governance/report-state-of-consumer-digital-security-in-india" class="internal-link" title="Report - State of Consumer Digital Security in India"&gt;here&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india'&gt;https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranav</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Media</dc:subject>
    

   <dc:date>2021-07-05T11:07:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions">
    <title>Advanced biometric technologies and new market entries tackle fraud, chase digital ID billions</title>
    <link>https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions</link>
    <description>
        &lt;b&gt;Amid forecasts of rapid growth and huge market potential, digital ID platforms launches by Techsign and Ping Identity, new services, features and even an investment fund have been launched.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post by Chris Burt was &lt;a class="external-link" href="https://www.biometricupdate.com/202106/advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions"&gt;published by Biometric Update&lt;/a&gt; on June 26, 2021.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A new camera solution for under-display 3D face biometrics from Infineon and partners, and IPO filings by Clear and SenseTime show parallel investment activity in biometrics, meanwhile, and experts from Veridium and Intellicheck provide insight into the shifting technology and fraud landscapes, among the most widely-read stories this week on Biometric Update.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Top biometrics news of the week&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Several areas of the digital identity market continued to be very active, with a new investment fund launched to support startups in digital commerce and payments, Yoti joining a regulatory sandbox, Techsign launching a digital ID platform, and Mastercard and b.well reporting positive results from a recent pilot for their biometric healthcare platform. All this activity contributes to explaining Juniper Research’s &lt;a href="https://www.biometricupdate.com/202106/digital-identity-verification-market-forecast-to-reach-16-7b-by-2026"&gt;forecast of rapid growth&lt;/a&gt; in the sector to $16.7 billion in 2026, driven largely by spending on remote onboarding.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Okta CEO Todd McKinnon, meanwhile, told Barron’s that the total addressable market for identity and access management providers like Okta is something like &lt;a href="https://www.biometricupdate.com/202106/okta-ceo-says-total-addressable-identity-and-access-management-market-near-80b"&gt;$80 billion&lt;/a&gt;, as well as that effective integration is the key to solving biometrics challenges in the space. Entrust and Yubico formed an integration partnership, LoginRadius launched a new feature, Jamf launched a biometric tool for enterprises, and a certification program for IAM professionals was launched.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A list of goods for sale on the dark web includes a listing for &lt;a href="https://www.biometricupdate.com/202106/biometric-selfies-and-forged-passports-identities-for-sale-on-the-dark-web"&gt;selfies holding an American ID credential&lt;/a&gt;, which in theory could be used in a biometric spoofing attack. Cybersecurity researcher Luana Pascu helps guide readers through the report, and shares insights such as on the status of faked vaccination certificates on dark web marketplaces.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ensuring the validity of the ID document a biometric identity verification process is based on, without adding too much friction, often means adopting &lt;a href="https://www.biometricupdate.com/202106/intellicheck-ceo-on-building-the-foundations-for-biometric-verification-and-fraud-protection"&gt;layered risk profiling&lt;/a&gt;, Intellicheck CEO Bryan Lewis tells &lt;em&gt;Biometric Update&lt;/em&gt; in a sponsored post. The company has deep roots in detecting fraudulent documents and has found that even scanning the barcode on an identity document will not necessarily catch a fake if the unique security elements are not validated as part of the scan.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Fourthline Anti-Financial Crime Head Ro Paddock writes in a Biometric Update guest post about the ever-increasing sophistication of fraud attacks, which reached the level of computer-generated &lt;a href="https://www.biometricupdate.com/202106/the-fraudsters-new-game-face"&gt;3D masks and deepfakes&lt;/a&gt; during the pandemic,. In response, information-sharing between organizations will be necessary to understand the scope of these new threats, and how to defend against them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Philippines’ election commission has launched an app to allow people to preregister for the &lt;a href="https://www.biometricupdate.com/202106/philippines-launches-app-to-fast-track-biometric-voter-registration"&gt;voter roll online&lt;/a&gt; before enrolling their biometrics in person, as the country continues digitizing its public services. Governments in Pakistan, Haiti and Nigeria are also making moves to improve the accessibility and trustworthiness of their electoral processes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A partnership between Research ICT Africa and the Centre for Internet and Society, supported by the Omidyar Network, to explore the development of digital ID systems for the African context is explained in a &lt;a href="https://researchictafrica.net/2021/06/21/why-digital-id-matters/" target="_blank"&gt;blog post&lt;/a&gt;. The project will be based on an adaptation of the Evaluation Framework for Digital Identities which the CIS used to assess India’s Aadhaar system, with rule of law, rights and risk-based tests, and presented in a series of posts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Details of Clear’s IPO plans emerged, including its intention to raise up to &lt;a href="https://www.biometricupdate.com/202106/clear-ipo-could-raise-up-to-396m-in-hot-biometrics-investment-market"&gt;$396 million&lt;/a&gt; on the NYSE. The $2.2 billion valuation aligns with some comparable companies, by revenue multiple, but the lower voting power of the shares on offer could be a restraining factor.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An even bigger IPO could be held by SenseTime later this year, with the Chinese AI firm looking to raise up to $2 billion &lt;a href="https://www.biometricupdate.com/202106/not-smarting-from-us-sanctions-sensetime-says-its-ipo-is-on-again"&gt;on the Hong Kong exchange&lt;/a&gt;. The company has been talking about a public stock launch since before the company was hit with restrictions to U.S. trade, which it indicates have had little impact.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The latest major funding round in digital identity is the largest yet, with &lt;a href="https://www.biometricupdate.com/202106/transmit-security-raises-543m-to-grow-biometric-passwordless-authentication"&gt;Transmit Security raising $543 million&lt;/a&gt; at a $2.2 billion valuation to expand the market reach of its passwordless biometric authentication technology. The company claims it is the highest ever Series A funding round in cybersecurity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Bob Eckel, Aware CEO and International Biometrics + Identity Association (IBIA) Director and Board Member, discusses why people should own their own identity, identifying things and protecting supply chains, and his background in setting up air traffic control systems used all over the world with the Requis &lt;a href="https://requis.com/podcasts/podcast-bob-eckel-biometrics-future-secured-identities/" target="_blank"&gt;Supply Chain Next podcast&lt;/a&gt;. In the longer term Eckel sees biometric replacing passwords, and in the shorter term being used to make processes touchless.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Veridium CTO John Callahan guides Biometric Update through recent NIST guidance on the &lt;a href="https://www.biometricupdate.com/202106/nist-touchless-fingerprint-biometrics-guidance-confirms-interoperability"&gt;interoperable use of contactless fingerprints&lt;/a&gt; with contact-based back-end AFIS systems. The guidance, which changes definitions within the NIST ITL biometric container standard, but advises that the associated image quality metric does not apply to contactless prints, could spark further investment in the modality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A new time-of-flight 3D imaging solution that could be used to implement facial authentication from &lt;a href="https://www.biometricupdate.com/202106/under-display-camera-for-3d-face-biometrics-developed-by-infineon-pmd-arcsoft"&gt;under the display of mobile devices&lt;/a&gt; without notches or bezels has been developed by partners Infineon, pmdtechnologies and ArcSoft. Based on the REAL3 sensor and ArcSoft’s computer vision algorithms, the solution is expected to reach availability in Q3 2021.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="https://www.biometricupdate.com/202106/ping-identity-adds-behavioral-biometrics-and-bot-detection-with-securedtouch-acquisition"&gt;Ping Identity has acquired SecuredTouch&lt;/a&gt; in a deal with undisclosed financial details to integrate its behavioral biometrics-based continuous user authentication with the PingOne enterprise cloud platform. Ping also launched a consumer application for reusable credentials and added unified management features to its cloud platform at its Identiverse 2021 event.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notre Dame-IBM Technology Ethics Lab Founding Director Elizabeth Renieris joins the MIT Sloan Management Review’s &lt;a href="https://sloanreview.mit.edu/audio/starting-now-on-technology-ethics-elizabeth-renieris/" target="_blank"&gt;Me, Myself and AI podcast&lt;/a&gt; to discuss the role of the lab, her path past and through some of the digital identity space’s key ethical developments, and the need to take the long view on technology to understand its ethical implications. Renieris makes a pitch for process-oriented regulations, based on the best understanding we have at the time.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ProctorU’s announcement that it will no longer sell fully-automated remote proctoring services is seen as a win in the battle against “the AI shell game” by the &lt;a href="https://www.eff.org/deeplinks/2021/06/long-overdue-reckoning-online-proctoring-companies-may-finally-be-here" target="_blank"&gt;Electronic Frontier Foundation&lt;/a&gt;. The descriptions of the balance between the automated and human decision-making by AI proctoring providers amount to doublespeak, the EFF says, before panning their human review processes, accuracy rates, and use of facial recognition.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions'&gt;https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Chris Burt</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>UIDAI</dc:subject>
    
    
        <dc:subject>Biometrics</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    

   <dc:date>2021-06-28T01:13:05Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/pdp-bill-is-coming-whatsapp-privacy-policy-analysis">
    <title>PDP Bill is coming: WhatsApp Privacy Policy analysis</title>
    <link>https://cis-india.org/internet-governance/blog/pdp-bill-is-coming-whatsapp-privacy-policy-analysis</link>
    <description>
        &lt;b&gt;WhatsApp started off the new year with changes to its privacy policy that has several implications for data protection and the digital governance ecosystem at large. This post is the first in a series by CIS unpacking the various implications of the policy.
&lt;/b&gt;
        &lt;span id="docs-internal-guid-153739d2-7fff-f133-6a27-53060c29814c"&gt;
&lt;p dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;On January 4, 2021, WhatsApp announced a revised privacy policy. The announcement was through an in-app notification. Users were asked to agree to the policy by February 8, else they will lose access to their accounts. The announcement triggered a backlash, globally and in India and it led to &lt;a href="https://economictimes.indiatimes.com/tech/information-tech/messaging-app-signal-faces-global-outage-days-after-adding-millions-of-users/articleshow/80296362.cms"&gt;millions of users in India migrating to other messaging platforms. &lt;/a&gt;In light of the backlash, WhatsApp had on January 15 announced that it will delay rolling out the new policy to May 15, 2021.&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;&amp;nbsp;It is important to note that many users have also commented that the new explicit terms of mandatory data sharing with Facebook and the extent of metadata collection haven’t changed drastically from WhatsApp’s existing operations. In 2016, WhatsApp had revised its privacy policy to enable data sharing with Facebook. Users were provided 30 days to opt out of such data sharing.&amp;nbsp; However, the option to opt out was not provided to users who joined the service after September 25, 2016 or who failed to exercise the opt-out option. The changes in the policy were challenged in the Delhi High Court.&amp;nbsp; The High Court (i) directed WhatsApp to delete the complete information of users who exercised the option to opt out before September 25, 2016; and (ii) with respect to users who did not exercise the opt-out option, WhatsApp was directed to not share the information of users collected until September 25, 2016 with Facebook. The matter is currently pending before the Supreme Court.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;The change in people’s reactions to the data processing from 2016 can partly be attributed to the change in the users perception of privacy and personal data protection. Conversations around privacy and data protection and harms arising out of unauthorized data collection are much more prevalent. What has also irked a large number of users is the difference between the privacy policy applicable to the European Region and the policy applicable to the rest of the world; There is a disparity in the two policies regarding the rights of the users in relation to sharing of data with Facebook Companies(Facebook payments inc, Facebook Payments International Limited, Onavo, Facebook technologies LLC, Facebook Technologies Ireland limited, WhatsApp inc.&amp;nbsp; WhatsApp Ireland Limited and Crowdtangle) due to the application of the General Data Protection Regulation.&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;Currently, Indian users have a fundamental right to privacy and an overarching data protection framework is set to be tabled in the Parliament soon. The Personal Data Protection Bill, 2019, being deliberated by the Joint Parliamentary Committee, is expected to provide comprehensive requirements for authorized collection and management of personal data. The proposed Bill, despite several shortcomings, does offer significantly more protection than the current framework consisting of S. 43A of Information Technology Act, 2000 and the Information Technology (Reasonable Security practices and procedures and sensitive personal data or Information) Rules, 2011. This blogpost will examine the viability of the revised privacy policy of WhatsApp if the proposed bill is enacted in the currently available public version of the Bill. In the subsequent posts we will analyse the effect of the revised privacy policy on the pending litigation.&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;
Privacy notice&lt;/h3&gt;
&lt;p dir="ltr"&gt;Section 7 of the proposed bill puts an obligation on the data fiduciary to provide a privacy notice, i.e. a document containing granular details of the processing of personal data to the data principals. The details must be provided in a manner that is clear, concise and easily comprehensible to a reasonable person. The notice should also be provided in multiple languages where necessary and practicable. The importance of a clear and concise policy has been highlighted in the Justice Srikrishna Report on Data Protection. However, there is no guidance from the Indian authorities on what it constitutes. Guidance from the &lt;a href="https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227"&gt;Article 29 working party&lt;/a&gt; in the EU suggests that the policy must be presented in a manner that avoids information fatigue. In the digital context, it has been recommended that presenting a policy in a layered format enhances readability. The guidance also suggests that policy should avoid reliance on complex sentences and abstract terms to convey the details of the processing operations. The revised privacy policy of WhatsApp cannot be termed a clear and concise policy.&amp;nbsp; The purely text-based policy, containing around 3800 words, is not presented in a layered format resulting in shockingly low readability for the amount and type of personal data collection the policy is attempting to convey. In addition to improper design and structure, the policy contains vague language providing an average user a hazy understanding of the extent of data processing and can leave room for different interpretations. The earlier version of the policy also uses similar language and structure to convey details regarding the processing and &lt;a href="https://www.irishtimes.com/business/technology/whatsapp-ireland-sets-aside-77-5m-for-possible-data-compliance-fines-1.4412449"&gt;doesn’t provide transparent details regarding its data sharing with Facebook&lt;/a&gt;. Relying on a similar format as its earlier versions without revising it based on global discussions around the best methods seems to be an opportunity lost to remedy the privacy policy. The structure, form and language of the policy will have to be revised if the Bill is enacted in its current form and the policy will also have to be provided in multiple languages.&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;Bundled consent&lt;/h3&gt;
&lt;p dir="ltr"&gt;According to its policy, WhatsApp relies on the consent of the user for the purpose of providing messaging and communication services, sharing information with third party service providers that help WhatsApp “operate, provide, improve, understand, customize, support, and market” their Services, and sharing information with other Facebook companies for “providing integrations with Facebook Company products” to name a few.&amp;nbsp; It is important to verify if the consent being obtained is valid according to the standard set by the proposed framework.&lt;/p&gt;
&lt;p dir="ltr"&gt;For consent to be valid under the proposed framework (Section 11(4)) , the provision and quality of services provided should not be linked to consenting to processing of personal data that is not directly necessary for that purpose. In WhatsApp’s case, the primary purpose of processing is to provide messaging and communication services on that particular platform. Neither sharing personal data with third party service providers for better marketing of their services on other platforms nor sharing it with Facebook company of products for better integration of services is incidental to the primary purpose of processing. The bundling of consent results in forcing individuals to either accept processing of personal data for all of the purposes outlined or lose the services altogether resulting in an invalid consent. An explicit opt-in mechanism for all those processing operations that are not compatible with the primary purpose of processing will have to be provided to the Indian users if the Bill is enacted in its current form and consent is being relied on as the lawful ground of processing.&lt;/p&gt;
&lt;h3&gt;Data sharing with Facebook&lt;/h3&gt;
&lt;p dir="ltr"&gt;WhatsApp’s policy on sharing of information with Facebook has garnered a significant amount of attention and has also raised privacy concerns amongst WhatsApp users in non-European countries. This is because the policy applicable to non- European countries now does not provide the user option to opt out from sharing the information if the user wants to continue using and operating WhatsApp. The policy under the heading ‘How we work with other Facebook Companies’ states that “As part of the&lt;a href="https://faq.whatsapp.com/general/security-and-privacy/the-facebook-companies"&gt; Facebook Companies&lt;/a&gt;, WhatsApp receives information from, and shares information (see&lt;a href="https://faq.whatsapp.com/general/security-and-privacy/what-information-does-whatsapp-share-with-the-facebook-companies"&gt; here&lt;/a&gt;) with, the other&lt;a href="https://faq.whatsapp.com/general/security-and-privacy/the-facebook-companies"&gt; Facebook Companies&lt;/a&gt;. We may use the information we receive from them, and they may use the information we share with them, to help operate, provide, improve, understand, customize, support, and market our Services and their offerings, including the&lt;a href="https://faq.whatsapp.com/general/security-and-privacy/the-facebook-company-products"&gt; Facebook Company Products&lt;/a&gt;.” The information that may be shared by WhatsApp with Facebook Companies includes; (i) users phone number; (ii) transaction data; (iii) service-related information, (iv) information on how the users interact with others (including businesses); (v) mobile device information; (vi) the user’s IP address; and (vii) and any other data covered by the privacy policy. All this information/data will fall within the ambit of personal data in terms of the current version of the Bill and therefore WhatsApp would have to comply with the obligations put on it under the Bill for it to be able to share personal data with other data fiduciaries including Facebook Companies.&lt;/p&gt;
&lt;p dir="ltr"&gt;As noted earlier, it is pertinent to note that the privacy policy is not the same globally. As per the privacy policy applicable to&amp;nbsp; Europe, WhatsApp states that any information that it shares with Facebook Companies is to be used on WhatsApp’s behalf and in accordance with its instructions. Any such information cannot be used for the Facebook Companies own purposes. This statement is not reflected in the privacy policy applicable to non European countries. Facebook has in a &lt;a href="https://www.irishtimes.com/business/technology/whatsapp-says-european-users-do-not-have-to-share-data-with-facebook-1.4452435"&gt;statement &lt;/a&gt;stated that “For the avoidance of any doubt, it is still the case that WhatsApp does not share European region WhatsApp user data with Facebook for the purpose of Facebook using this data to improve its products or advertisements”&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;strong id="docs-internal-guid-dbd02a4a-7fff-ed41-bc54-e5cce9a8b5ca"&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;Data sharing with other third party service providers&lt;/h3&gt;
&lt;p dir="ltr"&gt;It is also important to note that sharing of information is not limited to Facebook Companies, but also extends to other third party service providers. However, apart from a vaguely drafted statement stating that WhatsApp works with third party service providers as well as other Facebook Companies to help it to “operate, provide, improve, understand, customize, support, and market our Services”, the privacy policy is silent and does not provide any insight or clear information on (a) the nature of these third party entities; (b) extent of information shared with such third party entities.&amp;nbsp; Further, even though the policy provides a link to the other Facebook Companies (Facebook Payments Inc, Facebook International Limited, Onavo CrowdTangle) that it works with; there is again no clarity as to what are the specific services provided by these companies.&lt;/p&gt;
&lt;p dir="ltr"&gt;One of the rights provided to a data principal under Section 17 (3) and Section 7 (1)(g) of the current version of the Bill, is the right to be informed and the consent to be obtained from the data principal about the individuals or entities with whom personal data may be shared. The data principal also has the right to be informed about and given access to the categories of personal data shared with the other data fiduciaries. However, the policy as it stands on date is silent about both the details of the third parties service providers as well as the categories of personal data that could be shared with them.&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;Metadata collection and data minimisation&lt;/h3&gt;
&lt;p dir="ltr"&gt;The details on usage and log information in the previous version of the policy were rather vague as a result of which the extent of data collection was difficult to ascertain. The revised version indicates that WhatsApp’s metadata collection went further than most of the other popular messaging applications and the data being collected was linked back to the user and device identity. The principle of data minimisation (Section 6 of the proposed framework) limits the collection of personal data to that which is necessary for the purpose of processing. The compelling reasons that justify the metadata collection for the primary purpose of messaging and communication are so far unclear. The metadata collection section is similar in the privacy policy for the EU region and on the face of it doesn’t look GDPR compliant as well. Collection of those categories of personal data that are not necessary for processing of the primary purpose will need to be discontinued if the Bill is enacted in its current form.&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;Data Principal rights&lt;/h3&gt;
&lt;p dir="ltr"&gt;The difference between the protection afforded to Indian resident users and European resident users is highlighted in the rights accorded to the data principal under the two privacy policies. The European privacy policy has a section dedicated to how users can exercise their rights and specifies that users have the right to access, rectify, port, and erase their information, as well as the right to restrict and object to certain processing of their information. These rights are a reflection of the protection afforded to data principles under the GDPR.&amp;nbsp; As per the current version of the Bill, the data principal will have the right to&amp;nbsp; (i) confirmation and access (Section 17); (ii) correction and erasure (Section 18); and (iii) data portability (Section 19). If the current version of the Bill is enacted, then WhatsApp will be required to amend its privacy policy regarding its applicability to India and incorporate the rights of data accorded to the data principal .&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;Grievance redressal&amp;nbsp;&lt;/h3&gt;
&lt;p dir="ltr"&gt;The European Region privacy policy specifies the entity within WhatsApp responsible for addressing the complaints of the users and it further also informs the user that they have the right to approach the Irish Data Protection Commission, or any other competent data protection supervisory authority. None of these provisions are specified in the Non-European Region privacy policy.&amp;nbsp; The current version of the PDP Bill places an obligation on the data fiduciary to establish an effective grievance redressal mechanism (Section 32(1)) and to inform the data principal about their right to approach the Data Protection Authority (which is proposed to be established under the PDP Bill) (Section 7(k)). Additional details regarding the same will have to be provided if the Bill is enacted in its current form.&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;Clarifications from WhatsApp&amp;nbsp;&lt;/h3&gt;
&lt;p dir="ltr"&gt;On January 13, 2021, WhatsApp published a blog stating that the changes to the privacy policy will not affect users who use the platform messaging with friends and family,&amp;nbsp; the changes will only apply to users who use the platform to communicate with business accounts. As per WhatsApp messages to business accounts on WhatsApp can be shared with third-party service providers, which may include Facebook itself.&amp;nbsp; As per the blog, “But whether you communicate with a business by phone, email, or WhatsApp, it can see what you’re saying and may use that information for its own marketing purposes, which may include advertising on Facebook.” It is important to note that we recognise that the content of the messages and the call remains encrypted, however, the concern arises from the collection and use of ‘metadata.’&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;WhatsApp’s repeated assurances and clarifications asserting their commitment to data privacy falls short. Their insistence that their chats still use end to end encryption and that only interactions with WhatsApp Business will be shared with Facebook indicates ignorance with regard to the different contours of informational privacy. The expectations of privacy that individuals have over their personal data is linked to the extent of control they have over disclosure of such data. The mandatory metadata collection and lack of opt out clauses for data sharing for marketing purposes results in a mere illusion of control through its façade consent collecting process.&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;For the most part, the proposed framework should provide us the same level of protection offered to EU users of WhatsApp regarding some of the key contentions highlighted above. However, additional data principal rights such as the right to object and right to restrict processing will give additional protections to the data principal in case of data processing for marketing purposes. The uproar over the data collection practices of WhatsApp have cemented the immediate need for an effective data protection legislation in the country. The final draft of the Bill with &lt;a href="https://economictimes.indiatimes.com/news/politics-and-nation/parliamentary-panel-examining-personal-data-protection-bill-recommends-89-changes/articleshow/80138488.cms"&gt;89 new amendments&lt;/a&gt; is expected to be released soon. Considering the renewed apprehensions regarding unwarranted processing of personal data, we can only hope that the amendments have taken into consideration the feedback and comments provided by relevant stakeholders.&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;br /&gt;&lt;br /&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;(This post was edited and reviewed by Amber Sinha, Arindrajit Basu and Aman Nair)&lt;/p&gt;
&lt;/span&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/pdp-bill-is-coming-whatsapp-privacy-policy-analysis'&gt;https://cis-india.org/internet-governance/blog/pdp-bill-is-coming-whatsapp-privacy-policy-analysis&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi &amp; Shweta Reddy</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>WhatsApp</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2021-01-19T08:12:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/response-to-pegasus-questionnaire-issued-by-sc-technical-committee">
    <title>Response to the Pegasus Questionnaire issued by the SC Technical Committee</title>
    <link>https://cis-india.org/internet-governance/blog/response-to-pegasus-questionnaire-issued-by-sc-technical-committee</link>
    <description>
        &lt;b&gt;On March 25, 2022, the Supreme Court appointed Technical Committee constituted to examine the allegations of alleged unauthorised surveillance using the Pegasus software released a questionnaire seeking responses and comments from the general public.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The questionnaire had 11 questions and the responses had to be submitted through an online form- which was available &lt;a class="external-link" href="https://pegasus-india-investigation.in/invitation-to-comment/-"&gt;here&lt;/a&gt;. The last date for submitting the response was March 31, 2022. CIS had submitted the following responses to the questions in the questionnaire. Access the &lt;b&gt;&lt;a href="https://cis-india.org/internet-governance/response-to-the-pegasus-investigation" class="internal-link"&gt;Response to the Questionnaire&lt;/a&gt;&lt;/b&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/response-to-pegasus-questionnaire-issued-by-sc-technical-committee'&gt;https://cis-india.org/internet-governance/blog/response-to-pegasus-questionnaire-issued-by-sc-technical-committee&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anamika Kundu, Digvijay, Arindrajit Basu, Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Surveillance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-04-13T14:45:41Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/deccan-herald-aman-nair-and-pallavi-bedi-june-13-2021-pandemic-technology-takes-its-toll-on-data-privacy">
    <title>Pandemic Technology takes its Toll on Data Privacy</title>
    <link>https://cis-india.org/internet-governance/blog/deccan-herald-aman-nair-and-pallavi-bedi-june-13-2021-pandemic-technology-takes-its-toll-on-data-privacy</link>
    <description>
        &lt;b&gt;The absence of any legal framework has meant these tools are now being used for purposes beyond managing the pandemic.&lt;/b&gt;
        &lt;p style="text-align: center; "&gt;The article by Aman Nair and Pallavi Bedi was &lt;a class="external-link" href="https://www.deccanherald.com/specials/pandemic-technology-takes-its-toll-on-data-privacy-996870.html"&gt;published in the Deccan Herald &lt;/a&gt;on June 13, 2021.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: center; "&gt;&lt;img src="https://cis-india.org/home-images/ArogyaSetuApp.jpg" alt="Arogya Setu App" class="image-inline" title="Arogya Setu App" /&gt;&lt;/p&gt;
&lt;p style="text-align: center; "&gt;&lt;span class="discreet"&gt;People show Arogya Setu App installed in their phones while travelling by special New Delhi-Bilaspur train from New Delhi Railway Station. Credit: PTI File Photo&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt; &lt;/p&gt;
&lt;p style="text-align: center; "&gt;&lt;img src="https://cis-india.org/home-images/CovidCertificate.jpg/@@images/672b385b-d0b0-49af-953d-ae96a42be117.jpeg" alt="Covid Certificate" class="image-inline" title="Covid Certificate" /&gt;&lt;/p&gt;
&lt;p style="text-align: center; "&gt;&lt;span class="discreet"&gt;Jabalpur: A beneficiary shows his certificate on his mobile phone after receiving COVID-19 vaccine dose, at Gyan Ganga College in Jabalpur, Saturday, May 15, 2021. (PTI Photo)&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At a time when technology is spawning smart solutions to combat Covid-19 worldwide, India’s digital response to the pandemic has stoked concerns that surveillance could pose threats to the privacy of the personal data collected. Be it apps or drones, there is widespread criticism that digital tools are being misused to share information without knowledge or consent. At the other end of the spectrum, the great urban-rural digital divide is hampering the already sluggish vaccination drive, exposing vulnerable populations to a fast-mutating virus.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Last year, the Centre, states and municipal corporations launched more than 70 apps relating to Covid-19, demonstrating the country’s digital-driven approach to handling the pandemic. Chief among these was the central government’s contact tracing app Aarogya Setu. Launched under the Digital India programme, the app quickly came under scrutiny over data privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As per its privacy policy, Aarogya Setu collects personal details such as name, age, sex, profession and location. As there is no underlying legislation forming its basis, and in the absence of a personal data protection bill, serious privacy concerns regarding the collection, storage and use of personal data have been raised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The government has attempted to mitigate these concerns with reassurances that the data will be used solely in tracing the spread of the virus. However, recent reports from the Kulgam district of Jammu and Kashmir point to the sharing of application data with police. This demonstrates how easy it is to use personal data for purposes other than which it was collected, and presents a serious threat to citizen privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though Aarogya Setu was initially launched as ‘consensual’ and ‘voluntary’, it soon became mandatory for individuals to download the app for various purposes such as air and rail travel (this order was subsequently withdrawn) and for government officials. Initially it was also mandatory for the private sector, but this was later watered down to state that employers should, on a ‘best effort basis', ensure that the app is downloaded by all employees having compatible phones. However, the ‘best effort basis’ soon translated into mandatory imposition for certain individuals, especially those working in the ‘gig economy’.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several states had also launched apps for various purposes ranging from contact tracing of suspected Covid patients to monitoring the movement of quarantined patients. As a report by the Centre for Internet and Society observed, given the attention on Aarogya Setu, most of the apps launched by the state governments escaped scrutiny and public attention.Most of these apps either did not have a privacy policy or the policy was vague and often did not provide important details such as who was collecting the data, the time period for retaining the data and whether personal data could be shared with other departments, most notably, law enforcement.Apart from contact tracing apps, the pandemic also ushered in a wave of other apps and digital tools by the government. These include systems such as drones to check whether people are following Covid-19 norms and facial recognition cameras to report to the police whether someone has broken quarantine. Similar to Aarogya Setu, these tools have also largely been brought about in the absence of a legal and regulatory framework.&lt;br /&gt;The absence of any legal framework has meant these tools are now being used for purposes beyond managing the pandemic.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The government is now planning to use facial recognition technology along with Aadhaar toauthenticate people before giving them vaccine shots.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Aarogya Setu is now linked with the vaccination process. Beneficiaries have been provided an option to register through Aarogya Setu. The pandemic has also provided a means for the government to bring in changes to health policies and introduce the National Health Data Management Policy for the creation of a Unique Health Identity Number for citizens.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Vaccination and digital platforms&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The use of digital technology has extended to the vaccination process through the deployment of the Covid Vaccine Intelligence Network (Co-WIN) platform.During the first phase of inoculation, beneficiaries were required to register on the Co-WIN app while in the subsequent phases, registration was to be done on the Co-WIN website. The beneficiary is required to upload a photo identity proof.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While Aadhaar has been identified as one of the seven documents that can be uploaded for this, the Health Ministry has clarified that Aadhaar is not mandatory for registration either through Co-WIN or through Aarogya Setu. However, as per media reports, certain vaccination centres still seem to insist on Aadhaar identity even though beneficiaries may have used another identity proof to register on the Co-WIN website.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is also pertinent to note that the website did not have a privacy policy till the Delhi High Court issued directions on June 2, 2021. The privacy policy hyperlinked on the Co-WIN app directed the user to the Health Data Policy of the National Health Data Management Policy, 2020.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The vaccination drive has been used as a means to push the health identity project forward as beneficiaries who have opted to provide Aadhaar identity proof have also been provided with a health identity number on their vaccination certificate. It is interesting to note that Co-WIN’s privacy policy now states that if the beneficiary uses Aadhaar as identity proof, it can 'opt' to get a Unique Health Id.However, as a recent report revealed, health identity numbers have already been generated for certain beneficiaries without obtaining consent from them for the purpose.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Have the apps been successful?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;One could argue that privacy concerns are a worthwhile tradeoffin order to contain the spread of thepandemic. But it is worth examining how successful these technologies have been. In reality, the use of digital technology at every stage of combating the pandemic has clearly highlighted the extent of our digital divide. As per data from TRAI, there are around 750 million Internet subscribers in India,which is only a little more than half of India’s estimated 1.3 billion citizens — with this gap having a significant impact on the efficacy of the government’s strategies. Aarogya Setu has fallen far short of its goal, of having near universal adoption. It has limited adoption in much of the country. This has severely limited its efficacy in tracing the spread of the virus. Research from Maulana Azad Medical College has cited socio-economic inequalities,educational barriers and the lack of smartphone penetration as being the key causes behind the app’s limited success, pointing back to the digital divide. Moreover, the app has also brought with it a host of associated problems including lateral surveillance and function creep caused by the addition of new features. All of which, along with the previously mentioned privacy concerns, have served to hamper public trust and adoption.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A similar situation is seen in the case of vaccination and the Centre’s Co-WIN web portal. The need for registration, first on the Co-WIN app and later on the Co-WIN web portal, has disproportionately affected those who either have no or limited digital access. Many of them belong to vulnerable groups such as migrant and informal sector workers (mainly from disadvantaged castes), LGBTQIA + individuals, sex workers and both urban and rural poor. These issues have also been acknowledged by the Supreme Court, which raised serious concerns about the government being able to achieve its stated object of universal vaccination.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As the inoculation exercise opened up for the 18-45 age group, it increasingly favoured the urban population who possessed the technological and digital literacy to either create or access a host of tools. One need to only look at the wave of automated CO-WIN bots that arose as soon as the vaccination process was expanded to see how these dynamics manifested.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ultimately, the digital-driven approach that the governments have adopted has resulted in a number of issues — most notably, data privacy and exclusion. Going forward, government strategies must actively account for these factors and ensure that citize rights are adequately protected.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/deccan-herald-aman-nair-and-pallavi-bedi-june-13-2021-pandemic-technology-takes-its-toll-on-data-privacy'&gt;https://cis-india.org/internet-governance/blog/deccan-herald-aman-nair-and-pallavi-bedi-june-13-2021-pandemic-technology-takes-its-toll-on-data-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Aman Nair and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Health Tech</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Technological Protection Measures</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    
    
        <dc:subject>Healthcare</dc:subject>
    

   <dc:date>2021-06-26T06:52:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill">
    <title>A Guide to Drafting Privacy Policy under the Personal Data Protection Bill, 2019</title>
    <link>https://cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill</link>
    <description>
        &lt;b&gt;The Personal Data Protection Bill, 2019, (PDP Bill) which is currently being deliberated by the Joint Parliamentary Committee, is likely to be tabled in the Parliament during the winter session of 2021.&lt;/b&gt;
        
&lt;p style="text-align: justify;"&gt;The Bill in its current form, doesn’t have explicit transitory provisions i.e. a defined timeline for the enforcement of the provisions of the Bill post its notification as an enforceable legislation. Since the necessary subject matter expertise may be limited on short notice and out of budget for certain companies, we intend to release a series of guidance documents that will attempt to simplify the operational requirements of the legislation.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Certain news reports had earlier suggested that the Joint Parliamentary Committee reviewing the Bill has proposed&amp;nbsp;&lt;a class="external-link" href="https://economictimes.indiatimes.com/news/politics-and-nation/parliamentary-panel-examining-personal-data-protection-bill-recommends-89-changes/articleshow/80138488.cms"&gt;89 new amendments and a new clause&lt;/a&gt;. The nature and content of these amendments so far remain unclear. However, we intend to start the series by addressing some frequently asked questions around meeting the requirements of publishing a privacy notice and shall make the relevant changes post notification of the new Bill. The solutions provided in this guidance document are mostly based on international best practices and any changes in the solutions based on Indian guidelines and the revised PDP Bill will be redlined in the future.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The frequently asked questions and other specific examples on complying with the requirements of publishing a privacy policy have been compiled based on informal discussions with stakeholders, unsolicited queries from smaller organizations and publicly available details from conferences on the impact of the Bill. We intend to conduct extensive empirical analysis of additional queries or difficulties faced by smaller organizations towards achieving compliance post the notification of the new Bill. Regardless, any smaller organizations(NGOs, start-ups etc.) interested in discussing compliance related queries can get in touch with us.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;Click to download the &lt;a href="https://cis-india.org/internet-governance/guide-to-personal-data-protection-bill.pdf" class="internal-link"&gt;full report here&lt;/a&gt;. The report was reviewed by Pallavi Bedi and Amber Sinha.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>shwetar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2021-09-20T10:34:40Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill">
    <title>Beyond the PDP Bill: Governance Choices for the DPA</title>
    <link>https://cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill</link>
    <description>
        &lt;b&gt;This article  examines the specific governance choices the Data Protection Authority (DPA) in India  must deliberate on vis-à-vis its standard-setting function, which are distinct from those it will encounter as part of its enforcement and supervision functions.&lt;/b&gt;
        
&lt;p style="text-align: justify;"&gt;The Personal Data Protection Bill, 2019, was introduced in the Lok Sabha on 11 December 2019. It lays down an overarching framework for personal data protection in India. Once revised and approved by Parliament, it is likely to establish the first comprehensive data protection framework for India. However, the provisions of the Bill are only one component of the forthcoming data protection framework It further proposes setting up the Data Protection Authority (DPA) to oversee the final enforcement, supervision, and standard-setting. The Bill consciously chooses to vest the responsibility of administering the framework with a regulator instead of a government department. As an independent agency, the DPA is expected to be autonomous from the legislature and the Central Government and capable of making expert-driven regulatory decisions in enforcing the framework.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Furthermore, the DPA is not merely an implementing authority; it is also expected to develop privacy regulations for India by setting standards. As such, it will set the day-to-day obligations of regulated entities under its supervision. Thus, the effectiveness with which it carries out its functions will be the primary determinant of the impact of this Bill (or a revised version thereof) and the data protection framework set out under it.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The final version for the PDP Bill may or may not provide the DPA with clear guidance regarding its functions. In this article, we emphasise the need to look beyond the Bill and instead examine the specific governance choices the DPA must deliberate on vis-à-vis its standard-setting function, which are distinct from those it will encounter as part of its enforcement and supervision functions.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;A brief timeline of the genesis of a distinct privacy regulator for India&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The vision of an independent regulator for data protection in India emerged over the course of several intervening processes that set out to revise India’s data protection laws. In fact, the need for a dedicated data protection regulation for India, with enforceable obligations and rights, was debated years before the &lt;a href="https://thewire.in/government/privacy-aadhaar-supreme-court"&gt;Aadhaar&lt;/a&gt;, &lt;a href="https://www.thehindu.com/news/national/urgent-need-for-data-protection-laws-experts/article23314655.ece"&gt;Cambridge Analytica&lt;/a&gt;, and &lt;a href="https://www.livemint.com/opinion/online-views/pegasus-has-given-privacy-legislation-a-jab-of-urgency-11628181453098.html"&gt;Pegasus&lt;/a&gt;&lt;sup&gt; &lt;/sup&gt;revelations captured the public imagination and mainstreamed conversations on privacy.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The &lt;a href="https://cis-india.org/internet-governance/draft-bill-on-right-to-privacy"&gt;Right to Privacy Bill, 2011&lt;/a&gt;, which never took off, recognised the right to privacy in line with Article 21 of the Constitution of India, which pertains to the right to life and personal liberty. The Bill laid down express conditions for collecting and processing data and the rights of data subjects. It also proposed setting up a Data Protection Authority (DPA) to supervise and enforce the law and advise the government in policy matters. Upon review by the Cabinet, it was &lt;a href="https://cis-india.org/internet-governance/draft-bill-on-right-to-privacy"&gt;suggested&lt;/a&gt; that the Authority be revised to an Advisory Council, given its role under the Bill was limited.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Subsequently, in 2012, the AP Shah Committee Report &lt;a href="https://cis-india.org/internet-governance/blog/report-of-group-of-experts-on-privacy.pdf"&gt;recommended&lt;/a&gt; a principle-based data protection law, focusing on set standards while refraining from providing granular rules, to be enforced through a co-regulatory structure. This structure would consist of central and regional-level privacy commissioners, self-regulatory bodies, and data protection officers appointed by data controllers. There were also a few private members’ bills &lt;a href="https://saveourprivacy.in/media/all/Brief-PDP-Bill-25.12.2020.pdf"&gt;introduced&lt;/a&gt; between 2011 and 2019.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;None of these efforts materialised, and the regulatory regime for data protection and privacy remained embedded within the Information Technology Act, 2000, and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (SPDI Rules). Though the &lt;a href="https://www.meity.gov.in/writereaddata/files/GSR313E_10511%281%29_0.pdf"&gt;SPDI Rules&lt;/a&gt; require body corporates to secure personal data, their enforcement is &lt;a href="https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&amp;amp;orderno=49"&gt;limited&lt;/a&gt; to cases of negligence in abiding by these limited set of obligations pertaining to sensitive personal information only, and which have caused wrongful loss or gain – a high threshold to prove for aggrieved individuals. Otherwise, the &lt;a href="https://www.meity.gov.in/writereaddata/files/GSR314E_10511%281%29_0.pdf"&gt;Intermediary Guidelines&lt;/a&gt;, 2011 require all intermediaries to generally follow these Rules under Rule 3(8).&amp;nbsp; The enforcement of these obligations is &lt;a href="https://www.ikigailaw.com/dispute-resolution-framework-under-the-information-technology-act-2000/#acceptLicense"&gt;entrusted&lt;/a&gt; to adjudicating officers (AO) appointed by the central government, who are typically bureaucrats appointed as AOs in an ex-officio capacity.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;By 2017, the Aadhaar litigations had provided additional traction to the calls for a dedicated and enforceable data protection framework in India. In its judgement, the Supreme Court &lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;recognised&lt;/a&gt; the right to privacy as a fundamental right in India and stressed the need for a dedicated data protection law. Around the same time, the Ministry of Electronics and Information Technology (MeitY) constituted a &lt;a href="https://pib.gov.in/newsite/PrintRelease.aspx?relid=169420"&gt;committee of experts&lt;/a&gt; under the chairmanship of Justice BN Srikrishna. The Srikrishna Committee undertook public consultations on a 2017 &lt;a href="https://www.meity.gov.in/writereaddata/files/white_paper_on_data_protection_in_india_171127_final_v2.pdf"&gt;white paper&lt;/a&gt;, which culminated in the nearly comprehensive &lt;a href="https://www.meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf"&gt;Personal Data Protection Bill, 2018&lt;/a&gt;, and an accompanying &lt;a href="https://www.meity.gov.in/writereaddata/files/Data_Protection_Committee_Report.pdf"&gt;report&lt;/a&gt;. This 2018 Bill outlined a regulatory framework of personal data processing for India and defined data processing entities as fiduciaries, which owe a duty of care to individuals to whom personal data relates. The Bill provided for the setting up of an independent regulator that would, among other things, specify further standards for data protection and administer and enforce the provisions of the Bill.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;MeitY invited public comments on this Bill and tabled a revised version, the Personal Data Protection &lt;a href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf"&gt;Bill&lt;/a&gt;, 2019 (PDP Bill), in the Lok Sabha in December 2019. Following public pressure calling for detailed discussions on the Bill before its passing, it was referred to a &lt;a href="http://loksabhaph.nic.in/Committee/CommitteeInformation.aspx?comm_code=73&amp;amp;tab=1"&gt;Joint Parliamentary Committee&lt;/a&gt; (JPC) constituted for this purpose. It currently remains under review; the JPC is &lt;a href="https://www.hindustantimes.com/india-news/need-state-level-data-protection-authorities-joint-parliamentary-committee-mp-amar-patnaik-101632679181340.html"&gt;reportedly&lt;/a&gt; expected to table its report in the 2021 Winter Session of Parliament. Though the Bill is likely to undergo another &lt;a href="https://www.hindustantimes.com/india-news/over-100-drafting-changes-proposed-to-jpc-on-data-protection-bill-101631730726756.html"&gt;round of revisions&lt;/a&gt; following the JPC’s review, this is the closest India has come to realising its aspirations of establishing a dedicated and enforceable data protection framework.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This Bill carries forward the choice of a distinct regulatory body, though &lt;a href="https://thewire.in/tech/india-data-protection-authority-needs-constitutional-entrenchment"&gt;questions remain&lt;/a&gt; on the degree of its independence, given the direct control granted to the central government in appointing its members and funding the DPA.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Conceptualising an Independent DPA&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The Srikrishna Committee’s 2017 white paper and its 2018 report on the PDP Bill discuss the need for a regulator in the context of &lt;em&gt;enforcement&lt;/em&gt; of its provisions. However, the DPA under the PDP Bill is tasked with extensive powers to frame detailed regulations and codes of conduct to inform the day-to-day obligations of data fiduciaries and processors. To be clear, the standard-setting function for a regulator &lt;a href="https://ssrn.com/abstract=1393647"&gt;entails&lt;/a&gt; laying down the standards based on which regulated entities (i.e. the data fiduciaries) will be held accountable, and the manner in which they may conduct themselves while undertaking the regulated activity (i.e. personal data processing). This is in addition to its administrative and enforcement, and quasi-judicial functions, as outlined below:&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Functions of the DPA under the PDP Bill 2019&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;&lt;img src="https://cis-india.org/home-images/PDPBill.png/@@images/93bcf598-962a-48f1-b1b1-78933dac5d27.png" alt="null" class="image-inline" title="PDP" /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;At this stage, it is important to note that the choice of regulation via a regulator is distinct from the administration of the Bill by the central or state governments. Creating a distinct regulatory body allows government procedures to be replaced with expert-driven decision-making to ensure sound economic regulation of the sector. At the same time, the independence of the regulatory authority &lt;a href="https://www.oxfordhandbooks.com/view/10.1093/law/9780198704898.001.0001/oxfordhb-9780198704898"&gt;insulates it&lt;/a&gt; from political processes. The third advantage of independent regulatory authorities is the scope for ‘operational flexibility’, which is embodied in the relative autonomy of its employees and its decision-making from government scrutiny.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This is also the rationale provided by the Srikrishna Committee in stating their choice to entrust the administration of the data protection law to an independent DPA. The 2017 white paper that preceded the 2018 Srikrishna Committee Report proposed a distinct regulator to provide expert-driven enforcement of laws for the highly specialised data protection sphere. Secondly, the regulator would serve as a single point of contact for entities seeking guidance and will ensure consistency by issuing rules, standards, and guidelines. The Srikrishna Committee Report concretised this idea and proposed a sector-agnostic regulator that is expected to &lt;a href="https://www.meity.gov.in/writereaddata/files/Data_Protection_Committee_Report.pdf"&gt;undertake&lt;/a&gt; expertise-driven standard-setting, enforcement, and adjudication under the Bill.&lt;sup&gt; &lt;/sup&gt; The PDP Bill carries forward this conception of a DPA, which is distinct from the central government.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Conceptualised as such, the DPA has a completely new set of questions to contend with. Specifically, regulatory bodies require additional safeguards to overcome the legitimacy and accountability questions that &lt;a href="https://www.oxfordhandbooks.com/view/10.1093/law/9780198704898.001.0001/oxfordhb-9780198704898"&gt;arise&lt;/a&gt; when law-making is carried out not by elected members of the legislature, but via the unelected executive. The DPA would need to incorporate democratic decision-making processes to overcome the deficit of public participation in an expert-driven body. Thus, the meta-objective of ensuring autonomous, expertise-driven, and legitimate regulation of personal data processing necessitates that the regulator has sufficient independence from political interference, is populated with subject matter experts and competent decision-makers, and further has democratic decision-making procedures.&lt;/p&gt;
&lt;p&gt;Further, the standard-setting role of the regulator does not receive sufficient attention in terms of providing distinct procedural or substantive safeguards either in the legislation or public policy guidance.&lt;/p&gt;
&lt;h3&gt;Reconnaissance under the PDP Bill: How well does it guide the DPA?&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;At this time, the PDP Bill is the primary guidance document that defines the DPA and its overall structure. India also lacks an overarching statute or binding framework that lays down granular guidance on regulation-making by regulatory agencies.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The PDP Bill, in its current iteration, sets out skeletal provisions to guide the DPA in achieving its objectives. Specifically,&amp;nbsp; the Bill provides guidance limited to the following:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;&lt;em&gt;Parliamentary scrutiny of regulations:&lt;/em&gt; The DPA must table all its regulations before the Parliament. This is meant to accord &lt;a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf"&gt;legislative scrutiny&lt;/a&gt; to binding legal standards promulgated by unelected officials.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;&lt;em&gt;Consistency with the Act:&lt;/em&gt; All regulations should be consistent with the Act and the rules framed under it. This integrates a standard of administrative law to a limited extent within the regulation-making process. &lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;However, India’s past track record &lt;a href="https://prsindia.org/theprsblog/how-well-does-parliament-examine-rules-framed-under-various-laws"&gt;indicates&lt;/a&gt; that regulations, once tabled before the Parliament, are rarely questioned or scrutinised. Judicial review is typically based on ‘thin’ procedural considerations such as whether the regulation is unconstitutional, arbitrary, &lt;em&gt;ultra vires&lt;/em&gt;, or goes beyond the statutory obligations or jurisdiction of the regulator. In any event, judicial review is possible only when an instrument is challenged by a litigant, and, therefore, it may not always be a robust &lt;em&gt;ex-ante&lt;/em&gt; check on the exercise of this power. A third challenge arises where instruments other than regulations are issued by the regulator. These could be circulars, directions, guidelines, and even FAQs, which are &lt;a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf"&gt;rarely bound&lt;/a&gt; by even the minimal procedural mandate of being tabled before the Parliament. To be sure, older regulators including the Reserve Bank of India (RBI) and the Securities and Exchange Board of India (SEBI) also face similar issues, which they have attempted to address through various methods including voluntary public consultations, stakeholder meetings, and publication of minutes of meetings. These are useful tools for the DPA to consider as well.&lt;/p&gt;
&lt;p&gt;Apart from these, specific guidance is provided with respect to issuing and approving codes of practice and issuing directions as follows:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Codes of practice: The DPA is required to (i) ensure transparency,&lt;a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; (ii) consult with other sectoral regulators and stakeholders, and (iii) follow a procedure to be prescribed by the central government prior to the notification of codes of practice under the Bill.&lt;a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Directions: The DPA may issue directions to individual, regulated entities or their classes from time to time, provided these entities have been given the opportunity to be heard by the DPA before such directions are issued.&lt;a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;However, the meaning of transparency and the process for engaging with sectoral regulators remains unspecified under the Bill. Furthermore, the central government has been provided vast discretion to formulate these procedures, as the Bill does not specify the principles or outcomes sought to be achieved via these procedures. The Bill also does not specify instances where such directions may be issued and in which form.&lt;/p&gt;
&lt;p&gt;Thus, as per its last publicly available iteration, the Bill remains silent on the following:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The principles that may guide the DPA in its functioning.&lt;/li&gt;
&lt;li&gt;The procedure to be followed for issuing regulations and other subordinate legislation under the Bill.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;The relevant regulatory instruments, other than regulations and codes of practice – such as circulars, guidelines, FAQs, etc. – that may be issued by the DPA.&lt;/li&gt;
&lt;li&gt;The specifics regarding the members and employees within the DPA who are empowered to make these regulations.&lt;/li&gt;&lt;/ul&gt;
&lt;p style="text-align: justify;"&gt;It is unclear whether the JPC will revise the DPA’s structure or recommend statutory guidance for the DPA in executing any of its functions. This is unlikely, given that parent statutes for other regulators typically omit such guidance. As a result, the DPA may be required to make intentional and proactive choices on these matters, much like their regulatory counterparts in India. These are discussed in the section below.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Envisaging a Proactive Role for the DPA&lt;/h3&gt;
&lt;p&gt;As the primary regulatory body in charge of the enforcement of the forthcoming data protection framework, what should be the role of the DPA in setting standards for data protection?&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The complexity of the subject matter, and the DPA’s role as the frontline body to define day-to-day operational standards for data protection for the entire digital economy, necessitates that it develop transparent guiding principles and procedures. Furthermore, given that the DPA’s autonomy and capacity are currently unclear, the DPA will need to make deliberate choices regarding how it conducts itself. In this regard, the skeletal nature of the PDP Bill also allows the DPA to determine its own procedures to carry out its tasks effectively.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This is not uncommon in India: various regulators have devised frameworks to create benchmarks for themselves. The Airports Economic Regulatory Authority (AERA) is &lt;a href="http://aera.gov.in/aera/upload/uploadfiles/files/AERAACT.pdf"&gt;obligated&lt;/a&gt; to follow a dedicated consultation process as per an explicit transparency mandate under the parent statute. However, the Insolvency and Bankruptcy Board of India (IBBI) has, on its own initiative, &lt;a href="https://ibbi.gov.in/webadmin/pdf/legalframwork/2018/Oct/IBBI(Mechamism%20for%20Issuing%20Regulations)%20Regulations,%202018_2018-10-26%2011:59:43.pdf"&gt;formulated regulations&lt;/a&gt; to guide its regulation-making functions. In other cases, consultation processes have been integrated into the respective framework through judicial intervention: the Telecom Regulatory Authority of India (TRAI) has been mandated to undertake consultations through &lt;a href="https://clpr.org.in/wp-content/uploads/2018/10/Cellular-Operators-v.-TRAI.pdf"&gt;judicial interpretation&lt;/a&gt; of the requirement for transparency under the Telecom Regulatory Authority of India Act, 1997 (TRAI Act).&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In this regard, we develop a list of considerations that the DPA should look to address while carrying out its standard-setting functions. We also draw on best practices by Indian regulators and abroad, which can help identify feasible solutions for an effective DPA for India.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The choice of regulatory instruments&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The DPA is empowered to issue regulations, codes of practice, and directions under the Bill. At the same time, regulators in India routinely issue other regulatory instruments to assign obligations and clarify them. Some commonly used regulatory instruments are outlined below. The terms used for instruments are not standard across regulators, and the list and description set out below outline the main concepts and not fixed labels for the instruments.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;Overview of regulatory instruments&lt;/em&gt;&lt;/strong&gt;&lt;em&gt; &lt;/em&gt;&lt;/p&gt;
&lt;table&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;strong&gt;Circulars   and Master Circulars&lt;/strong&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;strong&gt;Guidelines&lt;/strong&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;strong&gt;FAQs&lt;/strong&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;strong&gt;Directions&lt;/strong&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p&gt;&lt;strong&gt;Content&lt;/strong&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Circulars are used to prescribe detailed obligations   and prohibitions for regulated entities and can mimic regulations. Master   circulars consolidate circulars on a particular topic periodically.&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;These may be administrative or substantive,   depending on the practice of the regulator in question.&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Issued in public interest by regulators to   clarify the regulatory framework administered by them. They cannot prescribe   new standards or create obligations.&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Issued to provide focused instructions to   individual entities or class of entities in response to an adjudicatory   action or in lieu of a current challenge.&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p&gt;&lt;strong&gt;Binding   character&lt;/strong&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;They are generally &lt;a href="https://indiankanoon.org/doc/1588871/"&gt;binding&lt;/a&gt; in the &lt;a href="https://indiankanoon.org/doc/1316639/"&gt;same manner&lt;/a&gt; as regulations and rules. However, if they go beyond   the parent Act or existing rules and regulations, they may be &lt;a href="https://indiankanoon.org/doc/15876695/"&gt;struck down&lt;/a&gt; following a judicial review.&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;They may or may not be binding depending   upon the language employed or the regulator’s practice.&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Unclear whether these are binding and to   what extent. However, crucial clarifications on important concepts sometimes   emerge from FAQs.&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Binding in respect of the class of regulated   entities to whom this is issued.&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p&gt;&lt;strong&gt;Parliamentary   scrutiny&lt;/strong&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td colspan="4"&gt;
&lt;p&gt;Unlike regulations, these do not have to be   laid before the Parliament.&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p style="text-align: justify;"&gt;Thus, all these instruments, to varying degrees, have &lt;a href="https://www.ncaer.org/news_details.php?nID=1399"&gt;been used&lt;/a&gt; to create binding obligations for regulated entities. The &lt;a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf"&gt;choice of regulatory instrument&lt;/a&gt; is not made systematically. Indeed, even a &lt;a href="https://www.bis.org/bcbs/publ/d321.pdf"&gt;hierarchy of instruments&lt;/a&gt; and their functions are not clearly set out by most regulators. The &lt;a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf"&gt;rationale&lt;/a&gt; for deciding why a circular is issued as against a regulation is also unclear. A study on regulatory performance in India by Burman and Zaveri (2018) has &lt;a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf"&gt;highlighted&lt;/a&gt; an over-reliance on instruments such as circulars. As per their study, between 2014 and 2016, RBI and SEBI issued 1,016 and 122 circulars, as against 48 and 51 regulations, respectively. These circulars are not bound by the same pre-consultative mandate nor are they mandated to be laid before the Parliament. While circulars may have&amp;nbsp; been intended for routine to routinely used to lay down administrative or procedural requirements, the study narrows its frame of reference to circulars which lay down substantive regulatory requirements. In this instance, it is unclear why parliamentary scrutiny is mandated for regulations alone, and not for instruments like circulars and directions, even though they lay down similarly substantive requirements. Furthermore, there have also been&lt;a href="https://indiacorplaw.in/2014/11/are-sebis-faqs-binding-on-partiessebi.html"&gt; instances&lt;/a&gt; where certain instruments like FAQs have gone beyond their advisory scope to provide new directions or definitions that were not previously shared under binding instruments like regulations or circulars.&lt;/p&gt;
&lt;p&gt;The DPA has been provided specific powers to issue regulations, codes of practice, and directions. However, the rationale for issuing one instead of the other has been &lt;a href="https://www.medianama.com/2020/01/223-pdp-bill-2019-data-protection-authority/"&gt;absent&lt;/a&gt; from the PDP Bill so far. In such a scenario, it is important that the DPA transparently outlines the &lt;em&gt;types&lt;/em&gt; of instruments it wishes to use, whether they are binding or advisory, and the procedure to be followed for issuing each.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Pre-legislative consultative rule-making&lt;/strong&gt;&lt;/p&gt;
&lt;ol&gt;&lt;/ol&gt;
&lt;p&gt;Participatory and consultative processes have emerged as core components of democratic rule-making by regulators. Transparent consultative mechanisms could also ameliorate capacity challenges in a new regulator (particularly for technical matters) and help enhance public confidence in the regulator.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In India, several regulators have adopted consultation mechanisms even when there is no specific statutory requirement. &lt;a href="https://www.sebi.gov.in/sebiweb/home/HomeAction.do?doListing=yes&amp;amp;sid=4&amp;amp;smid=35&amp;amp;ssid=38"&gt;SEBI&lt;/a&gt; and &lt;a href="https://ibbi.gov.in/public-comments/comments-on"&gt;IBBI&lt;/a&gt; routinely issue discussion papers and consultation papers. The RBI also issues draft instruments &lt;a href="https://www.rbi.org.in/Scripts/DraftNotificationsGuildelines.aspx"&gt;soliciting comments&lt;/a&gt;. As discussed previously, TRAI and AERA have distinct transparency mandates under which they carry out consultations before issuing regulations. However, these processes are not mandated all forms of subordinate legislation. Taking cognizance of this, the Financial Sector Legislative Reform Committee (FSLRC) has &lt;a href="https://dea.gov.in/sites/default/files/fslrc_report_vol1_1.pdf"&gt;recommended&lt;/a&gt; transparency in the regulation-making process. This was &lt;a href="https://dea.gov.in/sites/default/files/Handbook_GovEnhanc_fslrc_2.pdf"&gt;carried forward&lt;/a&gt; by the Financial Stability and Development Council (FSDC), which recommended that consultation processes should be a prerequisite for all subordinate legislations, including circulars, guidelines, etc. A &lt;a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf"&gt;study&lt;/a&gt; on regulators’ adherence to these mandates, spanning TRAI, AERA, SEBI, and RBI, demonstrated that this pre-consultation mandate is followed inconsistently, if at all. Predictable consultation practices are therefore critical.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Furthermore, the study stated that it &lt;a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf"&gt;could not determine&lt;/a&gt; whether the consultation processes yielded meaningful participation, given that regulators are not obligated to disclose how public feedback was integrated into the rule-making process. Subordinate legislations issued in the form of circulars and guidelines also do not typically undergo the same rigorous consultation processes. Thus, an ideal consultation framework would &lt;a href="https://ec.europa.eu/info/sites/default/files/better_regulation_joining_forces_to_make_better_laws_en_0.pdf"&gt;comprise&lt;/a&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify;"&gt;Publication of the draft subordinate legislation along with a detailed explanation of the policy objectives. Further, the regulator should publish the internal or external studies conducted to arrive at the proposed legislation to &lt;a href="https://legalinstruments.oecd.org/public/doc/669/51f6da97-c198-4c93-922f-1a5d80beae86.pdf"&gt;engender&lt;/a&gt; meaningful discussion.&lt;/li&gt;
&lt;li&gt;Permitting sufficient time for the public and interested stakeholders to respond to the draft.&lt;/li&gt;
&lt;li&gt;Publishing all feedback received for the public to assess, and allowing them to respond to the feedback.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;However, beyond specifying the manner of conducting consultations, it will be important for the DPA to determine where they are mandatory and binding, and for which type of subordinate legislations. These are discussed in the next section.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Choice of consultation mandates for distinct regulatory      instruments&lt;/strong&gt;&lt;/p&gt;
&lt;ol&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;While the Bill provides for consultation processes for issuing and approving codes of practice, no such mechanism has been set out for other instruments. Nevertheless, specifying consultation mandates for different regulatory instruments is important to ensure that decision-making is consistent and regulation-making remains bound by transparent and accountable processes. As discussed above, regulatory instruments such as circulars and FAQs are not necessarily bound by the same consultation mandates in India. This distinction has been clarified in more sophisticated administrative law frameworks abroad. For instance, under the Administrative Procedures Act in the United States (US), all substantive rules made by regulatory agencies are &lt;a href="https://www.reginfo.gov/public/reginfo/Regmap/regmap.pdf"&gt;bound&lt;/a&gt; by a consultation process, which requires notice of the proposed rule-making and public feedback. This does &lt;a href="https://www.federalregister.gov/uploads/2011/01/the_rulemaking_process.pdf"&gt;not preclude&lt;/a&gt; the regulatory agency from issuing clarifications, guidelines, and supplemental information on the rules issued. These documents do not require the consultation process otherwise required for formal rules. However, they cannot be used to expand the scope of the rules, set new legal standards, or have the effect of amending the rules. Nevertheless, agencies are not precluded from choosing to seek public feedback on such documents.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Similarly, the Information Commissioner’s Office in the United Kingdom (UK) takes into consideration &lt;a href="https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/"&gt;public consultations&lt;/a&gt; and &lt;a href="https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/ico-call-for-views-on-employment-practices/"&gt;surveys&lt;/a&gt; while issuing toolkits and guidance for regulated entities on how to comply with the data protection framework in the UK.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Here, the DPA may choose to subject strictly binding instruments like regulations and codes of practice to pre-legislative consultation mandates, while softer mechanisms like FAQs may be subject to the publication of a detailed outline of the policy objective or online surveys to invite non-binding, advisory feedback. For each of these, the DPA will nonetheless need to create specific criteria by which it classifies instruments as binding and advisory, and further outline specific pre-legislative mandates for each category.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Framework for issuing regulatory instruments and instructions&lt;/strong&gt;&lt;/p&gt;
&lt;ol&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;While the DPA is likely to issue several instruments, the system based on which these instruments will be issued is not yet clear. Without a clearly thought-out framework, different departments within the regulator &lt;a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf"&gt;typically issue&lt;/a&gt; a series of directions, circulars, regulations, and other instruments. This raises questions regarding the consistency between instruments. This also requires stakeholders to go through multiple instruments to find the position of law on a given issue. Older Indian regulators are now facing challenges in adapting their ad hoc system into a framework. For example, the RBI currently issues a series of circulars and guidelines that are periodically consolidated on a subject-matter basis as Master Circulars and Master Directions. These are then updated and published on their website. IBBI also publishes &lt;a href="https://ibbi.gov.in/uploads/publication/e42fddce80e99d28b683a7e21c81110e.pdf"&gt;handbooks&lt;/a&gt; and &lt;a href="https://ibbi.gov.in/publication/information-brochures"&gt;information brochures&lt;/a&gt; that consolidate instruments in an accessible manner.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;While these are useful improvements, these practices cannot keep pace with rapid changes in regulatory instructions and are not complete or user-friendly (for example, the subject-matter based consolidation does not allow for filtering regulatory instructions by entity). Other jurisdictions have developed different techniques such as formal codification processes to consolidate regulations issued by government agencies under one &lt;a href="https://www.govinfo.gov/help/cfr"&gt;unified code&lt;/a&gt;, &lt;a href="https://www.oaic.gov.au/privacy/privacy-registers/privacy-codes-register/"&gt;register&lt;/a&gt;, or &lt;a href="https://www.handbook.fca.org.uk/handbook"&gt;handbook&lt;/a&gt;,&amp;nbsp; websites that allow for searches based on different parameters (subject-matter, type of instrument, chronology, entity-based), and &lt;a href="https://www.handbook.fca.org.uk/handbook-guides"&gt;guides&lt;/a&gt; tailored to different types of entities. The DPA, as a new regulator, can learn from this experience and adopt a consistent framework right from the beginning.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Further, an ethos of responsive regulation also requires the DPA to evaluate and revise directions and regulations periodically, in response to market and technology trends. A commitment to periodic evaluation of subordinate legislations entrenched in the rules is critical to reducing the dependence on officials and leadership, which may change. For instance, the &lt;a href="https://www.ibbi.gov.in/webadmin/pdf/whatsnew/2018/Oct/Mechanism%20for%20issuing%20regulations%20October%20after%20Board%20meeting%20final_2018-10-22%2020:42:06.pdf"&gt;IBBI&lt;/a&gt; has set out a mandatory review of regulations issued by it every three years.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Dedicating capacity for drafting subordinate legislations&lt;/strong&gt;&lt;/p&gt;
&lt;ol&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;The DPA has been granted the discretion to appoint experts and staff its offices with the personnel it needs. A &lt;a href="https://www2.deloitte.com/content/dam/Deloitte/nl/Documents/risk/deloitte-nl-risk-reports-resources.pdf"&gt;study&lt;/a&gt; of European data protection authorities shows that by the time the General Data Protection Regulation, 2016 became effective, most of the authorities increased the number of employees with some even reporting a 240% increase. The annual spending on the authorities also went up for most countries. While these authorities do not necessarily frame subordinate legislations, they nonetheless create guidance toolkits and codes of practice as part of their supervisory functions.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In this regard, the DPA will need to ensure it has dedicated capacity in-house to draft subordinate legislations. Since regulators are generally seen as enforcement authorities, there is inadequate investment in capacity-building for drafting legislations in India.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Moreover, considering the multiplicity of instruments and guidance documents the DPA is expected to issue, it may seek to create templates for these instruments, along with compulsory constituents of different types of instruments. For instance, the Office of the Australian Information Commissioner is required to include a &lt;a href="https://www.oaic.gov.au/privacy/guidance-and-advice/guidelines-for-developing-codes/"&gt;mandatory set of components&lt;/a&gt; while issuing or approving binding industry codes of practice.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;The Personal Data Protection Bill, 2019 (in the final form recommended by the JPC and accepted by the MeitY) will usher in a new chapter in India’s data protection timeline. While the Bill will finally effectuate a nearly comprehensive data protection framework for India, it will also establish a new regulatory framework that sets up a new regulator, the DPA, to oversee the new data protection law. This DPA will be empowered to regulate entities across sectors and is likely to determine the success of the data protection law in India.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Furthermore, the DPA must not only contend with the complexity of markets and the fast pace of technological change, but it must also address &lt;a href="https://blog.theleapjournal.org/2018/02/a-pragmatic-approach-to-data-protection.html"&gt;anticipated&lt;/a&gt; regulatory capacity deficits, low levels of user literacy, the number and diversity of enities within its regulatory ambit, and the need to secure individual privacy within and outside the digital realm.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Thus, looking ahead, we must account for the questions of governance that the forthcoming DPA is likely to face, as these will directly impact how entities and citizens engage with the DPA. In India, regulatory agencies adopt distinct choices to fulfil their functions. Regulators have also &lt;a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf"&gt;fared variably&lt;/a&gt; in ensuring transparent and accountable decision-making driven by demonstrable expertise. Even if the final form of the PDP Bill does not address these gaps, the DPA has the opportunity to integrate benchmarks and best practices as discussed above within its own governance framework from the get-go as it takes on its daunting responsibilities under the PDP Bill.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;em&gt;(&lt;span id="docs-internal-guid-6bf51b9e-7fff-d2ac-d0fb-f42bcdd7f599"&gt;The authors are Research Fellow, Law, Technology and Society Initiative and Project Lead, Regulatory Governance Project respectively at the National Law School of India University, Bangalore. Views are personal.)&lt;/span&gt;&lt;/em&gt;&lt;/p&gt;
&lt;em&gt;
&lt;/em&gt;
&lt;p style="text-align: justify;"&gt;&lt;span id="docs-internal-guid-6bf51b9e-7fff-d2ac-d0fb-f42bcdd7f599"&gt;&lt;em&gt;This post was reviewed by Vipul Kharbanda and Shweta Mohandas&lt;/em&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;References&lt;/h3&gt;
&lt;ul&gt;
&lt;li style="text-align: justify;"&gt;For a discussion on distinct regulatory choices, please see TV Somanathan, &lt;em&gt;The Administrative and Regulatory State&lt;/em&gt; in Sujit Choudhary, Madhav Khosla, et al. (eds), &lt;a href="https://www.oxfordhandbooks.com/view/10.1093/law/9780198704898.001.0001/oxfordhb-9780198704898"&gt;Oxford Handbook of the Indian Constitution&lt;/a&gt; (2016).&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;On best practices for consultative law-making, see generally &lt;em&gt;European Union Better Regulation &lt;/em&gt;&lt;a href="https://ec.europa.eu/info/sites/default/files/better_regulation_joining_forces_to_make_better_laws_en_0.pdf"&gt;&lt;em&gt;Communication&lt;/em&gt;&lt;/a&gt;, &lt;em&gt;Guidelines for Effective Regulatory Consultations &lt;/em&gt;(&lt;a href="https://www.tbs-sct.gc.ca/rtrap-parfa/erc-cer/erc-cer-eng.pdf"&gt;Canada&lt;/a&gt;),&amp;nbsp; and&lt;em&gt; &lt;/em&gt;&lt;a href="https://read.oecd-ilibrary.org/governance/the-governance-of-regulators_9789264209015-en#page81"&gt;&lt;em&gt;OECD&lt;/em&gt;&lt;/a&gt;&lt;em&gt; &lt;/em&gt;&lt;em&gt;Best Practice Principles for Regulatory Policy: The Governance of Regulators&lt;/em&gt;,&lt;em&gt; 2014.&lt;/em&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;hr align="left" size="1" width="33%" /&gt;
&lt;p&gt;&lt;a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Personal Data Protection Bill 2019, § 50(3).&lt;/p&gt;
&lt;p&gt;&lt;a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Personal Data Protection Bill 2019, § 50(4).&lt;/p&gt;
&lt;p&gt;&lt;a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Personal Data Protection Bill 2019, § 51.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill'&gt;https://cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Trishi Jindal and S.Vivek</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2021-11-10T07:32:33Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy">
    <title>Big Tech’s privacy promise to consumers could be good news — and also bad news</title>
    <link>https://cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy</link>
    <description>
        &lt;b&gt;Rajat Kathuria, Isha Suri write: Its use as a tool for market development must balance consumer protection, innovation, and competition.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In February, Facebook, rebranded as Meta, stated that its revenue in 2022 is anticipated to reduce by $10 billion due to steps undertaken by Apple to enhance user privacy on its mobile operating system. More specifically, Meta attributed this loss to a new AppTrackingTransparency feature that requires apps to request permission from users before tracking them across other apps and websites or sharing their information with and from third parties. Through this change, Apple effectively shut the door on “permissionless” internet tracking and has given consumers more control over how their data is used. Meta alleged that this would hurt small businesses benefiting from access to targeted advertising services and charged Apple with abusing its market power by using its app store to disadvantage competitors under the garb of enhancing user privacy.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Access the full article published in the &lt;a class="external-link" href="https://indianexpress.com/article/opinion/columns/big-tech-consumers-privacy-policy-7866701/"&gt;Indian Express&lt;/a&gt; on April 13, 2022&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy'&gt;https://cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Rajat Kathuria and Isha Suri</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-01-18T23:25:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework">
    <title>Inputs to the Report on the Non-Personal Data Governance Framework</title>
    <link>https://cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework</link>
    <description>
        &lt;b&gt;This submission presents a response by researchers at the Centre for Internet and Society, India (CIS) to the draft Report on Non-Personal Data Governance Framework prepared by the Committee of Experts under the Chairmanship of Shri Kris Gopalakrishnan. The inputs are authored by Aayush Rathi, Aman Nair, Ambika Tandon, Pallavi Bedi, Sapni Krishna, and Shweta Mohandas (in alphabetical order), and reviewed by Sumandro Chattapadhyay.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Text of submitted inputs: &lt;a href="https://cis-india.org/raw/files/cis-inputs-to-report-on-non-personal-data-governance-framework" target="_blank"&gt;Read&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;h4&gt;Report by the Committee of Experts on Non-Personal Data Governance Framework: &lt;a href="https://static.mygov.in/rest/s3fs-public/mygov_159453381955063671.pdf" target="_blank"&gt;Read&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h2&gt;Inputs&lt;/h2&gt;
&lt;h3&gt;Clause 3.7 (v): The role of the Indian government in the operation of data markets&lt;/h3&gt;
&lt;p&gt;While highlighting the potential for India to be one of the top consumer and data markets of the world, it also sheds light on the concern about the possibility of data monopolies. The clause envisions the role of the Indian government as a regulator and a catalyst for domestic data markets.&lt;/p&gt;
&lt;p&gt;In doing so, the clause does not acknowledge that the proactive and dominant roles of the Indian government in generation and reuse of data, based on the existing data collection practices, as well as the provisions that have been given, as under the compulsory sharing provisions in the Report, and would continue to be given by the Personal Data Protection Bill. In reality, the Indian government’s role is not just of a catalyst but also of a key player, potentially with monopolistic market power, in the domestic data market, especially due to the ongoing data marketplace initiatives as detailed in published policy and vision documents. [1]&lt;/p&gt;
&lt;h3&gt;Clause 3.8 (iv): Introducing collective privacy&lt;/h3&gt;
&lt;p&gt;The introduction of collective privacy has initiated an overdue discussion at the policy level to arrive at privacy formulations that account for limitations in the contemporary dominant social, legal and ethical paradigms of privacy premised on individual interests and personal harm. The notion of collective privacy has garnered contemporary attention with the rise of data processing technologies and business models that thrive on the collection and processing of aggregate information.&lt;/p&gt;
&lt;p&gt;While the Report acknowledges that collective privacy is an evolving concept, it doesn’t attempt to define either collective or what privacy could entail in the context of a collective. The postulation of collective privacy as a legally binding right is bereft with challenges in both domestic and international legal frameworks. [2]&lt;/p&gt;
&lt;p&gt;Central to these challenges is the representation of the group of the entity. While the Report illustrates harms that may be incurred by certain collectives that collective privacy could protect against, these illustrated collectives are already recognised in law as rights-holding groups (society members, for example), and/or share pre-determined attributes (sexual orientation, for example).&lt;/p&gt;
&lt;p&gt;The Report does not acknowledge that the very technological processes that may have rendered the articulation of collective privacy necessary, also are intended to create ad-hoc and newer sets of individuals or groups with shared attributes. [3] In doing so, the Report furthers an ontology of groups having intuitive, predetermined attributes that exist naturally, or in law, whereas the intervention of data collection and processing technologies can determine shared group attributes afresh. Moreover, the Report also ignores that predetermined attributes are static, and in doing so, ignores a vast existing literature speaking to fluidity of identities and the intersectionality of identities that individuals in groups occupy. [4] We fully appreciate the challenges these pose in the determination of the legal contours of collective privacy. Much of the Report’s recommendations are premised on the idea of a predetermined collective, rendering more granular exploration of these ideas urgent.&lt;/p&gt;
&lt;p&gt;Further, the Report also puts forth a limited conception of privacy as a safeguard against data-related harms that may be caused to collectives. In doing so, it dilutes the conceptualisation of individual privacy as articulated in Justice K. S. Puttaswamy (Retd.) and Anr. vs Union Of India And Ors. Notwithstanding this dilution, the illustrations also only indicate harms that may be caused by private actors. Any further recommendations should envision the harms that may also be caused by public data-driven processes, such as those incubated within the state machinery.&lt;/p&gt;
&lt;h3&gt;Clause 4.1 (iii) and Recommendation 1: Defining Non-Personal Data&lt;/h3&gt;
&lt;p&gt;The Report proposes the definition of non-personal data to include (i) data that was never related to an identified or identifiable natural person, and (ii) aggregated, anonymised personal data such that individual events are “no longer identifiable”. In doing so, they have attempted to extend protections to categories of data that fall outside the ambit of the Personal Data Protection Bill, 2019 (hereafter “PDP Bill”). The Report is cognizant of the fallible nature of anonymization techniques but fails to indicate how these may be addressed. 
The test of anonymization in regarding data as non-personal data requires further clarification. Anonymization, in and of itself, is an ambiguous standard. Scholarship has indicated that anonymised data may never be completely anonymous. [5] Despite this, the PDP Bill proposes a high threshold of zero-risk of anonymization in relation to personal data, to mean “such irreversible process of transforming or converting personal data to a form in which a data principal cannot be identified”. From a plain reading, it appears that the Report proposes a lower threshold of the anonymization requirements governing non-personal data. It is unclear how non-personal data would then be different from inferred data as described within the definition of personal data under the PDP Bill. This adds regulatory uncertainty making it imperative for the Committee to articulate bright-line, risk-based principles and rules for the test of anonymization. Such rules should also indicate the factors that ought to be taken into account to determine whether anonymization has occurred and the timescale of reference for anonymization outcomes. [6]&lt;/p&gt;
&lt;p&gt;The recommendation also states that the data principal should "also provide consent for anonymisation and usage of this anonymized data while providing consent for collection and usage of his/her personal data". However the framing of this recommendation fails to mention the responsibility of the data fiduciary to provide notice to the data principal about the usage of the anonymized data while seeking the data principal’s consent for anonymization. The notice provided to the data principal should provide clear indication that consent of the data principal is based on their knowledge of the use of the  anonymized data.&lt;/p&gt;
&lt;h3&gt;Clause 4.8 (i), (ii): Function of data custodians&lt;/h3&gt;
&lt;p&gt;The Report does not make it clear who may perform the role of data custodians. The use of data fiduciary indicates the potential import of the definition of ‘data fiduciary’ as specified under Clause 3.13 of the PDP Bill. However, this needs to be further clarified.&lt;/p&gt;
&lt;h3&gt;Clause 4.8 (iii): Data custodians’ “duty of care”&lt;/h3&gt;
As is outlined in the following section on data trustees, it can be difficult for a singular entity to maintain a duty of care and undertake actions with the best interest of a community when that community consists of sub-communities that may be marginalised. 
Further, ‘duty of care’, ‘best interest’, and ‘absence of harm’ are not sufficient standards for data processing by data custodians. Recommendations to the effect of obligating data custodians to uphold the rights of data principals, including economic and fundamental rights need to be incorporated in the framework.
&lt;h3&gt;Clause 4.9: Data trustees&lt;/h3&gt;
&lt;p&gt;The committee’s suggestion that the “most appropriate representative body” should be the data trustee—that often being either the corresponding government entity or community body— is reasonable at face value. However, in the absence of any clear principles defining what constitutes “most appropriate” there are a number of potential issues that can appear:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Lack of means for selecting a data trustee:&lt;/strong&gt; The report makes note of the fact that both private and public entities can be selected to be data trustees but offers no principles on how these data trustees can be selected, i.e. whether they are to be directly selected by the members of a community, and if so how. Any selection criteria or process prescribed has to keep in mind the following point regarding the potential lack of representation for marginalised communities that could arise from a direct selection of a data trustee by a group of people.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Issues of having a single data trustee for large scale communities and when dealing with marginalised communities:&lt;/strong&gt; The report assumes that in instances wherein a community is spread across a geographic region, or consists of multiple sub-communities, then the data trustee will be the closest shared government authority (for example, the Ministry of Health and Family Welfare, Government of India being the data trustee for data regarding diabetes among Indian citizens).&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;This idea of a singular data trustee assumes that the ‘best interests’ of a community are uniform across that community. This can prove problematic especially when dealing with data obtained from marginalised communities that forms a part of a wider dataset.&lt;/strong&gt; It is entirely possible to imagine that a smaller disenfranchised community may have interests that are not aligned with the general majority. In such a situation the Report is unclear as to whether the data trustee would have to ensure that the best interests of all groups are maintained, or would they be responsible for ensuring the best interests of the largest number of people within that community. 
There are power differentials between citizens, government agencies, and other entities described by the Report. This places citizens at risk of abuse of power by government entities in their role as trustees, who are effectively being empowered through this policy framework as opposed to a representative mechanism. It is recommended that data trustees be appointed by relevant communities through clear and representative mechanisms. Additionally, any individual should be able to file complaints regarding the discharge of community trust by data trustees. This is necessary as any subsequent rights vested in the community can only be exercised through the data trustee, and become unenforceable in the lack of an appropriate data trustee.&lt;/p&gt;
&lt;p&gt;Any legislation that arises on the basis of this report will therefore have to not only provide a means for selecting the data trustee, but also safeguards for ensuring that data collected from marginalised communities are used keeping in mind their specific best interests—with these best interests being informed through consultation with that community.&lt;/p&gt;
&lt;h3&gt;Clause 4.10 (iii): Data trusts&lt;/h3&gt;
&lt;p&gt;Section 4.10 (iii) notes that data custodians may voluntarily share data in these data trusts. However it is unclear if such sharing must be done with the express consent of the relevant data trustee.&lt;/p&gt;
&lt;h3&gt;Clause 4.10 (iv): Mandatory sharing and competition&lt;/h3&gt;
&lt;p&gt;The fundamental premise of a mandatory data sharing regime seems increasingly distant from its practical impacts. The EU which earlier championed the cause now seems reluctant to further it on the face of studies which skews towards counteractive impacts of such steps. Such steps could apply to huge volumes of first-party data companies collect on their own assets, products and services, even though such data are among the least likely to create barriers to entry or contribute to abuses of dominant positions. [7] This is hence likely to bring in more chilling effect on innovation and investment than a pro-competition environment. The velocity of big data also adds to the futility of such data sharing mandates. [8] It is recommended that a sectoral analysis of this mandate be undertaken instead of an overarching stipulation.&lt;/p&gt;
&lt;p&gt;The Report suggests extensive data sharing without addressing the extent of obligation on the private players to submit to these requests and process them. The availability of meta-data about the data collected may be made easily accessible under mandates of transparency. However, the access to the detailed underlying data will be difficult in most cases due to the current structure of entities functioning in cyberspace, evidenced by the lack of compliance to such mandates by Courts of Law in the EU. Such a system can easily eliminate the comparative advantage of smaller players, helping larger players with more money at their disposal enabling their growth and throttling the smaller players. It could have serious implications on data quality and integrity through the sharing of erroneous data. Access to superior quality digital services in India may also have to be compromised. If this regime is furthered without amends to address these concerns, it might end up counter productive.&lt;/p&gt;
&lt;h3&gt;Clause 5.1 (iv): Grievance redressal against state’s role&lt;/h3&gt;
&lt;p&gt;This clause acknowledges the vast potential for government authorities and other bodies to abuse their power as data trustee. In addition, it should describe the setting up of impartial and accessible mechanisms for citizens to complain against such abuse of power and appropriate penalties, including the removal of the data trustee.&lt;/p&gt;
&lt;h3&gt;Chapter 7, Recommendation 5: Purpose of data-sharing&lt;/h3&gt;
&lt;p&gt;Recommendation 5 leaves scope for “national security” as a sovereign purpose for data sharing. This continues to be in line with the trend of having an overarching national security clause, as in the Personal Data Protection Bill, 2019. There could be provisions made to enable access to data for sovereign purposes without such broad definition, replacing it based on constitutional terms which will limit it to the confines laid down in the Constitution. This will effectively curb any misuse of the provision and strongly embed the proposed regulation of non-personal data on constitutional ethos. This can also prevent future conflicts with the fundamental rights.&lt;/p&gt;
&lt;p&gt;Platform companies have leveraged their position in society to take on an ever-greater number of quasi-public functions, exercising new forms of unaccountable, transnational authority. It is not difficult to imagine that this trend can continue to non-platform companies, or even taken forward by these very entities which also have access to a large chunk of non-personal data. A strict division between sovereign purposes and core public interest purposes seems difficult. However, it is imperative to have a clearer definition of core public interest purposes and sovereign purposes. The broad based definition may facilitate reduced accountability. Separating government actions from sovereign purposes could bring forth the power imbalance between the State and its people, while in the case of the non-governmental entities, it will facilitate encroachment of government functions by private players. Both these cases may not consider the best interest of the data generators, or the people at large.&lt;/p&gt;
&lt;h3&gt;Clause 7.1 (i): Data needs of law enforcement&lt;/h3&gt;
&lt;p&gt;Clause 7.1 (i) allows for acquisition of data governed by this framework for crime mapping, devising anticipation and preventive measures, and for investigations and law enforcement. While this may be necessary to be granted to law enforcement in certain cases,  this should happen only with an express permission of a court of law. Blanket executive access allows higher possibility of misuse by the people involved in law enforcement.&lt;/p&gt;
&lt;h3&gt;Clause 7.2 (iv): Use of health data as a pilot&lt;/h3&gt;
&lt;p&gt;The clause suggests the use of health sector data as a pilot use-case. This is highly undesirable due to the inherent nature of high sensitivity of the larger part of data related to the health sector. The high vulnerability of such data to harm the data principals should act as a deterrent in using this as the pilot use-case. Given the mass availability of data related to the health sector due to the pandemic, it creates further points of vulnerabilities which can be illegally monetised and misappropriated. It is recommended that this proposal be scrapped altogether.&lt;/p&gt;
&lt;h3&gt;Clause 7.2 (iii): Power of government bodies&lt;/h3&gt;
&lt;p&gt;As per this clause, data trustees or government bodies (who could also be acting as data trustees) can make requests for data sharing and place such data in appropriate data infrastructures or trusts. This presents a conflict of interest, as a data trust or government body can empower itself to be the data trustee. Such cases should be addressed within the scope of the framework.&lt;/p&gt;
&lt;h3&gt;Clause 8.2 (vii): Level-playing field for all Indian actors&lt;/h3&gt;
&lt;p&gt;In terms of this clause the “Non-Personal Data Authority (Authority) will ensure a level playing field for all Indian actors to fulfil the objective of maximising Indian data’s value to the Indian economy”. The emphasis on ensuring a level playing field for only Indian actors instead of non-discriminatory platform for all concerned actors irrespective of the country/nationality of the actor has the potential of violating India’s trade obligations under the WTO. Member states of the WTO are essentially restricted from discriminating between products and services coming from different WTO Members, and between foreign and domestic products and services unless they can avail of exceptions. There is also no clarity on what constitutes ‘Indian Actors’, would a Multi-National Corporation with its headquarters in a foreign State, but its subsidiaries in India also come within its ambit.&lt;/p&gt;
&lt;h3&gt;Clause 8.2 (x): Composition of the Authority&lt;/h3&gt;
&lt;p&gt;Clause 8.2 (x) states that the Authority will have some members with relevant industry experience. However, apart from this clause, the report is silent on the composition of the Authority. The report recognises that Authority will need individuals/organisations with specialised knowledge, i.e. data governance, technology, latest research and innovation in the field of non-personal data), however, it does not mention or refer to the role of civil society organisations and the need for representation from such organisations in the Authority.&lt;/p&gt;
&lt;p&gt;The report frequently alludes to non-personal data being used for the best interest of the data principal and therefore, it is essential that the composition of the Authority reflect the inherent asymmetry of power between the data principal and the State. Considering that the Authority will also be responsible for sharing of community data and with determining the code of conduct for sharing of such data, it is important that the  Authority also has adequate representation from civil society organisations along with groups or individuals having the necessary technological and legal skills.&lt;/p&gt;
&lt;h3&gt;Clause 8.2 (iii) and (vi): Roles and Responsibility of the Authority&lt;/h3&gt;
&lt;p&gt;A majority of the datasets in the country comprise of ‘mixed datasets’, i.e. it consists of both personal and non-personal data. However, there is lack of clarity about the coordination between the Data Protection Authority constituted under the PDP Bill and the Non-Personal Data Authority with regard to the regulation of such datasets. The Report refers to the European Union which provides that the Non-Personal Data Regulation applies to the Non-Personal Data of mixed datasets; if the Non-Personal Data part and the personal data parts are ‘inextricably linked’, the General Data Protection Regulation apply to the whole mixed dataset. However, it is unclear whether the Report also proposes the same mechanism for the regulation of mixed datasets.&lt;/p&gt;
&lt;p&gt;Further, the contours of the enforcement role of the Committee should be specified and clearly laid down. Will the Committee also have penal powers as prescribed for the Data Protection Authority under the PDP Bill? Also, will the privacy concerns emanating from the risk of re-anonymisation of data be addressed by the NPD Committee or by the DPA under the PDP Bill. Ideally, it should be specified that any such privacy concerns will fall within the domain of the DPA as the data is then converted into personal data and the DPA will be empowered to deal with such issues.&lt;/p&gt;
&lt;h3&gt;Endnotes&lt;/h3&gt;
&lt;p&gt;[1] See Ministry of Health and Family Welfare. (2020). National Digital Health Blueprint. Government of India. &lt;a href="https://main.mohfw.gov.in/sites/default/files/Final%20NDHB%20report_0.pdf"&gt;https://main.mohfw.gov.in/sites/default/files/Final%20NDHB%20report_0.pdf&lt;/a&gt;; Tandon, A. (2019). Big Data and Reproductive Health in India: A Case Study of the Mother and Child Tracking System. &lt;a href="https://cis-india.org/raw/big-data-reproductive-health-india-mcts"&gt;https://cis-india.org/raw/big-data-reproductive-health-india-mcts&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[2] Taylor, L., Floridi, L., van der Sloot, B. eds. (2017) Group Privacy: new challenges of data technologies. Dordrecht: Springer.&lt;/p&gt;
&lt;p&gt;[3] Mittelstadt, B. (2017). From Individual to Group Privacy in Big Data Analytics. Philos. Technol. 30, 475–494.&lt;/p&gt;
&lt;p&gt;[4] See Taylor, L., Floridi, L., van der Sloot, B. eds. (2017) Group Privacy: new challenges of data technologies. Dordrecht: Springer; Tisne, M. (n.d). The Data Delusion: Protecting Individual Data Isn't Enough When The Harm is Collective. Stanford Cyber Policy Centre. &lt;a href="https://cyber.fsi.stanford.edu/publication/data-delusion"&gt;https://cyber.fsi.stanford.edu/publication/data-delusion&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[5] Rocher, L., Hendrickx, J.M. &amp;amp; de Montjoye, Y. (2019). Estimating the success of re-identifications in incomplete datasets using generative models. Nat Commun 10, 3069 . &lt;a href="https://doi.org/10.1038/s41467-019-10933-3"&gt;https://doi.org/10.1038/s41467-019-10933-3&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[6] Finck,  M. &amp;amp; Pallas, F. (2020). They who must not be identified—distinguishing personal from non-personal data under the GDPR. International Data Privacy Law, 10 (1), 11–36. &lt;a href="https://doi.org/10.1093/idpl/ipz026"&gt;https://doi.org/10.1093/idpl/ipz026&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[7] European Commission (2020). Communication From The Commission To The European Parliament, The Council, The European Economic And Social Committee And The Committee Of The Regions: A European strategy for data. &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1593073685620&amp;amp;uri=CELEX:52020DC0066"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1593073685620&amp;amp;uri=CELEX:52020DC0066&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;[8] Modrall, Jay. (2019). Antitrust risks and Big Data. Norton Rose Fullbright. &lt;a href="https://www.nortonrosefulbright.com/en-in/knowledge/publications/64c13505/antitrust-risks-and-big-data"&gt;https://www.nortonrosefulbright.com/en-in/knowledge/publications/64c13505/antitrust-risks-and-big-data&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework'&gt;https://cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sumandro</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Systems</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Digital Economy</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Submissions</dc:subject>
    

   <dc:date>2020-12-30T09:40:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
