<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 1 to 15.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/unpacking-algorithmic-infrastructures"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025">
    <title>The Centre for Internet and Society’s comments and feedback to the: Digital Personal Data Protection Rules 2025</title>
    <link>https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) submitted its comments and feedback to the Digital Personal Data Protection Rules 2025 initiated by the Indian government.&lt;/b&gt;
        &lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 3 - Notice given by data fiduciary to data principal&lt;/span&gt;&lt;/b&gt; - Under Section 5(2) of the DPDP Act, when the personal data of the data principal has been processed before the commencement of the Act, then the data fiduciary is required to give notice to the data principal as soon as reasonably practicable. However, the Rules fail to specify what is meant by reasonably practicable. The timeline for a notice in such circumstances is unclear.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;In addition, under Rule 3(a) the phrase “be presented and be understandable independently” is ambiguous. It is not clear whether the consent notice has to be presented independently of any other information or whether it only needs to be independently understandable and can be presented along with other information. &lt;/li&gt;
&lt;li&gt;In addition to this we suggest that the need for “privacy by design” mentioned in the earlier drafts is brought back, with the focus on preventing deceptive design practices (dark patterns)  being used while collecting data. &lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;br /&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 4 - Registration and obligations of Consent Manager&lt;/span&gt;&lt;/b&gt;- The concept of independent consent managers, similar to account aggregators in the financial sector, and consent manager platforms in the EU is a positive step. However, the Act and the Rules need to flesh out the interplay between the Data Fiduciary and the Consent Managers in a more detailed manner, for example, how does the data fiduciary know if a data principal is using a consent manager, and under what circumstances can the data fiduciary bypass the consent manager, what is the penalty/consequence, etc.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 6 - Reasonable security safeguards&lt;/span&gt;&lt;/b&gt; - While we appreciate the guidance provided in terms of the measures for security such as “encryption, obfuscation or masking or the use of virtual tokens”, it would also be good to refer to the SPDI Rules and include the example of the The international Standard IS/ISO/IEC 27001 on Information Technology - Security Techniques - Information Security Management System as an illustration to guide data fiduciaries.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 7 - Intimation of personal data breach&lt;/span&gt;&lt;/b&gt; - As per the Rules, the data fiduciary on becoming aware of any personal data breach is required to notify the data principal and the Data Protection Board without delay; a plain reading of this Rule suggests that data fiduciary has to report the breach almost immediately, and this could be a practical challenge. Further, the absence of any threshold (materiality, gravity of the breach, etc) for notifying the data principal means that the data fiduciary will have to inform the data principal about even an isolated data breach which may not have an impact on the data principal. In this context, we recommend the Rule be amended to state that the data fiduciary should be required to inform the Data Protection Board about every data breach, however the data principal should be informed depending on the gravity and materiality of the breach and when it is likely to result in high risk to the data principal.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Whilst the Rules have provisions for intimation of data breach, there is no specific provision requiring the Data Fiduciary to take all steps necessary to ensure that the Data Fiduciary has taken all necessary measures to mitigate the risk arising out of the said breach. Although there is an obligation to report any such measures to the Data Principal (Rule 7(1)(c)) as well as to the DPBI (Rule 7(2)(b)(iii)), there is no positive obligation imposed on the Data Fiduciary to take any such mitigation measures. The Rules and the Act merely presume that the Data Fiduciary would take mitigation measures, perhaps that is the reason why there are notification requirements for such breach, however the Rules and the Act do not put any positive obligation on the Data Fiduciary to actually implement such measures. This would lead to a situation where a Data Fiduciary may not take any measures to mitigate the risks arising out of the data breach, and be in compliance with its legal obligations by merely notifying the Data Principal as well as the DPBI that no measures have been taken to mitigate the risks arising from the data breach. In addition, the SPDI Rules state that in an event of a breach the body corporate is required to demonstrate that they had implemented reasonable security standards. This provision could be incorporated in this Rule to emphasize on the need to implement robust security standards which is one of the ways to curb data breaches from happening, and ensure that there is a protocol to mitigate the breach.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 10 - Verifiable consent for processing of personal data of child or of person with disability who has a lawful guardian&lt;/span&gt;&lt;/b&gt; - The two mechanisms provided under the Rules to verify the age and identity of parents pre-suppose a high degree of digital literacy on the part of the parents. They may either give or refuse consent without thinking too much about the consequences arising out of giving or not giving consent. As there is always a risk of individuals not providing the correct information regarding their age or their relationship with the child, platforms may have to verify every user’s age; thereby preventing users from accessing the platform anonymously. Further, there is also a risk of data maximisation of personal data rather than data minimisation; i.e parents may be required to provide far more information than required to prove their identity. One recommendation/suggestion that we propose is to remove the processing of children's personal data from the ambit of this law, and instead create a separate standalone legislation dealing with children’s digital rights. Another important issue to highlight here is the importance of the Digital Protection Board and its capacity to levy fines and impose strictures on the platforms. We have seen from examples from other countries that platforms are forced to redesign and provide for better privacy and data protection mechanisms when the regulator steps in and imposes high penalties&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 12 - Additional obligations of Significant Data Fiduciary&lt;/span&gt;&lt;/b&gt; - The Rules do not clarify which entities will be considered as a Significant Data Fiduciary, leaving that to the government notifications. This creates uncertainty for data fiduciaries, especially smaller organisations that might not be able to set up the mechanisms and people for conducting data protection impact assessment, and auditing. The Rule provides that SDFs will have to conduct an annual Data Protection Impact Assessment. While this is a step in the right direction, the Rules are currently silent on the granularity of the DPIA. Similarly for “audit” the Rules do not clarify what type of audit is needed and what the parameters are. It is therefore imperative that the government notifies the level of details that the DPIA and the audit need to go into in order to ensure that the SDFs actually address issues where their data governance practices are lacking and not use the DPIA as a whitewashing tactic.There is also a  need to reduce some of the ambiguity with regards to the parameters, and responsibilities in order to make it easier for startups and smaller players to comply with the regulations.  In addition, while there is a need to protect data and increase responsibility on organisations collecting sensitive data or large volumes of data, there is a need to look beyond compliance and look at ways that preserve the rights of the data principal. Hence significant data fiduciaries should also be given the added responsibility of collecting explicit consent from the data principal, and also have easier access for correction of data, grievance redressal and withdrawal of consent.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 14 - Processing of personal data outside India&lt;/span&gt;&lt;/b&gt; - As per section 16 of the Act the government could, by notification, restrict the transfer of data to specific countries as notified. This system of a negative list envisaged under the Act appears to have been diluted somewhat by the use of the phrase “any foreign State” under the Rules. This ambiguity should be addressed and the language in the Rules may be altered to bring it in line with the Act. Further, the rules also appear to be ultra vires to the Act. As per the DPDP Act, personal data could be shared to outside India, except to countries which were on the negative list, however, the dilution of the provision through the rules appears to have now created a white list of countries; i.e. permissible list of countries to which data can be transferred.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 15 Exemption from Act for research, archiving or statistical purposes&lt;/span&gt;- &lt;/b&gt;While creating an exception for research and statistical purposes is an understandable objective, the current wording of the provision is vague and subject to mischief. The objective behind the provision is to ensure that research activities are not hindered due to the requirements of taking consent, etc. as required under the Act. However the way the provision is currently drafted, it could be argued that a research lab or a research centre established by a large company, for e.g. Google, Meta, etc. could also seek exemptions from the provisions of this Act for conducting “research”. The research conducted may not be shared with the public in general and may be used by the companies that funded/established the research centre. Therefore there should be further conditions attached to this provision, that would keep such research centers outside the purview of the exemption. Conditions such as making the results of the research publicly available, public interest, etc. could be considered for this purpose.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 22 - Calling for Information from data fiduciary or intermediary&lt;/span&gt; - &lt;/b&gt;This rule read with the seventh schedule appears to dilute the data minimisation and purpose limitation provisions provided for in the Act. The wide ambit of powers appears to be in contravention of the Supreme Court judgement in the Puttaswamy case, which places certain restrictions on the government while collecting personal data. This “omnibus” provision flouts guardrails like necessity and proportionality that are important to safeguard the fundamental right to privacy.&lt;/p&gt;
&lt;p&gt;It should be clarified whether this rule is merely an enabling provision to facilitate sharing of information, and only designated competent authorities as per law can avail of this provision. &lt;span style="text-decoration: underline;"&gt;Need for Confidentiality &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Additionally, the rule mandates that the government may “require the Data Fiduciary or intermediary to not disclose” any request for information made under the Act. There is no requirement of confidentiality indicated in the governing section, i.e. section 36, from which Rule 22 derives its authority. Talking about the avoidance of secrecy in government business, the Supreme Court in the State of U.P. v. Raj Narain, (1975) 4 SCC 428 has held that &lt;br /&gt; &lt;i&gt;“In a government of responsibility like ours, where all the agents of the public must be responsible for their conduct, there can but few secrets. The people of this country have a right to know every public act, everything, that is done in a public way, by their public functionaries. They are entitled to know the particulars of every public transaction in all its bearing. The right to know, which is derived from the concept of freedom of speech, though not absolute, is a factor which should make one wary, when secrecy is claimed for transactions which can, at any rate, have no repercussions on public security (2). To cover with [a] veil [of] secrecy the common routine business, is not in the interest of the public. Such secrecy can seldom be legitimately desired. It is generally desired for the purpose of parties and politics or personal self-interest or bureaucratic routine. The responsibility of officials to explain and to justify their acts is the chief safeguard against oppression and corruption.” &lt;/i&gt;&lt;br /&gt; In order to ensure that state interests are also protected, there may be an enabling provision whereby in certain instances confidentiality may be maintained, but there has to be a supervisory mechanism whereby such action may be judged on the anvil of legal propriety.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025'&gt;https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi, Vipul Kharbanda, Shweta Mohandas, Anubha Sinha and Isha Suri</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2025-03-06T02:06:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps">
    <title>Privacy Policy Framework for Indian Mental Health Apps </title>
    <link>https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps</link>
    <description>
        &lt;b&gt;This report analyses the privacy policies of mental health apps in India and provides recommendations for making the policies not only legally compliant but also user-centric&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The report’s findings indicate a significant gap in the structure and content of privacy policies in Indian mental health apps. This highlights the need to develop a framework that can guide organisations in developing their privacy policies. Therefore, this report proposes a holistic framework to guide the development of privacy policies for mental health apps in India. It focuses on three key segments that are an essential part of the privacy policy of any mental health app. First, it must include factors considered essential by the Digital Personal Data Protection Act 2023 (DPDPA) such as consent mechanisms, rights of the data principal, provision to withdraw consent etc. Second, the privacy policy must state how the data provided by them to these apps will be used. Finally, developers must include key elements, such as provisions for third-party integrations and data retention policies.”&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Click to download the full research paper &lt;a class="external-link" href="https://cis-india.org/internet-governance/files/privacy-policy-framework.pdf"&gt;here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps'&gt;https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Chakshu Sang and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2025-01-10T00:11:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu">
    <title>Reconfiguring Data Governance: Insights from India and the EU</title>
    <link>https://cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu</link>
    <description>
        &lt;b&gt;This policy paper is the result of a workshop organised jointly by the Tilburg Institute of Law, Technology and Society, Netherlands, the Centre for Communication Governance at the National Law University Delhi, India and the Centre for Internet &amp; Society, India in January, 2023. The workshop brought together a number of academics, researchers, and industry representatives in Delhi to discuss a range of issues at the core of data governance theory and practice. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ReconfiguringDataGovernance.png/@@images/70165fe1-cc66-4cac-9f99-b7485c87218a.png" alt="Reconfiguring Data Governance" class="image-inline" title="Reconfiguring Data Governance" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The workshop aimed to compare and assess lessons from data governance from India and the European Union, and to make recommendations on how to design fit-for-purpose institutions for governing data and AI in the European Union and India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This policy paper collates key takeaways from the workshop by grounding them across three key themes: how we conceptualise data; how institutional mechanisms as well as community-centric mechanisms can work to empower individuals, and what notions of justice these embody; and finally a case study of enforcement of data governance in India to illustrate and evaluate the claims in the first two sections.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This report was a collaborative effort between researchers Siddharth Peter De Souza, Linnet Taylor, and Anushka Mittal at the Tilburg Institute for Law, Technology and Society (Netherlands), Swati Punia, Sristhti Joshi, and Jhalak M. Kakkar at the Centre for Communication Governance at the National Law University Delhi (India) and Isha Suri, and Arindrajit Basu at the Centre for Internet &amp;amp; Society, India.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Click to download the &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/reconfiguring-data-governance.pdf"&gt;&lt;b&gt;report&lt;/b&gt;&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu'&gt;https://cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Swati Punia, Srishti Joshi, Siddharth Peter De Souza, Linnet Taylor, Jhalak M. Kakkar, Isha Suri, Arindrajit Basu, and Anushka Mittal</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2024-02-20T00:30:00Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/unpacking-algorithmic-infrastructures">
    <title>Unpacking Algorithmic Infrastructures: Mapping the Data Supply Chain in the Healthcare Industry in India </title>
    <link>https://cis-india.org/raw/unpacking-algorithmic-infrastructures</link>
    <description>
        &lt;b&gt;The Unpacking Algorithmic Infrastructures project, supported by a grant from the Notre Dame-IBM Tech Ethics Lab, aims to study the Al data supply chain infrastructure in healthcare in India, and aims to critically analyse auditing frameworks that are utilised to develop and deploy AI systems in healthcare. It will map the prevalence of Al auditing practices within the sector to arrive at an understanding of frameworks that may be developed to check for ethical considerations - such as algorithmic bias and harm within healthcare systems, especially against marginalised and vulnerable populations. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;There has been an increased interest in health data  in India over the recent years, where health data policies encourage  sharing of data with different entities, at the same time, there has  been a growing interest in deployment of Al in healthcare from startups,  hospitals, as well as multinational technology companies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Given the invisibility of  algorithmic infrastructures that underlie the digital economy and the  important decisions these technologies can make about patients' health,  it's important to look at how these systems are developed, how data  flows within them, how these systems are tested and verified and what  ethical considerations inform their deployment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ResearchersWork.png/@@images/00a848c7-b7f7-41b4-8bd9-45f2928fd44e.png" alt="Researchers at Work" class="image-inline" title="Researchers at Work" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;The &lt;/strong&gt;&lt;strong&gt;Unpacking Algorithmic Infrastructures&lt;/strong&gt; project,  supported by a grant from the Notre Dame-IBM Tech Ethics Lab, aims to  study the Al data supply chain infrastructure in healthcare in India,  and aims to critically analyse auditing frameworks that are utilised to  develop and deploy AI systems in healthcare. It will map the prevalence  of Al auditing practices within the sector to arrive at an understanding  of frameworks that may be developed to check for ethical considerations  - such as algorithmic bias and harm within healthcare systems,  especially against marginalised and vulnerable populations.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Research Questions&lt;/h3&gt;
&lt;ol&gt;
&lt;li style="text-align: justify; "&gt;To what extent organisations take      ethical principles into  account when developing AI , managing the training      and testing  dataset, and while deploying the AI in the healthcare sector.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;What best practices for auditing can be      put in place based on  our critical understanding of AI data supply chains      and auditing  frameworks being employed in the healthcare sector.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;What is a possible auditing framework      that is best suited to organisations in the majority world.&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;Research Design and Methods&lt;/h3&gt;
&lt;p&gt;For this study, we will use a  comprehensive mixed methods approach. We will survey professionals  working towards designing, developing and deploying AI systems for  healthcare in India, across technology and healthcare organizations. We  will also undertake in-depth interviews with experts who are part of key  stakeholder groups.&lt;/p&gt;
&lt;p&gt;We hereby invite researchers,  technologists, healthcare professionals, and others working at the  intersection of Artificial Intelligence and Healthcare to speak to us  and help us inform the study. You may contact Shweta Monhandas at &lt;a href="mailto:shweta@cis-india.org"&gt;shweta@cis-india.org&lt;/a&gt;&lt;/p&gt;
&lt;ol&gt; &lt;/ol&gt; 
&lt;hr /&gt;
&lt;p&gt;Research Team: Amrita Sengupta, Chetna V. M.,  Pallavi Bedi, Puthiya Purayil Sneha, Shweta Mohandas and Yatharth.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/unpacking-algorithmic-infrastructures'&gt;https://cis-india.org/raw/unpacking-algorithmic-infrastructures&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amrita Sengupta, Chetna V. M., Pallavi Bedi, Puthiya Purayil Sneha, Shweta Mohandas and Yatharth</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Health Tech</dc:subject>
    
    
        <dc:subject>RAW Blog</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Healthcare</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    

   <dc:date>2024-01-05T02:38:22Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution">
    <title>CoWIN Breach: What Makes India's Health Data an Easy Target for Bad Actors?</title>
    <link>https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution</link>
    <description>
        &lt;b&gt;Recent health data policies have failed to even mention the CoWIN platform.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="https://www.thequint.com/opinion/cowin-data-breach-health-sensitive-details-policies-solution#read-more"&gt;originally published in the Quint&lt;/a&gt; on 19 June 2023.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Last week, it was reported that due to an alleged breach of &lt;a href="https://www.thequint.com/fit/cowin-data-breach-private-information-covid-vaccine-telegram-bot"&gt;the CoWIN platform&lt;/a&gt;, details such as Aadhaar and passport numbers of Indians were made public via a Telegram bot.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While Minister of State for Information Technology &lt;a href="https://www.thequint.com/fit/cowin-data-breach-telegram-bot-covid-19-vaccine-unanswered-questions"&gt;Rajeev Chandrashekar&lt;/a&gt; put out information acknowledging that there was some form of a data breach, there is no information on how the breach took place or when a past breach may have taken place.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;This data leak is yet another example of &lt;a href="https://www.thequint.com/opinion/cowin-breach-shows-us-the-structural-problem-with-digital-indias-infrastructure"&gt;our health records&lt;/a&gt; being exposed in the recent past – during the pandemic, there were reports of COVID-19 test results being leaked online. The leaked information included patients’ full names, dates of birth, testing dates, and names of centres in which the tests were held.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;In December last year, five servers of the &lt;a href="https://www.thequint.com/fit/aiims-ayushman-bharat-digital-mission-health-data"&gt;All India Institute of Medical Science&lt;/a&gt; (AIIMS) in Delhi were under a cyberattack, leaving sensitive personal data of around 3-4 crore patients compromised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In such cases, the Indian Computer Emergency Response Team (CERT-In) is the agency responsible for looking into the vulnerabilities that may have led to them. However, till date, CERT-In has not made its technical findings into such attacks &lt;a href="https://www.thequint.com/topic/data-breach"&gt;publicly available&lt;/a&gt;.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The COVID-19 Pandemic Created Opportunity&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The pandemic saw a number of digitisation policies being rolled out in the health sector; the most notable one being the National Digital Health Mission (or NDHM, later re-branded as the Ayushman Bharat Digital Mission).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mobile phone apps and web portals launched by the central and state governments during the pandemic are also examples of this health digitisation push. The rollout of the COVID-19 vaccinations also saw the deployment of the CoWIN platform.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Initially, it was mandatory for individuals to register on CoWIN to get an appointment for vaccination, and there was no option for walk-in-registration or to book an appointment. But, the Centre subsequently modified this rule and walk-in appointments and registrations on CoWIN became permissible from June 2021.&lt;/p&gt;
&lt;blockquote&gt;However, a study conducted by the Centre for Internet and Society (CIS) found that states such as Jharkhand and Chhattisgarh, which have low internet penetration, permitted on-site registration for vaccinations from the beginning.&lt;/blockquote&gt;
&lt;p&gt;The rollout of the NDHM also saw Health IDs being generated for citizens.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In several reported cases across states, this rollout happened during the COVID-19 vaccination process – without the informed consent of the concerned person.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The &lt;b&gt;beneficiaries who have had their Health IDs created through the vaccination process had not been informed&lt;/b&gt; about the creation of such an ID or their right to opt out of the digital health ecosystem.&lt;/p&gt;
&lt;h3&gt;A Web of Health Data Policies&lt;/h3&gt;
&lt;p&gt;Even before the pandemic, India was working towards a Health ID and a health data management system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The components of the umbrella National Digital Health Ecosystem (NDHE) are the National Digital Health Blueprint published in 2019 (NDHB) and the NDHM.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Blueprint was created to implement the National Health Stack (published in 2018) which facilitated the creation of Health IDs. Whereas the NDHM was drafted to drive the implementation of the Blueprint, and promote and facilitate the evolution of NDHE.&lt;/p&gt;
&lt;p&gt;The National Health Authority (NHA), established in 2018, has been given the responsibility of implementing the National Digital Health Mission.&lt;/p&gt;
&lt;blockquote style="text-align: justify; "&gt;2018 also saw the Digital Information Security in Healthcare Act (DISHA), which was to regulate the generation, collection, access, storage, transmission, and use of Digital Health Data ("DHD") and associated personal data.&lt;/blockquote&gt;
&lt;p&gt;However, since its call for public consultation, &lt;b&gt;no progress has been made&lt;/b&gt; on this front.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to documents that chalk out the functioning and the ecosystem of a digitised healthcare system, the NHA has released policy documents such as:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;the Health Data Management Policy (which was revised three times; the latest version released in April 2022)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;the Health Data Retention Policy (released in April 2021)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Consultation paper on the Unified Health Interface (UHI) (released in December 2022)&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Along with these policies, in 2022, the NHA released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) – India’s state health insurance policy.&lt;/p&gt;
&lt;blockquote style="text-align: justify; "&gt;However these &lt;b&gt;draft guidelines repeat the pattern of earlier policies&lt;/b&gt; &lt;b&gt;on health data&lt;/b&gt;, wherein there is no reference to the policies that predated it; the PM-JAY’s Data Sharing Guidelines, published in August 2022, did not even refer to the draft National Digital Health Data Management Policy (published in April 2022).&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Interestingly, the recent health data policies do not mention CoWIN.&lt;/b&gt; Failing to cross-reference or mention preceding policies creates a lack of clarity on which documents are being used as guidelines by healthcare providers.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Can a Data Protection Bill Be the Solution?&lt;/h3&gt;
&lt;p&gt;The draft Data Protection Bill, 2021, defined health data as “…the data related to the state of physical or mental health of the data principal and &lt;b&gt;includes records regarding the past, present or future state of the health of such data principal&lt;/b&gt;, data collected in the course of registration for, or provision of health services, data associated with the data principal to the provision of specific health services.”&lt;/p&gt;
&lt;p&gt;However, this definition as well as the definition of sensitive personal data was removed from the current version of the Bill (Digital Personal Data Protection Bill, 2022).&lt;/p&gt;
&lt;blockquote&gt;Omitting these definitions from the Bill removes a set of data which, if collected, warrants increased responsibility and increased liability. Handling of health data, financial data, government identifiers, etc, need to come with a higher level of responsibility as they are a list of sensitive details of a person.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;The threats posed as a result of this data being leaked are not limited to spam messages or fraud and impersonation, but also of companies that can get a hand on this coveted data and gather insights and train their systems and algorithms, without the need to seek consent from anyone, or without facing the consequences of harm caused.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the current version of the draft DPDP Bill states that the data fiduciary shall notify the data principal of any breach, the draft Bill also states that the Data Protection Board “may” direct the data fiduciary to adopt measures that remedy the breach or mitigate harm caused to the data principal.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Bill also prescribes penalties of upto Rs 250 crore if the data fiduciary fails to take reasonable security safeguards to prevent a personal data breach, and a penalty of upto Rs 200 crore if the fiduciary fails to notify the data protection board and the data principal of such breach.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While &lt;b&gt;these steps, if implemented through legislation, would make organisations processing data take their data security more seriously&lt;/b&gt;, the removal of sensitive personal data from the definition of the Bill, would mean that data fiduciaries processing health data will not have to take additional steps other than reasonable security safeguards.&lt;/p&gt;
&lt;p&gt;The &lt;b&gt;absence of a clear indication of security standards&lt;/b&gt; will affect data principals and fiduciaries.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Looking to bring more efficiency to governance systems, the Centre launched the Digital India Mission in 2015. The press release by the central government reporting the approval of the programme by the Cabinet of Ministers speaks of ‘cradle to grave’ digital identity as one of its vision areas.&lt;/p&gt;
&lt;p&gt;The ambitious Universal Health ID and health data management policies are an example of this digitisation mission.&lt;/p&gt;
&lt;blockquote&gt;However breaches like this are reminders that without proper data security measures, and a system for having a person responsible for data security, the data is always vulnerable to an attack.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;While the UK and Australia have also seen massive data breaches in the past, India is at the start of its health data digitisation journey and has the ability to set up strong security measures, employ experienced professionals, and establish legal resources to ensure that data breaches are minimised and swift action can be taken in case of a breach.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;The first step&lt;/b&gt; to understand the vulnerabilities would be to present the CERT-In reports of this breach, and guide other institutions to check for the same so that they are better prepared for future breaches and attacks.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution'&gt;https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-07-04T09:39:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill">
    <title>The Centre for Internet and Society’s comments and recommendations to the: The Digital Data Protection Bill 2022</title>
    <link>https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) published its comments and recommendations to the Digital Personal Data Protection Bill, 2022, on December 17, 2022.&lt;/b&gt;
        &lt;div class="WordSection1" style="text-align: justify; "&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p align="center" class="MsoNormal" style="text-align:center; "&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p align="right" class="MsoNormal" style="text-align:right; "&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;h1&gt;&lt;span&gt;High Level Comments&lt;/span&gt;&lt;/h1&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;1.&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;/b&gt;&lt;b&gt;&lt;span&gt;Rationale for removing the distinction between personal data and sensitive personal data is unclear.&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;All the earlier iterations of the Bill as well as the rules made under Section 43A of the Information Technology Act, 2000&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[1]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; had classified data into two categories; (i) personal data; and (ii) sensitive personal data. The 2022 version of the Bill has removed this distinction and clubbed all personal data under one umbrella heading of personal data. The rationale for this is unclear, as sensitive personal data means such data which could reveal or be related to eminently private data such as financial data, health data, sexual orientations and biometric data. Considering the sensitive nature of the data, the data classified as sensitive personal data is accorded higher protection and safeguards from processing, therefore by clubbing all data as personal data, the higher protection such as the need for explicit consent to the processing of sensitive personal data, the bar on processing of sensitive personal data for employment purposes has also been removed. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;2.&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;/b&gt;&lt;b&gt;&lt;span&gt;No clear roadmap for the implementation of the Bill&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified.&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[2]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within which the Board shall be established and specific Rules and regulations notified. Considering that certain provisions have been deferred to Rules that have to be framed by the Central government, the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill. Provisions such as Section 10(1) which deals with verifiable parental consent for data of children,  Section 13 (1) which states the manner in which a Data Principal can initiate a right to correction, the process of selection and functioning of consent manager under &lt;/span&gt;&lt;span&gt;3(7)&lt;/span&gt;&lt;span&gt; are few such examples, that when the Act becomes applicable, the data principal will have to wait for the Rules to Act of these provisions, or to get clarity on entities created by the Act. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Board. This is important to give full and effective effect to the right of privacy of the individual. It is also important to ensure that individuals have an effective mechanism to enforce the right and seek recourse in case of any breach of obligations by the data fiduciaries. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Board must ensure that Data Principals and Fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment into force. This will allow the Data Fiduciaries to align their practices with the provisions of this new legislation and the Board will also have time to define and determine certain provisions that the Bill has left the Board to define. Additionally enforcing penalties for offenses initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders which now could include data principals as well, from paying a high price. This will relieve the fear of smaller companies and startups and individuals who might fear processing data for the fear of paying penalties for offenses.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;a name="_kn12ecl3pdrp"&gt;&lt;/a&gt;&lt;span&gt;3.&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;span&gt;Independence of  Data Protection Board of India.&lt;/span&gt;&lt;/h3&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Bill proposes the creation of the Data Protection Board of India (Board) in place of the Data Protection Authority. In comparison with the powers of the Board with the 2018 and 2019 version of Personal Data Protection Bill, we witness an abrogation of powers of the Board  to be created, in this Bill. Under Clause 19(2), the strength and composition of the Board, the process of selection, the terms and conditions of appointment and service, and the removal of its Chairperson and other Members shall be such as may be prescribed by the Union Government at a later stage. Further as per Clause 19(3), the Chief Executive of the Board will be appointed by the Union Government and the terms and conditions of her service will also be determined by the Union Government. The functions of the Board have also not been specified under the Bill, the Central Government may assign the functions to be performed by the Board.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate, ability to act swiftly, and resources. The political nature of  personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Board are independent of the Executive. &lt;/span&gt;&lt;/p&gt;
&lt;h1&gt;&lt;a name="_n9jzjnvile8f"&gt;&lt;/a&gt;&lt;span&gt;Chapter Wise Comments and Recommendations &lt;/span&gt;&lt;/h1&gt;
&lt;h2&gt;&lt;a name="_chp7y0vgrjqa"&gt;&lt;/a&gt;&lt;span&gt;CHAPTER I- PRELIMINARY&lt;/span&gt;&lt;/h2&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;&lt;span&gt; &lt;/span&gt;●&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;b&gt;&lt;span&gt;Definition:&lt;/span&gt;&lt;/b&gt;&lt;span&gt; While the Bill has added a few new definitions to the Bill including terms such as gains, loss, consent manager etc. there are a few key definitions that have been removed from the earlier versions of the Bill. The removal of certain definitions in the Bill, eg. sensitive personal data, health data, biometric data, transgender status, creating a legal uncertainty about the application of the Bill. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;With respect to the existing definitions as well the definition of the term ‘harm’ has been significantly reduced to remove harms such as surveillance from the ambit of harms. In addition, with respect of the definition of the term of harms also, the 2019 version of the Bill under Clause 2 (20) the definition provides a non exhaustive list of harms, by using the phrase “harms include”, however in the new definition the phrase has been altered to “harm”, in relation to a Data Principal, means”, thereby removing the possibility of more harms that are not apparent currently from being within the purview of the Act. We recommend that the definition of harms be made into a non-exhaustive list.&lt;br /&gt; &lt;br /&gt; &lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a name="_nhwnuzprx0ir"&gt;&lt;/a&gt;&lt;span&gt;CHAPTER II - OBLIGATIONS OF DATA FIDUCIARY&lt;/span&gt;&lt;/h2&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Notice: &lt;/span&gt;&lt;/b&gt;&lt;span&gt;The revised Clause on notice does away with the comprehensive requirements which were laid out under Clause 7 of the PDP Bill 2019. The current clause does not mention in detail what the notice should contain, while stating that that the notice should be itemised. While it can be reasoned that the Data Fiduciary can find the contents of the notice throughout the bill, such as with the rights of the Data Principal, the removal of a detailed list could create uncertainty for Data Fiduciaries. By leaving the finer details of what a notice should contain, it could cause Data Fiduciaries from missing out key information from the list, which in turn provide incomplete information to the Data Principal. Even in terms of Data Fiduciaries they might not know if they are complying with the provisions of the bill, and could result in them invariably being penalised. In addition to this by requiring less work by the Data Fiduciary and processor, the burden falls on the Data Principal to make sure they know how their data is processed and collected. The purpose of this legislation is to create further rights for individuals and consumers, hence the Bill should strive to put the individual at the forefront.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;In addition to this Clause 6(3) of the Bill states &lt;i&gt;“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English or any language specified in the Eighth Schedule to the Constitution of India.”&lt;/i&gt; While the inclusion of regional language notices is a welcome step, we suggest that the text be revised as follows &lt;i&gt;“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English&lt;b&gt; and in&lt;/b&gt; any language specified in the Eighth Schedule to the Constitution of India.” &lt;/i&gt;While the main crux of notice is to let the person know before giving consent, notice in a language that a person cannot read would not lead to meaningful consent.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Consent &lt;br /&gt; &lt;br /&gt; &lt;/span&gt;&lt;/b&gt;&lt;span&gt;Clause 3 of the Bill states &lt;i&gt;“request for consent would have the contact details of a Data Protection Officer, where applicable, or of any other person authorised by the Data Fiduciary to respond to any communication from the Data Principal for the purpose of exercise of her rights under the provisions of this Act.” &lt;/i&gt;Ideally this provision should be a part of the notice and should be mentioned in the above section. This is similar to Clause 7(1)(c) of the draft Personal Data Protetion Bill 2019 which requires the notice to state &lt;i&gt;“the identity and contact details of the data fiduciary and the contact details of the data protection officer, if applicable;”. &lt;/i&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Deemed Consent&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Bill  introduces a new type of consent that was absent in the earlier versions of the Bill. We are of the understanding that deemed consent is used to redefine non consensual processing of personal data. The use of the term deemed consent and the provisions under the section while more concise than the earlier versions could create more confusion for Data Principals and Fiduciaries alike. The definition and the examples do not shed light on one of the key issues with voluntary consent - the absence of notice. In addition to this the Bill is also silent on whether deemed consent can be withdrawn or if the data principal has the same rights as those that come from processing of data they have consented to. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Personal Data Protection of Children &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The age to determine whether a person has the ability to legally consent in the online world has been intertwined with the age of consent under the Indian Contract Act; i.e. 18 years. The Bill makes no distinction between a 5 year old and a 17 year old- both are treated in the same manner. It assumes the same level of maturity for all persons under the age of 18. It is pertinent to note that the law in the offline world does recognise that distinction and also acknowledges the changes in the level of maturity. As per Section 82 of the Indian Penal Code read with Section 83, any act by a child under the age of 12 shall not be considered as an offence. While the maturity of those aged between 12–18 years will be decided by court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industry&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;There is  a need to evaluate and rethink the idea that children are passive consumers of the internet and hence the consent of the parent is enough. Additionally, the bracketing of all individuals under the age of 18 as children fails to look at how teenages and young people use the internet. This is more important looking at the 2019 data which suggests that two-thirds of India’s internet users are in the 12–29 years age group, with those in the 12–19 age group accounting for about 21.5% of the total internet usage in metro cities. Given that the pandemic has compelled students and schools to adopt and adapt to virtual schools, the reliance on the internet has become ubiquitous with education. Out of an estimated 504 million internet users, nearly one-third are aged under 19. As per the Annual Status on Education Report (ASER) 2020, more than one-third of all schoolchildren are pursuing digital education, either through online classes or recorded videos.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Instead of setting a blanket age for determining valid consent, we could look at alternative means to determine the appropriate age for children at different levels of maturity, similar to what had been developed by the U.K. Information Commissioner’s Office. The Age Appropriate Code prescribes 15 standards that online services need to follow. It broadly applies to online services "provided for remuneration"—including those supported by online advertising—that process the personal data of and are "likely to be accessed" by children under 18 years of age, even if those services are not targeted at children. This includes apps, search engines, social media platforms, online games and marketplaces, news or educational websites, content streaming services, online messaging services. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The reservation to definition of child under the Bill has also been expressed by some members of the JPC through their dissenting opinion. MP Ritesh Pandey stated that keeping in mind the best interest of the child the Bill should consider a child to be a person who is less than 14 years of age. This would ensure that young people could benefit from the advances in technology without parental consent and reduce the social barriers that young women face in accessing the internet. Similarly Manish Tiwari in his dissenting note also observed that the regulation of the processing of data of children should be based on the type of content or data. The JPC Report observed that the Bill does not require the data fiduciary to take fresh consent of the child, once the child has attained the age of majority, and it also does not give the child the option to withdraw their consent upon reaching the majority age. It therefore, made the following recommendations:&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Registration of data fiduciaries, exclusively dealing with children’s data. Application of the Majority Act to a contract with a child. Obligation of Data fiduciary to inform a child to provide their consent, three months before such child attains majority  Continuation of the services until the child opts out or gives a fresh consent, upon achieving majority. However, these recommendations have not been incorporated into the provisions of the Bill. In addition to this the Bill is silent on the status of non consensual processing and deemed consent with respect to the data of children.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;We recommend that fiduciaries who have services targeted at children should be considered as significant Data Fiduciaries. In addition to this the Bill should also state that the guardians could approach the Data Protection Board on behalf of the child. With these obligations in place, the age of mandatory consent could be reduced and the data fiduciary could have an added responsibility of informing the children in the simplest manner how their data will be used. Such an approach places a responsibility on Data Fiduciaires when implementing services that will be used by children and allows the children to be aware of data processing, when they are interacting with technology.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Chapter III-RIGHTS AND DUTIES OF DATA PRINCIPAL&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Rights of Data Principal&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 12(3) of the Bill while providing the Data Principal the right to be informed of the identities of all the Data Fiduciaries with whom the personal data has been shared, also states that the data principal has the right to be informed of the categories of personal data shared. However the current version of the Bill provides only one category of data that is personal data. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 14 of the Bill talks about the Right of Grievance Redressal, and  states that the Data Principal has the right to readily available means of registering a grievance, however the Bill does not provide in the Notice provisions the need to mention details of a grievance officer or a grievance redressal mechanism. It is only  the additional obligations on significant data fiduciary that mentions the need for a Data Protection officer to be the contact for the grievance redressal mechanism under the provisions of this Bill. The Bill could ideally re-use the provisions of the IT Act SPDI Rules 2011 in which Section 5(7) states &lt;i&gt;“Body corporate shall address any discrepancies and grievances of their provider of the information with respect to processing of information in a time bound manner. For this purpose, the body corporate shall designate a Grievance Officer and publish his name and contact details on its website. The Grievance Officer shall redress the grievances or provider of information expeditiously but within one month ' from the date of receipt of grievance.”&lt;br /&gt; &lt;/i&gt;&lt;br /&gt; The above framing would not only bring clarity to the data fiduciaries on what process to follow for a grievance redressal, it also would reduce the significant burden of theBoard. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Duties of Data Principals&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Bill while entisting duties of the Data Principal states that the “Data Principal shall not register a false or frivolous grievance or complaint with a Data Fiduciary or the Board”, however it is very difficult for a Data Principal to and even for the Board to determine what constitutes a “frivolous grievance”. In addition to this the absence of a defined notice provision and the inclusion of deemed consent would mean that the Data Fiduciary could have more information about the matter than the Data Principal. This could mean that the fiduciary could prove that a claim was false or frivolous. Clause 21(12) states that “&lt;i&gt;At any stage after receipt of a complaint, if the Board determines that the complaint is devoid of merit, it may issue a warning or impose costs on the complainant.” &lt;/i&gt;In addition to this Clause 25(1) states that “ &lt;i&gt;If the Board determines on conclusion of an inquiry that non- compliance by &lt;b&gt;a person &lt;/b&gt;is significant, it may, after giving the person a reasonable opportunity of being heard, impose such financial penalty as specified in Schedule 1, not exceeding rupees five hundred crore in each instance.” &lt;/i&gt;The use of the term “person” in this case includes data which could mean that they could be penalised under the provisions of the Bill, which could also include not complying with the duties.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;CHAPTER IV- SPECIAL PROVISIONS&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Transfer of Personal Data outside India&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 17 of the Bill has removed the requirement of data localisation which the 2018 and 2019 Bill required. Personal data can be transferred to countries that will be notified by the central government. There is no need for a copy of the data to be stored locally and no prohibition on transferring sensitive personal data and critical data. Though it is a welcome change that personal data can be transferred outside of India, we would highlight the concerns in permitting unrestricted access to and transfer of all types of data. Certain data such as defence and health data do require sectoral regulation and ringfencing of the transfer of data. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Exemptions&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 18 of the Bill has widened the scope of government exemptions. Blanket exemption has been given to the State under Clause 18(4) from deleting the personal data even when the purpose for which the data was collected is no longer served or when retention is no longer necessary. The requirement of &lt;i&gt;proportionality, reasonableness and fairness&lt;/i&gt; have been removed for the Central Government to exempt any department or instrumentality from the ambit of the Bill.&lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;By doing away with the four pronged test, this provision is not in consonance with test laid down by the Supreme Court and are also incompatible with an effective privacy regulation. There is also no provision for either a prior judicial review  of the order by a district judge as envisaged by the Justice Srikrishna Committee Report or post facto review by an oversight committee of the order as laid down under the Indian Telegraph Rules, 1951&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[3]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and the rules framed under Information Technology Act&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[4]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;. The provision states that such processing of personal data shall be subject to the procedure, safeguard and oversight mechanisms that may be prescribed.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div style="text-align: justify; "&gt;&lt;br clear="all" /&gt; 
&lt;hr align="left" size="1" width="100%" /&gt;
&lt;div id="ftn1"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[1]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011&lt;/span&gt;&lt;span&gt;.&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[2]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; Clause 97 of the 2018 Bill states&lt;i&gt;“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6) The Authority shall no, later than twelve months from the date notified date issue codes of practice  on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f) security safeguards under section 31; (g) research purposes under section 45;(h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”&lt;/i&gt;&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[3]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;Rule 419A (16): The Central Government or the State Government shall constitute a Review Committee. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Rule 419 A(17): The Review Committee shall meet at least once in two months and record its findings whether the directions issued under sub-rule (1) are in accordance with the provisions of sub-section (2) of Section 5 of the said Act. When the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above it may set aside the directions and orders for destruction of the copies of the intercepted message or class of messages.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[4]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;Rule 22 of Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009: The Review Committee shall meet at least once in two months and record its findings whether the directions issued under rule 3 are in accordance with the provisions of sub-section (2) of section 69 of the Act and where the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above, it may set aside the directions and issue an order for destruction of the copies, including corresponding electronic record of the intercepted or monitored or decrypted information.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-01-20T02:35:30Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india">
    <title>Demystifying Data Breaches in India</title>
    <link>https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india</link>
    <description>
        &lt;b&gt;Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their  historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.

&lt;/b&gt;
        &lt;p&gt;Edited by Arindrajit Basu and Saumyaa Naidu&lt;/p&gt;
&lt;hr /&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;India saw a &lt;a href="https://theprint.in/india/despite-62-drop-in-data-breaches-india-among-top-5-nations-targeted-by-hackers-study-finds/917197/"&gt;62% drop in data breaches in the first quarter of 2022&lt;/a&gt;. Yet, it ranked fifth on the list of countries most hit by cyberattacks according to a 2022 &lt;a href="https://surfshark.com/blog/data-breach-statistics-by-country"&gt;report by Surfshark&lt;/a&gt;, a Netherlands-based VPN company. Another report &lt;a href="https://analyticsindiamag.com/the-ridiculous-17-5-cr-for-a-data-breach/"&gt;on the cost of data breaches researched by the Ponemon Institute and published by IBM&lt;/a&gt; reveals that the breach of about 29500 records between March 2021 and March 2022 resulted in a 25% increase in the average cost from INR 165 million in 2021 to INR 176 million in 2022.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;These statistics are certainly a cause for concern, especially in the context of India’s rapidly burgeoning digital economy shaped by the pervasive platformization of private and public services such as welfare, banking, finance, health, and shopping among others. Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;While expert articulations of cybersecurity in general and data breaches in particular tend to predominate the public discourse on data privacy, this post aims to situate broader understandings of data breaches within the historical context of India’s IT revolution and delve into specific concepts and terminology that have shaped the broader discourse on data protection. The late 1990s and early 2000s offer a useful point of entry into the genesis of the data security landscape in India.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;&lt;/span&gt;&lt;span&gt;Data Breaches and their Predecessor Forms&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;/span&gt;&lt;span&gt;The articulation of data security concerns around the late 1990s and early 2000s isn’t always consistent in deploying the phrase, ‘data breach’ to signal cybersecurity concerns in India. The terms such as ‘data/ identity theft’ and ‘data leak’ figure prominently in the public articulation of concerns with the handling of personal information by IT systems, particularly in the context of business process outsourcing (BPO) and e-commerce activities. Other pertinent terms such as “security breach”, “data security”, and ‘“cyberfraud” also capture the specificity of growing concerns around outsourced data to India. At the time, i.e. around mid-2000s regulatory frameworks were still evolving to accommodate and address the complexities arising from a dynamic reconfiguration of the telecommunications and IT landscape in India.&lt;/span&gt;&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Some of the formative cases that instantiate the usage of the aforementioned terms are instructive to understand shifts in the reporting of such incidents over time. The earliest case during that period concerns&lt;a href="https://www.stop-source-code-theft.com/source-code-theft-cases-in-india/"&gt; a 2002 case concerning the theft and sale of source code&lt;/a&gt; by an IIT Kharagpur student who intended to sell the code to two undercover FBI agents who worked with the CBI to catch the thief. A straightforward case of data theft was framed by media stories around the time as a &lt;a href="https://timesofindia.indiatimes.com/iitian-held-for-stealing-software-source-code/articleshow/20389713.cms"&gt;cybercrime involving the illegal sale&lt;/a&gt; of the source code of a software package, as &lt;a href="https://economictimes.indiatimes.com/ip-laws-lax-but-us-firm-bets-on-india/articleshow/696197.cms?from=mdr"&gt;software theft of intellectual property in the context of outsourcing&lt;/a&gt; and as an instance of &lt;a href="https://www.computerworld.com/article/2573515/at-risk-offshore.html"&gt;industrial espionage in poor nations without laws protecting foreign companies&lt;/a&gt;. This case became the basis of the earliest calls for the protection of data privacy and security in the context of the Indian BPO sector. The Indian IT Act, 2000 at the time only covered &lt;a href="http://pavanduggal.com/wp-content/uploads/2016/01/India-Responds-to-Growing-Concerns-Over-Data-Security.pdf"&gt;unauthorized access and data theft from computers and networks without any provisions for data protection, interception or computer forgery&lt;/a&gt;. The BPO boom in India brought with it &lt;a href="https://blj.ucdavis.edu/archives/vol-6-no-2/offshore-outsourcing-to-india.html"&gt;employment opportunities for India’s English-speaking, educated youth but in the absence of concrete data privacy legislation&lt;/a&gt;, the country was regarded as an unsafe destination for outsourcing aside from the political ramifications concerning the loss of American jobs.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In a major 2005 incident, employees of the Mphasis BFL call centre in Pune extracted sensitive bank account information of Citibank’s American customers to divert INR 1.90 crore into new accounts set up in India. The media coverage of this incident calls it &lt;a href="https://www.indiatoday.in/magazine/economy/story/20050502-pune-call-centre-fraud-rattles-india-booming-bpo-sector-787790-2005-05-01"&gt;India’s first outsourcing cyberfraud and a well planned scam&lt;/a&gt;, a &lt;a href="https://economictimes.indiatimes.com/mphasis-call-centre-fraud-net-widens/articleshow/1077097.cms"&gt;cybercrime in a globalized world&lt;/a&gt;, and a case of &lt;a href="https://timesofindia.indiatimes.com/home/sunday-times/deep-focus/indias-first-bpo-scam-unraveled/articleshow/1086438.cms"&gt;financial fraud and a scam&lt;/a&gt; that required no hacking skills, and a &lt;a href="https://www.infoworld.com/article/2668975/indian-call-center-workers-charged-with-citibank-fraud.html"&gt;case of data theft and misuse&lt;/a&gt;. Within the ambit of cybercrime, media reports of these incidents refer to them as cases of “fraud”, “scam” and “theft''.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Two other incidents in 2005 set the trend for a critical spotlight on data security practices in India. In a &lt;a href="http://news.bbc.co.uk/2/hi/south_asia/4619859.stm"&gt;June 2005 incident, an employee of a Delhi-based BPO firm, Infinity e-systems, sold the account numbers and passwords of 1000 bank customers &lt;/a&gt;to the British Tabloid, The Sun. The Indian newspaper, Telegraph India, carried an online story headlined, “&lt;a href="https://www.telegraphindia.com/india/bpo-blot-in-british-backlash-indian-sells-secret-data/cid/873737"&gt;BPO Blot in British Backlash: Indian Sells Secret Data&lt;/a&gt;,” which reported that the employee, Kkaran Bahree, 24, was set up by a British journalist, Oliver Harvey. Harvey filmed Bahree accepting wads of cash for the stolen data. Bahree’s theft of sensitive information is described both as a data fraud and a leak in the above 2005 BBC story by Soutik Biswar. Another story on the incident calls it a “&lt;a href="https://www.rediff.com/money/2005/jun/24bpo3.htm"&gt;scam” involving the leakage of credit card information&lt;/a&gt;. The use of the term ‘leak’ appears consistently across other media accounts such as a &lt;a href="https://timesofindia.indiatimes.com/city/delhi/esearch-bpo-employee-sacked-still-missing/articleshow/1153017.cms"&gt;2005 story on Karan Bahree in the Times of India&lt;/a&gt; and another story in the Economic Times about the Australian Broadcasting Corporation’s (ABC) sting operation similar to the one in Delhi, describing the scam by the &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/karan-bahree-part-ii-shot-in-australia/articleshow/1201347.cms?from=mdr"&gt;fraudsters as a leak&lt;/a&gt; of the online information of Australians. Another media account of the coverage describes the incident in more generic terms such as an “&lt;a href="https://www.tribuneindia.com/2005/20050625/edit.htm"&gt;outsourcing crime&lt;/a&gt;”.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The other case concerned &lt;a href="https://www.taylorfrancis.com/chapters/mono/10.4324/9781315610689-16/political-economy-data-security-bpo-industry-india-alan-chong-faizal-bin-yahya"&gt;four former employees of Parsec technologies who stole classified information and diverted calls from potential customers&lt;/a&gt;, causing a sudden drop in the productivity of call centres managed by the company in November 2005. Another call centre &lt;a href="http://news.bbc.co.uk/1/hi/uk/7953401.stm"&gt;fraud came to light in 2009 through a BBC sting operation in which British reporters went to Delhi &lt;/a&gt;and secretly filmed a deal with a man selling credit card and debit card details obtained from Symantec call centres, which sold software made by Norton. This BBC story uses the term “breach” to refer to the incident.&lt;/p&gt;
&lt;p dir="ltr"&gt;In the broader framing of these cases generally understood as cybercrime, which received transnational media coverage, the terms “fraud”, “leak”, “scam”, and “theft” appear interchangeably. The term “data breach” does not seem to be a popular or common usage in these media accounts of the BPO-related incidents. A broader sense of breach (of confidentiality, privacy) figures in the media reportage in &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr"&gt;implicitly racial terms of cultural trust&lt;/a&gt;, as a matter of &lt;a href="https://www.news18.com/news/business/bpo-staff-need-ethical-training-poll-248442.html"&gt;ethics and professionalism&lt;/a&gt; and in the &lt;a href="https://www.news18.com/news/business/sting-op-may-spell-doom-for-bpos-248260.html"&gt;language of scandal &lt;/a&gt;in some cases.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;These early cases typify a specific kind of cybercrime concerning the theft or misappropriation of outsourced personal data belonging to British or American residents. What’s remarkable about these cases is the utmost sensitivity of the stolen personal information including financial details, bank account and credit/debit card numbers, passwords, and in one case, source code. While these cases rang the alarm bells on the Indian BPO sector’s data security protocols, they also directed attention to concerns around &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr"&gt;the training of Indian employees on the ethics of data confidentiality and vetting through psychometric tests&lt;/a&gt; for character assessment. In the wake of these incidents, the National Association of Software and Service Companies (NASSCOM), an Indian non-governmental trade and advocacy group,&lt;a href="https://www.computerworld.com/article/2547959/outsourcing-to-india--dealing-with-data-theft-and-misuse.html"&gt; launched a National Skills Registry for IT professionals to enable employers to conduct background checks&lt;/a&gt; in 2006.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;These data theft incidents earned India a global reputation of an unsafe destination for business process outsourcing, seen to be lacking both, a culture of maintaining data confidentiality and concrete legislation for data protection at the time. Importantly, the incidents of data theft or misappropriation were also traceable back to a known source, a BPO employee or a group of malefactors, who often sold sensitive data belonging to foreign nationals to others in India.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The phrase “data leak” also caught on in another register in the context of the widespread use of camera-equipped mobile phones in India. The 2004 Delhi MMS case offers an instance of a date leak, recapitulating the language of scandal in moralistic terms.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;The Delhi MMS Case&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The infamous 2004 incident involved two underage Delhi Public School (DPS) students who recorded themselves in a sexually explicit act on a cellular phone. After a fall out, the male student passed the low-resolution clip on to his friend in which his female friend’s face is seen. The clip, distributed far and wide in India, ended up on the famous e-shopping and auction website, bazee.com leading to &lt;a href="https://indiancaselaw.in/avnish-bajaj-vs-state-dps-mms-scandal-case/"&gt;the arrest of the website’s CEO Avinash Bajaj for hosting the listing for sale&lt;/a&gt;. Another similar case in 2004 mimicked the mechanics of visual capture through hand-held MMS-enabled mobile phones. A two-minute MMS of a top South-Indian actress &lt;a href="https://timesofindia.indiatimes.com/india/web-of-sleaze-now-nude-video-of-top-actress/articleshow/966048.cms"&gt;taking a shower went viral on the Internet in 2004, the year when another MMS of two prominent Bollywood actors kissing&lt;/a&gt; had already done the rounds. The &lt;a href="https://www.journals.upd.edu.ph/index.php/plaridel/article/view/2392"&gt;MMS case also marked the onset of a national moral panic around the amateur uses of mobile phone technologies&lt;/a&gt;, capable of corrupting young Indian minds under a sneaky regime of new media modernity. The MMS case, not strictly the classic case of a data breach - non-visual information generally stored in databases - became an iconic case of a data leak framed in the media as &lt;a href="https://www.telegraphindia.com/india/scandal-in-school-shakes-up-delhi/cid/1667531"&gt;a scandal that shocked the country&lt;/a&gt;, with calls for the regulation of mobile phone use in schools. The case continued its scandalous afterlife in a &lt;a href="https://www.heraldgoa.in/Edit/dev-ds-leni-has-a-dps-mms-scandal-connection-/21344"&gt;2009 Bollywood film, Dev D&lt;/a&gt; and another &lt;a href="https://indianexpress.com/article/entertainment/entertainment-others/delhi-mms-scandal-inspires-dibakars-love-sex-aur-dhoka/"&gt;2010 film, Love, Sex and Dhokha&lt;/a&gt;,&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Taken together, the BPO data thefts and frauds and the data leak scandals prefigure the contemporary discourse on data breaches in the second decade of the 21st century, or what may also be called the Decade of Datafication. The launch of the Indian biometric identity project, Aadhaar, in 2009, which linked access to public services and welfare delivery with biometric identification, resulted in large-scale data collection of the scheme’s subscribers. Such linking raised the spectre of state surveillance as alleged by the critics of Aadhaar, marking a watershed moment in the discourse on data privacy and protection.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Aadhaar Data Security and Other Data Breaches&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Aadhaar was challenged in the Indian Supreme Court in 2012 when &lt;a href="https://www.outlookindia.com/website/story/worries-about-the-aadhaar-monster/296790"&gt;it was made mandatory for welfare and other services such as banking, taxation and mobile telephony&lt;/a&gt;. The national debate on the status of privacy as a cultural practice in Indian society and a fundamental right in the Indian Constitution led to two landmark judgments - the &lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;2017 Puttaswamy ruling&lt;/a&gt; holding privacy to be a constitutional right subject to limitations and &lt;a href="https://indiankanoon.org/doc/127517806/"&gt;the 2018 Supreme Court judgment holding mandatory Aadhaar to be constitutional only for welfare and taxation but no other service&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;While these judgments sought to rein in Aadhaar’s proliferating mandatory uses, biometric verification remained the most common mode of identity authentication with &lt;a href="https://www.businesstoday.in/latest/trends/story/aadhaar-not-mandatory-yet-organisations-pose-it-as-a-mandatory-document-335550-2022-05-29"&gt;most organizations claiming it to be mandatory for various purposes&lt;/a&gt;. During the same period from 2010 onwards, a range of data security events concerning Aadhaar came to light. These included &lt;a href="https://www.firstpost.com/tech/news-analysis/aadhaar-security-breaches-here-are-the-major-untoward-incidents-that-have-happened-with-aadhaar-and-what-was-actually-affected-4300349.html"&gt;app-based flaws, government websites publishing Aadhaar details of subscribers, third party leaks of demographic data, duplicate and forged Aadhaar cards and other misuses&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In 2015, the Indian government launched its ambitious &lt;a href="https://indiancc.mygov.in/wp-content/uploads/2021/08/mygov-10000000001596725005.pdf"&gt;Digital India Campaign to provide government services to Indian citizens&lt;/a&gt; through online platforms. Yet, data security breach incidents continued to increase, particularly the trade in the sale and purchase of sensitive financial information related to bank accounts and credit card numbers. The online availability of &lt;a href="https://www.livemint.com/Industry/l5WlBjdIDXWehaoKiuAP9J/India-unprepared-to-tackle-online-data-security-report.html"&gt;a rich trove of data, accessible via a simple Google search without the use of any extractive software or hacking skills &lt;/a&gt;within a thriving shadow economy of data buyers and sellers makes India a particularly vulnerable digital economy, especially in the absence of robust legislation. The lack of awareness around digital crimes and low digital literacy further exacerbates the situation given that datafication via government portals, e-commerce, and online apps has outpaced the enforcement of legislative frameworks for data protection and cybersecurity.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In the context of Aadhaar data security issues, the term “data leak” seems to have more traction in media stories followed by the term “security breach”. Given the complexity of the myriad ways in which Aadhaar data has been breached, terms such as &lt;a href="https://techcrunch.com/2022/06/13/aadhaar-leak-pm-kisan/?guccounter=1&amp;amp;guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&amp;amp;guce_referrer_sig=AQAAADvQXtC19Gj80LSKVc5jLwnRsREalvM2f6dV3N9KmCs8be6_1Zbvu3J6abPmBxhLlUooLiOjg4JktYDDCXr0OYYvOZ5XFlXa6DfCJk97TvMXM-cs3uJbCJBA-ePqvAC5K4qGZSyDB4OykMEOIKXJpB0CTOourPRc5dBxFFq5JXlB"&gt;data leak and exposure&lt;/a&gt; (of &lt;a href="https://zeenews.india.com/personal-finance/aadhaar-data-breach-over-110-crore-indian-farmers-aadhaar-card-data-compromised-2473666.html"&gt;11 crore Indian farmers’ sensitive information&lt;/a&gt;) add to the specificity of the data security compromise. The term “fraud” also makes a comeback in the context of &lt;a href="https://www.business-standard.com/article/economy-policy/india-s-aadhaar-id-system-delivers-benefits-but-at-risk-of-widespread-fraud-122062400124_1.html"&gt;Aadhaar-related data security incidents&lt;/a&gt;. These cases represent a mix of data frauds involving&lt;a href="https://economictimes.indiatimes.com/news/india/alarm-over-fake-id-printing-websites-using-customer-data-for-cyber-fraud/articleshow/94742646.cms"&gt; fake identities&lt;/a&gt;, &lt;a href="https://indianexpress.com/article/cities/delhi/in-new-age-data-theft-fraudsters-steal-thumb-prints-from-land-registries-7914530/"&gt;theft of thumb prints &lt;/a&gt;for instance from land registries and inadvertent data leaks in numerous incidents involving &lt;a href="https://techcrunch.com/2019/01/31/aadhaar-data-leak/"&gt;government employees in Jharkhand&lt;/a&gt;, v&lt;a href="https://www.firstpost.com/india/aadhaar-data-leak-details-of-7-82-cr-indians-from-ap-and-telangana-found-on-it-grids-database-6448961.html"&gt;oter ID information of Indian citizens in Andhra Pradesh and Telangana&lt;/a&gt; and &lt;a href="https://www.thehindu.com/sci-tech/technology/major-aadhaar-data-leak-plugged-french-security-researcher/article26584981.ece"&gt;activist reports of Indian government websites leaking Aadhaar data&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Aadhaar-related data security events parallel the increase in corporate data breaches during the decade of datafication. The term “data leak” again alternates with the term “data breach” in most media accounts while other terms such as “theft” and “scam” all but disappear in the media coverage of corporate data breaches.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;From 2016 onwards, incidents of corporate data breaches in India continued to rise. A massive &lt;a href="https://thewire.in/banking/debit-card-breach-india-banking"&gt;debit card data breach involving the YES Bank ATMs and point-of-sale (PoS) machines &lt;/a&gt;compromised through malware between May and July of 2016 resulted in the exposure of ATM PINs and non-personal identifiable information of customers. It went &lt;a href="https://www.livemint.com/Industry/Ope7B0jpjoLkemwz6QXirN/SBI-Yes-Bank-MasterCard-deny-data-breach-of-own-systems.html"&gt;undetected for nearly three&lt;/a&gt; months. Another data leak in 2018 concerned a &lt;a href="https://www.zdnet.com/article/another-data-leak-hits-india-aadhaar-biometric-database/"&gt;system run by Indane, a state-owned utility company, which allowed anyone to download private information on all Aadhaar holders &lt;/a&gt;including their names, services they were connected to and the unique 12-digit Aadhaar number. Data breaches continued to be reported in India concurrent with the incidents of data mismanagement related to Aadhaar. Some &lt;a href="https://www.csoonline.com/article/3541148/the-biggest-data-breaches-in-india.html"&gt;prominent data breaches included &lt;/a&gt;a cyberattack on the systems of airline data service provider SITA resulting in the leak of Air India passenger data, leakage of the personal details of the Common Admission Test (CAT) applicants, details of credit card and order preferences of Domino’s pizza customers on the dark web, leakage of COVID-19 patients’ test results leaked by government websites, user data of Justpay and Big Basket for sale on the dark web and an SBI data breach among others between 2019 and 2021.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The media reportage of these data breaches use the term “cyberattack” to describe the activities of hackers and cybercriminals operating within a&lt;a href="https://www.thehindu.com/sci-tech/technology/internet/most-damaging-cybercrime-services-are-cheap-on-the-dark-web/article37004587.ece"&gt; shadow economy or the dark web&lt;/a&gt;. Recent examples of cyberattacks by hackers who leak user data for sale on the dark web include &lt;a href="https://indianexpress.com/article/technology/tech-news-technology/mobikwik-database-leaked-on-dark-web-company-denies-any-data-breach-7251448/"&gt;8.2 terabytes of 110 million sensitive financial data (KYC details, Aadhaar, credit/debit cards and phone numbers) of the payments app MobiKwik users&lt;/a&gt;, &lt;a href="https://www.firstpost.com/tech/news-analysis/dominos-india-data-breach-name-location-mobile-number-email-of-18-crore-orders-up-for-sale-on-dark-web-9650591.html"&gt;180 million Domino’s pizza orders (name, location, emails, mobile numbers),&lt;/a&gt; and &lt;a href="https://techcrunch.com/2022/07/18/cleartrip-data-breach-dark-web/"&gt;Flipkart’s Cleartrip users’ data&lt;/a&gt;. In these incidents again, three terms appear prominently in the media reportage - cyberattack, data breach, and leak. The term “data breach” remains the most frequently used epithet in the media coverage of the lapses of data security. While it alternates with the term “leak” in the stories, the term “data breach” appears consistently across most headlines in the news stories.&lt;/p&gt;
&lt;p dir="ltr"&gt;The exposure of sensitive, personal, and non-personal data by public and private entities in India is certainly a cause for concern, given the ongoing data protection legislative vacuum.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The media coverage of data breaches tends to emphasize the quantum of compromised user data aside from the types of data exposed. The media framing of these breaches in &lt;a href="https://www.livemint.com/technology/tech-news/indian-firms-lost-176-million-to-data-breaches-last-fiscal-11658914231530.html"&gt;quantitative terms of financial loss&lt;/a&gt; as well as the &lt;a href="https://www.indiatoday.in/technology/news/story/personal-data-of-3-4-million-paytm-mall-users-reportedly-exposed-in-2020-data-breach-1980690-2022-07-27"&gt;magnitude&lt;/a&gt; and the &lt;a href="https://www.moneycontrol.com/news/business/banks/indian-banks-reported-248-data-breaches-in-last-four-years-says-government-8940891.html"&gt;number of breaches&lt;/a&gt; certainly highlights the gravity of these incidents but harm to individual users is often not addressed.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Evolving Terminology and the Source of Data Harms&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The main difference in the media reportage of the BPO cybersecurity incidents during the early aughts and the contemporary context of datafication is the usage of the term, “data breach”, which figures prominently in contemporary reportage of data security incidents but not so much in the BPO-related cybercrimes.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;THe BPO incidents of data theft and the attendant fraud must be understood in the context of the anxieties brought on by a globalizing world of Internet-enabled systems and transnational communications. In most of these incidents regarded as cybercrimes, the language of fraud and scam ventures further to attribute such illegal actions of the identifiable malefactors to cultural factors such as lack of ethics and professionalism.The usage of the term “data leak” in these media reports functions more specifically to underscore a broader lapse in data security as well as a lack of robust cybersecurity laws. The broader term, “breach”, is occasionally used to refer to these incidents but the term, “data breach” doesn’t appear as such.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The term “data breach” gains more prominence in media accounts from 2009 onwards in the context of Aadhaar and the online delivery of goods and services by public and private players. The term “data breach” is often used interchangeably with the term “leak” within the broader ambit of cyberattacks in the corporate sector. The media reportage frames Aadhaar-related security lapses as instances of security/data breaches, data leaks, fraud, and occasionally scam.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In contrast to the handful of data security cases in the BPO sector, data breaches have abounded in the second decade of the twenty-first century. What further differentiates the BPO-related incidents to the contemporary data breaches is the source of the data security lapse. Most corporate data breaches remain attributable to the actions of hackers and cybercriminals while the BPO security lapses were traceable back to ex-employees or insiders with access to sensitive data. We also see in the coverage of the BPO-related incidents, the attribution of such data security lapses to cultural factors including a lack of ethics and professionalism often in racial overtones. The media reportage of the BBC and ABC sting operations suggests that the India BPOs lack of preparedness to handle and maintain personal data confidentiality of foreigners point to the absence of a privacy culture in India. Interestingly, this transnational attribution recurs in a different form in the national debate on &lt;a href="https://huffpost.netblogpro.com/archive/in/entry/indians-don-t-care-about-privacy-but-thankfully-the-law-will-teach-them-what-it-means_a_23179031"&gt;Aadhaar and how Indians don’t care about their privacy&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The question of the harms of data breaches to individuals is also an important one. In the discourse on contemporary data breaches, the actual material harm to an individual user is rarely ever established in the media reportage and generally framed as potential harm that could be devastating given the sensitivity of the compromised data. The harm is reported to be predominantly a function of organizational cybersecurity weakness or attributed to hackers and cybercriminals.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The reporting of harm in collective terms of the number of accounts breached, financial costs of a data breach, the sheer number of breaches and the global rankings of countries with the highest reported cases certainly suggests a problem with cybersecurity and the lack of organizational preparedness. However, this collective framing of a data breach’s impact usually elides an individual user’s experience of harm. Even in the case of Aadhaar-related breaches - a mix of leaking data on government websites and other online portals and breaches - the notion of harm owing to exposed data isn’t clearly established. This is, however, different from the &lt;a href="https://scroll.in/article/1013700/six-types-of-problems-aadhaar-is-causing-and-safeguards-needed-immediately"&gt;extensively documented cases of Aadhaar-related issues&lt;/a&gt; in which welfare benefits have been denied, identities stolen and legitimate beneficiaries erased from the system due to technological errors.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Future Directions of Research&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;This brief, qualitative foray into the media coverage of data breaches over two decades has aimed to trace the usage of various terms in two different contexts - the Indian BPO-related incidents and the contemporary context of datafication. It would be worth exploring at length, the relationship between frequent reports of data breaches, and the language used to convey harm in the contemporary context of a concrete data protection legislation vacuum. It would be instructive to examine the specific uses of the terms such as “fraud”, “leak”, “scam”, “theft” and “breach” in media reporting of such data security incidents more exhaustively. Such analysis would elucidate how media reportage shapes public perception towards the safety of user data and an anticipation of attendant harm as data protection legislation continues to evolve.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Especially with Aadhaar, which represents a paradigm shift in identity verification through digital means, it would be useful to conduct a sentiment analysis of how biometric identity related frauds, scams, and leaks are reported by the mainstream news media. A study of user attitudes and behaviours in response to the specific terminology of data security lapses such as the terms “breach”, “leak”, “fraud”, “scam”, “cybercrime”, and “cyberattack” would further contribute to how lay users understand the gravity of a data security lapse. Such research would go beyond expert understandings of data security incidents that tend to dominate media reportage to elucidate the concerns of lay users and further clarify the cultural meanings of data privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india'&gt;https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pawan Singh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2022-10-17T16:14:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines">
    <title>NHA Data Sharing Guidelines – Yet Another Policy in the Absence of a Data Protection Act</title>
    <link>https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines</link>
    <description>
        &lt;b&gt;In July this year, the National Health Authority (NHA) released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) just two months after publishing the draft Health Data Management Policy.&lt;/b&gt;
        &lt;p&gt;Reviewed and edited by Anubha Sinha&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Launched in 2018, PM-JAY is a public health insurance scheme set to cover 10 crore poor and vulnerable families across the country for secondary and tertiary care hospitalisation. Eligible candidates can use the scheme to avail of cashless benefits at any public/private hospital falling under this scheme. Considering the scale and sensitivity of the data, the creation of a well-thought-out data-sharing document is a much-needed step. However, the document – though only a draft – has certain portions that need to be reconsidered, including parts that are not aligned with other healthcare policy documents. In addition, the guidelines should be able to work in tandem with the Personal Data Protection Act whenever it comes into force. With no prior intimation of the publication of the guidelines, and the provision of a mere 10 days for consultation, there was very little scope for stakeholders to submit their comments and participate in the consultation. While the guidelines pertain to the PM-JAY scheme, it is an important document to understand the government’s concerns and stance on the sharing of health data, especially by insurance companies.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Definitions: Ambiguous and incompatible with similar policy documents&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The draft guidelines add to the list of health data–related policies that have been published since the beginning of the pandemic. These include three draft health data management policies published within two years, which have already covered the sharing and management of health data. The draft guidelines repeat the pattern of earlier policies on health data, wherein there is no reference to the policies that predated it; in this case, the guidelines fail to refer to the draft National Digital Health Data Management Policy (published in April 2022). To add to this, the document – by placing the definitions at the end – is difficult to read and understand, especially when terms such as ‘beneficiary’, ‘data principal’, and ‘individual’ are used interchangeably. In the same vein, the document uses the terms ‘data principal’ and ‘data fiduciary’, and the definitions of health data and personal data, from the 2019 PDP Bill, while also referring to the IT Act SDPI Rules and its definition of ‘sensitive personal data’. While the guidelines state that the IT Act and Rules will be the legislation to refer to for these guidelines, it is to be noted that the IT Act under the SPDI Rules covers ‘body corporates’, which under Section 43A(1), is defined as “any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities;”. It is difficult to add responsibility and accountability to the organisations under the guidelines when they might not even be covered under this definition.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With each new policy, civil society organisations have been pointing out the need to have a data protection act before introducing policies and guidelines that deal with the processing and sharing of the data of individuals. Ideally, these policies – even in draft form – should have been published after the Personal Data Protection Bill was enacted, to ensure consistency with the provisions of the law. For example, the guidelines introduce a new category of governance mechanisms under the data-sharing committee headed by a data-sharing officer (DSO). The responsibilities and powers of the DSO are similar to that of the data protection officer under the draft PDP Bill as well as the National Data Health Management Policy (NHDMP). This, in turn, raises the question of whether the DSO and the DPOs under both the PDP Bill and the draft NDMP will have the same responsibilities. Clarity in terms of which of the policies are in force and how they intersect is needed to ensure a smooth implementation. Ideally, having multiple sources of definitions should be addressed at the drafting stage itself.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Guiding Principles: Need to look beyond privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The guidelines enumerate certain principles to govern the use, collection, processing, and transmission of the personal or sensitive personal data of beneficiaries. These principles are accountability, privacy by design, choice and consent, openness/transparency, etc. While these provisions are much needed, their explanation at times misses the mark of why these principles were added. For example, in the case of accountability, the guidelines state that the ‘data fiduciary’ shall be accountable for complying with measures based on the guiding principles However, it does not specify who the fiduciaries would be accountable to and what the steps are to ensure accountability. Similarly, in the case of openness and transparency, the guidelines state that the policies and practices relating to the management of personal data will be available to all stakeholders. However, openness and transparency need to go beyond policies and practices and should consider other aspects of openness, including open data and the use of open-source software and open standards. This again will add to transparency, in that it would specify the rights of the data principal, as the current draft looks at the rights of the data principal merely from a privacy perspective. In the case of purpose limitation as well, the guidelines are tied to the privacy notice, which again puts the burden on the individual (in this case, beneficiary) when the onus should actually be on the data fiduciary. Lastly, under the empowerment of beneficiaries, the guidelines state that the “data principal shall be able to seek correction, amendments, or deletion of such data where it is inaccurate;”. The right to deletion should not be conditional on inaccuracy, especially when entering the scheme is optional and consent-based.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Data sharing with third parties without adequate safeguards&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The guidelines outline certain cases where personal data can be collected, used, or disclosed without the consent of the individual. One of these cases is when the data is anonymised. However, the guidelines do not detail how this anonymisation would be achieved and ensured through the life cycle of the data, especially when the clause states that the data will also be collected without consent. The guidelines also state that the anonymised data could be used for public health management, clinical research, or academic research. The guidelines should have limited the scope of academic research or added certain criteria to gain access to the data; the use of vague terminology could lead to this data (sometimes collected without consent) being de-anonymised or used for studies that could cause harm to the data principal or even a particular community. The guidelines state that the data can be shared as ‘protected health information’ with a government agency for oversight activities authorised by law, epidemic control, or in response to court orders. With the sharing of data, care should be taken to ensure data minimisation and purpose limitations that go beyond the explanations added in the body of the guidelines. In addition, the guidelines also introduce the concept of a ‘clean room’, which is defined as “a secure sandboxed area with access controls, where aggregated and anonymised or de-identified data may be shared for the purposes of developing inference or training models”. The definition does not state who will be developing these training models; it could be a cause of worry if AI companies or even insurance companies have the potential to use this data to train models that could eventually make decisions based on the results. The term ‘sandbox’ is explained under the now revoked DP Bill 2021 as “such live testing of new products or services in a controlled or test regulatory environment for which the Authority may or may not permit certain regulatory relaxations for a&lt;br /&gt;specified period for the limited purpose of the testing”. Neither the 2019 Bill nor the IT Act/Rules defines ‘sandbox’; the guidelines should have ideally spent more time explaining how the sandbox system in the ‘Clean Room’ works.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The draft Data Sharing Guidelines are a welcome step in ensuring that the entities sharing and processing data have guidelines to adhere to, especially since the Data Protection Bill has not been passed yet. The mention of the best practices for data sharing in annexures, including practices for people who have access to the data, is a step in the right direction, which could be made better with regular training and sensitisation. While the guidelines are a good starting point, they still suffer from the issues that have been highlighted in similar health data policies, including not referring to older policies, adding new entities, and the reliance on digital and mobile technology. The guidelines could have added more nuance to the consent and privacy by design sections to ensure other forms of notice, e.g., notice in audio form in different Indian languages. While PM-JAY aims to reach 10 crore poor and vulnerable families, there is a need to look at how to ensure that consent is given according to the guidelines that are “free, informed, clear, and specific”.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines'&gt;https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-09-29T15:17:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021">
    <title>CCTVs in Public Spaces and the Data Protection Bill, 2021</title>
    <link>https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021</link>
    <description>
        &lt;b&gt;This article has been authored by Ms. Anamika Kundu, Research Assistant at the Centre for Internet and Society, and Digvijay S. Chaudhary, Researcher at the Centre for Internet and Society. This blog is a part of RSRR’s Blog Series on the Right to Privacy and the Legality of Surveillance, in collaboration with the Centre for Internet &amp; Society.&lt;/b&gt;
        &lt;p&gt;&lt;span&gt;The article by Anamika Kundu and Digvijay S. Chaudhary was originally &lt;/span&gt;&lt;a class="external-link" href="https://rsrr.in/2022/04/20/cctv-surveillance-privacy/"&gt;published by RGNUL Student Research Review&lt;/a&gt;&lt;span&gt; on April 20, 2022&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img src="https://cis-india.org/home-images/Surveillance.jpg/@@images/f8fad564-44ab-46e2-bd44-29607ea7fd19.jpeg" alt="Surveillance" class="image-inline" title="Surveillance" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In recent times, Indian cities have seen an expansion of state deployed CCTV cameras. According to a recent report, in terms of CCTVs deployed, Delhi was considered as the most surveilled city in the world, surpassing even the most surveilled cities in China. Delhi was not the only Indian city in that list, Chennai and Mumbai also made it to the list. In Hyderabad as well, the development of a Command and Control Centre aims to link the city’s surveillance infrastructure in real-time. Even though studies have shown that there is little correlation between CCTVs and crime control, deployment of CCTV cameras has been justified on the basis of national security and crime deterrence. Such an activity brings about the collection and retention of audio-visual/visual information of all individuals frequenting spaces where CCTV cameras are deployed. This information could be used to identify them (directly or indirectly) based on their looks or other attributes. Potential risks associated with the misuse, and processing of such personal data also arise. These risks include large scale profiling, criminal abuse (law enforcement misusing CCTV information for personal gains), and discriminatory targeting (law enforcement disproportionately focusing on a particular group of people). As these devices capture personal data of individuals, this article seeks data protection safeguards available to data principals against CCTV surveillance employed by the State in a public space under the proposed Data Protection Bill, 2021 (the “DPB”).&lt;/p&gt;
&lt;h2&gt;Safeguards Available Under the Data Protection Bill, 2021&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;To use CCTV surveillance, the measures and compliance listed under the DPB have to be followed. Obligations of data fiduciaries available under Chapter II, such as consent (clause 11), notice requirement (clause 7), and fair and reasonable processing (clause 5) are common to all data processing entities for a variety of activities. Similarly, as the DPB follows the principles of data minimisation (clause 6), storage limitation (clause 9), purpose limitation (clause 5), lawful and fair processing (clause 4), transparency (clause 23), and privacy by design (clause 22), these safeguards too are common to all data processing entities/activities. If a data fiduciary processes personal data of children, it has to comply with the standards stated under clause 16.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the DPB, compliance differs on the basis of grounds and purpose of data processing. As such, if compliance standards differ, so do the availability of safeguards under the DPB. Of relevance to this article, there are three standards of compliance under the DPB wherein the standards of safeguards available to a data principal differ. First, cases which would fall under Chapter III and hence, not require consent. Chapter III lists grounds for processing of personal data without consent. Second, cases which would fall under exemption clauses in Chapter VIII. In such cases, the DPB or some of its provisions would be inapplicable. Clause 35 under Chapter VIII gives power to the Central Government to exempt any agency from the application of the DPB. Similarly, Clause 36 under Chapter VIII, exempts certain provisions for certain processing of personal data. Third, cases which would not fall under either of the above Chapters. In such cases, all safeguards available under the DPB would be available to the data principals. Consequently, safeguards available to data principals in each of these standards are different. We will go through each of these separately.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;First, if the grounds of processing of CCTV information is such that it falls under the scope of Chapter III of the DPB, wherein the consent requirement is done away with, then in those cases, the notice requirement has to reflect such purpose, meaning that even if consent is not necessary for certain cases, other requirements under the DPB would still apply. Here, we must note that CCTV deployment by the state on such a large scale may be justified on the basis of conditions stated under clauses 12 and 14 of DPB – specifically, the condition for the performance of state function authorised by law, and public interest. The requirement under clause 12 of “authorised by law” simply means that the state function should have legal backing. Deployment of CCTVs is most likely to fall under clause 12 as various states have enacted legislations providing for CCTV deployment in the name of public safety. As a result, even if section 12 takes away the requirement of consent for certain cases, data principals should be able to exercise all rights accorded to them under the DPB (chapter V) except the right to data portability under clause 19.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Second, processing of personal data via CCTVs by government agencies could be exempted from DPB under clause 35 for certain cases under the clause. Another exemption that is particularly concerning with regard to the use of CCTVs is the exemption provided under clause 36(a). Section 36(a) says that the provisions of chapters II-VII would not apply where the data is processed in the interest of prevention, detection, investigation, and prosecution of any offence under the law. Chapters II-VII govern the obligations of data fiduciaries, grounds where consent would not be required, personal data of children, rights of data principals, transparency and accountability measures, and restrictions on transfer of personal data outside India respectively. In these cases, the requirement of fair and reasonable processing under clause 5 would also not apply. As a broad justification provided for CCTVs deployment by the government is crime control, it is possible that section 36(a) justification can be used to exempt the processing of CCTV footage from the above-mentioned safeguards.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;From the above discussion, the following can be concluded. First, if the grounds of processing fall under Chapter III, then standards of fair and reasonable processing, notice requirement, and all rights except the right to data portability u/s 19 would be available to data principals. Second, if the grounds of processing fall under clause 36, then, in that case, consent requirement, notice requirement, and the rights under DPB would be unavailable as that section mandates the non-application of those chapters. In such a case, even the processing requirements of a fair and reasonable manner stand suspended. Third, if the grounds of processing of CCTV information doesn’t fall under Chapter III, then all obligations listed under Chapter II would have to be followed. Moreover, the data principal would be able to exercise all the rights available under Chapter V of the DPB.&lt;/p&gt;
&lt;h2&gt;Constitutional Standards&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;When the Supreme Court recognised privacy as a fundamental right in the case of Puttaswamy v. Union of India (“Puttaswamy”), it located the principles of informed consent and purpose limitation as central to informational privacy. It recognised that privacy inheres not in spaces but in an individual. It also recognised that privacy is not an absolute right and certain restrictions may be imposed on the exercise of the right. Before listing the constitutional standards that activities infringing privacy must adhere to, it’s important to answer whether there exists a reasonable expectation of privacy in CCTV footage deployed in a public space by the State?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In Puttaswamy, the court recognised that privacy is not denuded in public spaces. Writing for the plurality judgement, Chandrachud J. recognised that the notion of a reasonable expectation of privacy has elements both of a subjective and objective nature. Defining these concepts, he writes, “Privacy at a subjective level is a reflection of those areas where an individual desire to be left alone. On an objective plane, privacy is defined by those constitutional values which shape the content of the protected zone where the individual ought to be left alone…hence while the individual is entitled to a zone of privacy, its extent is based not only on the subjective expectation of the individual but on an objective principle which defines a reasonable expectation.” Note how in the above sentences, the plurality judgement recognises “a reasonable expectation” to be inherent in “constitutional values”. This is important as the meaning of what’s reasonable is to be constituted according to constitutional values and not societal norms. A second consideration that the phrase “reasonable expectation of privacy” requires is that an individual’s reasonable expectation is allied to the purpose for which the information is provided, as held in the case of Hyderabad v. Canara Bank (“Canara Bank”). Finally, the third consideration in defining the phrase is that it is context dependent. For example, in the case of In the matter of an application by JR38 for Judicial Review (Northern Ireland) 242 (2015) (link here), the UK Supreme Court was faced with a scenario where the police published the CCTV footage of the appellant involved in riotous behaviour. The question before the court was: “Whether the publication of photographs by the police to identify a young person suspected of being involved in riotous behaviour and attempted criminal damage can ever be a necessary and proportionate interference with that person’s article 8 [privacy] rights?” The majority held that there was no reasonable expectation of privacy in the case because of the nature of the criminal activity the appellant was involved in. However, the majority’s formulation of this conclusion was based on the reasoning that “expectation of privacy” was dependent on the “identification” purpose of the police. The court stated, “Thus, if the photographs had been published for some reason other than identification, the position would have been different and might well have engaged his rights to respect for his private life within article 8.1”. Therefore, as the purpose of publishing the footage was “identification” of the wrongdoer, the reasonable expectation of privacy stood excluded. The Canara Bank case was relied on by the SC in Puttaswamy. The plurality judgement in Puttaswamy also quoted the above paragraphs from the UK Supreme Court judgement.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, the SC in the Aadhaar case, laid down the factors of “reasonable expectation of privacy.” Relying on those factors, the Supreme Court observed that demographic information and photographs do not raise a reasonable expectation of privacy. It further held that face photographs for the purpose of identification are not covered by a reasonable expectation of privacy. As this author has recognised, the majority in the Aadhaar case misconstrued the “reasonable expectation of privacy” to lie not in constitutional values as held in Puttaswamy but in societal norms. Even with the misapplication of the Puttaswamy principles by the majority in Aadhaar, it is clear that the exclusion of a “reasonable expectation of privacy” in face photographs is valid only for the purpose of “identification”. For purposes other than “identification”, there should exist a reasonable expectation of privacy in CCTV footage. Having recognised the existence of “reasonable expectation of privacy” in CCTV footage, let’s see how the safeguards mentioned under the DPB stand the constitutional standards of privacy laid down in Puttaswamy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bench in Puttaswamy located privacy not only in Article 21 but the entirety of part III of the Indian Constitution. Where transgression to privacy relates to different provisions under Part III, the tests evolved under those Articles would apply. Puttaswamy recognised that national security and crime control are legitimate state objectives. However, it also recognised that any limitation on the right must satisfy the proportionality test. The proportionality test requires a legitimate state aim, rational nexus, necessity, and balancing of interests. Infringement on the right to privacy occurs under the first and second standard. The first requirement of proportionality stands justified as national security and crime control have been recognised to be legitimate state objectives. However, it must be noted that the EU Guidelines on Processing of Personal Data through video devices state that the mere purpose of “safety” or “for your safety” is not sufficiently specific and is contrary to the principle that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject. The second requirement is a rational nexus. As stated above, there is little correlation between crime control and surveillance measures. Even if the state justifies a rational nexus between state aim and the action employed, it is the necessity part of the proportionality test where the CCTV surveillance measures fail (as explained by this author). Necessity requires us to draw a list of alternatives and their impact on an individual, and then do a balancing analysis with regard to the alternatives. Here, judicial scrutiny of the exemption order under clause 35 is a viable alternative that respects individual rights while at the same time, not interfering with the state’s aim.&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Informed consent and purpose limitation were stated to be central principles of informational privacy in Puttaswamy. Among the three standards we identified, the principles of informed consent and purpose limitation remain available only in the third standard. In the first standard, even though the requirement of consent has become unavailable, the principle of purpose limitation would still be applicable to the processing of such data. The second standard is of particular concern wherein neither of those principles is available to data principals. It is worth mentioning here that in large scale monitoring activities such as CCTV surveillance, the safeguards which the DPB lists out would inevitably have an implementation flaw. The reason is that in scenarios where individuals refuse consent for large scale CCTV monitoring, what alternatives would the government offer to those individuals? Practically, CCTV surveillance would fall under clause 12 standards where consent would not be required. Even in those cases, would the notice requirement safeguard be diminished to “you are under surveillance” notices? When we talk about exercise of rights available under the DPB, how would an individual effectively exercise their right when the data processing is not limited to a particular individual? These questions arise because the safeguards under the DPB (and data protection laws in general) are based on individualistic notions of privacy. Interestingly, individual use cases of CCTVs have also increased with an increase in state use of CCTVs. Deployment of CCTVs for personal or domestic purposes would be exempt from the above-mentioned compliances as that would fall under the exemption provision of clause 36(d). Two additional concerns arise in relation to processing of data concerning CCTVs – the JPC report’s inclusion of Non-Personal Data (“NPD”) within the ambit of DPB, and the government’s plan to develop a National Automated Facial Recognition System (“AFRS”). A significant part of the data collected by CCTVs would fall within the ambit of NPD.With the JPC’s recommendation, it will be interesting to follow the processing standards for NPD under the DPB. AFRS has been imagined as a national database of photographs gathered from various agencies to be used in conjunction with facial recognition technology. The use of facial recognition technology with CCTV cameras raises concerns surrounding biometric data, and risks of large scale profiling. Indeed, section 27 of the DPB reflects this risk and mandates a data protection impact assessment to be undertaken by the data fiduciary with respect to processing involving new technologies or large scale profiling or use of biometric data by such technologies, however the DPB does not define what “new technology” means. Concerns around biometric data are outside the scope of the present article, however, it would be interesting to look at how the use of facial recognition technology with CCTVs could impact the safeguards under DPB.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021'&gt;https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anamika Kundu and Digvijay S Chaudhary</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-04-28T02:29:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic">
    <title>Personal Data Protection Bill must examine data collection practices that emerged during pandemic</title>
    <link>https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic</link>
    <description>
        &lt;b&gt;The PDP bill is speculated to be introduced during the winter session of the parliament soon. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen’s data in order to fulfil its responsibilities. The bill could ensure that employers have some responsibility towards the data they collect from the employees.

&lt;/b&gt;
        &lt;p&gt;The article by Shweta Mohandas and Anamika Kundu was &lt;a class="external-link" href="https://www.news9live.com/technology/personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic-137031?infinitescroll=1"&gt;originally published by &lt;strong&gt;news nine&lt;/strong&gt;&lt;/a&gt; on November 29, 2021.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The Personal Data Protection Bill (PDP) is speculated to be introduced during the winter session of the parliament soon, and the report of the Joint Parliamentary Committee (JPC) has already been &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;adopted&lt;/a&gt; by the committee on Monday. The Report of the JPC comes after almost two years of deliberation and secrecy over how the final version of the Personal Data Protection Bill will be. Since the publication of the &lt;a class="external-link" href="https://prsindia.org/files/bills_acts/bills_parliament/2019/Personal%20Data%20Protection%20Bill,%202019.pdf"&gt;2019 version&lt;/a&gt; of the PDP Bill, the Covid 19 pandemic and the public safety measures have opened the way for a number of new organisations and reasons to collect personal data that was non-existent in 2019. Hence along with changes that have been suggested by multiple civil society organisations, the dissent notes submitted by the members of the JPC, the new version of the PDP Bill must also look at how data processing has changed over the span of two years.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Concerns with the bill&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;At the outset there are certain parts of the PDP Bill which need to be revised in order to uphold the spirit of privacy and individual autonomy laid out in the Puttaswamy judgement. The two sections that need to be in line with the privacy judgement are the ones that allow for non consensual processing of data by the government, and by employers. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen's data in order to fulfil its &lt;a class="external-link" href="https://www.livemint.com/news/india/big-brother-on-top-in-data-protection-bill-11576164271430.html"&gt;responsibilities&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the &lt;a class="external-link" href="https://www.meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf"&gt;2018 version&lt;/a&gt; of bill, drafted by the Justice Srikrishna Committee exemptions granted to the State with regard to processing of data was subject to a four pronged test which required the processing to be (i) authorised by law; (ii) in accordance with the procedure laid down by the law; (iii) necessary; and (iv) proportionate to the interests being achieved. This four pronged test was in line with the principles laid down by the Supreme Court in the Puttaswamy judgement. The 2019 version of the PDP Bill has diluted this principle by merely retaining the 'necessity principle' and removing the other requirements which is not in consonance with the test laid down by the Supreme Court in Puttaswamy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 35 was also widely discussed in the panel meetings where members had &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;argued&lt;/a&gt; the removal of 'public order' as a ground for exemption. The panel also insisted for '&lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;judicial or parliamentary oversight&lt;/a&gt;' to grant such exemptions. The final report did not accept these suggestions stating a need to balance &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;national security, liberty and privacy&lt;/a&gt; of an individual. There ought to be prior judicial review of the written order exempting the governmental agency from any provisions of the bill. Allowing the government to claim an exemption if it is satisfied to be "necessary or expedient" can be misused.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another clause which gives the data principal a wide berth is with respect to employee data Section 13 of the current version of the bill provides the employer with a leeway into processing employee data (other than sensitive personal data) without consent based on two grounds: when consent is not appropriate, or when obtaining consent would involve disproportionate effort on the part of the employer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The personal data so collected can only be collected for recruitment, termination, attendance, provision of any service or benefit, and assessing performance. This covers almost all of the activities that require data of the employee. Although the 2019 version of the bill excludes non-consensual collection of sensitive personal data (a provision that was missing in the 2018 version of the bill), there is still a lot of scope to improve this provision and provide employees further right to their data. At the outset the bill does not define employee and employer, which could result in confusion as there is no one definition of these terms across Indian Labour Laws.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, the bill distinguishes between employee and consumer, where the consumer of the same company or service has a greater right to their data than an employee. In the sense that the consumer as a data principal has the option to use any other product or service and also has the right to withdraw consent at any time, in the case of an employee the consequence of refusing consent or withdrawing consent would be being terminated from the employment. It is understood that there is a requirement for employee data to be collected, and that consent does not work the same way as it does in the case of a consumer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bill could ensure that employers have some responsibility towards the data they collect from the employees, such as ensuring that they are only used for the purpose for which they were collected, the employee knows how long their data will be retained, and know if the data is being processed by third parties. It is also worth mentioning that the Indian government is India's largest employer spanning a variety of agencies and public enterprises.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Concerns highlighted by JPC Members&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Going back to the few members of the JPC who have moved dissent notes, specifically with regard to governmental exemptions. Jairam Ramesh filed a &lt;a href="https://www.news9live.com/india/parliament-panel-adopts-report-on-data-protection-amid-dissent-by-opposition-135591"&gt;dissent note&lt;/a&gt;, to which many other opposition members followed suit. While Jairam Ramesh praised the JPC's functioning, he disagreed with certain aspects of the Report. According to him, the 2019 bill is designed in a manner where the right to privacy is given importance only in cases of private activities. He raised concerns regarding the unbridled powers given to the government to exempt itself from any of the provisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The amendment suggested by him would require parliamentary approval before exemption would take place. He also added that Section 12 of the bill which provided certain scenarios where consent was not needed for processing of personal data should have been made '&lt;a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html"&gt;less sweeping&lt;/a&gt;'. Similarly, Gaurav Gogoi's &lt;a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html"&gt;note&lt;/a&gt; stated that the exemptions would create a surveillance state and similarly criticised Section 12 and 35 of the bill. He also mentioned that there ought to be parliamentary oversight for the exemptions provided in the bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the same issue, Congress leader Manish Tiwari noted that the bill creates '&lt;a href="https://timesofindia.indiatimes.com/business/india-business/personal-data-protection-bill-what-is-it-and-why-is-the-opposition-so-unhappy-with-it/articleshow/87869391.cms"&gt;parallel universes&lt;/a&gt;' - one for the private sector which needs to be compliant and the other for the State which can exempt itself. He has opposed the entire bill stating there exists an "inherent design flaw". He has raised specific objections to 37 clauses and stated that any blanket exemptions to the state goes against the Puttaswamy Judgement.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In their joint &lt;a href="https://www.news9live.com/india/tmc-congress-mps-submit-dissent-notes-to-joint-panel-on-personal-data-protection-bill-135491"&gt;dissent note&lt;/a&gt;, Derek O'Brien and Mahua Mitra have said that there is a lack of adequate safeguards to protect the data principals' privacy and the lack of time and opportunity for stakeholder consultations. They have also pointed out that the independence of the DPA will cease to exist with the present provision of allowing the government powers to choose members and the chairman. Amar Patnaik is to object to the lack of inclusion of state level authorities in the bill. Without such bodies, he says, there would be federal override.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;While a number of issues were highlighted by civil society, the members of the JPC, and the media, the new version of the bill should also need to take into account the shifts that have taken place in view of the pandemic. The new version of the data protection bill should take into consideration the changes and new data collection practices that have emerged during the pandemic, be comprehensive and leave very little provisions to be decided later by the Rules.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic'&gt;https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-30T15:15:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill">
    <title>Nothing to Kid About – Children's Data Under the New Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill</link>
    <description>
        &lt;b&gt;The pandemic has forced policymakers to adapt their approach to people's changing practices, from looking at contactless ways of payment to the shifting of educational institutions online.&lt;/b&gt;
        &lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The article was originally &lt;a class="external-link" href="https://www.ijlt.in/post/nothing-to-kid-about-children-s-data-under-the-new-data-protection-bill"&gt;published in the Indian Journal of Law and Technology&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;For children, the internet has shifted from being a form of entertainment to a medium to connect with friends and seek knowledge and education. However, each time they access the internet, data about them and their choices are inadvertently recorded by companies and unknown third parties. The growth of EdTech apps in India has led to growing concerns regarding children's data privacy. This has led to the creation of a &lt;a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank"&gt;self-regulatory&lt;/a&gt; body, the Indian EdTech Consortium. More recently, the &lt;a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank"&gt;Advertising Standard Council of India&lt;/a&gt;&lt;span class="_3zM-5"&gt; has &lt;/span&gt;also started looking at passing a draft regulation to keep a check on EdTech advertisements.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The Joint Parliamentary Committee (JPC), tasked with drafting and revising the Data Protection Bill, had to consider the number of changes that had happened after the release of the 2019 version of the Bill. While the most significant change was the removal of the term “personal data” from the title of the Bill, in a move to create a comprehensive Data Protection Bill that includes both personal and non personal data. Certain other provisions of the Bill also featured additions and removals. The JPC, in its revised version of the Bill has removed an entire class of &lt;a class="_1lsz7 _3Bkfb" href="https://prsindia.org/billtrack/the-personal-data-protection-bill-2019#:~:text=Obligations%20of%20data%20fiduciary%3A%20A,specific%2C%20clear%20and%20lawful%20purpose" rel="noopener noreferrer" target="_blank"&gt;data fiduciaries&lt;/a&gt; – guardian data fiduciary – which was tasked with greater responsibility for managing children's data. While the JPC justified the removal of the guardian data fiduciary stating that consent from the guardian of the child is enough to meet the end for which personal data of children are processed by the data fiduciary. While thought has been given to looking at how consent is given by the guardian on behalf of the child, there was no change in the age of children in the Bill. Keeping the age of consent under the Bill as the same as the age of majority to enter into a contract under the 1872 Indian Contract Act – 18 years – reveals the disconnect the law has with the ground reality of how children interact with the internet.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;In the current state of affairs where Indian children are navigating the digital world on their own there is a need to look deeply at the processing of children’s data as well as ways to ensure that children have information about consent and informational privacy. By placing the onus of granting consent on parents, the PDP Bill fails to look at how consent works in a privacy policy–based consent model and how this, in turn, harms children in the long run.&lt;/p&gt;
&lt;h3 class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d aujbK _3M0Fe _1FoOD iWv3d _1j-51 mm8Nw"&gt;1. Age of Consent&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;By setting the age of consent as 18 years under the Data Protection Bill, 2021, it brings all individuals under 18 years of age under one umbrella without making a distinction between the internet usage of a 5-year-old child and a 16-year-old teenager. There is a need to look at the current internet usage habits of children and assess whether requiring parental consent is reasonable or even practical. It is also pertinent to note that the law in the offline world does make the distinction between age and maturity. For example, it has been &lt;a class="_1lsz7 _3Bkfb" href="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill" rel="noopener noreferrer" target="_blank"&gt;highlighted&lt;/a&gt; that Section 82 of the Indian Penal Code, read with Section 83, states that any act by a child under the age of 12 years shall not be considered an offence, while the maturity of those aged between 12–18 years will be decided by the court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industries, which would qualify them to fall under Section 13 of the Bill, which deals with employee data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;A 2019 &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://reverieinc.com/wp-content/uploads/2020/09/IAMAI-Digital-in-India-2019-Round-2-Report.pdf" rel="noopener noreferrer" target="_blank"&gt;report&lt;/a&gt;&lt;span&gt; suggests that two-thirds of India’s internet users are in the 12–29 years age group, accounting for about 21.5% of the total internet usage in metro cities. With the emergence of cheaper phones equipped with faster processing and low internet data costs, children are no longer passive consumers of the internet. They have social media accounts and use several applications to interact with others and make purchases. There is a need to examine how children and teenagers interact with the internet as well as the practicality of requiring parental consent for the usage of applications.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Most applications that require age data request users to type in their date of birth; it is not difficult for a child to input a suitable date that would make it appear that they are &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank"&gt;over 18&lt;/a&gt;&lt;span&gt;. In this case they are still children but the content that will be presented to them would be those that are meant for adults including content that might be disturbing or those involving use of &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank"&gt;alcohol and gambling. &lt;/a&gt;&lt;span&gt;Additionally, in their privacy policies, applications sometimes state that they are not suited for and restricted from users under 18. Here, data fiduciaries avoid liability by placing the onus on the user to declare their age and properly read and understand the privacy policy.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Reservations about the age of consent under the Bill have also been highlighted by some members of the JPC through their dissenting opinions. &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank"&gt;MP Ritesh Pandey &lt;/a&gt;&lt;span&gt;suggested that the age of consent should be reduced to 14 years keeping the best interest of the children in mind as well as to support children in benefiting from technological advances. Similarly, &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank"&gt;MP Manish Tiwari &lt;/a&gt;&lt;span&gt;in his dissenting opinion suggested regulating data fiduciaries based on the type of content they provide or data they collect.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;2. How is the 2021 Bill Different from the 2019 Bill?&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf" rel="noopener noreferrer" target="_blank"&gt;2019 &lt;/a&gt;&lt;span&gt;draft of the Bill consisted of a class of data fiduciaries called guardian data fiduciaries – entities that operate commercial websites or online services directed at children or which process large volumes of children’s personal data. This class of fiduciaries was barred from profiling, tracking, behavioural monitoring, and running targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child. In the previous draft, such data fiduciaries were not allowed to engage in ‘profiling, tracking, behavioural monitoring of children, or direct targeted advertising at children’. There was also a prohibition on conducting any activities that might significantly harm the child. As per Chapter IV, any violation could attract a penalty of up to INR 15 crore of the worldwide turnover of the data fiduciary for the preceding financial year, whichever is higher. However, this separate class of data fiduciaries do not have any additional responsibilities. It is also unclear as to whether a data fiduciary that does not by definition fall within such a category would be allowed to engage in activities that could cause ‘significant harm’ to children.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The new Bill also does not provide any mechanisms for age verification and only lays down considerations that verification processes should be undertaken. Furthermore, the JPC has suggested that consent options available to the child when they attain the age of majority i.e. 18 years should be included within the rule frame by the Data Protection Authority instead of being an amendment in the Bill.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;3. In the Absence of a Guardian Data Fiduciary&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The 2018 and 2019 drafts of the PDP Bill consider a child to be any person below the age of 18 years. For a child to access online services, the data fiduciary must first verify the age of the child and obtain consent from their guardian. The Bill does not provide an explicit process for age verification apart from stating that regulations shall be drafted in this regard. The 2019 Bill states that the Data Protection Authority shall specify codes of practice in this matter. Taking best practices into account, there is a need for ‘&lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/project-brief-highlighting-inclusive-and-practical-mechanisms-to-protect-childrens-data.pdf" rel="noopener noreferrer" target="_blank"&gt;user-friendly and privacy-protecting age verification techniques&lt;/a&gt;&lt;span&gt;’ to encourage safe navigation across the internet. This will require &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/bp-global-technological-developments-in-age-verification-and-age-estimation.pdf" rel="noopener noreferrer" target="_blank"&gt;looking at &lt;/a&gt;&lt;span&gt;technological developments and different standards worldwide. There is a need to hold companies &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.livemint.com/opinion/columns/theres-a-better-way-to-protect-the-online-privacy-of-kids-11615306723478.html" rel="noopener noreferrer" target="_blank"&gt;accountable&lt;/a&gt;&lt;span&gt; for the protection of children’s online privacy and the harm that their algorithms cause children and to make sure that they are not continued.&lt;/span&gt;&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The JPC in the 2021 version of the Bill removed provisions about guardian data fiduciaries, stating that there was no advantage in creating a different class of data fiduciary. As per the JPC, even those data fiduciaries that did not fall within the said classification would also need to comply with rules pertaining to the personal data of children i.e. with Section 16 of the Bill. Section 16 of the Bill requires the data fiduciary to verify the child’s age and obtain consent from the parent/guardian. The manner of age verification has also een spelt out.  Furthermore, since ‘significant data fiduciaries’ is an existing class, there is still a need to comply with rules related to data processing. The JPC also removed the phrase “in the best interests of, the child” and “is in the best interests of, the child” under sub-clause 16(1), implying that the entire Bill concerned the rights of the data principal and the use of such terms dilutes the purpose of the legislation and could give way to manipulation by the data fiduciary.&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Conclusion&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Over the past two years, there has been a significant increase in applications that are targeted at children. There has been a proliferation of EduTech apps, which ideally should have more responsibility as they are processing children's data. We recommend that instead of creating a separate category, such fiduciaries collecting children's data or providing services to children be seen as ‘significant data fiduciaries’ that need to take up additional compliance measures.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Furthermore, any blanket prohibition on tracking children may obstruct safety measures that could be implemented by data fiduciaries. These fears are also increasing in other jurisdictions as there is a likelihood to restrict data fiduciaries from using software that looks out for such as &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.unodc.org/e4j/en/cybercrime/module-12/key-issues/online-child-sexual-exploitation-and-abuse.html" rel="noopener noreferrer" target="_blank"&gt;Child Sexual Abuse Material&lt;/a&gt;&lt;span&gt; as well as  online predatory behaviour. Additionally, concerning the age of consent under the Bill, the JPC could look at international best practices and come up with ways to make sure that children can use the internet and have rights over their data, which would enable them to grow up with more awareness about data protection and privacy. One such example to look at could be the Children's Online Privacy Protection Rule (COPPA) in the US, where the rules apply to operators of websites and online services that collect personal information from kids &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-compliance" rel="noopener noreferrer" target="_blank"&gt;under 13 &lt;/a&gt;&lt;span&gt;or provide services to children that are directed at a general audience, but have actual knowledge that they collect personal information from such children. A form of combination of this system and the significant data fiduciary classification could be one possible way to ensure that children’s data and privacy are preserved online.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;The authors are researchers at the Centre for Internet and Society and thank their colleague Arindrajit Basu for his inputs.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digitalisation</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2022-03-10T13:19:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study">
    <title>Clause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study</link>
    <description>
        &lt;b&gt;In light of the state’s emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-digital-healthcare-case-study/"&gt;published in Medianama&lt;/a&gt; on February 21, 2022. This is the second in a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In the &lt;a href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;previous post&lt;/a&gt;, I looked at provisions on non-consensual data processing for state functions under the most recent version of recommendations by the Joint Parliamentary Committee on India’s Data Protection Bill (DPB). The true impact of these provisions can only be appreciated in light of ongoing policy developments and real-life implications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To appreciate the significance of the dilutions in Clause 12, let us consider the Indian state’s range of schemes promoting digital healthcare. In July 2018, NITI Aayog, a central government policy think tank in India released a strategy and approach paper (Strategy Paper) on the formulation of the National Health Stack which envisions the creation of a federated application programming interface (API)-enabled health information ecosystem. While the Ministry of Health and Family Welfare has focused on the creation of Electronic Health Records (EHR) Standards for India during the last few years and also identified a contractor for the creation of a centralised health information platform (IHIP), this Strategy Paper advocates a completely different approach, which is described as a Personal Health Records (PHR) framework. In 2021, the National Digital Health Mission (NDHM) was launched under which a citizen shall have the option to obtain a digital health ID. A digital health ID is a unique ID and will carry all health records of a person.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;A Stack Model for Big Data Ecosystem in Healthcare&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;A stack model as envisaged in the Strategy Paper, consists of several layers of open APIs connected to each other, often tied together by a unique health identifier. The open nature of APIs has the advantage that it allows public and private actors to build solutions on top of it, which are interoperable with all parts of the stack. It is however worth considering both the ‘openness’ and the role that the state plays in it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Even though the APIs are themselves open, they are a part of a pre-decided technological paradigm, built by private actors and blessed by the state. Even though innovators can build on it, the options available to them are limited by the information architecture created by the stack model. When such a technological paradigm is created for healthcare reform and health data, the stack model poses additional challenges. By tying the stack model to the unique identity, without appropriate processes in place for access control, siloed information, and encrypted communication, the stack model poses tremendous privacy and security concerns. The broad language under Clause 12 of the DPB needs to be looked at in this context.&lt;/p&gt;
&lt;p&gt;Clause 12 allows non-consensual processing of personal data where it is necessary “for the performance of any function of the state authorised by law” in order to provide a service or benefit from the State. In the previous post, I had highlighted the import of the use of only ‘necessity’ to the exclusion of ‘proportionality’. Now, we need to consider its significance in light of the emerging digital healthcare apparatus being created by the state.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The National Health Stack and National Digital Health Mission together envision an intricate system of data collection and exchange which in a regulatory vacuum would ensure unfettered access to sensitive healthcare data for both the state and private actors registered with the platforms. The Stack framework relies on repositories where data may be accessed from multiple nodes within the system. Importantly, the Strategy Paper also envisions health data fiduciaries to facilitate consent-driven interaction between entities that generate the health data and entities that want to consume the health records for delivering services to the individual. The cast of characters involve the National Health Authority, health care providers and insurers who access the National Health Electronic Registries, unified data from different programmes such as National Health Resource Repository (NHRR), NIN database, NIC and the Registry of Hospitals in Network of Insurance (ROHINI), private actors such as Swasth, iSpirt who assist the Mission as volunteers. The currency that government and private actors are interested in is data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The promised benefits of healthcare data in an anonymised and aggregate form range from Disease Surveillance to Pharmacovigilance as well as Health Schemes Management Systems and Nutrition Management, benefits which have only been more acutely emphasised during the pandemic. However, the pandemic has also normalised the sharing of sensitive healthcare data with a variety of actors, without much thinking on much-needed data minimisation practises.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The potential misuses of healthcare data include greater state surveillance and control, predatory and discriminatory practices by private actors which rely on Clause 12 to do away with even the pretense of informed consent so long as the processing of data is deemed necessary by the state and its private sector partners to provide any service or benefit.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Subclause (e) in Clause 12, which was added in the last version of the Bill drafted by MeitY and has been retained by the JPC, allows processing wherever it is necessary for ‘any measures’ to provide medical treatment or health services during an epidemic, outbreak or threat to public health. Yet again, the overly-broad language used here is designed to ensure that any annoyances of informed consent can be easily brushed aside wherever the state intends to take any measures under any scheme related to public health.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Effectively, how does the framework under Clause 12 alter the consent and purpose limitation model? Data protection laws introduce an element of control by tying purpose limitation to consent. Individuals provide consent to specified purposes, and data processors are required to respect that choice. Where there is no consent, the purposes of data processing are sought to be limited by the necessity principle in Clause 12. The state (or authorised parties) must be able to demonstrate necessity to the exercise of state function, and data must only be processed for those purposes which flow out of this necessity. However, unlike the consent model, this provides an opportunity to keep reinventing purposes for different state functions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the absence of a data protection law, data collected by one agency is shared indiscriminately with other agencies and used for multiple purposes beyond the purpose for which it was collected. The consent and purpose limitation model would have addressed this issue. But, by having a low threshold for non-consensual processing under Clause 12, this form of data processing is effectively being legitimised.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study'&gt;https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T15:07:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function">
    <title>How Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</link>
    <description>
        &lt;b&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;published in Medianama&lt;/a&gt; on February 18, 2022. This is the first of a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote &lt;a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html"&gt;an opinion piece&lt;/a&gt; providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash &lt;a href="https://twitter.com/pranesh/status/1023116679440621568"&gt;pointed out&lt;/a&gt; that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —&lt;/p&gt;
&lt;p&gt;a)  where it is necessary to respond to a medical emergency&lt;br /&gt;b)  where it is necessary for state to provide a service or benefit to the individual&lt;br /&gt;c)  where it is necessary for the state to issue any certification, licence or permit&lt;br /&gt;d)  where it is necessary under any central or state legislation, or to comply with a judicial order&lt;br /&gt;e)  where it is necessary for any measures during an epidemic, outbreak or public health&lt;br /&gt;f)  where it is necessary for safety procedures during disaster or breakdown of public order&lt;/p&gt;
&lt;p&gt;In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.&lt;/p&gt;
&lt;h2&gt;Twin restrictions in Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Scope of privacy under Puttaswamy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench ​was ​not ​required ​to ​provide ​a ​legal ​test ​to ​determine ​the ​extent ​and ​scope ​of the ​right ​to ​privacy, but they do provide sufficient ​guidance ​for ​us ​to ​contemplate ​how ​the ​limits ​and ​scope ​of ​the ​constitutional ​right ​to ​privacy ​could ​be ​determined ​in ​future ​cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —&lt;br /&gt;a) the existence of a “law”&lt;br /&gt;b) a “legitimate State interest”&lt;br /&gt;c) the requirement of “proportionality”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proportionality and Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —&lt;/p&gt;
&lt;p&gt;a)  the limiting measures must be carefully designed, or rationally connected, to the objective&lt;br /&gt;b)  they must impair the right as little as possible&lt;br /&gt;c)  the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'&gt;https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T14:56:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill">
    <title>CIS Comments and Recommendations on the Data Protection Bill, 2021</title>
    <link>https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill</link>
    <description>
        &lt;b&gt;This document is a revised version of the comments we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;After nearly two years of deliberations and a few changes in its composition, the Joint Parliamentary Committee (JPC), on 17 December 2021, submitted its report on the Personal Data Protection Bill, 2019  (2019 Bill). The report also contains a new version of the law titled the Data Protection Bill, 2021 (2021 Bill). Although there were no major revisions from the previous version other than the inclusion of all data under the ambit of the bill, some provisions were amended.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This document is a revised version of the&lt;a href="https://cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019"&gt; comments&lt;/a&gt; we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill. Through this document we aim to shed light on the issues that we highlighted in our previous comments that have not yet been addressed, along with additional comments on sections that have become more relevant since the pandemic began. In several instances our previous comments have either not been addressed or only partially been addressed; in such instances, we reiterate them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These general comments should be read in conjunction with our previous recommendations for the reader to get a comprehensive overview of what has changed from the previous version and what has remained the same. This document can also be read while referencing the new Data Protection Bill 2021 and the JPC’s report to understand some of the significant provisions of the bill.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;&lt;a href="https://cis-india.org/internet-governance/general-comments-data-protection-bill.pdf" class="internal-link"&gt;Read on to access the comments&lt;/a&gt; | &lt;/strong&gt;&lt;span&gt;Review and editing by Arindrajit Basu. Copy editing: The Clean Copy; Shared under Creative Commons Attribution 4.0 International license&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-02-14T16:07:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration">
    <title>The Competition Law Case Against Whatsapp’s 2021 Privacy Policy Alteration</title>
    <link>https://cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration</link>
    <description>
        &lt;b&gt;Having examined the privacy implications of Whatsapp's changes to its privacy policy in 2021, this issue brief is the second output in our series examining the effects of those changes. This brief examines the changes in the context of data sharing between Whatsapp and Facebook as being an anticompetitive action in violation of the Indian Competition Act, 2002. &lt;/b&gt;
        &lt;span id="docs-internal-guid-2e4a5c52-7fff-f416-6970-948314f0b524"&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Executive Summary&lt;/h3&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;On January 4, 2021, Whatsapp announced a revised privacy policy through an in-app notification. It highlighted that the new policy would impact user interactions with business accounts, including those which may be using Facebook's hosting services. The updated policy presented users with the option of either accepting greater data sharing between Whatsapp and Facebook or being unable to use the platform post 15th May, 2021. The updated policy resulted in temporarily slowed growth for Whatsapp and increased growth for other messaging apps like Signal and Telegram. While Whatsapp has chosen to delay the implementation of this policy due to consumer outrage, it is important for us to unpack and understand what this (and similar policies) mean for the digital economy, and its associated competition law concerns. Competition law is one of the sharpest tools available to policy-makers to fairly regulate and constrain the unbridled power of large technology companies.&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;While it is evident the Indian competition landscape will benefit from revisiting the existing law and policy framework to reign in Big technology companies, we argue that the change in Whatsapp’s privacy policy in 2021 can be held&amp;nbsp; anti-competitive using legal provisions as they presently stand. Therefore, in this issue brief, we largely limit ourselves to evaluating the legality of Whatsapp’s privacy policy within the confines of the present legal system.&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;First, we dive into an articulation of the present abuse of dominance framework in Indian Competition Law. Second, we analyze whether there was abuse of dominance-bearing in mind an economic analysis of Whatsapp’s role in the relevant market by using tests laid out in previous rulings of the CCI&lt;/p&gt;
&lt;br /&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;The framework for determining abuse of dominance as per The Competition Act is based on three factors:&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;1. Determination of relevant market&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;2. Determination of dominant position&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;3. Abuse of the dominant position&lt;/p&gt;
&lt;br /&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;In two previous orders in 2016 and 2020, CCI has held that Whatsapp is dominant in its relevant market based on several factors which we explore. These include:&lt;/p&gt;
&lt;ol&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;Advantage in user base, usage and reach,&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;Barriers to entry for other competitors&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;Power of acquisition over competitors.&lt;/p&gt;
&lt;/li&gt;&lt;/ol&gt;
&lt;br /&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;However, in both orders, CCI held that Whatsapp did not abuse its dominance by arguing that the practices in question allowed for user choice. We critique these judgments for not reflecting the market structures and exploitative practices of large technology companies. We also argue that even if we use the test of user choice laid down by the CCI in its previous orders concerning Whatsapp and&amp;nbsp; Facebook, the changes made to the privacy policy in 2021 did abuse dominance,and should be held guilty of violating competition law standards.&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;Our analysis revolves around examining the explicit and implicit standards of user choice laid out by the CCI in its 2016 and 2020 judgements as the standard for evaluating fairness in an Abuse of Dominance claim.We demonstrate how the 2021 changes failed to meet these standards.&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;Finally, we conclude by noting that the present case offers a crucial opportunity&amp;nbsp; for&amp;nbsp; India to take a giant step forward in its regulation of big tech companies and harmonise its rulings with regulatory developments around the world.&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;The full issue brief can be found&amp;nbsp;&lt;a href="https://cis-india.org/internet-governance/whatsapp-privacy-policy-2021-issue-brief-competition-law"&gt;here&lt;/a&gt;&lt;/p&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;/span&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration'&gt;https://cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Aman Nair and Arindrajit Basu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Consumer Rights</dc:subject>
    
    
        <dc:subject>Digital Economy</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Competition</dc:subject>
    
    
        <dc:subject>WhatsApp</dc:subject>
    
    
        <dc:subject>Competition Law</dc:subject>
    

   <dc:date>2021-03-24T16:12:09Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
