<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 1 to 15.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/health-data-management-policies"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/making-voices-heard"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india">
    <title>Mapping the Legal and Regulatory Frameworks of the Ad-Tech Ecosystem in India</title>
    <link>https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india</link>
    <description>
        &lt;b&gt;The main purpose of regulations in any sector is essentially twofold, one is to ensure that the interests of the general public or consumers are protected, and the other is to ensure that the sector itself flourishes and grows. Too much regulation may possibly stifle the commercial potential of any sector, whereas too little regulation runs the risk of leaving consumers vulnerable to harmful practices.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In this paper, we try to map the legal and regulatory framework dealing with Advertising Technology (Adtech) in India as well as a few other leading jurisdictions. Our analysis is divided into three main parts, the first being general consumer regulations, which apply to all advertising irrespective of the media – to ensure that advertisements are not false or misleading and do not violate any laws of the country. This part also covers the consumer laws which are specific to malpractices in the technology sector such as Dark Patterns, Influencer based advertising, etc.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The second part of the paper covers data protection laws in India and how they are relevant for the Adtech industry. The Adtech industry requires and is based on the collection and processing of large amounts of data from the users. It is therefore important to discuss the data protection and consent requirements that have been laid out in the spate of recent data protection regulations, which have the potential to severely impact the Adtech industry.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The last part of the paper covers the competition angle of the Adtech industry. Like with social media intermediaries, the Adtech industry in the world is also dominated by two or three players and such a scenario always lends itself easily to anti-competitive practices. It is therefore imperative to examine the competition law framework to see whether the laws as they exist are robust enough to deal with any possible anti competitive practices that may be prevalent in the Adtech sector.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The research was reviewed by Pallavi Bedi, it can be &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india"&gt;accessed here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india'&gt;https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vipul</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2025-04-24T14:52:29Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025">
    <title>The Centre for Internet and Society’s comments and feedback to the: Digital Personal Data Protection Rules 2025</title>
    <link>https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) submitted its comments and feedback to the Digital Personal Data Protection Rules 2025 initiated by the Indian government.&lt;/b&gt;
        &lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 3 - Notice given by data fiduciary to data principal&lt;/span&gt;&lt;/b&gt; - Under Section 5(2) of the DPDP Act, when the personal data of the data principal has been processed before the commencement of the Act, then the data fiduciary is required to give notice to the data principal as soon as reasonably practicable. However, the Rules fail to specify what is meant by reasonably practicable. The timeline for a notice in such circumstances is unclear.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;In addition, under Rule 3(a) the phrase “be presented and be understandable independently” is ambiguous. It is not clear whether the consent notice has to be presented independently of any other information or whether it only needs to be independently understandable and can be presented along with other information. &lt;/li&gt;
&lt;li&gt;In addition to this we suggest that the need for “privacy by design” mentioned in the earlier drafts is brought back, with the focus on preventing deceptive design practices (dark patterns)  being used while collecting data. &lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;br /&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 4 - Registration and obligations of Consent Manager&lt;/span&gt;&lt;/b&gt;- The concept of independent consent managers, similar to account aggregators in the financial sector, and consent manager platforms in the EU is a positive step. However, the Act and the Rules need to flesh out the interplay between the Data Fiduciary and the Consent Managers in a more detailed manner, for example, how does the data fiduciary know if a data principal is using a consent manager, and under what circumstances can the data fiduciary bypass the consent manager, what is the penalty/consequence, etc.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 6 - Reasonable security safeguards&lt;/span&gt;&lt;/b&gt; - While we appreciate the guidance provided in terms of the measures for security such as “encryption, obfuscation or masking or the use of virtual tokens”, it would also be good to refer to the SPDI Rules and include the example of the The international Standard IS/ISO/IEC 27001 on Information Technology - Security Techniques - Information Security Management System as an illustration to guide data fiduciaries.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 7 - Intimation of personal data breach&lt;/span&gt;&lt;/b&gt; - As per the Rules, the data fiduciary on becoming aware of any personal data breach is required to notify the data principal and the Data Protection Board without delay; a plain reading of this Rule suggests that data fiduciary has to report the breach almost immediately, and this could be a practical challenge. Further, the absence of any threshold (materiality, gravity of the breach, etc) for notifying the data principal means that the data fiduciary will have to inform the data principal about even an isolated data breach which may not have an impact on the data principal. In this context, we recommend the Rule be amended to state that the data fiduciary should be required to inform the Data Protection Board about every data breach, however the data principal should be informed depending on the gravity and materiality of the breach and when it is likely to result in high risk to the data principal.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Whilst the Rules have provisions for intimation of data breach, there is no specific provision requiring the Data Fiduciary to take all steps necessary to ensure that the Data Fiduciary has taken all necessary measures to mitigate the risk arising out of the said breach. Although there is an obligation to report any such measures to the Data Principal (Rule 7(1)(c)) as well as to the DPBI (Rule 7(2)(b)(iii)), there is no positive obligation imposed on the Data Fiduciary to take any such mitigation measures. The Rules and the Act merely presume that the Data Fiduciary would take mitigation measures, perhaps that is the reason why there are notification requirements for such breach, however the Rules and the Act do not put any positive obligation on the Data Fiduciary to actually implement such measures. This would lead to a situation where a Data Fiduciary may not take any measures to mitigate the risks arising out of the data breach, and be in compliance with its legal obligations by merely notifying the Data Principal as well as the DPBI that no measures have been taken to mitigate the risks arising from the data breach. In addition, the SPDI Rules state that in an event of a breach the body corporate is required to demonstrate that they had implemented reasonable security standards. This provision could be incorporated in this Rule to emphasize on the need to implement robust security standards which is one of the ways to curb data breaches from happening, and ensure that there is a protocol to mitigate the breach.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 10 - Verifiable consent for processing of personal data of child or of person with disability who has a lawful guardian&lt;/span&gt;&lt;/b&gt; - The two mechanisms provided under the Rules to verify the age and identity of parents pre-suppose a high degree of digital literacy on the part of the parents. They may either give or refuse consent without thinking too much about the consequences arising out of giving or not giving consent. As there is always a risk of individuals not providing the correct information regarding their age or their relationship with the child, platforms may have to verify every user’s age; thereby preventing users from accessing the platform anonymously. Further, there is also a risk of data maximisation of personal data rather than data minimisation; i.e parents may be required to provide far more information than required to prove their identity. One recommendation/suggestion that we propose is to remove the processing of children's personal data from the ambit of this law, and instead create a separate standalone legislation dealing with children’s digital rights. Another important issue to highlight here is the importance of the Digital Protection Board and its capacity to levy fines and impose strictures on the platforms. We have seen from examples from other countries that platforms are forced to redesign and provide for better privacy and data protection mechanisms when the regulator steps in and imposes high penalties&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 12 - Additional obligations of Significant Data Fiduciary&lt;/span&gt;&lt;/b&gt; - The Rules do not clarify which entities will be considered as a Significant Data Fiduciary, leaving that to the government notifications. This creates uncertainty for data fiduciaries, especially smaller organisations that might not be able to set up the mechanisms and people for conducting data protection impact assessment, and auditing. The Rule provides that SDFs will have to conduct an annual Data Protection Impact Assessment. While this is a step in the right direction, the Rules are currently silent on the granularity of the DPIA. Similarly for “audit” the Rules do not clarify what type of audit is needed and what the parameters are. It is therefore imperative that the government notifies the level of details that the DPIA and the audit need to go into in order to ensure that the SDFs actually address issues where their data governance practices are lacking and not use the DPIA as a whitewashing tactic.There is also a  need to reduce some of the ambiguity with regards to the parameters, and responsibilities in order to make it easier for startups and smaller players to comply with the regulations.  In addition, while there is a need to protect data and increase responsibility on organisations collecting sensitive data or large volumes of data, there is a need to look beyond compliance and look at ways that preserve the rights of the data principal. Hence significant data fiduciaries should also be given the added responsibility of collecting explicit consent from the data principal, and also have easier access for correction of data, grievance redressal and withdrawal of consent.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 14 - Processing of personal data outside India&lt;/span&gt;&lt;/b&gt; - As per section 16 of the Act the government could, by notification, restrict the transfer of data to specific countries as notified. This system of a negative list envisaged under the Act appears to have been diluted somewhat by the use of the phrase “any foreign State” under the Rules. This ambiguity should be addressed and the language in the Rules may be altered to bring it in line with the Act. Further, the rules also appear to be ultra vires to the Act. As per the DPDP Act, personal data could be shared to outside India, except to countries which were on the negative list, however, the dilution of the provision through the rules appears to have now created a white list of countries; i.e. permissible list of countries to which data can be transferred.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 15 Exemption from Act for research, archiving or statistical purposes&lt;/span&gt;- &lt;/b&gt;While creating an exception for research and statistical purposes is an understandable objective, the current wording of the provision is vague and subject to mischief. The objective behind the provision is to ensure that research activities are not hindered due to the requirements of taking consent, etc. as required under the Act. However the way the provision is currently drafted, it could be argued that a research lab or a research centre established by a large company, for e.g. Google, Meta, etc. could also seek exemptions from the provisions of this Act for conducting “research”. The research conducted may not be shared with the public in general and may be used by the companies that funded/established the research centre. Therefore there should be further conditions attached to this provision, that would keep such research centers outside the purview of the exemption. Conditions such as making the results of the research publicly available, public interest, etc. could be considered for this purpose.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 22 - Calling for Information from data fiduciary or intermediary&lt;/span&gt; - &lt;/b&gt;This rule read with the seventh schedule appears to dilute the data minimisation and purpose limitation provisions provided for in the Act. The wide ambit of powers appears to be in contravention of the Supreme Court judgement in the Puttaswamy case, which places certain restrictions on the government while collecting personal data. This “omnibus” provision flouts guardrails like necessity and proportionality that are important to safeguard the fundamental right to privacy.&lt;/p&gt;
&lt;p&gt;It should be clarified whether this rule is merely an enabling provision to facilitate sharing of information, and only designated competent authorities as per law can avail of this provision. &lt;span style="text-decoration: underline;"&gt;Need for Confidentiality &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Additionally, the rule mandates that the government may “require the Data Fiduciary or intermediary to not disclose” any request for information made under the Act. There is no requirement of confidentiality indicated in the governing section, i.e. section 36, from which Rule 22 derives its authority. Talking about the avoidance of secrecy in government business, the Supreme Court in the State of U.P. v. Raj Narain, (1975) 4 SCC 428 has held that &lt;br /&gt; &lt;i&gt;“In a government of responsibility like ours, where all the agents of the public must be responsible for their conduct, there can but few secrets. The people of this country have a right to know every public act, everything, that is done in a public way, by their public functionaries. They are entitled to know the particulars of every public transaction in all its bearing. The right to know, which is derived from the concept of freedom of speech, though not absolute, is a factor which should make one wary, when secrecy is claimed for transactions which can, at any rate, have no repercussions on public security (2). To cover with [a] veil [of] secrecy the common routine business, is not in the interest of the public. Such secrecy can seldom be legitimately desired. It is generally desired for the purpose of parties and politics or personal self-interest or bureaucratic routine. The responsibility of officials to explain and to justify their acts is the chief safeguard against oppression and corruption.” &lt;/i&gt;&lt;br /&gt; In order to ensure that state interests are also protected, there may be an enabling provision whereby in certain instances confidentiality may be maintained, but there has to be a supervisory mechanism whereby such action may be judged on the anvil of legal propriety.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025'&gt;https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi, Vipul Kharbanda, Shweta Mohandas, Anubha Sinha and Isha Suri</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2025-03-06T02:06:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps">
    <title>Privacy Policy Framework for Indian Mental Health Apps </title>
    <link>https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps</link>
    <description>
        &lt;b&gt;This report analyses the privacy policies of mental health apps in India and provides recommendations for making the policies not only legally compliant but also user-centric&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The report’s findings indicate a significant gap in the structure and content of privacy policies in Indian mental health apps. This highlights the need to develop a framework that can guide organisations in developing their privacy policies. Therefore, this report proposes a holistic framework to guide the development of privacy policies for mental health apps in India. It focuses on three key segments that are an essential part of the privacy policy of any mental health app. First, it must include factors considered essential by the Digital Personal Data Protection Act 2023 (DPDPA) such as consent mechanisms, rights of the data principal, provision to withdraw consent etc. Second, the privacy policy must state how the data provided by them to these apps will be used. Finally, developers must include key elements, such as provisions for third-party integrations and data retention policies.”&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Click to download the full research paper &lt;a class="external-link" href="https://cis-india.org/internet-governance/files/privacy-policy-framework.pdf"&gt;here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps'&gt;https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Chakshu Sang and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2025-01-10T00:11:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support">
    <title>Digital Delivery and Data System for Farmer Income Support</title>
    <link>https://cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support</link>
    <description>
        &lt;b&gt;This report, jointly published by the Centre for Internet &amp; Society and Privacy International, highlights the digital systems deployed by the government to augment farmer income. It analyses the PM-Kisan and Kalia schemes in Odisha and Andhra Pradesh. &lt;/b&gt;
        &lt;h2&gt;Executive Summary&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;This study provides an in-depth analysis of two direct cash transfer schemes in India – Krushak Assistance for Livelihood and Income Augmentation (KALIA) and Pradhan Mantri Kisan Samman Nidhi (PM-KISAN) – which aim to provide income support to farmers. The paper examines the role of data systems in the delivery and transfer of funds to the beneficiaries of these schemes, and analyses their technological framework and processes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We find that the use of digital technologies, such as direct benefit transfer (DBT) systems, can improve the efficiency and ensure timely transfer of funds. However, we observe that the technology-only system is not designed with the last beneficiaries in mind; these people not only have no or minimal digital literacy but are also faced with a lack of technological infrastructure, including internet connectivity and access to the system that is largely digital.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Necessary processes need to be implemented and personnel on the ground enhanced in the existing system, to promptly address the grievances of farmers and other challenges.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This study critically analyses the direct cash transfer scheme and its impact on the beneficiaries. We find that despite the benefits of direct benefit transfer (DBT) systems, there have been many instances of failures, such as the exclusion of several eligible households from the database.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The study also looks at gender as one of the components shaping the impact of digitisation on beneficiaries. We also identify infrastructural and policy constraints, in sync with the technological framework adopted and implemented, that impact the implementation of digital systems for the delivery of welfare. These include a lack of reliable internet connectivity in rural areas and low digital literacy among farmers. We analyse policy frameworks at the central and state levels and find discrepancies between the discourse of these schemes and their implementation on the ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We conclude the study by discussing the implications of datafication, which is the process of collecting, analysing, and managing data through the lens of data justice. Datafication can play a crucial role in improving the efficiency and transparency of income support schemes for farmers. However, it is important to ensure that the interests of primary beneficiaries are considered – the system should work as an enabling, not a disabling, factor. This appears to be the case in many instances since the current system does not give primacy to the interests of farmers. We offer recommendations for policymakers and other stakeholders to strengthen these schemes and improve the welfare of farmers and end users.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="https://cis-india.org/internet-governance/files/digital-tools-farmers-report/at_download/file" class="external-link"&gt;&lt;b&gt;Click to download the full report&lt;/b&gt;&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support'&gt;https://cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sameet</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Technologies</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-10-18T23:40:25Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy">
    <title>Deceptive Design in Voice Interfaces: Impact on Inclusivity, Accessibility, and Privacy </title>
    <link>https://cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy</link>
    <description>
        &lt;b&gt;This article was commissioned by the Pranava Institute, as part of their project titled Design Beyond Deception, supported by the University of Notre Dame - IBM's Tech Ethics Lab.” The article examines the design of voice interfaces (VI) to anticipate potential deceptive design patterns in VIs. It also presents design and regulatory recommendations to mitigate these practices. &lt;/b&gt;
        &lt;p&gt;The original blog post can be accessed &lt;a class="external-link" href="https://www.design.pranavainstitute.com/post/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;b&gt;Introduction&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Voice Interfaces (VIs) have come a long way in recent years and are easily available as inbuilt technology with smartphones, downloadable applications, or standalone devices. In line with growing mobile and internet connectivity, there is now an increasing interest in India in internet-based multilingual VIs which have the potential to enable people to access services that were earlier restricted by language (primarily English) and interface (text-based systems). This current interest has seen even global voice applications such as Google Home and Amazon’s Alexa being available in &lt;a class="itht3 TWoY9" href="https://www.businesstoday.in/technology/news/story/now-talk-to-alexa-seamlessly-in-hindi-english-and-hinglish-231469-2019-10-09" rel="noopener noreferrer" target="_blank"&gt;Hindi&lt;/a&gt; (Singal, 2019) as well as the &lt;a class="itht3 TWoY9" href="https://voice.cis-india.org/#mapping-actors" rel="noopener noreferrer" target="_blank"&gt;growth&lt;/a&gt; of multilingual voice bots for certain banks, hotels, and hospitals (Mohandas, 2022).&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;The design of VIs can have a significant impact on the behavior of the people using them. Deceptive design patterns or design practices that trick people into taking actions they might otherwise not take (Tech Policy Design Lab, n.d.), have gradually become pervasive in most digital products and services. Their use in visual interfaces has been widely &lt;a class="itht3 TWoY9" href="https://dl.acm.org/doi/pdf/10.1145/3400899.3400901" rel="noopener noreferrer" target="_blank"&gt;criticized&lt;/a&gt; by researchers (Narayanan, Mathur, Chetty, and Kshirsagar, 2020), along with recent &lt;a class="itht3 TWoY9" href="https://tacd.org/manipulative-design-practices-online-what-policy-solutions-for-the-eu-and-the-u-s/" rel="noopener noreferrer" target="_blank"&gt;policy interventions&lt;/a&gt; (Schroeder and Lützow-Holm Myrstad, 2022) as well. As VIs become more relevant and mainstream, it is critical to anticipate and address the use of deceptive design patterns in them. This article, based on our learnings from the &lt;a class="itht3 TWoY9" href="http://voice.cis-india.org/index.html" rel="noopener noreferrer" target="_blank"&gt;study&lt;/a&gt; of VIs in India, examines the various types of deceptive design patterns in VIs and focuses on their implications in terms of linguistic barriers, accessibility, and privacy.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Potential deceptive design patterns in VIs&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Our research findings suggest that VIs in India are still a long way off from being inclusive, accessible and privacy-preserving. While there has been some development in multilingual VIs in India, their compatibility has been limited to a few Indian languages (Mohandas, 2022) (Naidu, 2022)., The potential of VIs as a tool for people with vision loss and certain cognitive disabilities such as dyslexia is widely recognized (Pradhan, Mehta, and Findlater, 2018), but our conversations suggest that most developers and designers do not consider accessibility when conceptualizing a voice-based product, which leads to interfaces that do not understand non standard speech patterns, or have only text-based privacy policies (Mohandas, 2022). Inaccessible privacy policies full of legal jargon along with the lack of regulations specific to VIs,  also make people vulnerable to privacy risks.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Deceptive design patterns can be used by companies to further these gaps in VIs. As with visual interfaces, the affordances and attributes of VI can determine the way in which they can be used to manipulate behavior. Kentrell Owens, et.al in their recent &lt;a class="itht3 TWoY9" href="https://homes.cs.washington.edu/~kentrell/static/papers/owensEuroUSEC2022-preprint.pdf" rel="noopener noreferrer" target="_blank"&gt;research&lt;/a&gt; lay down six unique properties of VIs that may be used to implement deceptive design patterns (Owens, Gunawan, Choffnes, Emami-Naeini, Kohno, and Roesner, 2022). Expanding upon these properties, and drawing from our research, we look at how they can be exacerbated in India.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Making processes cumbersome&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;VIs are often limited by their inability to share large amounts of information through voice. They thus operate in combination with a smartphone app or a website. This can be intentionally used by platforms to make processes such as changing privacy settings or accessing the full privacy notice inconvenient for people to carry out. In India, this is experienced while unsubscribing from services such as Amazon Prime (Owens et al., 2022). Amazon Echo Dot presently allows individuals to subscribe to an Amazon Prime membership using a voice command, but directs them to use the website in order to unsubscribe from the membership. This can also manifest in the form of canceling orders and changing privacy settings.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;VIs follow a predetermined linear structure that ensures a tightly controlled interaction. People make decisions based on the information they are provided with at various steps. Changing their decision or switching contexts could involve going back several steps. People may accept undesirable actions from the VI in order to avoid this added effort (Owens et al., 2022). The urgency to make decisions on each step can also cause people to make unfavorable choices such as allowing consent to third party apps. The VI may prompt advertisements and push for the company’s preferred services in this controlled conversation structure, which the user cannot side-step. For example, while setting up the Google voice assistant on any device, it nudges people to sign into their Google account. This means the voice assistant gets access to their web and app activity and location history at this step. While the data management of Google accounts can be tweaked through the settings, it may get skipped during a linear set-up structure. Voice assistants can also push people to opt into features such as ads personalisation, default news sources, and location tracking.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Making options difficult to find&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Discoverability is another challenge for VIs. This means that people might find it difficult to discover available actions or options using just voice commands. This gap can be misused by companies to trick people into making undesirable choices. For instance, while purchasing items, the VI may suggest products that have been sponsored and not share full information on other cheaper products, forcing people to choose without complete knowledge of their options. Many mobile based voice apps in India use a combination of images or icons with the voice prompts to enable discoverability of options and potential actions, which excludes people with vision loss (Naidu, 2022). These apps comprise a voice layer added to an otherwise touch-based visual platform so that people are able to understand and navigate through all available options using the visual interface, and use voice only for purposes such as searching or narrating. This means that these apps cannot be used through voice alone, making them disadvantageous for people with vision loss.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Discreet integration with third parties&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;VIs can use the same voice for varying contexts. In the case of Alexa, Skills, which are apps on its platform, have the same voice output and invocation phrases as its own in-built features. End users find it difficult to differentiate between an interaction with Amazon and that with Skills which are third-party applications. This can cause users to share information that they otherwise would not have with third parties (Mozilla Foundation, 2022). There are numerous Amazon Skills inHindi and people might not be aware that the developers of these Skills are &lt;a class="itht3 TWoY9" href="https://www.theverge.com/2021/3/5/22315211/amazon-alexa-skills-how-to-remove-security-privacy-problems" rel="noopener noreferrer" target="_blank"&gt;not vetted &lt;/a&gt;by Amazon. This misunderstanding can create significant privacy or security risks if Skills are linked to contacts, banking, or social media accounts.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Lack of language inclusivity &lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;The lack of local language support, colloquial translations, and accents can lead to individuals not receiving clear and complete information. VI’s failure to understand certain accents can also make people feel isolated (Harwell, 2018). While in India voice assistants and even voice bots are available in few Indic languages, the default initial setup, privacy policies, and terms and conditions are still in English. The translated policies also use literary language which is difficult for people to understand, and miss out on colloquial terms. This could mean that the person might have not fully understood these notices and hence not have given informed consent. Such use of unclear language and unavailability of information in Indic languages can be viewed as a deceptive design pattern.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Making certain choices more apparent &lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;The different dimensions of voice such as volume, pitch, rate, fluency, pronunciation, articulation, and emphasis can be controlled and manipulated to implement deceptive design patterns. VIs may present the more privacy-invasive options more loudly or clearly, and the more privacy-preserving options more softly or quickly. It can use tone modulations to shame people into making a specific choice (Owens et al., 2022). For example, media streaming platforms may ask people to subscribe for a premium account to avoid ads in normal volume and mention the option to keep ads in a lower volume. Companies have also been observed to discreetly integrate product advertisements in voice assistants using tone. SKIN, a neurotargeting advertising strategy business, used a change of tone of the voice assistant to suggest a dry throat to advertise a drink (Chatellier, Delcroix, Hary, and Girard-Chanudet, 2019).&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;The attribution of gender, race, class, and age through stereotyping can create a persona of the VI for the user. This can extend to personality traits, such as an extroverted or an introverted, docile or aggressive character (Simone, 2020). The default use of female voices with a friendly and polite persona for voice assistants has drawn criticism for perpetuating harmful gender stereotypes (Cambre and Kulkarni, 2019). Although there is an option to change the wake word “Alexa” in Amazon’s devices, certain devices and third party apps do not work with another wake word (Ard, 2021). Further, projection of demographics can also be used to employ deceptive design patterns. For example, a VI persona that is constructed to create a perception of intelligence, reliability, and credibility can have a stronger influence on people’s decisions. Additionally, the effort to make voice assistants as human sounding as possible without letting people know they are human, could create a number of &lt;a class="itht3 TWoY9" href="https://www.nytimes.com/2019/05/22/technology/personaltech/ai-google-duplex.html" rel="noopener noreferrer" target="_blank"&gt;issues&lt;/a&gt; (X. Chen and Metz, 2019). First time users might divulge sensitive information thinking that they are interacting with a person. This becomes more ethically challenging when persons with vision loss are not able to know who they are interacting with.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Recording without notification &lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Owens et al speak about VIs occupying physical domains due to which they have a much wider impact as opposed to a visual interface (Owens et al., 2022). The always-on nature of virtual assistants could result in personal information of a guest being recorded without their knowledge or consent as consent is only given at the setup stage by the owner of the device or smartphone.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Making personalization more convenient through data collection&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;VIs are trained to adapt to the experience and expertise of the user. Virtual assistants provide personalization and the possibility to download a number of skills, save payment information, and phone contacts. In order to facilitate differentiation between multiple users on the same VI, individuals talking to the device are profiled based on their speech patterns and/or voice biometrics. This also helps in controlling or restricting content for children (Naidu, 2022). There is also tracking of commands to identify and list their intent for future use. The increase of specific and verified data can be used to provide better targeted advertisements, as well possibly be shared with law enforcement agencies in certain cases. &lt;a class="itht3 TWoY9" href="https://www.business-standard.com/article/current-affairs/razorpay-shared-donor-data-with-police-claims-alt-news-122070501255_1.html" rel="noopener noreferrer" target="_blank"&gt;Recently&lt;/a&gt;, a payment gateway company was made to share customer information to the law enforcement without their customer’s knowledge. This included not just the information about the client but also revealed sensitive personal data of the people who had used the gateway for transactions to the customer. While providing such details are not illegal and companies are meant to comply with requests from law enforcement, if more people knew of the possibility of every conversation of the house being accessible to law enforcement they would make more informed choices of what the VI records.&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Reducing friction in actions desired by the platform&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;One of the fundamental advantages of VIs is that it can reduce several steps to perform an action using a single command. While this is helpful to people interacting with it, the feature can also be used to reduce friction from actions that the platform wants them to take. These actions could include sharing sensitive information, providing consent to further data sharing, and making purchases. An &lt;a class="itht3 TWoY9" href="http://insider.com/kids-alexa-buy-700-worth-of-toys-moms-credit-card-2019-12" rel="noopener noreferrer" target="_blank"&gt;&lt;span class="D-jZk"&gt;example&lt;/span&gt;&lt;/a&gt; of this can be seen where children have found it very easy to purchase items using Alexa (BILD, 2019).&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;Recommendations for Designers and Policymakers&lt;/b&gt;&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Through these deceptive design patterns, VIs can obstruct and control information according to the preferences of the platform. This can result in a heightened impact on people with less experience with technology. Presently, profitability is a key driving factor for development and design of VI products. There is more importance given to data-based and technical approaches, and interfaces are often conceptualized by people with technical expertise with lack of inputs from designers at the early stages (Naidu, 2022). Designers also focus more on the usability and functionality of the interfaces by enabling personalization, but are often not as sensitive to safeguarding the rights of individuals using them. In order to tackle deceptive design, designers must work towards prioritizing ethical practice, and building in more agency and control for people who use VIs.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Many of the potential deceptive design patterns can be addressed by designing for accessibility and inclusivity in a privacy preserving manner. This includes vetting third-party apps, providing opt-outs, and clearly communicating privacy notices. Privacy implications can also be prompted by the interface at the time of taking actions. There should be clear notice mechanisms such as a prominent visual cue to alert people when a device is on and recording, along with an easy way to turn off the ‘always listening’ mode. The use of different voice outputs for third party apps can also signal to people about who they are interacting with and what information they would like to share in that context.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;Training data that covers a diverse population should be built for more inclusivity. A linear and time-efficient architecture is helpful for people with cognitive disabilities. But, this linearity can be offset by adding conversational markers that let the individual know where they are in the conversation (Pearl, 2016). This could address discoverability as well, allowing people to easily switch between different steps. Speech-only interactions can also allow people with vision loss to access the interface with clarity.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; "&gt;A number of policy documents including the 2019 version of India’s Personal Data Protection Bill, emphasize on the need for privacy by design. But, they do not mention how deceptive design practices could be identified and avoided, or prescribe penalties for using these practices (Naidu, Sheshadri, Mohandas, and Bidare, 2020). In the case of VI particularly, there is a need to look at it as biometric data that is being collected and have related regulations in place to prevent harm to users. In terms of accessibility as well, there could be policies that require not just websites but also apps (including voice based apps) to be compliant with international accessibility guidelines , and to conduct regular audits to ensure that the apps are meeting the accessibility threshold.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy'&gt;https://cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Saumyaa Naidu and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-08-08T15:22:51Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/health-data-management-policies">
    <title>Health Data Management Policies - Differences Between the EU and India </title>
    <link>https://cis-india.org/internet-governance/blog/health-data-management-policies</link>
    <description>
        &lt;b&gt;Through this issue brief we would like to highlight the differences in approaches to health data management taken by the EU and India, and look at possible recommendations for India, in creating a privacy preserving health data management policy. &lt;/b&gt;
        &lt;p&gt;This issue brief was reviewed and edited by Pallavi Bedi&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Health data has seen an increased interest the world over, on account of the amount of information and inferences that can be drawn not just about a person but also about the population in general. The Covid 19 pandemic also brought about an increased focus on health data, and brought players that earlier did not collect health data to be required to collect such data, including offices and public spaces. This increased interest has led to further thought on how health data is regulated and a greater understanding of the sensitivity of such data, because of which countries are in varying processes to get health data regulated over and above the existing data protection regulations. The regulations not only look at ensuring the privacy of the individual but also look at ways in which this data can be shared with companies, researchers and public bodies to foster innovation and to monetise this valuable data. However for a number of countries the effort is still on the digitisation of health data. India has been in the process of implementing a nationwide health ID that can be used by a person to get all their medical records in one place. The National Health Authority (NHA) has also since 2017 been publishing policies that look at the framework and ecosystem of health data, as well as the management and sharing of health data. However these policies and a scattered implementation of the health ID are being carried out without a data protection legislation in place. In comparison, Europe, which already has an established health Id system, and a data protection legislation (GDPR) is looking at the next stage of health data management through the EU Health Data Space (EUHDS). Through this issue brief we would like to highlight the differences in approaches to health data management taken by the EU and India, and look at possible recommendations for India, in creating a privacy preserving health data management policy.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Background&lt;/h2&gt;
&lt;h3&gt;EU Health Data Space&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The EU Health Data Space (&lt;b&gt;EUHDS&lt;/b&gt;) was proposed by the EU Council as a way to create an ecosystem which combines rules, standards, practices and infrastructure, around health data under a common governance framework. The EUHDS is set to rely on two pillars; namelyMyHealth@EU and HealthData@EU, where MyHealth@EU facilitates easy flow of health data between patients and healthcare professionals within member states, the HealthData@EU,faciliates secondary use of data which allows policy makers,researchers access to health data to foster research and innovation.&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[1]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The EUHDS aims to provide a trustworthy system to access and process health data and builds up from the General Data Protection Regulation (GDPR), proposed Data Governance Act.&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[2]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;India’s health data policies: &lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The last few years has seen a flurry of health policies and documents being published and the creation of a framework for the evolution of a National Digital Health Ecosystem (NDHE). The components for this ecosystem were the National Digital Health Blueprint published in 2019 (NDHB) and the National Digital Health Mission (NDHM). The BluePrint was created to implement the National Health Stack (published in 2018)  which facilitated the creation of Health IDs.&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[3]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Whereas the NDHM was drafted to drive the implementation of the Blueprint, and promote and facilitate the evolution of NDHE.&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[4]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The National Health Authority (&lt;b&gt;NHA&lt;/b&gt;) established in 2018 has been given the responsibility of implementing the National Digital Health Mission. 2018 also saw the Digital Information Security in Healthcare Act (&lt;b&gt;DISHA&lt;/b&gt;) which was to be a legislation that laid down provisions that regulate the generation, collection, access, storage, transmission and use of Digital Health Data ("DHD") and associated personal data.&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[5]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However since its call for public consultation no progress has been made on this front.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Along with these three strategy documents the NHA has also released policy documents more particularly the Health Data Management Policy (which was revised three times; the latest version released in April 2022), the Health Data Retention Policy (released April 2021), and the Consultation Paper on Unified Health Interface (UHI) (released March 2021). Along with this in 2022 the NHA released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) India’s state health insurance policy. &lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;However these draft guidelines repeat the pattern of earlier policies on health data, wherein there is no reference to the policies that predated it; the PM-JAY’s Data Sharing Guidelines published in August 2022 did not even refer to the draft National Digital Health Data Management Policy (published in April 2022). As stated through the examples above these documents do not cross-refer or mention preceding health data documents, creating a lack of clarity of which documents are being used as guidelines by health care providers. &lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In addition to this the Personal Data Protection Bill has been revised three times since its release in 2018. The latest version was published for public comments on November 18, 2022; the Bill has removed the distinction between sensitive personal data and personal data and clubbed all personal data under one umbrella heading of personal data.  Health and health data definition has also been deleted; creating further uncertainty with respect to health data as the different policies mentioned above rely on the data protection legislation to define health data. &lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;&lt;span&gt;Comparison of the Health Data Management Approaches &lt;/span&gt;&lt;/b&gt;&lt;span&gt;&lt;br /&gt; &lt;/span&gt;&lt;span&gt;Interoperability with Data Protection Legislations &lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;&lt;br /&gt;&lt;/span&gt;&lt;/b&gt;&lt;span&gt;At the outset the key difference between the EU and India’s health data management policies has been the legal backing of GDPR which the EUHDS has. EUHDS has a strong base in terms of rules for privacy and data protection as it follows, draws inference and works in tandem with the General Data Protection Regulation (GDPR). The provisions also build upon legislation such as Medical Devices Regulation and the In Vitro Diagnostics Regulation. With particular respect to GDPR the EUHDS draws from the rights set out for protection of personal data including that of electronic health data.&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The Indian Health data policies however currently exist in the vacuum created by the multiple versions of the Data Protection Bill that are published and repealed or replaced. The current version called the Digital Personal Data Protection Bill 2022 seems to take a step backward in terms of health data. The current version does away with sensitive personal data (which health data was a part of) and keeps only one category of data - personal data. It can be construed that the Bill currently considers all personal data as needing the same level of protection but it is not so in practice. The Bill does not at the moment mandate more responsibilities on data fiduciaries&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[6]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; that deal with health data (something that was present in all the earlier versions of the Bill) and in other data protection legislation across different jurisdictions  and leaves the creation of Significant Data Fiduciaries (who have more responsibilities) to be created by rules, based on the sensitivity of data decided by the government at a later date.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[7]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In addition to this the Bill does not define “health data”, the reason why this is a cause for worry is that the existing health data policies also do not define health data often relying on the definition mentioned in the versions of Data Protection Bill. &lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Definitions and Scope&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;&lt;span&gt;The EUHDS defines ‘personal electronic health data’ as data concerning health and genetic data as defined in Regulation (EU) 2016/679&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[8]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, as well as data referring to determinants of health, or data processed in relation to the provision of healthcare services, processed in an electronic form. Health data by these parameters would then include not just data about the status of health of a person which includes reports and diagnosis, but also data from medical devices. &lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In India the Health Data Management Policy 2022, defines “Personal Health Records” (&lt;b&gt;PHR&lt;/b&gt;) as a health record that is initiated and maintained by an individual. The policy also states that  a PHR would be able to reveal a complete and accurate summary of the health and medical history of an individual by gathering data from multiple sources and making this accessible online. However there is no definition of health data which can be used by companies or users to know what comes under health data. The 2018, 2019 and 2021 version of the Data Protection Legislation had definitions of the term health data, however the 2022 version of the Bill does away with the definition.&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Health data and wearable devices&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;One of the forward looking provisions in the EUHDS is the inclusion of devices that records health data into this legislation. This also includes the requirement of them to be added to registries to provide easy access and scrutiny. The document also requires voluntary labeling of wellness applications and registration of EHR systems and wellness applications. This is not just for the regulation point of view but also in the case of data portability, in order for people to control the data they share. In addition to this in the case where manufacturers of medical devices and high-risk AI systems declare interoperability with the EHR systems, they will need to comply with the essential requirements on interoperability under the EHDS. &lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In India the health data management policy 2022 while stating the applicable entities and individuals who are part of the ABDM ecosystem&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[9]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; mention medical device manufacturers, does not mention device sellers or use terms such as wellness applications or wearable devices. Currently the regulation of medical devices falls under the purview of  the Drugs and Cosmetics Act, 1940 (DCA) read along with the Medical Device Rules, 2017 (MDR). However in 2020 possibly due to the pandemic the Indian Government along with the Drugs Technical Advisory Board (DTAB) issued two notifications the first one expanded the scope of medical devices which earlier was limited to only 37 categories excluding medical apps, and second one notified the Medical Device (Amendment) Rules, 2020. These two changes together brought all medical devices under the DCA as well as expanded the categories of medical devices. However it is still unclear whether fitness tracker apps that come with devices are regulated, as the rules and the DCA still rely on the manufacturer to self-identify as a medical device.&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[10]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, this regulatory uncertainty has not brought about any change in how this data is being used and insurance companies at times encourage people to sync their fitness tracker data.&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[11]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Multiple use of health data &lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The EUHDS states two types of uses of data: primary and secondary use of data. In the document the EU states that while there are a number of organisations collecting data, this data is not made available for purposes other than for which it was collected. In order to ensure that researchers, innovators and policy makers can use this data. the EU encourages the data holders to contribute to this effort in making different categories of electronic health data they are holding available for secondary use. The data that can be used for secondary use would also include user generated data such as from devices, applications or other wearables and digital health applications.However, the regulation cautions against using this data for measures and making decisions that are detrimental to the individual, in ways such as increasing insurance premiums. The EUHDS also states that as the data is sensitive personal data care should be taken by the data access bodies, to ensure that while data is being shared it is necessary to ensure that the data will be processed in a privacy preserving manner. This could include through pseudonymisation, anonymisation, generalisation, suppression and randomisation of personal data.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;While the document states how important it is to have secondary use of the data for public health, research and innovation it also requires that the data is not provided without adequate checks. The EUHDS requires the organisation seeking access to provide several pieces of information and be evaluated by the data access body. The information should include legitimate interest, the necessity and the process the data will go through. In the case where the organisation is seeking pseudonymised data, there is a need to explain why anonymous data would not be sufficient. In order to ensure a comprehensive approach between health data access bodies, the EUHDS states that the European Commission should support the harmonisation of data application, as well as data request.         &lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In India, while multiple health data documents state the need to share data for public interest, research and innovation, not much thought has been given to ensuring that the data is not misused and that there is harmonisation between bodies that provide the data. Most recently the PMJay documents states that the NHA shall make aggregated and anonymised data available through a public dashboard for the purpose of facilitating health and clinical research, academic research, archiving, statistical analysis, policy formulation, the development and promotion of diagnostic solutions and such other purposes as may be specified by the NHA. Such data can be accessed through a request to the Data Sharing Committee&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[12]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; for the sharing of such information through secure modes, including clean rooms and other such secure modes specified by NHA. However the document does not mention what clean rooms are in this context. &lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The Health Data Management Policy 2022 states that Data fiduciaries (data controllers/ processors according to the data protection legislation) can themselves make anonymised or de-identified data in an aggregated form available based in technical processes and anonymisation protocols which may be specified by the NDHM in consultation with the MeitY. The purposes mentioned in this policy included health and clinical research, academic research, archiving, statistical analysis, policy formulation, the development and promotion of diagnostic solutions and such other purposes as may be specified by the NDHMP. The policy states that in order to access the anonymised or de-identified data the entity requesting the data would have to provide relevant information such as name, purpose of use and nodal person of contact details. While the policy does not go into details about the scrutiny of the organisations seeking this data, it does state that the data will be provided based on the term as may be stipulated. &lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;However the issue arises as both the documents published by the NHA do not have a similar process for getting the data, for example the NDHMP requires the data fiduciary to share the data directly, while the PMJay guidelines requires the data to be shared by the Data Sharing Committee, creating duplicate datasets as well as affecting the quality of the data being shared. &lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;b&gt;&lt;span&gt;Recommendations for India &lt;/span&gt;&lt;/b&gt;&lt;span&gt;&lt;br /&gt; &lt;/span&gt;&lt;span&gt;Need for a data protection legislation:&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;While the EUHDS is still a draft document and the end result could be different based on the consultations and deliberations, the document has a strong base with respect to the privacy and data protection based on the earlier regulations and the GDPR. The definitions of what counts as health data, and the parameters for managing the data creates a more streamlined process for all stakeholders. More importantly the GDPR and other regulations provide a way of recourse for people. In India the health data related  policies and strategy documents have been published and enforced before the data protection legislation is passed. In addition to this India, unlike the EU has just begun looking at a universal health ID and digitisation of the healthcare system, ideally it would be better to take each step at a time, and at first look at the issues that may arise due to the universal health ID. In addition to this, multiple policies, without a strong data protection legislation providing parameters and definitions could mean that the health data management policies only benefit certain people. This also creates uncertainty in terms of where an individual will go in case of harms caused by the processing of their data, and who would be the authority to govern questions around health data. The division of health data management between different documents also creates multiple silos of data management which creates data duplication and issues with data quality. &lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Secondary use of data&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;While both the EUHDS and India's Health Data Management Policy look at the sharing of health data with researchers and private organisations in order to foster innovation, the division of sharing of data based on who uses the data is a good way to ensure that only interested parties have access to the data. With respect to the health data policies in India, a number of policies talk about the sharing of anonymised data with researchers, however the documents being scattered could cause the same data to be shared by multiple health data entities, making it possible to identify people. For example, the health data management policy could share anonymised data of health services used by a person, whereas the PMJAY policy could share data about insurance covers, and the researcher could probably match the data and be closer to identifying people. It has also been  revealed in multiple studies that anonymisation of data is not permanent and that the anonymisation can be broken. This is more concerning since the polices do not put limits or checks on who the researchers are and what is the end goal of the data sought by them, the policies seem to rely on the anonymisation of the data as the only check for privacy. This data could be used to de-anonymise people, could be used by companies working with the researchers to get large amounts of data to train their systems, &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;train data that could lead to greater surveillance, increase insurance scrutiny etc. The NHA and Indian health policy makers could look at the restrictions and checks that the EUHDS creates for the secondary use of data and create systems of checks and categories of researchers and organisations seeking data to ensure minimal risks to an individual’s data. &lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;b&gt;&lt;span&gt;Conclusion&lt;/span&gt;&lt;/b&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;While the EU Health data space has been criticised for facilitating vast amounts of data with private companies and the collecting of data by governments, the codification of the legislation does in some way give some way to regulate the flow of health data. While India does not have to emulate the EU and have a similar document, it could look at the best practices and issues that are being highlighted with the EUHDS. Indian lawmakers have looked at the GDPR for guidance for the draft data protection legislation, similarly it could do so with regard to health data and health data management. One possible way to ensure both the free flow of health data and the safeguards of a regulation could be to re-introduce the DISHA Act which much like the EUHDS could act as a legislation which provides an anchor to the multiple health data policies, including standard definition of health data, grievance redressal bodies, and adjudicating authorities and their functions. In addition a legislation dedicated to the health data would also remove the existing burden on the to be formed data protection authority. &lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[1]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; “&lt;/span&gt;&lt;span&gt;European Health Data Space&lt;/span&gt;&lt;span&gt;”, European Commission, 03 May 2022,https://health.ec.europa.eu/ehealth-digital-health-and-care/european-health-data-space_en &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[2]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt;“&lt;/span&gt;&lt;span&gt;European Health Data Space&lt;/span&gt;&lt;span&gt;”&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[3]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; “National Digital Health Blueprint”, Ministry of Health and Family Welfare Government of India, https://abdm.gov.in:8081/uploads/ndhb_1_56ec695bc8.pdf&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[4]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; “National Digital Health Blueprint”&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[5]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; “Mondaq” “DISHA – India's Probable Response To The Law On Protection Of Digital Health Data” accessed 13 June 2023,https://www.mondaq.com/india/healthcare/1059266/disha-india39s-probable-response-to-the-law-on-protection-of-digital-health-data&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[6]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt;“The Digital Personal Data Protection Bill 2022”, accessed 13 June 2023 , https://www.meity.gov.in/writereaddata/files/The%20Digital%20Personal%20Data%20Potection%20Bill%2C%202022_0.pdf&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[7]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt;The Digital Personal Data Protection Bill 2022&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[8]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; Regulation (EU) 2016/679 defines health data as “Personal data concerning health should include all data pertaining to the health status of a data subject which reveal information relating to the past, current or future physical or mental health status of the data subject. This includes information about the natural person collected in the course of the registration for, or the provision of, health care services as referred to in Directive 2011/24/EU of the European Parliament and of the Council (1) to that natural person; a number, symbol or particular assigned to a natural person to uniquely identify the natural person for health purposes; information derived from the testing or examination of a body part or bodily substance, including from genetic data and biological samples; and any information on, for example, a disease, disability, disease risk, medical history, clinical treatment or the physiological or biomedical state of the data subject independent of its source, for example from a physician or other health professional, a hospital, a medical device or an in vitro diagnostic test. &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[9]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; For creating an integrated, uniform and interoperable ecosystem in a patient or individual centric manner, all the government healthcare facilities and programs, in a gradual/phased manner, should start assigning the same number for providing any benefit to individuals.&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[10]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; For example a manufacturer of a  fitness tracker which is capable of monitoring heart rate could state that the intended purpose of the device was fitness or wellness as opposed to early detection of heart disease thereby not falling under the purview of the regulation.&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[11]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt;“&lt;/span&gt;&lt;span&gt;Healthcare Executive” “GOQii Launches GOQii Smart Vital 2.0, an ECG-Enabled Smart Watch with Integrated Outcome based Health Insurance &amp;amp; Life Insurance, accessed 13 June 2023&lt;br /&gt; &lt;/span&gt;&lt;a href="https://www.healthcareexecutive.in/blog/ecg-enabled-smart-watch"&gt;&lt;span&gt;https://www.healthcareexecutive.in/blog/ecg-enabled-smart-watch&lt;/span&gt;&lt;/a&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[12]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; The guidelines only state that the Committee will be responsible for ensuring the compliance of the guidelines in relation to the personal data under its control. And does not go into details of defining the Committee.&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/health-data-management-policies'&gt;https://cis-india.org/internet-governance/blog/health-data-management-policies&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>shweta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Health Management</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    
    
        <dc:subject>Digitisation</dc:subject>
    

   <dc:date>2023-07-10T16:36:25Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution">
    <title>CoWIN Breach: What Makes India's Health Data an Easy Target for Bad Actors?</title>
    <link>https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution</link>
    <description>
        &lt;b&gt;Recent health data policies have failed to even mention the CoWIN platform.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="https://www.thequint.com/opinion/cowin-data-breach-health-sensitive-details-policies-solution#read-more"&gt;originally published in the Quint&lt;/a&gt; on 19 June 2023.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Last week, it was reported that due to an alleged breach of &lt;a href="https://www.thequint.com/fit/cowin-data-breach-private-information-covid-vaccine-telegram-bot"&gt;the CoWIN platform&lt;/a&gt;, details such as Aadhaar and passport numbers of Indians were made public via a Telegram bot.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While Minister of State for Information Technology &lt;a href="https://www.thequint.com/fit/cowin-data-breach-telegram-bot-covid-19-vaccine-unanswered-questions"&gt;Rajeev Chandrashekar&lt;/a&gt; put out information acknowledging that there was some form of a data breach, there is no information on how the breach took place or when a past breach may have taken place.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;This data leak is yet another example of &lt;a href="https://www.thequint.com/opinion/cowin-breach-shows-us-the-structural-problem-with-digital-indias-infrastructure"&gt;our health records&lt;/a&gt; being exposed in the recent past – during the pandemic, there were reports of COVID-19 test results being leaked online. The leaked information included patients’ full names, dates of birth, testing dates, and names of centres in which the tests were held.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;In December last year, five servers of the &lt;a href="https://www.thequint.com/fit/aiims-ayushman-bharat-digital-mission-health-data"&gt;All India Institute of Medical Science&lt;/a&gt; (AIIMS) in Delhi were under a cyberattack, leaving sensitive personal data of around 3-4 crore patients compromised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In such cases, the Indian Computer Emergency Response Team (CERT-In) is the agency responsible for looking into the vulnerabilities that may have led to them. However, till date, CERT-In has not made its technical findings into such attacks &lt;a href="https://www.thequint.com/topic/data-breach"&gt;publicly available&lt;/a&gt;.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The COVID-19 Pandemic Created Opportunity&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The pandemic saw a number of digitisation policies being rolled out in the health sector; the most notable one being the National Digital Health Mission (or NDHM, later re-branded as the Ayushman Bharat Digital Mission).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mobile phone apps and web portals launched by the central and state governments during the pandemic are also examples of this health digitisation push. The rollout of the COVID-19 vaccinations also saw the deployment of the CoWIN platform.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Initially, it was mandatory for individuals to register on CoWIN to get an appointment for vaccination, and there was no option for walk-in-registration or to book an appointment. But, the Centre subsequently modified this rule and walk-in appointments and registrations on CoWIN became permissible from June 2021.&lt;/p&gt;
&lt;blockquote&gt;However, a study conducted by the Centre for Internet and Society (CIS) found that states such as Jharkhand and Chhattisgarh, which have low internet penetration, permitted on-site registration for vaccinations from the beginning.&lt;/blockquote&gt;
&lt;p&gt;The rollout of the NDHM also saw Health IDs being generated for citizens.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In several reported cases across states, this rollout happened during the COVID-19 vaccination process – without the informed consent of the concerned person.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The &lt;b&gt;beneficiaries who have had their Health IDs created through the vaccination process had not been informed&lt;/b&gt; about the creation of such an ID or their right to opt out of the digital health ecosystem.&lt;/p&gt;
&lt;h3&gt;A Web of Health Data Policies&lt;/h3&gt;
&lt;p&gt;Even before the pandemic, India was working towards a Health ID and a health data management system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The components of the umbrella National Digital Health Ecosystem (NDHE) are the National Digital Health Blueprint published in 2019 (NDHB) and the NDHM.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Blueprint was created to implement the National Health Stack (published in 2018) which facilitated the creation of Health IDs. Whereas the NDHM was drafted to drive the implementation of the Blueprint, and promote and facilitate the evolution of NDHE.&lt;/p&gt;
&lt;p&gt;The National Health Authority (NHA), established in 2018, has been given the responsibility of implementing the National Digital Health Mission.&lt;/p&gt;
&lt;blockquote style="text-align: justify; "&gt;2018 also saw the Digital Information Security in Healthcare Act (DISHA), which was to regulate the generation, collection, access, storage, transmission, and use of Digital Health Data ("DHD") and associated personal data.&lt;/blockquote&gt;
&lt;p&gt;However, since its call for public consultation, &lt;b&gt;no progress has been made&lt;/b&gt; on this front.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to documents that chalk out the functioning and the ecosystem of a digitised healthcare system, the NHA has released policy documents such as:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;the Health Data Management Policy (which was revised three times; the latest version released in April 2022)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;the Health Data Retention Policy (released in April 2021)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Consultation paper on the Unified Health Interface (UHI) (released in December 2022)&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Along with these policies, in 2022, the NHA released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) – India’s state health insurance policy.&lt;/p&gt;
&lt;blockquote style="text-align: justify; "&gt;However these &lt;b&gt;draft guidelines repeat the pattern of earlier policies&lt;/b&gt; &lt;b&gt;on health data&lt;/b&gt;, wherein there is no reference to the policies that predated it; the PM-JAY’s Data Sharing Guidelines, published in August 2022, did not even refer to the draft National Digital Health Data Management Policy (published in April 2022).&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Interestingly, the recent health data policies do not mention CoWIN.&lt;/b&gt; Failing to cross-reference or mention preceding policies creates a lack of clarity on which documents are being used as guidelines by healthcare providers.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Can a Data Protection Bill Be the Solution?&lt;/h3&gt;
&lt;p&gt;The draft Data Protection Bill, 2021, defined health data as “…the data related to the state of physical or mental health of the data principal and &lt;b&gt;includes records regarding the past, present or future state of the health of such data principal&lt;/b&gt;, data collected in the course of registration for, or provision of health services, data associated with the data principal to the provision of specific health services.”&lt;/p&gt;
&lt;p&gt;However, this definition as well as the definition of sensitive personal data was removed from the current version of the Bill (Digital Personal Data Protection Bill, 2022).&lt;/p&gt;
&lt;blockquote&gt;Omitting these definitions from the Bill removes a set of data which, if collected, warrants increased responsibility and increased liability. Handling of health data, financial data, government identifiers, etc, need to come with a higher level of responsibility as they are a list of sensitive details of a person.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;The threats posed as a result of this data being leaked are not limited to spam messages or fraud and impersonation, but also of companies that can get a hand on this coveted data and gather insights and train their systems and algorithms, without the need to seek consent from anyone, or without facing the consequences of harm caused.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the current version of the draft DPDP Bill states that the data fiduciary shall notify the data principal of any breach, the draft Bill also states that the Data Protection Board “may” direct the data fiduciary to adopt measures that remedy the breach or mitigate harm caused to the data principal.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Bill also prescribes penalties of upto Rs 250 crore if the data fiduciary fails to take reasonable security safeguards to prevent a personal data breach, and a penalty of upto Rs 200 crore if the fiduciary fails to notify the data protection board and the data principal of such breach.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While &lt;b&gt;these steps, if implemented through legislation, would make organisations processing data take their data security more seriously&lt;/b&gt;, the removal of sensitive personal data from the definition of the Bill, would mean that data fiduciaries processing health data will not have to take additional steps other than reasonable security safeguards.&lt;/p&gt;
&lt;p&gt;The &lt;b&gt;absence of a clear indication of security standards&lt;/b&gt; will affect data principals and fiduciaries.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Looking to bring more efficiency to governance systems, the Centre launched the Digital India Mission in 2015. The press release by the central government reporting the approval of the programme by the Cabinet of Ministers speaks of ‘cradle to grave’ digital identity as one of its vision areas.&lt;/p&gt;
&lt;p&gt;The ambitious Universal Health ID and health data management policies are an example of this digitisation mission.&lt;/p&gt;
&lt;blockquote&gt;However breaches like this are reminders that without proper data security measures, and a system for having a person responsible for data security, the data is always vulnerable to an attack.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;While the UK and Australia have also seen massive data breaches in the past, India is at the start of its health data digitisation journey and has the ability to set up strong security measures, employ experienced professionals, and establish legal resources to ensure that data breaches are minimised and swift action can be taken in case of a breach.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;The first step&lt;/b&gt; to understand the vulnerabilities would be to present the CERT-In reports of this breach, and guide other institutions to check for the same so that they are better prepared for future breaches and attacks.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution'&gt;https://cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-07-04T09:39:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill">
    <title>The Centre for Internet and Society’s comments and recommendations to the: The Digital Data Protection Bill 2022</title>
    <link>https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) published its comments and recommendations to the Digital Personal Data Protection Bill, 2022, on December 17, 2022.&lt;/b&gt;
        &lt;div class="WordSection1" style="text-align: justify; "&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p align="center" class="MsoNormal" style="text-align:center; "&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p align="right" class="MsoNormal" style="text-align:right; "&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;h1&gt;&lt;span&gt;High Level Comments&lt;/span&gt;&lt;/h1&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;1.&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;/b&gt;&lt;b&gt;&lt;span&gt;Rationale for removing the distinction between personal data and sensitive personal data is unclear.&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;All the earlier iterations of the Bill as well as the rules made under Section 43A of the Information Technology Act, 2000&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[1]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; had classified data into two categories; (i) personal data; and (ii) sensitive personal data. The 2022 version of the Bill has removed this distinction and clubbed all personal data under one umbrella heading of personal data. The rationale for this is unclear, as sensitive personal data means such data which could reveal or be related to eminently private data such as financial data, health data, sexual orientations and biometric data. Considering the sensitive nature of the data, the data classified as sensitive personal data is accorded higher protection and safeguards from processing, therefore by clubbing all data as personal data, the higher protection such as the need for explicit consent to the processing of sensitive personal data, the bar on processing of sensitive personal data for employment purposes has also been removed. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;2.&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;/b&gt;&lt;b&gt;&lt;span&gt;No clear roadmap for the implementation of the Bill&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified.&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[2]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within which the Board shall be established and specific Rules and regulations notified. Considering that certain provisions have been deferred to Rules that have to be framed by the Central government, the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill. Provisions such as Section 10(1) which deals with verifiable parental consent for data of children,  Section 13 (1) which states the manner in which a Data Principal can initiate a right to correction, the process of selection and functioning of consent manager under &lt;/span&gt;&lt;span&gt;3(7)&lt;/span&gt;&lt;span&gt; are few such examples, that when the Act becomes applicable, the data principal will have to wait for the Rules to Act of these provisions, or to get clarity on entities created by the Act. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Board. This is important to give full and effective effect to the right of privacy of the individual. It is also important to ensure that individuals have an effective mechanism to enforce the right and seek recourse in case of any breach of obligations by the data fiduciaries. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Board must ensure that Data Principals and Fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment into force. This will allow the Data Fiduciaries to align their practices with the provisions of this new legislation and the Board will also have time to define and determine certain provisions that the Bill has left the Board to define. Additionally enforcing penalties for offenses initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders which now could include data principals as well, from paying a high price. This will relieve the fear of smaller companies and startups and individuals who might fear processing data for the fear of paying penalties for offenses.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;a name="_kn12ecl3pdrp"&gt;&lt;/a&gt;&lt;span&gt;3.&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;span&gt;Independence of  Data Protection Board of India.&lt;/span&gt;&lt;/h3&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Bill proposes the creation of the Data Protection Board of India (Board) in place of the Data Protection Authority. In comparison with the powers of the Board with the 2018 and 2019 version of Personal Data Protection Bill, we witness an abrogation of powers of the Board  to be created, in this Bill. Under Clause 19(2), the strength and composition of the Board, the process of selection, the terms and conditions of appointment and service, and the removal of its Chairperson and other Members shall be such as may be prescribed by the Union Government at a later stage. Further as per Clause 19(3), the Chief Executive of the Board will be appointed by the Union Government and the terms and conditions of her service will also be determined by the Union Government. The functions of the Board have also not been specified under the Bill, the Central Government may assign the functions to be performed by the Board.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate, ability to act swiftly, and resources. The political nature of  personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Board are independent of the Executive. &lt;/span&gt;&lt;/p&gt;
&lt;h1&gt;&lt;a name="_n9jzjnvile8f"&gt;&lt;/a&gt;&lt;span&gt;Chapter Wise Comments and Recommendations &lt;/span&gt;&lt;/h1&gt;
&lt;h2&gt;&lt;a name="_chp7y0vgrjqa"&gt;&lt;/a&gt;&lt;span&gt;CHAPTER I- PRELIMINARY&lt;/span&gt;&lt;/h2&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;&lt;span&gt; &lt;/span&gt;●&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;b&gt;&lt;span&gt;Definition:&lt;/span&gt;&lt;/b&gt;&lt;span&gt; While the Bill has added a few new definitions to the Bill including terms such as gains, loss, consent manager etc. there are a few key definitions that have been removed from the earlier versions of the Bill. The removal of certain definitions in the Bill, eg. sensitive personal data, health data, biometric data, transgender status, creating a legal uncertainty about the application of the Bill. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;With respect to the existing definitions as well the definition of the term ‘harm’ has been significantly reduced to remove harms such as surveillance from the ambit of harms. In addition, with respect of the definition of the term of harms also, the 2019 version of the Bill under Clause 2 (20) the definition provides a non exhaustive list of harms, by using the phrase “harms include”, however in the new definition the phrase has been altered to “harm”, in relation to a Data Principal, means”, thereby removing the possibility of more harms that are not apparent currently from being within the purview of the Act. We recommend that the definition of harms be made into a non-exhaustive list.&lt;br /&gt; &lt;br /&gt; &lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a name="_nhwnuzprx0ir"&gt;&lt;/a&gt;&lt;span&gt;CHAPTER II - OBLIGATIONS OF DATA FIDUCIARY&lt;/span&gt;&lt;/h2&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Notice: &lt;/span&gt;&lt;/b&gt;&lt;span&gt;The revised Clause on notice does away with the comprehensive requirements which were laid out under Clause 7 of the PDP Bill 2019. The current clause does not mention in detail what the notice should contain, while stating that that the notice should be itemised. While it can be reasoned that the Data Fiduciary can find the contents of the notice throughout the bill, such as with the rights of the Data Principal, the removal of a detailed list could create uncertainty for Data Fiduciaries. By leaving the finer details of what a notice should contain, it could cause Data Fiduciaries from missing out key information from the list, which in turn provide incomplete information to the Data Principal. Even in terms of Data Fiduciaries they might not know if they are complying with the provisions of the bill, and could result in them invariably being penalised. In addition to this by requiring less work by the Data Fiduciary and processor, the burden falls on the Data Principal to make sure they know how their data is processed and collected. The purpose of this legislation is to create further rights for individuals and consumers, hence the Bill should strive to put the individual at the forefront.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;In addition to this Clause 6(3) of the Bill states &lt;i&gt;“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English or any language specified in the Eighth Schedule to the Constitution of India.”&lt;/i&gt; While the inclusion of regional language notices is a welcome step, we suggest that the text be revised as follows &lt;i&gt;“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English&lt;b&gt; and in&lt;/b&gt; any language specified in the Eighth Schedule to the Constitution of India.” &lt;/i&gt;While the main crux of notice is to let the person know before giving consent, notice in a language that a person cannot read would not lead to meaningful consent.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Consent &lt;br /&gt; &lt;br /&gt; &lt;/span&gt;&lt;/b&gt;&lt;span&gt;Clause 3 of the Bill states &lt;i&gt;“request for consent would have the contact details of a Data Protection Officer, where applicable, or of any other person authorised by the Data Fiduciary to respond to any communication from the Data Principal for the purpose of exercise of her rights under the provisions of this Act.” &lt;/i&gt;Ideally this provision should be a part of the notice and should be mentioned in the above section. This is similar to Clause 7(1)(c) of the draft Personal Data Protetion Bill 2019 which requires the notice to state &lt;i&gt;“the identity and contact details of the data fiduciary and the contact details of the data protection officer, if applicable;”. &lt;/i&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Deemed Consent&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Bill  introduces a new type of consent that was absent in the earlier versions of the Bill. We are of the understanding that deemed consent is used to redefine non consensual processing of personal data. The use of the term deemed consent and the provisions under the section while more concise than the earlier versions could create more confusion for Data Principals and Fiduciaries alike. The definition and the examples do not shed light on one of the key issues with voluntary consent - the absence of notice. In addition to this the Bill is also silent on whether deemed consent can be withdrawn or if the data principal has the same rights as those that come from processing of data they have consented to. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Personal Data Protection of Children &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The age to determine whether a person has the ability to legally consent in the online world has been intertwined with the age of consent under the Indian Contract Act; i.e. 18 years. The Bill makes no distinction between a 5 year old and a 17 year old- both are treated in the same manner. It assumes the same level of maturity for all persons under the age of 18. It is pertinent to note that the law in the offline world does recognise that distinction and also acknowledges the changes in the level of maturity. As per Section 82 of the Indian Penal Code read with Section 83, any act by a child under the age of 12 shall not be considered as an offence. While the maturity of those aged between 12–18 years will be decided by court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industry&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;There is  a need to evaluate and rethink the idea that children are passive consumers of the internet and hence the consent of the parent is enough. Additionally, the bracketing of all individuals under the age of 18 as children fails to look at how teenages and young people use the internet. This is more important looking at the 2019 data which suggests that two-thirds of India’s internet users are in the 12–29 years age group, with those in the 12–19 age group accounting for about 21.5% of the total internet usage in metro cities. Given that the pandemic has compelled students and schools to adopt and adapt to virtual schools, the reliance on the internet has become ubiquitous with education. Out of an estimated 504 million internet users, nearly one-third are aged under 19. As per the Annual Status on Education Report (ASER) 2020, more than one-third of all schoolchildren are pursuing digital education, either through online classes or recorded videos.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Instead of setting a blanket age for determining valid consent, we could look at alternative means to determine the appropriate age for children at different levels of maturity, similar to what had been developed by the U.K. Information Commissioner’s Office. The Age Appropriate Code prescribes 15 standards that online services need to follow. It broadly applies to online services "provided for remuneration"—including those supported by online advertising—that process the personal data of and are "likely to be accessed" by children under 18 years of age, even if those services are not targeted at children. This includes apps, search engines, social media platforms, online games and marketplaces, news or educational websites, content streaming services, online messaging services. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The reservation to definition of child under the Bill has also been expressed by some members of the JPC through their dissenting opinion. MP Ritesh Pandey stated that keeping in mind the best interest of the child the Bill should consider a child to be a person who is less than 14 years of age. This would ensure that young people could benefit from the advances in technology without parental consent and reduce the social barriers that young women face in accessing the internet. Similarly Manish Tiwari in his dissenting note also observed that the regulation of the processing of data of children should be based on the type of content or data. The JPC Report observed that the Bill does not require the data fiduciary to take fresh consent of the child, once the child has attained the age of majority, and it also does not give the child the option to withdraw their consent upon reaching the majority age. It therefore, made the following recommendations:&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Registration of data fiduciaries, exclusively dealing with children’s data. Application of the Majority Act to a contract with a child. Obligation of Data fiduciary to inform a child to provide their consent, three months before such child attains majority  Continuation of the services until the child opts out or gives a fresh consent, upon achieving majority. However, these recommendations have not been incorporated into the provisions of the Bill. In addition to this the Bill is silent on the status of non consensual processing and deemed consent with respect to the data of children.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;We recommend that fiduciaries who have services targeted at children should be considered as significant Data Fiduciaries. In addition to this the Bill should also state that the guardians could approach the Data Protection Board on behalf of the child. With these obligations in place, the age of mandatory consent could be reduced and the data fiduciary could have an added responsibility of informing the children in the simplest manner how their data will be used. Such an approach places a responsibility on Data Fiduciaires when implementing services that will be used by children and allows the children to be aware of data processing, when they are interacting with technology.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Chapter III-RIGHTS AND DUTIES OF DATA PRINCIPAL&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Rights of Data Principal&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 12(3) of the Bill while providing the Data Principal the right to be informed of the identities of all the Data Fiduciaries with whom the personal data has been shared, also states that the data principal has the right to be informed of the categories of personal data shared. However the current version of the Bill provides only one category of data that is personal data. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 14 of the Bill talks about the Right of Grievance Redressal, and  states that the Data Principal has the right to readily available means of registering a grievance, however the Bill does not provide in the Notice provisions the need to mention details of a grievance officer or a grievance redressal mechanism. It is only  the additional obligations on significant data fiduciary that mentions the need for a Data Protection officer to be the contact for the grievance redressal mechanism under the provisions of this Bill. The Bill could ideally re-use the provisions of the IT Act SPDI Rules 2011 in which Section 5(7) states &lt;i&gt;“Body corporate shall address any discrepancies and grievances of their provider of the information with respect to processing of information in a time bound manner. For this purpose, the body corporate shall designate a Grievance Officer and publish his name and contact details on its website. The Grievance Officer shall redress the grievances or provider of information expeditiously but within one month ' from the date of receipt of grievance.”&lt;br /&gt; &lt;/i&gt;&lt;br /&gt; The above framing would not only bring clarity to the data fiduciaries on what process to follow for a grievance redressal, it also would reduce the significant burden of theBoard. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Duties of Data Principals&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;The Bill while entisting duties of the Data Principal states that the “Data Principal shall not register a false or frivolous grievance or complaint with a Data Fiduciary or the Board”, however it is very difficult for a Data Principal to and even for the Board to determine what constitutes a “frivolous grievance”. In addition to this the absence of a defined notice provision and the inclusion of deemed consent would mean that the Data Fiduciary could have more information about the matter than the Data Principal. This could mean that the fiduciary could prove that a claim was false or frivolous. Clause 21(12) states that “&lt;i&gt;At any stage after receipt of a complaint, if the Board determines that the complaint is devoid of merit, it may issue a warning or impose costs on the complainant.” &lt;/i&gt;In addition to this Clause 25(1) states that “ &lt;i&gt;If the Board determines on conclusion of an inquiry that non- compliance by &lt;b&gt;a person &lt;/b&gt;is significant, it may, after giving the person a reasonable opportunity of being heard, impose such financial penalty as specified in Schedule 1, not exceeding rupees five hundred crore in each instance.” &lt;/i&gt;The use of the term “person” in this case includes data which could mean that they could be penalised under the provisions of the Bill, which could also include not complying with the duties.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;CHAPTER IV- SPECIAL PROVISIONS&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Transfer of Personal Data outside India&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 17 of the Bill has removed the requirement of data localisation which the 2018 and 2019 Bill required. Personal data can be transferred to countries that will be notified by the central government. There is no need for a copy of the data to be stored locally and no prohibition on transferring sensitive personal data and critical data. Though it is a welcome change that personal data can be transferred outside of India, we would highlight the concerns in permitting unrestricted access to and transfer of all types of data. Certain data such as defence and health data do require sectoral regulation and ringfencing of the transfer of data. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt;Exemptions&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Clause 18 of the Bill has widened the scope of government exemptions. Blanket exemption has been given to the State under Clause 18(4) from deleting the personal data even when the purpose for which the data was collected is no longer served or when retention is no longer necessary. The requirement of &lt;i&gt;proportionality, reasonableness and fairness&lt;/i&gt; have been removed for the Central Government to exempt any department or instrumentality from the ambit of the Bill.&lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;By doing away with the four pronged test, this provision is not in consonance with test laid down by the Supreme Court and are also incompatible with an effective privacy regulation. There is also no provision for either a prior judicial review  of the order by a district judge as envisaged by the Justice Srikrishna Committee Report or post facto review by an oversight committee of the order as laid down under the Indian Telegraph Rules, 1951&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[3]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and the rules framed under Information Technology Act&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;&lt;span&gt;[4]&lt;/span&gt;&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;. The provision states that such processing of personal data shall be subject to the procedure, safeguard and oversight mechanisms that may be prescribed.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div style="text-align: justify; "&gt;&lt;br clear="all" /&gt; 
&lt;hr align="left" size="1" width="100%" /&gt;
&lt;div id="ftn1"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[1]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011&lt;/span&gt;&lt;span&gt;.&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[2]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; Clause 97 of the 2018 Bill states&lt;i&gt;“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6) The Authority shall no, later than twelve months from the date notified date issue codes of practice  on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f) security safeguards under section 31; (g) research purposes under section 45;(h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”&lt;/i&gt;&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[3]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;Rule 419A (16): The Central Government or the State Government shall constitute a Review Committee. &lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt;Rule 419 A(17): The Review Committee shall meet at least once in two months and record its findings whether the directions issued under sub-rule (1) are in accordance with the provisions of sub-section (2) of Section 5 of the said Act. When the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above it may set aside the directions and orders for destruction of the copies of the intercepted message or class of messages.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p class="MsoNormal"&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;span&gt;&lt;sup&gt;&lt;span&gt;[4]&lt;/span&gt;&lt;/sup&gt;&lt;/span&gt;&lt;/sup&gt;&lt;/a&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;Rule 22 of Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009: The Review Committee shall meet at least once in two months and record its findings whether the directions issued under rule 3 are in accordance with the provisions of sub-section (2) of section 69 of the Act and where the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above, it may set aside the directions and issue an order for destruction of the copies, including corresponding electronic record of the intercepted or monitored or decrypted information.&lt;/span&gt;&lt;/p&gt;
&lt;p class="MsoNormal"&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-01-20T02:35:30Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india">
    <title>Demystifying Data Breaches in India</title>
    <link>https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india</link>
    <description>
        &lt;b&gt;Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their  historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.

&lt;/b&gt;
        &lt;p&gt;Edited by Arindrajit Basu and Saumyaa Naidu&lt;/p&gt;
&lt;hr /&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;India saw a &lt;a href="https://theprint.in/india/despite-62-drop-in-data-breaches-india-among-top-5-nations-targeted-by-hackers-study-finds/917197/"&gt;62% drop in data breaches in the first quarter of 2022&lt;/a&gt;. Yet, it ranked fifth on the list of countries most hit by cyberattacks according to a 2022 &lt;a href="https://surfshark.com/blog/data-breach-statistics-by-country"&gt;report by Surfshark&lt;/a&gt;, a Netherlands-based VPN company. Another report &lt;a href="https://analyticsindiamag.com/the-ridiculous-17-5-cr-for-a-data-breach/"&gt;on the cost of data breaches researched by the Ponemon Institute and published by IBM&lt;/a&gt; reveals that the breach of about 29500 records between March 2021 and March 2022 resulted in a 25% increase in the average cost from INR 165 million in 2021 to INR 176 million in 2022.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;These statistics are certainly a cause for concern, especially in the context of India’s rapidly burgeoning digital economy shaped by the pervasive platformization of private and public services such as welfare, banking, finance, health, and shopping among others. Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;While expert articulations of cybersecurity in general and data breaches in particular tend to predominate the public discourse on data privacy, this post aims to situate broader understandings of data breaches within the historical context of India’s IT revolution and delve into specific concepts and terminology that have shaped the broader discourse on data protection. The late 1990s and early 2000s offer a useful point of entry into the genesis of the data security landscape in India.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;&lt;/span&gt;&lt;span&gt;Data Breaches and their Predecessor Forms&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;/span&gt;&lt;span&gt;The articulation of data security concerns around the late 1990s and early 2000s isn’t always consistent in deploying the phrase, ‘data breach’ to signal cybersecurity concerns in India. The terms such as ‘data/ identity theft’ and ‘data leak’ figure prominently in the public articulation of concerns with the handling of personal information by IT systems, particularly in the context of business process outsourcing (BPO) and e-commerce activities. Other pertinent terms such as “security breach”, “data security”, and ‘“cyberfraud” also capture the specificity of growing concerns around outsourced data to India. At the time, i.e. around mid-2000s regulatory frameworks were still evolving to accommodate and address the complexities arising from a dynamic reconfiguration of the telecommunications and IT landscape in India.&lt;/span&gt;&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Some of the formative cases that instantiate the usage of the aforementioned terms are instructive to understand shifts in the reporting of such incidents over time. The earliest case during that period concerns&lt;a href="https://www.stop-source-code-theft.com/source-code-theft-cases-in-india/"&gt; a 2002 case concerning the theft and sale of source code&lt;/a&gt; by an IIT Kharagpur student who intended to sell the code to two undercover FBI agents who worked with the CBI to catch the thief. A straightforward case of data theft was framed by media stories around the time as a &lt;a href="https://timesofindia.indiatimes.com/iitian-held-for-stealing-software-source-code/articleshow/20389713.cms"&gt;cybercrime involving the illegal sale&lt;/a&gt; of the source code of a software package, as &lt;a href="https://economictimes.indiatimes.com/ip-laws-lax-but-us-firm-bets-on-india/articleshow/696197.cms?from=mdr"&gt;software theft of intellectual property in the context of outsourcing&lt;/a&gt; and as an instance of &lt;a href="https://www.computerworld.com/article/2573515/at-risk-offshore.html"&gt;industrial espionage in poor nations without laws protecting foreign companies&lt;/a&gt;. This case became the basis of the earliest calls for the protection of data privacy and security in the context of the Indian BPO sector. The Indian IT Act, 2000 at the time only covered &lt;a href="http://pavanduggal.com/wp-content/uploads/2016/01/India-Responds-to-Growing-Concerns-Over-Data-Security.pdf"&gt;unauthorized access and data theft from computers and networks without any provisions for data protection, interception or computer forgery&lt;/a&gt;. The BPO boom in India brought with it &lt;a href="https://blj.ucdavis.edu/archives/vol-6-no-2/offshore-outsourcing-to-india.html"&gt;employment opportunities for India’s English-speaking, educated youth but in the absence of concrete data privacy legislation&lt;/a&gt;, the country was regarded as an unsafe destination for outsourcing aside from the political ramifications concerning the loss of American jobs.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In a major 2005 incident, employees of the Mphasis BFL call centre in Pune extracted sensitive bank account information of Citibank’s American customers to divert INR 1.90 crore into new accounts set up in India. The media coverage of this incident calls it &lt;a href="https://www.indiatoday.in/magazine/economy/story/20050502-pune-call-centre-fraud-rattles-india-booming-bpo-sector-787790-2005-05-01"&gt;India’s first outsourcing cyberfraud and a well planned scam&lt;/a&gt;, a &lt;a href="https://economictimes.indiatimes.com/mphasis-call-centre-fraud-net-widens/articleshow/1077097.cms"&gt;cybercrime in a globalized world&lt;/a&gt;, and a case of &lt;a href="https://timesofindia.indiatimes.com/home/sunday-times/deep-focus/indias-first-bpo-scam-unraveled/articleshow/1086438.cms"&gt;financial fraud and a scam&lt;/a&gt; that required no hacking skills, and a &lt;a href="https://www.infoworld.com/article/2668975/indian-call-center-workers-charged-with-citibank-fraud.html"&gt;case of data theft and misuse&lt;/a&gt;. Within the ambit of cybercrime, media reports of these incidents refer to them as cases of “fraud”, “scam” and “theft''.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Two other incidents in 2005 set the trend for a critical spotlight on data security practices in India. In a &lt;a href="http://news.bbc.co.uk/2/hi/south_asia/4619859.stm"&gt;June 2005 incident, an employee of a Delhi-based BPO firm, Infinity e-systems, sold the account numbers and passwords of 1000 bank customers &lt;/a&gt;to the British Tabloid, The Sun. The Indian newspaper, Telegraph India, carried an online story headlined, “&lt;a href="https://www.telegraphindia.com/india/bpo-blot-in-british-backlash-indian-sells-secret-data/cid/873737"&gt;BPO Blot in British Backlash: Indian Sells Secret Data&lt;/a&gt;,” which reported that the employee, Kkaran Bahree, 24, was set up by a British journalist, Oliver Harvey. Harvey filmed Bahree accepting wads of cash for the stolen data. Bahree’s theft of sensitive information is described both as a data fraud and a leak in the above 2005 BBC story by Soutik Biswar. Another story on the incident calls it a “&lt;a href="https://www.rediff.com/money/2005/jun/24bpo3.htm"&gt;scam” involving the leakage of credit card information&lt;/a&gt;. The use of the term ‘leak’ appears consistently across other media accounts such as a &lt;a href="https://timesofindia.indiatimes.com/city/delhi/esearch-bpo-employee-sacked-still-missing/articleshow/1153017.cms"&gt;2005 story on Karan Bahree in the Times of India&lt;/a&gt; and another story in the Economic Times about the Australian Broadcasting Corporation’s (ABC) sting operation similar to the one in Delhi, describing the scam by the &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/karan-bahree-part-ii-shot-in-australia/articleshow/1201347.cms?from=mdr"&gt;fraudsters as a leak&lt;/a&gt; of the online information of Australians. Another media account of the coverage describes the incident in more generic terms such as an “&lt;a href="https://www.tribuneindia.com/2005/20050625/edit.htm"&gt;outsourcing crime&lt;/a&gt;”.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The other case concerned &lt;a href="https://www.taylorfrancis.com/chapters/mono/10.4324/9781315610689-16/political-economy-data-security-bpo-industry-india-alan-chong-faizal-bin-yahya"&gt;four former employees of Parsec technologies who stole classified information and diverted calls from potential customers&lt;/a&gt;, causing a sudden drop in the productivity of call centres managed by the company in November 2005. Another call centre &lt;a href="http://news.bbc.co.uk/1/hi/uk/7953401.stm"&gt;fraud came to light in 2009 through a BBC sting operation in which British reporters went to Delhi &lt;/a&gt;and secretly filmed a deal with a man selling credit card and debit card details obtained from Symantec call centres, which sold software made by Norton. This BBC story uses the term “breach” to refer to the incident.&lt;/p&gt;
&lt;p dir="ltr"&gt;In the broader framing of these cases generally understood as cybercrime, which received transnational media coverage, the terms “fraud”, “leak”, “scam”, and “theft” appear interchangeably. The term “data breach” does not seem to be a popular or common usage in these media accounts of the BPO-related incidents. A broader sense of breach (of confidentiality, privacy) figures in the media reportage in &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr"&gt;implicitly racial terms of cultural trust&lt;/a&gt;, as a matter of &lt;a href="https://www.news18.com/news/business/bpo-staff-need-ethical-training-poll-248442.html"&gt;ethics and professionalism&lt;/a&gt; and in the &lt;a href="https://www.news18.com/news/business/sting-op-may-spell-doom-for-bpos-248260.html"&gt;language of scandal &lt;/a&gt;in some cases.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;These early cases typify a specific kind of cybercrime concerning the theft or misappropriation of outsourced personal data belonging to British or American residents. What’s remarkable about these cases is the utmost sensitivity of the stolen personal information including financial details, bank account and credit/debit card numbers, passwords, and in one case, source code. While these cases rang the alarm bells on the Indian BPO sector’s data security protocols, they also directed attention to concerns around &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr"&gt;the training of Indian employees on the ethics of data confidentiality and vetting through psychometric tests&lt;/a&gt; for character assessment. In the wake of these incidents, the National Association of Software and Service Companies (NASSCOM), an Indian non-governmental trade and advocacy group,&lt;a href="https://www.computerworld.com/article/2547959/outsourcing-to-india--dealing-with-data-theft-and-misuse.html"&gt; launched a National Skills Registry for IT professionals to enable employers to conduct background checks&lt;/a&gt; in 2006.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;These data theft incidents earned India a global reputation of an unsafe destination for business process outsourcing, seen to be lacking both, a culture of maintaining data confidentiality and concrete legislation for data protection at the time. Importantly, the incidents of data theft or misappropriation were also traceable back to a known source, a BPO employee or a group of malefactors, who often sold sensitive data belonging to foreign nationals to others in India.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The phrase “data leak” also caught on in another register in the context of the widespread use of camera-equipped mobile phones in India. The 2004 Delhi MMS case offers an instance of a date leak, recapitulating the language of scandal in moralistic terms.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;The Delhi MMS Case&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The infamous 2004 incident involved two underage Delhi Public School (DPS) students who recorded themselves in a sexually explicit act on a cellular phone. After a fall out, the male student passed the low-resolution clip on to his friend in which his female friend’s face is seen. The clip, distributed far and wide in India, ended up on the famous e-shopping and auction website, bazee.com leading to &lt;a href="https://indiancaselaw.in/avnish-bajaj-vs-state-dps-mms-scandal-case/"&gt;the arrest of the website’s CEO Avinash Bajaj for hosting the listing for sale&lt;/a&gt;. Another similar case in 2004 mimicked the mechanics of visual capture through hand-held MMS-enabled mobile phones. A two-minute MMS of a top South-Indian actress &lt;a href="https://timesofindia.indiatimes.com/india/web-of-sleaze-now-nude-video-of-top-actress/articleshow/966048.cms"&gt;taking a shower went viral on the Internet in 2004, the year when another MMS of two prominent Bollywood actors kissing&lt;/a&gt; had already done the rounds. The &lt;a href="https://www.journals.upd.edu.ph/index.php/plaridel/article/view/2392"&gt;MMS case also marked the onset of a national moral panic around the amateur uses of mobile phone technologies&lt;/a&gt;, capable of corrupting young Indian minds under a sneaky regime of new media modernity. The MMS case, not strictly the classic case of a data breach - non-visual information generally stored in databases - became an iconic case of a data leak framed in the media as &lt;a href="https://www.telegraphindia.com/india/scandal-in-school-shakes-up-delhi/cid/1667531"&gt;a scandal that shocked the country&lt;/a&gt;, with calls for the regulation of mobile phone use in schools. The case continued its scandalous afterlife in a &lt;a href="https://www.heraldgoa.in/Edit/dev-ds-leni-has-a-dps-mms-scandal-connection-/21344"&gt;2009 Bollywood film, Dev D&lt;/a&gt; and another &lt;a href="https://indianexpress.com/article/entertainment/entertainment-others/delhi-mms-scandal-inspires-dibakars-love-sex-aur-dhoka/"&gt;2010 film, Love, Sex and Dhokha&lt;/a&gt;,&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Taken together, the BPO data thefts and frauds and the data leak scandals prefigure the contemporary discourse on data breaches in the second decade of the 21st century, or what may also be called the Decade of Datafication. The launch of the Indian biometric identity project, Aadhaar, in 2009, which linked access to public services and welfare delivery with biometric identification, resulted in large-scale data collection of the scheme’s subscribers. Such linking raised the spectre of state surveillance as alleged by the critics of Aadhaar, marking a watershed moment in the discourse on data privacy and protection.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Aadhaar Data Security and Other Data Breaches&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Aadhaar was challenged in the Indian Supreme Court in 2012 when &lt;a href="https://www.outlookindia.com/website/story/worries-about-the-aadhaar-monster/296790"&gt;it was made mandatory for welfare and other services such as banking, taxation and mobile telephony&lt;/a&gt;. The national debate on the status of privacy as a cultural practice in Indian society and a fundamental right in the Indian Constitution led to two landmark judgments - the &lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;2017 Puttaswamy ruling&lt;/a&gt; holding privacy to be a constitutional right subject to limitations and &lt;a href="https://indiankanoon.org/doc/127517806/"&gt;the 2018 Supreme Court judgment holding mandatory Aadhaar to be constitutional only for welfare and taxation but no other service&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;While these judgments sought to rein in Aadhaar’s proliferating mandatory uses, biometric verification remained the most common mode of identity authentication with &lt;a href="https://www.businesstoday.in/latest/trends/story/aadhaar-not-mandatory-yet-organisations-pose-it-as-a-mandatory-document-335550-2022-05-29"&gt;most organizations claiming it to be mandatory for various purposes&lt;/a&gt;. During the same period from 2010 onwards, a range of data security events concerning Aadhaar came to light. These included &lt;a href="https://www.firstpost.com/tech/news-analysis/aadhaar-security-breaches-here-are-the-major-untoward-incidents-that-have-happened-with-aadhaar-and-what-was-actually-affected-4300349.html"&gt;app-based flaws, government websites publishing Aadhaar details of subscribers, third party leaks of demographic data, duplicate and forged Aadhaar cards and other misuses&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In 2015, the Indian government launched its ambitious &lt;a href="https://indiancc.mygov.in/wp-content/uploads/2021/08/mygov-10000000001596725005.pdf"&gt;Digital India Campaign to provide government services to Indian citizens&lt;/a&gt; through online platforms. Yet, data security breach incidents continued to increase, particularly the trade in the sale and purchase of sensitive financial information related to bank accounts and credit card numbers. The online availability of &lt;a href="https://www.livemint.com/Industry/l5WlBjdIDXWehaoKiuAP9J/India-unprepared-to-tackle-online-data-security-report.html"&gt;a rich trove of data, accessible via a simple Google search without the use of any extractive software or hacking skills &lt;/a&gt;within a thriving shadow economy of data buyers and sellers makes India a particularly vulnerable digital economy, especially in the absence of robust legislation. The lack of awareness around digital crimes and low digital literacy further exacerbates the situation given that datafication via government portals, e-commerce, and online apps has outpaced the enforcement of legislative frameworks for data protection and cybersecurity.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In the context of Aadhaar data security issues, the term “data leak” seems to have more traction in media stories followed by the term “security breach”. Given the complexity of the myriad ways in which Aadhaar data has been breached, terms such as &lt;a href="https://techcrunch.com/2022/06/13/aadhaar-leak-pm-kisan/?guccounter=1&amp;amp;guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&amp;amp;guce_referrer_sig=AQAAADvQXtC19Gj80LSKVc5jLwnRsREalvM2f6dV3N9KmCs8be6_1Zbvu3J6abPmBxhLlUooLiOjg4JktYDDCXr0OYYvOZ5XFlXa6DfCJk97TvMXM-cs3uJbCJBA-ePqvAC5K4qGZSyDB4OykMEOIKXJpB0CTOourPRc5dBxFFq5JXlB"&gt;data leak and exposure&lt;/a&gt; (of &lt;a href="https://zeenews.india.com/personal-finance/aadhaar-data-breach-over-110-crore-indian-farmers-aadhaar-card-data-compromised-2473666.html"&gt;11 crore Indian farmers’ sensitive information&lt;/a&gt;) add to the specificity of the data security compromise. The term “fraud” also makes a comeback in the context of &lt;a href="https://www.business-standard.com/article/economy-policy/india-s-aadhaar-id-system-delivers-benefits-but-at-risk-of-widespread-fraud-122062400124_1.html"&gt;Aadhaar-related data security incidents&lt;/a&gt;. These cases represent a mix of data frauds involving&lt;a href="https://economictimes.indiatimes.com/news/india/alarm-over-fake-id-printing-websites-using-customer-data-for-cyber-fraud/articleshow/94742646.cms"&gt; fake identities&lt;/a&gt;, &lt;a href="https://indianexpress.com/article/cities/delhi/in-new-age-data-theft-fraudsters-steal-thumb-prints-from-land-registries-7914530/"&gt;theft of thumb prints &lt;/a&gt;for instance from land registries and inadvertent data leaks in numerous incidents involving &lt;a href="https://techcrunch.com/2019/01/31/aadhaar-data-leak/"&gt;government employees in Jharkhand&lt;/a&gt;, v&lt;a href="https://www.firstpost.com/india/aadhaar-data-leak-details-of-7-82-cr-indians-from-ap-and-telangana-found-on-it-grids-database-6448961.html"&gt;oter ID information of Indian citizens in Andhra Pradesh and Telangana&lt;/a&gt; and &lt;a href="https://www.thehindu.com/sci-tech/technology/major-aadhaar-data-leak-plugged-french-security-researcher/article26584981.ece"&gt;activist reports of Indian government websites leaking Aadhaar data&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Aadhaar-related data security events parallel the increase in corporate data breaches during the decade of datafication. The term “data leak” again alternates with the term “data breach” in most media accounts while other terms such as “theft” and “scam” all but disappear in the media coverage of corporate data breaches.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;From 2016 onwards, incidents of corporate data breaches in India continued to rise. A massive &lt;a href="https://thewire.in/banking/debit-card-breach-india-banking"&gt;debit card data breach involving the YES Bank ATMs and point-of-sale (PoS) machines &lt;/a&gt;compromised through malware between May and July of 2016 resulted in the exposure of ATM PINs and non-personal identifiable information of customers. It went &lt;a href="https://www.livemint.com/Industry/Ope7B0jpjoLkemwz6QXirN/SBI-Yes-Bank-MasterCard-deny-data-breach-of-own-systems.html"&gt;undetected for nearly three&lt;/a&gt; months. Another data leak in 2018 concerned a &lt;a href="https://www.zdnet.com/article/another-data-leak-hits-india-aadhaar-biometric-database/"&gt;system run by Indane, a state-owned utility company, which allowed anyone to download private information on all Aadhaar holders &lt;/a&gt;including their names, services they were connected to and the unique 12-digit Aadhaar number. Data breaches continued to be reported in India concurrent with the incidents of data mismanagement related to Aadhaar. Some &lt;a href="https://www.csoonline.com/article/3541148/the-biggest-data-breaches-in-india.html"&gt;prominent data breaches included &lt;/a&gt;a cyberattack on the systems of airline data service provider SITA resulting in the leak of Air India passenger data, leakage of the personal details of the Common Admission Test (CAT) applicants, details of credit card and order preferences of Domino’s pizza customers on the dark web, leakage of COVID-19 patients’ test results leaked by government websites, user data of Justpay and Big Basket for sale on the dark web and an SBI data breach among others between 2019 and 2021.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The media reportage of these data breaches use the term “cyberattack” to describe the activities of hackers and cybercriminals operating within a&lt;a href="https://www.thehindu.com/sci-tech/technology/internet/most-damaging-cybercrime-services-are-cheap-on-the-dark-web/article37004587.ece"&gt; shadow economy or the dark web&lt;/a&gt;. Recent examples of cyberattacks by hackers who leak user data for sale on the dark web include &lt;a href="https://indianexpress.com/article/technology/tech-news-technology/mobikwik-database-leaked-on-dark-web-company-denies-any-data-breach-7251448/"&gt;8.2 terabytes of 110 million sensitive financial data (KYC details, Aadhaar, credit/debit cards and phone numbers) of the payments app MobiKwik users&lt;/a&gt;, &lt;a href="https://www.firstpost.com/tech/news-analysis/dominos-india-data-breach-name-location-mobile-number-email-of-18-crore-orders-up-for-sale-on-dark-web-9650591.html"&gt;180 million Domino’s pizza orders (name, location, emails, mobile numbers),&lt;/a&gt; and &lt;a href="https://techcrunch.com/2022/07/18/cleartrip-data-breach-dark-web/"&gt;Flipkart’s Cleartrip users’ data&lt;/a&gt;. In these incidents again, three terms appear prominently in the media reportage - cyberattack, data breach, and leak. The term “data breach” remains the most frequently used epithet in the media coverage of the lapses of data security. While it alternates with the term “leak” in the stories, the term “data breach” appears consistently across most headlines in the news stories.&lt;/p&gt;
&lt;p dir="ltr"&gt;The exposure of sensitive, personal, and non-personal data by public and private entities in India is certainly a cause for concern, given the ongoing data protection legislative vacuum.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The media coverage of data breaches tends to emphasize the quantum of compromised user data aside from the types of data exposed. The media framing of these breaches in &lt;a href="https://www.livemint.com/technology/tech-news/indian-firms-lost-176-million-to-data-breaches-last-fiscal-11658914231530.html"&gt;quantitative terms of financial loss&lt;/a&gt; as well as the &lt;a href="https://www.indiatoday.in/technology/news/story/personal-data-of-3-4-million-paytm-mall-users-reportedly-exposed-in-2020-data-breach-1980690-2022-07-27"&gt;magnitude&lt;/a&gt; and the &lt;a href="https://www.moneycontrol.com/news/business/banks/indian-banks-reported-248-data-breaches-in-last-four-years-says-government-8940891.html"&gt;number of breaches&lt;/a&gt; certainly highlights the gravity of these incidents but harm to individual users is often not addressed.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Evolving Terminology and the Source of Data Harms&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The main difference in the media reportage of the BPO cybersecurity incidents during the early aughts and the contemporary context of datafication is the usage of the term, “data breach”, which figures prominently in contemporary reportage of data security incidents but not so much in the BPO-related cybercrimes.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;THe BPO incidents of data theft and the attendant fraud must be understood in the context of the anxieties brought on by a globalizing world of Internet-enabled systems and transnational communications. In most of these incidents regarded as cybercrimes, the language of fraud and scam ventures further to attribute such illegal actions of the identifiable malefactors to cultural factors such as lack of ethics and professionalism.The usage of the term “data leak” in these media reports functions more specifically to underscore a broader lapse in data security as well as a lack of robust cybersecurity laws. The broader term, “breach”, is occasionally used to refer to these incidents but the term, “data breach” doesn’t appear as such.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The term “data breach” gains more prominence in media accounts from 2009 onwards in the context of Aadhaar and the online delivery of goods and services by public and private players. The term “data breach” is often used interchangeably with the term “leak” within the broader ambit of cyberattacks in the corporate sector. The media reportage frames Aadhaar-related security lapses as instances of security/data breaches, data leaks, fraud, and occasionally scam.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In contrast to the handful of data security cases in the BPO sector, data breaches have abounded in the second decade of the twenty-first century. What further differentiates the BPO-related incidents to the contemporary data breaches is the source of the data security lapse. Most corporate data breaches remain attributable to the actions of hackers and cybercriminals while the BPO security lapses were traceable back to ex-employees or insiders with access to sensitive data. We also see in the coverage of the BPO-related incidents, the attribution of such data security lapses to cultural factors including a lack of ethics and professionalism often in racial overtones. The media reportage of the BBC and ABC sting operations suggests that the India BPOs lack of preparedness to handle and maintain personal data confidentiality of foreigners point to the absence of a privacy culture in India. Interestingly, this transnational attribution recurs in a different form in the national debate on &lt;a href="https://huffpost.netblogpro.com/archive/in/entry/indians-don-t-care-about-privacy-but-thankfully-the-law-will-teach-them-what-it-means_a_23179031"&gt;Aadhaar and how Indians don’t care about their privacy&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The question of the harms of data breaches to individuals is also an important one. In the discourse on contemporary data breaches, the actual material harm to an individual user is rarely ever established in the media reportage and generally framed as potential harm that could be devastating given the sensitivity of the compromised data. The harm is reported to be predominantly a function of organizational cybersecurity weakness or attributed to hackers and cybercriminals.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The reporting of harm in collective terms of the number of accounts breached, financial costs of a data breach, the sheer number of breaches and the global rankings of countries with the highest reported cases certainly suggests a problem with cybersecurity and the lack of organizational preparedness. However, this collective framing of a data breach’s impact usually elides an individual user’s experience of harm. Even in the case of Aadhaar-related breaches - a mix of leaking data on government websites and other online portals and breaches - the notion of harm owing to exposed data isn’t clearly established. This is, however, different from the &lt;a href="https://scroll.in/article/1013700/six-types-of-problems-aadhaar-is-causing-and-safeguards-needed-immediately"&gt;extensively documented cases of Aadhaar-related issues&lt;/a&gt; in which welfare benefits have been denied, identities stolen and legitimate beneficiaries erased from the system due to technological errors.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Future Directions of Research&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;This brief, qualitative foray into the media coverage of data breaches over two decades has aimed to trace the usage of various terms in two different contexts - the Indian BPO-related incidents and the contemporary context of datafication. It would be worth exploring at length, the relationship between frequent reports of data breaches, and the language used to convey harm in the contemporary context of a concrete data protection legislation vacuum. It would be instructive to examine the specific uses of the terms such as “fraud”, “leak”, “scam”, “theft” and “breach” in media reporting of such data security incidents more exhaustively. Such analysis would elucidate how media reportage shapes public perception towards the safety of user data and an anticipation of attendant harm as data protection legislation continues to evolve.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Especially with Aadhaar, which represents a paradigm shift in identity verification through digital means, it would be useful to conduct a sentiment analysis of how biometric identity related frauds, scams, and leaks are reported by the mainstream news media. A study of user attitudes and behaviours in response to the specific terminology of data security lapses such as the terms “breach”, “leak”, “fraud”, “scam”, “cybercrime”, and “cyberattack” would further contribute to how lay users understand the gravity of a data security lapse. Such research would go beyond expert understandings of data security incidents that tend to dominate media reportage to elucidate the concerns of lay users and further clarify the cultural meanings of data privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india'&gt;https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pawan Singh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2022-10-17T16:14:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right">
    <title>Getting the (Digital) Indo-Pacific Economic Framework Right</title>
    <link>https://cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right</link>
    <description>
        &lt;b&gt;On the eve of the Tokyo Quad Summit in May 2022, President Biden unveiled the Indo-Pacific Economic Framework (IPEF), visualising cooperation across the Indo-Pacific based on four pillars: trade; supply chains; clean energy, decarbonisation and infrastructure; and tax and anti-corruption. Galvanised by the US, the other 13 founding members of the IPEF are Australia, Brunei Darussalam, India, Indonesia, Japan, Republic of Korea, Malaysia, New Zealand, Philippines, Singapore, Thailand and Vietnam. The first official in-person Ministerial meeting was held in Los Angeles on 9 September 2022.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="https://directionsblog.eu/getting-the-digital-indo-pacific-economic-framework-right/"&gt;originally published in Directions&lt;/a&gt; on 16 September 2022.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;It is still early days. Given the broad and noncommittal scope of the &lt;a href="http://indiamediamonitor.in/ViewImg.aspx?rfW3mQFhdxZsqXnJzK5Xi5+XYlnW6zXnPDF3Ad56Y/KdgI1zvICzrodtLI85MPKdVO1fIh79GUlPfyXY2/bE2g==" rel="noreferrer noopener" target="_blank"&gt;economic arrangement&lt;/a&gt;, it is unlikely that the IPEF will lead to a trade deal among members in the short run. Instead, experts believe that this new arrangement is designed to serve as a ‘&lt;a href="https://indianexpress.com/article/opinion/columns/building-on-common-ground-7963518/" rel="noreferrer noopener" target="_blank"&gt;framework or starting point&lt;/a&gt;’ for members to cooperate on geo-economic issues relevant to the Indo-Pacific, buoyed in no small part by the United States’ desire to make up lost ground and counter Chinese economic influence in the region.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;United States Trade Representative (USTR) Katherine Tai has underscored the relevance of the Indo-Pacific digital economy to the US agenda with the IPEF. She has emphasized the &lt;a href="https://www.whitehouse.gov/briefing-room/press-briefings/2022/05/23/on-the-record-press-call-on-the-launch-of-the-indo-pacific-economic-framework/" rel="noreferrer noopener" target="_blank"&gt;importance of&lt;/a&gt; collaboratively addressing key connectivity and technology challenges, including standards on cross-border data flows, data localisation and online privacy, as well as the discriminatory and unethical use of artificial intelligence. This is an ambitious agenda given the divergence among members in terms of technological advancement, domestic policy preferences and international negotiating stances at digital trade forums. There is a significant risk that imposing external standards or values on this evolving and politically-contested digital economy landscape will not work, and may even undermine the core potential of the IPEF in the Indo-Pacific. This post evaluates the domestic policy preferences and strategic interests of the Framework’s member states, and how the IPEF can navigate key points of divergence in order to achieve meaningful outcomes.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;State of domestic digital policy among IPEF members&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Data localisation is a core point of divergence in global digital policymaking. It continues to dominate discourse and trigger dissent at all &lt;a href="https://www.ikigailaw.com/the-data-localization-debate-in-international-trade-law/#acceptLicense" rel="noreferrer noopener" target="_blank"&gt;international trade forums&lt;/a&gt;, including the World Trade Organization. IPEF members have a range of domestic mandates restricting cross-border flows, which vary in scope, format and rigidity (see table below)&lt;strong&gt;. &lt;/strong&gt;Most countries only have a conditional data localisation requirement, meaning data can only be transferred to countries where it is accorded an equivalent level of protection – unless the individual whose data is being transferred consents to said transfer. &lt;a href="https://www.lexology.com/library/detail.aspx?g=ee977f2e-ecfb-45cf-9f63-186a78a49512#:~:text=Australia%20has%20no%20broad%20data,transferred%20or%20processed%20outside%20Australia." rel="noreferrer noopener" target="_blank"&gt;Australia &lt;/a&gt;and the &lt;a href="https://www.acq.osd.mil/dpap/pdi/docs/FAQs_Network_Penetration_Reporting_and_Contracting_for_Cloud_Services_(01-27-2017).pdf" rel="noreferrer noopener" target="_blank"&gt;United States&lt;/a&gt; have sectoral localisation requirements for health and defence data respectively. India presently has multiple sectoral data localisation requirements. In particular, a 2018 Reserve Bank of India (RBI) &lt;a href="https://www.rbi.org.in/Scripts/NotificationUser.aspx?Id=11244&amp;amp;Mode=0" rel="noreferrer noopener" target="_blank"&gt;directive&lt;/a&gt; imposed strict local storage requirements along with a 24-hour window for foreign processing of payments data generated in India. The RBI imposed a &lt;a href="https://theprint.in/economy/what-is-data-localisation-why-mastercard-amex-diners-club-cant-add-more-customers-in-india/703790/" rel="noreferrer noopener" target="_blank"&gt;moratorium&lt;/a&gt; on the issuance of new cards by several US-based card companies until compliance issues with the data localisation directive were resolved. Furthermore, several iterations of India’s recently &lt;a href="https://www.thehindu.com/sci-tech/technology/internet/explained-why-has-the-government-withdrawn-the-personal-data-protection-bill-2019/article65736155.ece" rel="noreferrer noopener" target="_blank"&gt;withdrawn &lt;/a&gt;Personal Data Protection Bill contained localisation requirements for some categories of personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Indonesia and Vietnam have &lt;a href="https://thediplomat.com/2020/01/the-retreat-of-the-data-localization-brigade-india-indonesia-and-vietnam/" rel="noreferrer noopener" target="_blank"&gt;diluted&lt;/a&gt; the scopes of their data localisation mandates to apply, respectively, only to companies providing public services and to companies not complying with other local laws. These dilutions may have occurred in response to concerted pushback from foreign technology companies operating in these countries. In addition to sectoral restrictions on the transfer of geospatial data, South Korea&lt;a href="https://carnegieendowment.org/2021/08/17/korean-approach-to-data-localization-pub-85165" rel="noreferrer noopener" target="_blank"&gt; retains &lt;/a&gt;several procedural checks on cross-border flows, including formalities regarding providing notice to individual users.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Moving onto another issue flagged by USTR Tai, while all IPEF members recognise the right to information privacy at an overarching or constitutional level, the legal and policy contours of data protection are at different stages of evolution in different countries. &lt;a href="https://www.dlapiperdataprotection.com/index.html?t=law&amp;amp;c=JP#:~:text=Personal%20Information%20Protection%20Commission,-Kasumigaseki%20Common%20Gate&amp;amp;text=Japan%20does%20not%20have%20a%20central%20registration%20system.&amp;amp;text=There%20is%20no%20specific%20legal,(eg%20Chief%20Privacy%20Officer)." rel="noreferrer noopener" target="_blank"&gt;Japan&lt;/a&gt;, &lt;a href="https://www.dlapiperdataprotection.com/index.html?t=law&amp;amp;c=KR" rel="noreferrer noopener" target="_blank"&gt;South Korea&lt;/a&gt;, &lt;a href="https://www.pdp.gov.my/jpdpv2/assets/2020/01/Introduction-to-Personal-Data-Protection-in-Malaysia.pdf" rel="noreferrer noopener" target="_blank"&gt;Malaysia&lt;/a&gt;, &lt;a href="https://www.linklaters.com/en/insights/data-protected/data-protected---new-zealand#:~:text=There%20is%20no%20data%20portability%20right%20in%20New%20Zealand.&amp;amp;text=While%20there%20is%20no%20%22right,a%20correction%20to%20that%20information." rel="noreferrer noopener" target="_blank"&gt;New Zealand,&lt;/a&gt; &lt;a href="https://www.privacy.gov.ph/data-privacy-act/#:~:text=%E2%80%93%20(a)%20The%20personal%20information,against%20any%20other%20unlawful%20processing." rel="noreferrer noopener" target="_blank"&gt;Philippines&lt;/a&gt;, &lt;a href="https://www.pdpc.gov.sg/Overview-of-PDPA/The-Legislation/Personal-Data-Protection-Act#:~:text=What%20is%20the%20PDPA%3F,Banking%20Act%20and%20Insurance%20Act." rel="noreferrer noopener" target="_blank"&gt;Singapore&lt;/a&gt; and &lt;a href="https://www.trade.gov/market-intelligence/thailand-personal-data-protection-act#:~:text=The%20legislation%20mandates%20that%20data,1%20million%20in%20criminal%20fines." rel="noreferrer noopener" target="_blank"&gt;Thailand &lt;/a&gt;have data protection frameworks in place. Data protection frameworks in India and Brunei are under consultation. Notably, the US does not have a comprehensive federal framework on data privacy, although there are patchworks of data privacy regulations at both the federal and state levels.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Regulation and strategic thinking on artificial intelligence (AI) are also at varying levels of development among IPEF members. India has produced a slew of policy papers on Responsible Artificial Intelligence. The most recent &lt;a href="https://www.niti.gov.in/sites/default/files/2021-08/Part2-Responsible-AI-12082021.pdf" rel="noreferrer noopener" target="_blank"&gt;policy paper&lt;/a&gt; published by NITI AAYOG (the Indian government’s think tank) refers to constitutional values and endorses a risk-based approach to AI regulation, much like that adopted by the EU. The US National Security Commission on Artificial Intelligence (NSCAI), chaired by Google CEO Eric Schmidt, expressed concerns about the US ceding AI leadership ground to China. The NSCAI’s final &lt;a href="https://www.nscai.gov/" rel="noreferrer noopener" target="_blank"&gt;report &lt;/a&gt;emphasised the need for US leadership of a ‘coalition of democracies’ as an alternative to China’s autocratic and control-oriented model. Singapore has also made key strides on trusted AI, launching &lt;a href="https://www.pdpc.gov.sg/news-and-events/announcements/2022/05/launch-of-ai-verify---an-ai-governance-testing-framework-and-toolkit" rel="noreferrer noopener" target="_blank"&gt;A.I. verify&lt;/a&gt; – the world’s first AI Governance Testing Framework for companies that wish to demonstrate their use of responsible AI through a minimum verifiable product.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;IPEF and pipe dreams of digital trade&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Some members of the IPEF are signatories to other regional trade agreements. With the exception of Fiji, India and the US, all the IPEF countries are members of the Regional Comprehensive Economic Partnership &lt;a href="https://www.dfat.gov.au/trade/agreements/in-force/rcep#:~:text=RCEP%20entered%20into%20force%20on,Australia%20as%20an%20original%20party." rel="noreferrer noopener" target="_blank"&gt;(RCEP)&lt;/a&gt;, which also includes China. Five IPEF member countries are also members of the &lt;a href="https://www.dfat.gov.au/trade/agreements/in-force/cptpp/comprehensive-and-progressive-agreement-for-trans-pacific-partnership" rel="noreferrer noopener" target="_blank"&gt;Comprehensive and Progressive Trans-Pacific Partnership (CPTPP)&lt;/a&gt; that President Trump backed out of in 2017. Several IPEF members also have bilateral or trilateral trading agreements among themselves, an example being the &lt;a href="https://www.mfat.govt.nz/en/trade/free-trade-agreements/free-trade-agreements-in-force/digital-economy-partnership-agreement-depa/" rel="noreferrer noopener" target="_blank"&gt;Digital Economic Partnership Agreement (DEPA)&lt;/a&gt; between Singapore, New Zealand and Chile.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/Pie.png" alt="Pie" class="image-inline" title="Pie" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;All these ‘mega-regional’ trading agreements contain provisions on data flows, including prohibitions on domestic legal provisions that mandate local computing facilities or restrict cross-border data transfers. Notably, these agreements also incorporate &lt;a href="https://publications.clpr.org.in/the-philosophy-and-law-of-information-regulation-in-india/chapter/indias-engagement-with-global-trade-regimes-on-cross-border-data-flows/" rel="noreferrer noopener" target="_blank"&gt;exceptions&lt;/a&gt; to these rules. The CPTPP includes within its ambit an exception on the grounds of ‘legitimate public policy objectives’ of the member, while the RCEP incorporates an additional exception for ‘essential security interests’.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;IPEF members are also spearheading &lt;a href="https://www.hinrichfoundation.com/research/article/wto/can-the-wto-build-consensus-on-digital-trade/" rel="noreferrer noopener" target="_blank"&gt;multilateral efforts &lt;/a&gt;related to the digital economy: Australia, Japan and Singapore are working as convenors of the plurilateral Joint Statement Initiative (JSI) at the World Trade Organization (WTO), which counts 86 WTO members as parties. India (along with South Africa) vehemently &lt;a href="https://docs.wto.org/dol2fe/Pages/SS/directdoc.aspx?filename=q:/WT/GC/W819.pdf&amp;amp;Open=True" rel="noreferrer noopener" target="_blank"&gt;opposes&lt;/a&gt; this plurilateral push on the grounds that the WTO is a multilateral forum functioning on consensus and a plurilateral trade agreement should not be negotiated within the aegis of the WTO. They fear, rightly, that such gambits close out the domestic policy space, especially for evolving digital economy regimes where keen debate and contestation exist among domestic stakeholders. While wary of the implications of the JSI, other IPEF members, such as Indonesia, have cautiously joined the initiative to ensure that they have a voice at the table.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is unlikely that the IPEF will lead to a digital trade arrangement in the short run. Policymaking on issues as complex as the digital economy that must respond to specific social, economic and (geo)political realities cannot be steamrolled through external trade agreements. For instance, after the Los Angeles Ministerial India &lt;a href="https://www.business-standard.com/article/economy-policy/india-opts-out-of-joining-ipef-trade-pillar-to-wait-for-final-contours-122091000344_1.html" rel="noreferrer noopener" target="_blank"&gt;opted out&lt;/a&gt; of the IPEF trade pillar citing both India’s evolving domestic legislative framework on data and privacy as well as a broader lack of consensus among IPEF members on several issues, including digital trade. Commerce Minister Piyush Goyal explained that India would wait for the “&lt;a href="https://pib.gov.in/PressReleasePage.aspx?PRID=1858243" rel="noreferrer noopener" target="_blank"&gt;final contours&lt;/a&gt;” of the digital trade track to emerge before making any commitments.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Besides, brokering a trade agreement through the IPEF runs a risk of redundancy. Already, there exists a ‘&lt;a href="https://www.rieti.go.jp/en/columns/a01_0193.html" rel="noreferrer noopener" target="_blank"&gt;spaghetti bowl’&lt;/a&gt; of regional trading agreements that IPEF members can choose from, in addition to forming bilateral trade ties with each other.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This is why Washington has been clear about calling the IPEF an ‘&lt;a href="https://theprint.in/diplomacy/india-set-to-join-us-led-indo-pacific-economic-arrangement-next-week-with-aim-to-counter-china/963795/" rel="noreferrer noopener" target="_blank"&gt;economic arrangement&lt;/a&gt;’ and not a trade agreement. Membership does not imply any legal obligations. Rather than duplicating ongoing efforts or setting unrealistic targets, the IPEF is an opportunity for all players to shape conversations, share best practices and reach compromises, which could feed back into ongoing efforts to negotiate trade deals. For example, several members of RCEP have domestic data localisation mandates that do not violate trade deals because the agreement carves out exceptions that legitimise domestic policy decisions. Exchanges on how these exceptions work in future trade agreements could be a part of the IPEF arrangement and nudge states towards framing digital trade negotiations through other channels, including at the WTO. Furthermore, states like Singapore that have launched AI self-governance mechanisms could share best practices on how these mechanisms were developed as well as evaluations of how they have helped policy goals be met. And these exchanges shouldn’t be limited to existing IPEF members. If the forum works well, countries that share strategic interests in the region with IPEF members, including, most notably, the European Union, may also want to get involved and further develop partnerships in the region.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Countering China&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Talking shop on digital trade should certainly not be the only objective of the IPEF. The US has made it clear that they want the message emanating from the IPEF ‘&lt;a href="https://www.business-standard.com/article/international/biden-to-visit-japan-for-quad-summit-to-have-bilateral-meetings-with-modi-122051900128_1.html" rel="noreferrer noopener" target="_blank"&gt;to be heard in Beijing&lt;/a&gt;’. Indeed, the IPEF offers an opportunity for the reassertion of US economic interests in a region where President Trump’s withdrawal from the CPTPP has left a vacuum for China to fill. Accordingly, it is no surprise that the IPEF has representation from several regions of the Indo-Pacific: South Asia, Southeast Asia and the Pacific.&lt;/p&gt;
&lt;p&gt;This should be an urgent policy priority for all IPEF members. Since its initial announcement in 2015, the &lt;a href="https://www.cfr.org/china-digital-silk-road/" rel="noreferrer noopener" target="_blank"&gt;Digital Silk Road (DSR)&lt;/a&gt;, the digital arm of China’s Belt and Road Initiative, has spearheaded &lt;a href="https://www.iiss.org/blogs/research-paper/2021/02/china-digital-silk-road-implications-for-defence-industry" rel="noreferrer noopener" target="_blank"&gt;massive investments&lt;/a&gt; by the Chinese private sector (allegedly under close control of the Chinese state) in e-commerce, fintech, smart cities, data centres, fibre optic cables and telecom networks. This expansion has also happened in the Indo-Pacific, unhampered by China’s aggressive geopolitical posturing in the region through maritime land grabs in the South China Sea. With the exception of &lt;a href="https://www.scmp.com/news/asia/southeast-asia/article/3024479/vietnam-shuns-huawei-it-seeks-build-aseans-first-5g" rel="noreferrer noopener" target="_blank"&gt;Vietnam&lt;/a&gt;, which remains wary of China’s economic expansionism, countries in Southeast Asia welcome Chinese investments, extolling their developmental benefits. Several IPEF members – &lt;a href="https://www.iseas.edu.sg/wp-content/uploads/2022/05/ISEAS_Perspective_2022_57.pdf" rel="noreferrer noopener" target="_blank"&gt;including&lt;/a&gt; Indonesia, Malaysia and Singapore – have associations with Chinese private sector companies, predominantly Huawei and ZTE. A &lt;a href="https://carnegieendowment.org/2022/07/11/localization-and-china-s-tech-success-in-indonesia-pub-87477" rel="noreferrer noopener" target="_blank"&gt;study&lt;/a&gt; evaluating Indonesia’s response to such investments indicates that while they are aware of the risks posed by Chinese infrastructure, their calculus remains unaltered: development and capacity building remain their primary focuses. Furthermore, on the specific question of surveillance, given evidence of other countries such as the US and Australia also using digital infrastructure for surveillance, the threat from China is not perceived as a unique risk.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Setting expectations and approaches&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Still, the risks of excessive dependence on one country for the development of digital infrastructure are well known. While the IPEF cannot realistically expect to displace the DSR, it can be utilised to provide countries with alternatives. This can only be done by issuing carrots rather than sticks. A US narrative extolling ‘digital democracy’ is unlikely to gain traction in a region characterised by a diversity of political systems that is focused on economic and development needs. At the same time, an excessive focus on thorny domestic policy issues – such as data localisation and the pipe dream of yet another mega-regional trade deal – could risk derailing the geo-economic benefits of the IPEF.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Instead, the IPEF must focus on capacity building, training and private sector investment in infrastructure across the Indo-Pacific. The US must position itself as a geopolitically reliable ally, interested in the overall stability of the digital Indo-Pacific, beyond its own economic or policy preferences. This applies equally to other external actors, like the EU, who may be interested in engaging with or shaping the digital economic landscape in the Indo-Pacific.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Countering Chinese economic influence and complementing security agendas set through other fora – such as the Quadrilateral Security Dialogue – should be the primary objective of the IPEF. It is crucial that unrealistic ambitions seeking convergence on values or domestic policy do not undermine strategic interests and dilute the immense potential of the IPEF in catalysing a more competitive and secure digital Indo-Pacific.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Table: Domestic policy positions on data localisation and data protection&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;&lt;img src="https://cis-india.org/home-images/Table.png/@@images/8e9a5192-5f6c-4666-8d78-e0863111534a.png" alt="Table" class="image-inline" title="Table" /&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right'&gt;https://cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>arindrajit</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Digital Economy</dc:subject>
    

   <dc:date>2022-10-03T14:56:22Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines">
    <title>NHA Data Sharing Guidelines – Yet Another Policy in the Absence of a Data Protection Act</title>
    <link>https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines</link>
    <description>
        &lt;b&gt;In July this year, the National Health Authority (NHA) released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) just two months after publishing the draft Health Data Management Policy.&lt;/b&gt;
        &lt;p&gt;Reviewed and edited by Anubha Sinha&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Launched in 2018, PM-JAY is a public health insurance scheme set to cover 10 crore poor and vulnerable families across the country for secondary and tertiary care hospitalisation. Eligible candidates can use the scheme to avail of cashless benefits at any public/private hospital falling under this scheme. Considering the scale and sensitivity of the data, the creation of a well-thought-out data-sharing document is a much-needed step. However, the document – though only a draft – has certain portions that need to be reconsidered, including parts that are not aligned with other healthcare policy documents. In addition, the guidelines should be able to work in tandem with the Personal Data Protection Act whenever it comes into force. With no prior intimation of the publication of the guidelines, and the provision of a mere 10 days for consultation, there was very little scope for stakeholders to submit their comments and participate in the consultation. While the guidelines pertain to the PM-JAY scheme, it is an important document to understand the government’s concerns and stance on the sharing of health data, especially by insurance companies.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Definitions: Ambiguous and incompatible with similar policy documents&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The draft guidelines add to the list of health data–related policies that have been published since the beginning of the pandemic. These include three draft health data management policies published within two years, which have already covered the sharing and management of health data. The draft guidelines repeat the pattern of earlier policies on health data, wherein there is no reference to the policies that predated it; in this case, the guidelines fail to refer to the draft National Digital Health Data Management Policy (published in April 2022). To add to this, the document – by placing the definitions at the end – is difficult to read and understand, especially when terms such as ‘beneficiary’, ‘data principal’, and ‘individual’ are used interchangeably. In the same vein, the document uses the terms ‘data principal’ and ‘data fiduciary’, and the definitions of health data and personal data, from the 2019 PDP Bill, while also referring to the IT Act SDPI Rules and its definition of ‘sensitive personal data’. While the guidelines state that the IT Act and Rules will be the legislation to refer to for these guidelines, it is to be noted that the IT Act under the SPDI Rules covers ‘body corporates’, which under Section 43A(1), is defined as “any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities;”. It is difficult to add responsibility and accountability to the organisations under the guidelines when they might not even be covered under this definition.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With each new policy, civil society organisations have been pointing out the need to have a data protection act before introducing policies and guidelines that deal with the processing and sharing of the data of individuals. Ideally, these policies – even in draft form – should have been published after the Personal Data Protection Bill was enacted, to ensure consistency with the provisions of the law. For example, the guidelines introduce a new category of governance mechanisms under the data-sharing committee headed by a data-sharing officer (DSO). The responsibilities and powers of the DSO are similar to that of the data protection officer under the draft PDP Bill as well as the National Data Health Management Policy (NHDMP). This, in turn, raises the question of whether the DSO and the DPOs under both the PDP Bill and the draft NDMP will have the same responsibilities. Clarity in terms of which of the policies are in force and how they intersect is needed to ensure a smooth implementation. Ideally, having multiple sources of definitions should be addressed at the drafting stage itself.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Guiding Principles: Need to look beyond privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The guidelines enumerate certain principles to govern the use, collection, processing, and transmission of the personal or sensitive personal data of beneficiaries. These principles are accountability, privacy by design, choice and consent, openness/transparency, etc. While these provisions are much needed, their explanation at times misses the mark of why these principles were added. For example, in the case of accountability, the guidelines state that the ‘data fiduciary’ shall be accountable for complying with measures based on the guiding principles However, it does not specify who the fiduciaries would be accountable to and what the steps are to ensure accountability. Similarly, in the case of openness and transparency, the guidelines state that the policies and practices relating to the management of personal data will be available to all stakeholders. However, openness and transparency need to go beyond policies and practices and should consider other aspects of openness, including open data and the use of open-source software and open standards. This again will add to transparency, in that it would specify the rights of the data principal, as the current draft looks at the rights of the data principal merely from a privacy perspective. In the case of purpose limitation as well, the guidelines are tied to the privacy notice, which again puts the burden on the individual (in this case, beneficiary) when the onus should actually be on the data fiduciary. Lastly, under the empowerment of beneficiaries, the guidelines state that the “data principal shall be able to seek correction, amendments, or deletion of such data where it is inaccurate;”. The right to deletion should not be conditional on inaccuracy, especially when entering the scheme is optional and consent-based.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Data sharing with third parties without adequate safeguards&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The guidelines outline certain cases where personal data can be collected, used, or disclosed without the consent of the individual. One of these cases is when the data is anonymised. However, the guidelines do not detail how this anonymisation would be achieved and ensured through the life cycle of the data, especially when the clause states that the data will also be collected without consent. The guidelines also state that the anonymised data could be used for public health management, clinical research, or academic research. The guidelines should have limited the scope of academic research or added certain criteria to gain access to the data; the use of vague terminology could lead to this data (sometimes collected without consent) being de-anonymised or used for studies that could cause harm to the data principal or even a particular community. The guidelines state that the data can be shared as ‘protected health information’ with a government agency for oversight activities authorised by law, epidemic control, or in response to court orders. With the sharing of data, care should be taken to ensure data minimisation and purpose limitations that go beyond the explanations added in the body of the guidelines. In addition, the guidelines also introduce the concept of a ‘clean room’, which is defined as “a secure sandboxed area with access controls, where aggregated and anonymised or de-identified data may be shared for the purposes of developing inference or training models”. The definition does not state who will be developing these training models; it could be a cause of worry if AI companies or even insurance companies have the potential to use this data to train models that could eventually make decisions based on the results. The term ‘sandbox’ is explained under the now revoked DP Bill 2021 as “such live testing of new products or services in a controlled or test regulatory environment for which the Authority may or may not permit certain regulatory relaxations for a&lt;br /&gt;specified period for the limited purpose of the testing”. Neither the 2019 Bill nor the IT Act/Rules defines ‘sandbox’; the guidelines should have ideally spent more time explaining how the sandbox system in the ‘Clean Room’ works.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The draft Data Sharing Guidelines are a welcome step in ensuring that the entities sharing and processing data have guidelines to adhere to, especially since the Data Protection Bill has not been passed yet. The mention of the best practices for data sharing in annexures, including practices for people who have access to the data, is a step in the right direction, which could be made better with regular training and sensitisation. While the guidelines are a good starting point, they still suffer from the issues that have been highlighted in similar health data policies, including not referring to older policies, adding new entities, and the reliance on digital and mobile technology. The guidelines could have added more nuance to the consent and privacy by design sections to ensure other forms of notice, e.g., notice in audio form in different Indian languages. While PM-JAY aims to reach 10 crore poor and vulnerable families, there is a need to look at how to ensure that consent is given according to the guidelines that are “free, informed, clear, and specific”.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines'&gt;https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-09-29T15:17:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar">
    <title>Surveillance Enabling Identity Systems in Africa: Tracing the Fingerprints of Aadhaar</title>
    <link>https://cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar</link>
    <description>
        &lt;b&gt;Biometric identity systems are being introduced around the world with a focus on promoting human development and social and economic inclusion, rather than previous goals of security. As a result, these systems being encouraged in developing countries, particularly in Africa and Asia, sometimes with disastrous consequences.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In this report, we       identify the different external actors that influencing this       “developmental” agenda. These range from philanthropic       organisations, private companies, and technology vendors, to state       and international institutions. Most notable among these is the       World Bank, whose influence we investigated in the form of case       studies of Nigeria and Kenya. We also explored the role played by       the “success” of the Aadhaar programme in India on these new ID       systems. A key characteristic of the growing “digital identity for       development” trend is the consolidation of different databases       that record beneficiary data for government programmes into one       unified platform, accessed by a unique biometric ID. This “Aadhaar       model” has emerged as a default model to be adopted in developing       countries, with little concern for the risks it introduces. Read       and download the full report &lt;a href="https://cis-india.org/internet-governance/surveillance-enabling-identity-systems-in-africa" class="internal-link"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar'&gt;https://cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shruti Trikanad and Vrinda Bhandari</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Surveillance</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-08-09T08:17:32Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19">
    <title>Deployment of Digital Health Policies and Technologies: During Covid-19</title>
    <link>https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19</link>
    <description>
        &lt;b&gt;In the last twenty years or so, the Indian government has adopted several digital mechanisms to deliver services to its citizens. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Digitisation of public services in India began with taxation, land record keeping, and passport details recording, but it was soon extended to cover most governmental services - with the latest being public health. The digitisation of healthcare system in India had begun prior to the pandemic. However, given the push digital health has received in recent years especially with an increase in the intensity of activity during the pandemic, we thought it is important to undertake a comprehensive study of India's digital health policies and implementation. The project report comprises a desk-based research review of the existing literature on digital health technologies in India and interviews with on-field healthcare professionals who are responsible for implementing technologies on the ground.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The report by Privacy International and the Centre for Internet &amp;amp; Society can be &lt;a href="https://cis-india.org/internet-governance/deployment-of-digital-health-policies-and-technologies" class="internal-link"&gt;&lt;strong&gt;accessed here&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19'&gt;https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pallavi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digitalisation</dc:subject>
    
    
        <dc:subject>Digital Health</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Media</dc:subject>
    
    
        <dc:subject>Digital Technologies</dc:subject>
    
    
        <dc:subject>Digitisation</dc:subject>
    

   <dc:date>2022-07-21T14:49:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy">
    <title>The Government’s Increased Focus on Regulating Non-Personal Data: A Look at the Draft National Data Governance Framework Policy </title>
    <link>https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy</link>
    <description>
        &lt;b&gt;Digvijay Chaudhary and Anamika Kundu wrote an article on the National Data Governance Framework Policy. It was edited by Shweta Mohandas.&lt;/b&gt;
        &lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Non Personal Data (‘NPD’) can be &lt;a href="https://www.taylorfrancis.com/chapters/edit/10.4324/9780429022241-8/regulating-non-personal-data-age-big-data-bart-van-der-sloot"&gt;understood&lt;/a&gt; as any information not relating to an identified or identifiable natural person. The origin of such data can be both human and non-human. Human NPD would be such data which has been anonymised in such a way that the person to whom the data relates cannot be re-identified. Non-human NPD would mean any such data that did not relate to a human being in the first place, for example, weather data. There has been a gradual demonstrated interest in NPD by the government in recent times. This new focus on regulating non personal data can be owed to the economic incentive it provides. In its report, the Sri Krishna committee, released in 2018 agreed that NPD holds considerable strategic or economic interest for the nation, however, it left the questions surrounding NPD to a future committee.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;History of NPD Regulation&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In 2020, the Ministry of Electronics and Information Technology (‘MEITY’) constituted an expert committee (‘NPD Committee’) to study various issues relating to NPD and to make suggestions on the regulation of non-personal data. The NPD Committee differentiated NPD into human and non-human NPD, based on the data’s origin. Human NPD would include all information that has been stripped of any personally identifiable information and non-human NPD meant any information that did not contain any personally identifiable information in the first place (eg. weather data). The final report of the NPD Committee is awaited but the Committee came out with a &lt;a href="https://static.mygov.in/rest/s3fs-public/mygov_160922880751553221.pdf"&gt;revised draft&lt;/a&gt; of its recommendations in December 2020. In its December 2020 report, the NPD Committee proposed the creation of a National Data Protection Authority (‘NPDA’) as it felt this is a new and emerging area of regulation. Thereafter, the Joint Parliamentary Committee  on the Personal Data Protection Bill, 2019 (‘JPC’) came out with its &lt;a href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf"&gt;version of the Data Protection Bill &lt;/a&gt;where it amended the short title of the PDP Bill 2019 to Data Protection Bill, 2021 widening the ambit of the Bill to include all types of data. The JPC report focuses only on human NPD, noting that non-personal data is essentially derived from one of the three sets of data - personal data, sensitive personal data, critical personal data - which is either anonymized or is in some way converted into non-re-identifiable data.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;On February 21, 2022,  the Ministry of Electronics and Information Technology (‘MEITY’) came out with the &lt;a href="https://www.meity.gov.in/content/draft-india-data-accessibility-use-policy-2022"&gt;Draft India Data Accessibility and Use Policy, 2022&lt;/a&gt; (‘Draft Policy’). The Draft Policy was strongly criticised mainly due to its aims to monetise data through its sale and licensing to body corporates. The Draft Policy had stated that anonymised and non-personal data collected by the State that has “&lt;a href="https://www.medianama.com/2022/06/223-new-data-governance-policy-privacy/"&gt;undergone value addition&lt;/a&gt;” could be sold for an “appropriate price”. During the Draft Policy’s consultation process, it had been withdrawn several times and then finally removed from the website.&lt;a href="https://www.meity.gov.in/writereaddata/files/Draft%20India%20Data%20Accessibility%20and%20Use%20Policy_0.pdf"&gt; The National Data Governance Framework Policy&lt;/a&gt; (‘NDGF Policy’) is a successor to this Draft Policy. There is a change in the language put forth in the NDGF Policy from the Draft Policy, where the latter mainly focused on monetary growth. The new NDGF Policy aims to regulate anonymised non-personal data (‘NPD’) kept with governmental authorities and make it accessible for research and improving governance. It wishes to create an ‘India Datasets programme’ which will consist of the aforementioned datasets. While  MEITY has opened the draft for public comments, is a need to spell out the procedure in some ways for stakeholders to draft recommendations for the NDGF policies in an informed manner. Through this piece, we discuss the NDGF Policy in terms of issues related to the absence of a comprehensive Data Protection Framework in India and the jurisdictional overlap of authorities under the NDGF Policy and DPB.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;What the National Data Governance Framework Policy Says&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Presently in India, NPD is stored in a variety of governmental departments and bodies. It is difficult to access and use this stored data for governmental functions without modernising collection and management of governmental data. Through the NDGF Policy, the government aims to build an Indian data storehouse of anonymised non-personal datasets and make it accessible for both improving governance and encouraging research. It imagines the establishment of an Indian Data Office (‘IDO’)  set up by MEITY , which shall be responsible for consolidating data access and sharing of non-personal data across the government. In addition, it also mandates a Data Management Unit for every Ministry/department that would work closely with the IDO. IDO will also be responsible for issuing protocols for sharing NPD. The policy further imagines an Indian Data Council (‘IDC’) whose function would be to define frameworks for important datasets, finalise data standards, and Metadata standards and also review the implementation of the policy. The NDGF Policy has provided a broad structure concerning the setting up of anonymisation standards, data retention policies, data quality, and data sharing toolkit. The NDGF Policy states that these standards shall be developed and notified by the IDO or MEITY or the Ministry in question and need to be adhered to by all entities.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;The Data Protection Framework in India&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The report adopted by the JPC, felt that it is simpler to enact a single law and a single regulator to oversee all the data that originates from any data principal and is in the custody of any data fiduciary. According to the JPC, the draft Bill deals with various kinds of data at various levels of security. The JPC also recommended that since the Data Protection Bill (‘DPB’) will handle both personal and non-personal data, any further policy / legal framework on non-personal data may be made a part of the same enactment instead of any separate legislation. The draft DPB states that what is to be done with the NDP shall be decided by the government from time to time according to its policy. As such, neither the DPB, 2021 nor the NDGF Policy go into details of regulating NPD but only provide a broad structure of facilitating free-flow of NPD, without taking into account the &lt;a href="https://cis-india.org/internet-governance/cis-comments-revised-npd-report/view"&gt;specific concerns&lt;/a&gt; that have been raised since the NPD committee came out with its draft report on regulating NPD dated December 2020.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;Jurisdictional overlaps among authorities and other concerns&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Under the NDGF policy, all guidelines and rules shall be published by a body known as the Indian Data Management Office (‘IDMO’). The IDMO is set to function under the MEITY and work with the Central government, state governments and other stakeholders to set standards. Currently, there is no sign of when the DPB will be passed as law. According to the JPC, the reason for including NPD within the DPB was because of the impossibility to differentiate between PD and NPD. There are also certain overlaps between the DPB and the NDGF which are not discussed by the NDGF. NDGF does not discuss the overlap between the IDMO and Data Protection Authority (‘DPA’) established under the DPB 2021.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Under the DPB, the DPA is tasked with specifying codes of practice under clause 49. On the other hand, the NDGF has imagined the setting up of IDO, IDMO, and the IDC, which shall be responsible for issuing codes of practice such as data retention, and data anonymisation, and data quality standards. As such, there appears to be some overlap in the functions of the to-be-constituted DPA and the NDGF Policy.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Furthermore, while the NDGF Policy aims to promote openness with respect to government data, there is a conflict with &lt;a href="https://opengovdata.org/"&gt;open government data (‘OGD’) principle&lt;/a&gt;s when there is a price attached to such data. OGD is data which is collected and processed by the government for free use, reuse and distribution. Any database created by the government must be publicly accessible to ensure compliance with the OGD principles.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;Conclusion&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Streamlining datasets across different authorities is a huge challenge for the government and hence the NGDF policy in its current draft requires a lot of clarification. The government can take inspiration from the European Union which in 2018, came out with a principles-based approach coupled with self-regulation on the framework of the free flow of non-personal data. The &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&amp;amp;from=EN"&gt;guidance&lt;/a&gt; on the free-flow of non-personal data defines non-personal data based on the origin of data - data which originally did not relate to any personal data (non-human NPD) and data which originated from personal data but was subsequently anonymised (human NPD). The &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&amp;amp;from=EN"&gt;regulation&lt;/a&gt; further realises the reality of mixed data sets and regulates only the non-personal part of such datasets and where the datasets are inextricably linked, the GDPR would apply to such datasets. Moreover, any policy that seeks to govern the free flow of NPD ought to make it clear that in case of re-identification of anonymised data, such re-identified data would be considered personal data. The DPB, 2021 and the NGDF, both fail to take into account this difference.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy'&gt;https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Digvijay Chaudhary and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Open Government Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-06-30T13:24:35Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/making-voices-heard">
    <title>Making Voices Heard</title>
    <link>https://cis-india.org/internet-governance/blog/making-voices-heard</link>
    <description>
        &lt;b&gt;We are happy to announce the launch of our final report on the study ‘Making Voices Heard: Privacy, Inclusivity, and Accessibility of Voice Interfaces in India. The study was undertaken with support from the Mozilla Corporation.&lt;/b&gt;
        &lt;p style="text-align: center; "&gt;&lt;img src="https://cis-india.org/home-images/WebsiteHeader.jpg/@@images/8d8ed2a0-f0e4-44d7-8938-493b186402c5.jpeg" alt="Making Voices Heard" class="image-inline" title="Making Voices Heard" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We believe that voice interfaces have the potential to democratise the use of the internet by addressing limitations related to reading and writing on digital text-only platforms and devices. This report examines the current landscape of voice interfaces in India, with a focus on concerns related to privacy and data protection, linguistic barriers, and accessibility for persons with disabilities (PwDs).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report features a visual mapping of 23 voice interfaces and technologies publicly available in India, along with a literature survey, a policy brief towards development and use of voice interfaces and a design brief documenting best practices and users’ needs, both with a focus on privacy, languages, and accessibility considerations, and a set of case studies on three voice technology platforms. &lt;span&gt;Read and download the full report &lt;a class="external-link" href="http://voice.cis-india.org/"&gt;here&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Credits&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Research&lt;/strong&gt;: Shweta Mohandas, Saumyaa Naidu, Deepika Nandagudi Srinivasa, Divya Pinheiro, and Sweta Bisht.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Conceptualisation, Planning, and Research Inputs&lt;/strong&gt;: Sumandro Chattapadhyay, and Puthiya Purayil Sneha.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Illustration&lt;/strong&gt;: Kruthika NS (Instagram @theworkplacedoodler). Website Design Saumyaa Naidu. Website Development Sumandro Chattapadhyay, and Pranav M Bidare.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Review and Editing&lt;/strong&gt;: Puthiya Purayil Sneha, Divyank Katira, Pranav M Bidare, Torsha Sarkar, Pallavi Bedi, and Divya Pinheiro.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Copy Editing&lt;/strong&gt;: The Clean Copy&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/making-voices-heard'&gt;https://cis-india.org/internet-governance/blog/making-voices-heard&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>shweta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Voice User Interface</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Accessibility</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Homepage</dc:subject>
    

   <dc:date>2022-06-27T16:18:36Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
