<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/internet-governance/blog/online-anonymity/search_rss">
  <title>We are anonymous, we are legion</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 31 to 45.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/what-are-the-consumer-protection-concerns-with-crypto-assets"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/making-voices-heard"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-to-the-draft-national-health-data-management-policy-2.0"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19">
    <title>Deployment of Digital Health Policies and Technologies: During Covid-19</title>
    <link>https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19</link>
    <description>
        &lt;b&gt;In the last twenty years or so, the Indian government has adopted several digital mechanisms to deliver services to its citizens. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Digitisation of public services in India began with taxation, land record keeping, and passport details recording, but it was soon extended to cover most governmental services - with the latest being public health. The digitisation of healthcare system in India had begun prior to the pandemic. However, given the push digital health has received in recent years especially with an increase in the intensity of activity during the pandemic, we thought it is important to undertake a comprehensive study of India's digital health policies and implementation. The project report comprises a desk-based research review of the existing literature on digital health technologies in India and interviews with on-field healthcare professionals who are responsible for implementing technologies on the ground.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The report by Privacy International and the Centre for Internet &amp;amp; Society can be &lt;a href="https://cis-india.org/internet-governance/deployment-of-digital-health-policies-and-technologies" class="internal-link"&gt;&lt;strong&gt;accessed here&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19'&gt;https://cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pallavi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digitalisation</dc:subject>
    
    
        <dc:subject>Digital Health</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Media</dc:subject>
    
    
        <dc:subject>Digital Technologies</dc:subject>
    
    
        <dc:subject>Digitisation</dc:subject>
    

   <dc:date>2022-07-21T14:49:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/what-are-the-consumer-protection-concerns-with-crypto-assets">
    <title>What Are The Consumer Protection Concerns With Crypto-Assets?</title>
    <link>https://cis-india.org/internet-governance/blog/what-are-the-consumer-protection-concerns-with-crypto-assets</link>
    <description>
        &lt;b&gt;Existing consumer protection regulations are not sufficient to cover the extent of protection that a crypto-investor would require.&lt;/b&gt;
        &lt;p&gt;The article was &lt;a class="external-link" href="https://www.medianama.com/2022/07/223-addressing-the-consumer-protection-concerns-associated-with-crypto-assets/"&gt;published in Medianama&lt;/a&gt; on July 8, 2022&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Crypto-asset regulation is at the forefront of India’s financial regulator’s minds. On the 6th of June, the Securities and Exchange Board of India (SEBI) &lt;a href="https://www.businessinsider.in/investment/news/sebi-raises-concern-on-crypto-says-that-its-decentralised-nature-makes-them-harder-to-regulate/articleshow/92079830.cms"&gt;in a response &lt;/a&gt;to the Parliamentary Standing Committee on Finance expressed clear consumer protection concerns associated with crypto-assets.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This statement follows &lt;a href="https://www.rbi.org.in/commonman/English/Scripts/PressReleases.aspx?Id=2474"&gt;multiple notices&lt;/a&gt; issued by the Reserve Bank of India (RBI) warning consumers of the risks related to crypto-assets, and even a &lt;a href="https://rbi.org.in/Scripts/NotificationUser.aspx?Id=12103"&gt;failed attempt&lt;/a&gt; to prevent banks from transacting with any individual trading crypto-assets. Yet, in spite of these multiple warnings, and a significant drop in trading volume due to the introduction of a new taxation structure, crypto-assets still have managed to establish themselves as a legitimate financial instrument in the minds of many.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recent global developments, however, seem to validate the concerns held by both the RBI and SEBI.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bear market that crypto finds itself in has sent shockwaves throughout the ecosystem, crippling some of the most established tokens in the space. Take, for example, the &lt;a href="https://indianexpress.com/article/technology/crypto/luna-terra-crash-a-brief-history-of-failed-algorithmic-stablecoins-7934293/"&gt;death spiral&lt;/a&gt; of the algorithmic stablecoin Terra USD and its sister token Luna—with Terra USD going from a top-10-traded crypto-token to being practically worthless. The volatility of token prices has had a significant knock-on effect on crypto-related services. Following Terra’s crash, the Centralised Finance Platform (CeFi) Celsius—which provided quasi-banking facilities for crypto holders—also halted all withdrawals. More recently, the crypto-asset hedge fund Three Arrows also filed for bankruptcy following its inability to meet its debt obligations and protect its assets from creditors looking to get their money back.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Underpinning these stories of failing corporations are the very real experiences of investors and consumers—many of whom have lost a &lt;a href="https://www.bloomberg.com/news/articles/2022-05-14/terra-s-45-billion-face-plant-creates-a-crowd-of-crypto-losers"&gt;significant amount of wealth&lt;/a&gt;. This has been a direct result of the messaging around crypto-assets. Crypto-assets have been promoted through popular culture as a means of achieving financial freedom and accruing wealth quickly. It is this narrative that lured numerous regular citizens to invest substantial portions of their income into crypto-asset trading. At the same time, the crypto-asset space is littered with a number of scams and schemes designed to trick unaware consumers. These schemes, primarily taking the form of ‘&lt;a href="https://www.investor.gov/introduction-investing/investing-basics/glossary/pump-and-dump-schemes"&gt;pump and dump&lt;/a&gt;’ schemes, represent a significant issue for investors in the space.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It seems, therefore, that any attempt to ensure consumer protection in the crypto-space must adopt two key strategies:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;First, it must re-orient the narrative from crypto as a simple means of getting wealthy—and ensure that those consumers who invest in crypto do so with full knowledge of the risks associated with crypto-assets&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Second, it must provide consumers with sufficient recourse in cases where they have been subject to fraud.&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;In this article, we examine the existing regulatory framework around grievance redressal for consumers in India—and whether these safeguards are sufficient to protect consumers trading crypto-assets. We further suggest practical measures that the government can adopt going forward.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;What is the Current Consumer Protection Framework Around Crypto-assets?&lt;/h3&gt;
&lt;p&gt;Safeguards Under the Consumer Protection Act and E-commerce Rules&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The increased adoption of e-commerce by consumers in India forced legislators to address the lack of regulation for the protection of consumer interests. This legislative expansion may extend to protecting the interests of investors and consumers trading in crypto-assets. &lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The groundwork for consumer welfare was laid in the new Consumer Protection Act, 2019 which defined e-commerce as the “buying or selling of goods or services including digital products over digital or electronic network.” It also empowered the Union Government to take measures and issue rules for the protection of consumer rights and interests, and the prevention of unfair trade practices in e-commerce.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Within a year, the Union Government exercised its power to issue operative rules known as the Consumer Protection (E-Commerce) Rules, 2020 (the “Rules”), which amongst other things, sought to prohibit unfair trade practices across all models of e-commerce. The Rules define an e-commerce entity as one which owns, operates or manages a digital or electronic facility or platform (which includes a website as well as mobile applications) for electronic commerce.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The definition of e-commerce is not limited only to physical goods but also includes services as well as digital products. So, one can plausibly assume that it would be applicable to a number of crypto-exchanges, as well as certain entities offering decentralized finance (DeFi)  services. This is because crypto tokens—be it cryptocurrencies like Bitcoin, Ethereum, or Dogecoin—are not considered currency or securities within Indian law, but can be said to be digital products since they are digital goods.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The fact that the digital products being traded on the e-commerce entity originated outside Indian territory would make no difference as far as the applicability of the Rules is concerned. The Rules apply even to e-commerce entities not established in India, but which systematically offer goods or services to consumers in India. The concept of systematically offering goods or services across territorial boundaries appears to have been taken from the E-evidence Directive of the European Union and seeks to target only those entities which intend to do substantial business within India while excluding those who do not focus on the Indian market and have only a minuscule presence here.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, the Rules impose certain duties and obligations on e-commerce entities, such as:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;The appointment of a nodal officer or a senior designated functionary who is resident in India, to ensure compliance with the provisions of the Consumer Protection Act;&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;The prohibition on the adoption of any unfair trading practices, thereby making the most important requirements of consumer protection applicable to e-commerce;&lt;/li&gt;
&lt;li&gt;The establishment of a grievance redressal mechanism and specifying an outer limit of one month for redressal of complaints;&lt;/li&gt;
&lt;li&gt;The prohibition on imposing cancellation charges on the consumer, unless a similar charge is also borne by the e-commerce entity if it cancels the purchase order unilaterally for any reason;&lt;/li&gt;
&lt;li&gt;The prohibition on price manipulation to gain unreasonable profit by imposing an unjustified price on the consumers; &lt;/li&gt;
&lt;li&gt;The prohibition on discrimination between consumers of the same class or an arbitrary classification of consumers that affects their rights; etc.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The Rules also impose certain liabilities on e-commerce entities relating to the tracking of shipments, the accuracy of the information on the goods or services being offered, information and ranking of sellers, tracking complaints, and information regarding payment mechanisms. Most importantly, the Rules explicitly make the grievance redressal mechanism under the Consumer Protection Act, 2019 applicable to e-commerce entities in case they violate any of the requirements under the Rules.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What this means is that at present crypto-exchanges and crypto-service providers clearly fall within the ambit of consumer protection legislation in India. In real terms, this means that consumers can rest assured that in any crypto transaction their rights must be accounted for by the corporation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With crypto related scams &lt;a href="https://www.ftc.gov/news-events/data-visualizations/data-spotlight/2022/06/reports-show-scammers-cashing-crypto-craze"&gt;exploding globally following 2021&lt;/a&gt;, it is likely that Indian investors will come into contact, or be subject to various scams and schemes in the crypto marketplace. Therefore, it is imperative that consumers and investors the steps they can take in case they fall victim to a scam. Currently, any consumer who is the victim of a fraud or scam in the crypto space would as per the current legal regime, have two primary redressal remedies:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;Lodging a criminal complaint with the police, usually the cyber cell, regarding the fraud. It then becomes the police’s responsibility to investigate the case, trace the perpetrators, and ensure that they are held accountable under relevant legal provisions. &lt;/span&gt;&lt;/li&gt;
&lt;li&gt;Lodging a civil complaint before the consumer forum or even the civil courts claiming compensation and damages for the loss caused. In this process, the onus is on the consumer to follow up and prove that they have been defrauded.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Filing a consumer complaint may impose an extra burden on the consumer to prove the fraud—especially if the consumer is unable to get complete and accurate information regarding the transaction. Additionally, in most cases, a consumer complaint is filed when the perpetrator is still accessible and can be located by the consumer. However, in case the perpetrator has absconded, the consumer would have no choice but to lodge a criminal complaint. That said, if the perpetrators have already absconded, it may be difficult even for the police to be of much help considering the anonymity that is built into technology.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Therefore, perhaps the best protection that can be afforded to the consumer is where the regulatory regime is geared towards the prevention of frauds and scams by establishing a licensing and supervisory regime for crypto businesses.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;A Practical Guide to Consumer Protection and Crypto-assets&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;What is apparent is that existing regulations are not sufficient to cover the extent of protection that a crypto-investor would require. Ideally, this gap would be covered by dedicated legislation that looks to cover the range of issues within the crypto-ecosystem. However, in the absence of the (still pending) government crypto bill, we are forced to consider how consumers can currently be protected and made aware of the risks associated with crypto-assets.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the question of informing customers of the risks associated, we must address one of the primary means through which consumers become aware of crypto-assets: advertising. Currently, crypto-asset advertising follows a &lt;a href="https://ascionline.in/images/pdf/vda-guidelines-23.02.22.pdf"&gt;code&lt;/a&gt; set down by the &lt;a href="https://www.google.com/search?client=safari&amp;amp;rls=en&amp;amp;q=Advertising+Council+of+India&amp;amp;ie=UTF-8&amp;amp;oe=UTF-8"&gt;Advertising Standards Council of India&lt;/a&gt;, a self-regulating, non-government body. As such, there is currently no government body that enforces binding advertising standards on crypto and crypto-service providers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While self-regulation has generally been an acceptable practice in the case of advertising, the advertising of financial products has differed slightly. For example, Schedule VI of the &lt;a href="https://www.sebi.gov.in/acts/mfreg96.html#sch6#sch6"&gt;Securities and Exchange Board of India (Mutual Funds) Regulations, 1996&lt;/a&gt;, lays down detailed guidelines associated with the advertising of mutual funds. Crypto-assets can, depending on their form, perform similar functions to currencies, securities, and assets. Moreover, they carry a clear financial risk—as such their advertising should come under the purview of a recognised financial regulator. In the absence of a dedicated crypto bill, an existing regulator—such as SEBI or the RBI—should use their ad-hoc power to bring crypto-assets and their advertising under their purview.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This would allow for the government to not only ensure that advertising guidelines are followed, but to dictate the exact nature of these guidelines. This allows it to issue standards pertaining to disclaimers and prevent crypto service providers from advertising crypto as being easy to understand, having a guaranteed return on investment, or other misleading messages.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Moreover, financial institutions such as the RBI and SEBI may consider increasing efforts to inform consumers of the financial and economic risks associated with crypto-assets by undertaking dedicated public awareness campaigns. Strongly enforced advertising guidelines, coupled with widespread and comprehensive awareness efforts, would allow the average consumer to understand the risks associated with crypto-assets, thereby re-orienting the prevailing narrative around them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the question of providing consumers with clear recourse, current financial regulators might consider setting up a joint working group to examine the extent of financial fraud associated with crypto-assets. Such a body can be tasked with providing consumers with clear information related to crypto-asset scams and schemes, how to spot them, and the next steps they must take in case they fall victim to one.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;Aman Nair is a policy officer at the Centre for Internet &amp;amp; Society (CIS), India, focusing on fintech, data governance, and digital cooperative research. Vipul Kharbanda is a non-resident fellow at CIS, focusing on the fintech research agenda of the organisation.&lt;/em&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/what-are-the-consumer-protection-concerns-with-crypto-assets'&gt;https://cis-india.org/internet-governance/blog/what-are-the-consumer-protection-concerns-with-crypto-assets&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Aman Nair and Vipul Kharbanda</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Consumer Rights</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Cryptography</dc:subject>
    

   <dc:date>2022-07-18T15:22:02Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021">
    <title>Comments to the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021</title>
    <link>https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) presented its comments on the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (‘the rules’), which were released on 6 June, 2022 for public comments.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;These comments examine whether the proposed amendments are in adherence to established principles of constitutional law, intermediary liability and other relevant legal doctrines. We thank the Ministry of Electronics and Information Technology (MEITY) for allowing us this opportunity. Our comments are divided into two parts. In the first part, we reiterate some of our comments to the existing version of the rules, which we believe holds relevance for the proposed amendments as well. And in the second part, we provide issue-wise comments that we believe need to be addressed prior to finalising the amendments to the rules.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;To access the full text of the Comments to the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, &lt;a href="https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-it-rules-2021.pdf" class="internal-link"&gt;click here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021'&gt;https://cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anamika Kundu, Digvijay Chaudhary, Divyansha Sehgal, Isha Suri and Torsha Sarkar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    
    
        <dc:subject>Information Technology</dc:subject>
    

   <dc:date>2022-07-07T02:39:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy">
    <title>The Government’s Increased Focus on Regulating Non-Personal Data: A Look at the Draft National Data Governance Framework Policy </title>
    <link>https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy</link>
    <description>
        &lt;b&gt;Digvijay Chaudhary and Anamika Kundu wrote an article on the National Data Governance Framework Policy. It was edited by Shweta Mohandas.&lt;/b&gt;
        &lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Non Personal Data (‘NPD’) can be &lt;a href="https://www.taylorfrancis.com/chapters/edit/10.4324/9780429022241-8/regulating-non-personal-data-age-big-data-bart-van-der-sloot"&gt;understood&lt;/a&gt; as any information not relating to an identified or identifiable natural person. The origin of such data can be both human and non-human. Human NPD would be such data which has been anonymised in such a way that the person to whom the data relates cannot be re-identified. Non-human NPD would mean any such data that did not relate to a human being in the first place, for example, weather data. There has been a gradual demonstrated interest in NPD by the government in recent times. This new focus on regulating non personal data can be owed to the economic incentive it provides. In its report, the Sri Krishna committee, released in 2018 agreed that NPD holds considerable strategic or economic interest for the nation, however, it left the questions surrounding NPD to a future committee.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;History of NPD Regulation&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In 2020, the Ministry of Electronics and Information Technology (‘MEITY’) constituted an expert committee (‘NPD Committee’) to study various issues relating to NPD and to make suggestions on the regulation of non-personal data. The NPD Committee differentiated NPD into human and non-human NPD, based on the data’s origin. Human NPD would include all information that has been stripped of any personally identifiable information and non-human NPD meant any information that did not contain any personally identifiable information in the first place (eg. weather data). The final report of the NPD Committee is awaited but the Committee came out with a &lt;a href="https://static.mygov.in/rest/s3fs-public/mygov_160922880751553221.pdf"&gt;revised draft&lt;/a&gt; of its recommendations in December 2020. In its December 2020 report, the NPD Committee proposed the creation of a National Data Protection Authority (‘NPDA’) as it felt this is a new and emerging area of regulation. Thereafter, the Joint Parliamentary Committee  on the Personal Data Protection Bill, 2019 (‘JPC’) came out with its &lt;a href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf"&gt;version of the Data Protection Bill &lt;/a&gt;where it amended the short title of the PDP Bill 2019 to Data Protection Bill, 2021 widening the ambit of the Bill to include all types of data. The JPC report focuses only on human NPD, noting that non-personal data is essentially derived from one of the three sets of data - personal data, sensitive personal data, critical personal data - which is either anonymized or is in some way converted into non-re-identifiable data.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;On February 21, 2022,  the Ministry of Electronics and Information Technology (‘MEITY’) came out with the &lt;a href="https://www.meity.gov.in/content/draft-india-data-accessibility-use-policy-2022"&gt;Draft India Data Accessibility and Use Policy, 2022&lt;/a&gt; (‘Draft Policy’). The Draft Policy was strongly criticised mainly due to its aims to monetise data through its sale and licensing to body corporates. The Draft Policy had stated that anonymised and non-personal data collected by the State that has “&lt;a href="https://www.medianama.com/2022/06/223-new-data-governance-policy-privacy/"&gt;undergone value addition&lt;/a&gt;” could be sold for an “appropriate price”. During the Draft Policy’s consultation process, it had been withdrawn several times and then finally removed from the website.&lt;a href="https://www.meity.gov.in/writereaddata/files/Draft%20India%20Data%20Accessibility%20and%20Use%20Policy_0.pdf"&gt; The National Data Governance Framework Policy&lt;/a&gt; (‘NDGF Policy’) is a successor to this Draft Policy. There is a change in the language put forth in the NDGF Policy from the Draft Policy, where the latter mainly focused on monetary growth. The new NDGF Policy aims to regulate anonymised non-personal data (‘NPD’) kept with governmental authorities and make it accessible for research and improving governance. It wishes to create an ‘India Datasets programme’ which will consist of the aforementioned datasets. While  MEITY has opened the draft for public comments, is a need to spell out the procedure in some ways for stakeholders to draft recommendations for the NDGF policies in an informed manner. Through this piece, we discuss the NDGF Policy in terms of issues related to the absence of a comprehensive Data Protection Framework in India and the jurisdictional overlap of authorities under the NDGF Policy and DPB.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;What the National Data Governance Framework Policy Says&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Presently in India, NPD is stored in a variety of governmental departments and bodies. It is difficult to access and use this stored data for governmental functions without modernising collection and management of governmental data. Through the NDGF Policy, the government aims to build an Indian data storehouse of anonymised non-personal datasets and make it accessible for both improving governance and encouraging research. It imagines the establishment of an Indian Data Office (‘IDO’)  set up by MEITY , which shall be responsible for consolidating data access and sharing of non-personal data across the government. In addition, it also mandates a Data Management Unit for every Ministry/department that would work closely with the IDO. IDO will also be responsible for issuing protocols for sharing NPD. The policy further imagines an Indian Data Council (‘IDC’) whose function would be to define frameworks for important datasets, finalise data standards, and Metadata standards and also review the implementation of the policy. The NDGF Policy has provided a broad structure concerning the setting up of anonymisation standards, data retention policies, data quality, and data sharing toolkit. The NDGF Policy states that these standards shall be developed and notified by the IDO or MEITY or the Ministry in question and need to be adhered to by all entities.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;The Data Protection Framework in India&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The report adopted by the JPC, felt that it is simpler to enact a single law and a single regulator to oversee all the data that originates from any data principal and is in the custody of any data fiduciary. According to the JPC, the draft Bill deals with various kinds of data at various levels of security. The JPC also recommended that since the Data Protection Bill (‘DPB’) will handle both personal and non-personal data, any further policy / legal framework on non-personal data may be made a part of the same enactment instead of any separate legislation. The draft DPB states that what is to be done with the NDP shall be decided by the government from time to time according to its policy. As such, neither the DPB, 2021 nor the NDGF Policy go into details of regulating NPD but only provide a broad structure of facilitating free-flow of NPD, without taking into account the &lt;a href="https://cis-india.org/internet-governance/cis-comments-revised-npd-report/view"&gt;specific concerns&lt;/a&gt; that have been raised since the NPD committee came out with its draft report on regulating NPD dated December 2020.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;Jurisdictional overlaps among authorities and other concerns&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Under the NDGF policy, all guidelines and rules shall be published by a body known as the Indian Data Management Office (‘IDMO’). The IDMO is set to function under the MEITY and work with the Central government, state governments and other stakeholders to set standards. Currently, there is no sign of when the DPB will be passed as law. According to the JPC, the reason for including NPD within the DPB was because of the impossibility to differentiate between PD and NPD. There are also certain overlaps between the DPB and the NDGF which are not discussed by the NDGF. NDGF does not discuss the overlap between the IDMO and Data Protection Authority (‘DPA’) established under the DPB 2021.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Under the DPB, the DPA is tasked with specifying codes of practice under clause 49. On the other hand, the NDGF has imagined the setting up of IDO, IDMO, and the IDC, which shall be responsible for issuing codes of practice such as data retention, and data anonymisation, and data quality standards. As such, there appears to be some overlap in the functions of the to-be-constituted DPA and the NDGF Policy.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Furthermore, while the NDGF Policy aims to promote openness with respect to government data, there is a conflict with &lt;a href="https://opengovdata.org/"&gt;open government data (‘OGD’) principle&lt;/a&gt;s when there is a price attached to such data. OGD is data which is collected and processed by the government for free use, reuse and distribution. Any database created by the government must be publicly accessible to ensure compliance with the OGD principles.&lt;/p&gt;
&lt;h2 dir="ltr" style="text-align: justify; "&gt;Conclusion&lt;/h2&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Streamlining datasets across different authorities is a huge challenge for the government and hence the NGDF policy in its current draft requires a lot of clarification. The government can take inspiration from the European Union which in 2018, came out with a principles-based approach coupled with self-regulation on the framework of the free flow of non-personal data. The &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&amp;amp;from=EN"&gt;guidance&lt;/a&gt; on the free-flow of non-personal data defines non-personal data based on the origin of data - data which originally did not relate to any personal data (non-human NPD) and data which originated from personal data but was subsequently anonymised (human NPD). The &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&amp;amp;from=EN"&gt;regulation&lt;/a&gt; further realises the reality of mixed data sets and regulates only the non-personal part of such datasets and where the datasets are inextricably linked, the GDPR would apply to such datasets. Moreover, any policy that seeks to govern the free flow of NPD ought to make it clear that in case of re-identification of anonymised data, such re-identified data would be considered personal data. The DPB, 2021 and the NGDF, both fail to take into account this difference.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy'&gt;https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Digvijay Chaudhary and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Open Government Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-06-30T13:24:35Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/making-voices-heard">
    <title>Making Voices Heard</title>
    <link>https://cis-india.org/internet-governance/blog/making-voices-heard</link>
    <description>
        &lt;b&gt;We are happy to announce the launch of our final report on the study ‘Making Voices Heard: Privacy, Inclusivity, and Accessibility of Voice Interfaces in India. The study was undertaken with support from the Mozilla Corporation.&lt;/b&gt;
        &lt;p style="text-align: center; "&gt;&lt;img src="https://cis-india.org/home-images/WebsiteHeader.jpg/@@images/8d8ed2a0-f0e4-44d7-8938-493b186402c5.jpeg" alt="Making Voices Heard" class="image-inline" title="Making Voices Heard" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We believe that voice interfaces have the potential to democratise the use of the internet by addressing limitations related to reading and writing on digital text-only platforms and devices. This report examines the current landscape of voice interfaces in India, with a focus on concerns related to privacy and data protection, linguistic barriers, and accessibility for persons with disabilities (PwDs).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report features a visual mapping of 23 voice interfaces and technologies publicly available in India, along with a literature survey, a policy brief towards development and use of voice interfaces and a design brief documenting best practices and users’ needs, both with a focus on privacy, languages, and accessibility considerations, and a set of case studies on three voice technology platforms. &lt;span&gt;Read and download the full report &lt;a class="external-link" href="http://voice.cis-india.org/"&gt;here&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Credits&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Research&lt;/strong&gt;: Shweta Mohandas, Saumyaa Naidu, Deepika Nandagudi Srinivasa, Divya Pinheiro, and Sweta Bisht.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Conceptualisation, Planning, and Research Inputs&lt;/strong&gt;: Sumandro Chattapadhyay, and Puthiya Purayil Sneha.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Illustration&lt;/strong&gt;: Kruthika NS (Instagram @theworkplacedoodler). Website Design Saumyaa Naidu. Website Development Sumandro Chattapadhyay, and Pranav M Bidare.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Review and Editing&lt;/strong&gt;: Puthiya Purayil Sneha, Divyank Katira, Pranav M Bidare, Torsha Sarkar, Pallavi Bedi, and Divya Pinheiro.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Copy Editing&lt;/strong&gt;: The Clean Copy&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/making-voices-heard'&gt;https://cis-india.org/internet-governance/blog/making-voices-heard&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>shweta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Voice User Interface</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Accessibility</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Homepage</dc:subject>
    

   <dc:date>2022-06-27T16:18:36Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-to-the-draft-national-health-data-management-policy-2.0">
    <title>Comments to the Draft National Health Data Management Policy 2.0</title>
    <link>https://cis-india.org/internet-governance/blog/comments-to-the-draft-national-health-data-management-policy-2.0</link>
    <description>
        &lt;b&gt;Anamika Kundu, Shweta Mohandas and Pallavi Bedi along with 9 other organizations / individuals drafted comments to the Draft National Health Data Management Policy 2.0. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;This is a joint submission on behalf of (i) Access Now, (ii) Article 21, (iii) Centre for New Economic Studies, (iv) Center for Internet and Society, (v) Internet Freedom Foundation, (vi) Centre for Justice, Law and Society at Jindal Global Law School, (vii) Priyam Lizmary Cherian, Advocate, High Court of Delhi (ix) Swasti-Health Catalyst, (x) Population Fund of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At the outset, we would like to thank the National Health Authority (NHA) for inviting public comments on the draft version of the National Health Data Management Policy 2.0 (NDHMPolicy 2.0) (Policy) We have not provided comments to each section/clause, but have instead highlighted specific broad concerns which we believe are essential to be addressed prior tothe launch of NDHM Policy 2.0.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Read on to &lt;a href="https://cis-india.org/internet-governance/draft-national-health-management-policy" class="internal-link"&gt;view the full submission here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-to-the-draft-national-health-data-management-policy-2.0'&gt;https://cis-india.org/internet-governance/blog/comments-to-the-draft-national-health-data-management-policy-2.0&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anamika Kundu, Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Health Tech</dc:subject>
    
    
        <dc:subject>Health Management</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Healthcare</dc:subject>
    

   <dc:date>2022-05-24T16:06:15Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021">
    <title>CCTVs in Public Spaces and the Data Protection Bill, 2021</title>
    <link>https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021</link>
    <description>
        &lt;b&gt;This article has been authored by Ms. Anamika Kundu, Research Assistant at the Centre for Internet and Society, and Digvijay S. Chaudhary, Researcher at the Centre for Internet and Society. This blog is a part of RSRR’s Blog Series on the Right to Privacy and the Legality of Surveillance, in collaboration with the Centre for Internet &amp; Society.&lt;/b&gt;
        &lt;p&gt;&lt;span&gt;The article by Anamika Kundu and Digvijay S. Chaudhary was originally &lt;/span&gt;&lt;a class="external-link" href="https://rsrr.in/2022/04/20/cctv-surveillance-privacy/"&gt;published by RGNUL Student Research Review&lt;/a&gt;&lt;span&gt; on April 20, 2022&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img src="https://cis-india.org/home-images/Surveillance.jpg/@@images/f8fad564-44ab-46e2-bd44-29607ea7fd19.jpeg" alt="Surveillance" class="image-inline" title="Surveillance" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In recent times, Indian cities have seen an expansion of state deployed CCTV cameras. According to a recent report, in terms of CCTVs deployed, Delhi was considered as the most surveilled city in the world, surpassing even the most surveilled cities in China. Delhi was not the only Indian city in that list, Chennai and Mumbai also made it to the list. In Hyderabad as well, the development of a Command and Control Centre aims to link the city’s surveillance infrastructure in real-time. Even though studies have shown that there is little correlation between CCTVs and crime control, deployment of CCTV cameras has been justified on the basis of national security and crime deterrence. Such an activity brings about the collection and retention of audio-visual/visual information of all individuals frequenting spaces where CCTV cameras are deployed. This information could be used to identify them (directly or indirectly) based on their looks or other attributes. Potential risks associated with the misuse, and processing of such personal data also arise. These risks include large scale profiling, criminal abuse (law enforcement misusing CCTV information for personal gains), and discriminatory targeting (law enforcement disproportionately focusing on a particular group of people). As these devices capture personal data of individuals, this article seeks data protection safeguards available to data principals against CCTV surveillance employed by the State in a public space under the proposed Data Protection Bill, 2021 (the “DPB”).&lt;/p&gt;
&lt;h2&gt;Safeguards Available Under the Data Protection Bill, 2021&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;To use CCTV surveillance, the measures and compliance listed under the DPB have to be followed. Obligations of data fiduciaries available under Chapter II, such as consent (clause 11), notice requirement (clause 7), and fair and reasonable processing (clause 5) are common to all data processing entities for a variety of activities. Similarly, as the DPB follows the principles of data minimisation (clause 6), storage limitation (clause 9), purpose limitation (clause 5), lawful and fair processing (clause 4), transparency (clause 23), and privacy by design (clause 22), these safeguards too are common to all data processing entities/activities. If a data fiduciary processes personal data of children, it has to comply with the standards stated under clause 16.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the DPB, compliance differs on the basis of grounds and purpose of data processing. As such, if compliance standards differ, so do the availability of safeguards under the DPB. Of relevance to this article, there are three standards of compliance under the DPB wherein the standards of safeguards available to a data principal differ. First, cases which would fall under Chapter III and hence, not require consent. Chapter III lists grounds for processing of personal data without consent. Second, cases which would fall under exemption clauses in Chapter VIII. In such cases, the DPB or some of its provisions would be inapplicable. Clause 35 under Chapter VIII gives power to the Central Government to exempt any agency from the application of the DPB. Similarly, Clause 36 under Chapter VIII, exempts certain provisions for certain processing of personal data. Third, cases which would not fall under either of the above Chapters. In such cases, all safeguards available under the DPB would be available to the data principals. Consequently, safeguards available to data principals in each of these standards are different. We will go through each of these separately.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;First, if the grounds of processing of CCTV information is such that it falls under the scope of Chapter III of the DPB, wherein the consent requirement is done away with, then in those cases, the notice requirement has to reflect such purpose, meaning that even if consent is not necessary for certain cases, other requirements under the DPB would still apply. Here, we must note that CCTV deployment by the state on such a large scale may be justified on the basis of conditions stated under clauses 12 and 14 of DPB – specifically, the condition for the performance of state function authorised by law, and public interest. The requirement under clause 12 of “authorised by law” simply means that the state function should have legal backing. Deployment of CCTVs is most likely to fall under clause 12 as various states have enacted legislations providing for CCTV deployment in the name of public safety. As a result, even if section 12 takes away the requirement of consent for certain cases, data principals should be able to exercise all rights accorded to them under the DPB (chapter V) except the right to data portability under clause 19.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Second, processing of personal data via CCTVs by government agencies could be exempted from DPB under clause 35 for certain cases under the clause. Another exemption that is particularly concerning with regard to the use of CCTVs is the exemption provided under clause 36(a). Section 36(a) says that the provisions of chapters II-VII would not apply where the data is processed in the interest of prevention, detection, investigation, and prosecution of any offence under the law. Chapters II-VII govern the obligations of data fiduciaries, grounds where consent would not be required, personal data of children, rights of data principals, transparency and accountability measures, and restrictions on transfer of personal data outside India respectively. In these cases, the requirement of fair and reasonable processing under clause 5 would also not apply. As a broad justification provided for CCTVs deployment by the government is crime control, it is possible that section 36(a) justification can be used to exempt the processing of CCTV footage from the above-mentioned safeguards.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;From the above discussion, the following can be concluded. First, if the grounds of processing fall under Chapter III, then standards of fair and reasonable processing, notice requirement, and all rights except the right to data portability u/s 19 would be available to data principals. Second, if the grounds of processing fall under clause 36, then, in that case, consent requirement, notice requirement, and the rights under DPB would be unavailable as that section mandates the non-application of those chapters. In such a case, even the processing requirements of a fair and reasonable manner stand suspended. Third, if the grounds of processing of CCTV information doesn’t fall under Chapter III, then all obligations listed under Chapter II would have to be followed. Moreover, the data principal would be able to exercise all the rights available under Chapter V of the DPB.&lt;/p&gt;
&lt;h2&gt;Constitutional Standards&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;When the Supreme Court recognised privacy as a fundamental right in the case of Puttaswamy v. Union of India (“Puttaswamy”), it located the principles of informed consent and purpose limitation as central to informational privacy. It recognised that privacy inheres not in spaces but in an individual. It also recognised that privacy is not an absolute right and certain restrictions may be imposed on the exercise of the right. Before listing the constitutional standards that activities infringing privacy must adhere to, it’s important to answer whether there exists a reasonable expectation of privacy in CCTV footage deployed in a public space by the State?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In Puttaswamy, the court recognised that privacy is not denuded in public spaces. Writing for the plurality judgement, Chandrachud J. recognised that the notion of a reasonable expectation of privacy has elements both of a subjective and objective nature. Defining these concepts, he writes, “Privacy at a subjective level is a reflection of those areas where an individual desire to be left alone. On an objective plane, privacy is defined by those constitutional values which shape the content of the protected zone where the individual ought to be left alone…hence while the individual is entitled to a zone of privacy, its extent is based not only on the subjective expectation of the individual but on an objective principle which defines a reasonable expectation.” Note how in the above sentences, the plurality judgement recognises “a reasonable expectation” to be inherent in “constitutional values”. This is important as the meaning of what’s reasonable is to be constituted according to constitutional values and not societal norms. A second consideration that the phrase “reasonable expectation of privacy” requires is that an individual’s reasonable expectation is allied to the purpose for which the information is provided, as held in the case of Hyderabad v. Canara Bank (“Canara Bank”). Finally, the third consideration in defining the phrase is that it is context dependent. For example, in the case of In the matter of an application by JR38 for Judicial Review (Northern Ireland) 242 (2015) (link here), the UK Supreme Court was faced with a scenario where the police published the CCTV footage of the appellant involved in riotous behaviour. The question before the court was: “Whether the publication of photographs by the police to identify a young person suspected of being involved in riotous behaviour and attempted criminal damage can ever be a necessary and proportionate interference with that person’s article 8 [privacy] rights?” The majority held that there was no reasonable expectation of privacy in the case because of the nature of the criminal activity the appellant was involved in. However, the majority’s formulation of this conclusion was based on the reasoning that “expectation of privacy” was dependent on the “identification” purpose of the police. The court stated, “Thus, if the photographs had been published for some reason other than identification, the position would have been different and might well have engaged his rights to respect for his private life within article 8.1”. Therefore, as the purpose of publishing the footage was “identification” of the wrongdoer, the reasonable expectation of privacy stood excluded. The Canara Bank case was relied on by the SC in Puttaswamy. The plurality judgement in Puttaswamy also quoted the above paragraphs from the UK Supreme Court judgement.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, the SC in the Aadhaar case, laid down the factors of “reasonable expectation of privacy.” Relying on those factors, the Supreme Court observed that demographic information and photographs do not raise a reasonable expectation of privacy. It further held that face photographs for the purpose of identification are not covered by a reasonable expectation of privacy. As this author has recognised, the majority in the Aadhaar case misconstrued the “reasonable expectation of privacy” to lie not in constitutional values as held in Puttaswamy but in societal norms. Even with the misapplication of the Puttaswamy principles by the majority in Aadhaar, it is clear that the exclusion of a “reasonable expectation of privacy” in face photographs is valid only for the purpose of “identification”. For purposes other than “identification”, there should exist a reasonable expectation of privacy in CCTV footage. Having recognised the existence of “reasonable expectation of privacy” in CCTV footage, let’s see how the safeguards mentioned under the DPB stand the constitutional standards of privacy laid down in Puttaswamy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bench in Puttaswamy located privacy not only in Article 21 but the entirety of part III of the Indian Constitution. Where transgression to privacy relates to different provisions under Part III, the tests evolved under those Articles would apply. Puttaswamy recognised that national security and crime control are legitimate state objectives. However, it also recognised that any limitation on the right must satisfy the proportionality test. The proportionality test requires a legitimate state aim, rational nexus, necessity, and balancing of interests. Infringement on the right to privacy occurs under the first and second standard. The first requirement of proportionality stands justified as national security and crime control have been recognised to be legitimate state objectives. However, it must be noted that the EU Guidelines on Processing of Personal Data through video devices state that the mere purpose of “safety” or “for your safety” is not sufficiently specific and is contrary to the principle that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject. The second requirement is a rational nexus. As stated above, there is little correlation between crime control and surveillance measures. Even if the state justifies a rational nexus between state aim and the action employed, it is the necessity part of the proportionality test where the CCTV surveillance measures fail (as explained by this author). Necessity requires us to draw a list of alternatives and their impact on an individual, and then do a balancing analysis with regard to the alternatives. Here, judicial scrutiny of the exemption order under clause 35 is a viable alternative that respects individual rights while at the same time, not interfering with the state’s aim.&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Informed consent and purpose limitation were stated to be central principles of informational privacy in Puttaswamy. Among the three standards we identified, the principles of informed consent and purpose limitation remain available only in the third standard. In the first standard, even though the requirement of consent has become unavailable, the principle of purpose limitation would still be applicable to the processing of such data. The second standard is of particular concern wherein neither of those principles is available to data principals. It is worth mentioning here that in large scale monitoring activities such as CCTV surveillance, the safeguards which the DPB lists out would inevitably have an implementation flaw. The reason is that in scenarios where individuals refuse consent for large scale CCTV monitoring, what alternatives would the government offer to those individuals? Practically, CCTV surveillance would fall under clause 12 standards where consent would not be required. Even in those cases, would the notice requirement safeguard be diminished to “you are under surveillance” notices? When we talk about exercise of rights available under the DPB, how would an individual effectively exercise their right when the data processing is not limited to a particular individual? These questions arise because the safeguards under the DPB (and data protection laws in general) are based on individualistic notions of privacy. Interestingly, individual use cases of CCTVs have also increased with an increase in state use of CCTVs. Deployment of CCTVs for personal or domestic purposes would be exempt from the above-mentioned compliances as that would fall under the exemption provision of clause 36(d). Two additional concerns arise in relation to processing of data concerning CCTVs – the JPC report’s inclusion of Non-Personal Data (“NPD”) within the ambit of DPB, and the government’s plan to develop a National Automated Facial Recognition System (“AFRS”). A significant part of the data collected by CCTVs would fall within the ambit of NPD.With the JPC’s recommendation, it will be interesting to follow the processing standards for NPD under the DPB. AFRS has been imagined as a national database of photographs gathered from various agencies to be used in conjunction with facial recognition technology. The use of facial recognition technology with CCTV cameras raises concerns surrounding biometric data, and risks of large scale profiling. Indeed, section 27 of the DPB reflects this risk and mandates a data protection impact assessment to be undertaken by the data fiduciary with respect to processing involving new technologies or large scale profiling or use of biometric data by such technologies, however the DPB does not define what “new technology” means. Concerns around biometric data are outside the scope of the present article, however, it would be interesting to look at how the use of facial recognition technology with CCTVs could impact the safeguards under DPB.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021'&gt;https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anamika Kundu and Digvijay S Chaudhary</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-04-28T02:29:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies">
    <title>Rethinking Acquisition of Digital Devices by Law Enforcement Agencies</title>
    <link>https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies</link>
    <description>
        &lt;b&gt;This article has been selected as a part of The Right to Privacy and the Legality of Surveillance series organized in collaboration with the RGNUL Student Research Review (RSRR) Journal.&lt;/b&gt;
        
&lt;p&gt;Read the article originally published in &lt;a class="external-link" href="https://rsrr.in/blog/"&gt;RGNUL Student Research Review (RSRR) Journal &lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;strong&gt;Abstract&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The Criminal Procedure Code was created in the 1970s when the concept of the right to privacy was highly unacknowledged. Following the &lt;em&gt;Puttuswamy&lt;/em&gt; &lt;em&gt;I &lt;/em&gt;(2017) judgement of the Supreme Court affirming the right to privacy, these antiquated codes must be re-evaluated. Today, the police can acquire digital devices through summons and gain direct access to a person’s life, despite the summons mechanism having been intended for targeted, narrow enquiries. Once in possession of a device, the police attempt to circumvent the right against self-incrimination by demanding biometric passwords, arguing that the right does not cover biometric information . However, due to the extent of information available on digital devices, courts ought to be cautious and strive to limit the power of the police to compel such disclosures, taking into consideration the &lt;em&gt;right to privacy&lt;/em&gt; judgement.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Keywords: &lt;/strong&gt;Privacy, Criminal Procedural Law, CrPc, Constitutional Law&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Introduction&lt;em&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;New challenges confront the Indian criminal investigation framework, particularly in the context of law enforcement agencies (LEAs) acquiring digital devices and their passwords. Criminal procedure codes delimiting police authority and procedures were created before the widespread use of digital devices and are no longer pertinent to the modern age due to the magnitude of information available on a single device. A single device could provide more information to LEAs than a complete search of a person’s home; yet, the acquisition of a digital device is not treated with the severity and caution it deserves. Following the affirmation of the right to privacy in &lt;em&gt;Puttuswamy I &lt;/em&gt;(2017), criminal procedure codes must be revamped, taking into consideration that the acquisition of a person’s digital device constitutes a major infringement on their right to privacy.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Acquisition of digital devices by LEAs through summons&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;a href="https://www.indiacode.nic.in/bitstream/123456789/15272/1/the_code_of_criminal_procedure%2C_1973.pdf"&gt;Section 91 of the Criminal Procedure Code&lt;/a&gt; (CrPc) grants powers to a court or police officer in charge of a police station to compel a person to produce any form of document or ‘thing’ necessary and desirable to a criminal investigation. In &lt;a href="https://indiankanoon.org/doc/1395576/"&gt;&lt;em&gt;Rama Krishna v State&lt;/em&gt;&lt;/a&gt;,&lt;em&gt; &lt;/em&gt;‘necessary’ and ‘desirable’ have been interpreted as any piece of evidence relevant to the investigation or a link in the chain of evidence. &lt;a href="https://deliverypdf.ssrn.com/delivery.php?ID=040088020003014069081068085012117023096031065012091090091115088031084097097081123000002033027047006112028087095120074083084003037094022080065067076089116106115025106025062083007085091067067124080091064096069093075026018100087109120024076084123086119022&amp;amp;EXT=pdf&amp;amp;INDEX=TRUE"&gt;Abhinav Sekhri&lt;/a&gt;, a criminal law litigator and writer, has argued that the wide wording of this section allows summons to be directed towards the retrieval of specific digital devices.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;As summons are target-specific, the section has minimal safeguards. However, several issues arise in the context of summons regarding digital devices. In the current day, access to a user’s personal device can provide comprehensive insight into their life and personality due to the vast amounts of private and personal information stored on it. In &lt;a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"&gt;&lt;em&gt;Riley v California&lt;/em&gt;&lt;/a&gt;, the Supreme Court of the United States (SCOTUS) observed that due to the nature of the content present on digital devices, summons for them are equivalent to a roving search, i.e., demanding the simultaneous production of all contents of the home, bank records, call records, and lockers. The &lt;em&gt;Riley&lt;/em&gt; decision correctly highlights the need for courts to recognise that digital devices ought to be treated distinctly compared to other forms of physical evidence due to the repository of information stored on digital devices.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The burden the state must surpass in order to issue summons is low as the relevancy requirement is easily provable. As noted in &lt;a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"&gt;&lt;em&gt;Riley&lt;/em&gt;&lt;/a&gt;, police must identify which evidence on a device is relevant. Due to the sheer amount of data on phones, it is very easy for police to claim that there will surely be some form of connection between the content on the device and the case. Due to the wide range of offences available for Indian LEAs to cite, it is easy for them to argue that the content on the device is relevant to any number of possible offences. LEAs rarely face consequences for slamming the accused with a huge roster of charges – even if many of them are baseless – leading to the system being prone to abuse. The Indian Supreme Court in its judgement in &lt;a href="https://indiankanoon.org/doc/1068532/"&gt;&lt;em&gt;Canara Bank&lt;/em&gt;&lt;/a&gt; noted that the burden of proof must be higher for LEAs when investigations violate the right to privacy. &lt;a href="https://www.ijlt.in/_files/ugd/066049_03e4a2b28a5e49f6a59b861aa4554ede.pdf"&gt;Tarun Krishnakumar&lt;/a&gt; notes that the trickle-down effect of &lt;em&gt;Puttuswamy I&lt;/em&gt; will lead to new privacy challenges with regards to a summons to appear in court. &lt;em&gt;Puttuswamy I&lt;/em&gt;, will provide the bedrock and constitutional framework, within which future challenges to the criminal process will be undertaken. It is important for the court to recognise the transformative potential within the &lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;&lt;em&gt;Puttuswamy&lt;/em&gt;&lt;/a&gt; judgement to help ensure that the right to privacy of citizens is safeguarded. The colonial logic of policing – wherein criminal procedure law was merely a tool to maximise the interest of the state at the cost of the people – must be abandoned. Courts ought to devise a framework under Section 91 to ensure that summons are narrowly framed to target specific information or content within digital devices. Additionally, the digital device must be collected following a judicial authority issuing the summons and not a police authority. Prior judicial warrants will require LEAs to demonstrate their requirement for the digital device; on estimating the impact on privacy, the authority can issue a suitable summons. Currently, the only consideration is if the item will furnish evidence relevant to the investigation; however, judges ought to balance the need for the digital device in the LEA’s investigation with the users’ right to privacy, dignity, and autonomy.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;&lt;em&gt;Puttuswamy I&lt;/em&gt;&lt;/a&gt;&lt;em&gt; &lt;/em&gt;provides a triple test encompassing legality, necessity, and proportionality to test privacy claims. Legality requires that the measure be prescribed by law, necessity analyses if it is the least restrictive means being adopted by the state, and proportionality checks if the objective pursued by the measure is proportional to the degree of infringement of the right. The relevance standard, as mentioned before, is inadequate as it does not provide enough safeguards against abuse. The police can issue summons based on the slightest of suspicions and thus get access to a digital device, following which they can conduct a roving enquiry of the device to find evidence of any other offence, unrelated to the original cause of suspicion.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Unilateral police summons of digital devices cannot pass the triple test as it is grossly disproportionate and lacks any form of safeguard against the police. The current system has no mechanism for overseeing the LEAs; as long as LEAs themselves are of the view that they require the device, they can acquire it. In &lt;a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"&gt;&lt;em&gt;Riley&lt;/em&gt;&lt;/a&gt;, SCOTUS has already held that warrantless seizure of digital devices constitutes a violation of the right to privacy. India ought to also adopt a requirement of a prior judicial warrant for the procurement of devices by LEAs. A re-imagined criminal process would have to abide by the triple test in particular proportionality wherein the benefit claimed by the state ought not to be disproportionate to the impact on the fundamental right to privacy; and further, a framework must be proposed to provide safeguards against abuse.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Compelling the production of passwords of devices&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In police investigations, gaining possession of a physical device is merely the first step in acquiring the data on the device, as the LEAs still require the passcodes needed to unlock the device. LEAs compelling the production of passcodes to gain access to potentially incriminating data raises obvious questions regarding the right against self-incrimination; however, in the context of digital devices, several privacy issues may crop up as well.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In &lt;a href="https://main.sci.gov.in/judgment/judis/4157.pdf"&gt;&lt;em&gt;Kathi Kalu Oghad&lt;/em&gt;&lt;/a&gt;, the SC held that compelling the production of fingerprints of an accused person to compare them with fingerprints discovered by the LEA in the course of their investigation does not violate the right to protection against self-incrimination of the accused. &lt;a href="https://lawschoolpolicyreview.com/2019/10/16/biometrics-as-passwords-the-slippery-scope-of-self-incrimination/"&gt;It has been argued&lt;/a&gt; that the ratio in the judgement prohibits the compelling of disclosure of passwords and biometrics for unlocking devices because &lt;a href="https://main.sci.gov.in/judgment/judis/4157.pdf"&gt;&lt;em&gt;Kathi Kalu Oghad&lt;/em&gt;&lt;/a&gt; only dealt with the production of fingerprints in order to compare the fingerprints with pre-existing evidence, as opposed to unlocking new evidence by utilising the fingerprint. However, the judgement deals with self-incrimination and does not address any privacy issues.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The right against self-incrimination approach alone may not be enough to resolve all concerns. Firstly, there may be varying levels of protection provided to different forms of password protections on digital devices; text- and pattern-based passcodes are inarguably protected under Art. 20(3) of the Constitution. However, the protection of biometrics-based passcodes relies upon the correct interpretation of the &lt;a href="https://main.sci.gov.in/judgment/judis/4157.pdf"&gt;&lt;em&gt;Kathi Kalu Oghad&lt;/em&gt;&lt;/a&gt; precedent. Secondly, Art. 20(3) only protects the accused in investigations and not when non-accused digital devices are acquired by LEAs and the passcodes of the devices demanded.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Therefore, considering the aforementioned points, it is pertinent to remember that the right against self-incrimination does not exist in a vacuum separate from privacy. It originates from the concept of decisional autonomy – the right of individuals to make decisions about matters intimate to their life without interference from the state and society. &lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;&lt;em&gt;Puttuswamy I&lt;/em&gt;&lt;/a&gt; observed that decisional autonomy is the bedrock of the right to privacy, as privacy allows an individual to make these intimate decisions away from the glare of society and/or the state. This has heightened importance in this context as interference with such autonomy could lead to the person in question facing criminal prosecution. The SC in &lt;a href="https://main.sci.gov.in/jonew/judis/36303.pdf"&gt;&lt;em&gt;Selvi v Karnataka&lt;/em&gt;&lt;/a&gt;&lt;em&gt; &lt;/em&gt;and &lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;&lt;em&gt;Puttuswamy I&lt;/em&gt;&lt;/a&gt; has repeatedly affirmed that the right against self-incrimination and the right to privacy are linked concepts, with the court observing that the right to remain silent is an integral aspect of decisional autonomy.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In &lt;a href="http://karnatakajudiciary.kar.nic.in:8080/repository/rep_judgmentcase.php"&gt;&lt;em&gt;Virendra Khanna&lt;/em&gt;&lt;/a&gt;, the Karnataka High Court (HC) dealt with the privacy and self-incrimination concerns caused by LEAs compelling the disclosure of passwords. The HC brushes aside concerns related to privacy by noting that the right to privacy is not absolute and that an exception to the right to privacy is state interest and protection of law and order (para 5.11), and that unlawful disclosure of material to third parties could be an actionable wrong (para 15). The court’s interpretation of privacy effectively provides a free pass for the police to interfere with the right to privacy under the pretext of a criminal investigation. This conception of privacy is inadequate as the issue of proportionality is avoided, and the court does not attempt to ensure that the interference is proportionate with the outcome.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;US courts also see the compelling of production of passcodes as an issue of self-incrimination as well as privacy. In its judgement in &lt;a href="https://casetext.com/case/in-re-application-for-a-search-warrant?__cf_chl_f_tk=lTxiJpZIvKfkIBtGQJtMObSmqhdRUZdjGk5hXeMfprQ-1642253001-0-gaNycGzNCJE"&gt;&lt;em&gt;Application for a Search Warrant&lt;/em&gt;&lt;/a&gt;, a US court observed that compelling the disclosure of passcodes existed at an intersection of the right to privacy and self-incrimination; the right against self-incrimination serves to protect the privacy interests of suspects.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Disclosure of passwords to digital devices amounts to an intrusion of the privacy of the suspect as the collective contents on the digital device effectively amount to providing LEAs with a method to observe a person’s mind and identity. Police investigative techniques cannot override fundamental rights and must respect the personal autonomy of suspects – particularly, the choice between silence and speech. Through the production of passwords, LEAs can effectively get a snapshot of a suspect’s mind. This is analogous to the polygraph and narco-analysis test struck down as unconstitutional by the SC in &lt;a href="https://main.sci.gov.in/jonew/judis/36303.pdf"&gt;&lt;em&gt;Selvi&lt;/em&gt;&lt;/a&gt; as it violates decisional autonomy.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;As &lt;a href="https://theproofofguilt.blogspot.com/2021/03/mobile-phones-and-criminal.html"&gt;Sekhri&lt;/a&gt; noted, a criminal process that reflects the aspirations of the &lt;em&gt;Puttuswamy &lt;/em&gt;judgement would require LEAs to first explain with reasonable detail the material which they wish to find in the digital devices. Secondly, they must provide a timeline for the investigation to ensure that individuals are not subjected to inexhaustible investigations with police roving through their devices indefinitely. Thirdly, such a criminal process must demand, a higher burden to be discharged from the state if the privacy of the individual is infringed upon. These aspirations should form the bedrock of a system of judicial warrants that LEAs ought to be required to comply with if they wish to compel the disclosure of passwords from individuals. The framework proposed above is similar to the &lt;a href="http://karnatakajudiciary.kar.nic.in:8080/repository/rep_judgmentcase.php"&gt;&lt;em&gt;Virendra Khanna&lt;/em&gt;&lt;/a&gt;&lt;em&gt; &lt;/em&gt;guidelines, as they provide a system of checks and balances that ensure that the intrusion on privacy is carried out proportionately; additionally, it would require LEAs to show a real requirement to demand access to the device. The independent eyes of a judicial magistrate provide a mechanism of oversight and a check against abuse of power by LEAs.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The criminal law apparatus is the most coercive power available to the state, and, therefore, privacy rights will become meaningless unless they can withstand it. Several criminal procedures in the country are rooted in colonial statutes, where the rights of the populace being policed were never a consideration; hence, a radical shift is required. However, post-1947 and &lt;em&gt;Puttuswamy&lt;/em&gt;, the ignorance and refusal to submit to the rights of the population can no longer be justified and significant reformulation is necessary to guarantee meaningful protections to device owners. There is a need to ensure that the rights of individuals are protected, especially when the motivation for their infringement is the supposed noble intentions of the criminal justice system. Failing to defend the right to privacy in these moments would be an invitation for allowing the power of the state to increase and inevitably become absolute.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies'&gt;https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Harikartik Ramesh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Surveillance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-05-02T09:27:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021">
    <title>Comments to the draft Motor Vehicle Aggregators Scheme, 2021</title>
    <link>https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021</link>
    <description>
        &lt;b&gt;This submission presents a response by researchers at the Centre for Internet and Society, India (CIS) to the draft Motor Vehicle Aggregators Scheme, 2021 published by the Transport Department, Government of National Capital Territory of Delhi, (hereafter “draft Scheme”).&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;span&gt;CIS, established in Bengaluru in 2008 as a non-profit organisation, undertakes interdisciplinary research on internet and digital technologies from public policy andacademic perspectives. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and regulatory practices around internet, technology,and society in India, and elsewhere.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;CIS is grateful for the opportunity to submit its comments to the draft Scheme. Please find below our thematically organised comments.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;a style="text-align: justify; " href="https://cis-india.org/internet-governance/comments-draft-motor-vehicle-aggregators-scheme.pdf" class="internal-link"&gt;&lt;strong&gt;Click here&lt;/strong&gt;&lt;/a&gt;&lt;span style="text-align: justify; "&gt; to read more.&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021'&gt;https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Chiara Furtado, Aayush Rathi and Abhishek Sekharan</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Motor Vehicle</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-04-01T15:25:06Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic">
    <title>Personal Data Protection Bill must examine data collection practices that emerged during pandemic</title>
    <link>https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic</link>
    <description>
        &lt;b&gt;The PDP bill is speculated to be introduced during the winter session of the parliament soon. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen’s data in order to fulfil its responsibilities. The bill could ensure that employers have some responsibility towards the data they collect from the employees.

&lt;/b&gt;
        &lt;p&gt;The article by Shweta Mohandas and Anamika Kundu was &lt;a class="external-link" href="https://www.news9live.com/technology/personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic-137031?infinitescroll=1"&gt;originally published by &lt;strong&gt;news nine&lt;/strong&gt;&lt;/a&gt; on November 29, 2021.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The Personal Data Protection Bill (PDP) is speculated to be introduced during the winter session of the parliament soon, and the report of the Joint Parliamentary Committee (JPC) has already been &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;adopted&lt;/a&gt; by the committee on Monday. The Report of the JPC comes after almost two years of deliberation and secrecy over how the final version of the Personal Data Protection Bill will be. Since the publication of the &lt;a class="external-link" href="https://prsindia.org/files/bills_acts/bills_parliament/2019/Personal%20Data%20Protection%20Bill,%202019.pdf"&gt;2019 version&lt;/a&gt; of the PDP Bill, the Covid 19 pandemic and the public safety measures have opened the way for a number of new organisations and reasons to collect personal data that was non-existent in 2019. Hence along with changes that have been suggested by multiple civil society organisations, the dissent notes submitted by the members of the JPC, the new version of the PDP Bill must also look at how data processing has changed over the span of two years.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Concerns with the bill&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;At the outset there are certain parts of the PDP Bill which need to be revised in order to uphold the spirit of privacy and individual autonomy laid out in the Puttaswamy judgement. The two sections that need to be in line with the privacy judgement are the ones that allow for non consensual processing of data by the government, and by employers. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen's data in order to fulfil its &lt;a class="external-link" href="https://www.livemint.com/news/india/big-brother-on-top-in-data-protection-bill-11576164271430.html"&gt;responsibilities&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the &lt;a class="external-link" href="https://www.meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf"&gt;2018 version&lt;/a&gt; of bill, drafted by the Justice Srikrishna Committee exemptions granted to the State with regard to processing of data was subject to a four pronged test which required the processing to be (i) authorised by law; (ii) in accordance with the procedure laid down by the law; (iii) necessary; and (iv) proportionate to the interests being achieved. This four pronged test was in line with the principles laid down by the Supreme Court in the Puttaswamy judgement. The 2019 version of the PDP Bill has diluted this principle by merely retaining the 'necessity principle' and removing the other requirements which is not in consonance with the test laid down by the Supreme Court in Puttaswamy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 35 was also widely discussed in the panel meetings where members had &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;argued&lt;/a&gt; the removal of 'public order' as a ground for exemption. The panel also insisted for '&lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;judicial or parliamentary oversight&lt;/a&gt;' to grant such exemptions. The final report did not accept these suggestions stating a need to balance &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;national security, liberty and privacy&lt;/a&gt; of an individual. There ought to be prior judicial review of the written order exempting the governmental agency from any provisions of the bill. Allowing the government to claim an exemption if it is satisfied to be "necessary or expedient" can be misused.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another clause which gives the data principal a wide berth is with respect to employee data Section 13 of the current version of the bill provides the employer with a leeway into processing employee data (other than sensitive personal data) without consent based on two grounds: when consent is not appropriate, or when obtaining consent would involve disproportionate effort on the part of the employer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The personal data so collected can only be collected for recruitment, termination, attendance, provision of any service or benefit, and assessing performance. This covers almost all of the activities that require data of the employee. Although the 2019 version of the bill excludes non-consensual collection of sensitive personal data (a provision that was missing in the 2018 version of the bill), there is still a lot of scope to improve this provision and provide employees further right to their data. At the outset the bill does not define employee and employer, which could result in confusion as there is no one definition of these terms across Indian Labour Laws.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, the bill distinguishes between employee and consumer, where the consumer of the same company or service has a greater right to their data than an employee. In the sense that the consumer as a data principal has the option to use any other product or service and also has the right to withdraw consent at any time, in the case of an employee the consequence of refusing consent or withdrawing consent would be being terminated from the employment. It is understood that there is a requirement for employee data to be collected, and that consent does not work the same way as it does in the case of a consumer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bill could ensure that employers have some responsibility towards the data they collect from the employees, such as ensuring that they are only used for the purpose for which they were collected, the employee knows how long their data will be retained, and know if the data is being processed by third parties. It is also worth mentioning that the Indian government is India's largest employer spanning a variety of agencies and public enterprises.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Concerns highlighted by JPC Members&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Going back to the few members of the JPC who have moved dissent notes, specifically with regard to governmental exemptions. Jairam Ramesh filed a &lt;a href="https://www.news9live.com/india/parliament-panel-adopts-report-on-data-protection-amid-dissent-by-opposition-135591"&gt;dissent note&lt;/a&gt;, to which many other opposition members followed suit. While Jairam Ramesh praised the JPC's functioning, he disagreed with certain aspects of the Report. According to him, the 2019 bill is designed in a manner where the right to privacy is given importance only in cases of private activities. He raised concerns regarding the unbridled powers given to the government to exempt itself from any of the provisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The amendment suggested by him would require parliamentary approval before exemption would take place. He also added that Section 12 of the bill which provided certain scenarios where consent was not needed for processing of personal data should have been made '&lt;a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html"&gt;less sweeping&lt;/a&gt;'. Similarly, Gaurav Gogoi's &lt;a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html"&gt;note&lt;/a&gt; stated that the exemptions would create a surveillance state and similarly criticised Section 12 and 35 of the bill. He also mentioned that there ought to be parliamentary oversight for the exemptions provided in the bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the same issue, Congress leader Manish Tiwari noted that the bill creates '&lt;a href="https://timesofindia.indiatimes.com/business/india-business/personal-data-protection-bill-what-is-it-and-why-is-the-opposition-so-unhappy-with-it/articleshow/87869391.cms"&gt;parallel universes&lt;/a&gt;' - one for the private sector which needs to be compliant and the other for the State which can exempt itself. He has opposed the entire bill stating there exists an "inherent design flaw". He has raised specific objections to 37 clauses and stated that any blanket exemptions to the state goes against the Puttaswamy Judgement.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In their joint &lt;a href="https://www.news9live.com/india/tmc-congress-mps-submit-dissent-notes-to-joint-panel-on-personal-data-protection-bill-135491"&gt;dissent note&lt;/a&gt;, Derek O'Brien and Mahua Mitra have said that there is a lack of adequate safeguards to protect the data principals' privacy and the lack of time and opportunity for stakeholder consultations. They have also pointed out that the independence of the DPA will cease to exist with the present provision of allowing the government powers to choose members and the chairman. Amar Patnaik is to object to the lack of inclusion of state level authorities in the bill. Without such bodies, he says, there would be federal override.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;While a number of issues were highlighted by civil society, the members of the JPC, and the media, the new version of the bill should also need to take into account the shifts that have taken place in view of the pandemic. The new version of the data protection bill should take into consideration the changes and new data collection practices that have emerged during the pandemic, be comprehensive and leave very little provisions to be decided later by the Rules.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic'&gt;https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-30T15:15:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill">
    <title>Nothing to Kid About – Children's Data Under the New Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill</link>
    <description>
        &lt;b&gt;The pandemic has forced policymakers to adapt their approach to people's changing practices, from looking at contactless ways of payment to the shifting of educational institutions online.&lt;/b&gt;
        &lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The article was originally &lt;a class="external-link" href="https://www.ijlt.in/post/nothing-to-kid-about-children-s-data-under-the-new-data-protection-bill"&gt;published in the Indian Journal of Law and Technology&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;For children, the internet has shifted from being a form of entertainment to a medium to connect with friends and seek knowledge and education. However, each time they access the internet, data about them and their choices are inadvertently recorded by companies and unknown third parties. The growth of EdTech apps in India has led to growing concerns regarding children's data privacy. This has led to the creation of a &lt;a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank"&gt;self-regulatory&lt;/a&gt; body, the Indian EdTech Consortium. More recently, the &lt;a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank"&gt;Advertising Standard Council of India&lt;/a&gt;&lt;span class="_3zM-5"&gt; has &lt;/span&gt;also started looking at passing a draft regulation to keep a check on EdTech advertisements.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The Joint Parliamentary Committee (JPC), tasked with drafting and revising the Data Protection Bill, had to consider the number of changes that had happened after the release of the 2019 version of the Bill. While the most significant change was the removal of the term “personal data” from the title of the Bill, in a move to create a comprehensive Data Protection Bill that includes both personal and non personal data. Certain other provisions of the Bill also featured additions and removals. The JPC, in its revised version of the Bill has removed an entire class of &lt;a class="_1lsz7 _3Bkfb" href="https://prsindia.org/billtrack/the-personal-data-protection-bill-2019#:~:text=Obligations%20of%20data%20fiduciary%3A%20A,specific%2C%20clear%20and%20lawful%20purpose" rel="noopener noreferrer" target="_blank"&gt;data fiduciaries&lt;/a&gt; – guardian data fiduciary – which was tasked with greater responsibility for managing children's data. While the JPC justified the removal of the guardian data fiduciary stating that consent from the guardian of the child is enough to meet the end for which personal data of children are processed by the data fiduciary. While thought has been given to looking at how consent is given by the guardian on behalf of the child, there was no change in the age of children in the Bill. Keeping the age of consent under the Bill as the same as the age of majority to enter into a contract under the 1872 Indian Contract Act – 18 years – reveals the disconnect the law has with the ground reality of how children interact with the internet.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;In the current state of affairs where Indian children are navigating the digital world on their own there is a need to look deeply at the processing of children’s data as well as ways to ensure that children have information about consent and informational privacy. By placing the onus of granting consent on parents, the PDP Bill fails to look at how consent works in a privacy policy–based consent model and how this, in turn, harms children in the long run.&lt;/p&gt;
&lt;h3 class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d aujbK _3M0Fe _1FoOD iWv3d _1j-51 mm8Nw"&gt;1. Age of Consent&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;By setting the age of consent as 18 years under the Data Protection Bill, 2021, it brings all individuals under 18 years of age under one umbrella without making a distinction between the internet usage of a 5-year-old child and a 16-year-old teenager. There is a need to look at the current internet usage habits of children and assess whether requiring parental consent is reasonable or even practical. It is also pertinent to note that the law in the offline world does make the distinction between age and maturity. For example, it has been &lt;a class="_1lsz7 _3Bkfb" href="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill" rel="noopener noreferrer" target="_blank"&gt;highlighted&lt;/a&gt; that Section 82 of the Indian Penal Code, read with Section 83, states that any act by a child under the age of 12 years shall not be considered an offence, while the maturity of those aged between 12–18 years will be decided by the court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industries, which would qualify them to fall under Section 13 of the Bill, which deals with employee data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;A 2019 &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://reverieinc.com/wp-content/uploads/2020/09/IAMAI-Digital-in-India-2019-Round-2-Report.pdf" rel="noopener noreferrer" target="_blank"&gt;report&lt;/a&gt;&lt;span&gt; suggests that two-thirds of India’s internet users are in the 12–29 years age group, accounting for about 21.5% of the total internet usage in metro cities. With the emergence of cheaper phones equipped with faster processing and low internet data costs, children are no longer passive consumers of the internet. They have social media accounts and use several applications to interact with others and make purchases. There is a need to examine how children and teenagers interact with the internet as well as the practicality of requiring parental consent for the usage of applications.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Most applications that require age data request users to type in their date of birth; it is not difficult for a child to input a suitable date that would make it appear that they are &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank"&gt;over 18&lt;/a&gt;&lt;span&gt;. In this case they are still children but the content that will be presented to them would be those that are meant for adults including content that might be disturbing or those involving use of &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank"&gt;alcohol and gambling. &lt;/a&gt;&lt;span&gt;Additionally, in their privacy policies, applications sometimes state that they are not suited for and restricted from users under 18. Here, data fiduciaries avoid liability by placing the onus on the user to declare their age and properly read and understand the privacy policy.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Reservations about the age of consent under the Bill have also been highlighted by some members of the JPC through their dissenting opinions. &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank"&gt;MP Ritesh Pandey &lt;/a&gt;&lt;span&gt;suggested that the age of consent should be reduced to 14 years keeping the best interest of the children in mind as well as to support children in benefiting from technological advances. Similarly, &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank"&gt;MP Manish Tiwari &lt;/a&gt;&lt;span&gt;in his dissenting opinion suggested regulating data fiduciaries based on the type of content they provide or data they collect.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;2. How is the 2021 Bill Different from the 2019 Bill?&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf" rel="noopener noreferrer" target="_blank"&gt;2019 &lt;/a&gt;&lt;span&gt;draft of the Bill consisted of a class of data fiduciaries called guardian data fiduciaries – entities that operate commercial websites or online services directed at children or which process large volumes of children’s personal data. This class of fiduciaries was barred from profiling, tracking, behavioural monitoring, and running targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child. In the previous draft, such data fiduciaries were not allowed to engage in ‘profiling, tracking, behavioural monitoring of children, or direct targeted advertising at children’. There was also a prohibition on conducting any activities that might significantly harm the child. As per Chapter IV, any violation could attract a penalty of up to INR 15 crore of the worldwide turnover of the data fiduciary for the preceding financial year, whichever is higher. However, this separate class of data fiduciaries do not have any additional responsibilities. It is also unclear as to whether a data fiduciary that does not by definition fall within such a category would be allowed to engage in activities that could cause ‘significant harm’ to children.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The new Bill also does not provide any mechanisms for age verification and only lays down considerations that verification processes should be undertaken. Furthermore, the JPC has suggested that consent options available to the child when they attain the age of majority i.e. 18 years should be included within the rule frame by the Data Protection Authority instead of being an amendment in the Bill.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;3. In the Absence of a Guardian Data Fiduciary&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The 2018 and 2019 drafts of the PDP Bill consider a child to be any person below the age of 18 years. For a child to access online services, the data fiduciary must first verify the age of the child and obtain consent from their guardian. The Bill does not provide an explicit process for age verification apart from stating that regulations shall be drafted in this regard. The 2019 Bill states that the Data Protection Authority shall specify codes of practice in this matter. Taking best practices into account, there is a need for ‘&lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/project-brief-highlighting-inclusive-and-practical-mechanisms-to-protect-childrens-data.pdf" rel="noopener noreferrer" target="_blank"&gt;user-friendly and privacy-protecting age verification techniques&lt;/a&gt;&lt;span&gt;’ to encourage safe navigation across the internet. This will require &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/bp-global-technological-developments-in-age-verification-and-age-estimation.pdf" rel="noopener noreferrer" target="_blank"&gt;looking at &lt;/a&gt;&lt;span&gt;technological developments and different standards worldwide. There is a need to hold companies &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.livemint.com/opinion/columns/theres-a-better-way-to-protect-the-online-privacy-of-kids-11615306723478.html" rel="noopener noreferrer" target="_blank"&gt;accountable&lt;/a&gt;&lt;span&gt; for the protection of children’s online privacy and the harm that their algorithms cause children and to make sure that they are not continued.&lt;/span&gt;&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The JPC in the 2021 version of the Bill removed provisions about guardian data fiduciaries, stating that there was no advantage in creating a different class of data fiduciary. As per the JPC, even those data fiduciaries that did not fall within the said classification would also need to comply with rules pertaining to the personal data of children i.e. with Section 16 of the Bill. Section 16 of the Bill requires the data fiduciary to verify the child’s age and obtain consent from the parent/guardian. The manner of age verification has also een spelt out.  Furthermore, since ‘significant data fiduciaries’ is an existing class, there is still a need to comply with rules related to data processing. The JPC also removed the phrase “in the best interests of, the child” and “is in the best interests of, the child” under sub-clause 16(1), implying that the entire Bill concerned the rights of the data principal and the use of such terms dilutes the purpose of the legislation and could give way to manipulation by the data fiduciary.&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Conclusion&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Over the past two years, there has been a significant increase in applications that are targeted at children. There has been a proliferation of EduTech apps, which ideally should have more responsibility as they are processing children's data. We recommend that instead of creating a separate category, such fiduciaries collecting children's data or providing services to children be seen as ‘significant data fiduciaries’ that need to take up additional compliance measures.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Furthermore, any blanket prohibition on tracking children may obstruct safety measures that could be implemented by data fiduciaries. These fears are also increasing in other jurisdictions as there is a likelihood to restrict data fiduciaries from using software that looks out for such as &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.unodc.org/e4j/en/cybercrime/module-12/key-issues/online-child-sexual-exploitation-and-abuse.html" rel="noopener noreferrer" target="_blank"&gt;Child Sexual Abuse Material&lt;/a&gt;&lt;span&gt; as well as  online predatory behaviour. Additionally, concerning the age of consent under the Bill, the JPC could look at international best practices and come up with ways to make sure that children can use the internet and have rights over their data, which would enable them to grow up with more awareness about data protection and privacy. One such example to look at could be the Children's Online Privacy Protection Rule (COPPA) in the US, where the rules apply to operators of websites and online services that collect personal information from kids &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-compliance" rel="noopener noreferrer" target="_blank"&gt;under 13 &lt;/a&gt;&lt;span&gt;or provide services to children that are directed at a general audience, but have actual knowledge that they collect personal information from such children. A form of combination of this system and the significant data fiduciary classification could be one possible way to ensure that children’s data and privacy are preserved online.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;The authors are researchers at the Centre for Internet and Society and thank their colleague Arindrajit Basu for his inputs.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digitalisation</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2022-03-10T13:19:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study">
    <title>Clause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study</link>
    <description>
        &lt;b&gt;In light of the state’s emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-digital-healthcare-case-study/"&gt;published in Medianama&lt;/a&gt; on February 21, 2022. This is the second in a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In the &lt;a href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;previous post&lt;/a&gt;, I looked at provisions on non-consensual data processing for state functions under the most recent version of recommendations by the Joint Parliamentary Committee on India’s Data Protection Bill (DPB). The true impact of these provisions can only be appreciated in light of ongoing policy developments and real-life implications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To appreciate the significance of the dilutions in Clause 12, let us consider the Indian state’s range of schemes promoting digital healthcare. In July 2018, NITI Aayog, a central government policy think tank in India released a strategy and approach paper (Strategy Paper) on the formulation of the National Health Stack which envisions the creation of a federated application programming interface (API)-enabled health information ecosystem. While the Ministry of Health and Family Welfare has focused on the creation of Electronic Health Records (EHR) Standards for India during the last few years and also identified a contractor for the creation of a centralised health information platform (IHIP), this Strategy Paper advocates a completely different approach, which is described as a Personal Health Records (PHR) framework. In 2021, the National Digital Health Mission (NDHM) was launched under which a citizen shall have the option to obtain a digital health ID. A digital health ID is a unique ID and will carry all health records of a person.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;A Stack Model for Big Data Ecosystem in Healthcare&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;A stack model as envisaged in the Strategy Paper, consists of several layers of open APIs connected to each other, often tied together by a unique health identifier. The open nature of APIs has the advantage that it allows public and private actors to build solutions on top of it, which are interoperable with all parts of the stack. It is however worth considering both the ‘openness’ and the role that the state plays in it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Even though the APIs are themselves open, they are a part of a pre-decided technological paradigm, built by private actors and blessed by the state. Even though innovators can build on it, the options available to them are limited by the information architecture created by the stack model. When such a technological paradigm is created for healthcare reform and health data, the stack model poses additional challenges. By tying the stack model to the unique identity, without appropriate processes in place for access control, siloed information, and encrypted communication, the stack model poses tremendous privacy and security concerns. The broad language under Clause 12 of the DPB needs to be looked at in this context.&lt;/p&gt;
&lt;p&gt;Clause 12 allows non-consensual processing of personal data where it is necessary “for the performance of any function of the state authorised by law” in order to provide a service or benefit from the State. In the previous post, I had highlighted the import of the use of only ‘necessity’ to the exclusion of ‘proportionality’. Now, we need to consider its significance in light of the emerging digital healthcare apparatus being created by the state.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The National Health Stack and National Digital Health Mission together envision an intricate system of data collection and exchange which in a regulatory vacuum would ensure unfettered access to sensitive healthcare data for both the state and private actors registered with the platforms. The Stack framework relies on repositories where data may be accessed from multiple nodes within the system. Importantly, the Strategy Paper also envisions health data fiduciaries to facilitate consent-driven interaction between entities that generate the health data and entities that want to consume the health records for delivering services to the individual. The cast of characters involve the National Health Authority, health care providers and insurers who access the National Health Electronic Registries, unified data from different programmes such as National Health Resource Repository (NHRR), NIN database, NIC and the Registry of Hospitals in Network of Insurance (ROHINI), private actors such as Swasth, iSpirt who assist the Mission as volunteers. The currency that government and private actors are interested in is data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The promised benefits of healthcare data in an anonymised and aggregate form range from Disease Surveillance to Pharmacovigilance as well as Health Schemes Management Systems and Nutrition Management, benefits which have only been more acutely emphasised during the pandemic. However, the pandemic has also normalised the sharing of sensitive healthcare data with a variety of actors, without much thinking on much-needed data minimisation practises.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The potential misuses of healthcare data include greater state surveillance and control, predatory and discriminatory practices by private actors which rely on Clause 12 to do away with even the pretense of informed consent so long as the processing of data is deemed necessary by the state and its private sector partners to provide any service or benefit.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Subclause (e) in Clause 12, which was added in the last version of the Bill drafted by MeitY and has been retained by the JPC, allows processing wherever it is necessary for ‘any measures’ to provide medical treatment or health services during an epidemic, outbreak or threat to public health. Yet again, the overly-broad language used here is designed to ensure that any annoyances of informed consent can be easily brushed aside wherever the state intends to take any measures under any scheme related to public health.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Effectively, how does the framework under Clause 12 alter the consent and purpose limitation model? Data protection laws introduce an element of control by tying purpose limitation to consent. Individuals provide consent to specified purposes, and data processors are required to respect that choice. Where there is no consent, the purposes of data processing are sought to be limited by the necessity principle in Clause 12. The state (or authorised parties) must be able to demonstrate necessity to the exercise of state function, and data must only be processed for those purposes which flow out of this necessity. However, unlike the consent model, this provides an opportunity to keep reinventing purposes for different state functions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the absence of a data protection law, data collected by one agency is shared indiscriminately with other agencies and used for multiple purposes beyond the purpose for which it was collected. The consent and purpose limitation model would have addressed this issue. But, by having a low threshold for non-consensual processing under Clause 12, this form of data processing is effectively being legitimised.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study'&gt;https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T15:07:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function">
    <title>How Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</link>
    <description>
        &lt;b&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;published in Medianama&lt;/a&gt; on February 18, 2022. This is the first of a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote &lt;a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html"&gt;an opinion piece&lt;/a&gt; providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash &lt;a href="https://twitter.com/pranesh/status/1023116679440621568"&gt;pointed out&lt;/a&gt; that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —&lt;/p&gt;
&lt;p&gt;a)  where it is necessary to respond to a medical emergency&lt;br /&gt;b)  where it is necessary for state to provide a service or benefit to the individual&lt;br /&gt;c)  where it is necessary for the state to issue any certification, licence or permit&lt;br /&gt;d)  where it is necessary under any central or state legislation, or to comply with a judicial order&lt;br /&gt;e)  where it is necessary for any measures during an epidemic, outbreak or public health&lt;br /&gt;f)  where it is necessary for safety procedures during disaster or breakdown of public order&lt;/p&gt;
&lt;p&gt;In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.&lt;/p&gt;
&lt;h2&gt;Twin restrictions in Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Scope of privacy under Puttaswamy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench ​was ​not ​required ​to ​provide ​a ​legal ​test ​to ​determine ​the ​extent ​and ​scope ​of the ​right ​to ​privacy, but they do provide sufficient ​guidance ​for ​us ​to ​contemplate ​how ​the ​limits ​and ​scope ​of ​the ​constitutional ​right ​to ​privacy ​could ​be ​determined ​in ​future ​cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —&lt;br /&gt;a) the existence of a “law”&lt;br /&gt;b) a “legitimate State interest”&lt;br /&gt;c) the requirement of “proportionality”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proportionality and Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —&lt;/p&gt;
&lt;p&gt;a)  the limiting measures must be carefully designed, or rationally connected, to the objective&lt;br /&gt;b)  they must impair the right as little as possible&lt;br /&gt;c)  the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'&gt;https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T14:56:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill">
    <title>CIS Comments and Recommendations on the Data Protection Bill, 2021</title>
    <link>https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill</link>
    <description>
        &lt;b&gt;This document is a revised version of the comments we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;After nearly two years of deliberations and a few changes in its composition, the Joint Parliamentary Committee (JPC), on 17 December 2021, submitted its report on the Personal Data Protection Bill, 2019  (2019 Bill). The report also contains a new version of the law titled the Data Protection Bill, 2021 (2021 Bill). Although there were no major revisions from the previous version other than the inclusion of all data under the ambit of the bill, some provisions were amended.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This document is a revised version of the&lt;a href="https://cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019"&gt; comments&lt;/a&gt; we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill. Through this document we aim to shed light on the issues that we highlighted in our previous comments that have not yet been addressed, along with additional comments on sections that have become more relevant since the pandemic began. In several instances our previous comments have either not been addressed or only partially been addressed; in such instances, we reiterate them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These general comments should be read in conjunction with our previous recommendations for the reader to get a comprehensive overview of what has changed from the previous version and what has remained the same. This document can also be read while referencing the new Data Protection Bill 2021 and the JPC’s report to understand some of the significant provisions of the bill.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;&lt;a href="https://cis-india.org/internet-governance/general-comments-data-protection-bill.pdf" class="internal-link"&gt;Read on to access the comments&lt;/a&gt; | &lt;/strong&gt;&lt;span&gt;Review and editing by Arindrajit Basu. Copy editing: The Clean Copy; Shared under Creative Commons Attribution 4.0 International license&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-02-14T16:07:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on">
    <title>Notes for India as the digital trade juggernaut rolls on</title>
    <link>https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on</link>
    <description>
        &lt;b&gt;Sitting out trade negotiations could result in the country losing out on opportunities to shape the rules.&lt;/b&gt;
        &lt;p&gt;The article by Arindrajit Basu was &lt;a class="external-link" href="https://www.thehindu.com/opinion/op-ed/notes-for-india-as-the-digital-trade-juggernaut-rolls-on/article38393921.ece"&gt;published in the Hindu&lt;/a&gt; on February 8, 2022&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Despite the cancellation of the Twelfth Ministerial Conference (MC12) of the World Trade Organization (WTO) late last year (scheduled date, November 30, 2021-December 3, 2021) due to COVID-19, digital trade negotiations continue their ambitious march forward. On December 14, Australia, Japan, and Singapore, co-convenors of the plurilateral Joint Statement Initiative (JSI) on e-commerce, welcomed the ‘substantial progress’ made at the talks over the past three years and stated that they expected a convergence on more issues by the end of 2022.&lt;/p&gt;
&lt;h3&gt;Holding out&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;But  therein lies the rub: even though JSI members account for over 90% of  global trade, and the initiative welcomes newer entrants, over half of  WTO members (largely from the developing world) continue to opt out of  these negotiations. They fear being arm-twisted into accepting global  rules that could etiolate domestic policymaking and economic growth.  India and South Africa have led the resistance and been the JSI’s most  vocal critics. India has thus far resisted pressures from the developed  world to jump onto the JSI bandwagon, largely through coherent legal  argumentation against the JSI and a long-term developmental vision. Yet,  given the increasingly fragmented global trading landscape and the  rising importance of the global digital economy, can India tailor its  engagement with the WTO to better accommodate its economic and  geopolitical interests?&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Global rules on digital trade&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The  WTO emerged in a largely analogue world in 1994. It was only at the  Second Ministerial Conference (1998) that members agreed on core rules  for e-commerce regulation. A temporary moratorium was imposed on customs  duties relating to the electronic transmission of goods and services.  This moratorium has been renewed continuously, to consistent opposition  from India and South Africa. They argue that the moratorium imposes  significant costs on developing countries as they are unable to benefit  from the revenue customs duties would bring.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The  members also agreed to set up a work programme on e-commerce across  four issue areas at the General Council: goods, services, intellectual  property, and development. Frustrated by a lack of progress in the two  decades that followed, 70 members brokered the JSI in December 2017 to  initiate exploratory work on the trade-related aspects of e-commerce.  Several countries, including developing countries, signed up in 2019  despite holding contrary views to most JSI members on key issues.  Surprise entrants, China and Indonesia, argued that they sought to shape  the rules from within the initiative rather than sitting on the  sidelines.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;India  and South Africa have rightly pointed out that the JSI contravenes the  WTO’s consensus-based framework, where every member has a voice and vote  regardless of economic standing. Unlike the General Council Work  Programme, which India and South Africa have attempted to revitalise in  the past year, the JSI does not include all WTO members. For the process  to be legally valid, the initiative must either build consensus or  negotiate a plurilateral agreement outside the aegis of the WTO.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;India  and South Africa’s positioning strikes a chord at the heart of the  global trading regime: how to balance the sovereign right of states to  shape domestic policy with international obligations that would enable  them to reap the benefits of a global trading system.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;A contested regime&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;There  are several issues upon which the developed and developing worlds  disagree. One such issue concerns international rules relating to the  free flow of data across borders. Several countries, both within and  outside the JSI, have imposed data localisation mandates that compel  corporations to store and process data within territorial borders. This  is a key policy priority for India. Several payment card companies,  including Mastercard and American Express, were prohibited from issuing  new cards for failure to comply with a 2018 financial data localisation  directive from the Reserve Bank of India. The Joint Parliamentary  Committee (JPC) on data protection has recommended stringent  localisation measures for sensitive personal data and critical personal  data in India’s data protection legislation. However, for nations and  industries in the developed world looking to access new digital markets,  these restrictions impose unnecessary compliance costs, thus arguably  hampering innovation and supposedly amounting to unfair protectionism.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There  is a similar disagreement regarding domestic laws that mandate the  disclosure of source codes. Developed countries believe that this  hampers innovation, whereas developing countries believe it is essential  for algorithmic transparency and fairness — which was another key  recommendation of the JPC report in December 2021.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;India’s choices&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;India’s  global position is reinforced through narrative building by political  and industrial leaders alike. Data sovereignty is championed as a means  of resisting ‘data colonialism’, the exploitative economic practices and  intensive lobbying of Silicon Valley companies. Policymaking for  India’s digital economy is at a critical juncture. Surveillance reform,  personal data protection, algorithmic governance, and non-personal data  regulation must be galvanised through evidenced insights,and work for  individuals, communities, and aspiring local businesses — not just  established larger players.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Hastily  signing trading obligations could reduce the space available to frame  appropriate policy. But sitting out trade negotiations will mean that  the digital trade juggernaut will continue unchecked, through  mega-regional trading agreements such as the Regional Comprehensive  Economic Partnership (RCEP) and the Comprehensive and Progressive  Agreement for Trans-Pacific Partnership (CPTPP). India could risk  becoming an unwitting standard-taker in an already fragmented trading  regime and lose out on opportunities to shape these rules instead.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Alternatives  exist; negotiations need not mean compromise. For example, exceptions  to digital trade rules, such as ‘legitimate public policy objective’ or  ‘essential security interests’, could be negotiated to preserve  policymaking where needed while still acquiescing to the larger  agreement. Further, any outcome need not be an all-or-nothing  arrangement. Taking a cue from the Digital Economy Partnership Agreement  (DEPA) between Singapore, Chile, and New Zealand, India can push for a  framework where countries can pick and choose modules with which they  wish to comply. These combinations can be amassed incrementally as  emerging economies such as India work through domestic regulations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Despite  its failings, the WTO plays a critical role in global governance and is  vital to India’s strategic interests. Negotiating without surrendering  domestic policy-making holds the key to India’s digital future.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Arindrajit Basu is Research Lead at the Centre for Internet and Society, India. The views expressed are personal. The author would like to thank The Clean Copy for edits on a draft of this article.&lt;/i&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on'&gt;https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>basu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digitalisation</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>E-Commerce</dc:subject>
    
    
        <dc:subject>Digital India</dc:subject>
    

   <dc:date>2022-02-09T15:04:36Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
