The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 11 to 25.
The Government’s Increased Focus on Regulating Non-Personal Data: A Look at the Draft National Data Governance Framework Policy
https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy
<b>Digvijay Chaudhary and Anamika Kundu wrote an article on the National Data Governance Framework Policy. It was edited by Shweta Mohandas.</b>
<h2>Introduction</h2>
<p style="text-align: justify; ">Non Personal Data (‘NPD’) can be <a href="https://www.taylorfrancis.com/chapters/edit/10.4324/9780429022241-8/regulating-non-personal-data-age-big-data-bart-van-der-sloot">understood</a> as any information not relating to an identified or identifiable natural person. The origin of such data can be both human and non-human. Human NPD would be such data which has been anonymised in such a way that the person to whom the data relates cannot be re-identified. Non-human NPD would mean any such data that did not relate to a human being in the first place, for example, weather data. There has been a gradual demonstrated interest in NPD by the government in recent times. This new focus on regulating non personal data can be owed to the economic incentive it provides. In its report, the Sri Krishna committee, released in 2018 agreed that NPD holds considerable strategic or economic interest for the nation, however, it left the questions surrounding NPD to a future committee.</p>
<h2 style="text-align: justify; ">History of NPD Regulation</h2>
<p dir="ltr" style="text-align: justify; ">In 2020, the Ministry of Electronics and Information Technology (‘MEITY’) constituted an expert committee (‘NPD Committee’) to study various issues relating to NPD and to make suggestions on the regulation of non-personal data. The NPD Committee differentiated NPD into human and non-human NPD, based on the data’s origin. Human NPD would include all information that has been stripped of any personally identifiable information and non-human NPD meant any information that did not contain any personally identifiable information in the first place (eg. weather data). The final report of the NPD Committee is awaited but the Committee came out with a <a href="https://static.mygov.in/rest/s3fs-public/mygov_160922880751553221.pdf">revised draft</a> of its recommendations in December 2020. In its December 2020 report, the NPD Committee proposed the creation of a National Data Protection Authority (‘NPDA’) as it felt this is a new and emerging area of regulation. Thereafter, the Joint Parliamentary Committee on the Personal Data Protection Bill, 2019 (‘JPC’) came out with its <a href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf">version of the Data Protection Bill </a>where it amended the short title of the PDP Bill 2019 to Data Protection Bill, 2021 widening the ambit of the Bill to include all types of data. The JPC report focuses only on human NPD, noting that non-personal data is essentially derived from one of the three sets of data - personal data, sensitive personal data, critical personal data - which is either anonymized or is in some way converted into non-re-identifiable data.</p>
<p dir="ltr" style="text-align: justify; ">On February 21, 2022, the Ministry of Electronics and Information Technology (‘MEITY’) came out with the <a href="https://www.meity.gov.in/content/draft-india-data-accessibility-use-policy-2022">Draft India Data Accessibility and Use Policy, 2022</a> (‘Draft Policy’). The Draft Policy was strongly criticised mainly due to its aims to monetise data through its sale and licensing to body corporates. The Draft Policy had stated that anonymised and non-personal data collected by the State that has “<a href="https://www.medianama.com/2022/06/223-new-data-governance-policy-privacy/">undergone value addition</a>” could be sold for an “appropriate price”. During the Draft Policy’s consultation process, it had been withdrawn several times and then finally removed from the website.<a href="https://www.meity.gov.in/writereaddata/files/Draft%20India%20Data%20Accessibility%20and%20Use%20Policy_0.pdf"> The National Data Governance Framework Policy</a> (‘NDGF Policy’) is a successor to this Draft Policy. There is a change in the language put forth in the NDGF Policy from the Draft Policy, where the latter mainly focused on monetary growth. The new NDGF Policy aims to regulate anonymised non-personal data (‘NPD’) kept with governmental authorities and make it accessible for research and improving governance. It wishes to create an ‘India Datasets programme’ which will consist of the aforementioned datasets. While MEITY has opened the draft for public comments, is a need to spell out the procedure in some ways for stakeholders to draft recommendations for the NDGF policies in an informed manner. Through this piece, we discuss the NDGF Policy in terms of issues related to the absence of a comprehensive Data Protection Framework in India and the jurisdictional overlap of authorities under the NDGF Policy and DPB.</p>
<h2 dir="ltr" style="text-align: justify; ">What the National Data Governance Framework Policy Says</h2>
<p dir="ltr" style="text-align: justify; ">Presently in India, NPD is stored in a variety of governmental departments and bodies. It is difficult to access and use this stored data for governmental functions without modernising collection and management of governmental data. Through the NDGF Policy, the government aims to build an Indian data storehouse of anonymised non-personal datasets and make it accessible for both improving governance and encouraging research. It imagines the establishment of an Indian Data Office (‘IDO’) set up by MEITY , which shall be responsible for consolidating data access and sharing of non-personal data across the government. In addition, it also mandates a Data Management Unit for every Ministry/department that would work closely with the IDO. IDO will also be responsible for issuing protocols for sharing NPD. The policy further imagines an Indian Data Council (‘IDC’) whose function would be to define frameworks for important datasets, finalise data standards, and Metadata standards and also review the implementation of the policy. The NDGF Policy has provided a broad structure concerning the setting up of anonymisation standards, data retention policies, data quality, and data sharing toolkit. The NDGF Policy states that these standards shall be developed and notified by the IDO or MEITY or the Ministry in question and need to be adhered to by all entities.</p>
<h2 dir="ltr" style="text-align: justify; ">The Data Protection Framework in India</h2>
<p dir="ltr" style="text-align: justify; ">The report adopted by the JPC, felt that it is simpler to enact a single law and a single regulator to oversee all the data that originates from any data principal and is in the custody of any data fiduciary. According to the JPC, the draft Bill deals with various kinds of data at various levels of security. The JPC also recommended that since the Data Protection Bill (‘DPB’) will handle both personal and non-personal data, any further policy / legal framework on non-personal data may be made a part of the same enactment instead of any separate legislation. The draft DPB states that what is to be done with the NDP shall be decided by the government from time to time according to its policy. As such, neither the DPB, 2021 nor the NDGF Policy go into details of regulating NPD but only provide a broad structure of facilitating free-flow of NPD, without taking into account the <a href="https://cis-india.org/internet-governance/cis-comments-revised-npd-report/view">specific concerns</a> that have been raised since the NPD committee came out with its draft report on regulating NPD dated December 2020.</p>
<h2 dir="ltr" style="text-align: justify; ">Jurisdictional overlaps among authorities and other concerns</h2>
<p dir="ltr" style="text-align: justify; ">Under the NDGF policy, all guidelines and rules shall be published by a body known as the Indian Data Management Office (‘IDMO’). The IDMO is set to function under the MEITY and work with the Central government, state governments and other stakeholders to set standards. Currently, there is no sign of when the DPB will be passed as law. According to the JPC, the reason for including NPD within the DPB was because of the impossibility to differentiate between PD and NPD. There are also certain overlaps between the DPB and the NDGF which are not discussed by the NDGF. NDGF does not discuss the overlap between the IDMO and Data Protection Authority (‘DPA’) established under the DPB 2021.</p>
<p dir="ltr" style="text-align: justify; ">Under the DPB, the DPA is tasked with specifying codes of practice under clause 49. On the other hand, the NDGF has imagined the setting up of IDO, IDMO, and the IDC, which shall be responsible for issuing codes of practice such as data retention, and data anonymisation, and data quality standards. As such, there appears to be some overlap in the functions of the to-be-constituted DPA and the NDGF Policy.</p>
<p dir="ltr" style="text-align: justify; ">Furthermore, while the NDGF Policy aims to promote openness with respect to government data, there is a conflict with <a href="https://opengovdata.org/">open government data (‘OGD’) principle</a>s when there is a price attached to such data. OGD is data which is collected and processed by the government for free use, reuse and distribution. Any database created by the government must be publicly accessible to ensure compliance with the OGD principles.</p>
<h2 dir="ltr" style="text-align: justify; ">Conclusion</h2>
<p dir="ltr" style="text-align: justify; ">Streamlining datasets across different authorities is a huge challenge for the government and hence the NGDF policy in its current draft requires a lot of clarification. The government can take inspiration from the European Union which in 2018, came out with a principles-based approach coupled with self-regulation on the framework of the free flow of non-personal data. The <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&from=EN">guidance</a> on the free-flow of non-personal data defines non-personal data based on the origin of data - data which originally did not relate to any personal data (non-human NPD) and data which originated from personal data but was subsequently anonymised (human NPD). The <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&from=EN">regulation</a> further realises the reality of mixed data sets and regulates only the non-personal part of such datasets and where the datasets are inextricably linked, the GDPR would apply to such datasets. Moreover, any policy that seeks to govern the free flow of NPD ought to make it clear that in case of re-identification of anonymised data, such re-identified data would be considered personal data. The DPB, 2021 and the NGDF, both fail to take into account this difference.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy'>https://cis-india.org/internet-governance/blog/national-data-governance-framework-policy</a>
</p>
No publisherDigvijay Chaudhary and Anamika KunduOpen DataOpen Government DataInternet GovernancePrivacy2022-06-30T13:24:35ZBlog EntryMaking Voices Heard
https://cis-india.org/internet-governance/blog/making-voices-heard
<b>We are happy to announce the launch of our final report on the study ‘Making Voices Heard: Privacy, Inclusivity, and Accessibility of Voice Interfaces in India. The study was undertaken with support from the Mozilla Corporation.</b>
<p style="text-align: center; "><img src="https://cis-india.org/home-images/WebsiteHeader.jpg/@@images/8d8ed2a0-f0e4-44d7-8938-493b186402c5.jpeg" alt="Making Voices Heard" class="image-inline" title="Making Voices Heard" /></p>
<p style="text-align: justify; ">We believe that voice interfaces have the potential to democratise the use of the internet by addressing limitations related to reading and writing on digital text-only platforms and devices. This report examines the current landscape of voice interfaces in India, with a focus on concerns related to privacy and data protection, linguistic barriers, and accessibility for persons with disabilities (PwDs).</p>
<p style="text-align: justify; ">The report features a visual mapping of 23 voice interfaces and technologies publicly available in India, along with a literature survey, a policy brief towards development and use of voice interfaces and a design brief documenting best practices and users’ needs, both with a focus on privacy, languages, and accessibility considerations, and a set of case studies on three voice technology platforms. <span>Read and download the full report <a class="external-link" href="http://voice.cis-india.org/">here</a></span></p>
<hr />
<h3>Credits</h3>
<p><strong>Research</strong>: Shweta Mohandas, Saumyaa Naidu, Deepika Nandagudi Srinivasa, Divya Pinheiro, and Sweta Bisht.</p>
<p><strong>Conceptualisation, Planning, and Research Inputs</strong>: Sumandro Chattapadhyay, and Puthiya Purayil Sneha.</p>
<p><strong>Illustration</strong>: Kruthika NS (Instagram @theworkplacedoodler). Website Design Saumyaa Naidu. Website Development Sumandro Chattapadhyay, and Pranav M Bidare.</p>
<p><strong>Review and Editing</strong>: Puthiya Purayil Sneha, Divyank Katira, Pranav M Bidare, Torsha Sarkar, Pallavi Bedi, and Divya Pinheiro.</p>
<p><strong>Copy Editing</strong>: The Clean Copy</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/making-voices-heard'>https://cis-india.org/internet-governance/blog/making-voices-heard</a>
</p>
No publishershwetaVoice User InterfacePrivacyAccessibilityInternet GovernanceResearchFeaturedHomepage2022-06-27T16:18:36ZBlog EntryCCTVs in Public Spaces and the Data Protection Bill, 2021
https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021
<b>This article has been authored by Ms. Anamika Kundu, Research Assistant at the Centre for Internet and Society, and Digvijay S. Chaudhary, Researcher at the Centre for Internet and Society. This blog is a part of RSRR’s Blog Series on the Right to Privacy and the Legality of Surveillance, in collaboration with the Centre for Internet & Society.</b>
<p><span>The article by Anamika Kundu and Digvijay S. Chaudhary was originally </span><a class="external-link" href="https://rsrr.in/2022/04/20/cctv-surveillance-privacy/">published by RGNUL Student Research Review</a><span> on April 20, 2022</span></p>
<p><span><img src="https://cis-india.org/home-images/Surveillance.jpg/@@images/f8fad564-44ab-46e2-bd44-29607ea7fd19.jpeg" alt="Surveillance" class="image-inline" title="Surveillance" /></span></p>
<hr />
<h2>Introduction</h2>
<p style="text-align: justify; ">In recent times, Indian cities have seen an expansion of state deployed CCTV cameras. According to a recent report, in terms of CCTVs deployed, Delhi was considered as the most surveilled city in the world, surpassing even the most surveilled cities in China. Delhi was not the only Indian city in that list, Chennai and Mumbai also made it to the list. In Hyderabad as well, the development of a Command and Control Centre aims to link the city’s surveillance infrastructure in real-time. Even though studies have shown that there is little correlation between CCTVs and crime control, deployment of CCTV cameras has been justified on the basis of national security and crime deterrence. Such an activity brings about the collection and retention of audio-visual/visual information of all individuals frequenting spaces where CCTV cameras are deployed. This information could be used to identify them (directly or indirectly) based on their looks or other attributes. Potential risks associated with the misuse, and processing of such personal data also arise. These risks include large scale profiling, criminal abuse (law enforcement misusing CCTV information for personal gains), and discriminatory targeting (law enforcement disproportionately focusing on a particular group of people). As these devices capture personal data of individuals, this article seeks data protection safeguards available to data principals against CCTV surveillance employed by the State in a public space under the proposed Data Protection Bill, 2021 (the “DPB”).</p>
<h2>Safeguards Available Under the Data Protection Bill, 2021</h2>
<p style="text-align: justify; ">To use CCTV surveillance, the measures and compliance listed under the DPB have to be followed. Obligations of data fiduciaries available under Chapter II, such as consent (clause 11), notice requirement (clause 7), and fair and reasonable processing (clause 5) are common to all data processing entities for a variety of activities. Similarly, as the DPB follows the principles of data minimisation (clause 6), storage limitation (clause 9), purpose limitation (clause 5), lawful and fair processing (clause 4), transparency (clause 23), and privacy by design (clause 22), these safeguards too are common to all data processing entities/activities. If a data fiduciary processes personal data of children, it has to comply with the standards stated under clause 16.</p>
<p style="text-align: justify; ">Under the DPB, compliance differs on the basis of grounds and purpose of data processing. As such, if compliance standards differ, so do the availability of safeguards under the DPB. Of relevance to this article, there are three standards of compliance under the DPB wherein the standards of safeguards available to a data principal differ. First, cases which would fall under Chapter III and hence, not require consent. Chapter III lists grounds for processing of personal data without consent. Second, cases which would fall under exemption clauses in Chapter VIII. In such cases, the DPB or some of its provisions would be inapplicable. Clause 35 under Chapter VIII gives power to the Central Government to exempt any agency from the application of the DPB. Similarly, Clause 36 under Chapter VIII, exempts certain provisions for certain processing of personal data. Third, cases which would not fall under either of the above Chapters. In such cases, all safeguards available under the DPB would be available to the data principals. Consequently, safeguards available to data principals in each of these standards are different. We will go through each of these separately.</p>
<p style="text-align: justify; ">First, if the grounds of processing of CCTV information is such that it falls under the scope of Chapter III of the DPB, wherein the consent requirement is done away with, then in those cases, the notice requirement has to reflect such purpose, meaning that even if consent is not necessary for certain cases, other requirements under the DPB would still apply. Here, we must note that CCTV deployment by the state on such a large scale may be justified on the basis of conditions stated under clauses 12 and 14 of DPB – specifically, the condition for the performance of state function authorised by law, and public interest. The requirement under clause 12 of “authorised by law” simply means that the state function should have legal backing. Deployment of CCTVs is most likely to fall under clause 12 as various states have enacted legislations providing for CCTV deployment in the name of public safety. As a result, even if section 12 takes away the requirement of consent for certain cases, data principals should be able to exercise all rights accorded to them under the DPB (chapter V) except the right to data portability under clause 19.</p>
<p style="text-align: justify; ">Second, processing of personal data via CCTVs by government agencies could be exempted from DPB under clause 35 for certain cases under the clause. Another exemption that is particularly concerning with regard to the use of CCTVs is the exemption provided under clause 36(a). Section 36(a) says that the provisions of chapters II-VII would not apply where the data is processed in the interest of prevention, detection, investigation, and prosecution of any offence under the law. Chapters II-VII govern the obligations of data fiduciaries, grounds where consent would not be required, personal data of children, rights of data principals, transparency and accountability measures, and restrictions on transfer of personal data outside India respectively. In these cases, the requirement of fair and reasonable processing under clause 5 would also not apply. As a broad justification provided for CCTVs deployment by the government is crime control, it is possible that section 36(a) justification can be used to exempt the processing of CCTV footage from the above-mentioned safeguards.</p>
<p style="text-align: justify; ">From the above discussion, the following can be concluded. First, if the grounds of processing fall under Chapter III, then standards of fair and reasonable processing, notice requirement, and all rights except the right to data portability u/s 19 would be available to data principals. Second, if the grounds of processing fall under clause 36, then, in that case, consent requirement, notice requirement, and the rights under DPB would be unavailable as that section mandates the non-application of those chapters. In such a case, even the processing requirements of a fair and reasonable manner stand suspended. Third, if the grounds of processing of CCTV information doesn’t fall under Chapter III, then all obligations listed under Chapter II would have to be followed. Moreover, the data principal would be able to exercise all the rights available under Chapter V of the DPB.</p>
<h2>Constitutional Standards</h2>
<p style="text-align: justify; ">When the Supreme Court recognised privacy as a fundamental right in the case of Puttaswamy v. Union of India (“Puttaswamy”), it located the principles of informed consent and purpose limitation as central to informational privacy. It recognised that privacy inheres not in spaces but in an individual. It also recognised that privacy is not an absolute right and certain restrictions may be imposed on the exercise of the right. Before listing the constitutional standards that activities infringing privacy must adhere to, it’s important to answer whether there exists a reasonable expectation of privacy in CCTV footage deployed in a public space by the State?</p>
<p style="text-align: justify; ">In Puttaswamy, the court recognised that privacy is not denuded in public spaces. Writing for the plurality judgement, Chandrachud J. recognised that the notion of a reasonable expectation of privacy has elements both of a subjective and objective nature. Defining these concepts, he writes, “Privacy at a subjective level is a reflection of those areas where an individual desire to be left alone. On an objective plane, privacy is defined by those constitutional values which shape the content of the protected zone where the individual ought to be left alone…hence while the individual is entitled to a zone of privacy, its extent is based not only on the subjective expectation of the individual but on an objective principle which defines a reasonable expectation.” Note how in the above sentences, the plurality judgement recognises “a reasonable expectation” to be inherent in “constitutional values”. This is important as the meaning of what’s reasonable is to be constituted according to constitutional values and not societal norms. A second consideration that the phrase “reasonable expectation of privacy” requires is that an individual’s reasonable expectation is allied to the purpose for which the information is provided, as held in the case of Hyderabad v. Canara Bank (“Canara Bank”). Finally, the third consideration in defining the phrase is that it is context dependent. For example, in the case of In the matter of an application by JR38 for Judicial Review (Northern Ireland) 242 (2015) (link here), the UK Supreme Court was faced with a scenario where the police published the CCTV footage of the appellant involved in riotous behaviour. The question before the court was: “Whether the publication of photographs by the police to identify a young person suspected of being involved in riotous behaviour and attempted criminal damage can ever be a necessary and proportionate interference with that person’s article 8 [privacy] rights?” The majority held that there was no reasonable expectation of privacy in the case because of the nature of the criminal activity the appellant was involved in. However, the majority’s formulation of this conclusion was based on the reasoning that “expectation of privacy” was dependent on the “identification” purpose of the police. The court stated, “Thus, if the photographs had been published for some reason other than identification, the position would have been different and might well have engaged his rights to respect for his private life within article 8.1”. Therefore, as the purpose of publishing the footage was “identification” of the wrongdoer, the reasonable expectation of privacy stood excluded. The Canara Bank case was relied on by the SC in Puttaswamy. The plurality judgement in Puttaswamy also quoted the above paragraphs from the UK Supreme Court judgement.</p>
<p style="text-align: justify; ">Finally, the SC in the Aadhaar case, laid down the factors of “reasonable expectation of privacy.” Relying on those factors, the Supreme Court observed that demographic information and photographs do not raise a reasonable expectation of privacy. It further held that face photographs for the purpose of identification are not covered by a reasonable expectation of privacy. As this author has recognised, the majority in the Aadhaar case misconstrued the “reasonable expectation of privacy” to lie not in constitutional values as held in Puttaswamy but in societal norms. Even with the misapplication of the Puttaswamy principles by the majority in Aadhaar, it is clear that the exclusion of a “reasonable expectation of privacy” in face photographs is valid only for the purpose of “identification”. For purposes other than “identification”, there should exist a reasonable expectation of privacy in CCTV footage. Having recognised the existence of “reasonable expectation of privacy” in CCTV footage, let’s see how the safeguards mentioned under the DPB stand the constitutional standards of privacy laid down in Puttaswamy.</p>
<p style="text-align: justify; ">The bench in Puttaswamy located privacy not only in Article 21 but the entirety of part III of the Indian Constitution. Where transgression to privacy relates to different provisions under Part III, the tests evolved under those Articles would apply. Puttaswamy recognised that national security and crime control are legitimate state objectives. However, it also recognised that any limitation on the right must satisfy the proportionality test. The proportionality test requires a legitimate state aim, rational nexus, necessity, and balancing of interests. Infringement on the right to privacy occurs under the first and second standard. The first requirement of proportionality stands justified as national security and crime control have been recognised to be legitimate state objectives. However, it must be noted that the EU Guidelines on Processing of Personal Data through video devices state that the mere purpose of “safety” or “for your safety” is not sufficiently specific and is contrary to the principle that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject. The second requirement is a rational nexus. As stated above, there is little correlation between crime control and surveillance measures. Even if the state justifies a rational nexus between state aim and the action employed, it is the necessity part of the proportionality test where the CCTV surveillance measures fail (as explained by this author). Necessity requires us to draw a list of alternatives and their impact on an individual, and then do a balancing analysis with regard to the alternatives. Here, judicial scrutiny of the exemption order under clause 35 is a viable alternative that respects individual rights while at the same time, not interfering with the state’s aim.</p>
<h2>Conclusion</h2>
<p style="text-align: justify; ">Informed consent and purpose limitation were stated to be central principles of informational privacy in Puttaswamy. Among the three standards we identified, the principles of informed consent and purpose limitation remain available only in the third standard. In the first standard, even though the requirement of consent has become unavailable, the principle of purpose limitation would still be applicable to the processing of such data. The second standard is of particular concern wherein neither of those principles is available to data principals. It is worth mentioning here that in large scale monitoring activities such as CCTV surveillance, the safeguards which the DPB lists out would inevitably have an implementation flaw. The reason is that in scenarios where individuals refuse consent for large scale CCTV monitoring, what alternatives would the government offer to those individuals? Practically, CCTV surveillance would fall under clause 12 standards where consent would not be required. Even in those cases, would the notice requirement safeguard be diminished to “you are under surveillance” notices? When we talk about exercise of rights available under the DPB, how would an individual effectively exercise their right when the data processing is not limited to a particular individual? These questions arise because the safeguards under the DPB (and data protection laws in general) are based on individualistic notions of privacy. Interestingly, individual use cases of CCTVs have also increased with an increase in state use of CCTVs. Deployment of CCTVs for personal or domestic purposes would be exempt from the above-mentioned compliances as that would fall under the exemption provision of clause 36(d). Two additional concerns arise in relation to processing of data concerning CCTVs – the JPC report’s inclusion of Non-Personal Data (“NPD”) within the ambit of DPB, and the government’s plan to develop a National Automated Facial Recognition System (“AFRS”). A significant part of the data collected by CCTVs would fall within the ambit of NPD.With the JPC’s recommendation, it will be interesting to follow the processing standards for NPD under the DPB. AFRS has been imagined as a national database of photographs gathered from various agencies to be used in conjunction with facial recognition technology. The use of facial recognition technology with CCTV cameras raises concerns surrounding biometric data, and risks of large scale profiling. Indeed, section 27 of the DPB reflects this risk and mandates a data protection impact assessment to be undertaken by the data fiduciary with respect to processing involving new technologies or large scale profiling or use of biometric data by such technologies, however the DPB does not define what “new technology” means. Concerns around biometric data are outside the scope of the present article, however, it would be interesting to look at how the use of facial recognition technology with CCTVs could impact the safeguards under DPB.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021'>https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021</a>
</p>
No publisherAnamika Kundu and Digvijay S ChaudharyInternet GovernanceData ProtectionPrivacy2022-04-28T02:29:42ZBlog EntryRethinking Acquisition of Digital Devices by Law Enforcement Agencies
https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies
<b>This article has been selected as a part of The Right to Privacy and the Legality of Surveillance series organized in collaboration with the RGNUL Student Research Review (RSRR) Journal.</b>
<p>Read the article originally published in <a class="external-link" href="https://rsrr.in/blog/">RGNUL Student Research Review (RSRR) Journal </a></p>
<hr />
<p><strong>Abstract</strong></p>
<p style="text-align: justify;">The Criminal Procedure Code was created in the 1970s when the concept of the right to privacy was highly unacknowledged. Following the <em>Puttuswamy</em> <em>I </em>(2017) judgement of the Supreme Court affirming the right to privacy, these antiquated codes must be re-evaluated. Today, the police can acquire digital devices through summons and gain direct access to a person’s life, despite the summons mechanism having been intended for targeted, narrow enquiries. Once in possession of a device, the police attempt to circumvent the right against self-incrimination by demanding biometric passwords, arguing that the right does not cover biometric information . However, due to the extent of information available on digital devices, courts ought to be cautious and strive to limit the power of the police to compel such disclosures, taking into consideration the <em>right to privacy</em> judgement.</p>
<p><strong>Keywords: </strong>Privacy, Criminal Procedural Law, CrPc, Constitutional Law</p>
<p><strong>Introduction<em></em></strong></p>
<p style="text-align: justify;">New challenges confront the Indian criminal investigation framework, particularly in the context of law enforcement agencies (LEAs) acquiring digital devices and their passwords. Criminal procedure codes delimiting police authority and procedures were created before the widespread use of digital devices and are no longer pertinent to the modern age due to the magnitude of information available on a single device. A single device could provide more information to LEAs than a complete search of a person’s home; yet, the acquisition of a digital device is not treated with the severity and caution it deserves. Following the affirmation of the right to privacy in <em>Puttuswamy I </em>(2017), criminal procedure codes must be revamped, taking into consideration that the acquisition of a person’s digital device constitutes a major infringement on their right to privacy.</p>
<p><strong>Acquisition of digital devices by LEAs through summons</strong></p>
<p style="text-align: justify;"><a href="https://www.indiacode.nic.in/bitstream/123456789/15272/1/the_code_of_criminal_procedure%2C_1973.pdf">Section 91 of the Criminal Procedure Code</a> (CrPc) grants powers to a court or police officer in charge of a police station to compel a person to produce any form of document or ‘thing’ necessary and desirable to a criminal investigation. In <a href="https://indiankanoon.org/doc/1395576/"><em>Rama Krishna v State</em></a>,<em> </em>‘necessary’ and ‘desirable’ have been interpreted as any piece of evidence relevant to the investigation or a link in the chain of evidence. <a href="https://deliverypdf.ssrn.com/delivery.php?ID=040088020003014069081068085012117023096031065012091090091115088031084097097081123000002033027047006112028087095120074083084003037094022080065067076089116106115025106025062083007085091067067124080091064096069093075026018100087109120024076084123086119022&EXT=pdf&INDEX=TRUE">Abhinav Sekhri</a>, a criminal law litigator and writer, has argued that the wide wording of this section allows summons to be directed towards the retrieval of specific digital devices.</p>
<p style="text-align: justify;">As summons are target-specific, the section has minimal safeguards. However, several issues arise in the context of summons regarding digital devices. In the current day, access to a user’s personal device can provide comprehensive insight into their life and personality due to the vast amounts of private and personal information stored on it. In <a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"><em>Riley v California</em></a>, the Supreme Court of the United States (SCOTUS) observed that due to the nature of the content present on digital devices, summons for them are equivalent to a roving search, i.e., demanding the simultaneous production of all contents of the home, bank records, call records, and lockers. The <em>Riley</em> decision correctly highlights the need for courts to recognise that digital devices ought to be treated distinctly compared to other forms of physical evidence due to the repository of information stored on digital devices.</p>
<p style="text-align: justify;">The burden the state must surpass in order to issue summons is low as the relevancy requirement is easily provable. As noted in <a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"><em>Riley</em></a>, police must identify which evidence on a device is relevant. Due to the sheer amount of data on phones, it is very easy for police to claim that there will surely be some form of connection between the content on the device and the case. Due to the wide range of offences available for Indian LEAs to cite, it is easy for them to argue that the content on the device is relevant to any number of possible offences. LEAs rarely face consequences for slamming the accused with a huge roster of charges – even if many of them are baseless – leading to the system being prone to abuse. The Indian Supreme Court in its judgement in <a href="https://indiankanoon.org/doc/1068532/"><em>Canara Bank</em></a> noted that the burden of proof must be higher for LEAs when investigations violate the right to privacy. <a href="https://www.ijlt.in/_files/ugd/066049_03e4a2b28a5e49f6a59b861aa4554ede.pdf">Tarun Krishnakumar</a> notes that the trickle-down effect of <em>Puttuswamy I</em> will lead to new privacy challenges with regards to a summons to appear in court. <em>Puttuswamy I</em>, will provide the bedrock and constitutional framework, within which future challenges to the criminal process will be undertaken. It is important for the court to recognise the transformative potential within the <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy</em></a> judgement to help ensure that the right to privacy of citizens is safeguarded. The colonial logic of policing – wherein criminal procedure law was merely a tool to maximise the interest of the state at the cost of the people – must be abandoned. Courts ought to devise a framework under Section 91 to ensure that summons are narrowly framed to target specific information or content within digital devices. Additionally, the digital device must be collected following a judicial authority issuing the summons and not a police authority. Prior judicial warrants will require LEAs to demonstrate their requirement for the digital device; on estimating the impact on privacy, the authority can issue a suitable summons. Currently, the only consideration is if the item will furnish evidence relevant to the investigation; however, judges ought to balance the need for the digital device in the LEA’s investigation with the users’ right to privacy, dignity, and autonomy.</p>
<p style="text-align: justify;"><a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy I</em></a><em> </em>provides a triple test encompassing legality, necessity, and proportionality to test privacy claims. Legality requires that the measure be prescribed by law, necessity analyses if it is the least restrictive means being adopted by the state, and proportionality checks if the objective pursued by the measure is proportional to the degree of infringement of the right. The relevance standard, as mentioned before, is inadequate as it does not provide enough safeguards against abuse. The police can issue summons based on the slightest of suspicions and thus get access to a digital device, following which they can conduct a roving enquiry of the device to find evidence of any other offence, unrelated to the original cause of suspicion.</p>
<p style="text-align: justify;">Unilateral police summons of digital devices cannot pass the triple test as it is grossly disproportionate and lacks any form of safeguard against the police. The current system has no mechanism for overseeing the LEAs; as long as LEAs themselves are of the view that they require the device, they can acquire it. In <a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"><em>Riley</em></a>, SCOTUS has already held that warrantless seizure of digital devices constitutes a violation of the right to privacy. India ought to also adopt a requirement of a prior judicial warrant for the procurement of devices by LEAs. A re-imagined criminal process would have to abide by the triple test in particular proportionality wherein the benefit claimed by the state ought not to be disproportionate to the impact on the fundamental right to privacy; and further, a framework must be proposed to provide safeguards against abuse.</p>
<p><strong>Compelling the production of passwords of devices</strong></p>
<p style="text-align: justify;">In police investigations, gaining possession of a physical device is merely the first step in acquiring the data on the device, as the LEAs still require the passcodes needed to unlock the device. LEAs compelling the production of passcodes to gain access to potentially incriminating data raises obvious questions regarding the right against self-incrimination; however, in the context of digital devices, several privacy issues may crop up as well.</p>
<p style="text-align: justify;">In <a href="https://main.sci.gov.in/judgment/judis/4157.pdf"><em>Kathi Kalu Oghad</em></a>, the SC held that compelling the production of fingerprints of an accused person to compare them with fingerprints discovered by the LEA in the course of their investigation does not violate the right to protection against self-incrimination of the accused. <a href="https://lawschoolpolicyreview.com/2019/10/16/biometrics-as-passwords-the-slippery-scope-of-self-incrimination/">It has been argued</a> that the ratio in the judgement prohibits the compelling of disclosure of passwords and biometrics for unlocking devices because <a href="https://main.sci.gov.in/judgment/judis/4157.pdf"><em>Kathi Kalu Oghad</em></a> only dealt with the production of fingerprints in order to compare the fingerprints with pre-existing evidence, as opposed to unlocking new evidence by utilising the fingerprint. However, the judgement deals with self-incrimination and does not address any privacy issues.</p>
<p style="text-align: justify;">The right against self-incrimination approach alone may not be enough to resolve all concerns. Firstly, there may be varying levels of protection provided to different forms of password protections on digital devices; text- and pattern-based passcodes are inarguably protected under Art. 20(3) of the Constitution. However, the protection of biometrics-based passcodes relies upon the correct interpretation of the <a href="https://main.sci.gov.in/judgment/judis/4157.pdf"><em>Kathi Kalu Oghad</em></a> precedent. Secondly, Art. 20(3) only protects the accused in investigations and not when non-accused digital devices are acquired by LEAs and the passcodes of the devices demanded.</p>
<p style="text-align: justify;">Therefore, considering the aforementioned points, it is pertinent to remember that the right against self-incrimination does not exist in a vacuum separate from privacy. It originates from the concept of decisional autonomy – the right of individuals to make decisions about matters intimate to their life without interference from the state and society. <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy I</em></a> observed that decisional autonomy is the bedrock of the right to privacy, as privacy allows an individual to make these intimate decisions away from the glare of society and/or the state. This has heightened importance in this context as interference with such autonomy could lead to the person in question facing criminal prosecution. The SC in <a href="https://main.sci.gov.in/jonew/judis/36303.pdf"><em>Selvi v Karnataka</em></a><em> </em>and <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy I</em></a> has repeatedly affirmed that the right against self-incrimination and the right to privacy are linked concepts, with the court observing that the right to remain silent is an integral aspect of decisional autonomy.</p>
<p style="text-align: justify;">In <a href="http://karnatakajudiciary.kar.nic.in:8080/repository/rep_judgmentcase.php"><em>Virendra Khanna</em></a>, the Karnataka High Court (HC) dealt with the privacy and self-incrimination concerns caused by LEAs compelling the disclosure of passwords. The HC brushes aside concerns related to privacy by noting that the right to privacy is not absolute and that an exception to the right to privacy is state interest and protection of law and order (para 5.11), and that unlawful disclosure of material to third parties could be an actionable wrong (para 15). The court’s interpretation of privacy effectively provides a free pass for the police to interfere with the right to privacy under the pretext of a criminal investigation. This conception of privacy is inadequate as the issue of proportionality is avoided, and the court does not attempt to ensure that the interference is proportionate with the outcome.</p>
<p style="text-align: justify;">US courts also see the compelling of production of passcodes as an issue of self-incrimination as well as privacy. In its judgement in <a href="https://casetext.com/case/in-re-application-for-a-search-warrant?__cf_chl_f_tk=lTxiJpZIvKfkIBtGQJtMObSmqhdRUZdjGk5hXeMfprQ-1642253001-0-gaNycGzNCJE"><em>Application for a Search Warrant</em></a>, a US court observed that compelling the disclosure of passcodes existed at an intersection of the right to privacy and self-incrimination; the right against self-incrimination serves to protect the privacy interests of suspects.</p>
<p style="text-align: justify;">Disclosure of passwords to digital devices amounts to an intrusion of the privacy of the suspect as the collective contents on the digital device effectively amount to providing LEAs with a method to observe a person’s mind and identity. Police investigative techniques cannot override fundamental rights and must respect the personal autonomy of suspects – particularly, the choice between silence and speech. Through the production of passwords, LEAs can effectively get a snapshot of a suspect’s mind. This is analogous to the polygraph and narco-analysis test struck down as unconstitutional by the SC in <a href="https://main.sci.gov.in/jonew/judis/36303.pdf"><em>Selvi</em></a> as it violates decisional autonomy.</p>
<p style="text-align: justify;">As <a href="https://theproofofguilt.blogspot.com/2021/03/mobile-phones-and-criminal.html">Sekhri</a> noted, a criminal process that reflects the aspirations of the <em>Puttuswamy </em>judgement would require LEAs to first explain with reasonable detail the material which they wish to find in the digital devices. Secondly, they must provide a timeline for the investigation to ensure that individuals are not subjected to inexhaustible investigations with police roving through their devices indefinitely. Thirdly, such a criminal process must demand, a higher burden to be discharged from the state if the privacy of the individual is infringed upon. These aspirations should form the bedrock of a system of judicial warrants that LEAs ought to be required to comply with if they wish to compel the disclosure of passwords from individuals. The framework proposed above is similar to the <a href="http://karnatakajudiciary.kar.nic.in:8080/repository/rep_judgmentcase.php"><em>Virendra Khanna</em></a><em> </em>guidelines, as they provide a system of checks and balances that ensure that the intrusion on privacy is carried out proportionately; additionally, it would require LEAs to show a real requirement to demand access to the device. The independent eyes of a judicial magistrate provide a mechanism of oversight and a check against abuse of power by LEAs.</p>
<p><strong>Conclusion</strong></p>
<p style="text-align: justify;">The criminal law apparatus is the most coercive power available to the state, and, therefore, privacy rights will become meaningless unless they can withstand it. Several criminal procedures in the country are rooted in colonial statutes, where the rights of the populace being policed were never a consideration; hence, a radical shift is required. However, post-1947 and <em>Puttuswamy</em>, the ignorance and refusal to submit to the rights of the population can no longer be justified and significant reformulation is necessary to guarantee meaningful protections to device owners. There is a need to ensure that the rights of individuals are protected, especially when the motivation for their infringement is the supposed noble intentions of the criminal justice system. Failing to defend the right to privacy in these moments would be an invitation for allowing the power of the state to increase and inevitably become absolute.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies'>https://cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies</a>
</p>
No publisherHarikartik RameshSurveillanceInternet GovernancePrivacy2022-05-02T09:27:54ZBlog EntryComments to the draft Motor Vehicle Aggregators Scheme, 2021
https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021
<b>This submission presents a response by researchers at the Centre for Internet and Society, India (CIS) to the draft Motor Vehicle Aggregators Scheme, 2021 published by the Transport Department, Government of National Capital Territory of Delhi, (hereafter “draft Scheme”).</b>
<p style="text-align: justify; "><span>CIS, established in Bengaluru in 2008 as a non-profit organisation, undertakes interdisciplinary research on internet and digital technologies from public policy andacademic perspectives. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and regulatory practices around internet, technology,and society in India, and elsewhere.</span></p>
<p style="text-align: justify; "><span>CIS is grateful for the opportunity to submit its comments to the draft Scheme. Please find below our thematically organised comments.</span></p>
<hr />
<p><a style="text-align: justify; " href="https://cis-india.org/internet-governance/comments-draft-motor-vehicle-aggregators-scheme.pdf" class="internal-link"><strong>Click here</strong></a><span style="text-align: justify; "> to read more.</span></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021'>https://cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021</a>
</p>
No publisherChiara Furtado, Aayush Rathi and Abhishek SekharanMotor VehicleInternet GovernancePrivacy2022-04-01T15:25:06ZBlog EntryPersonal Data Protection Bill must examine data collection practices that emerged during pandemic
https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic
<b>The PDP bill is speculated to be introduced during the winter session of the parliament soon. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen’s data in order to fulfil its responsibilities. The bill could ensure that employers have some responsibility towards the data they collect from the employees.
</b>
<p>The article by Shweta Mohandas and Anamika Kundu was <a class="external-link" href="https://www.news9live.com/technology/personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic-137031?infinitescroll=1">originally published by <strong>news nine</strong></a> on November 29, 2021.</p>
<hr />
<p style="text-align: justify; ">The Personal Data Protection Bill (PDP) is speculated to be introduced during the winter session of the parliament soon, and the report of the Joint Parliamentary Committee (JPC) has already been <a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">adopted</a> by the committee on Monday. The Report of the JPC comes after almost two years of deliberation and secrecy over how the final version of the Personal Data Protection Bill will be. Since the publication of the <a class="external-link" href="https://prsindia.org/files/bills_acts/bills_parliament/2019/Personal%20Data%20Protection%20Bill,%202019.pdf">2019 version</a> of the PDP Bill, the Covid 19 pandemic and the public safety measures have opened the way for a number of new organisations and reasons to collect personal data that was non-existent in 2019. Hence along with changes that have been suggested by multiple civil society organisations, the dissent notes submitted by the members of the JPC, the new version of the PDP Bill must also look at how data processing has changed over the span of two years.</p>
<h3 style="text-align: justify; ">Concerns with the bill</h3>
<p style="text-align: justify; ">At the outset there are certain parts of the PDP Bill which need to be revised in order to uphold the spirit of privacy and individual autonomy laid out in the Puttaswamy judgement. The two sections that need to be in line with the privacy judgement are the ones that allow for non consensual processing of data by the government, and by employers. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen's data in order to fulfil its <a class="external-link" href="https://www.livemint.com/news/india/big-brother-on-top-in-data-protection-bill-11576164271430.html">responsibilities</a>.</p>
<p style="text-align: justify; ">In the <a class="external-link" href="https://www.meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf">2018 version</a> of bill, drafted by the Justice Srikrishna Committee exemptions granted to the State with regard to processing of data was subject to a four pronged test which required the processing to be (i) authorised by law; (ii) in accordance with the procedure laid down by the law; (iii) necessary; and (iv) proportionate to the interests being achieved. This four pronged test was in line with the principles laid down by the Supreme Court in the Puttaswamy judgement. The 2019 version of the PDP Bill has diluted this principle by merely retaining the 'necessity principle' and removing the other requirements which is not in consonance with the test laid down by the Supreme Court in Puttaswamy.</p>
<p style="text-align: justify; ">Section 35 was also widely discussed in the panel meetings where members had <a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">argued</a> the removal of 'public order' as a ground for exemption. The panel also insisted for '<a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">judicial or parliamentary oversight</a>' to grant such exemptions. The final report did not accept these suggestions stating a need to balance <a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">national security, liberty and privacy</a> of an individual. There ought to be prior judicial review of the written order exempting the governmental agency from any provisions of the bill. Allowing the government to claim an exemption if it is satisfied to be "necessary or expedient" can be misused.</p>
<p style="text-align: justify; ">Another clause which gives the data principal a wide berth is with respect to employee data Section 13 of the current version of the bill provides the employer with a leeway into processing employee data (other than sensitive personal data) without consent based on two grounds: when consent is not appropriate, or when obtaining consent would involve disproportionate effort on the part of the employer.</p>
<p style="text-align: justify; ">The personal data so collected can only be collected for recruitment, termination, attendance, provision of any service or benefit, and assessing performance. This covers almost all of the activities that require data of the employee. Although the 2019 version of the bill excludes non-consensual collection of sensitive personal data (a provision that was missing in the 2018 version of the bill), there is still a lot of scope to improve this provision and provide employees further right to their data. At the outset the bill does not define employee and employer, which could result in confusion as there is no one definition of these terms across Indian Labour Laws.</p>
<p style="text-align: justify; ">Additionally, the bill distinguishes between employee and consumer, where the consumer of the same company or service has a greater right to their data than an employee. In the sense that the consumer as a data principal has the option to use any other product or service and also has the right to withdraw consent at any time, in the case of an employee the consequence of refusing consent or withdrawing consent would be being terminated from the employment. It is understood that there is a requirement for employee data to be collected, and that consent does not work the same way as it does in the case of a consumer.</p>
<p style="text-align: justify; ">The bill could ensure that employers have some responsibility towards the data they collect from the employees, such as ensuring that they are only used for the purpose for which they were collected, the employee knows how long their data will be retained, and know if the data is being processed by third parties. It is also worth mentioning that the Indian government is India's largest employer spanning a variety of agencies and public enterprises.</p>
<h3 style="text-align: justify; ">Concerns highlighted by JPC Members</h3>
<p style="text-align: justify; ">Going back to the few members of the JPC who have moved dissent notes, specifically with regard to governmental exemptions. Jairam Ramesh filed a <a href="https://www.news9live.com/india/parliament-panel-adopts-report-on-data-protection-amid-dissent-by-opposition-135591">dissent note</a>, to which many other opposition members followed suit. While Jairam Ramesh praised the JPC's functioning, he disagreed with certain aspects of the Report. According to him, the 2019 bill is designed in a manner where the right to privacy is given importance only in cases of private activities. He raised concerns regarding the unbridled powers given to the government to exempt itself from any of the provisions.</p>
<p style="text-align: justify; ">The amendment suggested by him would require parliamentary approval before exemption would take place. He also added that Section 12 of the bill which provided certain scenarios where consent was not needed for processing of personal data should have been made '<a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html">less sweeping</a>'. Similarly, Gaurav Gogoi's <a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html">note</a> stated that the exemptions would create a surveillance state and similarly criticised Section 12 and 35 of the bill. He also mentioned that there ought to be parliamentary oversight for the exemptions provided in the bill.</p>
<p style="text-align: justify; ">On the same issue, Congress leader Manish Tiwari noted that the bill creates '<a href="https://timesofindia.indiatimes.com/business/india-business/personal-data-protection-bill-what-is-it-and-why-is-the-opposition-so-unhappy-with-it/articleshow/87869391.cms">parallel universes</a>' - one for the private sector which needs to be compliant and the other for the State which can exempt itself. He has opposed the entire bill stating there exists an "inherent design flaw". He has raised specific objections to 37 clauses and stated that any blanket exemptions to the state goes against the Puttaswamy Judgement.</p>
<p style="text-align: justify; ">In their joint <a href="https://www.news9live.com/india/tmc-congress-mps-submit-dissent-notes-to-joint-panel-on-personal-data-protection-bill-135491">dissent note</a>, Derek O'Brien and Mahua Mitra have said that there is a lack of adequate safeguards to protect the data principals' privacy and the lack of time and opportunity for stakeholder consultations. They have also pointed out that the independence of the DPA will cease to exist with the present provision of allowing the government powers to choose members and the chairman. Amar Patnaik is to object to the lack of inclusion of state level authorities in the bill. Without such bodies, he says, there would be federal override.</p>
<h3 style="text-align: justify; ">Conclusion</h3>
<p style="text-align: justify; ">While a number of issues were highlighted by civil society, the members of the JPC, and the media, the new version of the bill should also need to take into account the shifts that have taken place in view of the pandemic. The new version of the data protection bill should take into consideration the changes and new data collection practices that have emerged during the pandemic, be comprehensive and leave very little provisions to be decided later by the Rules.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic'>https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic</a>
</p>
No publisherShweta Mohandas and Anamika KunduInternet GovernanceData ProtectionPrivacy2022-03-30T15:15:21ZBlog EntryClause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study
https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study
<b>In light of the state’s emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?</b>
<p>The blog post was <a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-digital-healthcare-case-study/">published in Medianama</a> on February 21, 2022. This is the second in a two-part series by Amber Sinha.</p>
<hr />
<p style="text-align: justify; ">In the <a href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/">previous post</a>, I looked at provisions on non-consensual data processing for state functions under the most recent version of recommendations by the Joint Parliamentary Committee on India’s Data Protection Bill (DPB). The true impact of these provisions can only be appreciated in light of ongoing policy developments and real-life implications.</p>
<p style="text-align: justify; ">To appreciate the significance of the dilutions in Clause 12, let us consider the Indian state’s range of schemes promoting digital healthcare. In July 2018, NITI Aayog, a central government policy think tank in India released a strategy and approach paper (Strategy Paper) on the formulation of the National Health Stack which envisions the creation of a federated application programming interface (API)-enabled health information ecosystem. While the Ministry of Health and Family Welfare has focused on the creation of Electronic Health Records (EHR) Standards for India during the last few years and also identified a contractor for the creation of a centralised health information platform (IHIP), this Strategy Paper advocates a completely different approach, which is described as a Personal Health Records (PHR) framework. In 2021, the National Digital Health Mission (NDHM) was launched under which a citizen shall have the option to obtain a digital health ID. A digital health ID is a unique ID and will carry all health records of a person.</p>
<h2 style="text-align: justify; ">A Stack Model for Big Data Ecosystem in Healthcare</h2>
<p style="text-align: justify; ">A stack model as envisaged in the Strategy Paper, consists of several layers of open APIs connected to each other, often tied together by a unique health identifier. The open nature of APIs has the advantage that it allows public and private actors to build solutions on top of it, which are interoperable with all parts of the stack. It is however worth considering both the ‘openness’ and the role that the state plays in it.</p>
<p style="text-align: justify; ">Even though the APIs are themselves open, they are a part of a pre-decided technological paradigm, built by private actors and blessed by the state. Even though innovators can build on it, the options available to them are limited by the information architecture created by the stack model. When such a technological paradigm is created for healthcare reform and health data, the stack model poses additional challenges. By tying the stack model to the unique identity, without appropriate processes in place for access control, siloed information, and encrypted communication, the stack model poses tremendous privacy and security concerns. The broad language under Clause 12 of the DPB needs to be looked at in this context.</p>
<p>Clause 12 allows non-consensual processing of personal data where it is necessary “for the performance of any function of the state authorised by law” in order to provide a service or benefit from the State. In the previous post, I had highlighted the import of the use of only ‘necessity’ to the exclusion of ‘proportionality’. Now, we need to consider its significance in light of the emerging digital healthcare apparatus being created by the state.</p>
<p style="text-align: justify; ">The National Health Stack and National Digital Health Mission together envision an intricate system of data collection and exchange which in a regulatory vacuum would ensure unfettered access to sensitive healthcare data for both the state and private actors registered with the platforms. The Stack framework relies on repositories where data may be accessed from multiple nodes within the system. Importantly, the Strategy Paper also envisions health data fiduciaries to facilitate consent-driven interaction between entities that generate the health data and entities that want to consume the health records for delivering services to the individual. The cast of characters involve the National Health Authority, health care providers and insurers who access the National Health Electronic Registries, unified data from different programmes such as National Health Resource Repository (NHRR), NIN database, NIC and the Registry of Hospitals in Network of Insurance (ROHINI), private actors such as Swasth, iSpirt who assist the Mission as volunteers. The currency that government and private actors are interested in is data.</p>
<p style="text-align: justify; ">The promised benefits of healthcare data in an anonymised and aggregate form range from Disease Surveillance to Pharmacovigilance as well as Health Schemes Management Systems and Nutrition Management, benefits which have only been more acutely emphasised during the pandemic. However, the pandemic has also normalised the sharing of sensitive healthcare data with a variety of actors, without much thinking on much-needed data minimisation practises.</p>
<p style="text-align: justify; ">The potential misuses of healthcare data include greater state surveillance and control, predatory and discriminatory practices by private actors which rely on Clause 12 to do away with even the pretense of informed consent so long as the processing of data is deemed necessary by the state and its private sector partners to provide any service or benefit.</p>
<p style="text-align: justify; ">Subclause (e) in Clause 12, which was added in the last version of the Bill drafted by MeitY and has been retained by the JPC, allows processing wherever it is necessary for ‘any measures’ to provide medical treatment or health services during an epidemic, outbreak or threat to public health. Yet again, the overly-broad language used here is designed to ensure that any annoyances of informed consent can be easily brushed aside wherever the state intends to take any measures under any scheme related to public health.</p>
<p style="text-align: justify; ">Effectively, how does the framework under Clause 12 alter the consent and purpose limitation model? Data protection laws introduce an element of control by tying purpose limitation to consent. Individuals provide consent to specified purposes, and data processors are required to respect that choice. Where there is no consent, the purposes of data processing are sought to be limited by the necessity principle in Clause 12. The state (or authorised parties) must be able to demonstrate necessity to the exercise of state function, and data must only be processed for those purposes which flow out of this necessity. However, unlike the consent model, this provides an opportunity to keep reinventing purposes for different state functions.</p>
<p style="text-align: justify; ">In the absence of a data protection law, data collected by one agency is shared indiscriminately with other agencies and used for multiple purposes beyond the purpose for which it was collected. The consent and purpose limitation model would have addressed this issue. But, by having a low threshold for non-consensual processing under Clause 12, this form of data processing is effectively being legitimised.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study'>https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study</a>
</p>
No publisheramberData GovernanceInternet GovernanceData ProtectionPrivacy2022-03-01T15:07:44ZBlog EntryHow Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill
https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function
<b>The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.</b>
<p>The blog post was <a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/">published in Medianama</a> on February 18, 2022. This is the first of a two-part series by Amber Sinha.</p>
<hr />
<p style="text-align: justify; ">In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote <a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html">an opinion piece</a> providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash <a href="https://twitter.com/pranesh/status/1023116679440621568">pointed out</a> that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.</p>
<p style="text-align: justify; ">While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.</p>
<p style="text-align: justify; ">The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —</p>
<p>a) where it is necessary to respond to a medical emergency<br />b) where it is necessary for state to provide a service or benefit to the individual<br />c) where it is necessary for the state to issue any certification, licence or permit<br />d) where it is necessary under any central or state legislation, or to comply with a judicial order<br />e) where it is necessary for any measures during an epidemic, outbreak or public health<br />f) where it is necessary for safety procedures during disaster or breakdown of public order</p>
<p>In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.</p>
<h2>Twin restrictions in Clause 12</h2>
<p style="text-align: justify; ">The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.</p>
<p style="text-align: justify; ">What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.</p>
<h2 style="text-align: justify; ">Scope of privacy under Puttaswamy</h2>
<p style="text-align: justify; ">It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench was not required to provide a legal test to determine the extent and scope of the right to privacy, but they do provide sufficient guidance for us to contemplate how the limits and scope of the constitutional right to privacy could be determined in future cases.</p>
<p style="text-align: justify; ">The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.</p>
<p style="text-align: justify; ">In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —<br />a) the existence of a “law”<br />b) a “legitimate State interest”<br />c) the requirement of “proportionality”.</p>
<p style="text-align: justify; ">The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.</p>
<h2 style="text-align: justify; ">Proportionality and Clause 12</h2>
<p style="text-align: justify; ">It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —</p>
<p>a) the limiting measures must be carefully designed, or rationally connected, to the objective<br />b) they must impair the right as little as possible<br />c) the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.</p>
<p style="text-align: justify; ">The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.</p>
<p style="text-align: justify; ">The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.</p>
<p style="text-align: justify; ">However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.</p>
<p style="text-align: justify; ">Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.</p>
<p style="text-align: justify; ">The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'>https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</a>
</p>
No publisheramberData GovernanceInternet GovernanceData ProtectionPrivacy2022-03-01T14:56:49ZBlog EntryCIS Comments and Recommendations on the Data Protection Bill, 2021
https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill
<b>This document is a revised version of the comments we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill.</b>
<p style="text-align: justify; ">After nearly two years of deliberations and a few changes in its composition, the Joint Parliamentary Committee (JPC), on 17 December 2021, submitted its report on the Personal Data Protection Bill, 2019 (2019 Bill). The report also contains a new version of the law titled the Data Protection Bill, 2021 (2021 Bill). Although there were no major revisions from the previous version other than the inclusion of all data under the ambit of the bill, some provisions were amended.</p>
<p style="text-align: justify; ">This document is a revised version of the<a href="https://cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019"> comments</a> we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill. Through this document we aim to shed light on the issues that we highlighted in our previous comments that have not yet been addressed, along with additional comments on sections that have become more relevant since the pandemic began. In several instances our previous comments have either not been addressed or only partially been addressed; in such instances, we reiterate them.</p>
<p style="text-align: justify; ">These general comments should be read in conjunction with our previous recommendations for the reader to get a comprehensive overview of what has changed from the previous version and what has remained the same. This document can also be read while referencing the new Data Protection Bill 2021 and the JPC’s report to understand some of the significant provisions of the bill.</p>
<hr />
<p style="text-align: justify; "><strong><a href="https://cis-india.org/internet-governance/general-comments-data-protection-bill.pdf" class="internal-link">Read on to access the comments</a> | </strong><span>Review and editing by Arindrajit Basu. Copy editing: The Clean Copy; Shared under Creative Commons Attribution 4.0 International license</span></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill'>https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill</a>
</p>
No publisherPallavi Bedi and Shweta MohandasInternet GovernanceData ProtectionPrivacy2022-02-14T16:07:44ZBlog EntryFacial Recognition Technology in India
https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf
<b></b>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf'>https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf</a>
</p>
No publisherElonnai Hickok, Pallavi Bedi, Aman Nair and Amber SinhaPrivacyInternet GovernanceFacial Recognition2021-09-02T16:17:44ZFileFacial Recognition Technology in India
https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india
<b>The Human Rights, Big Data and Technology Project, University of Essex, UK and the Centre for Internet & Society (CIS) have jointly published a research paper on facial recognition technology. Authors, Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha, examine technological tools such as CCTV and FRT which are increasingly being deployed by the government.</b>
<h3>Executive Summary</h3>
<p style="text-align: justify; ">Over the past two decades there has been a sustained effort at digitising India’s governance structure in order to foster development and innovation. The field of law enforcement and safety has seen significant change in that direction, with technological tools such as Closed Circuit Television (CCTV) and Facial Recognition Technology (FRT) increasingly being deployed by the government.</p>
<p style="text-align: justify; ">Yet for all its increased use, there is still a lack of a coherent legal and regulatory framework governing FRT in India. Towards informing such a framework, this paper seeks to document present uses of FRT in India, specifically by law enforcement agencies and central and state governments, understand the applicability of existing legal frameworks to the use of FRT, and define key areas that need to be addressed when using the technology in India. We also briefly look at how the coverage of FRT has increased beyond law enforcement; it now covers educational institutions, employment purposes, and it is now being used for providing Covid-19 vaccines.</p>
<p style="text-align: justify; ">We begin by examining use cases of FRT systems by various divisions of central and state governments. In doing so, it becomes apparent that there is a lack of uniform standards or guidelines at either the state or central level - leading to different FRT systems having differing standards of applicability and scope of use. And while the use of such systems seems to be growing at a rapid rate, questions around their legality persist.</p>
<p style="text-align: justify; ">It is unclear whether the use of FRT is compliant with the fundamental right to privacy as affirmed by the Supreme Court in 2017 in <i>Puttaswamy</i>. While the right to privacy is not an absolute right, for the state to curtail this right, the restrictions will have to comply with a three-fold requirement— first, being the need for explicit legislative mandate in instances where the government looks to curtail the right. However, the FRT systems we have analysed do not have such a mandate and are often the result of administrative or executive decisions with no legislative blessing or judicial oversight.</p>
<p style="text-align: justify; ">We further locate the use of FRT technology within the country’s wider legislative, judicial and constitutional frameworks governing surveillance. We also briefly articulate comparative perspectives on the use of FRT in other jurisdictions. We further analyse the impact of the proposed Personal Data Protection Bill on the deployment of FRT. Finally, we propose a set of recommendations to develop a path forward for the technology’s use which include the need for a comprehensive legal and regulatory framework that governs the use of FRT. Such a framework must take into consideration the necessity of use, proportionality, consent, security, retention, redressal mechanisms, purpose limitation, and other such principles. Since the use of FRT in India is also at a nascent stage, it is imperative that there is greater public research and dialogue into its development and use to ensure that any harms that may arise in the field are mitigated.</p>
<hr />
<p>Click to download the entire <a href="https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf" class="external-link">research paper here</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india'>https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india</a>
</p>
No publisherElonnai Hickok, Pallavi Bedi, Aman Nair and Amber SinhaPrivacyInternet GovernanceFacial Recognition2021-09-02T16:21:24ZBlog EntryTechno-solutionist Responses to COVID-19
https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19
<b>The Indian state has increasingly adopted a digital approach to service delivery over the past decade, with vaccination being the latest area to be subsumed by this strategy. In the context of the need for universal vaccination, the limitations of the government’s vaccination platform Co-WIN need to be analysed.</b>
<p><span style="text-align: justify; ">The article by Amber Sinha, Pallavi Bedi, and Aman Nair was published in the </span><a class="external-link" href="https://www.epw.in/journal/2021/29/commentary/techno-solutionist-responses-covid-19.html" style="text-align: justify; ">Economic & Political Weekly</a><span style="text-align: justify; ">, Vol. 56, Issue No. 29, 17 Jul, 2021.</span></p>
<hr />
<p style="text-align: justify; ">Over the last two decades, slowly but steadily, the governance agenda of the Indian state has moved to the digital realm. In 2006, the National e-Governance Plan (NeGP) was approved by the Indian state wherein a massive infrastructure was developed to reach the remotest corners and facilitate easy access of government services efficiently at affordable costs. The first set of NeGP projects focused on digitalising governance schemes that dealt with taxation, regulation of corporate entities, issuance of passports, and pensions. Over a period of time, they have come to include most interactions between the state and citizens from healthcare to education, transportation to employment, and policing to housing. Upon the launch of the Digital India Mission by the union government, the NeGP was subsumed under the e-Gov and e-Kranti components of the project. The original press release by the central government reporting the approval by the cabinet of ministers of the Digital India programme speaks of “cradle to grave” digital identity as one of its vision areas. This identity was always intended to be “unique, lifelong, online and authenticable.”</p>
<p style="text-align: justify; ">Since the inception of the Digital India campaign by the current government, there have been various concerns raised about the privacy issues posed by this project. The initiative includes over 50 “mission mode projects” in various stages of implementation. All of these projects entail collection of vast quantities of personally identifiable information of the citizens. However, most of these initiatives do not have clearly laid down privacy policies. There is also a lack of properly articulated access control mechanism and doubts exist over important issues such as data ownership owing to most projects involving public–private partnership which involves a private organisation collecting, processing and retaining large amounts of data. Most importantly, they have continued to exist and prosper in a state of regulatory vacuum with no data protection legislation to govern them. Further, the state of digital divide and digital literacy in India should automatically underscore the need to not rely solely on digital solutions.</p>
<hr />
<p><span>Click to </span><a class="external-link" href="https://www.epw.in/journal/2021/29/commentary/techno-solutionist-responses-covid-19.html">read the full article here</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19'>https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19</a>
</p>
No publisherAmber Sinha, Pallavi Bedi and Aman NairDigital GovernancePrivacyDigitalisationCo-WINCovid19Digital TechnologiesInternet GovernanceTechnologyE-Governance2021-08-10T15:34:06ZBlog EntryDo We Really Need an App for That? Examining the Utility and Privacy Implications of India’s Digital Vaccine Certificates
https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates
<b>We examine the purported benefits of digital vaccine certificates over regular paper-based ones and analyse the privacy implications of their use.</b>
<p><em>This blogpost was edited by Gurshabad Grover, Yesha Tshering Paul, and Amber Sinha.<br />It was originally published on <a href="https://digitalid.design/vaccine-certificates.html">Digital Identities: Design and Uses</a> and is cross-posted here.<br /></em></p>
<p>In an experiment to streamline its COVID-19 immunisation drive, India has adopted a centralised vaccine administration system called CoWIN (or COVID Vaccine Intelligence Network). In addition to facilitating registration for both online and walk-in vaccine appointments, the system also allows for the <a href="https://verify.cowin.gov.in/" target="_blank">digital verification</a> of vaccine certificates, which it issues to people who have received a dose. This development aligns with a global trend, as many countries have adopted or are in the process of adopting “vaccine passports” to facilitate safe movement of people while resuming commercial activity.
<br /><br />Some places, such as the <a href="https://www.schengenvisainfo.com/news/all-your-questions-on-eus-covid-19-vaccine-certificate-answered/" target="_blank">EU</a>, have constrained the scope of use of their vaccine certificates to international travel. The Indian government, however, has so far <a href="https://www.livemint.com/opinion/columns/vaccination-certificates-need-a-framework-to-govern-their-use-11618160385602.html" target="_blank">skirted</a> important questions around where and when this technology should be used. By allowing <a href="https://verify.cowin.gov.in/" target="_blank">anyone</a> to use the online CoWIN portal to scan and verify certificates, and even providing a way for the private-sector to incorporate this functionality into their applications, the government has opened up the possibility of these digital certificates being used, and even mandated, for domestic everyday use such as going to a grocery shop, a crowded venue, or a workplace.
<br /><br />In this blog post, we examine the purported benefits of digital vaccine certificates over regular paper-based ones, analyse the privacy implications of their use, and present recommendations to make them more privacy respecting. We hope that such an analysis can help inform policy on appropriate use of this technology and improve its privacy properties in cases where its use is warranted.
<br /><br />We also note that while this post only examines the merits of a technological solution put out by the government, it is more important to <a href="https://www.accessnow.org/cms/assets/uploads/2021/04/Covid-Vaccine-Passports-Threaten-Human-Rights.pdf" target="_blank">consider</a> the effects that placing restrictions on the movement of unvaccinated people has on their civil liberties in the face of a vaccine rollout that is inequitable along many lines, including <a href="https://thewire.in/gender/women-falling-behind-in-indias-covid-19-vaccination-drive" target="_blank">gender</a>, <a href="https://www.thehindu.com/sci-tech/science/will-25-covid-19-vaccines-for-private-hospitals-aggravate-inequity/article34799098.ece" target="_blank">caste-class</a>, and <a href="https://scroll.in/article/994871/tech-savvy-indians-drive-to-villages-for-covid-19-vaccinations-those-without-smartphones-lose-out" target="_blank">access to technology</a>.</p>
<h4>How do digital vaccine certificates work?</h4>
<p>Every vaccine recipient in the country is required to be registered on the CoWIN platform using one of <a href="https://www.cowin.gov.in/faq" target="_blank">seven</a> existing identity documents. [1] <a name="ref1"></a> Once a vaccine is administered, CoWIN generates a vaccine certificate which the recipient can access on the CoWIN website. The certificate is a single page document that contains the recipient’s personal information — their name, age, gender, identity document details, unique health ID, a reference ID — and some details about the vaccine given.<a name="ref2"></a> [2] It also includes a “secure QR code” and a link to CoWIN’s verification <a href="https://verify.cowin.gov.in/" target="_blank">portal</a>.
<br /><br />The verification portal allows for the verification of a certificate by scanning the attached QR code. Upon completion, the portal displays a success message along with some of the information printed on the certificate.
<br /><br />Verification is done using a cryptographic mechanism known as <a href="https://en.wikipedia.org/wiki/Digital_signature" target="_blank">digital signatures</a>, which are encoded into the QR code attached to a vaccine certificate. This mechanism allows “offline verification”, which means that the CoWIN verification portal or any private sector app attempting to verify a certificate does not need to contact the CoWIN servers to establish its authenticity. It instead uses a “public key” issued by CoWIN beforehand to verify the digital signature attached to the certificate.
<br /><br />The benefit of this convoluted design is that it protects user privacy. Performing verification offline and not contacting the CoWIN servers, precludes CoWIN from gleaning sensitive metadata about usage of the vaccine certificate. This means that CoWIN does not learn about where and when an individual uses their vaccine certificate, and who is verifying it. This closes off a potential avenue for mass surveillance. [3] However, given how certificate revocation checks are being implemented (detailed in the privacy implications section below), CoWIN ends up learning this information anyway.</p>
<h4>Where is digital verification useful?</h4>
<p>The primary argument for the adoption of digital verification of vaccine certificates over visual examination of regular paper-based ones is security. In the face of vaccine hesitancy, there are concerns that people may forge vaccine certificates to get around any restrictions that may be put in place on the movement of unvaccinated people. The use of digital signatures serves to allay these fears.
<br /><br />In its current form, however, digital verification of vaccine certificates is no more secure than visually inspecting paper-based ones. While the “secure QR code” attached to digital certificates can be used to verify the authenticity of the certificate itself, the CoWIN verification portal does not provide any mechanism nor does it instruct verifiers to authenticate the identity of the person presenting the certificate. This means that unless an accompanying identity document is also checked, an individual can simply present someone else’s certificate.
<br /><br />There are no simple solutions to this limitation; adding a requirement to inspect identity documents in addition to digital verification of the vaccine certificate would not be a strong enough security measure to prevent the use of duplicate vaccine certificates. People who are motivated enough to forge a vaccine certificate, can also duplicate one of the seven ID documents which can be used to register on CoWIN, some of which are simple paper-based documents. [4] Requiring even stronger identity checks, such as the use of Aadhaar-based biometrics, would make digital verification of vaccine certificates more secure. However, this would be a wildly disproportionate incursion on user privacy — allowing for the mass collection of metadata like when and where a certificate is used — something that digital vaccine certificates were explicitly designed to prevent. Additionally, in Russia, people were <a href="https://www.washingtonpost.com/world/europe/moscow-fake-vaccine-coronavirus/2021/06/26/0881e1e4-cf98-11eb-a224-bd59bd22197c_story.html" target="_blank">found</a> issuing fake certificates by discarding real vaccine doses instead of administering them. No technological solution can prevent such fraud.
<br /><br />As such, the utility of digital certificates is limited to uses such as international travel, where border control agencies already have strong identity checks in place for travellers. Any everyday usage of the digital verification functionality on vaccine certificates would not present any benefit over visually examining a piece of paper or a screen.</p>
<h4>Privacy implications of digital certificates</h4>
<p>In addition to providing little security utility over manual inspection of certificates, digital certificates also present privacy issues, these are listed below along with recommendations to mitigate them:
<br /><br /><em>(i) The verification portal leaks sensitive metadata to CoWIN’s servers:</em> An analysis of network requests made by the CoWin verification portal reveals that it conducts a ‘revocation check’ each time a certificate is verified. This check was also found in the source <a href="https://github.com/egovernments/DIVOC/blob/e667697b47a50a552b8d0a8c89a950180217b945/interfaces/vaccination-api.yaml#L385" target="_blank">code</a>, which is made openly available<a name="ref5"></a>.
[5]</p>
<p>Revocation checks are an important security consideration while using digital signatures. They allow the issuing authority (CoWIN, in this case) to revoke a certificate in case the account associated with it is lost or stolen, or if a certificate requires correction. However, the way they have been implemented here presents a significant privacy issue. Sending certificate details to the server on every verification attempt allows it to learn about where and when an individual is using their vaccine certificate.
<br /><br />We note that the revocation check performed by the CoWIN portal does not necessarily mean that it is storing this information. Nevertheless, sending certificate information to the server directly contradicts claims of an “offline verification” process, which is the basis of the design of these digital certificates.
<br /><br /><strong>Recommendations:</strong> Implementing privacy-respecting revocation checks such as Certificate Revocation Lists, [6] or Range Queries [7] would mitigate this issue. However, these solutions are either complex or present bandwidth and storage tradeoffs for the verifier.
<br /><br /><em>(ii) Oversharing of personally identifiable information:</em> CoWIN’s vaccine certificates include more personally identifiable information (name, age, gender, identity document details and unique health ID) than is required for the purpose of verifying the certificate. An examination of the vaccine certificates available to us revealed that while the Aadhaar number is appropriately masked, other personal identifiers such as passport number and unique health ID were not masked. Additionally, the inclusion of demographic details, such as age and gender, provides little security benefit by limiting the pool of duplicate certificates that can be used and are not required in light of the security analysis above.
<br /><br /><strong>Recommendation:</strong> Personal identifiers (such as passport number and unique health ID) should be appropriately masked and demographic details (age, gender) can be removed.
<br /><br />The minimal set of data required for identity-linked usage for digital verification, as described above, is a full name and masked ID document details. All other personally identifying information can be removed. In case of paper-based certificates, which is suggested for domestic usage, only the details about vaccine validity would suffice and no personal information is required.
<br /><br /><em>(iii) Making information available digitally increases the likelihood of collection:</em> All of the personal information printed on the certificate is also encoded into the QR code. This is <a href="https://www.bbc.com/news/uk-scotland-57208607" target="_blank">necessary</a> because the digital signature verification process also verifies the integrity of this information (i.e. it wasn’t modified). A side effect of this is that the personal information is made readily available in digital form to verifiers when it is scanned, making it easy for them to store. This is especially likely in private sector apps who may be interested in collecting demographic information and personal identifiers to track customer behaviour.
<br /><br /><strong>Recommendation:</strong> Removing extraneous information from the certificate, as suggested above, mitigates this risk as well.</p>
<h4>Conclusion</h4>
<p>Our analysis reveals that without incorporating strong, privacy-invasive identity checks, digital verification of vaccine certificates does not provide any security benefit over manually inspecting a piece of paper. The utility of digital verification is limited to purposes that already conduct strong identity checks.
<br /><br />In addition to their limited applicability, in their current form, these digital certificates also generate a trail of data and metadata, giving both government and industry an opportunity to infringe upon the privacy of the individuals using them.
<br /><br />Keeping this in mind, the adoption of this technology should be discouraged for everyday use.</p>
<p> </p>
<h4>References</h4>
<p>[1] Exceptions <a href="https://web.archive.org/web/20210511045921/https://www.mohfw.gov.in/pdf/SOPforCOVID19VaccinationofPersonswithoutPrescribedIdentityCards.pdf" target="_blank">exist</a> for people without state-issued identity documents.</p>
<p>[2] This information was gathered by inspecting three vaccine certificates linked to the author’s CoWIN account, which they were authorised to view, and may not be fully accurate.</p>
<p>[3] This design is similar to Aadhaar’s “<a href="https://resident.uidai.gov.in/offline-kyc" target="_blank">offline KYC</a>” process.</p>
<p>[4] “Aadhaar Card: UIDAI says downloaded versions on ordinary paper, mAadhaar perfectly valid”, <em>Zee Business</em>, April 29 2019, <em>https://www.zeebiz.com/india/news-aadhaar-card-uidai-says-downloaded-versions-on-ordinary-paper-maadhaar-perfectly-valid-96790</em>.</p>
<p>[5] This check was also verified to be present in the reference <a href="https://github.com/egovernments/DIVOC/blob/261a61093b89990fe34698f9ba17367d4cb74c34/public_app/src/components/CertificateStatus/index.js#L125" target="_blank">code</a> made available for private-sector applications incorporating this functionality, suggesting that private sector apps will also be affected by this.</p>
<p>[6] <a href="https://en.wikipedia.org/wiki/Certificate_revocation_list" target="_blank">Certificate Revocation Lists</a> allow the server to provide a list of revoked certificates to the verifier, instead of the verifier querying the server each time. This, however, can place heavy bandwidth and storage requirements on the verifying app as this list can potentially grow long.</p>
<p>[7] Range Queries are described in this <a href="https://www.ics.uci.edu/~gts/paps/st06.pdf" target="_blank">paper</a>. In this method, the verifier requests revocation status from the server by specifying a range of certificate identifiers within which the certificate being verified lies. If there are any revoked certificates within this range, the server will send their identifiers to the verifier, who can then check if the certificate in question is on the list. For this to work, the range selected must be sufficiently large to include enough potential candidates to keep the server from guessing which one is in use.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates'>https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates</a>
</p>
No publisherdivyankPrivacyDigital IDCovid19Appropriate Use of Digital ID2021-08-03T05:13:28ZBlog EntryState of Consumer Digital Security in India
https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india
<b>This report attempts to identify the existing state of digital safety in India, with a mapping of digital threats, which will aid stakeholders in identifying and addressing digital security problems in the country. This project was funded by the Asia Foundation.</b>
<p style="text-align: justify;"> </p>
<p style="text-align: justify;">Since 2006, successive Union governments in India have shown increased focus on digital governance. The National e-Governance Plan was launched by the UPA government in2006, and several digital projects led by the state such as digitisation of the filing of taxes, appointment process for passports, corporate governance, and the Aadhaar programme(India’s unique digital identity system that utilises biometric and demographic data) arose under it, in the form of mission mode projects (projects that are part of a broader National e-governance initiative, each focusing on specific e-Governance aspects, like banking, land records, or commercial taxes). In 2014, when the NDA government came to power, the National e-Governance Plan was subsumed under the government’s flagship project of Digital India, and several mission mode projects were added. In the meantime, the internet connectivity, first in the form of wire connectivity, and later in the form of mobile connectivity has increased greatly. In the same period, use of digital services, first in new services native to the Internet such as email, social networking, instant messaging, and later the platformization and disruption of traditional business models in transportation, healthcare, finance and virtually every sector, has led to a deluge of digital private service providers in India.</p>
<p style="text-align: justify;">Currently, India has 500 million internet users — over a third of its total population — making it the country with the second largest number of Internet users after China. The uptake of these technological services has also been accompanied by several kinds of digital threats that an average digital consumer in India must regularly contend with. This report is a mapping of consumer-facing digital threats in India and is intended to aid stakeholders in identifying and addressing digital security problems. The first part of the report categorises digital threats into four kinds, Personal Data Threats, Online Content Related Threats, Financial Threats, and Online Sexual Harassment Threats. Threats under each category are then defined, with detailed consumer-facing consequences, and past instances where harm has been caused because of these threats.</p>
<hr />
<p> </p>
<p>Read the full report <a href="https://cis-india.org/internet-governance/report-state-of-consumer-digital-security-in-india" class="internal-link" title="Report - State of Consumer Digital Security in India">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india'>https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india</a>
</p>
No publisherpranavDigital GovernancePrivacyDigital KnowledgeInternet GovernanceDigital Media2021-07-05T11:07:24ZBlog EntryAdvanced biometric technologies and new market entries tackle fraud, chase digital ID billions
https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions
<b>Amid forecasts of rapid growth and huge market potential, digital ID platforms launches by Techsign and Ping Identity, new services, features and even an investment fund have been launched.</b>
<p style="text-align: justify; ">The blog post by Chris Burt was <a class="external-link" href="https://www.biometricupdate.com/202106/advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions">published by Biometric Update</a> on June 26, 2021.</p>
<p style="text-align: justify; ">A new camera solution for under-display 3D face biometrics from Infineon and partners, and IPO filings by Clear and SenseTime show parallel investment activity in biometrics, meanwhile, and experts from Veridium and Intellicheck provide insight into the shifting technology and fraud landscapes, among the most widely-read stories this week on Biometric Update.</p>
<h2 style="text-align: justify; ">Top biometrics news of the week</h2>
<p style="text-align: justify; ">Several areas of the digital identity market continued to be very active, with a new investment fund launched to support startups in digital commerce and payments, Yoti joining a regulatory sandbox, Techsign launching a digital ID platform, and Mastercard and b.well reporting positive results from a recent pilot for their biometric healthcare platform. All this activity contributes to explaining Juniper Research’s <a href="https://www.biometricupdate.com/202106/digital-identity-verification-market-forecast-to-reach-16-7b-by-2026">forecast of rapid growth</a> in the sector to $16.7 billion in 2026, driven largely by spending on remote onboarding.</p>
<p style="text-align: justify; ">Okta CEO Todd McKinnon, meanwhile, told Barron’s that the total addressable market for identity and access management providers like Okta is something like <a href="https://www.biometricupdate.com/202106/okta-ceo-says-total-addressable-identity-and-access-management-market-near-80b">$80 billion</a>, as well as that effective integration is the key to solving biometrics challenges in the space. Entrust and Yubico formed an integration partnership, LoginRadius launched a new feature, Jamf launched a biometric tool for enterprises, and a certification program for IAM professionals was launched.</p>
<p style="text-align: justify; ">A list of goods for sale on the dark web includes a listing for <a href="https://www.biometricupdate.com/202106/biometric-selfies-and-forged-passports-identities-for-sale-on-the-dark-web">selfies holding an American ID credential</a>, which in theory could be used in a biometric spoofing attack. Cybersecurity researcher Luana Pascu helps guide readers through the report, and shares insights such as on the status of faked vaccination certificates on dark web marketplaces.</p>
<p style="text-align: justify; ">Ensuring the validity of the ID document a biometric identity verification process is based on, without adding too much friction, often means adopting <a href="https://www.biometricupdate.com/202106/intellicheck-ceo-on-building-the-foundations-for-biometric-verification-and-fraud-protection">layered risk profiling</a>, Intellicheck CEO Bryan Lewis tells <em>Biometric Update</em> in a sponsored post. The company has deep roots in detecting fraudulent documents and has found that even scanning the barcode on an identity document will not necessarily catch a fake if the unique security elements are not validated as part of the scan.</p>
<p style="text-align: justify; ">Fourthline Anti-Financial Crime Head Ro Paddock writes in a Biometric Update guest post about the ever-increasing sophistication of fraud attacks, which reached the level of computer-generated <a href="https://www.biometricupdate.com/202106/the-fraudsters-new-game-face">3D masks and deepfakes</a> during the pandemic,. In response, information-sharing between organizations will be necessary to understand the scope of these new threats, and how to defend against them.</p>
<p style="text-align: justify; ">Philippines’ election commission has launched an app to allow people to preregister for the <a href="https://www.biometricupdate.com/202106/philippines-launches-app-to-fast-track-biometric-voter-registration">voter roll online</a> before enrolling their biometrics in person, as the country continues digitizing its public services. Governments in Pakistan, Haiti and Nigeria are also making moves to improve the accessibility and trustworthiness of their electoral processes.</p>
<p style="text-align: justify; ">A partnership between Research ICT Africa and the Centre for Internet and Society, supported by the Omidyar Network, to explore the development of digital ID systems for the African context is explained in a <a href="https://researchictafrica.net/2021/06/21/why-digital-id-matters/" target="_blank">blog post</a>. The project will be based on an adaptation of the Evaluation Framework for Digital Identities which the CIS used to assess India’s Aadhaar system, with rule of law, rights and risk-based tests, and presented in a series of posts.</p>
<p style="text-align: justify; ">Details of Clear’s IPO plans emerged, including its intention to raise up to <a href="https://www.biometricupdate.com/202106/clear-ipo-could-raise-up-to-396m-in-hot-biometrics-investment-market">$396 million</a> on the NYSE. The $2.2 billion valuation aligns with some comparable companies, by revenue multiple, but the lower voting power of the shares on offer could be a restraining factor.</p>
<p style="text-align: justify; ">An even bigger IPO could be held by SenseTime later this year, with the Chinese AI firm looking to raise up to $2 billion <a href="https://www.biometricupdate.com/202106/not-smarting-from-us-sanctions-sensetime-says-its-ipo-is-on-again">on the Hong Kong exchange</a>. The company has been talking about a public stock launch since before the company was hit with restrictions to U.S. trade, which it indicates have had little impact.</p>
<p style="text-align: justify; ">The latest major funding round in digital identity is the largest yet, with <a href="https://www.biometricupdate.com/202106/transmit-security-raises-543m-to-grow-biometric-passwordless-authentication">Transmit Security raising $543 million</a> at a $2.2 billion valuation to expand the market reach of its passwordless biometric authentication technology. The company claims it is the highest ever Series A funding round in cybersecurity.</p>
<p style="text-align: justify; ">Bob Eckel, Aware CEO and International Biometrics + Identity Association (IBIA) Director and Board Member, discusses why people should own their own identity, identifying things and protecting supply chains, and his background in setting up air traffic control systems used all over the world with the Requis <a href="https://requis.com/podcasts/podcast-bob-eckel-biometrics-future-secured-identities/" target="_blank">Supply Chain Next podcast</a>. In the longer term Eckel sees biometric replacing passwords, and in the shorter term being used to make processes touchless.</p>
<p style="text-align: justify; ">Veridium CTO John Callahan guides Biometric Update through recent NIST guidance on the <a href="https://www.biometricupdate.com/202106/nist-touchless-fingerprint-biometrics-guidance-confirms-interoperability">interoperable use of contactless fingerprints</a> with contact-based back-end AFIS systems. The guidance, which changes definitions within the NIST ITL biometric container standard, but advises that the associated image quality metric does not apply to contactless prints, could spark further investment in the modality.</p>
<p style="text-align: justify; ">A new time-of-flight 3D imaging solution that could be used to implement facial authentication from <a href="https://www.biometricupdate.com/202106/under-display-camera-for-3d-face-biometrics-developed-by-infineon-pmd-arcsoft">under the display of mobile devices</a> without notches or bezels has been developed by partners Infineon, pmdtechnologies and ArcSoft. Based on the REAL3 sensor and ArcSoft’s computer vision algorithms, the solution is expected to reach availability in Q3 2021.</p>
<p style="text-align: justify; "><a href="https://www.biometricupdate.com/202106/ping-identity-adds-behavioral-biometrics-and-bot-detection-with-securedtouch-acquisition">Ping Identity has acquired SecuredTouch</a> in a deal with undisclosed financial details to integrate its behavioral biometrics-based continuous user authentication with the PingOne enterprise cloud platform. Ping also launched a consumer application for reusable credentials and added unified management features to its cloud platform at its Identiverse 2021 event.</p>
<p style="text-align: justify; ">Notre Dame-IBM Technology Ethics Lab Founding Director Elizabeth Renieris joins the MIT Sloan Management Review’s <a href="https://sloanreview.mit.edu/audio/starting-now-on-technology-ethics-elizabeth-renieris/" target="_blank">Me, Myself and AI podcast</a> to discuss the role of the lab, her path past and through some of the digital identity space’s key ethical developments, and the need to take the long view on technology to understand its ethical implications. Renieris makes a pitch for process-oriented regulations, based on the best understanding we have at the time.</p>
<p style="text-align: justify; ">ProctorU’s announcement that it will no longer sell fully-automated remote proctoring services is seen as a win in the battle against “the AI shell game” by the <a href="https://www.eff.org/deeplinks/2021/06/long-overdue-reckoning-online-proctoring-companies-may-finally-be-here" target="_blank">Electronic Frontier Foundation</a>. The descriptions of the balance between the automated and human decision-making by AI proctoring providers amount to doublespeak, the EFF says, before panning their human review processes, accuracy rates, and use of facial recognition.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions'>https://cis-india.org/internet-governance/news/biometric-update-june-26-2021-chris-burt-advanced-biometric-technologies-and-new-market-entries-tackle-fraud-chase-digital-id-billions</a>
</p>
No publisherChris BurtPrivacyInternet GovernanceUIDAIBiometricsAadhaar2021-06-28T01:13:05ZNews Item