<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 1 to 15.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-august-2-2017-should-an-inability-to-precisely-define-privacy-render-it-untenable-as-a-right"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/jobs/call-for-design-interns-201906"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/social-media-monitoring"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function">
    <title>How Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</link>
    <description>
        &lt;b&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;published in Medianama&lt;/a&gt; on February 18, 2022. This is the first of a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote &lt;a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html"&gt;an opinion piece&lt;/a&gt; providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash &lt;a href="https://twitter.com/pranesh/status/1023116679440621568"&gt;pointed out&lt;/a&gt; that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —&lt;/p&gt;
&lt;p&gt;a)  where it is necessary to respond to a medical emergency&lt;br /&gt;b)  where it is necessary for state to provide a service or benefit to the individual&lt;br /&gt;c)  where it is necessary for the state to issue any certification, licence or permit&lt;br /&gt;d)  where it is necessary under any central or state legislation, or to comply with a judicial order&lt;br /&gt;e)  where it is necessary for any measures during an epidemic, outbreak or public health&lt;br /&gt;f)  where it is necessary for safety procedures during disaster or breakdown of public order&lt;/p&gt;
&lt;p&gt;In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.&lt;/p&gt;
&lt;h2&gt;Twin restrictions in Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Scope of privacy under Puttaswamy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench ​was ​not ​required ​to ​provide ​a ​legal ​test ​to ​determine ​the ​extent ​and ​scope ​of the ​right ​to ​privacy, but they do provide sufficient ​guidance ​for ​us ​to ​contemplate ​how ​the ​limits ​and ​scope ​of ​the ​constitutional ​right ​to ​privacy ​could ​be ​determined ​in ​future ​cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —&lt;br /&gt;a) the existence of a “law”&lt;br /&gt;b) a “legitimate State interest”&lt;br /&gt;c) the requirement of “proportionality”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proportionality and Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —&lt;/p&gt;
&lt;p&gt;a)  the limiting measures must be carefully designed, or rationally connected, to the objective&lt;br /&gt;b)  they must impair the right as little as possible&lt;br /&gt;c)  the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'&gt;https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T14:56:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study">
    <title>Clause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study</link>
    <description>
        &lt;b&gt;In light of the state’s emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-digital-healthcare-case-study/"&gt;published in Medianama&lt;/a&gt; on February 21, 2022. This is the second in a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In the &lt;a href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;previous post&lt;/a&gt;, I looked at provisions on non-consensual data processing for state functions under the most recent version of recommendations by the Joint Parliamentary Committee on India’s Data Protection Bill (DPB). The true impact of these provisions can only be appreciated in light of ongoing policy developments and real-life implications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To appreciate the significance of the dilutions in Clause 12, let us consider the Indian state’s range of schemes promoting digital healthcare. In July 2018, NITI Aayog, a central government policy think tank in India released a strategy and approach paper (Strategy Paper) on the formulation of the National Health Stack which envisions the creation of a federated application programming interface (API)-enabled health information ecosystem. While the Ministry of Health and Family Welfare has focused on the creation of Electronic Health Records (EHR) Standards for India during the last few years and also identified a contractor for the creation of a centralised health information platform (IHIP), this Strategy Paper advocates a completely different approach, which is described as a Personal Health Records (PHR) framework. In 2021, the National Digital Health Mission (NDHM) was launched under which a citizen shall have the option to obtain a digital health ID. A digital health ID is a unique ID and will carry all health records of a person.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;A Stack Model for Big Data Ecosystem in Healthcare&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;A stack model as envisaged in the Strategy Paper, consists of several layers of open APIs connected to each other, often tied together by a unique health identifier. The open nature of APIs has the advantage that it allows public and private actors to build solutions on top of it, which are interoperable with all parts of the stack. It is however worth considering both the ‘openness’ and the role that the state plays in it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Even though the APIs are themselves open, they are a part of a pre-decided technological paradigm, built by private actors and blessed by the state. Even though innovators can build on it, the options available to them are limited by the information architecture created by the stack model. When such a technological paradigm is created for healthcare reform and health data, the stack model poses additional challenges. By tying the stack model to the unique identity, without appropriate processes in place for access control, siloed information, and encrypted communication, the stack model poses tremendous privacy and security concerns. The broad language under Clause 12 of the DPB needs to be looked at in this context.&lt;/p&gt;
&lt;p&gt;Clause 12 allows non-consensual processing of personal data where it is necessary “for the performance of any function of the state authorised by law” in order to provide a service or benefit from the State. In the previous post, I had highlighted the import of the use of only ‘necessity’ to the exclusion of ‘proportionality’. Now, we need to consider its significance in light of the emerging digital healthcare apparatus being created by the state.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The National Health Stack and National Digital Health Mission together envision an intricate system of data collection and exchange which in a regulatory vacuum would ensure unfettered access to sensitive healthcare data for both the state and private actors registered with the platforms. The Stack framework relies on repositories where data may be accessed from multiple nodes within the system. Importantly, the Strategy Paper also envisions health data fiduciaries to facilitate consent-driven interaction between entities that generate the health data and entities that want to consume the health records for delivering services to the individual. The cast of characters involve the National Health Authority, health care providers and insurers who access the National Health Electronic Registries, unified data from different programmes such as National Health Resource Repository (NHRR), NIN database, NIC and the Registry of Hospitals in Network of Insurance (ROHINI), private actors such as Swasth, iSpirt who assist the Mission as volunteers. The currency that government and private actors are interested in is data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The promised benefits of healthcare data in an anonymised and aggregate form range from Disease Surveillance to Pharmacovigilance as well as Health Schemes Management Systems and Nutrition Management, benefits which have only been more acutely emphasised during the pandemic. However, the pandemic has also normalised the sharing of sensitive healthcare data with a variety of actors, without much thinking on much-needed data minimisation practises.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The potential misuses of healthcare data include greater state surveillance and control, predatory and discriminatory practices by private actors which rely on Clause 12 to do away with even the pretense of informed consent so long as the processing of data is deemed necessary by the state and its private sector partners to provide any service or benefit.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Subclause (e) in Clause 12, which was added in the last version of the Bill drafted by MeitY and has been retained by the JPC, allows processing wherever it is necessary for ‘any measures’ to provide medical treatment or health services during an epidemic, outbreak or threat to public health. Yet again, the overly-broad language used here is designed to ensure that any annoyances of informed consent can be easily brushed aside wherever the state intends to take any measures under any scheme related to public health.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Effectively, how does the framework under Clause 12 alter the consent and purpose limitation model? Data protection laws introduce an element of control by tying purpose limitation to consent. Individuals provide consent to specified purposes, and data processors are required to respect that choice. Where there is no consent, the purposes of data processing are sought to be limited by the necessity principle in Clause 12. The state (or authorised parties) must be able to demonstrate necessity to the exercise of state function, and data must only be processed for those purposes which flow out of this necessity. However, unlike the consent model, this provides an opportunity to keep reinventing purposes for different state functions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the absence of a data protection law, data collected by one agency is shared indiscriminately with other agencies and used for multiple purposes beyond the purpose for which it was collected. The consent and purpose limitation model would have addressed this issue. But, by having a low threshold for non-consensual processing under Clause 12, this form of data processing is effectively being legitimised.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study'&gt;https://cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T15:07:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-august-2-2017-should-an-inability-to-precisely-define-privacy-render-it-untenable-as-a-right">
    <title>Should an Inability to Precisely Define Privacy Render It Untenable as a Right?</title>
    <link>https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-august-2-2017-should-an-inability-to-precisely-define-privacy-render-it-untenable-as-a-right</link>
    <description>
        &lt;b&gt;The judges may still be able to articulate the manner in which limits for a right to privacy may be arrived at, without explicitly specifying them.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="https://thewire.in/163695/inability-precisely-define-privacy-render-untenable-right/"&gt;published in the Wire&lt;/a&gt; on August 2, 2017.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Ludwig Wittgenstein wrote in his book, &lt;i&gt;Philosophical Investigations&lt;/i&gt;,  that things which we expect to be connected by one essential common  feature, may be connected by a series of overlapping similarities, where  no one feature is common. Instead of having one definition that works  as a grand unification theory, concepts often draw from a common pool of  characteristics. Drawing from overlapping characteristics that exist  between family members, Wittgenstein uses the phrase ‘family  resemblances’ to refer to such concepts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In his book, &lt;i&gt;Understanding Privacy&lt;/i&gt;, Daniel Solove makes a  case for privacy being a family resemblance concept. Responding to the  discontent in conceptualising privacy, Solove attempted to ground  privacy not in a tightly defined idea, but around a web of diverse yet  connected ideas. Some of the diverse human experiences that we  instinctively associate with privacy are bodily privacy, relationships  and family, home and private spaces, sexual identity, personal  communications, ability to make decisions without intrusions and sharing  of personal data. While these are widely diverse concepts, intrusions  upon or interferences with these experiences are all understood as  infringements of our privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other scholars too have recognised this dynamic, evolving and  difficult to pinpoint nature of privacy. Robert Post described privacy  as a concept “engorged with various and distinct meanings.” Helen  Nissenbaum advocates a dynamic idea of privacy to be understood in terms  of contextual norms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ongoing arguments in the Supreme Court on the existence of a  constitutional right to privacy can also be viewed in the context of the  idea of privacy as a family resemblance concept. In their arguments,  the counsels for the petitioners have tried to make a case for privacy  as a multi-dimensional fundamental right. Senior advocate Gopal  Subramanium argued before the court that privacy inheres in the concept  of liberty and dignity under Constitution of India, and is presupposed  by various other rights such as freedom of speech, good conscience, and  freedom to practice religion. He further goes on say that there are four  aspects to privacy – spatial, decisional, informational and the right  to develop personality. Shyam Divan, also arguing for the petitioners,  further added that privacy includes the right to be left alone, freedom  of thought, freedom to dissent, bodily integrity and informational  self-determination.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When the chief justice brought up the need to define the extent of  the right to privacy, the counsels raised concerns about the right being  defined too specifically. This reluctance was borne out of the  recognition that by its very nature, the right to privacy is a cluster  of rights, with multiple dimensions manifesting themselves in different  ways depending on the context. Both advocates, Subramaniam and Arvind  Datar, argued that court must not engage in an exercise to definitively  catalog all the different aspects of the right, foreclosing the future  development of the law on point. This reluctance was also a result of  the fact that the court has isolated the question of the existence of  the right to privacy and how it may apply in the case of the Aadhaar  project. Usually judges are able to ground legal principles in the  relevant facts of the case while developing precedents. The referral to  this bench is only on the limited question of the existence of a  constitutional right to privacy. Therefore, any limits that are  articulated by the court on the right exist without the benefit of a  context.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the other hand, the Attorney General (AG) argued that this very  aspect of privacy was a rationale for not declaring it a fundamental  right. At various points during the arguments, he indicated that the  ambiguous and vague nature of the concept of privacy made it unsuitable  as a fundamental right. Similarly, Tushar Mehta, arguing for Unique  Identification Authority of India, also sought to deny privacy’s  existence as a fundamental right as it is too subjective and vague.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The above argument assumes that the inability to precisely define  privacy renders its untenable as a right. The key question is whether  this lack of a common denominator makes privacy too vague a right,  liable to expansive misinterpretations. Conceptions that do not have  fixed and sharp boundaries, are not boundless. What it means is that the  boundaries can often be fuzzy and in a state of constant evolution, but  the limits and boundaries always exist.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At one point during the hearings, Justice Rohinton Nariman wanted the  counsels to work on the parameters of challenge for state action with  respect to privacy. As mentioned earlier, in the absence of facts to  work with, such an exercise is fraught with risks. However, the judges  may still be able to articulate the manner in which such limits may be  arrived at, without specifying them. Justice Nariman himself later  agrees that the judicial examination must proceed on a case by case  basis, taking into account not only the tests under Article 14,19 and 21  under which petitioners have tried to locate privacy, but also under  any other concurrent rights which may be infringed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The AG also argued that the infringement of privacy in itself does  not amount to a violation of the rights under Article 21, rather in some  cases the transgressions on privacy may lead to an infringement of a  person’s right to liberty and only in such cases should the fundamental  rights be invoked. Thus, the argument made was that there was no need to  declare privacy as a fundamental right but only to acknowledge that  limiting privacy may sometimes lead to violations of the already  existing rights. This argument may have been more cogent had he  identified specific dimensions of privacy which, according to him, do  not qualify as fundamental rights. However, this might have meant  conceding that other dimensions of privacy, in fact do amount to  fundamental rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It must be remembered that the problem of changing or multiple  meanings is not limited to privacy. As the bench noted, drawing  comparisons to the concepts of ‘liberty’ and ‘dignity’, these are  constitutionally recognised values which equally suffer from a multitude  of meanings based on context. The government’s position here is in line  with critiques of privacy that Solove seeks to bust in his book. The  idea of privacy evolves with time and people. And people, whether from a  developed or developing polity, have an instinctive appreciation for  it. The absence of a precise definition does not necessarily do great  disservice to a concept, especially one that is fundamental to our  freedoms.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-august-2-2017-should-an-inability-to-precisely-define-privacy-render-it-untenable-as-a-right'&gt;https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-august-2-2017-should-an-inability-to-precisely-define-privacy-render-it-untenable-as-a-right&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-08-04T01:49:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept">
    <title>Privacy is not a unidimensional concept</title>
    <link>https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept</link>
    <description>
        &lt;b&gt;Right  to privacy is important not only for our negotiations with the information age but also to counter the transgressions of a welfare state. A robust right to privacy is essential for all citizens in India to defend their individual autonomy in the face of invasive state actions purportedly for the public good. The ruling of this nine-judge bench will have far-reaching impact on the extent and scope of rights available to us all.&lt;/b&gt;
        
&lt;div&gt;This article, written by Amber Sinha was published in the &lt;a class="external-link" href="http://economictimes.indiatimes.com/news/politics-and-nation/aadhar-privacy-is-not-a-unidimensional-concept/articleshow/59716562.cms"&gt;Economic Times&lt;/a&gt; on July 23, 2017.&amp;nbsp;&lt;/div&gt;
&lt;div&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;div&gt;In a disappointing case of judicial evasion by the apex court,
      it has taken over 600 days since a reference order passed in
      August 11, 2015, for this bench to be constituted. Over two days
      of arguments, the counsels for the petitioners have presented
      before the court why the right to privacy, despite not finding a
      mention in the Constitution of India, is a fundamental right
      essential to a person’s dignity and liberty, and must be read into
      not one but multiple articles of the Constitution. The government
      will make its arguments in the coming week.&lt;/div&gt;
&lt;div&gt;One must wonder why we are debating the contours of the right
      to privacy, which 40 years of jurisprudence had lulled us into
      believing we already had. The answer to that can be found in a
      series of hearings in the Aadhaar case that began in 2012. Justice
      KS Puttaswamy, a former Karnataka High Court judge, filed a
      petition before the Supreme Court, questioning the validity of the
      Aadhaar project due its lack of legislative basis (since then the
      Aadhaar Act was passed in 2016) and its transgressions on our
      fundamental rights. Over time, a number of other petitions also
      made their way to the apex court, challenging different aspects of
      the Aadhaar project. Since then, five different interim orders by
      the Supreme Court have stated that no person should suffer because
      they do not have an Aadhaar number. Aadhaar, according to the
      court, could not be made mandatory to avail benefits and services
      from government schemes. Further, the court has limited the use of
      Aadhaar to specific schemes: LPG, PDS, MGNREGA, National Social
      Assistance Programme, the Pradhan Mantri Jan Dhan Yojna and EPFO.&lt;br /&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;div&gt;The real spanner in the works in the progress of this case was
      the stand taken by Mukul Rohatgi, then attorney general of India
      who, in a hearing before the court in July 2015, stated that there
      is no constitutionally guaranteed right to privacy. His reliance
      was on two Supreme Court judgments in MP Sharma v Satish Chandra
      (1954) and Kharak Singh v State of Uttar Pradesh (1962): both
      cases, decided by eight- and six-judge benches respectively,
      denied the existence of a constitutional right to privacy. As the
      subsequent judgments which upheld the right to privacy were by
      smaller benches, Rohatgi claimed that MP Sharma and Kharak Singh
      still prevailed over them, until they were overruled by a larger
      bench.&lt;/div&gt;
&lt;div&gt;The reference to a larger bench has since delayed the entire
      matter, even as a number of government schemes have made Aadhaar
      mandatory. This reading of privacy as a unidimensional concept by
      the courts is, with due respect, erroneous. Privacy, as a concept,
      includes within its scope, spatial, familial, informational and
      decisional aspects. We all have a legitimate expectation of
      privacy in our private spaces, such as our homes, and in our
      personal relationships. Similarly, we must be able to exercise
      some control over how personal data, like our financial
      information, are disseminated. Most importantly, privacy gives us
      the space to make autonomous choices and decisions without
      external interference. All these dimensions of privacy must stand
      as distinct rights. In MP Sharma, the court rejected a certain
      aspect of the right of privacy by refusing to acknowledge a right
      against search and seizure. This, in no way prevented the court,
      even in the form of a smaller bench, from ruling on any other
      aspects of privacy, including those that are relevant to the
      Aadhaar case.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;The limited referral to this bench means that the court will
      have to rule on the status of privacy and its possible limitations
      in isolation, without even going into the details of the Aadhaar
      case (based on the nature of protection that this bench accords to
      privacy, the petitioners and defendants in the Aadhaar case will
      have to argue afresh on whether the project does impede on this
      most fundamental right). There are no facts of the case to ground
      the legal principles in, and defining the contours of a right can
      be a difficult exercise. The court must be wary of how any limits
      they put on the right may be used in future. Equally, it is
      important to articulate that any limitations on the right to
      privacy due to competing interests such as national security and
      public interest must be imposed only when necessary and always be
      proportionate. &lt;br /&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;p&gt;
    
    
    
    
    
    It will not be enough for the court to merely state that we have a
    constitutional right to privacy. They would be well advised to cut
    through the muddle of existing privacy jurisprudence, and
    unequivocally establish the various facets of the right. Without
    that, we may not be able to withstand the modern dangers of
    surveillance, denial of bodily integrity and self-determination
    through forcible collection of information. The nine judges, in
    their collective wisdom, must not only ensure that we have a right
    to privacy, but also clearly articulate a robust reading of this
    right capable of withstanding the growing interferences with our
    autonomy.&lt;/p&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept'&gt;https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-08-07T08:02:20Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments">
    <title>Right to be Forgotten: A Tale of Two Judgements</title>
    <link>https://cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments</link>
    <description>
        &lt;b&gt;In the last few months, there have been contrasting judgments from two Indian high courts, Karnataka and Gujarat, on matters relating to the right to be forgotten. The two high courts heard pleas on issues to do the right of individuals to have either personal information redacted from the text of judgments available online or removal of such judgment from publically available sources.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;While one High Court (Karnataka) ordered the removal of personal details from the judgment,&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt; the other (Gujarat) dismissed the plea&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/a&gt;. In this post, we try to understand the global jurisprudence on the right to be forgotten, and how the contrasting judgments in India may be located within it.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Background&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ‘right to be forgotten’ has gained prominence since a matter was referred to the Court of Justice of European Union (CJEU) in 2014 by a Spanish court.&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/a&gt; In this case, Mario Costeja González had disputed the Google search of his name continuing to show results leading to an auction notice of his reposed home. The fact that Google continued to make available in its search results, an event in his past, which had long been resolved, was claimed by González as a breach of his privacy. He filed a complaint with the Spanish Data Protection Agency (AEPD in its Spanish acronym), to have the online newspaper reports about him as well as related search results appearing on Google deleted or altered. While AEPD did not agree to his demand to have newspaper reports altered, it ordered Google Spain and Google, Inc. to remove the links in question from their search results. The case was brought in appeal before the Spanish High Court, which referred the matter to CJEU. In a judgement having far reaching implications, CJEU held that where the information is ‘inaccurate, inadequate, irrelevant or excessive,’ individuals have the right to ask search engines to remove links with personal information about them. The court also ruled that even if the physical servers of the search engine provider are located outside the jurisdiction of the relevant Member State of EU, these rules would apply if they have branch office or subsidiary in the Member State.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ‘right to be forgotten’ is a misnomer, and essentially when we speak of it in the context of the proposed laws in EU, we refer to the rights of individuals to seek erasure of certain data that concerns them. The basis of what has now evolved into this right is contained in the 1995 EU Data Protection Directive, with Article 12 of the Directive allowing a person to seek deletion of personal data once it is no longer required.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Critical to our understanding of the rationale for how the ‘right to be forgotten’ is being framed in the EU, is an appreciation of how European laws perceive privacy of individuals. Unlike the United States (US), where privacy may be seen as a corollary of personal liberty protecting against unreasonable state intrusions, European laws view privacy as an aspect of personal dignity, and are more concerned with protection from third parties, particularly the media. The most important way in which this manifests itself is in where the burden to protect privacy rights lie. In Europe, privacy policy often dictates intervention from the state, whereas in the US, in many cases it is up to the individuals to protect their privacy.&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Since the advent of the Internet, both the nature and quantity of information existing about individuals has changed dramatically. This personal information is no longer limited to newspaper reports and official or government records either. Our use of social media, micro-discussions on Twitter, photographs and videos uploaded by us or others tagging us, every page or event we like, favourite or share—all contribute to our digital footprint. Add to this the information created not by us but about us by both public and private bodies storing data about individuals in databases, our digital shadows begin to far exceed the data we create ourselves. It is abundantly clear that we exist in a world of Big Data, which relies on algorithms tracking repeated behaviour by our digital selves. It is in this context that a mechanism which enables the purging of some of this digital shadow makes sense.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Further, it is not only the nature and quantity of information that has changed, but also the means through which this information can be accessed. In the pre-internet era, access to records was often made difficult by procedural hurdles. Permissions or valid justifications were required to access certain kinds of data. Even for the information available in the public domain, often the process of gaining access were far too cumbersome. Now digital information not only continues to exist indefinitely, but can also be easily accessed readily through search engines. It is in this context that in a 2007 paper, Viktor Mayer-Schöenberger pioneered the idea of memory and forgetting for the digital age.&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/a&gt; He proposed that all forms of personal data should have an additional meta data of expiration date to switch the default from information existing endlessly to having a temporal limit after which it is deleted. While this may be a radical suggestion, we have since seen proposals to allow individuals some control over information about them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In 2016, the EU released the final version of the General Data Protection Regulation. The regulation provides for a right to erasure under Article 17, which would enable a data-subject to seek deletion of data.&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/a&gt; Notably, except in the heading of the provision, Article 17 makes no reference to the word ‘forgetting.’ Rather the right made available in this regulation is in the form of making possible ‘erasure’ and ‘abstention from further dissemination.’ This is significant because what the proposed regulations provide for is not an overarching framework to enable or allow ‘forgetting’ but a limited right which may be used to delete certain data or search results. Providing a true right to be forgotten would pose issues of interpretation as to what ‘forgetting’ might mean in different contexts and the extent of measures that data controllers would have to employ to ensure it. The proposed regulation attempts to provide a specific remedy which can be exercised in the defined circumstances without having to engage with the question of ‘forgetting’.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary arguments made against the ‘right to be forgotten’ have come from its conflict with the right to freedom of speech. Jonathan Zittrain has argued against the rationale that the right to be forgotten merely alters results on search engines without deleting the actual source, thus, not curtailing the freedom of expression.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/a&gt; He has compared this altering of search results to letting a book remain in the library but making the catalogue unavailable. According to Zittrain, a better approach would be to allow data subjects to provide their side of the story and more context to the information about them, rather than allowing any kind of erasure. Unlike in the US, the European approach is to balance free speech against other concerns. So while one of the exceptions in sub-clause (3) of Article 17 provides that information may not be deleted where it is necessary to exercise the right to free speech, free speech does not completely trump privacy as the value that must be protected. On the other hand, US constitutional law would tend to give more credence to the First Amendment rights and allow them to be compromised in very limited circumstances. As per the position of the US Supreme Court in &lt;i&gt;Florida Star&lt;/i&gt; v. &lt;i&gt;B.J.F.&lt;/i&gt;, lawfully obtained information may be restricted from publication only in cases involving a ‘state interest of the highest order’. This position would allow any potential right to be forgotten to be exercised in the most limited of circumstances and privacy and reputational harm would not satisfy the standard. For these reasons the rights to be forgotten as it exists in Article 17 may be unworkable in the US.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Issues in application&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Significant technical challenges remain in the effective and consistent application of Article 17 of the EU Directive. One key issue is concerned with how ‘personal data’ is defined and understood, and how its interpretation will impact this right in different contexts. According to Article 17 of the EU directive, the term ‘personal data’ includes any information relating to an individual. Some ambiguity remains about whether information which may not uniquely identify a person, but as a part of small group, could be considered within the scope of personal data. This becomes relevant, for instance, where one seeks the erasure of information which, without referring to an individual, points fingers towards a family. At the same time, often the piece of information sought to be erased by a person may contain personal information about more than one individual. There is no clarity over whether a consensus of all the individuals concerned should be required, and if not, on what parameters should the wishes of one individual prevail over the others. Another important question, which is as yet unanswered, is whether the same standards for removal of content should apply to most individuals and those in public life.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The issue of what is personal data and can therefore be erased gets further complicated in cases of derived data about individuals used in statistics and other forms of aggregated content. While, it would be difficult to argue that the right to be forgotten needs to be extended to such forms of information, not erasing such derived content poses the risk of the primary information being inferred from it. In addition, Article 17(1)(a) provides for deletion in cases where the data is no longer necessary for the purposes for which they were collected or used. The standards for circumstances which satisfy this criteria are, as yet, unclear and may only be fully understood through a consistent application of this law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, once there are reasonable grounds to seek erasure of information, it is not clear how this erasure will be enforced practically. It may not be prudent to require that all copies of the impugned data are deleted such that they may not be recovered, to the extent technologically possible. A more reasonable solution might be to permit the data to continue to remain available in encrypted forms, much like certain records are sealed and subject to the strictest confidentiality obligations. In most cases, it may be sufficient to ensure that the records of the impugned data is removed from search results and database reports without actually tampering with information as it may exist. These are some of the challenges which the practical application of this right will face, and it is necessary to take them into account in enforcing the proposed regulations.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The two Indian judgments&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In the first case, (before the Gujarat High Court), the petitioner entered a plea for “permanent restraint [on] free public exhibition of the judgment and order.” The judgment in question concerned proceeding against the petitioner for a number of offences, including culpable homicide amounting to murder. The petitioner was acquitted, both by the Sessions court and the High Court before which he was pleading. The petitioner’s primary contention was that despite the judgment being classified as ‘unreportable’, it was published by an online repository of judgments and was also indexed by Google search. The decision of the High Court to dismiss the petition, rest of the following factors: a) failure on the part of the petitioner to show any provisions in law which are attracted, or threat to the constitutional right to life and liberty, b) publication on a website does not amount to ‘reporting’, as reporting only refers to that by law reports.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the second point of reasoning made by the courts is problematic in terms of the function of precedent served by the reported judgments, and the basis for reducing the scope of ‘reporting’ to only law reports, the first point is of direct relevance to our current discussion. The lack of available legal provisions points to the absence of data protection legislation in India. Had there been a privacy legislation which addressed the issues of how personal information may be dealt with, it is possible that it may have had instructive provisions to address situation like these. In the absence of such law, the only recourse that an individual has is to seek constitutional protection under one of the fundamental rights, most notably Article 21, which over the years, has emerged as the infinite repository of unenumerated rights. However, typically rights under Article 21 are of a vertical nature, i.e., available only against the state. Their application in cases where a private party is involved remains questionable, at best.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In contrast, in the second case, the Karnataka High Court ruled in favor of the petitioner. In this case, the petitioner’s daughter instituted both criminal and civil proceedings against a person. However, later they arrived at a compromise and one of the conditions was quashing all the proceedings which had been initiated. The petitioner had raised concerns about the appearance of his daughter’s name in the cause title and was easily searchable. The court, while making vague references to “trend in the Western countries where they follow this as a matter of rule “Right to be forgotten” in sensitive cases involving women in general and highly sensitive cases involving rape or affecting the modesty and reputation of the person concerned, held in the petitioner’s favor, and order that the name be redacted from the cause title and the body of the order before releasing to any service provider.  The second judgment is all the more problematic for while it makes a reference to jurisprudence in other countries, yet it does not base it on the fundamental right to privacy, but to the idea of modesty and reputation of women, which has no clear legal basis on either Indian or comparative jurisprudence.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The above two cases demonstrate the problem of lack of a clear legal basis being employed by the judiciary in interpreting the right to be forgotten. Not only were no clear legal provisions in Indian law were taken refuge of while ruling on the existence of this right, the court also do not engage in any analysis of comparative jurisprudence such as the GDPR or the Costeja judgment. Such ad-hoc jurisprudence underlines the need for a data protection legislation, as in its absence, it is likely that divergent views are taken upon this issue, without a clear legal direction. It is likely that most matters concerning the right to erasure concern private parties as data controllers. In such cases, the existing jurisprudence on the right to privacy as interpreted under Article 21 may also be of limited value. Further, as has been pointed out above, the right to be forgotten needs to be a right qualified by conditions very clearly, and its conflict with the right to freedom of expression under Article 19. Therefore, it is imperative that a comprehensive data protection law addresses these issues.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt; Sri Vasunathan vs The Registrar, available at &lt;a href="http://www.iltb.net/2017/02/karnataka-hc-on-the-right-to-be-forgotten/"&gt;http://www.iltb.net/2017/02/karnataka-hc-on-the-right-to-be-forgotten/&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/a&gt; Dharmraj Bhanushankar Dave v. State of Gujarat, available at &lt;a href="https://drive.google.com/file/d/0BzXilfcxe7yueXFJWG5mZ1pKaTQ/view"&gt;https://drive.google.com/file/d/0BzXilfcxe7yueXFJWG5mZ1pKaTQ/view&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/a&gt; Google Spain et al v. Mario Costeja González, available at &lt;a href="http://curia.europa.eu/juris/document/document_print.jsf?doclang=EN&amp;amp;docid=152065"&gt;http://curia.europa.eu/juris/document/document_print.jsf?doclang=EN&amp;amp;docid=152065&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.europarl.europa.eu/RegData/etudes/STUD/2015/536459/IPOL_STU(2015)536459_EN.pdf"&gt;http://www.europarl.europa.eu/RegData/etudes/STUD/2015/536459/IPOL_STU(2015)536459_EN.pdf&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/a&gt; Mayer-Schoenberger, Viktor, Useful Void: The Art of Forgetting in the Age of Ubiquitous Computing (April 2007). KSG Working Paper No. RWP07-022. Available at SSRN: https://ssrn.com/abstract=976541 or &lt;a href="http://dx.doi.org/10.2139/ssrn.976541"&gt;http://dx.doi.org/10.2139/ssrn.976541&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/a&gt; Article 17 (1) states: &lt;i&gt;The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(a) the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(b) the data subject withdraws consent on which the processing is based according to point (a) of Article 6(1), or point (a) of Article 9(2), and where there is no other legal ground for the processing;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(c) the data subject objects to the processing pursuant to Article 21(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to Article 21(2);&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(d) the personal data have been unlawfully processed;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(e) the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(f) the personal data have been collected in relation to the offer of information society services referred to in Article 8(1).&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/a&gt; Zittrain, Jonathan, “Don’t Force Google to ‘Forget’”, The New York Times, May 14, 2014. Available at &lt;a href="https://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html"&gt;https://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments'&gt;https://cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Right to be Forgotten</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-04-07T02:27:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data">
    <title>Privacy in the Age of Big Data</title>
    <link>https://cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data</link>
    <description>
        &lt;b&gt;Personal data is freely accessible, shared and even sold, and those to whom this information belongs have little control over its flow.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="http://www.asianage.com/india/all-india/100417/privacy-in-the-age-of-big-data.html"&gt;Asian Age&lt;/a&gt; on April 10, 2017.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;In 2011 it was estimated that the quantity of data produced globally surpassed 1.8 zettabyte. By 2013, it had increased to 4 zettabytes. This is a result of digital services which involve constant data trails left behind by human activity. This expansion in the volume, velocity, and variety of data available, together with the development of innovative forms of statistical analytics on the data collected, is generally referred to as “Big Data”. Despite significant (though largely unrealised) promises about Big Data, which range from improved decision-making, increased efficiency and productivity to greater personalisation of services, concerns remain about the impact of such datafication of all human activity on an individual’s privacy. Privacy has evolved into a sweeping concept, including within its scope matters pertaining to control over one’s body, physical space in one’s home, protection from surveillance, and from search and seizure, protection of one’s reputation as well as one’s thoughts. This generalised and vague conception of privacy not only comes with great judicial discretion, it also thwarts a fair understanding of the subject. Robert Post called privacy a concept so complex and “entangled in competing and contradictory dimensions, so engorged with various and distinct meanings”, that he sometimes “despairs whether it can be usefully addressed at all”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This also leaves the idea of privacy vulnerable to considerable suspicion and ridicule. However, while there is a lack of clarity over the exact contours of what constitutes privacy, there is general agreement over its fundamental importance to our ability to lead whole lives. In order to understand the impact of datafied societies on privacy, it is important to first delve into the manner in which we exercise our privacy. The ideas of privacy and data management that are prevalent can be traced to the Fair Information Practice Principles (FIPP). These principles are the forerunners of most privacy regimes internationally, such as the OECD Privacy Guidelines, APEC Framework, or the nine National Privacy Principles articulated by the Justice A.P. Shah Committee Report. All of these frameworks have rights to notice, consent and correction, and how the data may be used, as their fundamental principles. It makes the data subject to the decision-making agent about where and when her/his personal data may be used, by whom, and in what way. The individual needs to be notified and his consent obtained before his personal data is used. If the scope of usage extends beyond what he has agreed to, his consent will be required for the increased scope.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In theory, this system sounds fair. Privacy is a value tied to the personal liberty and dignity of an individual. It is only appropriate that the individual should be the one holding the reins and taking the large decisions about the use of his personal data. This makes the individual empowered and allows him to weigh his own interests in exercising his consent. The allure of this paradigm is that in one elegant stroke, it seeks to ensure that consent is informed and free and also to implement an acceptable trade-off between privacy and competing concerns. This approach worked well when the number of data collectors were less and the uses of data was narrower and more defined. Today’s infinitely complex and labyrinthine data ecosystem is beyond the comprehension of most ordinary users. Despite a growing willingness to share information online, most people have no understanding of what happens to their data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The quantity of data being generated is expanding at an exponential rate. From smartphones and televisions, trains and airplanes, sensor-equipped buildings and even the infrastructures of our cities, data now streams constantly from almost every sector and function of daily life, “creating countless new digital puddles, lakes, tributaries and oceans of information”. The inadequacy of the regulatory approaches and the absence of a comprehensive data protection regulation is exacerbated by the emergence of data-driven business models in the private sector and the adoption of data-driven governance approach by the government. The Aadhaar project, with over a billion registrants, is intended to act as a platform for a number of digital services, all of which produce enormous troves of data. The original press release by the Central Government reporting the approval by the Cabinet of Ministers of the Digital India programme, speaks of “cradle to grave” digital identity as one of its vision areas.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the very idea of the government wanting to track its citizens’ lives from cradle to grave is creepy enough in itself, let us examine for a minute what this form of datafied surveillance will entail. A host of schemes under Digital India shall collect and store information through the life cycle of an individual. The result, as we can see, is building databases on individuals, which when combined, will provide a 360 degree view into the lives of individuals. Alongside the emergence of India Stack, a set of APIs built on top of the Aadhaar, conceptualised by iSPIRT, a consortium of select IT companies from India, to be deployed and managed by several agencies, including the National Payments Corporation of India, promises to provide a platform over which different private players can build their applications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The sum of these interconnected parts will lead to a complete loss of anonymity, greater surveillance and impact free speech and individual choice. The move towards a cashless economy — with sharp nudges from the government — could lead to lack of financial agencies in case of technological failures as has been the case in experiments with digital payments in Africa. Lack of regulation in emerging data driven sectors such as Fintech can enable predatory practices where right to remotely deny financial services can be granted to private sector companies. An architecture such as IndiaStack enables datafication of financial transactions in a way that enables linked and structured data that allows continued use of the transaction data collected. It is important to recognise that at the stage of giving consent, there are too many unknowns for us to make informed decisions about the future uses of our personal data. Despite blanket approvals allowing any kind of use granted contractually through terms of use and privacy policies, there should be legal obligations overriding this consent for certain kinds of uses that may require renewed consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Biometrics-based identification in UK: &lt;/b&gt;In  2005, researchers from London School of Economics and Political Science  came out with a detailed report on the UK Identity Cards Bill (‘UK  Bill’) — the proposed legislation for a national identification system  based on biometrics. The project also envisaged a centralised database  (like India) that would store personal information along with the entire  transaction history of every individual. The report pointed strongly  against the centralising storage of information and suggested other  alternatives such as a system based on smartcards (where biometrics are  stored on the card itself) or offline biometric-reader terminals.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As per the report, the alternatives would also have been cheaper as neither required real-time online connectivity. In India, online authentication is a far greater challenge. According to Network Readiness Index, 2016, India ranks 91, whereas UK is placed eight. Poor Internet connectivity can raise a lot of problems in the future including paralysis of transactions. The UK identification project was subsequently discarded as a result of the privacy and cost considerations raised in this report.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Aadhaar: Privacy concerns&lt;/h3&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;Once the data is collected through National Information Utilities, it will be privatised and controlled by private utilities.&lt;/li&gt;
&lt;li&gt;Once an individual’s data is entered in the system, it cannot be deleted. That individual will have no control over it.&lt;/li&gt;
&lt;li&gt;Aadhaar Data (Demographic details along with photographs) are shared/transferred with the private entities including telecom companies as per the Aadhaar (Targeted delivery of Financial and other subsidies, benefits and services) Act, 2016 with the consent of Aadhaar number holder to fulfil their e-KYC requirements. The data is shared in encrypted form through secured channel.&lt;/li&gt;
&lt;li&gt;Aadhaar Enabled Payment System (AEPS) on which 119 banks are live.&lt;/li&gt;
&lt;li&gt;More than 33.87 crore transactions have taken place through AEPS, which was only 46 lakhs in May 2014.&lt;/li&gt;
&lt;li&gt;As on 30-9-2016, 78 government schemes were linked to Aadhaar.&lt;/li&gt;
&lt;li&gt;The Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016, provides that no core-biometric information (fingerprints, iris scan) shall be shared with anyone for any reason whatsoever (Sec 29) and that the biometric information shall not be used for any purpose other than generation of Aadhaar and authentication.&lt;/li&gt;
&lt;li&gt;Access to the data repository of UIDAI, called the Central Identities Data Repository(CIDR), is provided to third parties or private companies.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Central Monitoring System&lt;/b&gt; (CMS) is already live in  Delhi, New Delhi and Mumbai. Union minister Ravi Shankar Prasad revealed  this in one of his replies in the Lok Sabha last year. CMS has been set  up to automate the process of Lawful Interception &amp;amp; Monitoring of  telecommunications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Central Monitoring System&lt;/b&gt; (CMS) is already live in  Delhi, New Delhi and Mumbai. Union minister Ravi Shankar Prasad revealed  this in one of his replies in the Lok Sabha last year. CMS has been set  up to automate the process of Lawful Interception &amp;amp; Monitoring of  telecommunications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Lawful Intercept &lt;/b&gt;and Monitoring (LIM) systems are used  by the Indian Government to intercept records of voice, SMSes, GPRS  data, details of a subscriber’s application and recharge history and  call detail record (CDR) and monitor Internet traffic, emails,  web-browsing, Skype and any other Internet activity of Indian users.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data'&gt;https://cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-04-11T14:43:59Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017">
    <title>Comments on the Right to Information Rules, 2017</title>
    <link>https://cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017</link>
    <description>
        &lt;b&gt;On March 31st, 2017, the Ministry of Personnel, Public Grievances and Pensions, Department of Personnel and Training released a Circular framing rules under the Right to Information Act, 2005 (“RTI Rules”). The Ministry invited comments on on the RTI Rules. CIS submitted its comments on April 25, 2017.&lt;/b&gt;
        
&lt;h3 dir="ltr"&gt;1. Preliminary&lt;/h3&gt;
&lt;p dir="ltr"&gt;1.1 On March 31st, 2017, the Ministry of Personnel, Public Grievances and Pensions, Department of Personnel and Training released a Circular framing rules under the Right to Information Act, 2005 (“RTI Rules”). The Ministry invited comments on on the RTI Rules.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;2. The Centre for Internet and Society&lt;/h3&gt;
&lt;p dir="ltr"&gt;2.1. The Centre for Internet and Society, (“CIS”), is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with diverse abilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, and open access), internet governance, telecommunication reform, digital privacy, and cyber-security.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;3. Comments&lt;/h3&gt;
&lt;p dir="ltr"&gt;3.1 General Comments&lt;/p&gt;
&lt;p dir="ltr"&gt;The new RTI Rules introduce various procedural hurdles and provides a great deal of discretionary power to the CIC in dealing with RTI applications and appeals. One of the provisions which has attracted attention in the past also is the abatement of appeals upon the death of the RTI applications. This provision, explored in more detail is especially objectionable in light of the threats that RTI activists face.&lt;/p&gt;
&lt;p&gt;&lt;strong id="docs-internal-guid-f3638231-aeb5-9d2f-4329-a2fd7d07f81a"&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2 Specific Comments&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.1 Rule 4 of the RTI Rules states that the fees for providing information under the RTI Act would be ‘as notified by Central Government from time to time’. While the RTI Rules also prescribe the fee for filing RTI applications, this phrase provides a window to increase the fees through subsequent notifications. We recommend that the phrase “or as notified by Central Government from time to time” be deleted in order prevent prohibitive increase in the fees in future.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.2 Rule 4 of the RTI Rules also specifies the fees for provision of information via floppies and diskettes. There is no plausible reason to engage in continued rulemaking applicable to outdated modes of data storage. It would be of much more help if the rules were to prescribe fees for CDs, DVDs and email. We also submit that no fees need be charged for information provided through emails, and this mode of communication must be adopted where possible.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.3 Rule 8 (1)(viii) states that every appellant must affirm that they have not filed an appeal pertaining to similar matters before the Commission or any court. However, the same matter can lead to multiple counts of causes of actions, and the principle of res judicata barring further action should not apply in these cases. Therefore, it is recommended that this requirement is deleted.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.4 &amp;nbsp;Rule 12 permits the withdrawal of an appeal on the request of the appellant and &amp;nbsp;the &amp;nbsp;abatement &amp;nbsp;of &amp;nbsp;an &amp;nbsp;appeal &amp;nbsp;on &amp;nbsp;the &amp;nbsp;death &amp;nbsp;of &amp;nbsp;the &amp;nbsp;appellant. This provisions needs to be evaluated in light of the increasing number of cases of threats received by RTI activists. There have been close to 400 documented cases of attacks on RTI applicants,[1] including cases of murder and physical assault. This provision will serve to enable withdrawal of RTI appeals through harassment and other means of coercion.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;Further, the abatement of an appeal upon death of an RTI appellant is a clause without any merit and could translate into murders of appellants to cause abatement of the appeal. Additionally, the Supreme Court’s judgment in the matter of Union of India v. Namit Sharma[2] must be kept in mind which clarified the position that RTI applications and appeals are not in the nature of lis and deal with the question of whether requested information ought to be disclosed. Therefore, there is no reason why appeals should abate upon the demise of the appellant.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.5 &amp;nbsp;Rule 14 permits the CIC to return complaints due to non-compliance with the procedural rules in Rule 13. Such rules[3] have been used in the past to return complaints on unreasonable or artificial grounds. This is an example of additional procedural hurdles introduced by through the rulemaking process instead of making the process more citizen friendly.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.6 Rule 15 (iii) of the RTI Rules gives the CIC the discretion to close a case without even allowing hearing to the applicant. There is no requirement on the CIC to provide a detailed reasoning of its determination either. This rule is in violation of the right to be heard before adjudication under natural justice principles.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.7 The redressal mechanism under Rule 16 of the RTI Rules leaves a lot to be desired. Beginning with the use of the term ‘communication’ to refer to the complaint regarding a non-compliance of the CIC’s order, the rule takes a cavalier approach to addressing the significant number of cases of non-compliance with the CIC’s order. Further, there is no clear procedure spelt out with regard to how the CIC will deal with such matters and whether parties may be heard before making an adjudication. Further, there is an inconsistency in that a communication may be rejected if not submitted in the prescribed format, whereas in the case of appeals it clearly stated that they may not be returned/rejected only on the ground of non-compliance with the format.&lt;/p&gt;
&lt;p dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;[1]  http://attacksonrtiusers.org&lt;/p&gt;
&lt;p dir="ltr"&gt;[2]  https://indiankanoon.org/doc/47938967/&lt;/p&gt;
&lt;p dir="ltr"&gt;[3]  Rule 9 of the RTI Rules, 2012.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017'&gt;https://cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Openness</dc:subject>
    
    
        <dc:subject>RTI</dc:subject>
    
    
        <dc:subject>Call for Comments</dc:subject>
    

   <dc:date>2017-04-27T09:25:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar">
    <title>Can the Judiciary Upturn the Lok Sabha Speaker’s Decision on Aadhaar?</title>
    <link>https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar</link>
    <description>
        &lt;b&gt;When ruling on the petition filed by Jairam Ramesh challenging passing the Aadhaar Act as a money Bill, the court has differing precedents to look at.&lt;/b&gt;
        &lt;p&gt;The article was &lt;a class="external-link" href="https://thewire.in/110795/aadhaar-money-bill-judiciary/"&gt;published in the Wire&lt;/a&gt; on February 21, 2017.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In &lt;a href="http://thewire.in/2016/04/24/the-aadhaar-act-is-not-a-money-bill-31297/" target="_blank" title="an earlier article"&gt;an earlier article&lt;/a&gt;, I had argued that the characterisation of the &lt;a href="https://www.google.co.in/url?sa=t&amp;amp;rct=j&amp;amp;q=&amp;amp;esrc=s&amp;amp;source=web&amp;amp;cd=5&amp;amp;cad=rja&amp;amp;uact=8&amp;amp;ved=0ahUKEwj0xo6U_KDSAhVHLo8KHcygCVEQFggvMAQ&amp;amp;url=https%3A%2F%2Fuidai.gov.in%2Fimages%2Fthe_aadhaar_act_2016.pdf&amp;amp;usg=AFQjCNHDmJKdO8jdfGZJKLKRJQpHdf1Frw&amp;amp;sig2=B_YbWncu6eyZHJ1MFTD0NA" rel="external nofollow" target="_blank" title="Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act"&gt;Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act&lt;/a&gt;,  as a money Bill by Sumitra Mahajan, speaker of the Lok Sabha, was  erroneous. Specifically, I had argued that upon perusal of Article 110  (1) of the constitution, the Aadhaar Act does not satisfy the conditions  required of a money Bill. For a legislation to be classified as a money  Bill, it must comprise of ‘only’ provisions dealing with the following  matters: (a) imposition, regulation and abolition of any tax, (b)  borrowing or other financial obligations of the government of India, (c)  custody, withdrawal from or payment into the Consolidated Fund of India  (CFI) or Contingent Fund of India, (d) appropriation of money out of  CFI, (e) expenditure charged on the CFI or (f) receipt or custody or  audit of money into CFI or public account of India; or (g) any matter  incidental to any of the matters specified in sub-clauses (a) to (f).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Article 110 is modelled on Section 1(2) of the UK’s Parliament Act, 1911, which also defines money Bills as those only dealing with certain enumerated matters. The use of the word ‘only’ was brought up by Ghanshyam Singh Gupta during the constituent assembly debates. He pointed out that the use of the word ‘only’ limits the scope money Bills to only those legislations which did not deal with other matters. His amendment to delete the word ‘only’ was rejected, clearly establishing the intent of the framers of the constitution to keep the ambit of money Bills extremely narrow. G.V. Mavalankar, the first speaker of Lok Sabha, had stated that the word ‘only’ must not be construed so as to give an overly restrictive meaning. For instance, a Bill which deals with taxation could have provisions which deal with the administration of the tax. The finance minister, Arun Jaitley, referred to these words by Mavalankar, justifying the classification of the Aadhaar Act as a money Bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the Aadhaar Bill does makes references to benefits, subsidies and services funded by the CFI, even a cursory reading of the Bill reveals its main objectives as creating a right to obtain a unique identification number and providing for a statutory apparatus to regulate the entire process. Any reasonable reading of the legislation would be hard pressed to view all provisions in the Aadhaar Act, aside from the one creating a charge on the CFI, as merely administrative provisions incidental to the creation such charge. The mere fact of establishing the Aadhaar number as the identification mechanism for benefits and subsidies funded by the CFI does not give it the character of a money Bill. The Bill merely speaks of facilitating access to unspecified subsidies and benefits rather than their creation and provision being the primary object of the legislation. Erskine May’s seminal textbook, Parliamentary Practice, is instructive in this respect and makes it clear that a legislation which simply makes a charge on the consolidated fund does not becomes a money Bill if otherwise its character is not that of one. Further, the subordinate regulations notified under the Aadhaar Act deal almost entirely with matters to do with enrolment, updation, authentication of the Aadhaar number and related matters such as data security regulations and sharing of information collected, rather than the provision of benefits or subsidies or disbursal of funds otherwise from the CFI.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, in the context of the petition filed by former Union minister Jairam Ramesh challenging the passage of the law on Aadhaar as a money Bill, the more important question is whether the judiciary has a right to question the speaker’s decision in such a matter. If not, any other questions about whether the legislation is a money Bill will remain merely academic in nature.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Irregularity vs illegality&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Article 110 (3) clearly states that with regard to the question whether a legislation is a money Bill or not, the decision of the speaker is final and binding. The question is whether such a clause completely excludes any judicial review. Further, Article 122 prohibits the courts from questioning the validity of any proceedings in parliament on the ground of any alleged irregularity of procedure.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;During the arguments in the court, the attorney general questioned the locus standi of Ramesh. The petition has been made under Article 32 of the constitution and the government argued that no fundamental rights of Ramesh were violated. However, the court has asked Ramesh to make his submission and adjourned the hearing to July. The petition by Ramesh would hinge largely on the powers of the judiciary to question the decision of the speaker of the Lok Sabha.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The powers of privilege that parliamentarians enjoy are integral to the principle of separation of powers. The rationale behind parliamentary privilege is to prevent interference in the lawmakers’ powers to perform essential functions. The ability to speak and vote inside the legislature without the fear of punishment is certainly essential to the role of a lawmaker. However, the extent of this protection lies at the centre of this discussion. During the constituent assembly debates, H.V. Kamath and others had argued for a schedule to exhaustively codify the existing privileges. However, B.R. Ambedkar pointed to the difficulty of doing so and parliamentary privilege on the lines of the British parliamentary practice was retained in the constitution. In the last few decades, a judicial position has emerged that courts could exercise a limited degree of scrutiny over privileges, as they are primarily responsible for interpreting the constitution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the matter of &lt;a href="https://indiankanoon.org/doc/1757390/" rel="external nofollow" target="_blank" title="Raja Ram Pal vs The Hon’ble Speaker, Lok Sabha"&gt;&lt;i&gt;Raja Ram Pal vs The Hon’ble Speaker, Lok Sabh&lt;/i&gt;a&lt;/a&gt;,  it had been clarified that proceedings of the legislature were immune  from questioning by courts in the case of procedural irregularity but  not in the case of illegality. In this case, the Supreme Court while  dealing with Article 122 stated that it does not oust review by the  judiciary in cases of “gross illegality, irrationality, violation of  constitutional mandate, mala fides, non-compliance with rules of natural  justice and perversity.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In 1968, the speaker of the Punjab legislative assembly adjourned the  proceedings for a period of two months following rowdy behaviour.  Subsequently, an ordinance preventing such a suspension was promulgated  and the legislature was summoned by the governor to consider some  expedient financial matters. The speaker disagreed with the decision and  after some confusion, the deputy speaker passed a few Bills as money  Bills. While looking into the question of what was protected from  judicial review, the &lt;a href="https://indiankanoon.org/doc/36589/" rel="external nofollow" target="_blank" title="court stated"&gt;court stated&lt;/a&gt; that the protection did not extend to breaches of mandatory provisions  of the constitution, only to directory provisions. By that logic, if  Article 110 (1) is seen as a mandatory provision, a breach of its  provisions could lead to an interpretation that the Supreme Court may  well question an erroneous decision by the speaker of the Lok Sabha to  certify a legislation as a money Bill. The use of the word “shall” in  Article 110 (1), the nature and design of the provision, its overriding  impact on the other constitutional provisions granting the Rajya Sabha  powers are ample evidence of its mandatory nature. Based on the above,  Anup Surendranath has &lt;a href="http://ccgdelhi.org/doc/%28CCG-NLU%29%20Aadhaar%20Money%20Bill.pdf" rel="external nofollow" target="_blank" title="argued"&gt;argued&lt;/a&gt; that  the passage of the Aadhaar Act as a money Bill when it does not satisfy  the constitutional conditions for it does amount to a gross illegality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The judicial precedent in &lt;i&gt;&lt;a href="https://indiankanoon.org/doc/60568976/" rel="external nofollow" target="_blank" title="Mohd. Saeed Siddiqui vs State of Uttar Pradesh"&gt;Mohd. Saeed Siddiqui vs State of Uttar Pradesh&lt;/a&gt;&lt;/i&gt; where the matter of the court’s power to question the decision of a  speaker was considered, though, leans in the other direction. In 2012,  the &lt;a href="https://www.google.co.in/url?sa=t&amp;amp;rct=j&amp;amp;q=&amp;amp;esrc=s&amp;amp;source=web&amp;amp;cd=1&amp;amp;cad=rja&amp;amp;uact=8&amp;amp;ved=0ahUKEwiRtov_iKHSAhVLuo8KHYhsClcQFggbMAA&amp;amp;url=http%3A%2F%2Fwww.lawsofindia.org%2Fdownloadfile.php%3Flawid%3D7834%26file%3Duttar_pradesh%2F1981%2F1981UP7.pdf%26pageurl%3D%252Fsingle%252Falpha%252F7.html&amp;amp;usg=AFQjCNGRW8-NChXALunaUbjZRrlM4IvCkA&amp;amp;sig2=rg6YCMf7qRqNw08NnctuhQ" rel="external nofollow" target="_blank" title="Uttar Pradesh Lokayukta and Up-Lokayuktas (Amendment) Act"&gt;Uttar Pradesh Lokayukta and Up-Lokayuktas (Amendment) Act&lt;/a&gt;,  2012 was passed as money Bill by the Uttar Pradesh state legislature.  Subsequently, a writ petition was filed challenging its constitutional  validity. A three-judge bench of the Supreme Court looked into the  application of Article 212. It is the provision corresponding to Article  122, dealing with the power of the courts to inquire into the  proceedings of the state legislature. The court held that Article 212  makes “it clear that the finality of the decision of the Speaker and the  proceedings of the State Legislature being important privilege of the  State Legislature, viz., freedom of speech, debate and proceedings are  not to be inquired by the Courts.” Importantly, ‘proceedings of the  legislature’ were deemed to include within its scope everything done in  transacting parliamentary business, including the passage of the Bill.  While the court did acknowledge the limitations of parliamentary  privilege as established in the &lt;i&gt;Raja Ram Pal&lt;/i&gt; case, it did not adequately take into account the reasoning in it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Aadhaar Act is a legislation which makes it mandatory of all  residents to enrol for a biometric identification system in order to  avail certain subsidies, benefits and services. It has huge potential  risks for individual privacy and national security and has been the  subject of an extremely high profile Public Interest Litigation. Its  passage as a money Bill, without any oversight from the Rajya Sabha and  an opportunity for substantial debate and discussion, is a fraud on the  Constitution. Whether or not the court chooses to see it that way  remains to be seen.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar'&gt;https://cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-02-27T15:44:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations">
    <title>Analysis of Key Provisions of the Aadhaar Act Regulations </title>
    <link>https://cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations</link>
    <description>
        &lt;b&gt;In exercise of their powers under of the powers conferred by Aadhaar (Targeted Delivery of Financial and other Subsidies, Benefits and Services) Act, 2016, (Aadhaar Act) the UIDAI has come out with a set of five regulations in late 2016 last year. In this policy brief, we look at the five regulations, their key provisions and highlight point out the unresolved, issues, unaddressed, and created issues as result of these   regulations. &lt;/b&gt;
        &lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;This blog post was edited by Elonnai Hickok&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h3 style="text-align: justify; "&gt;Introduction&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;At the outset it is important to note that a concerning feature of these regulations is that they intend to govern the processes of a body which has been in existence for over six years, and has engaged in all the activities sought to be governed by these policies at a massive scale, considering the claims of over one billion Aadhaar number holders. However, the regulation do not acknowledge, let alone address past processes, practices, enrollments, authentications, use of technology etc.  this fact, and there are no provisions that effectively address  the past operations of the UIDAI. Below is an analysis of the five regulations issued thus far by the UIDAI.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Unique Identification Authority of India (Transactions of Business at Meetings of the Authority) Regulations&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;These regulations framed under clause (h) of sub-section (2) of section 54 read with sub-section (1) of section 19 of the Aadhaar Act, deal with the meetings of the UIDAI, the process following up to each meeting, and the manner in which all meetings are to be conducted.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 3.&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Meetings of the Authority– (1) There shall be no less than three meetings of the Authority in a financial year on such dates and at such places as the Chairperson may direct and the interval between any two meetings shall not in any case, be longer than five months&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The number of times that UIDAI would meet in a year is far too less, taking in account the significance of the responsibilities of UIDAI as the sole body for policy making for all issues related to Aadhaar. In contrast, the Telecom Regulatory Authority of India is required to meet at least once a month. Other bodies such as SEBI and IRDAI are also required to meet at least four times&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and six times&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; in a year respectively.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 8 (5)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Decisions taken at every meeting of the Authority shall be published on the website of Authority unless the Chairperson determines otherwise on grounds of ensuring confidentiality.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The Chairperson has the power to determine withholding publication of the decisions of the meeting on the broad grounds of ‘confidentiality’. Given the fact that the decisions taken by UIDAI as a public body can have very real implications for the rights of residents, the ground of confidentiality is not sufficient to warrant withholding publication. It is curious that instead of referring to the clearly defined exceptions laid down in other similar provisions such as the exceptions in Section 8 of the Right to Information Act, 2005, the rules merely refer to vague and undefined criteria of ‘confidentiality’.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 14 (4)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Members of the Authority and invitees shall sign an initial Declaration at the first meeting of the Authority for maintaining the confidentiality of the business transacted at meetings of the Authority in Schedule II.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The above provision, combined with the fact that there is no provision regarding publication of the minutes of the meetings of UIDAI raise serious questions about the transparency of  its functioning.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Unique Identification Authority of India (Enrolment and Update) Regulations&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;These regulations, framed under  sub-section (1), and sub-clauses (a), (b), (d,) (e), (j), (k), (l), (n), (r), (s), and (v) of sub-section (2), of Section 54 of the Aadhaar Act deals with the enrolment process, the generation of an Aadhaar number, updation of information and governs the conduct of enrolment agencies and associated third parties.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provisions:&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 8 (2), (3) and (4)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The standard enrolment/update software shall have the security features as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;All equipment used in enrolment, such as computers, printers, biometric devices and other accessories shall be as per the specifications issued by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The biometric devices used for enrolment shall meet the specifications, and shall be certified as per the procedure, as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 3 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The standards for collecting the biometric information shall be as specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 4 (5)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The standards of the above demographic information shall be as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 6 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For residents who are unable to provide any biometric information contemplated by these regulations, the Authority shall provide for handling of such exceptions in the enrolment and update software, and such enrolment shall be carried out as per the procedure as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 14 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In case of rejection due to duplicate enrolment, resident may be informed about the enrolment against which his Aadhaar number has been generated in the manner as may be specified by the Authority.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;Though in February 2017,  the UIDAI published technical specifications for registered devices&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, the regulations  leave unaddressed issues such as lack of appropriately defined security safeguards in the Aadhaar. There is a general trend of continued deferrals in the regulations by stating that matters would be specified later on important aspects such as rejection of applications, uploading of the enrolment packet to the CIDR, the procedure for enrolling residents with biometric exceptions, the procedure for informing residents about acceptance/rejection of enrolment application, specifying the convenience fee for updation of residents’ information, the procedure for authenticating individuals across services etc.c. There is a clear failure to exercise the mandate delegated to UIDAI, leaving key matters to determined at a future unspecified date. The delay and ambiguity around when regulations will be defined is  all the more problematic  in light of the fact that the project has been implemented since 2010 and the Aadhaar number is now mandatory for availing a number of services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Further it is important to note that a number of policies put out by the UIDAI predate these regulations, on which the regulations are  completely silent, thus neither endorsing previous policies  nor suggesting that they may be revisited. Further, the regulations choose to not engage with the question of operation of the Aadhaar project, enrolment and storage of data etc prior to the notification of these regulations, or the policies which these regulations may regularise. For instance, the regulations do not specify any measures to deal with issues arising out of enrolment devices used prior to the development of the February 2017 specifications.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 32&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;The Authority shall set up a contact centre to act as a central point of contact for resolution of queries and grievances of residents, accessible to residents through toll free number(s) and/ or e-mail, as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(2) The contact centre shall:&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;Provide a mechanism to log queries or grievances and provide residents with a unique reference number for further tracking till closure of the matter;&lt;/li&gt;
&lt;li&gt;Provide regional language support to the extent possible;&lt;/li&gt;
&lt;li&gt;Ensure safety of any information received from residents in relation to their identity information;&lt;/li&gt;
&lt;li&gt;Comply with the procedures and processes as may be specified by the Authority for this purpose.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;(3) Residents may also raise grievances by visiting the regional offices of the Authority or through any other officers or channels as may be specified by the Authority.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;While the setting up of a grievance redressal mechanism under the regulations is a welcome move, there is little clarity about the procedure to be followed, nor is a timeline for it specified. The chapter on grievance redressal is in fact one of the shortest chapters in the regulations. The only provision in this chapter deals with the setting up of a contact centre, a curious choice of term for what is supposed to be the primary quasi judicial grievance redressal body for the Aadhaar project. In line with the indifferent and insouciant terminology of ‘contact centre’, the chapter is restricted to the matters of the logging of queries and grievances by the contact centre, and does not address the matter of procedure or timelines, and even the substantive provisions about the nature of redress available. Furthermore, the obligation on the contact centre to protect information received is limited to ‘ensuring safety’ an ambiguous standard that does not speak to any other standards in Indian law.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Aadhaar (Authentication) Regulations, 2016&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;These regulations, framed under  sub-section (1), and sub-clauses (f) and (w) of sub-section (2) of Section 54 of the Aadhaar Act deals with the authentication framework for Aadhaar numbers, the governance of authentication agencies and the procedure for collection, storage of authentication data and records.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provisions:&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 5 (1)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At the time of authentication, a requesting entity shall inform the Aadhaar number holder of the following details:—&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(a) the nature of information that will be shared by the Authority upon authentication;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(b) the uses to which the information received during authentication may be put; and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(c) alternatives to submission of identity information&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 6 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A requesting entity shall obtain the consent referred to in sub-regulation (1) above in physical or preferably in electronic form and maintain logs or records of the consent obtained in the manner and form as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;Sub-regulation 5 mentions that at the time of authentication, requesting entities shall inform the Aadhaar number holder of alternatives to submission of identity information for the purpose of authentication. Similarly, sub-regulation 6 mentions that requesting entity shall obtain the consent of the Aadhaar number holder for the authentication. However, in neither of the above circumstances do the regulations specify the clearly defined options that must be made available to the Aadhaar number holder in case they do not wish submit identity information, nor do the regulations specify the procedure to be followed in case the Aadhaar number holder does not provide consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most significantly, this provision does little by way of allaying the fears raised by the language in Section 8 (4) of the Aadhaar Act which states that UIDAI “shall respond to an authentication query with a positive, negative or any other appropriate response sharing such identity information.” This section gives a very wide discretion to UIDAI to share personal identity information with third parties, and the regulations do not temper or qualify this power in any way.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Sub-Regulation 11 (1) and (4)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;The Authority may enable an Aadhaar number holder to permanently lock his biometrics and temporarily unlock it when needed for biometric authentication.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Authority may make provisions for Aadhaar number holders to remove such permanent locks at any point in a secure manner.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;A welcome provision in the regulation is that of biometric locking which allows Aadhaar number holders to permanently lock his biometrics and temporarily unlock it only when needed for biometric authentication. However, in the same breath, the regulation also provides for the UIDAI to make provisions to remove such locking without any specified grounds for doing so.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 18 (2), (3) and (4)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;The logs of authentication transactions shall be maintained by the requesting entity for a period of 2 (two) years, during which period an Aadhaar number holder shall have the right to access such logs, in accordance with the procedure as may be specified.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Upon expiry of the period specified in sub-regulation (2), the logs shall be archived for a period of five years or the number of years as required by the laws or regulations governing the entity, whichever is later, and upon expiry of the said period, the logs shall be deleted except those records required to be retained by a court or required to be retained for any pending disputes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The requesting entity shall not share the authentication logs with any person other than the concerned Aadhaar number holder upon his request or for grievance redressal and resolution of disputes or with the Authority for audit purposes. The authentication logs shall not be used for any purpose other than stated in this sub-regulation.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;While it is specified that the authentication logs collected by the requesting entities shall not be shared with any person other than the concerned Aadhaar number holder upon their request or for grievance redressal and resolution of disputes or with the Authority for audit purposes, and that the authentication logs may not be used for any other purpose, the maintenance of the logs for a period of seven years seems excessive. Similarly, the UIDAI is also supposed to store Authentication transaction data for over five years. This is in violation of the widely recognized data minimisation principles which seeks that data collectors and data processors delete personal data records when the purpose for which it has been collected if fulfilled. While retention of data for audit and dispute-resolution purpose is legitimate, the lack of specification of security standards and the overall lack of transparency and inadequate grievance redressal mechanism greatly exacerbate the risks associated with data retention.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Aadhaar (Sharing of Information) Regulations, 2016 and Aadhaar (Data security) Regulations, 2016&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Framed under the powers conferred by sub-section (1), and sub-clause (o) of sub-section (2), of Section 54 read with sub-clause (k) of sub-section (2) of Section 23, and sub-sections&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(2) and (4) of Section 29, of the Aadhaar Act, the Sharing of Information regulations look at the restrictions on sharing of identity information collected by the UIDAI and requesting entities. The Data Security regulation, framed under powers conferred by clause (p) of subsection (2) of section 54 of the Aadhaar Act, looks at security obligations of all service providers engaged by the UIDAI.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 6 (1)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;All agencies, consultants, advisors and other service providers engaged by the Authority, and ecosystem partners such as registrars, requesting entities, Authentication User Agencies and Authentication Service Agencies shall get their operations audited by an information systems auditor certified by a recognised body under the Information Technology Act, 2000 and furnish certified audit reports to the Authority, upon request or at time periods specified by the Authority.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The regulation states that audits shall be conducted by an information systems auditor certified by a recognised body under the Information Technology Act, 2000. However, there is no such certifying body under the Information Technology Act. This suggests a lack of diligence in framing the rules, and will inevitably to lead to inordinate delays, or alternately, a lack of a clear procedure in the appointment of  an auditor. Further, instead of prescribing a regular and proactive process of audits, the regulation only limits audits to when requested or as deemed appropriate by UIDAI. This is another, in line of many provisions, whose implication is power being concentrated in the hands of  UIDAI, with little scope for accountability and transparency.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In conclusion, it must be stated that the regulations promulgated by the UIDAI leave a lot to be desired. Some of the most important issues raised against the Aadhaar Act, which were delegated to the UIDAI’s rule making powers have not been addressed at all. Some of the most important issues such as data security policies, right to access records of Aadhaar number holders, procedure to be followed by the grievance redressal bodies, uploading of the enrolment packet to the CIDR, procedure for enrolling residents with biometric exceptions, procedure for informing residents about acceptance/rejection of enrolment application have left unaddressed and ‘may be specified’ at a later data. These failures leave a gaping hole especially in light of the absence of a comprehensive data protection legislation in India, as well the speed and haste with the enrolment and seeding has been done by the UIDAI, and the number of services, both private and public, which are using or planning to use the Aadhaar number and the authentication process as a primary identifier for residents.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://www.irda.gov.in/ADMINCMS/cms/frmGeneral_Layout.aspx?page=PageNo62&amp;amp;flag=1"&gt;https://www.irda.gov.in/ADMINCMS/cms/frmGeneral_Layout.aspx?page=PageNo62&amp;amp;flag=1&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.sebi.gov.in/acts/boardregu.html"&gt;http://www.sebi.gov.in/acts/boardregu.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt; &lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at:  https://uidai.gov.in/images/resource/aadhaar_registered_devices_2_0_09112016.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations'&gt;https://cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>UID</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>UIDAI</dc:subject>
    
    
        <dc:subject>Biometrics</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    

   <dc:date>2017-04-03T14:05:01Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/jobs/call-for-design-interns-201906">
    <title>Call for Design Interns</title>
    <link>https://cis-india.org/jobs/call-for-design-interns-201906</link>
    <description>
        &lt;b&gt;CIS is seeking graphic design interns to create communication material (information and data visualizations, publication layouts, presentations, etc.) for our projects. The intern will assist our researchers in presenting their research in accessible and easy-to-understand forms, as well as design social media collaterals. They will be working with a multi-disciplinary team across two cities, and be supervised by a designer.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Who can apply?&lt;/h4&gt;
&lt;p&gt;Students of design or recent design graduates, who are available to work full-time for at least a month, and have experience in editorial design and creating data visualizations. Others who can demonstrate similar skills and aptitude are also welcome to apply. Applicants with an interest in digital technology research would be preferred.&lt;/p&gt;
&lt;p&gt;Our work is strengthened by the diversity in background, culture, experience, religion, caste, sexual orientation, gender, gender identity, race, ethnicity, age and disability. We welcome applications from candidates belonging to marginalised communities.&lt;/p&gt;
&lt;h4&gt;Skills&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;Comfortable working with Adobe InDesign, Illustrator, and Photoshop,&lt;/li&gt;
&lt;li&gt;Comfortable working with Google Docs and Slides, and&lt;/li&gt;
&lt;li&gt;Knowledge of HTML/CSS will be preferred.&lt;/li&gt;&lt;/ul&gt;
&lt;h4&gt;Duration of the internship&lt;/h4&gt;
&lt;p&gt;1 – 2 months&lt;/p&gt;
&lt;h4&gt;Location&lt;/h4&gt;
&lt;p&gt;Bangalore or New Delhi&lt;/p&gt;
&lt;h4&gt;Remuneration&lt;/h4&gt;
&lt;p&gt;A modest stipend will be paid&lt;/p&gt;
&lt;h4&gt;How to apply?&lt;/h4&gt;
&lt;p&gt;To apply, please send –&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Resumé,&lt;/li&gt;
&lt;li&gt;Relevant work samples (less than 5MB), and&lt;/li&gt;
&lt;li&gt;Link to online portfolio, if any.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;Applications should be sent to Saumyaa Naidu (saumyaa [at] cis-india.org) and Karan Saini (karan [at] cis-india.org) by &lt;strong&gt;June 28, 2019&lt;/strong&gt;.&lt;/p&gt;
&lt;h4&gt;Organisational policies&lt;/h4&gt;
&lt;p&gt;All interns working at CIS must read and abide by CIS' &lt;a href="https://cis-india.org/about/policies" target="_blank"&gt;organisational policies&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/jobs/call-for-design-interns-201906'&gt;https://cis-india.org/jobs/call-for-design-interns-201906&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2019-06-12T06:16:13Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement">
    <title>Announcement of a Three-Region Research Alliance on the Appropriate Use of Digital Identity</title>
    <link>https://cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement</link>
    <description>
        &lt;b&gt;Omidyar Network has recently announced its decision to invest in establishment of a three-region research alliance — to be co-led by the Institute for Technology &amp; Society (ITS), Brazil, the Centre for Intellectual Property and Information Technology Law (CIPIT) , Kenya, and the CIS, India — on the Appropriate Use of Digital Identity. As part of this Alliance, we at the CIS will look at the policy objectives of digital identity projects, how technological policy choices can be thought through to meet the objectives, and how legitimate uses of a digital identity framework may be evaluated.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;As governments across the globe are implementing new, digital foundational identification systems or modernizing existing ID programs, there is a dire need for greater research and discussion about appropriate design choices for a digital identity framework. There is significant momentum on digital ID, especially after the adoption of UN Sustainable Development Goal 16.9, which calls for legal identity for all by 2030. Given the importance of this subject, its implications for both the development agenda as well its impact on civil, social and economic rights, there is a need for more focused research that can enable policymakers to take better decisions, guide civil society in different jurisdictions to comment on and raise questions about digital identity schemes, and provide actionable material to the industry to create identity solutions that are privacy enhancing and inclusive.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Excerpt from the &lt;a href="https://www.omidyar.com/blog/appropriate-use-digital-identity-why-we-invested-three-region-research%C2%A0alliance" target="_blank"&gt;blog post by Subhashish Bhadra&lt;/a&gt; announcing this new research alliance&lt;/h4&gt;
&lt;p&gt;...In the absence of any widely-accepted thinking on this issue, we run the risk of digital identity systems suffering from mission creep, that is being made mandatory or being used for an ever-expanding set of services. We believe this creates several risks. First, people may be excluded from services if they do not have a digital identity or because it malfunctions. Second, this approach creates a wider digital footprint that can be used to create a profile of an individual, sometimes without consent. This can increase privacy risk. Third, this approach increases the power of institutions versus individuals and can be used as rationale to intentionally deny services, especially to vulnerable or persecuted groups.&lt;/p&gt;
&lt;p&gt;Three exceptional research groups have undertaken the effort of answering this complex and important question. Over the next six months, these think tanks will conduct independent research, as well as involve experts from across the globe. Based in South America, Africa, and Asia, these institutions represent the collective wisdom and experiences of three very distinct geographies in emerging markets. While drawing on their local context, this research effort is globally oriented. The think tanks will create a set of recommendations and tools that can be used by stakeholders to engage with digital identity systems in any part of the world...&lt;/p&gt;
&lt;p&gt;This research will use a collaborative and iterative process. The researchers will put out some ideas every few weeks, with the objective of seeking thoughts, questions, and feedback from various stakeholders. They will participate in several digital rights and identity events across the globe over the next several months. They will also organize webinars to seek input from and present their interim findings to interested communities from across the globe. Each of these provide an opportunity for you to provide your thoughts and help this research program provide an independent, rigorous, transparent, and holistic answer to the question of when it’s appropriate for digital identity to be used. We need a diversity of viewpoints and collaborative dissent to help solve the most pressing issues of our times.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement'&gt;https://cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital ID</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Appropriate Use of Digital ID</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Digital Identity</dc:subject>
    
    
        <dc:subject>Homepage</dc:subject>
    

   <dc:date>2019-05-13T09:06:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017">
    <title>Rankathon on Digital Rights (Delhi, January 08)</title>
    <link>https://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017</link>
    <description>
        &lt;b&gt;Please join us on Sunday, January 08, at the CIS office in Hauz Khas, Delhi, for a rankathon to visualise, and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology. The event will begin at 10:00 in the morning and participants can focus on one or more of three kinds of tasks: 1) visualising the CIS and Ranking Digital Rights data, 2) evaluating additional companies using the RDR methodology, and 3) evaluating the RDR methodology and its suitability for independent use.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_RDRIndia-Rankathon_08012017_Invitation.pdf"&gt;Invitation&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="https://rankingdigitalrights.org/"&gt;Ranking Digital Rights Corporate Responsibility Index&lt;/a&gt; is a project hosted by the Open Technology Institute at New America Foundation that aims to rank Information and Communications Technology (ICTs) companies with respect to their Governance, Freedom of Expression, and Privacy practices. The inaugural Corporate Accountability Index, released in November 2015, evaluated 16 companies based on the project’s methodology that included 31 indicators in total.&lt;/p&gt;
&lt;p&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of &lt;a href="https://privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues.&lt;/p&gt;
&lt;p&gt;Please join us on Sunday, January 08, at the CIS office in Hauz Khas, Delhi, for a rankathon to visualise, and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology. The event will begin at 10:00 in the morning and participants can focus on one or more of three kinds of tasks:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;
&lt;p&gt;visualising the CIS and Ranking Digital Rights data,&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;evaluating additional companies using the RDR methodology, and&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;evaluating the RDR methodology and its suitability for independent use.&lt;/p&gt;
&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;The event is open to all but the venue has limited space. The participants are requested to RSVP by sending an email to &lt;a href="mailto:nisha@cis-india.org?subject=RSVP: Rankathon on Digital Rights"&gt;nisha@cis-india.org&lt;/a&gt;. The final date for registering for the event is &lt;strong&gt;January 04&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;All visualisations and other outputs produced at the event will be published under open licenses. All participants are expected to bring their own laptop or any other items needed for their work. CIS will offer data, help with understanding how the Ranking Digital Rights methodology work, refreshments, and any other support as needed.&lt;/p&gt;
&lt;p&gt;We are also organising a discussion event on Saturday, January 07, at the India Islamic Cultural Centre, Delhi, to present our findings on digital rights practices of 8 Indian ICT companies, followed by an open structured discussion on the methodology of the Ranking Digital Rights study. Please find more details about this &lt;a href="http://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We look forward to your participation and contribution to the discussion. Please support us by sharing this invitation with your colleagues and networks.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017'&gt;https://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2016-12-29T07:10:09Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/social-media-monitoring">
    <title>Social Media Monitoring</title>
    <link>https://cis-india.org/internet-governance/blog/social-media-monitoring</link>
    <description>
        &lt;b&gt;We see a trend of social media and communication monitoring and surveillance initiatives in India which have the potential to create a chilling effect on free speech online and raises question about the privacy of individuals. In this paper, Amber Sinha looks at social media monitoring as a tool for surveillance, the current state of social media surveillance in India, and evaluate how the existing regulatory framework in India may deal with such practices in future.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Social Media Monitoring: &lt;a href="http://cis-india.org/internet-governance/files/social-media-monitoring/at_download/file"&gt;Download&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;In 2014, the Government of India launched the much lauded and popular citizen outreach website called MyGov.in. A press release by the government announced that they had roped in global consulting firm PwC to assist in the data mining exercise to process and filter key points emerging from debates on Mygov.in. While this was a welcome move, the release also mentioned that the government intended to monitor social media sites in order to gauge popular opinion. Further, earlier this year, the government set up National Media Analytics Centre (NMAC) to monitor blogs, media channels, news outlets and social media platforms. The tracking software used by NMAC will generate tags to classify post and comments on social media into negative, positive and neutral categories, paying special attention to “belligerent” comments, and also look at the past patterns of posts. A project called NETRA has already been reported in the media a few years back which would intercept and analyse internet traffic using pre-defined filters. Alongside, we see other initiatives which intend to use social media data for predictive policing purposes such as CCTNS and Social Media Labs.&lt;/p&gt;
&lt;p&gt;Thus, we see a trend of social media and communication monitoring and surveillance initiatives announced by the government which have the potential to create a chilling effect on free speech online and raises question about the
privacy of individuals. Various commentators have raised concerns about the legal validity of such programmes and whether they were in violation of the fundamental rights to privacy and free expression, and the existing surveillance laws in India. The lack of legislation governing these programmes often translates into an absence of transparency and due procedure. Further, a lot of personal communication now exists in the public domain which
renders traditional principles which govern interception and monitoring of personal communications futile. In the last few years, the blogosphere and social media websites in India have also changed and become platforms for more dissemination of political content, often also accompanied by significant vitriol, ‘trolling’ and abuse. Thus, we see greater policing of public or semi-public spaces online. In this paper, we look at social media monitoring as a
tool for surveillance, the current state of social media surveillance in India and evaluate how the existing regulatory framework in India may deal with such practices in future.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/social-media-monitoring'&gt;https://cis-india.org/internet-governance/blog/social-media-monitoring&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Surveillance</dc:subject>
    

   <dc:date>2017-01-16T14:23:13Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report">
    <title>Privacy after Big Data - Workshop Report</title>
    <link>https://cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society (CIS) and the Sarai programme, CSDS, organised a workshop on 'Privacy after Big Data: What Changes? What should Change?' on Saturday, November 12, 2016 at Centre for the Study of Developing Societies in New Delhi. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;This workshop aimed to build a dialogue around some of the key government-led big data initiatives in India and elsewhere that are contributing significant new challenges and concerns to the ongoing debates on the right to privacy. It was an open event.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this age of big data, discussions about privacy are intertwined with the use of technology and the data deluge. Though big data possesses enormous value for driving innovation and contributing to productivity and efficiency, privacy concerns have gained significance in the dialogue around regulated use of data and the means by which individual privacy might be compromised through means such as surveillance, or protected. The tremendous opportunities big data creates in varied sectors ranges from financial technology, governance, education, health, welfare schemes, smart cities to name a few. With the UID project re-animating the Right to Privacy debate in India, and the financial technology ecosystem growing rapidly, striking a balance between benefits of big data and privacy concerns is a critical policy question that demands public dialogue and research to inform an evidence based decision. Also, with the advent of potential big data initiatives like the ambitious Smart Cities Mission under the Digital India Scheme, which would rely on harvesting large data sets and the use of analytics in city subsystems to make public utilities and services efficient, the tasks of ensuring data security on one hand and protecting individual privacy on the other become harder.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This workshop sought to discuss some of the emerging problems due to the advent of big data and possible ways to address these problems. The workshop began with Amber Sinha of CIS and Sandeep Mertia of Sarai introducing the topic of big data and implications for privacy. Both speakers tried to define big data and brief history of the evolution of the term and raised questions about how we understand it. Dr. Usha Ramanathan spoke on the right to privacy in the context of the ongoing Aadhaar case and Vipul Kharbanda introduced the concept of Habeas Data as a possible solution to the privacy problems posed by big data.  Amelia Andersotter discussed national centralised digital ID systems and their evolution in Europe, often operating at a cross-functional scale, and highlighted its implications for discussions on data protection, welfare governance, and exclusion from public and private services. Srikanth Lakshmanan spoke of the issues with technology and privacy, and possible technological solutions.  Dr. Anupam Saraph discussed the rise of digital banking and Aadhaar based payments and its potential use for corrupt practices. Astha Kapoor of Microsave spoke about her experience of implementation of digital money solution in rural India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Post lunch, Dr. Anja Kovacs and Mathew Rice spoke on the rise of mass communication surveillance across the world, and the evolving challenges of regulating surveillance by government agencies. Mathew also spoke of privacy movements by citizens and civil society in regions. In the final speaking session, Apar Gupta and Kritika Bhardwaj traced the history of jurisprudence on the right to privacy and the existing regulations and procedures. In the final session, the participants discussed various possible solutions to privacy threats from big data and identity projects including better regulation, new approached such as harms based regulation and privacy risk assessments, and conceiving privacy as a horizontal right. The workshop ended with vote of thanks from the organizers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The agenda for the event can be accessed &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS-Sarai_PrivacyAfterBigData_ConceptAgenda.pdf"&gt;here&lt;/a&gt;, and the transcript is available &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/privacy-after-big-data/"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report'&gt;https://cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-01-27T01:09:17Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy">
    <title>Deep Packet Inspection: How it Works and its Impact on Privacy</title>
    <link>https://cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy</link>
    <description>
        &lt;b&gt; In the last few years, there has been extensive debate and discussion around network neutrality in India. The online campaign in favor of Network Neutrality was led by Savetheinternet.in in India. The campaign was a spectacular success and facilitated sending  over a million emails supporting the cause of network neutrality, eventually leading to ban on differential pricing. Following in the footsteps of the Shreya Singhal judgement, the fact that the issue of net neutrality has managed to attract wide public attention is an encouraging sign for a free and open Internet in India. Since the debate has been focused largely on zero rating, other kinds of network practices impacting network neutrality have yet to be comprehensively explored in the Indian context, nor their impact on other values. In this article, the author focuses on network management, in general, and deep packet inspection, in particular and how it impacts the privacy of users.&lt;/b&gt;
        &lt;h3 style="text-align: justify; "&gt;&lt;a name="_ek69t4linon1"&gt;&lt;/a&gt; Background&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In the last few years, there has been extensive debate and discussion around network neutrality in India. The online campaign in favor of Network Neutrality was led by Savetheinternet.in in India. The campaign, captured in detail by an article in Mint,	&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; was a spectacular success and facilitated sending over a million emails supporting 	the cause of network neutrality, eventually leading to ban on differential pricing. Following in the footsteps of the Shreya Singhal judgement, the fact 	that the issue of net neutrality has managed to attract wide public attention is an encouraging sign for a free and open Internet in India. Since the 	debate has been focused largely on zero rating, other kinds of network practices impacting network neutrality have yet to be comprehensively explored in 	the Indian context, nor their impact on other values. In this article, I focus on network management, in general, and deep packet inspection, in particular 	and how it impacts the privacy of users.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_ft3wpj7p1jf1"&gt;&lt;/a&gt; The Architecture of the Internet&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Internet exists as a network acting as an intermediary between providers of content and it users.	&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Traditionally, the network did not distinguish between those who provided content 	and those who were recipients of this service, in fact often, the users also functioned as content providers. The architectural design of the Internet 	mandated that all content be broken down into data packets which were transmitted through nodes in the network transparently from the source machine to the 	destination machine.&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As discussed in detail later, as per the OSI model, the network 	consists of 7 layers. We will go into each of these layers in detail below, however is important to understand that at the base is the physical layer of 	cables and wires, while at the top is application layer which contains all the functions that people want to perform on the Internet and the content 	associated with it. The layers in the middle can be characterised as the protocol layers for the purpose of this discussion. What makes the architecture of 	the Internet remarkable is that these layers are completely independent of each other, and in most cases, indifferent to the other layers. The protocol 	layer is what impacts net neutrality. It is this layer which provides the standards for the manner in which the data must flow through the network. The 	idea was for the it to be as simple and feature free as possible such that it is only concerned with the transmission data as fast as possible ('best 	efforts principle') while innovations are pushed to the layers above or below it.&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This aspect of the Internet's architectural design, which mandates that network features are implemented as the end points only (destination and source 	machine), i.e. at the application level, is called the 'end to end principle'.&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This 	means that the intermediate nodes do not differentiate between the data packets in any way based on source, application or any other feature and are only concerned with transmitting data as fast as possible, thus creating what has been described as a 'dumb' or neutral network.	&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This feature of the Internet architecture was also considered essential to what 	Jonathan Zittrain has termed as the 'generative' model of the Internet.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Since, the 	Internet Protocol remains a simple layer incapable of discrimination of any form, it meant that no additional criteria could be established for what kind 	of application would access the Internet. Thus, the network remained truly open and ensured that the Internet does not privilege or become the preserve of 	a class of applications, nor does it differentiate between the different kinds of technologies that comprise the physical layer below.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the above model speaks of a dumb network not differentiating between the data packets that travel through it, in truth, the network operators engage 	in various kinds of practices that priorities, throttle or discount certain kinds of data packets. In her thesis essay at the Oxford Internet Institute, 	Alissa Cooper&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; states that traffic management involves three different set of 	criteria- a) Some subsets of traffic needs to be managed, and arriving at a criteria to identify those subsets the criteria can be based on source, 	destination, application or users, b) Trigger for the traffic management measure which - could be based upon time of the day, usage threshold or a specific 	network condition, and c) the traffic treatment put into practice when the trigger is met. The traffic treatment can be of three kinds. The first is 	Blocking, in which traffic is prevented from being delivered. The second is Prioritization under which identified traffic is sent sooner or later. This is 	usually done in cases of congestion and one kind of traffic needs to be prioritized. The third kind of treatment is Rate limiting where identified traffic 	is limited to a defined sending rate.&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The dumb network does not interfere with an 	application's operation, nor is it sensitive to the needs of an application, and in this way it treats all information sent over it as equal. In such a 	network, the content of the packets is not examined, and Internet providers act according to the destination of the data as opposed to any other factor. 	However, in order to perform traffic management in various circumstances, Deep packet Inspection technology, which does look at the content of data packets 	is commonly used by service providers.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_r7ojhgh467u5"&gt;&lt;/a&gt; Deep Packet Inspection&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Deep packet inspection (DPI) enables the examination of the content of a data packets being sent over the Internet. Christopher Parsons explains the header 	and the payload of a data packet with respect to the OSI model. In order to understand this better, it is more useful to speak of network in terms of the 	seven layers in the OSI model as opposed to the three layers discussed above.&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the OSI model, the top layer, the Application Layer is in contact with the software making a data request. For instance, if the activity in question 	is accessing a webpage, the web-browser makes a request to access a page which is then passed on to the lower layers. The next layer is the Presentation 	Layer which deals with the format in which the data is presented. This lateral performs encryption and compression of the data. In the above example, this 	would involve asking for the HTML file. Next comes the Session Layer which initiates, manages and ends communication between the sender and receiver. In 	the above example, this would involve transmitting and regulating the data of the webpage including its text, images or any other media. These three layers 	are part of the 'payload' of the data packet.&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The next four layers are part of the 'header' of the data packet. It begins with the Transport Layer which collects data from the Payload and creates a 	connection between the point of origin and the point of receipt, and assembles the packets in the correct order. In terms of accessing a webpage, this 	involves connecting the requesting computer system with the server hosting the data, and ensuring the data packets are put together in an arrangement which 	is cohesive when they are received. The next layer is the Data Link Layer. This layer formats the data packets in such a way that that they are compatible 	with the medium being used for their transmission. The final layer is the Physical Layer which determines the actual media used for transmitting the 	packets.&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The transmission of the data packet occurs between the client and server, and packet inspect occurs through some equipment placed between the client and 	the server. There are various ways in which packet inspection has been classified and the level of depth that the inspection needs to qualify in order to 	be categorized as Deep Packet Inspection. We rely on Parson's classification system in this article. According to him, there are three broad categories of 	packet inspection - shallow, medium and deep.&lt;a href="#_ftn13" name="_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Shallow packet inspection involves the inspection of the only the header, and usually checking it against a blacklist. The focus in this form of inspection 	is on the source and destination (IP address and packet;s port number). This form of inspection primarily deals with the Data Link Layer and Network Layer 	information of the packet. Shallow Packet Inspection is used by firewalls.&lt;a href="#_ftn14" name="_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Medium Packet Inspection involves equipment existing between computers running the applications and the ISP or Internet gateways. They use application 	proxies where the header information is inspected against their loaded parse-list and used to look at a specific flows. These kinds of inspections 	technologies are used to look for specific kinds of traffic flows and take pre-defined actions upon identifying it. In this case, the header and a small 	part of the payload is also being examined.&lt;a href="#_ftn15" name="_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, Deep Packet Inspection (DPI) enables networks to examine the origin, destination as well the content of data packets (header and payload). These 	technologies look for protocol non-compliance, spam, harmful code or any specific kinds of data that the network wants to monitor. The feature of the DPI 	technology that makes it an important subject of study is the different uses it can be put to. The use cases vary from real time analysis of the packets to 	interception, storage and analysis of contents of a packets.&lt;a href="#_ftn16" name="_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_pi28w1745j15"&gt;&lt;/a&gt; The different purposes of DPI&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Network Management and QoS&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary justification for DPI presented is network management, and as a means to guarantee and ensure a certain minimum level of QoS (Quality of 	Service). Quality of Service (QoS) as a value conflicting with the objectives of Network Neutrality, has emerged as a significant discussion point in this 	topic. Much like network neutrality, QoS is also a term thrown around in vague, general and non-definitive references. The factors that come into play in 	QoS are network imposed delay, jitter, bandwidth and reliability. Delay, as the name suggests, is the time taken for a packet to be passed by the sender to the receiver. Higher levels of delay are characterized by more data packets held 'in transit' in the network.	&lt;a href="#_ftn17" name="_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; A paper by Paul Ferguson and Geoff Huston described the TCP as a 'self clocking' 	protocol.&lt;a href="#_ftn18" name="_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This enables the transmission rate of the sender to be adjusted as per 	the rate of reception by the receiver. As the delay and consequent stress on the protocol increases, this feedback ability begins to lose its sensitivity. 	This becomes most problematic in cases of VoIP and video applications. The idea of QoS generally entails consistent service quality with low delay, low 	jitter and high reliability through a system of preferential treatment provided to some traffic on a criteria formulated around the need of such traffic to 	have greater latency sensitivity and low delay and jitter. This is where Deep Packet Inspection comes into play. In 1991, Cisco pioneered the use of a new 	kind of router that could inspect data packets flowing through the network. DPI is able to look inside the packets and its content, enabling it to classify 	packets according to a formulated policy. DPI, which was used a security tool, to begin with, is a powerful tool as it allows ISPs to limit or block 	specific applications or improve performances of applications in telephony, streaming and real-time gaming. Very few scholars believe in an all-or-nothing approach to network neutrality and QoS and debate often comes down to what forms of differentiations are reasonable for service providers to practice.	&lt;a href="#_ftn19" name="_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Security&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Deep Packet inspection was initially intended as a measure to manage the network and protect it from transmitting malicious programs . As mentioned above, Shallow Packet Inspection was used to secure LANs and keep out certain kinds of unwanted traffic.	&lt;a href="#_ftn20" name="_ftnref20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Similarly, DPI is used for identical purposes, where it is felt useful to 	enhance security and complete a 'deeper' inspection that also examines the payload along with the header information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Surveillance&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The third purpose of DPI is what concerns privacy theorists the most. The fact that DPI technologies enable the network operators to have access to the actual content of the data packets puts them a position of great power as well as making them susceptible to significant pressure from the state.	&lt;a href="#_ftn21" name="_ftnref21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; For instance, in US, the ISPs are required to conform to the provisions of the 	Communications Assistance for Law Enforcement Act (CALEA) which means they need to have some surveillance capacities designed into their systems. What is 	more disturbing for privacy theorists compared to the use of DPI for surveillance under legislation like CALEA, are the other alleged uses by organisation 	like the National Security Agency through back end access to the information via the ISPs. Aside from the US government, there have been various reports of use of DPI by governments in countries like China,&lt;a href="#_ftn22" name="_ftnref22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Malaysia&lt;a href="#_ftn23" name="_ftnref23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and Singapore.	&lt;a href="#_ftn24" name="_ftnref24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Behavioral targeting&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DPI also enables very granular tracking of the online activities of Internet users. This information is invaluable for the purposes of behavioral targeting 	of content and advertising. Traditionally, this has been done through cookies and other tracking software. DPI allows new way to do this, so far exercised 	only through web-based tools to ISPs and their advertising partners. DPI will enable the ISPs to monitor contents of data packets and use this to create profiles of users which can later be employed for purposes such as targeted advertising.	&lt;a href="#_ftn25" name="_ftnref25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_gn60r7ifwcge"&gt;&lt;/a&gt; Impact on Privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Each of the above use-cases has significant implications for the privacy of Internet users as the technology in question involves access, tracking or 	retention of their online communication and usage activity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Alyssa Cooper compares DPI with other technologies carrying out content inspection such as caching services and individual users employing firewalls or packet sniffers. She argues that one of the most distinguishing feature of DPI is the potential for "mission-creep."	&lt;a href="#_ftn26" name="_ftnref26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kevin Werbach writes that while networks may deploy DPI for implementation under 	CALEA or traffic peer-to-peer shaping, once deployed DPI techniques can be used for completely different purposes such as pattern matching of intercepted 	content and storage of raw data or conclusions drawn from the data.&lt;a href="#_ftn27" name="_ftnref27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This scope of 	mission creep is even more problematic as it is completely invisible. As opposed to other technologies which rely on cookies or other web-based services, 	the inspection occurs not at the end points, but somewhere in the middle of the network, often without leaving any traces on the user's system, thus 	rendering them virtually undiscoverable.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Much like other forms of surveillance, DPI threatens the sense that the web is a space where people can engage freely with a wide range of people and 	services. For such a space to continue to exist, it is important for people to feel secure about their communication and transaction on medium. This notion 	of trust is severely harmed by a sense that users are being surveilled and their communication intercepted. This has obvious chilling effect on free speech 	and could also impact electronic commerce.&lt;a href="#_ftn28" name="_ftnref28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Allyssa Cooper also points out another way in which DPI differs from other content tracking technologies. As the DPI is deployed by the ISPs, it creates a 	greater barrier to opting out and choosing another service. There are only limited options available to individuals as far as ISPs are concerned. 	Christopher Parsons does a review of ISPs using DPI technology in UK, US and Canada and offers that various ISPs do provide in their terms of services that 	they use DPI for network management purposes. However, this information is often not as easily accessible as the terms and conditions of online services. 	A;so, As opposed to online services, where it is relatively easier to migrate to another service, due to both presence of more options and the ease of 	migration, it is a much longer and more difficult process to change one's ISP.&lt;a href="#_ftn29" name="_ftnref29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_n5w8euzb4xhb"&gt;&lt;/a&gt; Measures to mitigate risk&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Currently, there are no existing regulatory frameworks in India which deal govern DPI technology in any way. The International Telecommunications Union 	(ITU) prescribes a standard for DPI&lt;a href="#_ftn30" name="_ftnref30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; however, the standard does not engage with 	any questions of privacy and requires all DPI technologies to be capable of identifying payload data, and prescribing classification rules for specific 	applications, thus, conflicting with notions of application agnosticism in network management. More importantly, the requirements to identify, decrypt and 	analyse tunneled and encrypted data threaten the reasonable expectation of privacy when sending and receiving encrypted communication. In this final 	section, I look at some possible principles and practices that may be evolved in order to mitigate privacy risks caused due to DPI technology.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Limiting 'depth' and breadth&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It has been argued that inherently what DPI technology intends to do is matching of patterns in the inspected content against a pre-defined list which is 	relevant to the purpose how which DPI is employed. Much like data minimization principles applicable to data controllers and data processors, it is 	possible for network operators to minimize the depth of the inspection (restrict it to header information only or limited payload information) so as to 	serve the purpose at hand. For instance, in cases where the ISP is looking to identify peer-to-peer traffic, there are protocols which declare their names 	in the application header itself. Similarly, a network operators looking to generate usage data about email traffic can do so simply by looking at port 	number and checking them against common email ports.&lt;a href="#_ftn31" name="_ftnref31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, this mitigation 	strategy may not work well for other use-cases such as blocking malicious software or prohibited content or monitoring for the sake of behavioral 	advertising.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While depth referred to the degree of inspection within data packets, breadth refers to the volume of packets being inspected. Alyssa Cooper argues that 	for many DPI use cases, it may be possible to rely on pattern matching on only the first few data packets in a flow, in order to arrive at sufficient data 	to take appropriate response. Cooper uses the same example about peer-to-peer traffic. In some cases, the protocol name may appear on the header file of 	only the first packet of a flow between two peers. In such circumstances, the network operators need not look beyond the header files of the first packet 	in a flow, and can apply the network management rule to the entire flow.&lt;a href="#_ftn32" name="_ftnref32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Data retention&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Aside from the depth and breadth of inspection, another important question whether and for along is there a need for data retention. All use cases may not 	require any kind of data retention and even in case where DPI is used for behavioral advertising, only the conclusions drawn may be retained instead of 	retaining the payload data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Transparency&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One of the issues is that DPI technology is developed and deployed outside the purview of standard organizations like ISO. Hence, there has been a lack of 	open, transparent standards development process in which participants have deliberated the impact of the technology. It is important for DPI to undergo 	these process which are inclusive, in that there is participation by non-engineering stakeholders to highlight the public policy issues such as privacy. Further, aside from the technology, the practices by networks need to be more transparent.	&lt;a href="#_ftn33" name="_ftnref33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Disclosure of the presence of DPI, the level of detail being inspected or retained and the purpose for deployment of DPI can be done. Some ISPs provide some of these details in their terms of service and website notices.	&lt;a href="#_ftn34" name="_ftnref34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, as opposed to web-based services, users have limited interaction with 	their ISP. It would be useful for ISPs to enable greater engagement with their users and make their practices more transparent.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The very nature of of the DPI technology renders some aspects of recognized privacy principles like notice and consent obsolete. The current privacy frameworks under FIPP&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and OECD	&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; rely on the idea of empowering the individual by providing them with knowledge 	and this knowledge enables them to make informed choices. However, for this liberal conception of privacy to function meaningfully, it is necessary that 	there are real and genuine choices presented to the alternatives. While some principles like data minimisation, necessity and proportionality and purpose 	limitation can be instrumental in ensuring that DPI technology is used only for legitimate purposes, however, without effective opt-out mechanisms and 	limited capacity of individual to assess the risks, the efficacy of privacy principles may be far from satisfactory.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ongoing Aadhaar case and a host of surveillance projects like CMS, NATGRID, NETRA&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and NMAC	&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; have raised concerns about the state conducting mass-surveillance, particularly 	of online content. In this regard, it is all the more important to recognise the potential of Deep Packet Inspection technologies for impact on privacy 	rights of individuals. Earlier, the Centre for Internet and Society had filed Right to Information applications with the Department of Telecommunications, Government of India regarding the use of DPI, and the government had responded that there was no direction/reference to the ISPs to employ DPI technology.	&lt;a href="#_ftn39" name="_ftnref39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Similarly, MTNL also responded to the RTI Applications and denied using the 	technology.&lt;a href="#_ftn40" name="_ftnref40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; It is notable though, that they did not respond to the questions 	about the traffic management policies they follow. Thus, so far there has been little clarity on actual usage of DPI technology by the ISPs.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ashish Mishra, "India's Net Neutrality Crusaders", available at 			&lt;a href="http://mintonsunday.livemint.com/news/indias-net-neutrality-crusaders/2.3.2289565628.html"&gt; http://mintonsunday.livemint.com/news/indias-net-neutrality-crusaders/2.3.2289565628.html &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.livinginternet.com/i/iw_arch.htm"&gt;http://www.livinginternet.com/i/iw_arch.htm&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Vinton Cerf and Robert Kahn, "A protocol for packet network intercommunication", available at 			&lt;a href="https://www.semanticscholar.org/paper/A-protocol-for-packet-network-intercommunication-Cerf-Kahn/7b2fdcdfeb5ad8a4adf688eb02ce18b2c38fed7a"&gt; https://www.semanticscholar.org/paper/A-protocol-for-packet-network-intercommunication-Cerf-Kahn/7b2fdcdfeb5ad8a4adf688eb02ce18b2c38fed7a &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ganley and Ben Algove, "Network Neutrality-A User's Guide", available at			&lt;a href="http://wiki.commres.org/pds/NetworkNeutrality/NetNeutrality.pdf"&gt;http://wiki.commres.org/pds/NetworkNeutrality/NetNeutrality.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; J H Saltzer, D D Clark and D P Reed, "End-to-End arguments in System Design", available at			&lt;a href="http://web.mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf"&gt;http://web.mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 4.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jonathan Zittrain, The future of Internet - and how to stop it, (Yale University Press and Penguin UK, 2008) available at 			&lt;a href="https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future%20of%20the%20Internet.pdf?sequence=1"&gt; https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future%20of%20the%20Internet.pdf?sequence=1 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alissa Cooper, How Regulation and Competition Influence Discrimination in Broadband Traffic Management: A Comparative Study of Net Neutrality in 			the United States and the United Kingdom available at 			&lt;a href="http://ora.ox.ac.uk/objects/uuid:757d85af-ec4d-4d8a-86ab-4dec86dab568"&gt; http://ora.ox.ac.uk/objects/uuid:757d85af-ec4d-4d8a-86ab-4dec86dab568 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Christopher Parsons, "The Politics of Deep Packet Inspection: What Drives Surveillance by Internet Service Providers?", available at 			&lt;a href="https://www.christopher-parsons.com/the-politics-of-deep-packet-inspection-what-drives-surveillance-by-internet-service-providers/"&gt; https://www.christopher-parsons.com/the-politics-of-deep-packet-inspection-what-drives-surveillance-by-internet-service-providers/ &lt;/a&gt; at 15.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 16.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 19.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jay Klein, "Digging Deeper Into Deep Packet Inspection (DPI)", available at			&lt;a href="http://spi.unob.cz/papers/2007/2007-06.pdf"&gt;http://spi.unob.cz/papers/2007/2007-06.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Tim Wu, "Network Neutrality: Broadband Discrimination", available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=388863"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=388863&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ferguson and Geoff Huston, "Quality of Service on the Internet: Fact, Fiction,&lt;/p&gt;
&lt;p&gt;or Compromise?", available at &lt;a href="http://www.potaroo.net/papers/1998-6-qos/qos.pdf"&gt;http://www.potaroo.net/papers/1998-6-qos/qos.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Barbara van Schewick, "Network Neutrality and Quality of Service: What a non-discrimination Rule should look like", available at 			&lt;a href="http://cyberlaw.stanford.edu/downloads/20120611-NetworkNeutrality.pdf"&gt; http://cyberlaw.stanford.edu/downloads/20120611-NetworkNeutrality.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 14.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ohm, "The Rise and Fall of Invasive ISP Surveillance," available at 			&lt;a href="http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf"&gt; http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ben Elgin and Bruce Einhorn, "The great firewall of China", available at 			&lt;a href="http://www.bloomberg.com/news/articles/2006-01-22/the-great-firewall-of-china"&gt; http://www.bloomberg.com/news/articles/2006-01-22/the-great-firewall-of-china &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Mike Wheatley, "Malaysia's Web Heavily Censored Before Controversial Elections", available at 			&lt;a href="http://siliconangle.com/blog/2013/05/06/malaysias-web-heavily-censored-before-controversial-elections/"&gt; http://siliconangle.com/blog/2013/05/06/malaysias-web-heavily-censored-before-controversial-elections/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Fazal Majid, "Deep packet inspection rears it ugly head" available at			&lt;a href="https://majid.info/blog/telco-snooping/"&gt;https://majid.info/blog/telco-snooping/&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alissa Cooper, "Doing the DPI Dance: Assessing the Privacy Impact of Deep Packet Inspection," in W. Aspray and P. Doty (Eds.), Privacy in America: 			Interdisciplinary Perspectives, Plymouth, UK: Scarecrow Press, 2011 at 151.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 148.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kevin Werbach, "Breaking the Ice: Rethinking Telecommunications Law for the Digital Age", Journal of Telecommunications and High Technology, 			available at &lt;a href="http://www.jthtl.org/articles.php?volume=4"&gt;http://www.jthtl.org/articles.php?volume=4&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 149.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 147.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; International Telecommunications Union, Recommendation ITU-T.Y.2770, Requirements for Deep Packet Inspection in next generation networks, available 			at &lt;a href="https://www.itu.int/rec/T-REC-Y.2770-201211-I/en"&gt;https://www.itu.int/rec/T-REC-Y.2770-201211-I/en&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 154.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 156.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 10.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ohm, "The Rise and Fall of Invasive ISP Surveillance", available at 			&lt;a href="http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf"&gt; http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.nist.gov/nstic/NSTIC-FIPPs.pdf"&gt;http://www.nist.gov/nstic/NSTIC-FIPPs.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm"&gt; https://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; "India's Surveillance State" Software Freedom Law Centre, available at 			&lt;a href="http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/"&gt; http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Amber Sinha, "Are we losing our right to privacy and freedom on speech on Indian Internet", DNA, available at 			&lt;a href="http://www.dnaindia.com/scitech/column-are-we-losing-the-right-to-privacy-and-freedom-of-speech-on-indian-internet-2187527"&gt; http://www.dnaindia.com/scitech/column-are-we-losing-the-right-to-privacy-and-freedom-of-speech-on-indian-internet-2187527 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://cis-india.org/telecom/use-of-dpi-technology-by-isps.pdf"&gt;http://cis-india.org/telecom/use-of-dpi-technology-by-isps.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Smita Mujumdar, "Use of DPI Technology by ISPs - Response by the Department of Telecommunications" available at 			&lt;a href="http://cis-india.org/telecom/dot-response-to-rti-on-use-of-dpi-technology-by-isps"&gt; http://cis-india.org/telecom/dot-response-to-rti-on-use-of-dpi-technology-by-isps &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy'&gt;https://cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-12-16T23:14:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
