<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 21 to 35.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/marco-civil-da-internet"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/it-act-and-commerce"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/gdpr-and-india-a-comparative-analysis"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/major-security-flaw-namo-app"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept">
    <title>Privacy is not a unidimensional concept</title>
    <link>https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept</link>
    <description>
        &lt;b&gt;Right  to privacy is important not only for our negotiations with the information age but also to counter the transgressions of a welfare state. A robust right to privacy is essential for all citizens in India to defend their individual autonomy in the face of invasive state actions purportedly for the public good. The ruling of this nine-judge bench will have far-reaching impact on the extent and scope of rights available to us all.&lt;/b&gt;
        
&lt;div&gt;This article, written by Amber Sinha was published in the &lt;a class="external-link" href="http://economictimes.indiatimes.com/news/politics-and-nation/aadhar-privacy-is-not-a-unidimensional-concept/articleshow/59716562.cms"&gt;Economic Times&lt;/a&gt; on July 23, 2017.&amp;nbsp;&lt;/div&gt;
&lt;div&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;div&gt;In a disappointing case of judicial evasion by the apex court,
      it has taken over 600 days since a reference order passed in
      August 11, 2015, for this bench to be constituted. Over two days
      of arguments, the counsels for the petitioners have presented
      before the court why the right to privacy, despite not finding a
      mention in the Constitution of India, is a fundamental right
      essential to a person’s dignity and liberty, and must be read into
      not one but multiple articles of the Constitution. The government
      will make its arguments in the coming week.&lt;/div&gt;
&lt;div&gt;One must wonder why we are debating the contours of the right
      to privacy, which 40 years of jurisprudence had lulled us into
      believing we already had. The answer to that can be found in a
      series of hearings in the Aadhaar case that began in 2012. Justice
      KS Puttaswamy, a former Karnataka High Court judge, filed a
      petition before the Supreme Court, questioning the validity of the
      Aadhaar project due its lack of legislative basis (since then the
      Aadhaar Act was passed in 2016) and its transgressions on our
      fundamental rights. Over time, a number of other petitions also
      made their way to the apex court, challenging different aspects of
      the Aadhaar project. Since then, five different interim orders by
      the Supreme Court have stated that no person should suffer because
      they do not have an Aadhaar number. Aadhaar, according to the
      court, could not be made mandatory to avail benefits and services
      from government schemes. Further, the court has limited the use of
      Aadhaar to specific schemes: LPG, PDS, MGNREGA, National Social
      Assistance Programme, the Pradhan Mantri Jan Dhan Yojna and EPFO.&lt;br /&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;div&gt;The real spanner in the works in the progress of this case was
      the stand taken by Mukul Rohatgi, then attorney general of India
      who, in a hearing before the court in July 2015, stated that there
      is no constitutionally guaranteed right to privacy. His reliance
      was on two Supreme Court judgments in MP Sharma v Satish Chandra
      (1954) and Kharak Singh v State of Uttar Pradesh (1962): both
      cases, decided by eight- and six-judge benches respectively,
      denied the existence of a constitutional right to privacy. As the
      subsequent judgments which upheld the right to privacy were by
      smaller benches, Rohatgi claimed that MP Sharma and Kharak Singh
      still prevailed over them, until they were overruled by a larger
      bench.&lt;/div&gt;
&lt;div&gt;The reference to a larger bench has since delayed the entire
      matter, even as a number of government schemes have made Aadhaar
      mandatory. This reading of privacy as a unidimensional concept by
      the courts is, with due respect, erroneous. Privacy, as a concept,
      includes within its scope, spatial, familial, informational and
      decisional aspects. We all have a legitimate expectation of
      privacy in our private spaces, such as our homes, and in our
      personal relationships. Similarly, we must be able to exercise
      some control over how personal data, like our financial
      information, are disseminated. Most importantly, privacy gives us
      the space to make autonomous choices and decisions without
      external interference. All these dimensions of privacy must stand
      as distinct rights. In MP Sharma, the court rejected a certain
      aspect of the right of privacy by refusing to acknowledge a right
      against search and seizure. This, in no way prevented the court,
      even in the form of a smaller bench, from ruling on any other
      aspects of privacy, including those that are relevant to the
      Aadhaar case.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;The limited referral to this bench means that the court will
      have to rule on the status of privacy and its possible limitations
      in isolation, without even going into the details of the Aadhaar
      case (based on the nature of protection that this bench accords to
      privacy, the petitioners and defendants in the Aadhaar case will
      have to argue afresh on whether the project does impede on this
      most fundamental right). There are no facts of the case to ground
      the legal principles in, and defining the contours of a right can
      be a difficult exercise. The court must be wary of how any limits
      they put on the right may be used in future. Equally, it is
      important to articulate that any limitations on the right to
      privacy due to competing interests such as national security and
      public interest must be imposed only when necessary and always be
      proportionate. &lt;br /&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;p&gt;
    
    
    
    
    
    It will not be enough for the court to merely state that we have a
    constitutional right to privacy. They would be well advised to cut
    through the muddle of existing privacy jurisprudence, and
    unequivocally establish the various facets of the right. Without
    that, we may not be able to withstand the modern dangers of
    surveillance, denial of bodily integrity and self-determination
    through forcible collection of information. The nine judges, in
    their collective wisdom, must not only ensure that we have a right
    to privacy, but also clearly articulate a robust reading of this
    right capable of withstanding the growing interferences with our
    autonomy.&lt;/p&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept'&gt;https://cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-08-07T08:02:20Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research">
    <title>Privacy after Big Data: Compilation of Early Research</title>
    <link>https://cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research</link>
    <description>
        &lt;b&gt;Evolving data science, technologies, techniques, and practices, including big data, are enabling shifts in how the public and private sectors carry out their functions and responsibilities, deliver services, and facilitate innovative production and service models to emerge. In this compilation we have put together a series of articles that we have developed as we explore the impacts – positive and negative – of big data. This is a growing body of research that we are exploring and
is relevant to multiple areas of our work including privacy and surveillance. Feedback and comments on the compilation are welcome and appreciated.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;&lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_PrivacyAfterBigData_CompilationOfEarlyResearch_2016.11.pdf"&gt;Download the Compilation&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;strong&gt;Privacy after Big Data&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Evolving data science, technologies, techniques, and practices, including big data, are enabling shifts in how the public and private sectors carry out their functions and responsibilities, deliver services, and facilitate innovative production and service models to emerge. For example, in the public sector, the Indian government has considered replacing the traditional poverty line with targeted subsidies based on individual household income and assets. The my.gov.in platform is aimed to enable participation of the connected citizens, to pull in online public opinion in a structured manner on key governance topics in the country. The 100 Smart Cities Mission looks forwards to leverage big data analytics and techniques to deliver services and govern citizens within city sub-systems. In the private sector, emerging financial technology companies are developing credit scoring models using big, small, social, and fragmented data so that people with no formal credit history can be offered loans. These models promote efficiency and reduction in cost through personalization and are powered by a wide variety of data sources including mobile data, social media data, web usage data, and passively collected data from usages of IoT or connected devices.&lt;/p&gt;
&lt;p&gt;These data technologies and solutions are enabling business models that are based on the ideals of ‘less’: cash-less, presence-less, and paper-less. This push towards an economy premised upon a foundational digital ID in a prevailing condition of absent legal frameworks leads to substantive loss of anonymity and privacy of individual citizens and consumers vis-a-vis both the state and the private sector. Indeed, the present use of these techniques run contrary to the notion of the ‘sunlight effect’ - making the individual fully transparent (often without their knowledge) to the state and private sector, while the algorithms and means of reaching a decision are opaque and inaccessible to the individual.&lt;/p&gt;
&lt;p&gt;These techniques, characterized by the volume of data processed, the variety of sources data is processed from, and the ability to both contextualize - learning new insights from disconnected data points - and de-contextualize - finding correlation rather than causation - have also increased the value of all forms of data. In some ways, big data has made data exist on an equal playing field as far as monetisation and joining up are concerned. Meta data can be just as valuable to an entity as content data. As data science techniques evolve to find new ways of collecting, processing, and analyzing data - the benefits of the same are clear and tangible, while the harms are less clear, but significantly present.&lt;/p&gt;
&lt;p&gt;Is it possible for an algorithm to discriminate? Will incorrect decisions be made based on data collected? Will populations be excluded from necessary services if they do not engage with certain models or do emerging models overlook certain populations? Can such tools be used to surveil individuals at a level of granularity that was formerly not possible and before a crime occurs? Can such tools be used to violate rights – for example target certain types of speech or groups online? And importantly, when these practices are opaque to the individual, how can one seek appropriate and effective remedy.&lt;/p&gt;
&lt;p&gt;Traditionally, data protection standards have defined and established protections for certain categories of data. Yet, data science techniques have evolved beyond data protection principles. It is now infinitely harder to obtain informed consent from an individual when data that is collected can be used for multiple purposes by multiple bodies. Providing notice for every use is also more difficult – as is fulfilling requirements of data minimization. Some say privacy is dead in the era of big data. Others say privacy needs to be re-conceptualized, while others say protecting privacy now, more than ever, requires a ‘regulatory sandbox’ that brings together technical design, markets, legislative reforms, self regulation, and innovative regulatory frameworks. It also demands an expanding of the narrative around privacy – one that has largely been focused on harms such as misuse of data or unauthorized collection – to include discrimination, marginalization, and competition harms.&lt;/p&gt;
&lt;p&gt;In this compilation we have put together a series of articles that we have developed as we explore the impacts – positive and negative – of big data. This includes looking at India’s data protection regime in the context of big data, reviewing literature on the benefits of harms of big data, studying emerging predictive policing techniques that rely on big data, and analyzing closely the impact of big data on specific privacy principles such as consent. This is a growing body of research that we are exploring and is relevant to multiple areas of our work including privacy and surveillance. Feedback and comments on the compilation are welcome and appreciated.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Elonnai Hickok&lt;/em&gt;&lt;br /&gt;Director - Internet Governance&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research'&gt;https://cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Saumyaa Naidu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Human Rights</dc:subject>
    
    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Smart Cities</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Information Technology</dc:subject>
    
    
        <dc:subject>Publications</dc:subject>
    

   <dc:date>2016-11-12T01:37:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic">
    <title>Personal Data Protection Bill must examine data collection practices that emerged during pandemic</title>
    <link>https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic</link>
    <description>
        &lt;b&gt;The PDP bill is speculated to be introduced during the winter session of the parliament soon. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen’s data in order to fulfil its responsibilities. The bill could ensure that employers have some responsibility towards the data they collect from the employees.

&lt;/b&gt;
        &lt;p&gt;The article by Shweta Mohandas and Anamika Kundu was &lt;a class="external-link" href="https://www.news9live.com/technology/personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic-137031?infinitescroll=1"&gt;originally published by &lt;strong&gt;news nine&lt;/strong&gt;&lt;/a&gt; on November 29, 2021.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The Personal Data Protection Bill (PDP) is speculated to be introduced during the winter session of the parliament soon, and the report of the Joint Parliamentary Committee (JPC) has already been &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;adopted&lt;/a&gt; by the committee on Monday. The Report of the JPC comes after almost two years of deliberation and secrecy over how the final version of the Personal Data Protection Bill will be. Since the publication of the &lt;a class="external-link" href="https://prsindia.org/files/bills_acts/bills_parliament/2019/Personal%20Data%20Protection%20Bill,%202019.pdf"&gt;2019 version&lt;/a&gt; of the PDP Bill, the Covid 19 pandemic and the public safety measures have opened the way for a number of new organisations and reasons to collect personal data that was non-existent in 2019. Hence along with changes that have been suggested by multiple civil society organisations, the dissent notes submitted by the members of the JPC, the new version of the PDP Bill must also look at how data processing has changed over the span of two years.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Concerns with the bill&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;At the outset there are certain parts of the PDP Bill which need to be revised in order to uphold the spirit of privacy and individual autonomy laid out in the Puttaswamy judgement. The two sections that need to be in line with the privacy judgement are the ones that allow for non consensual processing of data by the government, and by employers. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen's data in order to fulfil its &lt;a class="external-link" href="https://www.livemint.com/news/india/big-brother-on-top-in-data-protection-bill-11576164271430.html"&gt;responsibilities&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the &lt;a class="external-link" href="https://www.meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf"&gt;2018 version&lt;/a&gt; of bill, drafted by the Justice Srikrishna Committee exemptions granted to the State with regard to processing of data was subject to a four pronged test which required the processing to be (i) authorised by law; (ii) in accordance with the procedure laid down by the law; (iii) necessary; and (iv) proportionate to the interests being achieved. This four pronged test was in line with the principles laid down by the Supreme Court in the Puttaswamy judgement. The 2019 version of the PDP Bill has diluted this principle by merely retaining the 'necessity principle' and removing the other requirements which is not in consonance with the test laid down by the Supreme Court in Puttaswamy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 35 was also widely discussed in the panel meetings where members had &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;argued&lt;/a&gt; the removal of 'public order' as a ground for exemption. The panel also insisted for '&lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;judicial or parliamentary oversight&lt;/a&gt;' to grant such exemptions. The final report did not accept these suggestions stating a need to balance &lt;a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece"&gt;national security, liberty and privacy&lt;/a&gt; of an individual. There ought to be prior judicial review of the written order exempting the governmental agency from any provisions of the bill. Allowing the government to claim an exemption if it is satisfied to be "necessary or expedient" can be misused.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another clause which gives the data principal a wide berth is with respect to employee data Section 13 of the current version of the bill provides the employer with a leeway into processing employee data (other than sensitive personal data) without consent based on two grounds: when consent is not appropriate, or when obtaining consent would involve disproportionate effort on the part of the employer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The personal data so collected can only be collected for recruitment, termination, attendance, provision of any service or benefit, and assessing performance. This covers almost all of the activities that require data of the employee. Although the 2019 version of the bill excludes non-consensual collection of sensitive personal data (a provision that was missing in the 2018 version of the bill), there is still a lot of scope to improve this provision and provide employees further right to their data. At the outset the bill does not define employee and employer, which could result in confusion as there is no one definition of these terms across Indian Labour Laws.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, the bill distinguishes between employee and consumer, where the consumer of the same company or service has a greater right to their data than an employee. In the sense that the consumer as a data principal has the option to use any other product or service and also has the right to withdraw consent at any time, in the case of an employee the consequence of refusing consent or withdrawing consent would be being terminated from the employment. It is understood that there is a requirement for employee data to be collected, and that consent does not work the same way as it does in the case of a consumer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bill could ensure that employers have some responsibility towards the data they collect from the employees, such as ensuring that they are only used for the purpose for which they were collected, the employee knows how long their data will be retained, and know if the data is being processed by third parties. It is also worth mentioning that the Indian government is India's largest employer spanning a variety of agencies and public enterprises.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Concerns highlighted by JPC Members&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Going back to the few members of the JPC who have moved dissent notes, specifically with regard to governmental exemptions. Jairam Ramesh filed a &lt;a href="https://www.news9live.com/india/parliament-panel-adopts-report-on-data-protection-amid-dissent-by-opposition-135591"&gt;dissent note&lt;/a&gt;, to which many other opposition members followed suit. While Jairam Ramesh praised the JPC's functioning, he disagreed with certain aspects of the Report. According to him, the 2019 bill is designed in a manner where the right to privacy is given importance only in cases of private activities. He raised concerns regarding the unbridled powers given to the government to exempt itself from any of the provisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The amendment suggested by him would require parliamentary approval before exemption would take place. He also added that Section 12 of the bill which provided certain scenarios where consent was not needed for processing of personal data should have been made '&lt;a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html"&gt;less sweeping&lt;/a&gt;'. Similarly, Gaurav Gogoi's &lt;a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html"&gt;note&lt;/a&gt; stated that the exemptions would create a surveillance state and similarly criticised Section 12 and 35 of the bill. He also mentioned that there ought to be parliamentary oversight for the exemptions provided in the bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the same issue, Congress leader Manish Tiwari noted that the bill creates '&lt;a href="https://timesofindia.indiatimes.com/business/india-business/personal-data-protection-bill-what-is-it-and-why-is-the-opposition-so-unhappy-with-it/articleshow/87869391.cms"&gt;parallel universes&lt;/a&gt;' - one for the private sector which needs to be compliant and the other for the State which can exempt itself. He has opposed the entire bill stating there exists an "inherent design flaw". He has raised specific objections to 37 clauses and stated that any blanket exemptions to the state goes against the Puttaswamy Judgement.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In their joint &lt;a href="https://www.news9live.com/india/tmc-congress-mps-submit-dissent-notes-to-joint-panel-on-personal-data-protection-bill-135491"&gt;dissent note&lt;/a&gt;, Derek O'Brien and Mahua Mitra have said that there is a lack of adequate safeguards to protect the data principals' privacy and the lack of time and opportunity for stakeholder consultations. They have also pointed out that the independence of the DPA will cease to exist with the present provision of allowing the government powers to choose members and the chairman. Amar Patnaik is to object to the lack of inclusion of state level authorities in the bill. Without such bodies, he says, there would be federal override.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;While a number of issues were highlighted by civil society, the members of the JPC, and the media, the new version of the bill should also need to take into account the shifts that have taken place in view of the pandemic. The new version of the data protection bill should take into consideration the changes and new data collection practices that have emerged during the pandemic, be comprehensive and leave very little provisions to be decided later by the Rules.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic'&gt;https://cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-30T15:15:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill">
    <title>Nothing to Kid About – Children's Data Under the New Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill</link>
    <description>
        &lt;b&gt;The pandemic has forced policymakers to adapt their approach to people's changing practices, from looking at contactless ways of payment to the shifting of educational institutions online.&lt;/b&gt;
        &lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The article was originally &lt;a class="external-link" href="https://www.ijlt.in/post/nothing-to-kid-about-children-s-data-under-the-new-data-protection-bill"&gt;published in the Indian Journal of Law and Technology&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;For children, the internet has shifted from being a form of entertainment to a medium to connect with friends and seek knowledge and education. However, each time they access the internet, data about them and their choices are inadvertently recorded by companies and unknown third parties. The growth of EdTech apps in India has led to growing concerns regarding children's data privacy. This has led to the creation of a &lt;a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank"&gt;self-regulatory&lt;/a&gt; body, the Indian EdTech Consortium. More recently, the &lt;a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank"&gt;Advertising Standard Council of India&lt;/a&gt;&lt;span class="_3zM-5"&gt; has &lt;/span&gt;also started looking at passing a draft regulation to keep a check on EdTech advertisements.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The Joint Parliamentary Committee (JPC), tasked with drafting and revising the Data Protection Bill, had to consider the number of changes that had happened after the release of the 2019 version of the Bill. While the most significant change was the removal of the term “personal data” from the title of the Bill, in a move to create a comprehensive Data Protection Bill that includes both personal and non personal data. Certain other provisions of the Bill also featured additions and removals. The JPC, in its revised version of the Bill has removed an entire class of &lt;a class="_1lsz7 _3Bkfb" href="https://prsindia.org/billtrack/the-personal-data-protection-bill-2019#:~:text=Obligations%20of%20data%20fiduciary%3A%20A,specific%2C%20clear%20and%20lawful%20purpose" rel="noopener noreferrer" target="_blank"&gt;data fiduciaries&lt;/a&gt; – guardian data fiduciary – which was tasked with greater responsibility for managing children's data. While the JPC justified the removal of the guardian data fiduciary stating that consent from the guardian of the child is enough to meet the end for which personal data of children are processed by the data fiduciary. While thought has been given to looking at how consent is given by the guardian on behalf of the child, there was no change in the age of children in the Bill. Keeping the age of consent under the Bill as the same as the age of majority to enter into a contract under the 1872 Indian Contract Act – 18 years – reveals the disconnect the law has with the ground reality of how children interact with the internet.&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;In the current state of affairs where Indian children are navigating the digital world on their own there is a need to look deeply at the processing of children’s data as well as ways to ensure that children have information about consent and informational privacy. By placing the onus of granting consent on parents, the PDP Bill fails to look at how consent works in a privacy policy–based consent model and how this, in turn, harms children in the long run.&lt;/p&gt;
&lt;h3 class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d aujbK _3M0Fe _1FoOD iWv3d _1j-51 mm8Nw"&gt;1. Age of Consent&lt;/h3&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;By setting the age of consent as 18 years under the Data Protection Bill, 2021, it brings all individuals under 18 years of age under one umbrella without making a distinction between the internet usage of a 5-year-old child and a 16-year-old teenager. There is a need to look at the current internet usage habits of children and assess whether requiring parental consent is reasonable or even practical. It is also pertinent to note that the law in the offline world does make the distinction between age and maturity. For example, it has been &lt;a class="_1lsz7 _3Bkfb" href="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill" rel="noopener noreferrer" target="_blank"&gt;highlighted&lt;/a&gt; that Section 82 of the Indian Penal Code, read with Section 83, states that any act by a child under the age of 12 years shall not be considered an offence, while the maturity of those aged between 12–18 years will be decided by the court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industries, which would qualify them to fall under Section 13 of the Bill, which deals with employee data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;A 2019 &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://reverieinc.com/wp-content/uploads/2020/09/IAMAI-Digital-in-India-2019-Round-2-Report.pdf" rel="noopener noreferrer" target="_blank"&gt;report&lt;/a&gt;&lt;span&gt; suggests that two-thirds of India’s internet users are in the 12–29 years age group, accounting for about 21.5% of the total internet usage in metro cities. With the emergence of cheaper phones equipped with faster processing and low internet data costs, children are no longer passive consumers of the internet. They have social media accounts and use several applications to interact with others and make purchases. There is a need to examine how children and teenagers interact with the internet as well as the practicality of requiring parental consent for the usage of applications.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Most applications that require age data request users to type in their date of birth; it is not difficult for a child to input a suitable date that would make it appear that they are &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank"&gt;over 18&lt;/a&gt;&lt;span&gt;. In this case they are still children but the content that will be presented to them would be those that are meant for adults including content that might be disturbing or those involving use of &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank"&gt;alcohol and gambling. &lt;/a&gt;&lt;span&gt;Additionally, in their privacy policies, applications sometimes state that they are not suited for and restricted from users under 18. Here, data fiduciaries avoid liability by placing the onus on the user to declare their age and properly read and understand the privacy policy.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Reservations about the age of consent under the Bill have also been highlighted by some members of the JPC through their dissenting opinions. &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank"&gt;MP Ritesh Pandey &lt;/a&gt;&lt;span&gt;suggested that the age of consent should be reduced to 14 years keeping the best interest of the children in mind as well as to support children in benefiting from technological advances. Similarly, &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank"&gt;MP Manish Tiwari &lt;/a&gt;&lt;span&gt;in his dissenting opinion suggested regulating data fiduciaries based on the type of content they provide or data they collect.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;2. How is the 2021 Bill Different from the 2019 Bill?&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf" rel="noopener noreferrer" target="_blank"&gt;2019 &lt;/a&gt;&lt;span&gt;draft of the Bill consisted of a class of data fiduciaries called guardian data fiduciaries – entities that operate commercial websites or online services directed at children or which process large volumes of children’s personal data. This class of fiduciaries was barred from profiling, tracking, behavioural monitoring, and running targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child. In the previous draft, such data fiduciaries were not allowed to engage in ‘profiling, tracking, behavioural monitoring of children, or direct targeted advertising at children’. There was also a prohibition on conducting any activities that might significantly harm the child. As per Chapter IV, any violation could attract a penalty of up to INR 15 crore of the worldwide turnover of the data fiduciary for the preceding financial year, whichever is higher. However, this separate class of data fiduciaries do not have any additional responsibilities. It is also unclear as to whether a data fiduciary that does not by definition fall within such a category would be allowed to engage in activities that could cause ‘significant harm’ to children.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The new Bill also does not provide any mechanisms for age verification and only lays down considerations that verification processes should be undertaken. Furthermore, the JPC has suggested that consent options available to the child when they attain the age of majority i.e. 18 years should be included within the rule frame by the Data Protection Authority instead of being an amendment in the Bill.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;3. In the Absence of a Guardian Data Fiduciary&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The 2018 and 2019 drafts of the PDP Bill consider a child to be any person below the age of 18 years. For a child to access online services, the data fiduciary must first verify the age of the child and obtain consent from their guardian. The Bill does not provide an explicit process for age verification apart from stating that regulations shall be drafted in this regard. The 2019 Bill states that the Data Protection Authority shall specify codes of practice in this matter. Taking best practices into account, there is a need for ‘&lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/project-brief-highlighting-inclusive-and-practical-mechanisms-to-protect-childrens-data.pdf" rel="noopener noreferrer" target="_blank"&gt;user-friendly and privacy-protecting age verification techniques&lt;/a&gt;&lt;span&gt;’ to encourage safe navigation across the internet. This will require &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/bp-global-technological-developments-in-age-verification-and-age-estimation.pdf" rel="noopener noreferrer" target="_blank"&gt;looking at &lt;/a&gt;&lt;span&gt;technological developments and different standards worldwide. There is a need to hold companies &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.livemint.com/opinion/columns/theres-a-better-way-to-protect-the-online-privacy-of-kids-11615306723478.html" rel="noopener noreferrer" target="_blank"&gt;accountable&lt;/a&gt;&lt;span&gt; for the protection of children’s online privacy and the harm that their algorithms cause children and to make sure that they are not continued.&lt;/span&gt;&lt;/p&gt;
&lt;p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; "&gt;The JPC in the 2021 version of the Bill removed provisions about guardian data fiduciaries, stating that there was no advantage in creating a different class of data fiduciary. As per the JPC, even those data fiduciaries that did not fall within the said classification would also need to comply with rules pertaining to the personal data of children i.e. with Section 16 of the Bill. Section 16 of the Bill requires the data fiduciary to verify the child’s age and obtain consent from the parent/guardian. The manner of age verification has also een spelt out.  Furthermore, since ‘significant data fiduciaries’ is an existing class, there is still a need to comply with rules related to data processing. The JPC also removed the phrase “in the best interests of, the child” and “is in the best interests of, the child” under sub-clause 16(1), implying that the entire Bill concerned the rights of the data principal and the use of such terms dilutes the purpose of the legislation and could give way to manipulation by the data fiduciary.&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Conclusion&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Over the past two years, there has been a significant increase in applications that are targeted at children. There has been a proliferation of EduTech apps, which ideally should have more responsibility as they are processing children's data. We recommend that instead of creating a separate category, such fiduciaries collecting children's data or providing services to children be seen as ‘significant data fiduciaries’ that need to take up additional compliance measures.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Furthermore, any blanket prohibition on tracking children may obstruct safety measures that could be implemented by data fiduciaries. These fears are also increasing in other jurisdictions as there is a likelihood to restrict data fiduciaries from using software that looks out for such as &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.unodc.org/e4j/en/cybercrime/module-12/key-issues/online-child-sexual-exploitation-and-abuse.html" rel="noopener noreferrer" target="_blank"&gt;Child Sexual Abuse Material&lt;/a&gt;&lt;span&gt; as well as  online predatory behaviour. Additionally, concerning the age of consent under the Bill, the JPC could look at international best practices and come up with ways to make sure that children can use the internet and have rights over their data, which would enable them to grow up with more awareness about data protection and privacy. One such example to look at could be the Children's Online Privacy Protection Rule (COPPA) in the US, where the rules apply to operators of websites and online services that collect personal information from kids &lt;/span&gt;&lt;a class="_1lsz7 _3Bkfb" href="https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-compliance" rel="noopener noreferrer" target="_blank"&gt;under 13 &lt;/a&gt;&lt;span&gt;or provide services to children that are directed at a general audience, but have actual knowledge that they collect personal information from such children. A form of combination of this system and the significant data fiduciary classification could be one possible way to ensure that children’s data and privacy are preserved online.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;The authors are researchers at the Centre for Internet and Society and thank their colleague Arindrajit Basu for his inputs.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill'&gt;https://cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Anamika Kundu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digitalisation</dc:subject>
    
    
        <dc:subject>Digital Knowledge</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2022-03-10T13:19:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines">
    <title>NHA Data Sharing Guidelines – Yet Another Policy in the Absence of a Data Protection Act</title>
    <link>https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines</link>
    <description>
        &lt;b&gt;In July this year, the National Health Authority (NHA) released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) just two months after publishing the draft Health Data Management Policy.&lt;/b&gt;
        &lt;p&gt;Reviewed and edited by Anubha Sinha&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Launched in 2018, PM-JAY is a public health insurance scheme set to cover 10 crore poor and vulnerable families across the country for secondary and tertiary care hospitalisation. Eligible candidates can use the scheme to avail of cashless benefits at any public/private hospital falling under this scheme. Considering the scale and sensitivity of the data, the creation of a well-thought-out data-sharing document is a much-needed step. However, the document – though only a draft – has certain portions that need to be reconsidered, including parts that are not aligned with other healthcare policy documents. In addition, the guidelines should be able to work in tandem with the Personal Data Protection Act whenever it comes into force. With no prior intimation of the publication of the guidelines, and the provision of a mere 10 days for consultation, there was very little scope for stakeholders to submit their comments and participate in the consultation. While the guidelines pertain to the PM-JAY scheme, it is an important document to understand the government’s concerns and stance on the sharing of health data, especially by insurance companies.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Definitions: Ambiguous and incompatible with similar policy documents&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The draft guidelines add to the list of health data–related policies that have been published since the beginning of the pandemic. These include three draft health data management policies published within two years, which have already covered the sharing and management of health data. The draft guidelines repeat the pattern of earlier policies on health data, wherein there is no reference to the policies that predated it; in this case, the guidelines fail to refer to the draft National Digital Health Data Management Policy (published in April 2022). To add to this, the document – by placing the definitions at the end – is difficult to read and understand, especially when terms such as ‘beneficiary’, ‘data principal’, and ‘individual’ are used interchangeably. In the same vein, the document uses the terms ‘data principal’ and ‘data fiduciary’, and the definitions of health data and personal data, from the 2019 PDP Bill, while also referring to the IT Act SDPI Rules and its definition of ‘sensitive personal data’. While the guidelines state that the IT Act and Rules will be the legislation to refer to for these guidelines, it is to be noted that the IT Act under the SPDI Rules covers ‘body corporates’, which under Section 43A(1), is defined as “any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities;”. It is difficult to add responsibility and accountability to the organisations under the guidelines when they might not even be covered under this definition.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With each new policy, civil society organisations have been pointing out the need to have a data protection act before introducing policies and guidelines that deal with the processing and sharing of the data of individuals. Ideally, these policies – even in draft form – should have been published after the Personal Data Protection Bill was enacted, to ensure consistency with the provisions of the law. For example, the guidelines introduce a new category of governance mechanisms under the data-sharing committee headed by a data-sharing officer (DSO). The responsibilities and powers of the DSO are similar to that of the data protection officer under the draft PDP Bill as well as the National Data Health Management Policy (NHDMP). This, in turn, raises the question of whether the DSO and the DPOs under both the PDP Bill and the draft NDMP will have the same responsibilities. Clarity in terms of which of the policies are in force and how they intersect is needed to ensure a smooth implementation. Ideally, having multiple sources of definitions should be addressed at the drafting stage itself.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Guiding Principles: Need to look beyond privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The guidelines enumerate certain principles to govern the use, collection, processing, and transmission of the personal or sensitive personal data of beneficiaries. These principles are accountability, privacy by design, choice and consent, openness/transparency, etc. While these provisions are much needed, their explanation at times misses the mark of why these principles were added. For example, in the case of accountability, the guidelines state that the ‘data fiduciary’ shall be accountable for complying with measures based on the guiding principles However, it does not specify who the fiduciaries would be accountable to and what the steps are to ensure accountability. Similarly, in the case of openness and transparency, the guidelines state that the policies and practices relating to the management of personal data will be available to all stakeholders. However, openness and transparency need to go beyond policies and practices and should consider other aspects of openness, including open data and the use of open-source software and open standards. This again will add to transparency, in that it would specify the rights of the data principal, as the current draft looks at the rights of the data principal merely from a privacy perspective. In the case of purpose limitation as well, the guidelines are tied to the privacy notice, which again puts the burden on the individual (in this case, beneficiary) when the onus should actually be on the data fiduciary. Lastly, under the empowerment of beneficiaries, the guidelines state that the “data principal shall be able to seek correction, amendments, or deletion of such data where it is inaccurate;”. The right to deletion should not be conditional on inaccuracy, especially when entering the scheme is optional and consent-based.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Data sharing with third parties without adequate safeguards&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The guidelines outline certain cases where personal data can be collected, used, or disclosed without the consent of the individual. One of these cases is when the data is anonymised. However, the guidelines do not detail how this anonymisation would be achieved and ensured through the life cycle of the data, especially when the clause states that the data will also be collected without consent. The guidelines also state that the anonymised data could be used for public health management, clinical research, or academic research. The guidelines should have limited the scope of academic research or added certain criteria to gain access to the data; the use of vague terminology could lead to this data (sometimes collected without consent) being de-anonymised or used for studies that could cause harm to the data principal or even a particular community. The guidelines state that the data can be shared as ‘protected health information’ with a government agency for oversight activities authorised by law, epidemic control, or in response to court orders. With the sharing of data, care should be taken to ensure data minimisation and purpose limitations that go beyond the explanations added in the body of the guidelines. In addition, the guidelines also introduce the concept of a ‘clean room’, which is defined as “a secure sandboxed area with access controls, where aggregated and anonymised or de-identified data may be shared for the purposes of developing inference or training models”. The definition does not state who will be developing these training models; it could be a cause of worry if AI companies or even insurance companies have the potential to use this data to train models that could eventually make decisions based on the results. The term ‘sandbox’ is explained under the now revoked DP Bill 2021 as “such live testing of new products or services in a controlled or test regulatory environment for which the Authority may or may not permit certain regulatory relaxations for a&lt;br /&gt;specified period for the limited purpose of the testing”. Neither the 2019 Bill nor the IT Act/Rules defines ‘sandbox’; the guidelines should have ideally spent more time explaining how the sandbox system in the ‘Clean Room’ works.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The draft Data Sharing Guidelines are a welcome step in ensuring that the entities sharing and processing data have guidelines to adhere to, especially since the Data Protection Bill has not been passed yet. The mention of the best practices for data sharing in annexures, including practices for people who have access to the data, is a step in the right direction, which could be made better with regular training and sensitisation. While the guidelines are a good starting point, they still suffer from the issues that have been highlighted in similar health data policies, including not referring to older policies, adding new entities, and the reliance on digital and mobile technology. The guidelines could have added more nuance to the consent and privacy by design sections to ensure other forms of notice, e.g., notice in audio form in different Indian languages. While PM-JAY aims to reach 10 crore poor and vulnerable families, there is a need to look at how to ensure that consent is given according to the guidelines that are “free, informed, clear, and specific”.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines'&gt;https://cis-india.org/internet-governance/blog/nha-data-sharing-guidelines&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-09-29T15:17:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6">
    <title>MediaNama - #NAMAprivacy: The Future of User Data (Delhi, Sep 6)</title>
    <link>https://cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6</link>
    <description>
        &lt;b&gt;MediaNama is hosting a full day conference on "the future of user data in India", on the 6th of September 2017, which is particularly significant given the recent Supreme Court ruling on the fundamental right to privacy, and two government consultations: one at the TRAI, and another at MEITY. This discussion is supported by Facebook, Google, and Microsoft. Sumandro Chattapadhyay, Research Director, will participate as a speaker in the session titled "regulating storage, sharing and transfer of data."&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Details&lt;/h4&gt;
&lt;p&gt;Time: September 6th 2017, 9 am to 4:30 pm&lt;/p&gt;
&lt;p&gt;Venue: Gulmohar Hall, India Habitat Centre, Lodhi Road (please enter from Gate #3)&lt;/p&gt;
&lt;p&gt;Agenda: &lt;a href="https://www.medianama.com/2017/08/223-agenda-namaprivacy-future-of-user-data/"&gt;https://www.medianama.com/2017/08/223-agenda-namaprivacy-future-of-user-data/&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;Announced Speakers&lt;/h4&gt;
&lt;ul&gt;&lt;li&gt;Chinmayi Arun, Centre for Communication Governance at NLU Delhi&lt;/li&gt;
&lt;li&gt;Malavika Raghavan, IFMR Finance Foundation&lt;/li&gt;
&lt;li&gt;Renuka Sane, NIPFP&lt;/li&gt;
&lt;li&gt;Smitha Krishna Prasad, Centre for Communication Governance at NLU Delhi&lt;/li&gt;
&lt;li&gt;Ananth Padmanabhan, Carnegie India&lt;/li&gt;
&lt;li&gt;Avinash Ramachandra, Amazon&lt;/li&gt;
&lt;li&gt;Hitesh Oberoi, Naukri&lt;/li&gt;
&lt;li&gt;Jochai Ben-Avie, Mozilla&lt;/li&gt;
&lt;li&gt;Mrinal Sinha, Mobikwik&lt;/li&gt;
&lt;li&gt;Murari Sreedharan, Bankbazaar&lt;/li&gt;
&lt;li&gt;Sumandro Chattapadhyay, Centre for Internet and Society&lt;/li&gt;&lt;/ul&gt;
&lt;h4&gt;Facilitators&lt;/h4&gt;
&lt;ul&gt;&lt;li&gt;Saikat Datta, Asia Times Online&lt;/li&gt;
&lt;li&gt;Shashidar KJ, MediaNama&lt;/li&gt;
&lt;li&gt;Nikhil Pahwa, MediaNama&lt;/li&gt;&lt;/ul&gt;
&lt;h4&gt;Attendees&lt;/h4&gt;
&lt;p&gt;We have confirmed 140+ attendees from: Adobe, Amber Health, Amazon, APCO Worldwide, Bank Bazaar, Bloomberg-Quint, Blume Ventures, Broadband India Forum, Business Standard, BuzzFeed News, CCOAI, CEIP, Change Alliance, Chase India, CIS, CNN News18, DEF, Deloitte, DNA, DSCI, E2E Networks, British High Commission, Eurus Network Services, FICCI, Firefly Networks, Flipkart, Forrester Research, Fortumo, DoT, MEITY, IAMAI, IBM, ICRIER, IFMR Finance Foundation, IIMC, Indian Law Institute, Indic Project, Info Edge, ISPAI, IT for Change, ITU-APT, Jamia Millia Islamia, Jindal Global Law School, Mimir Technologies, Mozilla, Newslaundry, NIPFP, Nishith Desai Associates, NIXI, NLU-Delhi, ORF, Paytm, PLR Chambers, PRS Legislative Research, Publicis Groupe, Quartz India, Reliance Jio, Reuters, Saikrishna &amp;amp; Associates, Scroll.in, SFLC.in, Spectranet, The Economics Times, The Indian Express, The Times of India, The Wire, Times Internet, Twitter, and more.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6'&gt;https://cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sumandro</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Digital Economy</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2017-09-05T10:22:12Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/marco-civil-da-internet">
    <title>Marco Civil da Internet: Brazil’s ‘Internet Constitution’</title>
    <link>https://cis-india.org/internet-governance/blog/marco-civil-da-internet</link>
    <description>
        &lt;b&gt;On March 25, 2014, Brazil's lower house of parliament passed bill no. 2126/2011, popularly known as Marco Civil da Internet. The Marco Civil is a charter of Internet user-rights and service provider responsibilities, committed to freedom of speech and expression, privacy, and accessibility and openness of the Internet. In this post, the author looks at the pros and cons of the bill.&lt;/b&gt;
        &lt;h3&gt;&lt;em&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;/em&gt;&lt;/h3&gt;
&lt;div style="text-align: justify; "&gt;
&lt;div&gt;
&lt;div style="text-align: justify; "&gt;Ten months ago, Edward Snowden’s revelations of the U.S. National Security Agency’s extensive, warrantless spying dawned on us. Citizens and presidents alike expressed their outrage at this sweeping violation of their privacy. While India’s position remained carefully neutral, or indeed, supportive of NSA’s surveillance, Germany, France and Brazil cut the U.S. no slack. Indeed, at the 68th session of the United Nations General Assembly, Brazilian President Dilma Rousseff (whose office the NSA had placed under surveillance) stated, “&lt;em&gt;Tampering in such a manner in the affairs of other countries is a breach of International Law and is an affront to the principles that must guide the relations among them, especially among friendly nations.&lt;/em&gt;” Brazil, she said, would “&lt;em&gt;redouble its efforts to adopt legislation, technologies and mechanisms to protect us from the illegal interception of communications and data.&lt;/em&gt;”&lt;/div&gt;
&lt;div style="text-align: justify; "&gt;&lt;/div&gt;
&lt;div style="text-align: justify; "&gt;&lt;/div&gt;
&lt;div&gt;Some may say that Brazil has lived up to its word. Later this month, Brazil will be host to &lt;em&gt;NETmundial&lt;/em&gt;, the Global Multi-stakeholder Meeting on the Future of Internet Governance, jointly organized by the Brazilian Internet Steering Committee (CGI.br) and the organization /1Net. The elephantine invisible presence of Snowden vests NETmundial with the hope and responsibility of laying the ground for a truly multi-stakeholder model for governing various aspects of the Internet; a model where governments are an integral part, but not the only decision-makers. The global Internet community, comprising users, corporations, governments, the technical community, and NGOs and think-tanks, is hoping devise a workable method to divest the U.S. Government of its &lt;em&gt;de facto&lt;/em&gt; control over the Internet, which it wields through its contracts to manage the domain name system and the root zone.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;span&gt; &lt;/span&gt;&lt;/div&gt;
&lt;div&gt;But as Internet governance expert Dr. Jeremy Malcolm put it, these technical aspects do not make or break the Internet. The real questions in Internet governance underpin the rights of users, corporations and netizens worldwide. Sir Tim Berners-Lee, when he &lt;a class="external-link" href="http://www.theguardian.com/technology/2014/mar/12/online-magna-carta-berners-lee-web"&gt;called for&lt;/a&gt; an Internet Bill of Rights, meant much the same. For Sir Tim, an open, neutral Internet is imperative if we are to keep our governments open, and foster “&lt;em&gt;good democracy, healthcare, connected communities and diversity of culture&lt;/em&gt;”. Some countries agree. The Philippines envisaged a &lt;em&gt;Magna Carta&lt;/em&gt; for Internet Freedom, though the Bill is pending in the Philippine parliament.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;span&gt; &lt;/span&gt;&lt;/div&gt;
&lt;h3&gt;&lt;strong&gt;&lt;em&gt;Marco Civil da Internet:&lt;/em&gt;&lt;/strong&gt;&lt;/h3&gt;
&lt;div&gt;Last week, on March 25, 2014, the Brazilian Chamber of Deputies (the lower house of parliament) passed the &lt;em&gt;Marco Civil da Internet&lt;/em&gt;, bill 2126/2011, a charter of Internet rights. The &lt;em&gt;Marco Civi&lt;/em&gt;l is considered by the global Internet community as a one-of-a-kind bill, with Sir Tim Berners-Lee &lt;a class="external-link" href="http://www.webfoundation.org/2014/03/marco-civil-statement-of-support-from-sir-tim-berners-lee/?utm_source=hootsuite&amp;amp;utm_campaign=hootsuite"&gt;hailing&lt;/a&gt; the “&lt;em&gt;groundbreaking, inclusive and participatory process has resulted in a policy that balances the rights and responsibilities of the individuals, governments and corporations who use the Internet&lt;/em&gt;”.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;The &lt;em&gt;Marco Civil&lt;/em&gt;’s journey began with a two-stage public consultation process in October 2009, under the aegis of the Brazilian Ministry of Justice’s Department of Legislative Affairs, jointly with the Getulio Vargas Foundation’s Center for Technology and Society of the Law School of Rio de Janeiro (CTS-FGV). The collaborative process &lt;a class="external-link" href="http://observatoriodainternet.br/wp-content/uploads/2012/11/Internet-Policy-Report-Brazil-2011.pdf"&gt;involved&lt;/a&gt; a 45-day consultation process in which over 800 comments were received, following which a second consultation in May 2010 received over 1200 comments from individuals, civil society organizations and corporations involved in the telecom and technology industries. Based on comments, the initial draft of the bill was revamped to include issues of popular, public importance, such as intermediary liability and online freedom of speech.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;An official English translation of the &lt;em&gt;Marco Civil&lt;/em&gt; is as yet unavailable. But an &lt;a class="external-link" href="https://docs.google.com/document/d/1kJYQx-l_BVa9-3FZX23Vk9IfibH9x6E9uQfFT4e4V9I/pub"&gt;unofficial translation&lt;/a&gt; (please note that the file is uploaded on Google Drive), triangulated against &lt;a class="external-link" href="http://infojustice.org/archives/32527"&gt;online&lt;/a&gt; &lt;a class="external-link" href="http://www.zdnet.com/brazil-passes-groundbreaking-internet-governance-bill-7000027740http://www.zdnet.com/brazil-passes-groundbreaking-internet-governance-bill-7000027740/"&gt;commentary&lt;/a&gt; on &lt;a class="external-link" href="http://www.zdnet.com/all-you-need-to-know-about-brazils-internet-constitution-7000022726/"&gt;the bill&lt;/a&gt;, reveals that the following issues were of primary importance:&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;h3&gt;&lt;strong&gt;&lt;em&gt;The fundamentals:&lt;/em&gt;&lt;/strong&gt;&lt;/h3&gt;
&lt;div&gt;The fundamental principles of the &lt;em&gt;Marco Civil&lt;/em&gt; reveal a commitment to openness, accessibility neutrality and democratic collaboration on the Internet. Art. 2 (see unofficial translation) sets out the fundamental principles that form the basis of the law. It pledges to adhere to freedom of speech and expression, along with an acknowledgement of the global scale of the network, its openness and collaborative nature, its plurality and diversity. It aims to foster free enterprise and competition on the Internet, while ensuring consumer protection and upholding human rights, personality development and citizenship exercise in the digital media in line with the network’s social purposes. Not only this, but Art. 4 of the bill pledges to promote universal access to the Internet, as well as “&lt;em&gt;to information, knowledge and participation in cultural life and public affairs&lt;/em&gt;”. It aims to promote innovation and open technology standards, while ensuring interoperability.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;span&gt; &lt;/span&gt;&lt;/div&gt;
&lt;div&gt;The &lt;em&gt;Marco Civil&lt;/em&gt; expands on its commitment to human rights and accessibility by laying down a “&lt;em&gt;discipline of Internet use in Brazil&lt;/em&gt;”. Art. 3 of the bill guarantees freedom of expression, communication and expression of thoughts, under the terms of the Federal Constitution of Brazil, while at the same time guaranteeing privacy and protection of personal data, and preserving network neutrality. It also focuses on preserving network stability and security, by emphasizing accountability and adopting “&lt;em&gt;technical measures consistent with international standards and by encouraging the implementation of best practices&lt;/em&gt;”.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;These principles, however, are buttressed by rights assured to Internet users and responsibilities of and exceptions provided to service providers.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;/div&gt;
&lt;h3&gt;&lt;strong&gt;&lt;em&gt;Rights and responsibilities of users and service providers:&lt;/em&gt;&lt;/strong&gt;&lt;/h3&gt;
&lt;div&gt;&lt;strong&gt;&lt;span style="text-decoration: underline;"&gt;Net neutrality:&lt;/span&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;Brazil becomes one of the few countries in the world (joining the likes of the Netherlands, Chile and Israel in part) to preserve network neutrality by legislation. Art. 9 of the &lt;em&gt;Marco Civil&lt;/em&gt; requires all Internet providers to “&lt;em&gt;to treat any data package with isonomy, regardless of content, origin and destination, service, terminal or application&lt;/em&gt;”. Not only this, but Internet providers are enjoined from blocking, monitoring or filtering content during any stage of transmission or routing of data. Deep packet inspection is also forbidden. Exceptions may be made to discriminate among network traffic &lt;em&gt;only&lt;/em&gt; on the basis of essential technical requirements for services-provision, and for emergency services prioritization. Even this requires the Internet provider to inform users in advance of such traffic discrimination, and to act proportionately, transparently and with equal protection.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;span style="text-decoration: underline;"&gt;Data retention, privacy and data protection:&lt;/span&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;The &lt;em&gt;Marco Civil&lt;/em&gt; includes provisions for the retention of personal data and communications by service providers, and access to the same by law enforcement authorities. However, record, retention and access to Internet connection records and applications access-logs, as well as any personal data and communication, are required to meet the standards for “&lt;em&gt;the conservation of intimacy, private life, honor and image of the parties directly or indirectly involved&lt;/em&gt;” (Art. 10). Specifically, access to identifying information and contents of personal communication may be obtained &lt;em&gt;only&lt;/em&gt; upon judicial authorization.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;Moreover, where data is collected within Brazilian territory, processes of collection, storage, custody and treatment of the abovementioned data are required to comply with Brazilian laws, especially the right to privacy and confidentiality of personal data and private communications and records (Art. 11). Interestingly, this compliance requirement is applicable also to entities incorporated in foreign jurisdictions, which offer services to Brazilians, or where a subsidiary or associate entity of the corporation in question has establishments in Brazil. While this is undoubtedly a laudable protection for Brazilians or service providers located in Brazil, it is possible that conflicts may arise (&lt;a class="external-link" href="http://www.economist.com/news/americas/21599781-brazils-magna-carta-web-net-closes?frsc=dg%7Ca&amp;amp;fsrc=scn/tw_app_ipad"&gt;with penal consequences&lt;/a&gt;) between standards and terms of data retention and access by authorities in other jurisdictions. In the predictable absence of harmonization of such laws, perhaps rules of conflicts of law may prove helpful.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;While data retention remained a point of contention (Brazil initially sought to ensure a 5-year data retention period), under the &lt;em&gt;Marco Civil&lt;/em&gt;&lt;span&gt;, Internet providers are required to retain connection records for 1 year under rules of strict confidentiality; this responsibility cannot be delegated to third parties (Art. 13). Providers providing the Internet connection (such as Reliance or Airtel in India) are forbidden from retaining records of access to applications on the Internet (Art. 14). While law enforcement authorities may request a longer retention period, a court order (filed for by the authority within 60 days from the date of such request) is required to access the records themselves. In the event the authority fails to file for such court order within the stipulated period, or if court order is denied, the service provider must protect the confidentiality of the connection records.&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span&gt;&lt;br /&gt;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span&gt; &lt;/span&gt;&lt;/div&gt;
&lt;div&gt;Though initially excluded from the &lt;em&gt;Marco Civil&lt;/em&gt;, the current draft passed by the Chamber of Deputies requires Internet application providers (such as Google or Facebook) to retain access-logs for their applications for 6 months (Art. 15). Logs for other applications may not be retained without previous consent of the owner, and in any case, the provider cannot retain personal data that is in excess of the purpose for which consent was given by the owner. As for connection records, law enforcement authorities may request a greater retention period, but require a court order to access the data itself.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;These requirements must be understood in light of the rights that the &lt;em&gt;Marco Civil&lt;/em&gt; guarantees to users. Art. 7, which enumerates these user-rights, does not however set forth their &lt;em&gt;content&lt;/em&gt;; this is probably left to judicial interpretation of rights enshrined in the Federal Constitution. In any event, Art. 7 guarantees to all Internet users the “&lt;em&gt;inviolability of intimacy and privacy&lt;/em&gt;”, including the confidentiality of all Internet communications, along with “&lt;em&gt;compensation for material or moral damages resulting from violation&lt;/em&gt;”. In this regard, it assures that users are entitled to a guarantee that no personal data or communication shall be shared with third parties in the absence of express consent, and to “&lt;em&gt;clear and complete information on the collection, use, storage, treatment and protection of their personal data&lt;/em&gt;”. Indeed, where contracts violate the requirements of inviolability and secrecy of private communications, or where a dispute resolution clause does not permit the user to approach Brazilian courts as an alternative, Art. 8 renders such contracts null and void.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;Most importantly, Art. 7 states that users are entitled to clear and complete information about how connection records and access logs shall be stored and protected, and to publicity of terms/policies of use of service providers. Additionally, Art. 7 emphasizes quality of service and accessibility to the Internet, and forbids suspension of Internet connections except for failure of payments. Read comprehensively, therefore, Arts. 7-15 of the &lt;em&gt;Marco Civil prima facie&lt;/em&gt; set down robust protections for private and personal data and communications.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;An initial draft of the &lt;em&gt;Marco Civil&lt;/em&gt; &lt;a class="external-link" href="http://www.zdnet.com/companies-brace-for-brazil-local-data-storage-requirements-7000027092/"&gt;sought to mandate&lt;/a&gt; local storage of all Brazilians’ data within Brazilian territory. This came in response to Snowden’s revelations of NSA surveillance, and President Rousseff, in her &lt;a class="external-link" href="http://gadebate.un.org/sites/default/files/gastatements/68/BR_en.pdf"&gt;statement&lt;/a&gt; to the United Nations, declared that Brazil sought to protect itself from “&lt;em&gt;illegal interception of communications and data&lt;/em&gt;”. However, the implications of this local storage requirement was the creation of a &lt;a class="external-link" href="http://bigstory.ap.org/article/brazil-looks-break-us-centric-internet"&gt;geographically isolated&lt;/a&gt; Brazilian Internet, with repercussions for the Internet’s openness and interoperability that the &lt;em&gt;Marco Civil&lt;/em&gt; itself sought to protect. Moreover, there are &lt;a class="external-link" href="http://www.gp-digital.org/gpd-update/data-retention-provisions-in-the-marco-civil/"&gt;implications&lt;/a&gt; for efficiency and business; for instance, small businesses may be unable to source the money or capacity to comply with local storage requirements. Also, they lead to mandating storage on political grounds, and not on the basis of effective storage. Amid widespread protest from corporations and civil society, this requirement was then &lt;a class="external-link" href="http://www.zdnet.com/brazil-gives-up-on-local-data-storage-demands-net-neutrality-7000027493/"&gt;withdrawn&lt;/a&gt; which, some say, propelled the quick passage of the bill in the Chamber of Deputies.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;/div&gt;
&lt;div style="text-align: justify; "&gt;
&lt;div&gt;&lt;strong&gt;&lt;span style="text-decoration: underline;"&gt;Intermediary liability:&lt;/span&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;Laws of many countries make service providers liable for third party content that infringes copyright or that is otherwise against the law (such as pornography or other offensive content). For instance, Section 79 of the Indian Information Technology Act, 2000 (as amended in 2008) is such a provision where intermediaries (i.e., those who host user-generated content, but do not create the content themselves) may be held liable. However, stringent intermediary liability regimes create the possibility of private censorship, where intermediaries resort to blocking or filtering user-generated content that they fear may violate laws, sometimes even without intimating the creator of the infringing content. The &lt;em&gt;Marco Civil&lt;/em&gt; addresses this possibility of censorship by creating a restricted intermediary liability provision. Please note, however, that the bill expressly excludes from its ambit copyright violations, which a &lt;a class="external-link" href="http://infojustice.org/archives/31993"&gt;copyright reforms bill&lt;/a&gt; seeks to address.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;At first instance, the &lt;em&gt;Marco Civil&lt;/em&gt; exempts service providers from civil liability for third party content (Art. 18). Moreover, intermediaries are liable for damages arising out of third party content &lt;em&gt;only&lt;/em&gt; where such intermediaries do not comply with court orders (which may require removal of content, etc.) (Art. 19). This leaves questions of infringement and censorship to the judiciary, which the author believes is the right forum to adjudicate such issues. Moreover, wherever identifying information is available, Art. 20 mandates the intermediary to appraise the creator of infringing content of the reasons for removal of his/her content, with information that enables the creator to defend him- or herself in court. This measure of transparency is particularly laudable; for instance, in India, no such intimation is required by law, and you or I as journalists, bloggers or other creators of content may never know why our content is taken down, or be equipped to defend ourselves in court against the plaintiff or petitioner who sought removal of our content. Finally, a due diligence requirement is placed on the intermediary in circumstances where third party content discloses, “&lt;em&gt;without consent of its participants, of photos, videos or other materials containing nudity or sexual acts of private character&lt;/em&gt;”. As per Art. 21, where the intermediary does not take down such content upon being intimated by the concerned participant, it may be held secondarily liable for infringement of privacy.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;This restricted intermediary liability regime is further strengthened by a requirement of specific identification of infringing content, which both the court order issued under Art. 20 and the take-down request under Art. 21 must fulfill. This requirement is missing, for instance, under Section 79 of the Indian Information Technology Act, which creates a diligence and liability regime without requiring idenfiability of infringing content.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;h3&gt;&lt;strong&gt;&lt;em&gt;Conclusion:&lt;/em&gt;&lt;/strong&gt;&lt;/h3&gt;
&lt;div&gt;Brazil’s ‘Internet Constitution’ has done much to add to the ongoing discussion on the rights and responsibilities of users and providers. By expressly adopting protections for net neutrality and online privacy and freedom of expression, the Marco Civil may be considered to set itself up as a model for Internet rights at the municipal level, barring a Utopian bill of rights. Indeed, in an effusive statement of support for the bill, Sir Tim Berners-Lee stated: “&lt;em&gt;If Marco Civil is passed, without further delay or amendment, this would be the best possible birthday gift for Brazilian and global Web users.&lt;/em&gt;”&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;Of course, the &lt;em&gt;Marco Civil&lt;/em&gt; is not without its failings. Authors &lt;a class="external-link" href="http://infojustice.org/archives/32527"&gt;say&lt;/a&gt; that the data retention requirements by connection and application providers, with leeway provided for law enforcement authorities to lengthen retention periods, is problematic. Moreover, the discussions surrounding data localization and a ‘walled-off’ Internet that protects against surveillance ignores the interoperability and openness that forms the core of the Internet.&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;On the whole, though, the &lt;em&gt;Marco Civil&lt;/em&gt; may be considered a victory, on many counts. It is possibly the first successful example of a national legislation that is the outcome of a broad, consultative process with civil society and other affected entities. It expressly affirms Brazil’s commitment to the protection of privacy and freedom of expression, as well as to Internet accessibility and the openness of the network. It aims to eliminate the possibility of private censorship online, while upholding privacy rights of users. It seeks to reduce the potential for abuse of personal data and communication by government authorities, by requiring judicial authorization for the same. In a world where warrantless government spying extends across national border, such a provision is novel and desirable. One hopes that, when the global Internet community sits down at its various fora to identify and enumerate principles for Internet governance, it will look to the &lt;em&gt;Marco Civil&lt;/em&gt; as an example of standards that governments may adhere to, and not necessarily resort to the lowest common denominator standards of international rights and protections.&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/marco-civil-da-internet'&gt;https://cis-india.org/internet-governance/blog/marco-civil-da-internet&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>geetha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Net Neutrality</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2014-06-19T10:38:10Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/it-act-and-commerce">
    <title>IT Act and Commerce</title>
    <link>https://cis-india.org/internet-governance/blog/it-act-and-commerce</link>
    <description>
        &lt;b&gt;This is a guest post by Rahul Matthan, partner in the law firm Trilegal, and widely regarded as one of the leading experts on information technology law in India.  In this post, Mr. Matthan looks at the provisions in the amended Information Technology Act of interest to commerce, namely electronic signatures and data protection.&lt;/b&gt;
        
&lt;p&gt;This post analyses the amendments brought about to the Information Technology Act, 2000 (“IT Act 2000”) through the recent 2008 amendments (“IT Act 2008”).&lt;/p&gt;
&lt;h2&gt;Definitions&lt;/h2&gt;
&lt;p&gt;The IT Act 2008 has introduced a few additional definitions to the list of definitions originally included in the IT Act 2000. These definitions have either amplified the existing provisions or been introduced in order to address new issues required to be defined in the context of the newly introduced provisions in the statute. Some of the significant definitions have been discussed below:&lt;/p&gt;
&lt;h3&gt;Computer Network&lt;/h3&gt;
&lt;p&gt;The definition of “computer network” has been amended to specifically include the wireless interconnection of computers. While wireless technology did fall within the scope of the IT Act under the rather generic head of “other communication media”, the Amendment Act clarifies the scope of the IT Act by expressly including the term “wireless”.&lt;/p&gt;
&lt;h3&gt;Communication Devices&lt;/h3&gt;
&lt;p&gt;The IT Amendment Bill, 2006, had provided an explanation for “communication devices” under Section 66A. This definition has been moved into the definition section and now applies across all sections of the IT Act 2008. “Communication devices” is defined to mean “a cell phone, personal digital assistance (PDA) device or combination of both or any device used to communicate, send or transmit any text, video, audio or image”.&lt;/p&gt;
&lt;p&gt;There has been case law even under the IT Act that has held mobile phones to fall within the ambit of the IT Act, as a result of which all the provisions of the Act that apply to computers are equally applicable to mobile phones. This amendment only makes that position more explicit.&lt;/p&gt;
&lt;h2&gt;Electronic Signatures&lt;br /&gt;&lt;/h2&gt;
&lt;p&gt;One of the major criticisms of the IT Act 2000 was the fact that it was not a technology neutral legislation. This was specifically so in relation to the provisions in the IT Act 2000 relating to the use of digital signatures for the purpose of authentication of electronic records. The statute made specific reference to the use of asymmetric cryptosystem technologies in the context of digital signatures, and, in effect, any authentication method that did not use this technology was not recognised under the IT Act 2000.&lt;/p&gt;
&lt;p&gt;The IT Act 2008 has attempted to make this more technology neutral. In doing so, the attempt has been to bring the law in line with the United Nations Commission on International Trade Law Model Law on Electronic Signatures (“Model Law”).&lt;/p&gt;
&lt;h3&gt;Replacement of Digital Signatures&lt;/h3&gt;
&lt;p&gt;The first significant change in the IT Act 2008 is the replacement of the term “digital signatures” with “electronic signatures” in almost all the provisions in the IT Act 2000. In some provisions, reference continues to be made to digital signatures, but the net effect of the amendments is to treat digital signatures as a subset (or an example of one type) of electronic signatures.&lt;/p&gt;
&lt;p&gt;Electronic signatures have been defined as the authentication of an electronic record using the authentication techniques specified in the 2nd Schedule to the Act, provided they are reliable. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;The reliability criterion has been introduced, very much along the lines of the Model Law. However, the contents of the 2nd Schedule are yet to be stipulated, which means that despite the existence of a reliability standard, the only authentication method available at this point in time is the digital signature regime.&lt;/p&gt;
&lt;h3&gt;Dual Requirement&lt;/h3&gt;
&lt;p&gt;One significant implication of this amendment is the introduction of a dual requirement – to meet the reliability standard as well as to be included in the 2nd Schedule. However, structuring the authentication procedures in this manner offsets the objective tests of neutrality borrowed from the Model Law, since an authentication method may meet the reliability test but will not be deemed to be legally enforceable unless it is notified in the 2nd Schedule.&lt;/p&gt;
&lt;p&gt;Additionally, there will be grounds for challenging electronic signatures that are notified to the 2nd Schedule, if it can be shown that the signature so notified is not reliable under the terms of the reliability criteria. This can act as an impediment to the recognition of electronic signatures by notification.&lt;/p&gt;
&lt;h3&gt;Emphasis on Digital Signatures&lt;/h3&gt;
&lt;p&gt;Another concern is the treatment of digital signatures in the post amendment statute. The IT Act 2008 continues to retain all the provisions relating to digital signatures within the main body of the statute. The term “digital signature” has not been uniformly substituted with “electronic signature” throughout the statute. In certain provisions this leads to a certain amount of absurdity, such as in those relating to representations made as to the issuance, suspension or revocation of digital signature certificates; due to the lack of uniformity, these principles now apply only to digital signatures and not to all types of electronic signatures. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;It would have been preferable if the provisions relating to digital signatures had been moved in their entirety to the 2nd Schedule. Then, digital signatures would have become just another class of electronic signatures listed in the Schedule. By omitting to do this, the authors ensure that digital signature-specific provisions remaining in the main body of the statute challenge the technology neutrality of the statute.&lt;/p&gt;
&lt;h3&gt;Certifying Authorities&lt;/h3&gt;
&lt;p&gt;The IT Act 2008 has made the certifying authority the repository of all electronic signatures issued under the statute. Given that there are, at present, multiple certifying authorities, this provision is impractical. Instead, the statute should have either referred to the Controller of Certifying Authorities or should have been worded to state that each certifying authority would be the repository for all electronic signature certificates issued by it.&lt;/p&gt;
&lt;h3&gt;Impact on Other Statutes&lt;/h3&gt;
&lt;p&gt;Since the enactment of the IT Act 2000, amendments have been carried out in other statutes, relying on the concept of digital signatures. For instance, the Negotiable Instruments Act, 1881, makes the use of a digital signature essential for an electronic cheque.1 While the IT Act 2008 has expanded the scope of the available authentication measures, by introducing the technologically neutral concept of electronic signatures, corresponding amendments in other statutes like the Negotiable Instruments Act, 1881, will need to be carried out, so that they are not limited in their application to digital signatures.&lt;/p&gt;
&lt;h2&gt;Data Protection&lt;br /&gt;&lt;/h2&gt;
&lt;p&gt;Prior to the passing of the IT Act 2008, the concept of 'data protection' was not recognised in India. The amendments have now introduced some amount of legal protection for data stored in the electronic medium. This chapter analyses the changes sought to be introduced and their impact on data protection law in India.&lt;/p&gt;
&lt;h3&gt;Data under the IT Act 2000&lt;/h3&gt;
&lt;p&gt;The only provision under the IT Act 2000, which dealt with unauthorised access and damage to data, was Section 43. Under that section, penalties were prescribed in respect of any person who downloads copies or extracts data from a computer system, introduces computer contaminants or computer viruses into a computer system or damages any data residing in a computer system.&lt;/p&gt;
&lt;h3&gt;Data under the IT Act 2008&lt;/h3&gt;
&lt;p&gt;Under the IT Act 2008, far-reaching changes have been made in relation to data. Two sections have been inserted specifically for that purpose – Sections 43-A and 72-A, one dealing with the civil and the other with the criminal remedies in relation to the breach of data related obligations.&lt;/p&gt;
&lt;h3&gt;The Civil Remedies for Data Protection&lt;/h3&gt;
&lt;p&gt;The newly introduced Section 43-A reads as follows:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Compensation for failure to protect data - Where a body corporate, possessing, dealing or handling any sensitive personal data or information in a computer resource which it owns, controls or operates, is negligent in implementing and maintaining reasonable security practices and procedures and thereby causes wrongful loss or wrongful gain to any person, such body corporate shall be liable to pay damages by way of compensation, to the person so affected.&lt;/p&gt;
&lt;p&gt; Explanation - For the purposes of this section:&lt;/p&gt;
&lt;p&gt; (i)&amp;nbsp; “Body Corporate” means any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities;&lt;/p&gt;
&lt;p&gt;(ii) “Reasonable Security Practices and Procedures” means security practices and procedures designed to protect such information from unauthorised access, damage, use, modification, disclosure or impairment, as may be specified in an agreement between the parties or as may be specified in any law for the time being in force and in the absence of such agreement or any law, such reasonable security practices and procedures, as may be prescribed by the Central Government in consultation with such professional bodies or associations as it may deem fit; and&lt;/p&gt;
&lt;p&gt;(iii)&amp;nbsp; “Sensitive Personal Data or Information” means such personal information as may be prescribed by the Central Government in consultation with such professional bodies or associations as it may deem fit.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;While at first this provision appears to address several long standing concerns relating to data protection in India, there are several insidious flaws that could affect the development of a data protection jurisprudence in the country.&lt;/p&gt;
&lt;h3&gt;Non-Electronic Data&lt;/h3&gt;
&lt;p&gt;In the first instance, there is no mention, under this provision, of non-electronic data. Most international data protection statutes recognise and protect data stored in any electronic medium or a relevant filing system (including, for instance, a salesperson's diary). The newly introduced provisions of the IT Act 2008 do not provide any protection for data stored in a non-electronic medium.&lt;/p&gt;
&lt;p&gt;It could be argued that given the legislative focus of this statute (it has been called the Information Technology Act with a reason), it would be inappropriate to include within this statute protection for forms of data that do not relate to the digital or electronic medium. While that argument is valid to many who look to the new provisions introduced in the IT Act 2008 as the answer to the data protection concerns that the country has been facing all these years, their enthusiasm must be tempered as these new provisions merely provide solutions for electronic data.&lt;/p&gt;
&lt;h3&gt;Classification of Data&lt;/h3&gt;
&lt;p&gt;Most international data protection statutes distinguish between different levels of personal data – specifying difference levels of protection for personal information and sensitive personal information. Depending on whether the data can be classified as one or the other, they have different levels of protection, as loss, unauthorised access or disclosure of sensitive personal information is considered to have a deeper impact on the data subject. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;The new provisions of the IT Act 2008 make no such distinction. Section 43-A applies to all “sensitive personal data or information” but does not specify how personal data not deemed to be sensitive is to be treated. In essence, personal information and sensitive personal information do not appear to be differentially treated in the context of data protection.&lt;/p&gt;
&lt;h3&gt;Consequences&lt;/h3&gt;
&lt;p&gt;Under most international data protection statutes, the person in “control” of the data is liable for the consequences of disclosure, loss or unauthorised access to such information. This ensures that liability is restricted to those who actually have the ability to control the manner in which the data is treated. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;However, under the new provisions of the IT Act 2008, the mere possession of information and its subsequent misuse would render any person who possesses this data liable to damages. While there is likely to be a debate on what constitutes possession and how this differs from control, there can be little doubt that by referring to “possession” in addition to “operation” and “control”, the IT Act 2008 appears to have widened the net considerably.&lt;/p&gt;
&lt;h3&gt;Negligence in Implementing Security Practices&lt;/h3&gt;
&lt;p&gt;Section 43-A specifically places liability on a body corporate only if such body corporate has been negligent in implementing its security practices and procedures in relation to the data possessed, controlled or handled by it. The choice of language here is significant. The statute specifically refers to the term “negligence” in relation to the security practices and procedures as opposed to stipulating a clear, pass-fail type obligation to conform.&lt;/p&gt;
&lt;p&gt;There is a significant difference between the terms “negligence to implement” and “failure to implement”. The former can only result in a breach if the body corporate that was required to follow reasonable security practices with regard to the data in its possession or control does not perform the required action and it can be proved that a reasonable man in the same circumstances would have performed the required action. If a body corporate is to be made liable under the provisions of this Section, it is not enough to demonstrate that security procedures were not followed; it has to be proved in addition that the body corporate was negligent.&lt;/p&gt;
&lt;h3&gt;Wrongful Loss and Gain&lt;/h3&gt;
&lt;p&gt;The Section appears to have been constructed on the basis that a breach has occurred in the event that any “wrongful gain” or “wrongful loss” was suffered. These terms have not been defined either under statutes or through any judicial precedents in the civil context. However, these terms do have a definition under criminal law in India. The Indian Penal Code, 1860 (“IPC”), defines “Wrongful Gain” to mean gain, by unlawful means, of property to which the person gaining is not legally entitled; and “Wrongful Loss” to mean the loss by unlawful means of property to which the person losing it is legally entitled.&lt;/p&gt;
&lt;p&gt;There does not appear to be any greater significance in the use of these terms even though they are typically found in criminal statutes. Therefore, apart from the slight ambiguity as to purpose, their use in the IT Act does not appear to have any great significance.&lt;/p&gt;
&lt;h3&gt;Limitation on Liability&lt;/h3&gt;
&lt;p&gt;The provisions of Section 43 originally had the total liability for a breach capped at Rs. 5,00,00,000 (five crore rupees). The original text of Section 43-A had the same limitation of liability in respect of its data protection provisions. Before the bill was passed into law, this limitation was removed and now a breach of Section 43-A is not subject to any limitation of liabilities.&lt;/p&gt;
&lt;h3&gt;Reasonable Security Practices and Procedures&lt;/h3&gt;
&lt;p&gt;Section 43-A makes a reference to “reasonable security practices and procedures” and stipulates that a breach has been caused only if such practices and procedures have not been followed. There are three methods by which reasonable security practices and procedures can be established:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt; By agreement;&lt;/li&gt;&lt;li&gt;By law; and&lt;/li&gt;&lt;li&gt;By prescription by the Central Government.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;&lt;br /&gt;As there is no law in India which sets out an appropriate definition for the term and since it will be some time before which the Central Government comes out with necessary regulations, it would appear that the only option available is for the parties to arrive at an agreement as to how the sensitive personal data and information exchanged under their contract is to be handled.&lt;/p&gt;
&lt;p&gt;As a corollary, till such time as the government establishes the necessary rules in relation to these security practices and procedures, if a body corporate does not enter into an agreement with the person providing the information as to the reasonable security practices and procedures that would apply, the body corporate cannot be brought within the purview of this section for any loss or damage to data.&lt;/p&gt;
&lt;h3&gt;The Criminal Remedies for Unlawful Disclosure of Information&lt;/h3&gt;
&lt;p&gt;In addition to the civil remedies spelled out in such detail in Section 43-A, the newly introduced provisions of Section 72-A of the IT Act 2008 could be used to impose criminal sanctions against any person who discloses information in breach of a contract for services. While not exactly a data protection provision in the same way that Section 43-A is, there are enough similarities in purpose to achieve the same result.&lt;/p&gt;
&lt;p&gt;Section 72-A reads:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt; Punishment for Disclosure of information in breach of lawful contract - Save as otherwise provided in this Act or any other law for the time being in force, any person including an intermediary who, while providing services under the terms of lawful contract, has secured access to any material containing personal information about another person, with the intent to cause or knowing that he is likely to cause wrongful loss or wrongful gain discloses, without the consent of the person concerned, or in breach of a lawful contract, such&amp;nbsp; material to any other person shall be punished with imprisonment for a term which may extend to three years, or with a fine which may extend to Rupees five lakh, or with both.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;In substance, this provision appears to be focused on providing criminal remedies in the context of breach of confidentiality obligations under service contracts; given that the section specifically refers to the disclosure of personal information obtained under that service contract, it is fair to classify this as a provision that addresses data protection issues.&lt;/p&gt;
&lt;h3&gt;Personal Information&lt;/h3&gt;
&lt;p&gt;The IT Act 2008 does not define “personal information”. Equally, there are no judicial precedents that provide any clarity on the term. The Right to Information Act, 2005 does provide a definition for “personal information”, but that definition is inappropriate in the context of the IT Act 2008. In the absence of a useable definition for the term “personal information”, it becomes difficult to assess the scope and ambit of the provision and in particular to understand the extent to which it is enforceable.&lt;/p&gt;
&lt;h3&gt;"Willful"&lt;/h3&gt;
&lt;p&gt;The section would only apply to persons who willfully disclose personal information and cause wrongful loss or gain. Hence, in order to make a person liable it has to be proved that the person disclosing the personal information did so with an intention to cause wrongful loss or gain. It would be a valid defense to claim that any loss caused was unintentional.&lt;/p&gt;
&lt;h3&gt;Service Contracts&lt;/h3&gt;
&lt;p&gt;The section appears to be particular about the fact that it only applies in the context of personal information obtained under a contract for services. This appears to rule out confidential information (that is not of a personal nature) that has been received under any other form of agreement (including, for example, a technology license agreement). The section is clearly intended to protect against the misuse of personal information and cannot be adapted to provide a wider level of protection against all breaches of confidential information. That said, employers now have a much stronger weapon against employees who leave with the personal records of other fellow employees.&lt;/p&gt;
&lt;h3&gt;Consent&lt;/h3&gt;
&lt;p&gt;This section also clearly applies only to those disclosures of personal information with the intent to cause wrongful loss or gain which have taken place without the consent of the person whose personal information is being disclosed. What remains to be seen is how the law will deal with situations where a general consent for disclosures has been obtained at the time of recruitment.&lt;/p&gt;
&lt;p&gt;Such clauses are made effective around the world by including opt in and opt out clauses, to allow the employee to either expressly agree to the disclosure of his personal information or to specifically exclude himself from the ambit of any such disclosures.&lt;/p&gt;
&lt;h3&gt;Media of Material&lt;/h3&gt;
&lt;p&gt;This section, unlike several other provisions of the IT Act 2008, deals with all manner of materials without requiring them to be digital. However, while disclosure of information stored in the non-electronic medium has been recognised, in the absence of a clear definition of personal information, it is difficult to ascertain the application and enforcement of this section.&lt;/p&gt;
&lt;h3&gt;What’s Missing&lt;/h3&gt;
&lt;p&gt;In order to be a truly effective data protection statute, the IT Act 2008 must include provisions relating to the collection, circumstances of collection, control, utilisation and proper disposal of data. At present the statute is silent about these aspects. In many ways, the statute addresses the particular concerns of companies or corporate entities looking for protection in relation to data outsourced to any other corporate entity for processing. Within these specific parameters the statute works well. However it does little to protect the average citizen of the country from the theft of personal data. Until we have statutory recognition of these issues, we will not be able to say that we have an effective data protection law in India.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/it-act-and-commerce'&gt;https://cis-india.org/internet-governance/blog/it-act-and-commerce&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranesh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Authentication</dc:subject>
    
    
        <dc:subject>Security</dc:subject>
    

   <dc:date>2011-08-02T07:41:45Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function">
    <title>How Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill</title>
    <link>https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</link>
    <description>
        &lt;b&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;published in Medianama&lt;/a&gt; on February 18, 2022. This is the first of a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote &lt;a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html"&gt;an opinion piece&lt;/a&gt; providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash &lt;a href="https://twitter.com/pranesh/status/1023116679440621568"&gt;pointed out&lt;/a&gt; that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —&lt;/p&gt;
&lt;p&gt;a)  where it is necessary to respond to a medical emergency&lt;br /&gt;b)  where it is necessary for state to provide a service or benefit to the individual&lt;br /&gt;c)  where it is necessary for the state to issue any certification, licence or permit&lt;br /&gt;d)  where it is necessary under any central or state legislation, or to comply with a judicial order&lt;br /&gt;e)  where it is necessary for any measures during an epidemic, outbreak or public health&lt;br /&gt;f)  where it is necessary for safety procedures during disaster or breakdown of public order&lt;/p&gt;
&lt;p&gt;In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.&lt;/p&gt;
&lt;h2&gt;Twin restrictions in Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Scope of privacy under Puttaswamy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench ​was ​not ​required ​to ​provide ​a ​legal ​test ​to ​determine ​the ​extent ​and ​scope ​of the ​right ​to ​privacy, but they do provide sufficient ​guidance ​for ​us ​to ​contemplate ​how ​the ​limits ​and ​scope ​of ​the ​constitutional ​right ​to ​privacy ​could ​be ​determined ​in ​future ​cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —&lt;br /&gt;a) the existence of a “law”&lt;br /&gt;b) a “legitimate State interest”&lt;br /&gt;c) the requirement of “proportionality”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proportionality and Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —&lt;/p&gt;
&lt;p&gt;a)  the limiting measures must be carefully designed, or rationally connected, to the objective&lt;br /&gt;b)  they must impair the right as little as possible&lt;br /&gt;c)  the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'&gt;https://cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T14:56:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/gdpr-and-india-a-comparative-analysis">
    <title>GDPR and India: A Comparative Analysis</title>
    <link>https://cis-india.org/internet-governance/blog/gdpr-and-india-a-comparative-analysis</link>
    <description>
        &lt;b&gt;At present, companies world over are in the process of assessing the impact that EU General Data Protection Regulations (“GDPR”) will have on their businesses.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The post is written by Aditi Chaturvedi and edited by Amber Sinha&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;High administrative  fines in case of non-compliance with GDPR provisions are a driving force behind these concerns as they can lead to loss of business for various countries such as India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To a large extent, future of business will depend on how well India responds to the changing regulatory  changes unfolding globally. India  will have to  assess her preparedness and make convincing changes to retain the status as a  dependable processing destination. This document gives a brief overview of data protection provisions of the Information Technology Act, 2000 followed by a comparative analysis of the key  provisions of GDPR and Information Technology  Act and the Rules notified under it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/gdpr-and-india"&gt;Download the full blog post&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/gdpr-and-india-a-comparative-analysis'&gt;https://cis-india.org/internet-governance/blog/gdpr-and-india-a-comparative-analysis&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Aditi Chaturvedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-11-28T15:17:39Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns">
    <title>Electoral Databases – Privacy and Security Concerns</title>
    <link>https://cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns</link>
    <description>
        &lt;b&gt;In this blogpost, Snehashish Ghosh analyzes privacy and security concerns which have surfaced with the digitization, centralization and standardization of the electoral database and argues that even though the law provides the scope for protection of electoral databases, the State has not taken any steps to ensure its safety.&lt;/b&gt;
        &lt;p&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The recent move by the Election Commission of India (ECI) to tie-up with Google for providing electoral look-up services for citizens and electoral information services has faced heavy criticism on the grounds of data security and privacy.&lt;a href="#_edn1" name="_ednref1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[i]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; After due consideration, the ECI has decided to drop the plan.&lt;a href="#_edn2" name="_ednref2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The plan to partner with Google has led to much apprehension regarding Google gaining access to the database of 790 million voters including, personal information such as age, place of birth and residence. It could have also gained access to cell phone numbers and email addresses had the voter chosen to enroll via the online portal on the ECI website.  Although, the plan has been cancelled, it does not necessarily mean that the largest database of citizens of India is safe from any kind of security breach or abuse. In fact, the personal information of each voter in a constituency can be accessed by anyone through the ECI website and the publication of electoral rolls is mandated by the law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Publication of Electoral Rolls&lt;/b&gt;&lt;br /&gt;The electoral roll essentially contains the name of the voter, name of the relationship (son of/wife of, etc.), age, sex, address and the photo identity card number. The main objective of creation and maintenance of electoral rolls and the issue of Electoral Photo Identity Card (EPIC) was to ensure a free and fair election where the voter would have been  able to cast his own vote as per his own choice. In other words, the main purpose of the exercise was to curtail bogus voting. This is achieved by cross referencing the EPIC with the electoral roll.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The process of creation and maintenance of electoral rolls is governed by the Registration of Electors Rules, 1960. Rule 22 requires the registration officer to publish the roll with list of amendments at his office for inspection and public information. Furthermore, ECI may direct the registration officer to send two copies of the electoral roll to every political party for which a symbol has exclusively been reserved by the ECI. It can be safely concluded that the electoral roll of a constituency is a public document&lt;a href="#_edn3" name="_ednref3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; given that the roll is published and can be circulated on the direction of the ECI.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With the computational turn, in 1998 the ECI took the decision to digitize the electoral databases. Furthermore, printed electoral rolls and compact discs containing the rolls are available for sale to general public.&lt;a href="#_edn4" name="_ednref4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iv]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; In addition to that, the electoral rolls for the entire country are available on the ECI website.&lt;a href="#_edn5" name="_ednref5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[v]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; However, the current database is not uniform and standardized, and entries in some constituencies are available only in the local language. The ECI has taken steps to make the database uniform, standardized and centralized.&lt;a href="#_edn6" name="_ednref6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Security Concerns&lt;/b&gt;&lt;br /&gt;The Registration of Electoral Rules, 1960 is an archaic piece of delegated legislation which is still in force and casts a statutory duty on the ECI to publish the electoral rolls. The publication of electoral rolls is not a threat to security when it is distributed in hard copies and the availability of electoral rolls is limited. The security risks emerge only after the digitization of electoral database, which allows for uniformity, standardization and centralization of the database which in turn makes it vulnerable and subject to abuse. The law has failed to evolve with the change in technology.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a recent article, Bill Davidow analyzes "the dark side of Moore’s Law" and argues that with the growth processing power there has been a growth in surveillance capabilities and on this note the article is titled, “&lt;i&gt;With Great Computing Power Comes Great Surveillance”&lt;/i&gt;&lt;a href="#_edn7" name="_ednref7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Drawing from Davidow’s argument, with the exponential growth in computing power, search has become convenient, faster and cheap. A uniform, standardized and centralized database bearing the personal information of 790 million voters can be searched and categorized in accordance with the search terms. The personal information of the voters can be used for good, but it can be equally abused if it falls into the wrong hands. Big data analysis or the computing power makes it easier to target voters, as bits and pieces of personal information give a bigger picture of an individual, a community, etc. This can be considered intrusive on individual’s privacy since the personal information of every voter is made available in the public domain&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For example, the availability of a centralized, searchable database of voters along with their age would allow the appropriate authorities to identify wards or constituencies, which has a high population of voters above the age of 65. This would help the authority to set up polling booths at closer location with special amenities. However, the same database can be used to search for density of members of a particular community in a ward or constituency based on the name, age, sex of the voters. This information can be used to disrupt elections, target vulnerable communities during an election and rig elections.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Current IT Laws does not mandate the protection of the electoral database&lt;/b&gt;&lt;br /&gt;A centralized electoral database of the entire country can be considered as a critical information infrastructure (CII) given the impact it may have on the election which is the cornerstone of any democracy. Under Section 70 of the Information Technology Act, 2000 (IT Act) CII means “the computer resource, incapacitation or destruction of which, shall have debilitating impact on national security, economy.”&lt;a href="#_edn8" name="_ednref8"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[viii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; However, the appropriate Government has not notified the electoral database as a protected system&lt;a href="#_edn9" name="_ednref9"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ix]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;. Therefore, information security practices and procedures for a protected system are not applicable to the electoral database.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Information Technology Rules (IT Rules) are also not applicable to electoral databases, &lt;i&gt;per se&lt;/i&gt;. Since, ECI is not a body corporate, the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information), Rules, 2011 (&lt;i&gt;hereinafter &lt;/i&gt;Reasonable Security Practices Rules) do not apply to electoral databases. Ignoring that Reasonable Security Practices Rules only apply to a body corporate, the electoral database does fall within the ambit of definition of “personal information”&lt;a href="#_edn10" name="_ednref10"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[x]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; and should arguably be made subject to the Rules.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The intent of the ECI for hosting the entire country’s electoral database online &lt;i&gt;inter alia&lt;/i&gt; is to provide electronic service delivery to the citizens. It seeks to provide “electoral look up services for citizens ... for better electoral information services.”&lt;a href="#_edn11" name="_ednref11"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[xi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; However, the Information Technology (Electronic Service Delivery) Rules, 2011 are not applicable to the electoral database given that it is not notified by the appropriate Government as a service to be delivered electronically. Hence, the encryption and security standards for electronic service delivery are not applicable to electoral rolls.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The IT Act and the IT Rules provide a reasonable scope for the appropriate Government to include electoral databases within the ambit of protected system and electronic service delivery. However, the appropriate government has not taken any steps to notify electoral database as protected system or a mode of electronic service delivery under the existing laws.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Conclusion&lt;/b&gt;&lt;br /&gt;Publication of electoral rolls is a necessary part of an election process. It ensures free and fair election and promotes transparency and accountability. But unfettered access to electronic electoral databases may have an adverse effect and would endanger the very goal it seeks to achieve because the electronic database may pose threat to privacy of the voters and also lead to security breach.  It may be argued that the ECI is mandated by the law to publish the electoral database and hence, it is beyond the operation of the IT Act. But Section 81 of the IT Act has an overriding effect on any law inconsistent, therewith. The appropriate Government should take necessary steps under the IT Act and notify electoral databases as a protected system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is recommended that the Electors Registration Rules, 1960 should be amended, taking into account the advancement in technology. Therefore, the Rules should aim at restricting the unfettered electronic access to the electoral database and also introduce purposive limitation on the use of the electoral database. It should also be noted that more adequate and robust data protection and privacy laws should be put in place, which would regulate the collection, use, storage and processing of databases which are critical to national security.&lt;/p&gt;
&lt;div&gt;
&lt;hr align="left" size="1" width="100%" /&gt;
&lt;div id="edn1"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref1" name="_edn1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[i]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Pratap Vikram Singh, Post-uproar, EC’s Google tie-up plan may go for a toss, Governance Now, January 7, 2014 available at &lt;a class="external-link" href="http://www.governancenow.com/news/regular-story/post-uproar-ecs-google-tie-plan-may-go-toss"&gt;http://www.governancenow.com/news/regular-story/post-uproar-ecs-google-tie-plan-may-go-toss&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn2"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref2" name="_edn2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Press Note No.ECI/PN/1/2014, Election Commission of India , January 9, 2014 available at &lt;a class="external-link" href="http://eci.nic.in/eci_main1/current/PN09012014.pdf"&gt;http://eci.nic.in/eci_main1/current/PN09012014.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn3"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref3" name="_edn3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Section 74, Indian Evidence Act, 1872&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn4"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref4" name="_edn4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iv]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; &lt;a class="external-link" href="http://eci.nic.in/eci_main1/the_function.aspx"&gt;eci.nic.in/eci_main1/the_function.aspx&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn5"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref5" name="_edn5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[v]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; &lt;a class="external-link" href="http://eci.nic.in/eci_main1/Linkto_erollpdf.aspx"&gt;http://eci.nic.in/eci_main1/Linkto_erollpdf.aspx&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn6"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref6" name="_edn6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; “At present, in most States and UTs the Electoral Database is kept at the district level. In some cases it is kept even with the vendors. In most States/UTs it is maintained in MS Access, while in some cases it is on a primitive technology like FoxPro and in some other cases on advanced RDBMS like Oracle or Sql Server. The database is not kept in bilingual form in some of the States/UTs, despite instructions of the Commission. In most cases Unicode fonts are not used. The database structure not being uniform in the country, makes it almost impossible for the different databases to talk to each other” –  Election Commission of India, Revision of Electoral Rolls with reference to 01-01-2010 as the qualifying date – Integration and Standardization of the database- reg., No. 23/2009-ERS, January 6, 2010 available at e&lt;a class="external-link" href="http://eci.nic.in/eci_main/eroll&amp;amp;epic/ins06012010.pdf"&gt;ci.nic.in/eci_main/eroll&amp;amp;epic/ins06012010.pdf&lt;/a&gt;&lt;span dir="RTL"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn7"&gt;
&lt;p class="MsoEndnoteText"&gt;&lt;a href="#_ednref7" name="_edn7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;a class="external-link" href="http://eci.nic.in/eci_main1/current/PN09012014.pdf"&gt;&lt;span&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt; &lt;/span&gt;&lt;/span&gt;&lt;/span&gt;http://www.theatlantic.com/technology/archive/2014/01/with-great-computing-power-comes-great-surveillance/282933/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn8"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref8" name="_edn8"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[viii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Section 70, Information Technology Act, 2000&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn9"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref9" name="_edn9"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ix]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Computer resource which directly or indirectly affects the facility of Critical Information Infrastructure&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn10"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref10" name="_edn10"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[x]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Rule 2(1)(i), Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn11"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref11" name="_edn11"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[xi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Press Note No.ECI/PN/1/2014, Election Commission of India , January 9, 2014 available at &lt;a class="external-link" href="http://eci.nic.in/eci_main1/current/PN09012014.pdf"&gt;http://eci.nic.in/eci_main1/current/PN09012014.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns'&gt;https://cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>snehashish</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Safety</dc:subject>
    
    
        <dc:subject>Information Technology</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Security</dc:subject>
    
    
        <dc:subject>e-Governance</dc:subject>
    
    
        <dc:subject>Transparency, Politics</dc:subject>
    
    
        <dc:subject>E-Governance</dc:subject>
    

   <dc:date>2014-01-16T11:07:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online">
    <title>Does the Safe-Harbor Program Adequately Address Third Parties Online?</title>
    <link>https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online</link>
    <description>
        &lt;b&gt;While many citizens outside of the US and EU benefit from the data privacy provisions the Safe Harbor Program, it remains unclear how successfully the program can govern privacy practices when third-parties continue to gain more rights over personal data.  Using Facebook as a site of analysis, I will attempt to shed light on the deficiencies of the framework for addressing the complexity of data flows in the online ecosystem. &lt;/b&gt;
        
&lt;p&gt;To date, the EU-US Safe Harbor Program leads in governing
the complex and multi-directional flows of personal information online. &amp;nbsp;&amp;nbsp;As commerce began to thrive in the online
context, the European Union was faced with the challenge of ensuring that personal
information exchanged through online services were granted
levels of protect on par with provisions set out in EU privacy law.&amp;nbsp; This was important, notably as the piecemeal
and sectoral approach to privacy legislation in the United states was deemed incompatible
with the EU approach.&amp;nbsp; While the Safe
Harbor program did not aim to protect the privacy of citizens outside of the
European Union per say, the program has in practice set minimum standards for
online data privacy due to the international success of American online
services.&lt;/p&gt;

&lt;p&gt;While many citizens outside of the US and EU benefit from
the Safe Harbor Program, it remains unclear how successful the program will be in an
online ecosystem where third-parties are being granted increasingly more rights
over the data they receive from first parties.&amp;nbsp;
Using Facebook as a site of analysis, I will attempt to shed light on
the deficiencies of the framework for addressing the complexity of data flows
in the online ecosystem.&amp;nbsp; First, I will argue
that the safe harbor program does not do enough to ensure that participants are
held reasonably responsible third party privacy practices.&amp;nbsp; Second, I will argue that the information
asymmetries created between first party sites, citizens, and governance bodies
vis-à-vis third parties obscures the application of the Safe Harbor Model.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The EU-US
Safe-Harbor Agreement&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;In 1995, and based on earlier &lt;a href="http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html"&gt;OECD
guidelines&lt;/a&gt;, the EU Data Directive on the “protection of individuals with
regard to the processing of personal data and the free movement of such data”
was passed&lt;a name="_ednref1" href="#_edn1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [1].&amp;nbsp; The original purpose of the EU Privacy
Directive was not only to increase privacy protection within the European
Union, but to also promote trade liberalization and a single integrated market
in the EU.&amp;nbsp; After the Data Directive was
passed, each member state of the EU incorporated the principles of
the directive into national laws accordingly.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While the Directive was successful in harmonizing data
privacy in the European Union, it also embodied extraterritorial
provisions, giving in reach&lt;a name="_ednref2" href="#_edn2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; beyond the EU.&amp;nbsp; Article 25 of the Directive states that the
EU commission may ban data transfers to third countries that do not ensure “an
adequate level of protect’ of data privacy rights&lt;a name="_ednref3" href="#_edn3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [2].&amp;nbsp; Also, Article 26 of the Directive, expanding
on Article 25, states that personal data cannot be &lt;em&gt;transferred &lt;/em&gt;to a country that “does not ensure an adequate level of
protection” if the data controller does not enter into a contract that adduces
adequate privacy safeguards&lt;a name="_ednref4" href="#_edn4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [3].
&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In light of the increased occurrence of cross-border
information flows, the Data Directive itself was not effective enough to ensure that
privacy principles were enforced outside of the EU.&amp;nbsp; Articles 25 and 26 of the Directive had essentially deemed all cross-border data-flows to the US in contravention of EU privacy law.&amp;nbsp; Therefor, the EU-US Safe-Harbor was established by the
EU Council and the US Department of Commerce as a way of mending the variant
levels of privacy protection set out in these jurisdictions, while also promoting
online commerce.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Social Networking
Sites and the Safe-Harbor Principles&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The case of social networking sites exemplifies the ease
with which data is transferred, processed, and stored between jurisdictionas.&amp;nbsp; While many of the top social networking sites
are registered American entities, they continue to attract users not only from
the EU, but also internationally.&amp;nbsp; In agreement
to the EU law, many social networking sites, including LinkedIn, Facebook,
Myspace, and Bebo, now adhere to the principles of the program.&amp;nbsp; The enforcement of the Safe Harbor takes
place in the United States in accordance with U.S. law and relies, to a great
degree, on enforcement by the private sector.&amp;nbsp;
TRUSTe, an independent certification program and dispute mechanism, has become the most popular governance mechanism for the safe harbor program
among social networking sites.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Drawing broadly on the principles embodied within the EU
Data Directive and the OECD Guidelines, the seven principles of the Safe-Harbor
were developed.&amp;nbsp; These principles include
Notice, Choice, Onward Transfer, Access and Accuracy, Security, Data Integrity
and Enforcement.&amp;nbsp;&amp;nbsp; The principle of “Notice”
sets out that organizations must inform individuals about the purposes for
which it collects and uses information about them, how to contact the
organization with any inquiries or complaints, the types of third parties to
which it disclosures the information, and the choices and means the organization
offers individuals for limiting its use and disclosure.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;“Choice” ensures that individuals have the opportunity to
choose to opt out whether their personal information is disclosed to a third
party, and to ensure that information is not used for purposes incompatible with the purposes for
which it was originally collected.&amp;nbsp; The
“Onward Transfer” principle ensures that third parties receiving information
subscribes to the Safe Harbor principles, is subject to the Directive, or
enters into a written agreement which requires that the third party provide at
least the same level of privacy protection as is requires by the relevant
principles.&lt;/p&gt;
&lt;p&gt;The principles of “Security” and “Data Integrity” seek to
ensure that reasonable precautions are taken to protect the loss or misuse of
data, and that information is not used in a manner which is incompatible with
the purposes for it is has been collected—minimizing the risk that personal
information would be misused or abused.&amp;nbsp;&amp;nbsp;&amp;nbsp;
Individuals are also granted the right, through the access principle, to
view the personal information about them that an organization holds, and to
ensure that it is up-to-date and accurate.&amp;nbsp;
The “Enforcement” principle works to ensure that an effective mechanism
for assuring compliance with the principles, and that there are consequences
for the organization when the principles are not followed.&lt;/p&gt;
&lt;p&gt;The principles of the program are rather quite clear and
enforceable in the first party context, despite some prevailing ambiguities.&amp;nbsp; The privacy policies of most social
networking services have become increasingly clear and straightforward since
their inception.&amp;nbsp; Facebook, for example,
has revamped its &lt;a href="http://www.facebook.com/privacy/explanation.php"&gt;privacy
regime&lt;/a&gt; several times, and gives explicit notice to users how their
information is being used.&amp;nbsp; The privacy
policy also explains the relationship between third parties and your personal information—including
how it may be used by advertisers, search engines, and fellow members.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;With respect to third party advertisers, principles of
“choice” are clearly granted by most social networking services.&amp;nbsp; For example, the &lt;a href="http://www.networkadvertising.org/"&gt;Network Advertising Initiative&lt;/a&gt;, a
self-regulatory initiative of the online advertising industry, clearly lists
its member websites and allows individuals to opt out of any targeted
advertising conducted by its members.&amp;nbsp; In
Facebook’s description of “cookies” in their privacy policy, a direct link to NAI’s
opt out features is given, allowing individuals to make somewhat informed
choices about their participation in such programs.&amp;nbsp; This point is, of course, in light of the
fact that most users do not read or understand the privacy policies provided by
social networking sites&lt;a name="_ednref5" href="#_edn5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [4].
It is also important to note that Google—a major player in the online
advertising business, does not grant users of Buzz and Orkut the same “opt-out”
options as sites such as Facebook and Bebo.&lt;/p&gt;
&lt;p&gt;Under the auspices of the US Federal Trade Commission, the
Safe Harbor Program has also successfully investigated and settled several
privacy-related breaches which have taken place on social networking sites.&amp;nbsp; Of the most famous cases is &lt;a href="http://www.beaconclasssettlement.com/"&gt;Lane et al. v. Facebook et al.&lt;/a&gt;,
which was a class action suit brought against Facebook’s Beacon Advertising
program.&amp;nbsp; The US Federal Trade Commission
was quick to insight an investigation of the program after many privacy groups
and individuals became critical of its questionable advertising practices.&amp;nbsp; The Beacon program was designed to allow
Facebook users to share information with their friends about actions taken on
affiliated, third party sites.&amp;nbsp; This had included,
for example, the movie rentals a user had made through the Blockbuster website.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The Plaintiffs filed a suit, alleging that Facebook and its
affiliates did not give users adequate notice and choice about Beacon and the
collection and use of users’ personal information. &amp;nbsp;&amp;nbsp;&amp;nbsp;The Beacon program was ultimately found to
be in breach of US law, including the &lt;a href="http://epic.org/privacy/vppa/"&gt;Video
Privacy Protection Act&lt;/a&gt;, which bans the disclosure of personally identifiable
rental information.&amp;nbsp; Facebook has
announced the settlement of the lawsuit, not bringing individual settlements,
but a marked end to the program and the development of a 9.5 million dollar &lt;a href="http://www.p2pnet.net/story/37119"&gt;Facebook Privacy Fund&lt;/a&gt; dedicated to
privacy and data-related issues.&amp;nbsp; Other privacy
related investigations of social networking sites launched by the FTC under the
Safe Harbor Program include Facebook’s &lt;a href="http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly"&gt;privacy
changes&lt;/a&gt; in late 2009, and the Google’s recently released &lt;a href="http://www.networkworld.com/news/2010/032910-lawmakers-ask-for-ftc-investigation.html"&gt;Buzz
application&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Despite the headway the Safe Harbor is making, many privacy
related questions remain ambiguous with respect to the responsibilities social networking
sites through the program.&amp;nbsp; For example,
Bebo &lt;a href="http://www.bebo.com/Privacy2.jsp"&gt;reserves the right&lt;/a&gt; to
supplement a social profile with addition information collected from publicly
available information and information from other companies.&amp;nbsp; Bebo’s does adhere to the “notice principle”—as
it makes know to users how their information will be used through their privacy
policy. However, it remains unclear if appropriate disclosures are given by Bebo
as required by Safe Harbor Framework, notably as the sources of “publicly
available information” as a concept remains broad and obscured in the privacy policy.&amp;nbsp; It is also unclear whether or not Bebo users
are able to, under the “Choice” principle, refuse to having their profiles from
being supplemented by other information sources.&amp;nbsp; Also, under the “access
principle”, do individuals have the right to review all information held about them as “Bebo
users”?&amp;nbsp; The right to review information
held by a social networking site is an important one that should be upheld.&amp;nbsp; This is most notable as supplementary information
from outside social networking services is employed &amp;nbsp;to profile individual users in ways which may
work to categorize individuals in undesirable ways.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The Third Party Problem&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Cooperation between social networking sites and the Safe
Harbor has improved, and most of these sites now have privacy policies which
explicitly address the principles of the Program.&amp;nbsp;&amp;nbsp; It should also be noted that public interest
groups, such as Epic, the Center for Digital Democracy, and The Electronic
Frontier Foundation, have played a key role in ensuring that data privacy
breaches are brought to the attention of the FTC under the program.&amp;nbsp; While the program has somewhat adequately
addressed the privacy practices of first party participants, the number of
third parties on social networking sites calls into question the
comprehensiveness and effectiveness of the Safe Harbor program.&amp;nbsp; Facebook itself as a first party site may adhere
to the Safe Harbor Program.&amp;nbsp; However, its
growing number third party platform members may not always adhere to best practices
in the field, nor can Facebook or the Safe Harbor Program guarantee that they
do so.&lt;/p&gt;
&lt;p&gt;The Safe Harbor Program does require that all participants
take certain security measures when transferring data to a third party.&amp;nbsp; Third parties must either subscribe to the
safe harbor principles, or be subject to the EU Data Directive.&amp;nbsp; Alternatively, an organization can may also
enter into a written agreement with a third party requiring that they provide
at least the same level of privacy protection as is required by program
principles.&amp;nbsp; Therefore, third parties of
participating program sites are, de facto, bound by the safe harbor principles by
the way of entering into agreement with a first party participant of the
program. &amp;nbsp;This is the approach taken by
most social networking sites and their third parties.&lt;/p&gt;
&lt;p&gt;It is important to note, however, that third parties are not
governed directly by the regulatory bodies, such as the FTC.&amp;nbsp; The safe harbor website also &lt;a href="http://www.export.gov/safeharbor/eu/eg_main_018476.asp"&gt;explicitly notes&lt;/a&gt;
that the program does not apply to third parties.&amp;nbsp; Therefore, as per these provisions, Facebook must
adhere to the principles of the program, while its third party platform members
(such as social gaming companies), only must do so indirectly as per a separate
contract with Facebook.&amp;nbsp; The
effectiveness of this indirect mode of governing of third party privacy
practices is questionable for numerous reasons.&lt;/p&gt;
&lt;p&gt;Firstly, while Facebook does take steps to ensure that
third parties use information from Facebook in a manner which is consistent to
the safe harbor principles, the company explicitly &lt;a href="http://www.facebook.com/policy.php"&gt;waives any guarantee&lt;/a&gt; that third
parties will “follow their rules”. &amp;nbsp;&amp;nbsp;Prior to allowing third parties to access any
information about users, Facebook requires third parties to &lt;a href="http://www.facebook.com/terms.php"&gt;agree to terms&lt;/a&gt; that limit their
use of information, and also use technical measures to ensure that they only
obtain authorized information.&amp;nbsp;&amp;nbsp; Facebook
also warns users to “always review the policies of third party applications and
websites to make sure you are comfortable with the ways in which they use
information”.&amp;nbsp; Not only are users
required to read the privacy policies of every third party application, but are
also expected to report applications which may be in violation of privacy
principles.&amp;nbsp; In this sense, Facebook not
only waives responsibility for third party privacy breaches, but also places further
regulatory onus upon the user.&lt;/p&gt;
&lt;p&gt;As the program guidelines express, the safe harbor relies to
a great degree on enforcement by the private sector.&amp;nbsp; However, it is likely that a self-regulatory
framework may lead the industry into a state of regulatory malaise.&amp;nbsp; Under the safe harbor program, Facebook must
ensure that the privacy practices of third parties are adequate.&amp;nbsp; However, at the same time, the company may
simultaneously waiver their responsibility for third party compliance with safe
harbor principles.&amp;nbsp; Therefore, it remains
questionable as to where responsibility for third parties exactly lies.&amp;nbsp; When third parties are not directly
answerable to the governing bodies of safe harbor program, and when first parties
can to waive responsibility for their practices, from where does the incentive to
effectively regulate third parties to come from?&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While Facbeook may in fact take reasonable legal and technical
measures to ensure third party compliance, the room for potential dissonance
between speech and deed&amp;nbsp; is worrisome.&amp;nbsp; Facebook is required to ensure that third
parties provide “&lt;a href="http://www.export.gov/safeharbor/eu/eg_main_018476.asp"&gt;at least the same
level of privacy protection&lt;/a&gt;” as they do.&amp;nbsp;
However, in practice, this has yet to become the case.&amp;nbsp; A quick survey of twelve of the most popular
Platform Applications in the gaming category showed&lt;a name="_ednref6" href="#_edn6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;
that third parties are not granting their users the “same level of privacy
protection”[5].&amp;nbsp; For example, section 9.2.3
of Facebooks “&lt;a href="http://www.facebook.com/terms.php"&gt;Rights and
Responsibilities&lt;/a&gt;” for Developers/Operators of applications/sites states
that they must “have a privacy policy or otherwise make it clear to users what
user data you are going to use and how you will use, display, or share that
data”.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;However, out of the 12 gaming applications surveyed, four
companies failed to make privacy policies available to users &lt;em&gt;before&lt;/em&gt; they granted the application
access to the personal information, including that of their friends&lt;a name="_ednref7" href="#_edn7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [6].&amp;nbsp; After searching for the privacy policies on
the websites of each of the four social gaming companies, two completely failed
to post privacy policies on their central websites. &amp;nbsp;&amp;nbsp;This practice is in direct breach of the
contract made between these companies and Facebook, as mentioned above.&amp;nbsp; In addition to many applications failing to clearly
post privacy policies, many of provisions set out in these policies were
questionable vis-à-vis safe harbor principles.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;For example Zynga, makes of popular games Mafia Wars and
Farmville, reserve the right to “maintain copies of your content
indefinitely”.&amp;nbsp; This practice remains contrary
to Safe Harbor principles which states that information should not be kept for
longer than required to run a service.&amp;nbsp;
Electronic Arts also maintains similar provisions for data retention in
its privacy policy.&amp;nbsp;&amp;nbsp; Such practices are
rather worrisome also in light of the fact that both companies also reserve the
right to collect information on users from other sources to supplement profiles
held.&amp;nbsp; This includes (but is not limited
to) newspapers and Internet sources such as blogs, instant messaging services, and
other games.&amp;nbsp;&amp;nbsp; It is also notable to
mention that only one of the twelve social gaming companies surveyed directly
participates in the safe harbor program.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In addition to the difficulties of ensuring that safe harbor
principles are adhered to by third parties, the information asymmetries which
exist between first party sites, citizens, and governance bodies vis-à-vis
third parties complicate this model.&amp;nbsp; Foremost,
it is clear that Facebook, despite its resources, cannot keep tabs on the
practices of all of their applications.&amp;nbsp;&amp;nbsp;
This puts into question if industry self-regulation can really guarantee
that privacy is respected by third parties in this context.&amp;nbsp; Furthermore, the lack of knowledge or
understanding held by citizens about how third parties user their information
is particularly problematic when a system relies so heavily on users to report
suspected privacy breaches.&amp;nbsp; The same is
likely to be true for governments, too.&amp;nbsp; As
one legal scholar, promoting a more laisse-fair approach to third party
regulation, notes—multiple and invisible third party relationships presents
challenges to traditional forms of legal regulation&lt;a name="_ednref8" href="#_edn8"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [7].&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In an “open “social ecosystem, the sheer volume of data
flows between users of social networking sites and third party players appears
to have become increasingly difficult to effectively regulate.&amp;nbsp; While the safe harbor program has been
successful in establishing best practices and minimum standards for data
privacy, it is also clear that governance bodies, and public interest groups,
have focused most attention on large industry players such as Facebook.&amp;nbsp; This has left smaller third party players on
social networking sites in the shadows of any substantive regulatory concern.&amp;nbsp; &amp;nbsp;&amp;nbsp;If
one this has become clear, it is the fact that governments may no longer be
able to effectively govern the flows of data in the burgeoning context of “open
data”.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;As I have demonstrated, it remains questionable whether or
not Facebook can regulate third parties data collection practices
effectively.&amp;nbsp; Imposing more stringent
responsibilities on safe harbor participants could be a positive step.&amp;nbsp; It is reasonable to assume that it would be
undue to impose liability on social networking sites for the data breaches of
third parties.&amp;nbsp; However, it is not
unreasonable to require sites like Facebook go beyond setting “minimum
standards” for data privacy, towards taking a more active enforcement, if even
through TRUSTe or another regulatory body.&amp;nbsp;
If the safe harbor is to be effective, it cannot allow program participants
to simply wave the liability for third party privacy practices.&amp;nbsp; The indemnity granted to third parties on social
networking sites may deem the safe harbor program more effective in sustaining
the non-liability of third parties, rather than protecting the data privacy of
citizens.&lt;/p&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;
&lt;hr align="left" size="1" width="33%" /&gt;

&lt;/div&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn1" href="#_ednref1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[1] Official Directive 95/46/EC&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn2" href="#_ednref2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn3" href="#_ednref3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[2] 95/46/EC&lt;/p&gt;
&lt;p class="discreet"&gt;[3] Ibid&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn4" href="#_ednref4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;a name="_edn5" href="#_ednref5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/a&gt;[4] See Acquisit,
A. a. (n.d.). Imagined Communities: Awareness, Information Sharing, and Privacy
on Facebook. &lt;em&gt;PET 2006&lt;/em&gt;&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn6" href="#_ednref6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[5] Of the Privacy Policy browsed include, Zynga, Rock
You!, Crowdstar, Mind Jolt, Electronic Arts, Pop Cap Games, Slash Key, Playdom,
Meteor Games, Broken Bulb Studios, Wooga, and American Global Network.&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn7" href="#_ednref7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[6] By adding an application, users are also sharing with
third parties the information of their friends if they do not specifically &amp;nbsp;opt out of this practice.&lt;/p&gt;
&lt;p class="discreet"&gt;[7]See&lt;strong&gt;
&lt;/strong&gt;&amp;nbsp;Milina, S. (2003).
Let the Market Do its Job: Advocating an Integrated Laissez-Faire Approach to
Online Profiling. &lt;em&gt;Cardozo Arts and Entertainment Law Journal&lt;/em&gt; .&lt;/p&gt;
&lt;pre&gt;&lt;/pre&gt;
&lt;div&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;/div&gt;
&lt;h2&gt;&amp;nbsp;&lt;/h2&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online'&gt;https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>rebecca</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Social Networking</dc:subject>
    

   <dc:date>2011-08-02T07:19:34Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019">
    <title>Divergence between the General Data Protection Regulation and the Personal Data Protection Bill, 2019</title>
    <link>https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
&lt;p&gt;Our note on the divergence between the General Data Protection Regulation and the Personal Data Protection Bill can be downloaded as a PDF &lt;a href="https://cis-india.org/internet-governance/divergence-between-the-gdpr-and-pdp-bill-2019" class="internal-link" title="Divergence between the GDPR and PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The European Union’s General Data
Protection Regulation (GDPR), replacing the 1995 EU Data Protection Directive
came into effect in May 2018. It harmonises the data protection regulations
across the European Union. In India, the Ministry of Electronics and
Information Technology had constituted a Committee of Experts (chaired by
Justice Srikrishna) to frame recommendations for a data protection framework in
India. The Committee submitted its report and a draft Personal Data Protection
Bill in July 2018 (2018 Bill). Public comments were sought on the bill till
October 2018. The Central Government revised the Bill and introduced the
revised version of the Personal Data Protection Bill (PDP Bill) on December 11,
2019 in the Lok Sabha.&lt;/p&gt;
&lt;p&gt;The PDP Bill has incorporated certain
aspects of the GDPR, such as requirements for notice to be given to the data
principal, consent for processing of data, establishment of a data protection
authority, etc. However, there are some differences and in this note we have highlighted
the areas of divergence between the two. It only includes
provisions which are common to the GDPR and the PDP Bill. It does not include
the provisions on (i) Appellate Tribunal, (ii) Finance, Account and Audit; and
(iii) Non- Personal Data.&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019'&gt;https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2020-02-21T11:08:50Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/major-security-flaw-namo-app">
    <title>Developer team fixed vulnerabilities in Honorable PM's app and API</title>
    <link>https://cis-india.org/internet-governance/blog/major-security-flaw-namo-app</link>
    <description>
        &lt;b&gt;The official app of Narendra Modi, the Indian Prime Minister, was found to contain a security flaw in 2015 that exposed millions of people's personal data.  A few days ago a very similar flaw was reported again.  This post by Bhavyanshu Parasher, who found the flaw and sought to get it fixed last year, explains the technical details behind the security vulnerability.&lt;/b&gt;
        &lt;p&gt;&lt;strong&gt;This blog post has been authored by Bhavyanshu Parasher&lt;/strong&gt;. The original post can be&lt;a class="external-link" href="https://bhavyanshu.me/major-security-flaw-pm-app/09/29/2015"&gt; read here&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2 style="text-align: justify; "&gt;What were the issues?&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The main issue was how the app was communicating with the API served by narendramodi.in.&lt;/span&gt;&lt;/p&gt;
&lt;div id="_mcePaste" style="text-align: justify; "&gt;&lt;ol&gt;
&lt;li&gt;I was able to extract private data, like email addresses, of each registered user just by iterating over user IDs.&lt;/li&gt;
&lt;li&gt;There was no authentication check for API endpoints. Like, I was able to comment as any xyz user just by hand-crafting the requests.&lt;/li&gt;
&lt;li&gt;The API was still being served over HTTP instead of HTTPS.&lt;/li&gt;
&lt;/ol&gt;&lt;/div&gt;
&lt;h3 style="text-align: justify; "&gt;Fixed&lt;/h3&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;The most important issue of all. Unauthorized access to personal info, like email addresses, is fixed. I have tested it and can confirm it.&lt;/li&gt;
&lt;li&gt;A check to verify if a valid user is making the request to API endpoint is fixed. I have tested it and can confirm it.&lt;/li&gt;
&lt;li&gt;Blocked HTTP. Every response is served over HTTPS. The people on older versions (which was serving over HTTP) will get a message regarding this. I have tested it. It says something like “Please update to the latest version of the Narendra Modi App to use this feature and access the latest news and exciting new features”. It’s good that they have figuered out a way to deal with people running older versions of the app. Atleast now they will update the app.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2 style="text-align: justify; "&gt;Detailed Vulnerability Disclosure&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Found major security loophole in how the app accesses the “api.narendramodi.in/api/” API. At the time of disclosure, API was being served over “HTTP” as well as “HTTPS”. People who were still using the older version of the app were accessing endpoints over HTTP. This was an issue because data (passwords, email addresses) was being transmitted as plain text. In simple terms, your login credentials could easily be intercepted. MITM attack could easily fetch passwords and email addresses. Also, if your ISP keeps log of data, which it probably does, then they might already have your email address, passwords etc in plain text. So if you were using this app,&lt;strong&gt; I would suggest you to change your password immediately&lt;/strong&gt;. Can’t leave out a possibility of it being compromised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another major problem was that the token needed to access API was giving a false sense of security to developers. The access token could easily be fetched &amp;amp; anyone could send hand-crafted HTTP requests to the server. It would result in a valid JSON response without authenticating the user making the request. This included accessing user-data (primarily email address, fb profile pictures of those registered via fb) for any user and posting comments as any registered user of the app. There was no authentication check on the API endpoint. Let me explain you with a demo.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The API endpoint to fetch user profile information (email address) was getprofile. Before the vulnerability was fixed, the endpoint was accessible via “http://www.narendramodi.in/api/getprofile?userid=useridvalue&amp;amp;token=sometokenvalue”. As you can see, it only required two parameters. userid, which we could easily iterate on starting from 1 &amp;amp; token which was a fixed value. There was no authentication check on API access layer. Hand-crafting such requests resulted in a valid JSON response which exposed critical data like email addresses of each and every user. I quickly wrote a very simply script to fetch some data to demonstrate. Here is the sample output for xrange(1,10).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/App.png/@@images/7bec3ca6-0808-4d19-9711-bc084b507f61.png" alt="App" class="image-inline" title="App" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Not just email addresses, using this method you could spam on any article pretending to be any user of the app. There was no authentication check as to who was making what requests to the API. See,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/copy_of_App.png/@@images/2e499adb-b621-4bc4-a490-f8957c9ac1d7.png" alt="App" class="image-inline" title="App" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;They have fixed all these vulnerabilities. I still believe it wouldn’t have taken so long if I would have been able to get in touch with team of engineers directly right from the beginning. In future, I hope they figure out an easier way to communicate. Such issues must be addressed as soon as they are found but the communication gap cost us lot of time. The team did a great job by fixing the issues and that’s what matters.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h2 style="text-align: justify; "&gt;Disclosure to officials&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The email address provided on Google play store returned a response stating “The email account that you tried to reach is over quota”. Had to get in touch with authorities via twitter.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Vulnerability disclosed to authorities on 30th sep, 2015 around 5:30 AM&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/Tweet1.png" alt="Tweet 1" class="image-inline" title="Tweet 1" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After about 30 hours of reporting the vulnerabillity&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/Tweet2.png" alt="Tweet 2" class="image-inline" title="Tweet 2" /&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proposed Solution&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Consulted &lt;/span&gt;&lt;a href="https://twitter.com/pranesh_prakash"&gt;@pranesh_prakash&lt;/a&gt;&lt;span&gt; as well regarding the issue.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;img src="https://cis-india.org/home-images/Tweet3.png" alt="Tweet 3" class="image-inline" title="Tweet 3" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After this, I mailed them a solution regarding the issues.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h2 style="text-align: justify; "&gt;Discussion with developer&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Received &lt;strong&gt;phone call&lt;/strong&gt; from a developer. Discussed possible solutions to fix it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;The solution that I proposed could not be implemented &lt;/strong&gt;since the vulnerability is caused by a design flaw that should have been thought about right from the beginning when they started developing the app. It just proved how difficult it is to fix such issues for mobile apps. For web apps, it’s lot easier. Why? Because for mobile apps, you need to consider backward compatibility. If they applied my proposed solution, it would crash app for people running the older versions. Main problem is that &lt;strong&gt;people don’t upgrade to latest versions leaving themselves vulnerable to security flaws&lt;/strong&gt;. The one I proposed is a better way of doing it I think but it will break for people using older versions as stated by the developer. Though, they (developers) have come up with solutions that I think would fix most of the issues and can be considered an alternative.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/Tweet4.png" alt="Tweet 4" class="image-inline" title="Tweet 4" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On Oct 3rd, I received mail from one of the developers who informed me they have fixed it. I could not check it out at that time as I was busy but I checked it around 5 PM. &lt;strong&gt;I can now confirm they have fixed all three issues&lt;/strong&gt;.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h2 style="text-align: justify; "&gt;Update 12/02/2016&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://www.dailyo.in/variety/narendra-modi-namo-app-hacker-security-concerns-javed-khatri-demonetisation-survey-bjp-voter-data/story/1/14347.html"&gt;This vulnerability&lt;/a&gt; in NM app is similar to the one I got fixed last year. Like I said before also, the vulnerability is because of how the API has been designed. They released the same patch which they did back then. Removing email addresses from the JSON output is not really a patch. I wonder why would they introduce personal information in JSON output again if they knew that’s a privacy problem and has been reported by me a year back. He showed how he was able to follow any user being any user. Similarly, I was able to comment on any post using account of any user of the app. When I talked to the developer back then he mentioned it will be difficult to migrate users to a newer/secure version of the app so they are releasing this patch for the meantime. It was more of a backward compatibility issue because of how API was designed. The only solution to this problem is to rewrite the API from scratch and add standard auth methods for API. That should take care of most of vulnerabilities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Also read:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="http://www.newindianexpress.com/nation/2016/dec/02/narendra-modi-app-hacked-by-youngster-points-out-risk-to-7-million-users-data-1544933--1.html"&gt;Narendra Modi app hacked by youngster, points out risk to 7 million users’ data&lt;/a&gt; (New Indian Express; December 2, 2016)&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://indiatoday.intoday.in/story/security-22-year-old-hacks-modi-app-private-data-7-million/1/825661.html"&gt;Security flaw: 22-year-old hacks Modi app and accesses private data of 7 million people&lt;/a&gt; (India Today; December 2, 2016)&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://thewire.in/84148/tech-security-namo-api/"&gt;The NaMo App Non-Hack is Small Fry – the Tech Security on Government Apps Is Worse&lt;/a&gt; (The Wire; December 3, 2016)&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/major-security-flaw-namo-app'&gt;https://cis-india.org/internet-governance/blog/major-security-flaw-namo-app&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranesh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Hacking</dc:subject>
    
    
        <dc:subject>Mobile Apps</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2016-12-04T19:08:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india">
    <title>Demystifying Data Breaches in India</title>
    <link>https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india</link>
    <description>
        &lt;b&gt;Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their  historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.

&lt;/b&gt;
        &lt;p&gt;Edited by Arindrajit Basu and Saumyaa Naidu&lt;/p&gt;
&lt;hr /&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;India saw a &lt;a href="https://theprint.in/india/despite-62-drop-in-data-breaches-india-among-top-5-nations-targeted-by-hackers-study-finds/917197/"&gt;62% drop in data breaches in the first quarter of 2022&lt;/a&gt;. Yet, it ranked fifth on the list of countries most hit by cyberattacks according to a 2022 &lt;a href="https://surfshark.com/blog/data-breach-statistics-by-country"&gt;report by Surfshark&lt;/a&gt;, a Netherlands-based VPN company. Another report &lt;a href="https://analyticsindiamag.com/the-ridiculous-17-5-cr-for-a-data-breach/"&gt;on the cost of data breaches researched by the Ponemon Institute and published by IBM&lt;/a&gt; reveals that the breach of about 29500 records between March 2021 and March 2022 resulted in a 25% increase in the average cost from INR 165 million in 2021 to INR 176 million in 2022.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;These statistics are certainly a cause for concern, especially in the context of India’s rapidly burgeoning digital economy shaped by the pervasive platformization of private and public services such as welfare, banking, finance, health, and shopping among others. Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;While expert articulations of cybersecurity in general and data breaches in particular tend to predominate the public discourse on data privacy, this post aims to situate broader understandings of data breaches within the historical context of India’s IT revolution and delve into specific concepts and terminology that have shaped the broader discourse on data protection. The late 1990s and early 2000s offer a useful point of entry into the genesis of the data security landscape in India.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;&lt;/span&gt;&lt;span&gt;Data Breaches and their Predecessor Forms&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;/span&gt;&lt;span&gt;The articulation of data security concerns around the late 1990s and early 2000s isn’t always consistent in deploying the phrase, ‘data breach’ to signal cybersecurity concerns in India. The terms such as ‘data/ identity theft’ and ‘data leak’ figure prominently in the public articulation of concerns with the handling of personal information by IT systems, particularly in the context of business process outsourcing (BPO) and e-commerce activities. Other pertinent terms such as “security breach”, “data security”, and ‘“cyberfraud” also capture the specificity of growing concerns around outsourced data to India. At the time, i.e. around mid-2000s regulatory frameworks were still evolving to accommodate and address the complexities arising from a dynamic reconfiguration of the telecommunications and IT landscape in India.&lt;/span&gt;&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Some of the formative cases that instantiate the usage of the aforementioned terms are instructive to understand shifts in the reporting of such incidents over time. The earliest case during that period concerns&lt;a href="https://www.stop-source-code-theft.com/source-code-theft-cases-in-india/"&gt; a 2002 case concerning the theft and sale of source code&lt;/a&gt; by an IIT Kharagpur student who intended to sell the code to two undercover FBI agents who worked with the CBI to catch the thief. A straightforward case of data theft was framed by media stories around the time as a &lt;a href="https://timesofindia.indiatimes.com/iitian-held-for-stealing-software-source-code/articleshow/20389713.cms"&gt;cybercrime involving the illegal sale&lt;/a&gt; of the source code of a software package, as &lt;a href="https://economictimes.indiatimes.com/ip-laws-lax-but-us-firm-bets-on-india/articleshow/696197.cms?from=mdr"&gt;software theft of intellectual property in the context of outsourcing&lt;/a&gt; and as an instance of &lt;a href="https://www.computerworld.com/article/2573515/at-risk-offshore.html"&gt;industrial espionage in poor nations without laws protecting foreign companies&lt;/a&gt;. This case became the basis of the earliest calls for the protection of data privacy and security in the context of the Indian BPO sector. The Indian IT Act, 2000 at the time only covered &lt;a href="http://pavanduggal.com/wp-content/uploads/2016/01/India-Responds-to-Growing-Concerns-Over-Data-Security.pdf"&gt;unauthorized access and data theft from computers and networks without any provisions for data protection, interception or computer forgery&lt;/a&gt;. The BPO boom in India brought with it &lt;a href="https://blj.ucdavis.edu/archives/vol-6-no-2/offshore-outsourcing-to-india.html"&gt;employment opportunities for India’s English-speaking, educated youth but in the absence of concrete data privacy legislation&lt;/a&gt;, the country was regarded as an unsafe destination for outsourcing aside from the political ramifications concerning the loss of American jobs.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In a major 2005 incident, employees of the Mphasis BFL call centre in Pune extracted sensitive bank account information of Citibank’s American customers to divert INR 1.90 crore into new accounts set up in India. The media coverage of this incident calls it &lt;a href="https://www.indiatoday.in/magazine/economy/story/20050502-pune-call-centre-fraud-rattles-india-booming-bpo-sector-787790-2005-05-01"&gt;India’s first outsourcing cyberfraud and a well planned scam&lt;/a&gt;, a &lt;a href="https://economictimes.indiatimes.com/mphasis-call-centre-fraud-net-widens/articleshow/1077097.cms"&gt;cybercrime in a globalized world&lt;/a&gt;, and a case of &lt;a href="https://timesofindia.indiatimes.com/home/sunday-times/deep-focus/indias-first-bpo-scam-unraveled/articleshow/1086438.cms"&gt;financial fraud and a scam&lt;/a&gt; that required no hacking skills, and a &lt;a href="https://www.infoworld.com/article/2668975/indian-call-center-workers-charged-with-citibank-fraud.html"&gt;case of data theft and misuse&lt;/a&gt;. Within the ambit of cybercrime, media reports of these incidents refer to them as cases of “fraud”, “scam” and “theft''.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Two other incidents in 2005 set the trend for a critical spotlight on data security practices in India. In a &lt;a href="http://news.bbc.co.uk/2/hi/south_asia/4619859.stm"&gt;June 2005 incident, an employee of a Delhi-based BPO firm, Infinity e-systems, sold the account numbers and passwords of 1000 bank customers &lt;/a&gt;to the British Tabloid, The Sun. The Indian newspaper, Telegraph India, carried an online story headlined, “&lt;a href="https://www.telegraphindia.com/india/bpo-blot-in-british-backlash-indian-sells-secret-data/cid/873737"&gt;BPO Blot in British Backlash: Indian Sells Secret Data&lt;/a&gt;,” which reported that the employee, Kkaran Bahree, 24, was set up by a British journalist, Oliver Harvey. Harvey filmed Bahree accepting wads of cash for the stolen data. Bahree’s theft of sensitive information is described both as a data fraud and a leak in the above 2005 BBC story by Soutik Biswar. Another story on the incident calls it a “&lt;a href="https://www.rediff.com/money/2005/jun/24bpo3.htm"&gt;scam” involving the leakage of credit card information&lt;/a&gt;. The use of the term ‘leak’ appears consistently across other media accounts such as a &lt;a href="https://timesofindia.indiatimes.com/city/delhi/esearch-bpo-employee-sacked-still-missing/articleshow/1153017.cms"&gt;2005 story on Karan Bahree in the Times of India&lt;/a&gt; and another story in the Economic Times about the Australian Broadcasting Corporation’s (ABC) sting operation similar to the one in Delhi, describing the scam by the &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/karan-bahree-part-ii-shot-in-australia/articleshow/1201347.cms?from=mdr"&gt;fraudsters as a leak&lt;/a&gt; of the online information of Australians. Another media account of the coverage describes the incident in more generic terms such as an “&lt;a href="https://www.tribuneindia.com/2005/20050625/edit.htm"&gt;outsourcing crime&lt;/a&gt;”.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The other case concerned &lt;a href="https://www.taylorfrancis.com/chapters/mono/10.4324/9781315610689-16/political-economy-data-security-bpo-industry-india-alan-chong-faizal-bin-yahya"&gt;four former employees of Parsec technologies who stole classified information and diverted calls from potential customers&lt;/a&gt;, causing a sudden drop in the productivity of call centres managed by the company in November 2005. Another call centre &lt;a href="http://news.bbc.co.uk/1/hi/uk/7953401.stm"&gt;fraud came to light in 2009 through a BBC sting operation in which British reporters went to Delhi &lt;/a&gt;and secretly filmed a deal with a man selling credit card and debit card details obtained from Symantec call centres, which sold software made by Norton. This BBC story uses the term “breach” to refer to the incident.&lt;/p&gt;
&lt;p dir="ltr"&gt;In the broader framing of these cases generally understood as cybercrime, which received transnational media coverage, the terms “fraud”, “leak”, “scam”, and “theft” appear interchangeably. The term “data breach” does not seem to be a popular or common usage in these media accounts of the BPO-related incidents. A broader sense of breach (of confidentiality, privacy) figures in the media reportage in &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr"&gt;implicitly racial terms of cultural trust&lt;/a&gt;, as a matter of &lt;a href="https://www.news18.com/news/business/bpo-staff-need-ethical-training-poll-248442.html"&gt;ethics and professionalism&lt;/a&gt; and in the &lt;a href="https://www.news18.com/news/business/sting-op-may-spell-doom-for-bpos-248260.html"&gt;language of scandal &lt;/a&gt;in some cases.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;These early cases typify a specific kind of cybercrime concerning the theft or misappropriation of outsourced personal data belonging to British or American residents. What’s remarkable about these cases is the utmost sensitivity of the stolen personal information including financial details, bank account and credit/debit card numbers, passwords, and in one case, source code. While these cases rang the alarm bells on the Indian BPO sector’s data security protocols, they also directed attention to concerns around &lt;a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr"&gt;the training of Indian employees on the ethics of data confidentiality and vetting through psychometric tests&lt;/a&gt; for character assessment. In the wake of these incidents, the National Association of Software and Service Companies (NASSCOM), an Indian non-governmental trade and advocacy group,&lt;a href="https://www.computerworld.com/article/2547959/outsourcing-to-india--dealing-with-data-theft-and-misuse.html"&gt; launched a National Skills Registry for IT professionals to enable employers to conduct background checks&lt;/a&gt; in 2006.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;These data theft incidents earned India a global reputation of an unsafe destination for business process outsourcing, seen to be lacking both, a culture of maintaining data confidentiality and concrete legislation for data protection at the time. Importantly, the incidents of data theft or misappropriation were also traceable back to a known source, a BPO employee or a group of malefactors, who often sold sensitive data belonging to foreign nationals to others in India.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The phrase “data leak” also caught on in another register in the context of the widespread use of camera-equipped mobile phones in India. The 2004 Delhi MMS case offers an instance of a date leak, recapitulating the language of scandal in moralistic terms.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;The Delhi MMS Case&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The infamous 2004 incident involved two underage Delhi Public School (DPS) students who recorded themselves in a sexually explicit act on a cellular phone. After a fall out, the male student passed the low-resolution clip on to his friend in which his female friend’s face is seen. The clip, distributed far and wide in India, ended up on the famous e-shopping and auction website, bazee.com leading to &lt;a href="https://indiancaselaw.in/avnish-bajaj-vs-state-dps-mms-scandal-case/"&gt;the arrest of the website’s CEO Avinash Bajaj for hosting the listing for sale&lt;/a&gt;. Another similar case in 2004 mimicked the mechanics of visual capture through hand-held MMS-enabled mobile phones. A two-minute MMS of a top South-Indian actress &lt;a href="https://timesofindia.indiatimes.com/india/web-of-sleaze-now-nude-video-of-top-actress/articleshow/966048.cms"&gt;taking a shower went viral on the Internet in 2004, the year when another MMS of two prominent Bollywood actors kissing&lt;/a&gt; had already done the rounds. The &lt;a href="https://www.journals.upd.edu.ph/index.php/plaridel/article/view/2392"&gt;MMS case also marked the onset of a national moral panic around the amateur uses of mobile phone technologies&lt;/a&gt;, capable of corrupting young Indian minds under a sneaky regime of new media modernity. The MMS case, not strictly the classic case of a data breach - non-visual information generally stored in databases - became an iconic case of a data leak framed in the media as &lt;a href="https://www.telegraphindia.com/india/scandal-in-school-shakes-up-delhi/cid/1667531"&gt;a scandal that shocked the country&lt;/a&gt;, with calls for the regulation of mobile phone use in schools. The case continued its scandalous afterlife in a &lt;a href="https://www.heraldgoa.in/Edit/dev-ds-leni-has-a-dps-mms-scandal-connection-/21344"&gt;2009 Bollywood film, Dev D&lt;/a&gt; and another &lt;a href="https://indianexpress.com/article/entertainment/entertainment-others/delhi-mms-scandal-inspires-dibakars-love-sex-aur-dhoka/"&gt;2010 film, Love, Sex and Dhokha&lt;/a&gt;,&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Taken together, the BPO data thefts and frauds and the data leak scandals prefigure the contemporary discourse on data breaches in the second decade of the 21st century, or what may also be called the Decade of Datafication. The launch of the Indian biometric identity project, Aadhaar, in 2009, which linked access to public services and welfare delivery with biometric identification, resulted in large-scale data collection of the scheme’s subscribers. Such linking raised the spectre of state surveillance as alleged by the critics of Aadhaar, marking a watershed moment in the discourse on data privacy and protection.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Aadhaar Data Security and Other Data Breaches&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Aadhaar was challenged in the Indian Supreme Court in 2012 when &lt;a href="https://www.outlookindia.com/website/story/worries-about-the-aadhaar-monster/296790"&gt;it was made mandatory for welfare and other services such as banking, taxation and mobile telephony&lt;/a&gt;. The national debate on the status of privacy as a cultural practice in Indian society and a fundamental right in the Indian Constitution led to two landmark judgments - the &lt;a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"&gt;2017 Puttaswamy ruling&lt;/a&gt; holding privacy to be a constitutional right subject to limitations and &lt;a href="https://indiankanoon.org/doc/127517806/"&gt;the 2018 Supreme Court judgment holding mandatory Aadhaar to be constitutional only for welfare and taxation but no other service&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;While these judgments sought to rein in Aadhaar’s proliferating mandatory uses, biometric verification remained the most common mode of identity authentication with &lt;a href="https://www.businesstoday.in/latest/trends/story/aadhaar-not-mandatory-yet-organisations-pose-it-as-a-mandatory-document-335550-2022-05-29"&gt;most organizations claiming it to be mandatory for various purposes&lt;/a&gt;. During the same period from 2010 onwards, a range of data security events concerning Aadhaar came to light. These included &lt;a href="https://www.firstpost.com/tech/news-analysis/aadhaar-security-breaches-here-are-the-major-untoward-incidents-that-have-happened-with-aadhaar-and-what-was-actually-affected-4300349.html"&gt;app-based flaws, government websites publishing Aadhaar details of subscribers, third party leaks of demographic data, duplicate and forged Aadhaar cards and other misuses&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In 2015, the Indian government launched its ambitious &lt;a href="https://indiancc.mygov.in/wp-content/uploads/2021/08/mygov-10000000001596725005.pdf"&gt;Digital India Campaign to provide government services to Indian citizens&lt;/a&gt; through online platforms. Yet, data security breach incidents continued to increase, particularly the trade in the sale and purchase of sensitive financial information related to bank accounts and credit card numbers. The online availability of &lt;a href="https://www.livemint.com/Industry/l5WlBjdIDXWehaoKiuAP9J/India-unprepared-to-tackle-online-data-security-report.html"&gt;a rich trove of data, accessible via a simple Google search without the use of any extractive software or hacking skills &lt;/a&gt;within a thriving shadow economy of data buyers and sellers makes India a particularly vulnerable digital economy, especially in the absence of robust legislation. The lack of awareness around digital crimes and low digital literacy further exacerbates the situation given that datafication via government portals, e-commerce, and online apps has outpaced the enforcement of legislative frameworks for data protection and cybersecurity.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In the context of Aadhaar data security issues, the term “data leak” seems to have more traction in media stories followed by the term “security breach”. Given the complexity of the myriad ways in which Aadhaar data has been breached, terms such as &lt;a href="https://techcrunch.com/2022/06/13/aadhaar-leak-pm-kisan/?guccounter=1&amp;amp;guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&amp;amp;guce_referrer_sig=AQAAADvQXtC19Gj80LSKVc5jLwnRsREalvM2f6dV3N9KmCs8be6_1Zbvu3J6abPmBxhLlUooLiOjg4JktYDDCXr0OYYvOZ5XFlXa6DfCJk97TvMXM-cs3uJbCJBA-ePqvAC5K4qGZSyDB4OykMEOIKXJpB0CTOourPRc5dBxFFq5JXlB"&gt;data leak and exposure&lt;/a&gt; (of &lt;a href="https://zeenews.india.com/personal-finance/aadhaar-data-breach-over-110-crore-indian-farmers-aadhaar-card-data-compromised-2473666.html"&gt;11 crore Indian farmers’ sensitive information&lt;/a&gt;) add to the specificity of the data security compromise. The term “fraud” also makes a comeback in the context of &lt;a href="https://www.business-standard.com/article/economy-policy/india-s-aadhaar-id-system-delivers-benefits-but-at-risk-of-widespread-fraud-122062400124_1.html"&gt;Aadhaar-related data security incidents&lt;/a&gt;. These cases represent a mix of data frauds involving&lt;a href="https://economictimes.indiatimes.com/news/india/alarm-over-fake-id-printing-websites-using-customer-data-for-cyber-fraud/articleshow/94742646.cms"&gt; fake identities&lt;/a&gt;, &lt;a href="https://indianexpress.com/article/cities/delhi/in-new-age-data-theft-fraudsters-steal-thumb-prints-from-land-registries-7914530/"&gt;theft of thumb prints &lt;/a&gt;for instance from land registries and inadvertent data leaks in numerous incidents involving &lt;a href="https://techcrunch.com/2019/01/31/aadhaar-data-leak/"&gt;government employees in Jharkhand&lt;/a&gt;, v&lt;a href="https://www.firstpost.com/india/aadhaar-data-leak-details-of-7-82-cr-indians-from-ap-and-telangana-found-on-it-grids-database-6448961.html"&gt;oter ID information of Indian citizens in Andhra Pradesh and Telangana&lt;/a&gt; and &lt;a href="https://www.thehindu.com/sci-tech/technology/major-aadhaar-data-leak-plugged-french-security-researcher/article26584981.ece"&gt;activist reports of Indian government websites leaking Aadhaar data&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Aadhaar-related data security events parallel the increase in corporate data breaches during the decade of datafication. The term “data leak” again alternates with the term “data breach” in most media accounts while other terms such as “theft” and “scam” all but disappear in the media coverage of corporate data breaches.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;From 2016 onwards, incidents of corporate data breaches in India continued to rise. A massive &lt;a href="https://thewire.in/banking/debit-card-breach-india-banking"&gt;debit card data breach involving the YES Bank ATMs and point-of-sale (PoS) machines &lt;/a&gt;compromised through malware between May and July of 2016 resulted in the exposure of ATM PINs and non-personal identifiable information of customers. It went &lt;a href="https://www.livemint.com/Industry/Ope7B0jpjoLkemwz6QXirN/SBI-Yes-Bank-MasterCard-deny-data-breach-of-own-systems.html"&gt;undetected for nearly three&lt;/a&gt; months. Another data leak in 2018 concerned a &lt;a href="https://www.zdnet.com/article/another-data-leak-hits-india-aadhaar-biometric-database/"&gt;system run by Indane, a state-owned utility company, which allowed anyone to download private information on all Aadhaar holders &lt;/a&gt;including their names, services they were connected to and the unique 12-digit Aadhaar number. Data breaches continued to be reported in India concurrent with the incidents of data mismanagement related to Aadhaar. Some &lt;a href="https://www.csoonline.com/article/3541148/the-biggest-data-breaches-in-india.html"&gt;prominent data breaches included &lt;/a&gt;a cyberattack on the systems of airline data service provider SITA resulting in the leak of Air India passenger data, leakage of the personal details of the Common Admission Test (CAT) applicants, details of credit card and order preferences of Domino’s pizza customers on the dark web, leakage of COVID-19 patients’ test results leaked by government websites, user data of Justpay and Big Basket for sale on the dark web and an SBI data breach among others between 2019 and 2021.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The media reportage of these data breaches use the term “cyberattack” to describe the activities of hackers and cybercriminals operating within a&lt;a href="https://www.thehindu.com/sci-tech/technology/internet/most-damaging-cybercrime-services-are-cheap-on-the-dark-web/article37004587.ece"&gt; shadow economy or the dark web&lt;/a&gt;. Recent examples of cyberattacks by hackers who leak user data for sale on the dark web include &lt;a href="https://indianexpress.com/article/technology/tech-news-technology/mobikwik-database-leaked-on-dark-web-company-denies-any-data-breach-7251448/"&gt;8.2 terabytes of 110 million sensitive financial data (KYC details, Aadhaar, credit/debit cards and phone numbers) of the payments app MobiKwik users&lt;/a&gt;, &lt;a href="https://www.firstpost.com/tech/news-analysis/dominos-india-data-breach-name-location-mobile-number-email-of-18-crore-orders-up-for-sale-on-dark-web-9650591.html"&gt;180 million Domino’s pizza orders (name, location, emails, mobile numbers),&lt;/a&gt; and &lt;a href="https://techcrunch.com/2022/07/18/cleartrip-data-breach-dark-web/"&gt;Flipkart’s Cleartrip users’ data&lt;/a&gt;. In these incidents again, three terms appear prominently in the media reportage - cyberattack, data breach, and leak. The term “data breach” remains the most frequently used epithet in the media coverage of the lapses of data security. While it alternates with the term “leak” in the stories, the term “data breach” appears consistently across most headlines in the news stories.&lt;/p&gt;
&lt;p dir="ltr"&gt;The exposure of sensitive, personal, and non-personal data by public and private entities in India is certainly a cause for concern, given the ongoing data protection legislative vacuum.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The media coverage of data breaches tends to emphasize the quantum of compromised user data aside from the types of data exposed. The media framing of these breaches in &lt;a href="https://www.livemint.com/technology/tech-news/indian-firms-lost-176-million-to-data-breaches-last-fiscal-11658914231530.html"&gt;quantitative terms of financial loss&lt;/a&gt; as well as the &lt;a href="https://www.indiatoday.in/technology/news/story/personal-data-of-3-4-million-paytm-mall-users-reportedly-exposed-in-2020-data-breach-1980690-2022-07-27"&gt;magnitude&lt;/a&gt; and the &lt;a href="https://www.moneycontrol.com/news/business/banks/indian-banks-reported-248-data-breaches-in-last-four-years-says-government-8940891.html"&gt;number of breaches&lt;/a&gt; certainly highlights the gravity of these incidents but harm to individual users is often not addressed.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Evolving Terminology and the Source of Data Harms&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The main difference in the media reportage of the BPO cybersecurity incidents during the early aughts and the contemporary context of datafication is the usage of the term, “data breach”, which figures prominently in contemporary reportage of data security incidents but not so much in the BPO-related cybercrimes.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;THe BPO incidents of data theft and the attendant fraud must be understood in the context of the anxieties brought on by a globalizing world of Internet-enabled systems and transnational communications. In most of these incidents regarded as cybercrimes, the language of fraud and scam ventures further to attribute such illegal actions of the identifiable malefactors to cultural factors such as lack of ethics and professionalism.The usage of the term “data leak” in these media reports functions more specifically to underscore a broader lapse in data security as well as a lack of robust cybersecurity laws. The broader term, “breach”, is occasionally used to refer to these incidents but the term, “data breach” doesn’t appear as such.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The term “data breach” gains more prominence in media accounts from 2009 onwards in the context of Aadhaar and the online delivery of goods and services by public and private players. The term “data breach” is often used interchangeably with the term “leak” within the broader ambit of cyberattacks in the corporate sector. The media reportage frames Aadhaar-related security lapses as instances of security/data breaches, data leaks, fraud, and occasionally scam.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;In contrast to the handful of data security cases in the BPO sector, data breaches have abounded in the second decade of the twenty-first century. What further differentiates the BPO-related incidents to the contemporary data breaches is the source of the data security lapse. Most corporate data breaches remain attributable to the actions of hackers and cybercriminals while the BPO security lapses were traceable back to ex-employees or insiders with access to sensitive data. We also see in the coverage of the BPO-related incidents, the attribution of such data security lapses to cultural factors including a lack of ethics and professionalism often in racial overtones. The media reportage of the BBC and ABC sting operations suggests that the India BPOs lack of preparedness to handle and maintain personal data confidentiality of foreigners point to the absence of a privacy culture in India. Interestingly, this transnational attribution recurs in a different form in the national debate on &lt;a href="https://huffpost.netblogpro.com/archive/in/entry/indians-don-t-care-about-privacy-but-thankfully-the-law-will-teach-them-what-it-means_a_23179031"&gt;Aadhaar and how Indians don’t care about their privacy&lt;/a&gt;.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The question of the harms of data breaches to individuals is also an important one. In the discourse on contemporary data breaches, the actual material harm to an individual user is rarely ever established in the media reportage and generally framed as potential harm that could be devastating given the sensitivity of the compromised data. The harm is reported to be predominantly a function of organizational cybersecurity weakness or attributed to hackers and cybercriminals.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The reporting of harm in collective terms of the number of accounts breached, financial costs of a data breach, the sheer number of breaches and the global rankings of countries with the highest reported cases certainly suggests a problem with cybersecurity and the lack of organizational preparedness. However, this collective framing of a data breach’s impact usually elides an individual user’s experience of harm. Even in the case of Aadhaar-related breaches - a mix of leaking data on government websites and other online portals and breaches - the notion of harm owing to exposed data isn’t clearly established. This is, however, different from the &lt;a href="https://scroll.in/article/1013700/six-types-of-problems-aadhaar-is-causing-and-safeguards-needed-immediately"&gt;extensively documented cases of Aadhaar-related issues&lt;/a&gt; in which welfare benefits have been denied, identities stolen and legitimate beneficiaries erased from the system due to technological errors.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;Future Directions of Research&lt;/h3&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;This brief, qualitative foray into the media coverage of data breaches over two decades has aimed to trace the usage of various terms in two different contexts - the Indian BPO-related incidents and the contemporary context of datafication. It would be worth exploring at length, the relationship between frequent reports of data breaches, and the language used to convey harm in the contemporary context of a concrete data protection legislation vacuum. It would be instructive to examine the specific uses of the terms such as “fraud”, “leak”, “scam”, “theft” and “breach” in media reporting of such data security incidents more exhaustively. Such analysis would elucidate how media reportage shapes public perception towards the safety of user data and an anticipation of attendant harm as data protection legislation continues to evolve.&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;Especially with Aadhaar, which represents a paradigm shift in identity verification through digital means, it would be useful to conduct a sentiment analysis of how biometric identity related frauds, scams, and leaks are reported by the mainstream news media. A study of user attitudes and behaviours in response to the specific terminology of data security lapses such as the terms “breach”, “leak”, “fraud”, “scam”, “cybercrime”, and “cyberattack” would further contribute to how lay users understand the gravity of a data security lapse. Such research would go beyond expert understandings of data security incidents that tend to dominate media reportage to elucidate the concerns of lay users and further clarify the cultural meanings of data privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india'&gt;https://cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pawan Singh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2022-10-17T16:14:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
