<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 181 to 195.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/is-privacy-obsolete"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/bloomberg-quint-june-9-2018-draft-bill-seeks-to-revolutionise-data-collection-storage-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-on-the-draft-national-policy-on-official-statistics"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/regulation-of-cross-border-transfers-of-personal-data-in-asia"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/economic-times-may-29-mugdha-variyar-alexas-recording-leak-in-us-echoes-privacy-issues-here"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/privacy-in-the-digital-age-addressing-common-challenges-seizing-opportunities"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/indian-intermediary-liability-regime"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/hack-read-waqas-may-15-2018-indian-cricket-board-exposes-personal-data-of-thousands-of-players"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/the-wire-karan-saini-may-11-2018-aadhaar-remains-an-unending-security-nightmare-for-a-billion-indians"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy">
    <title>Data Privacy: Footprints on the Web</title>
    <link>https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy</link>
    <description>
        &lt;b&gt;Technology has made data protection a hot button issue. Now, a group of eminent citizens, mostly lawyers, have formulated a draft privacy bill, a legal framework that protects the individual’s right to privacy, but it faces legal jurisdiction issues &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post by Sujit Bhar was published in &lt;a class="external-link" href="http://www.indialegallive.com/constitutional-law-news/acts-and-bills-news/data-privacy-footprints-on-the-web-50261"&gt;IndiaLegal&lt;/a&gt; on June 21, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Lack of data privacy is a modern day peril. Quite like the individual’s right to privacy—one that has been raised to the level of a Fundamental Right by the Supreme Court—data privacy today is prime, because technology has made our lives fully dependant on associated data. Hence, by extension of the same logic and arguments that the top court used for personal privacy, data privacy should be protected.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The methodology to be adopted, though, is not as easy to determine given the lack of legislation in the field, the improbability of existing technology to ensure complete privacy and because of legal jurisdiction issues.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Also, to what extent data privacy can and should be allowed is a legal argument that needs to be supported by other fields of knowledge. The Supreme Court decision to award privacy as a Fundamental Right will act as a plinth in determining this.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To that end a group of eminent citizens, mostly lawyers, came together and formulated a draft privacy bill with the objective of slicing through banal arguments that would ensue if this was to wait for public re-reference/debate.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The proponents—Apar Gupta, Gautam Bhatia, Kritika Bhardwaj, Maansi Verma, Naman M Aggarwal, Praavita Kashyap, Prasanna S, Raman Jit Singh Chima, Ujwala Uppaluri and Vrinda Bhandari—have tried to develop their own privacy bill, based on the foundation of the Privacy (Protection) Bill, 2013, “which was drafted over a series of roundtables and inputs conducted by the Centre for Internet and Society, Bangalore”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In doing so the group started from what it calls “seven privacy principles”, derived from various constitutional and expert texts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 1: Individual rights are at the centre of privacy and data protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This says that “the individual and her rights are primary. The law on privacy must empower you by advancing your right to privacy…”including “your right to autonomy and dignity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 2: A data protection law must be based on privacy principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Here reference is made to the report of the Justice AP Shah Committee of Experts. It’s a method that has been left flexible, to accommodate fast developing technology. There is a reference to Moore’s Law in this. Moore’s Law has remained one of the most overwhelmingly true laws of the IT industry. Originating in 1970, it says that processor speeds, or overall processing power for computers “will double every two years”. While that has remained true till now, with the development of multiple core processors, this law too has seemingly run its course. With the world changing at such a fast pace, if the data privacy bill/law does not remain flexible, it would also be quickly consigned to a museum of laws. Hence this flexible approach will be crucial.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 3: A strong privacy commission must be created to enforce the privacy principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This is the part of establishing an oversight authority, “a strong body to ensure that the data protection rights are put into practice and enforced”. This structure has been treated for something “that works in principle and in practice.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There is one part that says that this proposed “Privacy Commission”, has been “provided wide powers of investigation, adjudication, rule-making and enforcement. The Commission should adopt an approach that builds accountability for the rights of users by having powers to impose penalties that are proportionate to the harm and build deterrence.” This, obviously, means that it will be stepping onto the toes of other laws and that would be a rough road to navigate. However, as the group’s own philosophy says that the problem with technology oriented legislation is that it takes catching up with the progress of technology. To overcome this, the group wants to “make sure that the Privacy Code is not outdated” and hence wants to make sure that the “Privacy Commission can exercise rule making powers to give effect to the data protection principles under the regulation”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The other part of the philosophy is of acknowledging and addressing public complaints. Hence the legal rigidity of regular acts would be dismissed. How this can work with enforcement agencies, though, will remain a matter of debate. The draft bill says that the “Privacy Commission must serve as the forum for the redressal of the general public’s grievances”, and that “Privacy Commissions should have the ability to investigate (independently through the office of a Director General), hold hearings and pass orders with directions and fines”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;That could be legal nightmare, because unlike a simple code, the bill has to pass through parliament to become an act, and legislators are the ones who have final say in remodelling an existing law. How much power they would agree to delegate is anybody’s guess.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Of course, the draft also calls for the courts to welcome public opinion. There seems to be a slight hitch in the wording, which says that “…while the Privacy Commission serves as the forum for redressal, the public should retain the remedies of approaching the civil courts (even in instances where harm is suffered by a group of people) and of filing police complaints directly”. That questions even the oversight authority of the commission. There is another objective—a hope, one would say—that the Privacy Commission must have jurisdiction over the government, as it does over the private sector. The Privacy Commission should have overriding power and superintendence over all legal entities in matter of data protection and privacy”. While this sounds good on paper, the issue of national security can override all. At this point, according to a cyber security expert, there is talk within the Indian government on how to deal with the social media messaging app WhatsApp. Technically, as the company points out, messaging through an app is encrypted (military grade encryption, it is said) end-to-end. Hence terrorist groups have zeroed in on this as a common idea exchange platform. There could possibly be restrictive legislation on this. That could strike at the heart of data privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The government’s reaction, though, could become counter-productive. This could be visible in what the Justice Srikrishna-led Committee of Experts possibly could recommend.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 4: The government should respect user privacy. Technically, if this bill, in its current form, has to go through parliament, members of both houses should be willing to accept that it will have no snooping powers, ever. The way the government fought tooth and nail against personal privacy in court—and the Aadhaar verdict is still awaited—this proposal seems unlikely to have an easy passage. The draft says: “It is imperative that the government, its arms, bodies and programmes be compliant with the privacy protection principles through a data protection law.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There is a caveat within this, saying: “We support the use of digital technologies for public benefit. However, they should not be privileged over fundamental rights.” The proposal also says: “The government is responsible for the delivery of many essential services to the public of India. These services must not be withheld from an individual, due to such individual not sharing data with the government. Withholding services on the pretext of requirement of collection of data effectively amounts to extortion of consent. Individuals cannot be forced to trade away their data and citizenship at the altar of being permitted to use government services and access legal entitlements on welfare.” This will have to wait its validation or dismissal through the Aadhaar verdict.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 5: A complete privacy code comes with surveillance reform&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This is another tricky issue for any government. It talks about how the Snowden revelations “brought to public knowledge that our personal data is collected in an indiscriminate manner by governments”. The draft calls this collection procedure “dragnet surveillance”, because it “contravenes the principles of necessity, proportionality and purpose limitation”. Necessity and proportionality have been argued in detail during the Aadhaar debate in court and till that verdict is out, it would, possibly, not be right to delve into this, though a recommendation for procedural safeguards might run into the same wall as in the case of encrypted software in social media apps. The draft accepts the possibility of “individual interception and surveillance”, but says “this should be severely limited in substance and practice through procedural safeguards”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 6: The right to information needs to be strengthened and protected&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This basically refers to the Right to Information Act and seems completely justified, with Information Commissioners being “exempted from interference or control by the Privacy Commissioner”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 7: International protections and harmonisation to protect the open internet must be incorporated&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another contentious issue, being fuelled by the loss of face by Facebook in its effort to introduce graded access (with paywalls).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The group widens its scope in stating that “we need to be guided by the &lt;a href="http://www.indialegallive.com/topic/supreme-court"&gt;Supreme Court’s&lt;/a&gt; Right to Privacy decision and make reference to the European Union’s General Data Protection Regulation”. More interestingly, the group admits that every law will have certain exceptions. It says: “…but without clear wording sometimes exceptions swallow up the rule. We adopted a three part test in our drafting process in which any exceptions to these privacy principles should be: (a) worded clearly; (b) limited in purpose, necessary and proportionate to the aim; and (c) accompanied by sufficient procedural safeguards”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the face of it, the overall draft represents a novel and upright way of thinking, and if some of this is accepted while the government mulls the Justice Srikrishna Committee’s recommendations (expected late this month), it would be a good beginning.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy'&gt;https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-25T16:48:34Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/is-privacy-obsolete">
    <title>Is Privacy Obsolete?</title>
    <link>https://cis-india.org/internet-governance/news/is-privacy-obsolete</link>
    <description>
        &lt;b&gt;Pranesh Prakash was a panelist at this event organized by TERI in Bangalore on June 22, 2018.&lt;/b&gt;
        &lt;p&gt;&lt;img src="https://cis-india.org/home-images/copy_of_BIC.png/@@images/5fdcc0f8-eef2-4d3d-b33b-800722a235e1.png" alt="BIC" class="image-inline" title="BIC" /&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/is-privacy-obsolete'&gt;https://cis-india.org/internet-governance/news/is-privacy-obsolete&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-23T05:01:21Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention">
    <title>Why NPCI and Facebook need urgent regulatory attention </title>
    <link>https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention</link>
    <description>
        &lt;b&gt;The world’s oldest networked infrastructure, money, is increasingly dematerialising and fusing with the world’s latest networked infrastructure, the Internet. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="https://economictimes.indiatimes.com/industry/banking/finance/banking/why-npci-and-facebook-need-urgent-regulatory-attention/articleshow/64522587.cms"&gt;Economic Times&lt;/a&gt; on June 10, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;As the network effects compound, disruptive acceleration hurtle us towards financial utopia, or dystopia. Our fate depends on what we get right and what we get wrong with the law, code and architecture, and the market.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Internet, unfortunately, has completely transformed from how it was first architected. From a federated, generative network based on free software and open standards, into a centralised, environment with an increasing dependency on proprietary technologies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In countries like Myanmar, some citizens misconstrue a single social media website, Facebook, for the internet, according to LirneAsia research. India is another market where Facebook could still get its brand mistaken for access itself by some users coming online. This is Facebook put so many resources into the battle over Basics, in the run-up to India’s network neutrality regulation. an odd corporation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On hand, its business model is what some term surveillance capitalism. On the other hand, by acquiring WhatsApp and by keeping end-toend (E2E) encryption “on”, it has ensured that one and a half billion users can concretely exercise their right to privacy. At the time of the acquisition, WhatsApp founders believed Facebook’s promise that it would never compromise on their high standards of privacy and security. But 18 months later, Facebook started harvesting data and diluting E2E.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In April this year, my colleague Ayush Rathi and I wrote in Asia Times that WhatsApp no longer deletes multimedia on download but continues to store it on its servers. Theoretically, using the very same mechanism, Facebook could also be retaining encrypted text messages and comprehensive metadata from WhatsApp users indefinitely without making this obvious.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;My friend, Srikanth Lakshmanan, founder of the CashlessConsumer collective, is a keen observer of this space. He says in India, “we are seeing an increasing push towards a bank-led model, thanks to National Payments Corporation of India (NPCI) and its control over Unified Payments Interface (UPI), which is also known as the cashless layer of the India Stack.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;NPCI is best understood as a shape shifter. Arundhati Ramanathan puts it best when she says “depending on the time and context, NPCI is a competitor. It is a platform. It is a regulator. It is an industry association. It is a profitable non-profit. It is a rule maker. It is a judge. It is a bystander.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This results in UPI becoming, what Lakshmanan calls, a NPCI-club-good rather than a new generation digital public good. He also points out that NPCI has an additional challenge of opacity — “it doesn’t provide any metrics on transaction failures, and being a private body, is not subject to proactive or reactive disclosure requirements under the RTI.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Technically, he says, UPI increases fragility in our financial ecosystem since it “is a centralised data maximisation network where NPCI will always have the superset of data.” Given that NPCI has opted for a bank-led model in India, it is very unlikely that Facebook able to leverage its monopoly the social media market duopoly it shares with in the digital advertising market to become a digital payments monopoly.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, NCPI and Facebook both share the following traits — one, an insatiable appetite for personal information; two, a fetish for hypercentralisation; three, a marginal commitment to transparency, and four, poor track record as a custodian of consumer trust. The marriage between these like-minded entities has already had a dubious beginning.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Previously, every financial technology wanting direct access to the NPCI infrastructure had to have a tie-up with a bank. But for Facebook and Google, as they are large players, it was decided to introduce a multi-bank model. This was definitely the right thing to do from a competition perspective. But, unfortunately, the marriage between the banks and the internet giant was arranged by NPCI in an opaque process and WhatsApp was exempted from the full NPCI certification process for its beta launch.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Both NPCI and Facebook need urgent regulatory attention. A modern data protection law and a more proactive competition regulator is required for Facebook. The NPCI will hopefully also be subjected to the upcoming data protection law. But it also requires a range of design, policy and governance fixes to ensure greater privacy and security via data minimisation and decentralisation; greater accountability and transparency to the public; separation of powers for better governance and open access policies to prevent anti-competitive behaviour.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention'&gt;https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-12T02:07:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/bloomberg-quint-june-9-2018-draft-bill-seeks-to-revolutionise-data-collection-storage-in-india">
    <title>Citizens’ Draft Privacy Bill Seeks To Revolutionise Data Collection, Storage In India</title>
    <link>https://cis-india.org/internet-governance/news/bloomberg-quint-june-9-2018-draft-bill-seeks-to-revolutionise-data-collection-storage-in-india</link>
    <description>
        &lt;b&gt;A draft privacy bill proposes sweeping reforms to the way personal data is collected, processed and stored in India.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post by Arpan Chaturvedi was published in &lt;a class="external-link" href="https://www.bloombergquint.com/law-and-policy/2018/06/08/draft-bill-seeks-to-revolutionise-data-collection-storage-in-india"&gt;Bloomberg Quint&lt;/a&gt; on June 9, 2018. CIS research was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Titled Indian Privacy Code, 2018, the draft proposes that “all data collected, processed and stored by data controllers and data processors prior to the date on which this Act comes into force shall be destroyed within a period of two years from the date on which this Act comes into force”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft has been put together by a group of lawyers and policy analysts and uploaded on the website of ‘Save our Privacy’ — a public initiative to put forth a model law on data protection. The initiative is backed by the India Privacy Foundation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;No person, including a data controller and data processor, shall collect any personal data without obtaining the consent of the data subject to whom it pertains, the draft bill says. Collection of personal data without consent can happen only when:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;It’s necessary for the provision of an emergency medical service.&lt;/li&gt;
&lt;li&gt;Prevent, investigate or prosecute a cognizable offence.&lt;/li&gt;
&lt;li&gt;Exempted by a privacy commission that the draft seeks to institute&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Also, the draft bill proposes that no person shall store any personal data for a period longer than is necessary to achieve the purpose for which it was collected or received. The same applies to the processing of personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft bill has been submitted to the Justice Sri Krishna Committee — which will deliberate on a data-protection framework for the country. The committee’s first draft is likely to be submitted this month.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bill prescribes punishment for offenses related to interception of communication, surveillance, abetment, repeat offenders and offenses by companies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The bill, according to information on the website, is based on seven principles, foremost of which is the importance of individual rights. The others are:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;A data protection law must be based on privacy principles and guidelines discussed in the report of Justice AP Shah Committee of Experts; the Supreme Court judgement on Right to Privacy and European Union’s General Data Protection Regulation.&lt;/li&gt;
&lt;li&gt;A strong privacy commission must be created to enforce privacy principles. The commission should be granted wide powers of investigation, adjudication, rule-making and enforcement. The privacy commission must have jurisdiction over the government as well as private bodies.&lt;/li&gt;
&lt;li&gt;The government must respect user privacy. The government cannot deny essential services to citizens if they choose not to share data with it. The draft says government withholding services on pretext of collection of information effectively amounts to “extortion of consent”.&lt;/li&gt;
&lt;li&gt;A complete privacy code must come with surveillance reform. Even when individual interception and surveillance is carried out this should be severely limited in substance and practiced through procedural safeguards.&lt;/li&gt;
&lt;li&gt;Strengthen the Right To Information Act and exempt information commissioners from interference or control by the privacy commissioner&lt;/li&gt;
&lt;li&gt;International protection and harmonisation is a must to protect the open internet. The group suggests the law must have extraterritorial effect and apply to web services and platforms which are accessible in India and gather personal data of Indians.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The bill takes inspiration from the Privacy (Protection) Bill, 2013 which was drafted over a series of roundtable discussions and inputs conducted by the Centre for Internet and Society, Bengaluru.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The individuals who were involved in the drafting of the model law are Raman Jit Singh Cheema, Apar Gupta, Gautam Bhatia, Kritika Bhardwaj, Maansi Verma, Naman N Aggarwal, Praavita Kashyap, Prasanna S, Ujjwala Uppaluri, Vrinda Bhandari.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/bloomberg-quint-june-9-2018-draft-bill-seeks-to-revolutionise-data-collection-storage-in-india'&gt;https://cis-india.org/internet-governance/news/bloomberg-quint-june-9-2018-draft-bill-seeks-to-revolutionise-data-collection-storage-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-11T02:47:46Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-on-the-draft-national-policy-on-official-statistics">
    <title>Comments on the Draft National Policy on Official Statistics</title>
    <link>https://cis-india.org/internet-governance/blog/comments-on-the-draft-national-policy-on-official-statistics</link>
    <description>
        &lt;b&gt;This submission presents comments by the Centre for Internet &amp; Society, India (“CIS”) on the Draft National Policy on Official Statistics which was released to the public by the Ministry of Statistics and Programme Implementation on 17th May 2018 for comments and views.&lt;/b&gt;
        &lt;p&gt;Edited by Swaraj Barooah. Download a PDF of the submission &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/comments-on-draft-national-policy-on-official-statistics"&gt;here&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Preliminary&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;CIS appreciates the Government’s efforts in realising the importance of the need for high quality statistical information enshrined in the Fundamental Principles of Official Statistics as adopted by the UN General Assembly in January 2014. CIS is grateful for the opportunity to put forth its views on the draft policy. This submission was made on 31st May, 2018.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;First, this submission highlights some general defects in the draft policy: there is lack of principles guiding data dissemination policies; there are virtually no positive mandates set for Government bodies for secure storage and transmission of data; and while privacy is mentioned as a concern, it has been overlooked in designing the principles of the implementation of surveys. Then, this submission puts forward specific comments suggesting improvements to various sections in the draft policy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;CIS would also like to point out the short timeline between the publication of the &lt;a class="external-link" href="http://mospi.gov.in/announcements/suggestions-invited-draft-national-policy-official-statistics"&gt;draft policy&lt;/a&gt; (18th May, 2018), and the deadline set for the stakeholders to submit their comments (31st May, 2018). Considering that the policy has widespread implications for all Ministries, citizens, and State legislation rights (proposed changes include a Constitutional Amendment), it is necessary that such call-for-comments are publicised widely, and enough time is given to the public so that the Government can receive well-researched comments.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;General Comments&lt;/h2&gt;
&lt;h3&gt;Data dissemination&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;For data dissemination, the draft policy does not stress upon a general principle or set of principles, and often disregards principles specified in the Fundamental Principles of Official Statistics, which are the very principles the Government intends to draw its policies on official statistics from. Rather it relies on context-specific provisions that fail to summarise and articulate a general philosophy for the dissemination of official statistics, and fails to practically embody some stated goals. The first principle on Official Statistics, as realised by the United Nations General Assembly, clearly states that: “[...] official  statistics  that  meet  the  test  of  practical utility  are  to  be  compiled  and  made  available  on  an  impartial  basis  by  official statistical agencies to honour citizens’ entitlement to &lt;a class="external-link" href="https://unstats.un.org/unsd/dnss/gp/FP-New-E.pdf"&gt;public information&lt;/a&gt;.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Let us compare this with Section 5.1.7 (9) of the draft policy, which refers to policies regarding core statistics: it mentions a data “warehouse” to be maintained by the NSO which should be accessible to private and public bodies. While this does point towards an open data policy, such a vision has not been articulated in any part thereof.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft policy, at the outset, should have general guiding principles of publishing data openly and freely (once it meets the utility test, and it has been ensured that individual privacy will not be violated by the publishing of such statistics). This should serve well to inform further regulations and related policies governing the use and publishing of statistics, like the &lt;a class="external-link" href="https://cis-india.org/internet-governance/comments-on-the-statistical-disclosure-control-report"&gt;Statistical Disclosure Control Report&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A general commitment to a well-articulated policy on data dissemination will ensure easy-to-follow principles for the various Ministries that will refer to the document. The additional principles that come with open data principles should also be described by the policy document: a commitment to publishing data in a machine-readable format, making it available in multiple data formats (.txt, .csv, etc.), and including its metadata.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Data storage and usage&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In the absence of a regime for data protection, it is absolutely necessary that a national policy on statistics provide positive mandates for the encryption of all digitally-stored personal and sensitive information collected through surveys. Even though the current draft of the policy mentions the need to protect confidential information, it sets no mandatory requirements on the Government to ensure the security of such information, especially on digital platforms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, all transmission of potentially sensitive information should be done with the digital signatures of the employee/Department/Ministry authorising said transmission. This will ensure the integrity and authenticity of the information, and provide with an auditable trail of the information flowing between entities in the various bodies.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Data privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;It is appreciable that Section 5.7.9 of the draft policy notes, “[a]ll statistical surveys represent a degree of privacy invasion, which is justified by the need for an alternative public good, namely information.” However, all statistical surveys may not be proportionate in their invasiveness, even if they might serve a legitimate public goal in the future.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft policy does not address how privacy concerns can be taken into account while designing the survey itself. A necessary outcome of the realisation of the possible privacy violations that may arise due to surveys is that all data collection be “minimally intrusive”, the data be securely stored (see previous comment section, ‘Data storage and usage’), and the surveyed users have control over the data even after they have parted with their information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Since the policy deals extensively with the implementation of surveys, the following should details should be clearly laid out in the policy:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The extent to which an individual has control over the data they have provided to the surveying agency.&lt;/li&gt;
&lt;li&gt;The means of redressal available to an individual who feels that his/her privacy has been violated through the publication of certain statistical information&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 style="text-align: justify; "&gt;Specific Comments&lt;/h2&gt;
&lt;p&gt;Section 5.1: Dichotomising official statistics as core statistics and other official statistics&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Comments&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The reasons for dichotomising official statistics has not been appropriately substantiated with evidence, considering the wide implications of policy proposals that arise from the definition of “core statistics.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Firstly, the descriptions of what constitutes “core statistics” casts too wide a net by only having a single vague qualitative criterion, i.e. “national importance.” All the other characteristics of the “core statistics” are either recommendations or requirements as to how the data will be handled and thus, pose no filter to what can constitute “core statistics.” The wide net is apparent in the fact that even the initially-proposed list of “core statistics”, given in Annex-II of the policy, has 120 categories of statistics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Secondly, the policy does not provide reasons for why the characteristics of “core statistics”, highlighted in Section 5.1.5, should not apply to all official statistics at the various levels of Government. Therefore, the utility of the proposed dichotomy has also not been appropriately substantiated with illustrative examples of how “core statistics” should be considered qualitatively different from all official statistics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This definition may lead to widespread disagreement between the States and the Centre, because Section 5.2 proposes that “core statistics” be added to the Union List of the Seventh Schedule of the Constitution. How the proposal may affect Centre-State responsibilities and relations pertaining to the collection and dissemination of statistics is elaborated in the next section.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Recommendations&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The policy should not make a forced dichotomy between “core” and (&lt;i&gt;ipso facto&lt;/i&gt;) non-core statistics. If a distinction is to be made for any reason(s) (such as for the purposes of delineating administrative roles) then such reason must be clearly defined, along with a clear explanation for why such a dichotomy would alleviate the described problem. The definitions should have tangible and unambiguous qualitative criteria.&lt;/p&gt;
&lt;p&gt;Section 5.2: Constitutional amendment in respect of core statistics&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Comments&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The main proposal in the section is that the Seventh Schedule of the Constitution be amended to include “core statistics” in the Union List. This would give the Parliament the legislative competence to regulate the collection, storage, publication and sharing of such statistics, and the Central Government the power to enforce such legislation. Annex-II provides a tentative list of what would constitute “core statistics”; as is apparent, this list is wide-ranging and consists over 120 items which span the gamut of administrative responsibilities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The list includes items such as “Landholdings Number, area, tenancy, land utilisation [...]” (S. No. 21), and “Statistics on land records” (S. No. 111) while most responsibilities of land regulation currently lie with the States. Similarly, items in Annex-II venture into statistics related to petroleum, water, agriculture, electricity, and industry; some of which are in the Concurrent or State List.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Statistics are metadata. There is no reason for why the administration of a particular subject lie with the State, and the regulation of data about such subject should lie with solely with the Central Government. It is important to recognise that adding the vaguely defined “core statistics” to the Union List, while enabling the Central Government to execute and plan such statistical exercises, will also prevent the States from enacting any legislation that regulates the management of statistics regarding its own administrative responsibilities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The regulation of State Government records in general has been a contentious issue, and its place in our federal structure has been debated several times &lt;a class="external-link" href="https://thewire.in/tech/states-power-enact-data-protection-laws"&gt;in the Parliament&lt;/a&gt;&lt;span&gt;:&lt;/span&gt; the enactment of Public Records Act, 1993; the Right to Information Act, 2005; and the Collection of Statistics Act, 2008 are predicated on an assumption of such competence lying with the Parliament. However, it is equally important to recognise the role States have played in advancing transparency of Government records. For example, State-level Acts analogous to the Right to Information Act existed in Tamil Nadu and Karnataka before the Central Government enactment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Recommendations&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;We strongly recommend that “statistics” be included in the Concurrent List, so that States are free to enact progressive legislation which advances transparency and accountability, and is not in derogation of Parliamentary legislation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Ministry should view this statistical policy document as a venue to set the minimum standards for the collection, handling and publication of statistics regarding its various functions. If the item is added to the Concurrent List, the States, through local legislation, will only have the power to improve on the Central standards since in a case of conflict, State-levels laws will be superseded by Parliamentary ones.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 5.3: Mechanism for regulating core statistics including auditing&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Comments&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft policy in Section 5.3.2 says, “[...] The Committee will be assisted by a Search Committee headed by the Vice-Chairperson of the NITI Aayog, in which a few technical experts could be included as Members.” The non-commital nature of the word ‘could’ in this statement detracts from the importance of having technical experts on this committee, by making their inclusion optional. The policy also does not specify who has the power to include technical experts as Members in the Search Committee. The statement should include either a minimum number of a  specific number or members, and not use the non-committal word “could”&lt;/p&gt;
&lt;p&gt;The National Statistical Development Council, as mentioned in 5.3.9, is supposed to “handle Centre-State relations in the areas of official statistics, the Council should be represented by Chief Ministers of six States to be nominated by the Centre” (Section 5.3.10). The draft does not elaborate on the rationale behind including just six states in the Council. It does not recommend any mechanism on the basis of which Centre will nominate states in the council.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Recommendations&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The policy should recommend a minimum number of technical experts who &lt;i&gt;must&lt;/i&gt; be included in the search committee, along with a clear process for how such members are to be appointed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, the policy appropriately recognises the great diversity in India and the unique challenges faced by each State. Thus, each State has its unique requirements. Since in Section 5.3.11, the policy recommends that council meet at a low frequency of at least once in a year, all States should be represented in the Council.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 5.4: Official Machinery to implement directions on core statistics&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Comments&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The functions of Statistics Wing in the MOSPI, laid out in Section 5.4.7, include advisory functions which overlap with functions of National Statistical Commission (NSC) mentioned in Section 5.3.5. Some regulatory functions of Statistics Wing, like “conducting quality checks and auditing of statistical surveys/data sets”, overlap with the regulatory functions of NSC mentioned in Section 5.3.7.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In section 5.3.1, the draft policy explicitly mentions that “what is feasible and desirable is that production of official statistics should continue with the Government, whereas the related regulatory and advisory functions could be kept outside the Government”. But Statistics Wing is a part of the government and it also has regulatory and advisory functions. It will adversely affect the power of NSC as an autonomous body.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are inconsistencies in the draft-policy regarding the importance and need of a decentralized statistical system. In section 3 [Objectives], it has been emphasized that the Indian Statistical System shall function within decentralized structure of the system.  But, in section 5.4.15, the draft says that decentralized statistical system poses a variety of problems, and advocates for a unified statistical system. Again, in section 5.15, draft emphasizes the development of sub-national statistical systems. These views are inconsistent and create confusion regarding the nature of statistical system that policy wants to pursue.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Recommendations&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The functions of the NSC should be kept in its exclusive domain. Any such overlapping functions should be allocated to one agency taking into consideration the Fundamental Principles on Official Statistics.&lt;/p&gt;
&lt;p&gt;The inconsistencies regarding the decentralisation philosophy of the statistical system should be addressed.&lt;/p&gt;
&lt;p&gt;Section 5.5: Identifying statistical products required through committees&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Comments&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While Section 5.5.2 recognises data confidentiality as a goal for statistical coordination, it does not take into account the violation of privacy that might occur due to the sharing of data. For example, a certain individual might agree to share personal information with a particular Ministry, but have apprehensions about it being shared with other Ministries or private parties.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Recommendations&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We recommend that point 4 in Section 5.5.2 be read as, “enabling sharing of data without compromising the privacy of individuals and the confidentiality/security of data.”The value of of the individual privacy stems from both the recent Supreme Court judgment that affirmed privacy as a Fundamental Right, and also Principle 6 of the of the Fundamental Principles of Official Statistics. Realising privacy as a goal in this section will add a realm of individual control that is already articulated in Section 5.7.9.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Annex-VII: Guidelines on Outsourcing statistical activities&lt;/h2&gt;
&lt;h3&gt;Comments&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Section 6 defines “sensitive information” in an all-inclusive manner and does not leave space for further inclusion of any information that may be interpreted as sensitive. For example, biometric data has not been listed as “sensitive information”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 9.1, draft says, “[t]he identity of the Government agency and the Contractor may be made available to informants at the time of collection of data”. It is imperative that informants have the right to verify the identity of the Government agency and the Contractor before parting with their personal information.&lt;/p&gt;
&lt;h3&gt;Recommendations&lt;/h3&gt;
&lt;p&gt;The definition of “sensitive information” should be broad-based with scope for further inclusion of any kind of data that may be deemed “sensitive.”&lt;/p&gt;
&lt;p&gt;Section 9.1 must mandate that the identity of the Government agency and the Contractor be made available to informants at the time of collection of data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 9.6 can be redrafted to state that each informant must be informed of the manner in which the informant could access the data collected from the informant in a statistical project, as also of the measures taken to deny access on that information to others, except in the cases specified by the policy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 10.2 can be improved to state that if information exists in a physical form that makes the removal of the identity of informants impracticable (e.g. on paper), the information should be recorded in another medium and the original records must be destroyed.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-on-the-draft-national-policy-on-official-statistics'&gt;https://cis-india.org/internet-governance/blog/comments-on-the-draft-national-policy-on-official-statistics&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Gurshabad Grover and Sandeep Kumar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-07T02:54:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states">
    <title>EC disables easy access to electoral data across states </title>
    <link>https://cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states</link>
    <description>
        &lt;b&gt;The recently-concluded Assembly elections may have set more than just one precedent with implications for the entire nation. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Akshatha M was published in &lt;a class="external-link" href="https://economictimes.indiatimes.com/news/politics-and-nation/ec-disables-easy-access-to-electoral-data-across-states/articleshow/64474558.cms"&gt;Economic Times&lt;/a&gt; on June 5, 2018. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;While the poll result led to what many see as the beginning of a national front comprising regional parties, the steps Karnataka’s chief electoral officer took to protect the privacy of its electoral rolls will be emulated across the country. &lt;br /&gt;&lt;br /&gt;The Election Commission of India, in an internal circular issued in January, ordered the chief electoral officers of all states and union territories to publish electoral rolls only in image PDF and CAPTCHA formats. These formats ensure that no individual can access electoral data, except as readonly files. While the image PDF format disables the search option in the rolls, CAPTCHA does not allow visitors to either extract or download the rolls.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“It has been decided that electoral rolls should be published on (the) website in image PDF only. If presently-available PDF electoral rolls are not image PDF, then the same shall be done immediately,” the EC circular said. &lt;br /&gt;&lt;br /&gt;It all started in the latter part of 2017 when Karnataka’s chief electoral officer published the draft electoral rolls as image PDF with CAPTCHA formats. Earlier, rolls were published in text PDF (minus CAPTCHA) format. Electoral analysts and citizen groups took exception to the new formats on the grounds that that did not allow them to analyse errors.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;CEO Sanjiv Kumar’s contention was that analysts were seeking easy access to data. He defended his move on the grounds that the personal data of voters need to be protected. The tiff eventually reached the EC’s doorstep. It turns out that the chief election commissioner was convinced about Sanjiv Kumar’s intent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Speaking to ET, CEC Om Prakash Rawat said: “After Cambridge Analytica and Facebook episodes, the EC has decided to protect voters’ data from data harvesting and data manipulation as a precautionary measure. We are also working towards adding special data security features in the electoral rolls.” &lt;br /&gt;&lt;br /&gt;Electoral roll analysts continue to see the EC’s decision as a bid to cover up flaws in the rolls. “Their argument is self-defeating on two counts: One, an individual can still extract the data, though it is a little time-consuming. Two, voter data is sold in broad daylight for 7 paise per record and despite knowing this, the election authorities have not taken any action to prevent the same,” said Bengaluru-based electoral roll analyst PG Bhat.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Data security researchers say the EC decision to have the new formats is no long-term solution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sunil Abraham, executive director of the Centre for Internet and Society, said that the image PDF format would not be a longterm solution at a time when the optical character recognition software has become all-powerful. “The EC should first remove EPIC numbers from the public database as it allows people who are into data-mining exercise to combine data. The solution would be to have mandatory registration in order to access data, even for voters,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He said the EC should have a system that enables it to track those who access electoral data. “Mass export of data should be permitted only for those who are monitoring electoral rolls at the polling-station level,” he said.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states'&gt;https://cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-29T01:59:02Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/regulation-of-cross-border-transfers-of-personal-data-in-asia">
    <title>CIS contributes to ABLI Compendium on Regulation of Cross-Border Transfers of Personal Data in Asia</title>
    <link>https://cis-india.org/internet-governance/blog/regulation-of-cross-border-transfers-of-personal-data-in-asia</link>
    <description>
        &lt;b&gt;The Asian Business Law Institute, based in Singapore published a compendium on “Regulation of cross-border transfer of personal data in Asia”.  This was part of an exercise to explore legal convergence around issues such as data protection, enforcement of foreign judgments and principle of restructuring in Asia.&lt;/b&gt;
        
&lt;p style="text-align: justify;"&gt;The compendium contains 14 detailed reports written by legal practitioners, legal scholars and researchers in their respective jurisdictions, on the regulation of cross-border data transfers in the wider Asian region (Australia, China, Hong Kong SAR, India, Indonesia, Japan, South Korea, Macau SAR, Malaysia, New Zealand, Philippines, Singapore, Thailand, and Vietnam).&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The compendium is intended to act as a springboard for the next phase of ABLI's project, which will be devoted to the in-depth study of the differences and commonalities between Asian legal systems on these issues and – where feasible – the drafting of recommendations and/or policy options to achieve convergence in this area of law in Asia.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;The chapter titled Jurisdictional Report India was authored by Amber Sinha and Elonnai Hickok. The compendium can be &lt;a class="external-link" href="http://abli.asia/PUBLICATIONS/Data-Privacy-Project"&gt;accessed here&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/regulation-of-cross-border-transfers-of-personal-data-in-asia'&gt;https://cis-india.org/internet-governance/blog/regulation-of-cross-border-transfers-of-personal-data-in-asia&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha and Elonnai Hickok</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-03T15:10:11Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/economic-times-may-29-mugdha-variyar-alexas-recording-leak-in-us-echoes-privacy-issues-here">
    <title>Alexa’s recording leak in US ‘echoes’ privacy issues here </title>
    <link>https://cis-india.org/internet-governance/news/economic-times-may-29-mugdha-variyar-alexas-recording-leak-in-us-echoes-privacy-issues-here</link>
    <description>
        &lt;b&gt;Market analyst Sanjay Mehta (name changed) has been keeping his Amazon Echo smart speaker mostly unplugged since reports surfaced last week of the device’s voice assistant, Alexa, inadvertently recording and sending out conversations of a family in the US. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Mugdha Variyar was published in the &lt;a class="external-link" href="https://economictimes.indiatimes.com/small-biz/startups/newsbuzz/alexas-recording-leak-in-us-echoes-privacy-issues-here/articleshow/64363491.cms"&gt;Economic Times&lt;/a&gt; on May 29, 2018. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Digital rights activist Nikhil Pahwa keeps his Google Home smart speaker occasionally plugged out, citing the propensity of the device’s voice assistant to assume it is being queried even when it is not. In the Portland case involving Echo, Alexa had misinterpreted a family’s conversation to be a request to record and send the conversation to a person in the family’s contacts list.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In India, as internet consumers become comfortable using AI-powered voice assistants to play music, set tasks and seek information, they are also waking up to the fragility of data privacy, especially after the infamous Facebook-Cambridge Analytica episode. Indian laws, though, are yet to catch up with technology such as these, say privacy experts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Globally too, governments are grappling with framing policy around data and privacy. That said, the European Union’s tough privacy laws on how companies can handle user data, introduced last week, are forcing companies to seek consent from customers globally to use their data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;According to Singapore-based market research firm Canalys, 108,000 units of Amazon Echo devices were shipped to sales channels in India in the first quarter of this year. As for Google Home, which was launched here in April, 25,000 devices have been shipped so far.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“It is always the company’s fault when such incidents (Alexa’s recording leak) happen. But if it does happen in India, it will also be the government’s fault since there is a big vacuum when it comes to protecting privacy in the digital age,” said Sunil Abraham, executive director of Centre for Internet and Society.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Abraham said a recording device in homes could open up the possibility of hacking or wiretapping. He, however, added that the Amazon incident would not necessarily create any panic. Amazon did not respond to specific queries about what steps it was taking to ensure such incidents do not occur again.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Google said it provides a Home user control through its activity control feature, ability to delete voice-recording history and control permissions to personal data on Gmail, as well as the option to mute the device.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Abraham cited the principles of data minimisation, that is, bare minimum collection of data, and minimal data retention policies with the user, as the main policy requirements, especially to prevent incidents such as the Alexa leak. “We are hopeful that the Srikrishna Committee will include this in the data privacy law,” he added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While there needs to be a strong law, there also needs to be a strong citizen advocacy, where users take a company to court for privacy breach. Alexa users should also be sending queries to Amazon about what steps they are taking for privacy protection.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/economic-times-may-29-mugdha-variyar-alexas-recording-leak-in-us-echoes-privacy-issues-here'&gt;https://cis-india.org/internet-governance/news/economic-times-may-29-mugdha-variyar-alexas-recording-leak-in-us-echoes-privacy-issues-here&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-30T00:49:26Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices">
    <title>Design Concerns in Creating Privacy Notices</title>
    <link>https://cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices</link>
    <description>
        &lt;b&gt;The purpose of privacy notices and choice mechanisms is to notify users of the data practices of a system, so they can make informed privacy decisions. &lt;/b&gt;
        
&lt;p&gt;This blog post was edited by Elonnai Hickok.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;The Role of Design in Enabling Informed Consent&lt;/h2&gt;
&lt;p align="left"&gt;Currently, privacy notices and choice mechanisms, are largely ineffective. Privacy and security researchers have concluded that privacy notices not only fail to help consumers make informed privacy decisions but are mostly ignored by them. [1] They have been reduced to being a mere necessity to ensure legal compliance for companies. The design of privacy systems has an essential role in determining whether the users read the notices and understand them. While it is important to assess the data practices of a company, the communication of privacy policies to users is also a key factor in ensuring that the users are protected from privacy threats. If they do not read or understand the privacy policy, they are not protected by it at all.&lt;/p&gt;
&lt;p align="left"&gt;The visual communication of a privacy notice is determined by the User Interface (UI) and User Experience (UX) design of that online platform. User experience design is broadly about creating the logical flow from one step to the next in any digital system, and user interface design ensures that each screen or page that the user interacts with has a consistent visual language and styling. This compliments the path created by the user experience designer. [2] UI/UX design still follows the basic principles of visual communication where information is made understandable, usable and interesting with the use of elements such as colours, typography, scale, and spacing.&lt;/p&gt;
&lt;p align="left"&gt;In order to facilitate informed consent, the design principles are to be applied to ensure that the privacy policy is presented clearly, and in the most accessible form. A paper by Batya Friedman, Peyina Lin, and Jessica K. Miller, ‘Informed Consent By Design’, presents a model of informed consent for information systems. [3] It mentions the six components of the model; Disclosure, Comprehension, Voluntariness, Competence, Agreement, Minimal Distraction. The design of a notice should achieve these components to enable informed consent. Disclosure and comprehension lead to the user being ‘informed’ while ‘consent’ encompasses voluntariness, competence, and agreement. Finally, The tasks of being informed and giving consentshould happen with minimal distraction, without diverting users from their primary taskor overwhelming them with unnecessary noise.[4]&lt;/p&gt;
&lt;p align="left"&gt;UI/UX design builds upon user behaviour to anticipate their interaction with the platform. It has led to practices where the UI/UX design is directed at influencing the user to respond in a way that is desired by the system. For instance, the design of default options prompts users to allow the system to collect their data when the ‘Allow’ button is checked by default. Such practices where the interface design is used to push users in a particular direction are called “dark patterns”.[5] These are tricks used in websites and apps that make users buy or sign up for things that they did not intend to. [6] Dark patterns are often followed as UI/UX trends without the consequences on users being questioned. This has had implications on the design of privacy systems as well. Privacy notices are currently being designed to be invisible instead of drawing attention towards them.&lt;/p&gt;
&lt;p align="left"&gt;Moreover, most communication designers believe that privacy notices are beyond their scope of expertise. They do not consider themselves accountable for how a notice comes across to the user. Designers also believe that they have limited agency when it comes to designing privacy notices as most of the decisions have been already taken by the company or the service. They can play a major role in communicating privacy concerns at an interface level, but the issues of privacy are much deeper. Designers tend to find ways of informing the user without compromising the user experience, and in the process choose aesthetic decisions over informed consent.&lt;/p&gt;
&lt;p align="left"&gt;&amp;nbsp;&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;Issues with Visual Communication of Privacy Notices&lt;/h2&gt;
&lt;p align="left"&gt;The ineffectiveness of privacy notices can be attributed to several broad issues such as the complex language and length, their timing, and location. In 2015, the Center for Plain Language [7] published a privacy-policy analysis report [8] for TIME.com [9], evaluating internet-based companies’ privacy policies to determine how well they followed plain language guidelines. The report concluded that among the most popular companies, Google and Facebook had the more accessible notices, while Apple, Uber, and Twitter were ranked as less accessible. The timing of notices is also crucial in ensuring that it is read by the users. The primary task for the user is to avail the service being offered. The goals of security and privacy are valued but are only secondary in this process. [10] Notices are presented at a time when they are seen as a barrier between the user and the service. People thus, choose to ignore the notices and move on to their primary task. Another concern is disassociated notices or notices which are presented on a separate website or manual. The added effort of going to an external website also gets in the way of the users which leads to them not reading the notice. While most of these issues can be dealt with at the strategic level of designing the notice, there are also specific visual communication design issues that are required to be addressed.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Invisible Structure and Organisation of Information&lt;/h3&gt;
&lt;p align="left"&gt;Long spells of text with no visible structure or content organisation is the lowest form of privacy notices. These are the blocks of text where the information is flattened with no visual markers such as a section separator, or contrasting colour and typography to distinguish between the types of content. In such notices, the headings and subheadings are also not easy to locate and comprehend. For a user, the large block of text appears to be pointless and irrelevant, and they begin to dismiss or ignore it. Further, the amount of time it would take for the user to read the entire text and comprehend it successfully, is simply impractical, considering the number of websites they visit regularly.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/CollectionandUseofPersonalInformation.jpg" alt="null" class="image-inline" title="Collection and Use of Personal Information" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;The privacy policy notice by Apple [11] with no use of colours or visuals.&lt;/em&gt;&lt;/p&gt;
&lt;p align="center"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/PrivacyPolicyTwitter.jpg" alt="null" class="image-inline" title="Privacy Policy Twitter" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;The privacy policy notice by Twitter [12] no visual segregator&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Visual Contrast Between Front Interface and Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;The front facing interface of an app or website is designed to be far more engaging than the privacy notice pages. There is a visible difference in the UI/UX design of the pages, almost as if the privacy notices were not designed at all. In case of Uber’s mobile app, the process of adding a destination, selecting the type of cab and confirming a ride has been made simple to do for any user. This interface has been thought through keeping in mind the users’ behaviour and needs. It allows for quick and efficient use of the service. As opposed to the process of buying into the service, the privacy notice on the app is complex and unclear.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img class="image-inline image-inline" src="UberApp.jpg" alt="Uber App Interface 2" height="397" width="224" /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &lt;img class="image-inline image-inline" src="UberApp_PrivacyNotice.jpg" alt="Uber App Interface" height="397" width="224" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Uber mobile app screenshots of the front interface (left) and the policy notice page (right)&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Gaining Trust Through the Initial Pitch&lt;/h3&gt;
&lt;p align="left"&gt;A pattern in the privacy notices of most companies is that they attempt to establish credibility and gain confidence by stating that they respect the users’ privacy. This can be seen in the introductory text of the privacy notices of Apple and LinkedIn. The underlying intent seems to be that since the company understands that the users’ privacy is important, the users can rely on them and not read the full notice.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/ApplePrivacyNote.jpg" alt="null" class="image-inline" title="Apple Privacy Note" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Introduction text to Apple’s privacy policy notice [13]&lt;/em&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/LinkedInPrivacyNote.jpg" alt="null" class="image-inline" title="LinkedIn Privacy Note" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Introduction text to LinkedIn’s privacy policy notice [14]&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Low Navigability&lt;/h3&gt;
&lt;p align="left"&gt;The text heavy notices need clear content pockets which can be navigated through easily using mechanisms such as menu bar. Navigability of a document allows for quick locating of sections, and moving between them. Several companies miss to follow this. Apple and Twitter privacy notices (shown above), have low navigability as the reader has no prior indication of how many sections there are in the notice. The reader could have summarised the content based on the titles of the sections if it were available in a table of contents or a menu. Lack of a navigation system leads to endless scrolling to reach the end of the page.&lt;/p&gt;
&lt;p align="left"&gt;Facebook privacy notice, on the other hand is an example of good navigability. It uses typography and colour to build a clear structure of information that can be navigated through easily using the side menu. The menu doubles up as a table of contents for the reader. The side menu however, does not remain visible while scrolling down the page. This means while the user is reading through a section, they cannot switch to a different section from the menu directly. They will need to click on the ‘Return to top’ button and then select the section from the menu.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/DataPolicy.jpg" alt="null" class="image-inline" title="Data Policy" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Navigation menu in the Facebook Data Policy page [15]&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Lack of Visual Support&lt;/h3&gt;
&lt;p align="left"&gt;Privacy notices can rely heavily on visuals to convey the policies more efficiently. These could be visual summaries or supporting infographics. The data flow on the platform and how it would affect the users can be clearly visualised using infographics. But, most notices fail to adopt them. The Linkedin privacy notice [16] page shows a video at the beginning of its privacy policy. Although this could have been an opportunity to explain the policy in the video, LinkedIn only gives an introduction to the notice and follows it with a pitch to use the platform. The only visual used in notices currently are icons. Facebook uses icons to identify the different sections so that they can be located easily. But, apart from being identifiers of sections, these icons do not contribute to the communication of the policy. It does not make reading of the full policy any easier.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Icon Heavy ‘Visual’ Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;The complexity of privacy notices has led to the advent of online tools and generators that create short notices or summaries for apps and websites to supplement the full text versions of policies. Most of these short notices use icons as a way of visually depicting the categories of data that is being collected and shared. iubenda [17], an online tool, generates policy notice summary and full text based on the inputs given by the client. It asks for the services offered by the site or app, and the type of data collection. Icons are used alongside the text headings to make the summary seem more ‘visual’ and hence more easily consumable. It makes the summary more inviting to read, but does not reduce the time for reading.&lt;/p&gt;
&lt;p align="left"&gt;Another icon-based policy summary generator was created by KnowPrivacy. [18] They developed a policy coding methodology by creating icon sets for types of data collected, general data practices, and data sharing. The use of icons in these short notices is more meaningful as they show which type of data is collected or not collected, shared or not shared at a glance without any text. This facilitates comparison between data practices of different apps.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/Google.jpg" alt="null" class="image-inline" title="Google" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Icon based short policy notice created for Google by KnowPrivacy [19]&lt;/em&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2 style="text-align: justify;"&gt;Initiatives to Counter Issues with the Design of Privacy Notices&lt;/h2&gt;
&lt;p align="left"&gt;Several initiatives have called out the issues with privacy notices and some have even countered them with tools and resources. The TIME.com ranking of internet-based companies’ privacy policies brought attention to the fact that some of the most popular platforms have ineffective policy notices. A user rights initiative called Terms of Services; Didn’t Read [20] rates and labels websites’ terms &amp;amp; privacy policies.&amp;nbsp;There is also the Usable Privacy Policy Project which develops techniques to semi-automatically analyze privacy policies with crowdsourcing, natural language processing, and machine learning. [21] It uses artificial intelligence to sift through the most popular sites on the Internet, including Facebook, Reddit, and Twitter, and annotate their privacy policies. They realise that it is not practical for people to read privacy policies. Thus, their aim is to use technology to extract statements from the notices and match them with things that people care about. However, even AI has not been fully successful in making sense of the dense documents and missed out some important context. [22]&lt;/p&gt;
&lt;p align="left"&gt;One of the more provocative initiatives is the Me and My Shadow ‘Lost in Small Print’ [23] project. It shows the text for the privacy notices of companies like LinkedIn, Facebook, WhatsApp, etc. and then ‘reveals’ the data collection and use information that would closely affect the users.&lt;/p&gt;
&lt;p align="left"&gt;Issues with notices have also been addressed by standardising their format, so people can interpret the information faster. The Platform for Privacy Preferences Project (P3P) [24] was one of the initial efforts in enabling websites to share their privacy practices in a standard format. Similar to KnowPrivacy’s policy coding, there are more design initiatives that are focusing on short privacy notice design. An organisation offering services in Privacy Compliance and Risk Management Solutions called TrustArc, [25] is also in the process of designing an interactive icon-based privacy short notice.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/PrivacySummary.jpg" alt="null" class="image-inline" title="Privacy Summary" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;TrustArc’s proposed design [26] for the short notice for a sample site&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;Most efforts have been done in simplifying the notices so as to decode the complex terminology. But, there have been very few evaluations and initiatives to improve the design of these notices.&lt;/p&gt;
&lt;h2&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2&gt;Recommendations&lt;/h2&gt;
&lt;h3&gt;Multilayered Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;One of the existing suggestions on increasing usability of privacy notices are multilayered privacy notices. [27] Multilayered privacy notices comprise a very short notice designed for use on portable digital devices where there is limited space, condensed notice that contains all the key factors in an easy to understand way, and a complete notice with all the legal requirements. [28] Some of the examples above use this in the form of short notices and summaries. The very short notice layer consists of who is collecting the information, primary uses of information, and contact details of the organisation.[29] Condensed notice layer covers scope or who does the notice apply to, personal information collected, uses and sharing, choices, specific legal requirements if any, and contact information. [30] In order to maintain consistency, the sequence of topics in the condensed and the full notice must be same. Words and phrases should also be consistent in both layers. Although an effective way of simplifying information, multi-layered notices must be reconsidered along with the timing of notices. For instance, it could be more suitable to show very short notices at the time of collection or sharing of user data.&lt;/p&gt;
&lt;h3 align="left"&gt;Supporting Infographics&lt;/h3&gt;
&lt;p align="left"&gt;Based on their visual design, the currently available privacy notices can be broadly classified into 4 categories; (i) the text only notices which do not have a clearly visible structure, (ii) the text notices with a contents menu that helps in informing of the structure and in navigating, (iii) the notices with basic use of visual elements such as icons used only to identify sections or headings, (iv) multilayered notices or notices with short summary before giving out the full text. There is still a lack of visual aid in all these formats. The use of visuals in the form of infographics to depict data flows could be more helpful for the users both in short summaries and complete text of policy notices.&lt;/p&gt;
&lt;h3 align="left"&gt;Integrating the Privacy Notices with the Rest of the System&lt;/h3&gt;
&lt;p align="left"&gt;The design of privacy notices usually seems disconnected to the rest of the app or website. The UI/UX design of privacy notices requires as much attention as the consumer-facing interface of a system. The contribution of the designer has to be more than creating a clean layout for the text of the notice. The integration of privacy notices with the rest of the system is also related to the early involvement of the designer in the project. The designer needs to understand the information flows and data practices of a system in order to determine whether privacy notices are needed, who should be notified, and about what. This means that decisions such as selecting the categories to be represented in the short or condensed notice, the datasets within these categories, and the ways of representing them would all be part of the design process. The design interventions cannot be purely visual or UI/UX based. They need to be worked out keeping in mind the information architecture, content design, and research. By integrating the notices, strategic decisions on the timing and layering of content can be made as well, apart from the aesthetic decisions. Just as the aim of the front face of the interface in a system makes it easier for the user to avail the service, the policy notice should also help the user in understanding the consequences, by giving them clear notice of the unexpected collection or uses of their data.&lt;/p&gt;
&lt;h3 align="left"&gt;Practice Based Frameworks on Designing Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;There is little guidance available to communication designers for the actual design of privacy notices which is specific to the requirements and characteristics of a system. [31] The UI/UX practice needs to be expanded to include ethical ways of designing privacy notices online. The paper published by Florian Schaub, Rebecca Balebako, Adam L. Durity, and Lorrie Faith Cranor, called, ‘A Design Space for Effective Privacy Notice’ in 2015 offers a comprehensive design frame­work and standardised vocabulary for describing privacy notice options. [32] The objective of the paper is to allow designers to use this framework and vocabulary in creating effective privacy notices. The design space suggested has four key dimensions, ‘timing’, ‘channel’, ‘modality’ and ‘control’. [33] It also provides options for each of these dimensions. For example, ‘timing’ options are ‘at setup’, ‘just in time’, ‘context-dependent’, ‘periodic’, ‘persistent’, and ‘on demand’. The dimensions and options in the design space can be expanded to accommodate new systems and interaction methods.&lt;/p&gt;
&lt;h3 align="left"&gt;Considering the Diversity of Audiences&lt;/h3&gt;
&lt;p align="left"&gt;For the various mobile apps and services, there are multiple user groups who use them. The privacy notices are hence not targeted to one kind of an audience. There are diverse audiences who have different privacy preferences for the same system. [34] The privacy preferences of these diverse groups of users’ must be accommodated. In a typical design process for any system, multiple user personas are identified. The needs and behaviour of each persona is used to determine the design of the interface. Privacy preferences must also be observed as part of these considerations for personas, especially while designing the privacy notices. Different users may need different kinds of notices based on which data practices affect them.[35] Thus, rather than mandating a single mechanism for obtaining informed consent for all users in all situations, designers need to provide users with a range of mechanisms and levels of control. [36]&lt;/p&gt;
&lt;h3 align="left"&gt;Ethical Framework for Design Practitioners&lt;/h3&gt;
&lt;p align="left"&gt;An ethical framework is required for design practitioners that can be followed at the level of both deciding the information flow and the experience design. With the prevalence of ‘dark patterns’, the visual design of notices is used to trick users into accepting it. Design ethics can play a huge role in countering such practices. Will Dayable, co-director at Squareweave, [37] a developer of web and mobile apps, suggests that UI/UX designers should “Design Like They’re (Users are) Drunk”. [38]&amp;nbsp;&amp;nbsp;He asks designers to imagine the user to be in a hurry and still allow them access to all the information necessary for making a decision. He concludes that good privacy UX and UI is about actually trying to communicate with users rather than trying to slip one past them. In principle, an ethical design practice would respect the rights of the users and proactively design to facilitate informed consent.&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2 style="text-align: justify;"&gt;Reconceptualising Privacy Notices&lt;/h2&gt;
&lt;p align="left"&gt;Based on the above recommendations, a guiding sample for multilayered privacy notices has been created. Each system would need its own structure and mechanisms for notices, which are integrated with its data practice, audiences, and medium, but this sample notice provides basic guidelines for creating effective and accessible privacy notices. The aesthetic decisions would also vary based on the interface design of a system.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/SampleEye.jpg" alt="null" class="image-inline" title="Sample Eye" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Fixed Icon for Privacy Notifications&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;A fixed icon can appear along with all privacy notifications on the system, so that the users can immediately know that the notification is about a privacy concern. This icon should capture attention instantly and suggest a sense of caution. Besides its use as a call to attention, the icon can also lead to a side panel for privacy implications from all actions that the user takes.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/SampleVeryShortNotice.jpg" alt="null" class="image-inline" title="Sample Very Short Notice" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Very Short Notice on Desktop and Mobile Platforms&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;The very short notices can be shown when an action from the user would lead to data collection or sharing. The notice mechanism should be designed to provide notices at different times tailored to a user’s needs in that context. The styling and placement of the ‘Allow’ and ‘Don’t Allow’ buttons should not be biased towards the ‘Allow’ option. The text used in very short and condensed notice layers should be engaging yet honest in its communication.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/DataCollected.jpg" alt="null" class="image-inline" title="Data Collected" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Summary Notice&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;The summary or the condensed notice layer should allow the user to gauge at a glance, how the data policy is going to affect them. This can be combined with a menu that lists the topics covered in the full notice. The menu would double up as a navigation mechanism for users. It should be visible to users even as they scroll down to the full notice. The condensed notice can also be supported by an infographic depicting the flow of data in the system.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="https://cis-india.org/home-images/DataCollection.jpg" alt="null" class="image-inline" title="Data Collection" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Navigation Menu&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;All the images in this section use sample text for the purpose of illustrating the structure and layout&lt;/p&gt;
&lt;p align="left"&gt;The full notice can be made accessible by creating a clear information hierarchy in the text. The menu which is available on the side while scrolling down the text would facilitate navigation and familiarity with the structure of the notice.&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2 style="text-align: justify;"&gt;Conclusion&lt;/h2&gt;
&lt;p align="left"&gt;The presentation of privacy notices directly influences the decisions of users online and ineffective notices make users vulnerable to their data being misused. But currently, there is little conversation about privacy and data protection among designers. Design practice has to become sensitive to privacy and security requirements. Designers need to take the accountability of creating accessible notices which are beneficial to the users, rather than to the companies issuing them. They must prioritise the well-being of users over aesthetics and user experience even. The aesthetics of a platform must be directed at achieving transparency in the privacy notice by making it easily readable.&lt;/p&gt;
&lt;p align="left"&gt;The design community in India has a more urgent task at hand of building a design practice that is informed by privacy. Comparing the privacy notices of Indian and global companies, Indian companies have an even longer way to go in terms of communicating the notices effectively. Most Indian companies such as Swiggy, [39] 99acres, [40] and Paytm [41] have completely textual privacy policy notices with no clear information hierarchy or navigation. Ola Cabs [42]&amp;nbsp; provides an external link to their privacy notice, which opens as a pdf, making it even more inaccessible. Thus, there is a complete lack of design input in the layout of these notices.&lt;/p&gt;
&lt;p align="left"&gt;Designers must engage in conversations with technologists and researchers, and include privacy and other user rights in design education in order to prepare practitioners for creating more valuable digital platforms.&lt;/p&gt;
&lt;hr /&gt;
&lt;ol&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.fastcodesign.com/3032719/ui-ux-who-does-what-a-designers-guide-to-the-tech-industry"&gt;https://www.fastcodesign.com/3032719/ui-ux-who-does-what-a-designers-guide-to-the-tech-industry&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf"&gt;https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf"&gt;https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://fieldguide.gizmodo.com/dark-patterns-how-websites-are-tricking-you-into-givin-1794734134"&gt;https://fieldguide.gizmodo.com/dark-patterns-how-websites-are-tricking-you-into-givin-1794734134&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://darkpatterns.org/"&gt;https://darkpatterns.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://centerforplainlanguage.org/"&gt;https://centerforplainlanguage.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://centerforplainlanguage.org/wp-content/uploads/2016/11/TIME-privacy-policy-analysis-report.pdf"&gt;https://centerforplainlanguage.org/wp-content/uploads/2016/11/TIME-privacy-policy-analysis-report.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://time.com/3986016/google-facebook-twitter-privacy-policies/"&gt;http://time.com/3986016/google-facebook-twitter-privacy-policies/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html"&gt;https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com"&gt;https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://twitter.com/privacy?lang=en"&gt;https://twitter.com/privacy?lang=en&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com"&gt;https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/legal/privacy-policy"&gt;https://www.linkedin.com/legal/privacy-policy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.facebook.com/privacy/explanation"&gt;https://www.facebook.com/privacy/explanation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/legal/privacy-policy"&gt;https://www.linkedin.com/legal/privacy-policy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.iubenda.com/blog/2013/06/13/privacy%C2%ADpolicy%C2%ADfor%C2%ADandroid%C2%ADapp/"&gt;http://www.iubenda.com/blog/2013/06/13/privacy­policy­for­android­app/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://knowprivacy.org/policies_methodology.html"&gt;http://knowprivacy.org/policies_methodology.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://knowprivacy.org/profiles/google"&gt;http://knowprivacy.org/profiles/google&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://tosdr.org/"&gt;https://tosdr.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://explore.usableprivacy.org/"&gt;https://explore.usableprivacy.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://motherboard.vice.com/en_us/article/a3yz4p/browser-plugin-to-read-privacy-policy-carnegie-mellon"&gt;https://motherboard.vice.com/en_us/article/a3yz4p/browser-plugin-to-read-privacy-policy-carnegie-mellon&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://myshadow.org/lost-in-small-print"&gt;https://myshadow.org/lost-in-small-print&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.w3.org/P3P/"&gt;https://www.w3.org/P3P/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.trustarc.com/blog/2011/02/17/privacy-short-notice-designpart-i-background/"&gt;http://www.trustarc.com/blog/2011/02/17/privacy-short-notice-designpart-i-background/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.trustarc.com/blog/?p=1253"&gt;http://www.trustarc.com/blog/?p=1253&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf"&gt;https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf"&gt;https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf"&gt;https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html"&gt;https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf"&gt;https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.squareweave.com.au/"&gt;https://www.squareweave.com.au/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://iapp.org/news/a/how-ui-and-ux-can-ko-privacy/"&gt;https://iapp.org/news/a/how-ui-and-ux-can-ko-privacy/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.swiggy.com/privacy-policy"&gt;https://www.swiggy.com/privacy-policy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.99acres.com/load/Company/privacy"&gt;https://www.99acres.com/load/Company/privacy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://pages.paytm.com/privacy.html"&gt;https://pages.paytm.com/privacy.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://s3-ap-southeast-1.amazonaws.com/ola-prod-website/privacy_policy.pdf"&gt;https://s3-ap-southeast-1.amazonaws.com/ola-prod-website/privacy_policy.pdf&lt;/a&gt;&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices'&gt;https://cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>saumyaa</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-06T13:45:40Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward">
    <title>Emerging Technologies: Issues &amp; Way Forward</title>
    <link>https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward</link>
    <description>
        &lt;b&gt;Aayush Rathi and Gurshabad Grover attended a two day conference on 'Emerging Technologies: Issues &amp; Way Forward' organised by the Technology Policy team at the National Institute of Public Finance and Policy (NIPFP), held on 23rd and 24th May in Bangalore.&lt;/b&gt;
        &lt;p&gt;The themes for discussion included:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Privacy, surveillance and data protection&lt;/li&gt;
&lt;li&gt;Regulation of emerging technologies&lt;/li&gt;
&lt;li&gt;Building sound regulators for technology policy, and&lt;/li&gt;
&lt;li&gt;Fintech regulation&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/nipfp-bangalore-agenda"&gt;Click here&lt;/a&gt; to read the agenda&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward'&gt;https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-26T00:39:11Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/privacy-in-the-digital-age-addressing-common-challenges-seizing-opportunities">
    <title>Privacy in the Digital Age: Addressing Common Challenges, Seizing Opportunities</title>
    <link>https://cis-india.org/internet-governance/news/privacy-in-the-digital-age-addressing-common-challenges-seizing-opportunities</link>
    <description>
        &lt;b&gt;DG Justice and Consumers and European Union is organizing a conference on privacy in the digital age on May 25, 2018 in New Delhi.&lt;/b&gt;
        
&lt;h3 style="text-align: center;"&gt;&lt;img src="https://cis-india.org/home-images/copy_of_India_posterwall_20180517page001.jpg/@@images/bc1bb559-cf77-4518-b4d3-a367e5a2f04f.jpeg" alt="null" class="image-inline" title="India Poster Wall" /&gt;&lt;/h3&gt;
&lt;hr /&gt;
&lt;h3 style="text-align: justify;"&gt;Agenda&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Friday 25 May 2018, Reception to follow, The Lalit Hotel, Barakhamba Avenue, Connaught Place, New Delhi, India&lt;/p&gt;
&lt;ul style="text-align: justify;"&gt;
&lt;li&gt;9:00 a.m. Registration and welcome coffee&lt;/li&gt;
&lt;li&gt;9:20 a.m. Welcome: Vera Jourova, EU Commissioner for Justice and Consumers (by video)&lt;/li&gt;
&lt;li&gt;9:30 a.m. Opening remarks: Justice B.N. Srikrishna, chair of the Committee of Experts on a Data Protection Framework for India &lt;br /&gt;Tomasz Kozlowski, Ambassador of the European Union to India&lt;/li&gt;&lt;/ul&gt;
&lt;p style="text-align: justify;"&gt;10:00 a.m. &lt;strong&gt;Panel 1 - Setting the scene: India at the crossroads&lt;/strong&gt;&lt;/p&gt;
&lt;ul style="text-align: justify;"&gt;
&lt;li&gt;Moderator: Sunil Abraham, Executive Director, Centre for Internet and Society, India&lt;br /&gt;Vinayak Godse, Senior Director, Data Protection, Data Security Council of India&amp;nbsp;&lt;br /&gt;Raman Jit Singh Chima, Policy Director, Access Now, India&lt;br /&gt;Amba Kak, Public Policy Advisor, Mozilla, India&lt;/li&gt;
&lt;li&gt;11:00 a.m.: Coffee break&lt;/li&gt;&lt;/ul&gt;
&lt;p style="text-align: justify;"&gt;11:15 a.m. &lt;strong&gt;Panel 2 - Modern data protection laws: towards global convergence&lt;/strong&gt;&lt;/p&gt;
&lt;ul style="text-align: justify;"&gt;
&lt;li&gt;Moderator: Clarisse Girot, Data Privacy Project Lead, Asian Business Law Institute, Singapore&lt;br /&gt;Ralf Sauer, Deputy Head of Unit, International data flows and protection, European Commission, Brussels &lt;br /&gt;Malavika Jayaram, Executive Director, Digital Asia Hub, Hong Kong&lt;br /&gt;Graham Greenleaf, Professor of Law &amp;amp; Information Systems, University of New South Wales, Australia (by video)&lt;/li&gt;&lt;/ul&gt;
&lt;p style="text-align: justify;"&gt;12:15 p.m. &lt;strong&gt;Panel 3 - Privacy and data security: a business opportunity&lt;/strong&gt;&lt;/p&gt;
&lt;ul style="text-align: justify;"&gt;
&lt;li&gt;Moderator: Ralf Sauer, Deputy Head of Unit,&amp;nbsp;International data flows and protection, European Commission, Brussels&lt;br /&gt;Srinivas Poorsarla, Vice President and Head (Global), Privacy and Data Protection, Infosys, India&lt;br /&gt;Ravi Sogi, Head - Product Security and Privacy, Philips&lt;br /&gt;Riccardo Masucci, Global Director of Privacy Policy, Intel, Washington DC&lt;/li&gt;&lt;/ul&gt;
&lt;p style="text-align: justify;"&gt;1:15 p.m.: Reception&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/privacy-in-the-digital-age-addressing-common-challenges-seizing-opportunities'&gt;https://cis-india.org/internet-governance/news/privacy-in-the-digital-age-addressing-common-challenges-seizing-opportunities&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-24T10:45:56Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/indian-intermediary-liability-regime">
    <title>Indian Intermediary Liability Regime: Compliance with the Manila Principles on Intermediary Liability</title>
    <link>https://cis-india.org/internet-governance/blog/indian-intermediary-liability-regime</link>
    <description>
        &lt;b&gt;This report assesses the compliance of the Indian intermediary liability framework with the Manila Principles on Intermediary Liability, and recommends substantive legislative changes to bring the legal framework in line with the Manila Principles. &lt;/b&gt;
        &lt;p&gt;&lt;span style="text-align: justify; "&gt;The report was edited by Elonnai Hickok and Swaraj Barooah&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The report is an examination of Indian laws based upon the background paper to the Manila Principles as the explanatory text on which these recommendations have been based, and not an assessment of the principles themselves. To do this, the report considers the Indian regime in the context of each of the principles defined in the Manila Principles. As such, the explanatory text to the Manila Principles recognizes that diverse national and political scenario may require different intermediary liability legal regimes, however, this paper relies only on the best practices prescribed under the Manila Principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report is divided into the following sections&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Principle I: Intermediaries should be shielded by law from liability for third-party content&lt;/li&gt;
&lt;li&gt;Principle II: Content must not be required to be restricted without an order by a judicial authority&lt;/li&gt;
&lt;li&gt;Principle III: Requests for restrictions of content must be clear, be unambiguous, and follow due process&lt;/li&gt;
&lt;li&gt;Principle IV: Laws and content restriction orders and practices must comply with the tests of necessity and proportionality&lt;/li&gt;
&lt;li&gt;
&lt;div id="_mcePaste"&gt;Principle V: Laws and content restriction policies and practices must respect due process&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;div id="_mcePaste"&gt;Principle VI: Transparency and accountability must be built into laws and content restriction policies and practices&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;div id="_mcePaste"&gt;Conclusion&lt;/div&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/indian-intermediary-liability-regime"&gt;Download the Full report here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/indian-intermediary-liability-regime'&gt;https://cis-india.org/internet-governance/blog/indian-intermediary-liability-regime&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>divij</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-20T15:14:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good">
    <title>India's Data Protection Framework Will Need to Treat Privacy as a Social and Not Just an Individual Good</title>
    <link>https://cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good</link>
    <description>
        &lt;b&gt;The idea that technological innovations may compete with privacy of individuals assumes that there is social and/or economic good in allowing unrestricted access to data. However, it must be remembered that data is potentially a toxic asset, if it is not collected, processed, secured and shared in the appropriate way.&lt;/b&gt;
        &lt;div class="field-label-hidden      field-type-text-with-summary field-name-body field" style="text-align: justify; "&gt;
&lt;div class="field-items"&gt;
&lt;div class="even field-item"&gt;
&lt;p&gt;Published in Economic &amp;amp; Political Weekly, Volume 53, Issue No. 18, 05 May, 2018. Article can be &lt;a class="external-link" href="http://www.epw.in/engage/article/for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good"&gt;accessed online here&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;In             July 2017, the Ministry of Electronics and Information             Technology (MeITy) in India set up a committee headed by a             former judge, B N Srikrishna, to address the growing clamour             for privacy protections at a time when both private             collection of data and public projects like Aadhaar are             reported to pose major privacy risks (Maheshwari 2017). The             Srikrishna Committee is in the process of providing its             input, which will go on to inform India’s data-protection             law.&lt;/p&gt;
&lt;p&gt;While             the committee released a white paper with provisional views,             seeking feedback a few months ago, it may be discussing a             data protection framework without due consideration to how             data practices have evolved.&lt;/p&gt;
&lt;p&gt;In             early 2018, a series of stories based on investigative             journalism by &lt;em&gt;Guardian&lt;/em&gt;and &lt;em&gt;Observer&lt;/em&gt; revealed             that the data of 87 million Facebook users was used for the             Trump campaign by a political consulting firm, Cambridge             Analytica, without their permissions. Aleksandr Kogan, a             psychology researcher at the University of Cambridge,             created an application called “thisisyourdigitallife” and             collected data from 270,000 participants through a             personality test using Facebook’s application programming             interface (API), which allows developers to integrate with             various parts of the Facebook platform (Fruchter et al             2018). This data was collected purportedly for academic             research purposes only. Kogan’s application also collected             profile data from each of the participants’ friends, roughly             87 million people.&lt;/p&gt;
&lt;p&gt;The             kinds of practices concerning the sharing and processing of             data exhibited in this case are not unique. These are, in             fact, common to the data economy in India as well. It can be             argued that the Facebook–Cambridge Analytica incident is             representative of data practices in the data-driven digital             economy. These new practices pose important questions for             data protection laws globally, and how these may need to             evolve to address data protection, particularly for India,             which is in the process of drafting its own data protection             law.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Privacy as Control&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;Most             modern data protection laws focus on individual control. In             this context, the definition by the late Alan Westin             (2015) characterises privacy as:&lt;/p&gt;
&lt;blockquote style="padding-left: 20px; "&gt;
&lt;p&gt;The claim               of individuals, groups, or institutions to determine for               themselves when, how, and to what extent information about               them is communicated to other.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The             idea of “privacy as control” is what finds articulation in             data protection policies across jurisdictions, beginning             with the Fair Information Practice Principles (FIPP) from             the United States (US) (Dixon 2006). These FIPPs are the             building blocks of modern information privacy law (Schwartz             1999) and not only play a significant role in the             development of privacy laws in the US, but also inform data             protection laws in most privacy regimes internationally             (Rotenberg 2001), including the nine “National Privacy             Principles” articulated by the Justice A P Shah Committee in             India. Much of this approach is also reflected in the white             paper released by the committee, led by Justice Srikrishna,             towards the creation of data protection laws in India             (Srikrishna 2017)&lt;/p&gt;
&lt;p&gt;This             approach essentially involves the following steps (Cate             2006):&lt;/p&gt;
&lt;p&gt;(i)             Data controllers are required to tell individuals what data             they wish to collect and use and give them a choice to share             the data. &lt;br /&gt; (ii) Upon sharing, the individuals have rights such as being             granted access, and data controllers have obligations such             as securing the data with appropriate technologies and             procedures, and only using it for the purposes identified.&lt;/p&gt;
&lt;p&gt;The             objective in this approach is to make the individual             empowered and allow them to weigh their own interests in             exercising their consent. The allure of this paradigm is             that, in one elegant stroke, it seeks to “ensure that             consent is informed and free and thereby also (seeks) to             implement an acceptable tradeoff between privacy and             competing concerns.” (Sloan and Warner 2014). This approach             is also easy to enforce for both regulators and businesses.             Data collectors and processors only need to ensure that they             comply with their privacy policies, and can thus reduce             their liability while, theoretically, consumers have the             information required to exercise choice. In recent years,             however, the emergence of big data, the “Internet of             Things,” and algorithmic decision-making has significantly             compromised the notice and consent model (Solove 2013).&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Limitations of Consent &lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;Some             cognitive problems, such as long and difficult to understand             privacy notices, have always existed with regard to the             issue of informed consent, but lately these problems have             become aggravated. Privacy notices often come in the form of             long legal documents, much to the detriment of the readers’             ability to understand them. These policies are “long,             complicated, full of jargon and change frequently” (Cranor             2012).&lt;/p&gt;
&lt;p&gt;Kent             Walker (2001) lists five problems that privacy notices             typically suffer from:&lt;/p&gt;
&lt;p&gt;(i)             Overkill: Long and repetitive text in small print.&lt;br /&gt; (ii) Irrelevance: Describing situations of little concern to             most consumers.&lt;br /&gt; (iii) Opacity: Broad terms that reflect limited truth, and             are unhelpful to track and control the information collected             and stored.&lt;br /&gt; (iv) Non-comparability: Simplification required to achieve             comparability will lead to compromising of accuracy.&lt;br /&gt; (v) Inflexibility: Failure to keep pace with new business             models.&lt;/p&gt;
&lt;p&gt;Today,             data is collected continuously with every use of online             services, making it humanly impossible to exercise             meaningful consent. &lt;br /&gt; The quantity of data being generated is expanding at an             exponential rate. With connected devices, smartphones,             appliances transmitting data about our usage, and even the             smart cities themselves, data now streams constantly from             almost every sector and function of daily life, “creating             countless new digital puddles, lakes, tributaries and oceans             of information” (Bollier 2010).&lt;/p&gt;
&lt;p&gt;The             infinitely complex nature of the data ecosystem renders             consent of little value in cases where individuals may be             able to read and comprehend privacy notices. As the uses of             data are so diverse, and often not limited by a purpose             identified at the beginning, individuals cannot             conceptualise how their data will be aggregated and possibly             used or reused.&lt;/p&gt;
&lt;p&gt;Seemingly             innocuous bits of data revealed at different stages could be             combined to reveal sensitive information about the             individual. While the regulatory framework is designed such             that individuals are expected to engage in cost–benefit             analysis of trading their data to avail services, this             ecosystem makes such individual analysis impossible.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Conflicts Between Big Data               and Individual Control&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;The             thrust of big data technologies is that the value of data             resides not in its primary purposes, but in its numerous             secondary purposes, where data is reused many times over             (Schoenberger and Cukier 2013).&lt;/p&gt;
&lt;p&gt;On             the other hand, the idea of privacy as control draws from             the “data minimisation” principle, which requires             organisations to limit the collection of personal data to             the minimum extent necessary to obtain their legitimate             purpose and to delete data no longer required. Control is             excercised and privacy is enhanced by ensuring data             minimisation. These two concepts are in direct conflict.             Modern data-driven businesses want to retain as much data as             possible for secondary uses. Since these secondary uses are,             by their nature, unanticipated, their practices run counter             to the very principle of purpose limitation (Tene and             Polonetsky 2012).&lt;/p&gt;
&lt;p&gt;It             is evident from such data-sharing practices, as demonstrated             by the Cambridge Analytica–Facebook story, that platform             architectures are designed with a clear view to collect as             much data as possible. This is amply demonstrated by the             provision of a “friends permission” feature by Facebook on             its platform to allow individuals to share information not             just about themselves, but also about their friends. For the             principle of informed consent to be meaningfully             implemented, it is necessary for users to have access to             information about intended data practices, purposes and             usage, so they consciously share data about themselves.&lt;/p&gt;
&lt;p&gt;In             reality, however, privacy policies are more likely to serve             as liability disclaimers for companies than any kind of             guarantee of privacy for consumers. A case in point is Mark             Zuckerberg’s facile claim that there was no “data-breach" in             the Cambridge Analytica–Facebook incident. Instead of asking             each of the 87 million users whether they wanted their data             to be collected and shared further, Facebook designed a             platform that required consent in any form only from 270,000             users. Not only were users denied the opportunity to give             consent, their consent was assumed through a feature which             was on by default. This is representative of how privacy             trade-offs are conceived by current data-driven business             models. Participation in a digital ecosystem is by itself             deemed as users’ consent to relinquish control over how             their data is collected, who may have access to it, and what             purposes it may be used for.&lt;/p&gt;
&lt;p&gt;Yet,             Zuckerberg would have us believe that the primary privacy             issue of concern is not about how his platform enabled the             collection of users’ data without their explicit consent,             but in the subsequent unauthorised sharing of the data by             Kogan. Zuckerberg’s insistence that collection of data of             people without their consent is not a data breach is             reminiscent of the UIDAI’s recent claims in India that             publication of Aadhaar numbers and related information by             several government websites  is not a data breach, so long             as its central biometric database in secure (Sharma 2018).             In such cases also, the intended architecture ensured the             seeding of other databases with Aadhaar numbers, thus             creating multiple potential points of failure through             disclosure. Similarly, the design flaws in direct benefit             transfers enabled Airtel to create payments bank accounts             with the customers’ knowledge (&lt;em&gt;Hindu Business Line 2017&lt;/em&gt;). Such claims             clearly suggest the very limited responsibility data             controllers (both public and private) are willing to take             for personal data that they collect, while wilfully             facilitating and encouraging data practices which may lead             to greater risk to data.&lt;/p&gt;
&lt;p&gt;On             this note, it is also relevant to point out that the             Srikrishna committee white paper begins with identifying             informational privacy and data innovation as its two key             objectives. It states that “a firm legal framework for data             protection is the foundation on which data-driven innovation             and entrepreneurship can flourish in India.”&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Conversations             around privacy and data have become inevitably linked to the             idea of technological innovation as a competing interest.             Before engaging in such conversations, it is important to             acknowledge that the value of innovation as a competing             interest itself is questionable. It is not a competing             right, nor a legitimate public interest endeavour, nor a             proven social good.&lt;/p&gt;
&lt;p&gt;The             idea that in policymaking, technological innovations may             compete with privacy of individuals assumes that there is             social and/or economic good in allowing unrestricted access             to data. The social argument is premised on the promises of             mathematical models and computational capacity being capable             of identifying key insights from data. In turn, these             insights may be useful in public and private             decision-making. However, it must be remembered that data is             potentially a toxic asset, if it is not collected,             processed, secured and shared in the appropriate way.             Sufficient research suggests that indiscriminate data             collection is greatly increasing the ratio of noise to             signal, and can lead to erroneous insights. Further, the             greater the amount of data you collect, the greater is the             attack surface that leads to cybersecurity risks. Further,             incidents such as Facebook–Cambridge Analytica demonstrate             that toxicity of data in various ways and underscores the             need for data regulation at every stage of the data             lifecycle (Scheiner  2016). These are important tempering             factors that need to be kept in mind while evaluating data             innovation as a key mover of policy or regulation.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Privacy as Social Good&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;As             long as privacy is framed as arising primarily from             individual control, data controllers will continue to engage             in practices that compromise the ability to exercise choice.             There is a need to view privacy as a social good, and             policymaking should ensure its preservation and enhancement.             Contractual protections and legal sanctions can themselves             do little if platform architectures are designed to do the             exact opposite.&lt;/p&gt;
&lt;p&gt;More             importantly, policymaking needs to recognise privacy not             merely as an individual right, available for individuals to             forego when engaging with data-driven business models, but             also as a social good. The recognition of something as a             social good deems it desirable by definition, and a             legitimate goal of law and policy, rather than rely             completely on market forces for its achievement.&lt;/p&gt;
&lt;p&gt;The             Puttaswamy judgment (K Puttaswamy v Union of India             2017) lends sufficient weight to privacy’s social value by             identifying it as fundamental to any individual development             through its dependence on solitude, anonymity, and temporary             releases from social duties.&lt;/p&gt;
&lt;p&gt;Sociological             scholarship demonstrates that different types of social             relationships, be it Gesellschaft (interest groups and             acquaintances) or Gemeinschaft (friendship, love, and             marriage), and the nature of these relationships depend on             the ability to conceal certain things (Simmel 1906).             Demonstrating this in the context of friendships, it has             been stated that such relationships “present a very peculiar             synthesis in regard to the question of discretion, of             reciprocal revelation and concealment.” Friendships, much             like most other social relationships, are very much             dependent on our ability to selectively present ourselves to             others. Contrast this with Zuckerberg’s stated aim of making             the world more “open” where information about people flows             freely and effectively without any individual control.             Contrast this also with government projects such as the             Aadhaar which intends to act as one universal identity which             can provide a 360-degree view of citizens.&lt;/p&gt;
&lt;p&gt;Other             scholars such as Julie Cohen (2012) and Anita Allen (2011)             have demonstrated that data that a person produces or has             control over concerns both herself and others. Individuals             can be exposed not only because of their own actions and             choices, but also made vulnerable merely because others have             been careless with their data. This point is amply             demonstrated in the Facebook–Cambridge Analytica incident.             What this means is that protection of privacy requires not             just individual action, but in a sense, requires group             co-ordination. It is my argument that this group interest of             privacy as a social good must be the basis of policymaking             and regulation of data in the future, in addition to the             idea of privacy as an individual right. In the absence of             attention to the social good aspect of privacy, individual             consumers are left to their own devices to negotiate  their             privacy trade-offs with large companies and governments and             are significantly compromised.&lt;/p&gt;
&lt;p&gt;What             this translates into is a regulatory framework and data             protection frameworks should not be value-neutral in their             conception of privacy as a facet of individual control. The             complete reliance of data regulation on the data subject to             make an informed choice is, in my opinion, an idea that has             run its course. If privacy is viewed as a social good, then             the data protection framework, including the laws and the             architecture must be designed with a view to protect it,             rather than leave it entirely to the market forces.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;The Way Forward&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;Data             protection laws need to be re-evaluated, and policymakers             must recognise Lawrence Lessig’s dictum that “code is law.”             Like laws, architecture and norms can play a fundamental             role in regulation. Regulatory intervention for technology             need not mean regulation of technology only, but also how             technology itself may be leveraged for regulation (Lessig             2006; Reidenberg 1998). It is key that the latter is not             left only in the hands of private players. &lt;br /&gt; Zuckerberg, in his testimony (&lt;em&gt;Washington Post&lt;/em&gt; 2018) before             the United States Senate's Commerce and Judiciary             committees, asserted that "AI tools" are central to any             strategy for addressing hate speech, fake news, and             manipulations that use data ecosystems for targeting.&lt;/p&gt;
&lt;p&gt;What             is most concerning in his testimony is the complete lack of             mention of standards, public scrutiny and peer-review             processes, which “AI tools” and regulatory technologies need             to be subject to. Further, it cannot be expected that             data-driven businesses will view privacy as a social good or             be publicly accountable.&lt;/p&gt;
&lt;p&gt;As             policymakers in India gear up for writing the country’s data             protection law, they must acknowledge that their             responsibility extends to creating norms and principles that             will inform future data-driven platforms and regulatory             technologies.&lt;/p&gt;
&lt;p&gt;Since             issues of privacy and data protection will have to be             increasingly addressed at the level of how architectures             enable data collection, and more importantly how data is             used after collection, policymakers must recognise that             being neutral about these practices is no longer enough.             They must take normative positions on data collection,             processing and sharing practices. These positions cannot be             implemented through laws only, but need to be translated             into technological solutions and norms.  Unless a             multipronged approach comprising laws, architecture and             norms is adopted, India’s new data protection regime may end             up with limited efficacy.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good'&gt;https://cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-18T06:22:57Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/hack-read-waqas-may-15-2018-indian-cricket-board-exposes-personal-data-of-thousands-of-players">
    <title>Indian Cricket Board Exposes Personal Data of Thousands of Players</title>
    <link>https://cis-india.org/internet-governance/news/hack-read-waqas-may-15-2018-indian-cricket-board-exposes-personal-data-of-thousands-of-players</link>
    <description>
        &lt;b&gt;The IT security researchers at Kromtech Security Center discovered a trove of personal and sensitive data belonging to around 15,000 to 20,000 Indian applicants participating in cricket seasons 2015-2018.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post was published on &lt;a class="external-link" href="https://www.hackread.com/indian-cricket-board-exposes-data-of-cricketers/"&gt;Hack Read&lt;/a&gt; on May 15, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The authority responsible for protecting this data was The Board of Control for Cricket in India (BCCI) but it was left exposed to the public in two misconfigured AWS (Amazon Web Service) S3 cloud storage buckets.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="https://mackeepersecurity.com/post/bcci-exposed-players-personal-sensitive-data" rel="noopener" target="_blank"&gt;According to the analysis&lt;/a&gt; from Kromtech researchers, the data was divided into different categories of players including those under 19 years old. The data was accessible to anyone with an Internet connection and basic knowledge of using AWS cloud storage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The data was discovered earlier this month and included names, date of birth, place of birth, permanent addresses, email IDs, proficiency details, medical records, birth certificate number, passport number, SSC certificate number, PAN card number, mobile number, landline and phone number of the person who can be contacted in case of emergency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img alt="Indian Cricket Board Exposes Personal Data of Thousands of Players" src="https://www.hackread.com/wp-content/uploads/2018/05/indian-cricket-board-exposes-personal-data-of-thousands-of-players-1.png?x62286" /&gt;&lt;/p&gt;
&lt;p&gt;Screenshot of one of the files that were exposed (Image credit: Kromtech)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At the time of publishing this article, the BCCI was informed by Kromtech researchers and both misconfigured buckets were secured. However, this is not the first time when such sensitive information was leaked online. In 2017, Bangalore-based Centre for Internet and Society (CIS) &lt;a href="https://www.hackread.com/indian-biometric-system-data-leaked/" rel="noopener" target="_blank"&gt;found that&lt;/a&gt; names, addresses, date of birth, PAN card details, Aadhaar card numbers and other relevant details of millions of Indian citizen could be found with just a simple Google search.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the other hand, lately, AWS buckets have been &lt;a href="https://www.hackread.com/localblox-exposes-millions-of-facebook-linkedin-data/" rel="noopener" target="_blank"&gt;making headlines for the wrong reasons&lt;/a&gt;. Until now, there have been tons of cases in which misconfigured AWS buckets have been found carrying highly sensitive and confidential data &lt;a href="https://www.hackread.com/unprotected-s3-cloud-bucket-exposed-100gb-of-classified-nsa-data/" rel="noopener" target="_blank"&gt;such as classified NSA documents&lt;/a&gt; or details about &lt;a href="https://www.hackread.com/misconfigured-amazon-s3-buckets-exposed-us-militarys-social-media-spying-campaign/" rel="noopener" target="_blank"&gt;US Military’s social media spying campaign&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In two such cases, malicious hackers were able to compromise AWS buckets belonging to &lt;a href="https://www.hackread.com/hackers-compromise-tesla-cloud-server-to-mine-cryptocurrency/" rel="noopener" target="_blank"&gt;Tesla Motors&lt;/a&gt; and &lt;a href="https://www.hackread.com/la-times-website-hacked-mine-monero-cryptocurrency/" rel="noopener" target="_blank"&gt;LA Times&lt;/a&gt; to secretly mine cryptocurrency. Therefore, if you are an AWS user make sure your cloud server is properly secured.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/hack-read-waqas-may-15-2018-indian-cricket-board-exposes-personal-data-of-thousands-of-players'&gt;https://cis-india.org/internet-governance/news/hack-read-waqas-may-15-2018-indian-cricket-board-exposes-personal-data-of-thousands-of-players&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-18T05:01:50Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/the-wire-karan-saini-may-11-2018-aadhaar-remains-an-unending-security-nightmare-for-a-billion-indians">
    <title>Aadhaar Remains an Unending Security Nightmare for a Billion Indians</title>
    <link>https://cis-india.org/internet-governance/news/the-wire-karan-saini-may-11-2018-aadhaar-remains-an-unending-security-nightmare-for-a-billion-indians</link>
    <description>
        &lt;b&gt;Yesterday was the 38th and last day of hearings in the Supreme Court case challenging the constitutional validity of India’s biometric authentication programme. After weeks of arguments from both sides, the Supreme Court has now reserved the matter for judgement.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Karan Saini was published in the &lt;a class="external-link" href="https://thewire.in/government/aadhaar-remains-an-unending-security-nightmare-for-a-billion-indians"&gt;Wire&lt;/a&gt; on May 11, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Since its inception, the Aadhaar project has lurched from controversy to scandal. In the last two years, the debate has heavily centred around issues of data security, privacy and government overreach. This debate, unfortunately, like with most things Aadhaar, has been obfuscated in no small part due to the manner in which the Unique Identification Authority of India (UIDAI) reacts to critical public discussion.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As India waits for the apex court’s judgement, this is as good time as any to take stock of the security and privacy flaws underpinning the Aadhaar ecosystem.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Poor security standards&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Let’s start with the lackadaisical attitude towards information security. As has become evident in the &lt;a href="https://cis-india.org/internet-governance/information-security-practices-of-aadhaar-or-lack-thereof/view" target="_blank"&gt;past&lt;/a&gt;, harvesting and collecting Aadhaar numbers – or acquiring scans and prints of valid Aadhaar cards – has become a trivial matter.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are several government websites which implement Aadhaar authentication while at the same time lack in basic security practices such as the use of SSL to encrypt user traffic and/or the use of captchas to protect against brute-force or scraping attacks. This includes the biometric attendance website of the &lt;a href="http://dgftbct.attendance.gov.in/register/myemp" rel="noopener" target="_blank"&gt;Director General of Foreign Trade&lt;/a&gt;, the website for the &lt;a href="http://nfsm.gov.in/dbt/aadhaarverification.aspx" rel="noopener" target="_blank"&gt;National Food Security Mission&lt;/a&gt; and the &lt;a href="http://medleaprhry.gov.in/PvtAddRecord.aspx" rel="noopener" target="_blank"&gt;Medleapr website&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With numerous government websites being susceptible, problematic issues such as the use of open directories to store sensitive data gives us a look into how even the bare minimum – when it comes to adhering to security best practices – isn’t enforced across the gamut of websites which interface with Aadhaar.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It should not be acceptable practice to have government websites with open web directories containing PDF scans of dozens of Aadhaar cards available for just about anyone to view and/or download. Yet, over the past year and even before, many government websites have been found to either inadvertently or knowingly publish this information without much regard for the potential consequences it could have.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The UIDAI has repeatedly shown an attitude of hostility and dismissiveness when it comes to fixing security and privacy issues which are present in the Aadhaar ecosystem. It has also shown no signs of how it plans to tackle this problem.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In my personal experience as a security researcher, I have found and reported a cache of more than 40,000 scanned Aadhaar cards being available through an unsecured database managed by a private company, which relied on those scans for the purposes of verifying and maintaining records of their customers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What’s worse is that the media reports regarding Aadhaar information being exposed may only be scratching the surface of the issue as more data may actually be susceptible to access and theft, and simply yet to be found and publicly reported. For example, data could be leaking through publicly available data stores of third-party companies interfacing with Aadhaar, or through inadequately secured API and sensitive portals without proper access controls.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Not all security incidents become a matter of public knowledge, so what we know at any given point about the illegal exposure of Aadhaar information may just be a glimpse of what is actually out there.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It should be acknowledged that the possession of these 12-digit numbers and their corresponding demographic information can open up room for potential fraud –  or at the very least make it easier for criminals to carry out identity theft and SIM and banking fraud.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A &lt;a href="https://thewire.in/economy/aadhaar-fraud-uidai" target="_blank"&gt;detailed analysis&lt;/a&gt; of all publicly-reported Aadhaar-related or Aadhaar-enabled fraud over the last few years shows that the problem is not only real but deserves far more attention than what it has received so far.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Threat level infinity&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Taking a step back, it’s clear that the Aadhaar project snowballed into an ecosystem that it now struggles to control.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For instance, demographic information – as is stated in the draft for the &lt;a href="https://www.uidai.gov.in/images/the_aadhaar_act_2016.pdf" rel="noopener" target="_blank"&gt;Aadhaar Act&lt;/a&gt; (NIDAI Bill 2010) – was originally considered confidential information, meaning no entity could request your demographic information such as name, address, phone number etc. for purposes of eKYC.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, as the ecosystem has progressed, the implementation and usage of eKYC have also changed and grown significantly with companies like PayTM utilising eKYC for the purposes of requesting and verifying customer information. It should be considered that data which has been collected by any of these companies through Aadhaar can be accessed by them in the future for an indefinite period of time depending on their own policies regarding storage and retention of the data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;If there ever is a breach of the CIDR or a mirrored silo containing a significant amount of Aadhaar-related data, it would directly affect more than one billion people. To put this in perspective, it would easily be the single largest breach of data in terms of the sheer number of people affected &lt;i&gt;and&lt;/i&gt; it would have far-reaching consequences for everyone affected which might be very hard to offset.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On a comparatively smaller scale – although just as serious, if not more in terms of potential implications – would be a breach of any given state’s resident data hub (SRDH) repository. In some cases, SRDHs &lt;a href="https://www.thenewsminute.com/article/13-lakh-aadhaar-numbers-leaked-andhra-govt-website-linked-personal-details-80178" rel="noopener" target="_blank"&gt;have been known to integrate data&lt;/a&gt; acquired from other sources containing information regarding parameters such as caste, banking details, religion, employment status, salaries, and &lt;a href="https://webcache.googleusercontent.com/search?q=cache:-HMXusc-Nm4J:https://mpsrdh.gov.in/aboutUsCitizen.html+&amp;amp;cd=2&amp;amp;hl=en&amp;amp;ct=clnk&amp;amp;gl=in&amp;amp;client=firefox-b-ab" rel="noopener" target="_blank"&gt;then linking the same&lt;/a&gt; to residents’ corresponding Aadhaar data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Damage control would be costly and painstaking due to the number of people enrolled. What adds to the disastrous consequences is that one cannot just deactivate their Aadhaar or opt-out of the programme the way they would with say a compromised Facebook or Twitter account. You can always deactivate Facebook. You cannot deactivate your Aadhaar. It should be noted that even with biometrics set to ‘disabled’, Aadhaar verification transactions can be verified through OTP.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, the Aadhaar ecosystem is such that information about individuals can be accessed not just from UIDAI servers but also from other third-party databases where Aadhaar numbers are linked with their own respective datasets. Due to this aspect – multiple points of failure are introduced for possible compromise of data, especially because third-party databases are almost certainly not as secure as the CIDR.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recently, after taking a closer look at the ecosystem of websites which incorporate the use of Aadhaar based authentication, I &lt;a href="https://www.karansaini.com/extracting-aadhaar-linked-phone-numbers/" rel="noopener" target="_blank"&gt;discovered that it was possible&lt;/a&gt; to extract the phone number linked to any given Aadhaar through the use of websites which poorly implemented Aadhaar text-based (OTP) authentication.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This process worked by first retrieving the last four digits of the phone number linked to an Aadhaar using any website which reveals this information (this includes DigiLocker, NFSM.gov.in and seems to be standard practice which seems to be enforced by UIDAI) and then performing an enumeration attack on the first six digits using websites which allow the user to provide both their Aadhaar number and the verified phone number linked to it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This again highlights that while secure practices might be followed by the UIDAI, the errors in implementation and other flaws are introduced neverthelessby third parties who interface with Aadhaar, posing a risk to the privacy and security of its data.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The bank mapper rabbit hole&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;As of February 24, 2017, it &lt;a href="https://thewire.in/government/india-inc-needs-to-fix-numerous-basic-%20information-security-flaws-quickly)" target="_blank"&gt;was possible&lt;/a&gt; to retrieve bank linking status information directly from UIDAI’s website without any prior verification.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, after this information was reported, the ‘&lt;a href="https://uidai.gov.in/" rel="noopener" target="_blank"&gt;uidai.gov.in&lt;/a&gt;’ website was updated to first require requesters to prove their identity before retrieving Aadhaar bank-linking data from the endpoint on their website.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A year later – when business technology news site &lt;i&gt;ZDNet &lt;/i&gt;published their report regarding a flawed API on the website of a state-owned utility company (later revealed to be Indane) – part of the data revealed included bank linking status information which was identical to what was previously revealed on UIDAI’s website without proper authentication.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This suggests that both the Indane API and UIDAI website utilised the National Payments Corporation of India (NPCI) to retrieve bank-linking data – but as of now, this remains conjecture since Indane never put out a statement or gave a public comment regarding the flawed API on their website.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;More importantly, what this also suggests is that the NPCI never placed any controls or security mechanisms (such as request throttling or access controls) on the lookup requests it processed for the UIDAI (and seemingly for Indane as well).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This means that while the UIDAI may have fixed their website to not reveal bank linking data without proper verification – the issue was not rectified at its core by the NPCI – allowing the same to happen a year later in Indane’s case. This practice also classifies as a case of security through obscurity, &lt;a href="http://users.softlab.ntua.gr/~taver/security/secur3.html" rel="noopener" target="_blank"&gt;which&lt;/a&gt; “is the belief that a system of any sort can be secure so long as nobody outside of its implementation group is allowed to find out anything about its internal mechanisms”.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Who is on the hook?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;There is a lack of needed accountability when it comes to data breaches. Have any of the organisations against whom allegations of data breach been made been investigated and acted on? Have fines been imposed on those responsible for allowing access/theft of user data? Have there been reports published by any of the affected organisations in which they investigate any alleged breaches to either provide insight regarding the breach and its impact, the scale of data accessed, logs of access and other crucial evidence or dismiss the allegations by proving that there was no intrusion which took place?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most of the times, organisations do not even accept that a breach has taken place, let alone take responsibility for the same and strive to better protect user data in the future.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Switching to ‘PR spin mode’ should never be the answer when dealing with the data of billion-plus Indian citizens and residents. This can be observed in almost all cases where a breach or security lapse was alleged.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The UIDAI has also acquired the dubious reputation of sending legal notices and slapping cases on journalists and security researchers who seek to highlight the security and privacy problems ailing the Aadhaar infrastructure.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In March 2017, a case against Sameer Kochhar – chairman of the Skoch Group – was filed on the basis of a complaint from Yashwant Kumar of the UIDAI allegedly for “spreading rumours on the internet about vulnerability of the Aadhaar system”. Kochhar had written an article in February 2017 titled “Is a Deep State at Work to Steal Digital India?” in which a request replay attack on biometric Aadhaar authentication was demonstrated.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Two months later, The Centre for Internet and Society published a report regarding several government websites which were inadvertently leaking millions of Aadhaar card numbers. A few days after this report was published, the UIDAI &lt;a href="https://in.reuters.com/article/india-aadhaar-breach/critics-of-aadhaar-project-say-they-have-%20been-harassed-put-under-surveillance-idINKCN1FX1SS" rel="noopener" target="_blank"&gt;sent a legal notice to the organisation&lt;/a&gt;, stating that the people involved with the report had to be “brought to justice”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In January 2018, an investigative story was published by Rachna Khaira of &lt;em&gt;The Tribune&lt;/em&gt; newspaper – in which she reported that access to an Aadhaar portal was being sold by “agents” for as cheap as Rs 500. In response to this story – the UIDAI first sought to discredit the investigative work by calling it a ‘case of misreporting’ – after which they attempted to downplay the magnitude of the report by citing that biometrics were safe and had not been breached.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Following this, the Delhi crime branch registered an FIR against the reporter and others named in the article on the basis of a complaint by a UIDAI official, with charges ranging from forgery, cheating by impersonation and unauthorised access of a computer system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In March 2018, &lt;em&gt;ZDNet&lt;/em&gt; published a report about Aadhaar-related data leaking from an unsecured API on a utility provider’s website. This was the result of days of testing to first confirm the existence issue and its scope. It was preempted by more than a month of attempted communication through several channels of communication – email, phone, even direct messages via Twitter – with both Indane and the UIDAI (and even the Indian Consulate in New York).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But still, when the report was published after a lack of acknowledgement/response from affected parties, the UIDAI was quick to deny the report as well as any possibility of such a thing occurring. The Aadhaar agency then released a statement in which they said they were ‘contemplating legal action’ against the publication of their report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Data security and privacy laws won’t do much to affect the dismissive and hostile attitude the UIDAI seems to have regarding the people that investigate and report on security and privacy issues relating to Aadhaar.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Hide and seek&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In general, when it comes to reports of security breaches and security incidents, many authorities in India prefer playing the blame-game. This was seen latest in response to an internal letter (ironically marked as ‘SECRET’) that was circulated on social media – which mentioned that data was stolen from the Aadhaar Seeding portal of the EPFO by hackers exploiting a known vulnerability in the Apache Struts framework.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Following this – the EPFO &lt;a href="https://economictimes.indiatimes.com/wealth/personal-finance-news/epfo-slams-aadhaar-data-theft-reports-on-social-media/articleshow/63999631.cms?utm_source=WAPusers&amp;amp;utm_medium=whatsappshare&amp;amp;utm_campaign=socialsharebutton&amp;amp;from=mdr" rel="noopener" target="_blank"&gt;quickly switched to PR mode&lt;/a&gt; and publicly issued a statement through their official Twitter account (@socialepfo) denying the breach – saying that “There is no leak from EPFO database. We have already shut down the alleged Aadhaar seeding site run by Common Service Centres on 22.03.2018.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Every time reports of a potential breach or leak of data circulate, Indian government agencies are quick to come out and announce that no breach has taken place. However, this is always to be taken just on the basis of their saying so, as opposed to the reports which they’re meant to be arguing (in some cases) contain verifiable evidence which is the result of arduous investigative work.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Regardless, passing around the blame and in cases completely denying security incidents is not something authorities should be doing when it concerns the data of more than a billion people.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In response to a recent story by &lt;em&gt;Asia Times&lt;/em&gt; &lt;a href="https://www.thewire.in/government/cracked-aadhaar-enrolment-software-being-sold" rel="noopener" target="_blank"&gt;regarding Aadhaar enrolment software being cracked and sold&lt;/a&gt;, the UIDAI sought to discredit and discount the report through messages shared on their social media profiles – where they stated that the report was “baseless, false, misleading and irresponsible”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The UIDAI should have an interest in protecting any and all data which stems from or relates to Aadhaar as it has to do with a project they are ultimately responsible for. It should not matter whether the leak occurred from a portal on EPFO’s website, an API without proper access controls on Indane’s website, a website of the Andhra Pradesh state government, through biometric request replay attacks, through sold access to admin portals and cracked software, or however else. It should ultimately be the UIDAI’s responsibility to not only be reactive about these issues when they’re brought to light but to do so in such a way which does not hinder reporters from continuing their work.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, if the UIDAI wishes to keep its systems as secure as they could be – they should proactively seek such reports about flaws or vulnerabilities in critical infrastructure pertaining to their project.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The way forward&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In April 2018, the head of the Indian Computer Emergency Response Team (CERT-IN), &lt;a href="https://factordaily.com/vulnerability-reported-cert/" rel="noopener" target="_blank"&gt;rather defensively noted&lt;/a&gt; that “not a single person had reported any incident” to the organisation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;CERT-In, a part of the IT ministry, is the central agency responsible for dealing with security issues and incidents. To put it bluntly, it has not done a very great job of outreach when it comes to the people it ultimately relies on: security researchers and hackers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In India, there is an abundance of skills and talent when it comes to IT security and this could be of immense help to organisations responsible for managing critical infrastructure – but only if they cared enough to utilise it to the fullest extent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ajay Bhushan Pandey, the CEO of UIDAI,  promised a secure and legal bug reporting environment for the Aadhaar ecosystem sometime in 2017. However, almost a year later, there are no tangible signs of any steps being taken to ensure the same. In fact, the UIDAI would already be straying from their usual course of action if they stopped harassing people reporting on issues of security and privacy with regard to Aadhaar.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It has been suggested that the UIDAI employ a bug bounty programme – which involves rewarding hackers with monetary compensation or through means such as an addition to a ‘Security Hall of Fame’ as an incentive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;I personally believe that there is no need for a bug bounty programme in its traditional sense – meaning that UIDAI should not have to provide material incentives to attract hackers to report valid issues to them. Simply acknowledging the work of those that discover and report valid issues should more than likely be incentive enough to get talent on-board.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The US Department of Defense (DoD) employs a similar approach &lt;a href="https://www.hackerone.com/sites/default/files/2018-03/Distributed%20Defense-How%20Governments%20Deploy%20Hacker-Powered%20Security.pdf" rel="noopener" target="_blank"&gt;where they invite hackers from the world&lt;/a&gt; over to test their systems for security vulnerabilities/bugs and then report them in a responsible manner. What the hackers get in return is the acknowledgement of their skill and devotion to ensuring the security of DoD’s platform. Something similar needs to be set up with regard to critical information infrastructures in India so that issues can be reported by anyone who wishes to do so – without hassle and/or fear of persecution hanging over the heads of hackers.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/the-wire-karan-saini-may-11-2018-aadhaar-remains-an-unending-security-nightmare-for-a-billion-indians'&gt;https://cis-india.org/internet-governance/news/the-wire-karan-saini-may-11-2018-aadhaar-remains-an-unending-security-nightmare-for-a-billion-indians&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-13T16:28:40Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
