<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 701 to 715.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/e-governance-identity-privacy.pdf"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/dsci-infosys-roundtable"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/dsci-best-practices-meet"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/driving-in-the-surveillance-society-cameras-rfid-black-boxes"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/draft-intl-principles-on-communications-surveillance-and-human-rights"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/draft-human-dna-profiling-bill-april-2012"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/e-governance-identity-privacy.pdf">
    <title>E-Governance, Identity &amp; Privacy</title>
    <link>https://cis-india.org/internet-governance/e-governance-identity-privacy.pdf</link>
    <description>
        &lt;b&gt;This chapter will look at different legislations, projects, and policies pertaining to e-governance and identity that India has put in place, and examine both the strengths and the weaknesses of these, through the lense of privacy.&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/e-governance-identity-privacy.pdf'&gt;https://cis-india.org/internet-governance/e-governance-identity-privacy.pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2012-09-26T06:17:03Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women">
    <title>Due Diligence Project FGD by UN Women</title>
    <link>https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women</link>
    <description>
        &lt;b&gt;On October 11, 2019, Radhika Radhakrishnan attended a focussed group discussion at the UN House, New Delhi, organized by UN Women for their multi-country research study on online violence (Due Diligence Project).&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The purpose of the  discussion was to provide a better understanding of the nature and the  scope of this form of VAWG and to provide recommendations to inform  policies, plans, programming and advocacy on the issue.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women'&gt;https://cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Due Diligence</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-10-20T07:11:13Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/dsci-infosys-roundtable">
    <title>DSCI-Infosys Roundtable</title>
    <link>https://cis-india.org/internet-governance/news/dsci-infosys-roundtable</link>
    <description>
        &lt;b&gt;Sunil Abraham participated in this meeting organized by Infosys in Bangalore on March 25, 2019 as a speaker.&lt;/b&gt;
        &lt;p&gt;AGENDA:&lt;/p&gt;
&lt;table&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;10:00-10:15 AM&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Opening Remarks:  Infosys &lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Context Setting: DSCI and Infosys&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;10:15- 11:00 AM&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Elements                     shaping Data Economy &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;§                        Digitization:                   Personalization, Experience, Productivity &amp;amp;                   Possibilities&lt;/p&gt;
&lt;p&gt;§                        Global Internet                   Platforms: Transforming B2C and B2B&lt;/p&gt;
&lt;p&gt;§                        Phantomization                   of Technology &amp;amp; Business Models&lt;/p&gt;
&lt;p&gt;§                        Changing nature                   of Deliveries: value driven, subscription based and                   platform based&lt;/p&gt;
&lt;p&gt;§                        Product                   Economy: Data-centric Designs, Start-ups and Unicorn,&lt;/p&gt;
&lt;p&gt;§                        IOT and                   Industrialisation 4.0: Next generation service &amp;amp;                   business lines&lt;/p&gt;
&lt;p&gt;§                        Data flow and                   how it’s shaping trade of goods and services&lt;/p&gt;
&lt;p&gt;§                        Role of data in                   delivering the public service and improving public                   order&lt;/p&gt;
&lt;p&gt;§                        Artificial                   Intelligence: at specific product/service level and                   its ramification to industrial and national economy&lt;/p&gt;
&lt;p&gt;§                        Technology:                   role of data in developing next generation tech                   platforms&lt;/p&gt;
&lt;p align="right"&gt;&lt;i&gt;Discussion Facilitation: DSCI and                     Infosys&lt;/i&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;11:00- 11:45 AM&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Tech’s Dilemmas&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;§                        Scale and reach                   of BigTech: Industrial Capitalism versus Internet                   Capitalism&lt;/p&gt;
&lt;p&gt;§                        Competition&lt;/p&gt;
&lt;p&gt;§                        Influence on                   personal, social, transactional, economic and                   political life&lt;/p&gt;
&lt;p&gt;§                        Stressed                   relations with values of modern value system&lt;/p&gt;
&lt;p&gt;§                        Ethical issues:                   human rights, social harmony, public space decency,                   health electoral  processes, information warfare...&lt;/p&gt;
&lt;p&gt;§                        Data Privacy&lt;/p&gt;
&lt;p&gt;§                        Tech’s                   response: Locking down of data, editorial/ censorship                    controls...&lt;/p&gt;
&lt;p&gt;§                        Challenges of                   law enforcement, fraud management and supervision&lt;/p&gt;
&lt;p&gt;§                        Relevance to                   national security objectives&lt;/p&gt;
&lt;p&gt;......&lt;/p&gt;
&lt;p&gt;§                        Principles of                   Responsible Innovation&lt;/p&gt;
&lt;p&gt;§                        Ideas under                   discussion/ experimentation&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p align="right"&gt;&lt;i&gt;Discussion Facilitation: DSCI and Infosys&lt;/i&gt;&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;11:45-12:15 AM&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Shaping Data Economy&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;§                        Structures and                   approaches: state controlled, private sector led,                   decentralized&lt;/p&gt;
&lt;p&gt;§                        Directions:                   legal/ policy,  innovation, investments, architectures                   (like India Stack),&lt;/p&gt;
&lt;p&gt;§                        Searching the                   role of liberal economic principles&lt;/p&gt;
&lt;p&gt;§                        Open                   architectures and open data ecosystem&lt;/p&gt;
&lt;p&gt;§                        Positions,                   Obligations, Burdens and Liabilities for protecting                   rights, creating level playing field, ensuring                   competition...&lt;/p&gt;
&lt;p&gt;§                        Regulatory                   approaches: establishing supervisory controls&lt;/p&gt;
&lt;p&gt;§                        National                   security: Interventions, mandates and cooperation&lt;/p&gt;
&lt;p align="right"&gt;&lt;i&gt;Discussion Facilitation: DSCI and Infosys&lt;/i&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;12:15 to 12:30 PM&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Discussion Summary&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;12:30 PM onwards&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p align="center"&gt;&lt;b&gt;Lunch&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt; &lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/dsci-infosys-roundtable'&gt;https://cis-india.org/internet-governance/news/dsci-infosys-roundtable&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-04-05T02:06:00Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet">
    <title>DSCI's Bangalore chapter meet</title>
    <link>https://cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet</link>
    <description>
        &lt;b&gt;On January 29, 2019, Karan Saini and Gurshabad Grover participated in the Bangalore chapter meet organized by Data Security Council of India in Bangalore.&lt;/b&gt;
        &lt;p&gt;&lt;img src="https://cis-india.org/home-images/DSCI.png/@@images/5964984e-07ca-4be0-8a63-98b2490b5032.png" alt="DSCI" class="image-inline" title="DSCI" /&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet'&gt;https://cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-02-02T01:47:51Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes">
    <title>DSCI Best Practices Meet 2013</title>
    <link>https://cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes</link>
    <description>
        &lt;b&gt;The DSCI Best Practices Meet 2013 was organized on July 12, 2013 at Hyatt Regency, Anna Salai in Chennai. Kovey Coles attended the meet and shares a summary of the happenings in this blog post.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p&gt;&lt;i&gt;This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC&lt;/i&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Last year’s annual Best Practices Meet, sponsored by the Data Security Council of India (DSCI), was held in here in Bangalore, and featured CIS associates as panelists for an agenda focused mostly around mobility in technology. This year, the event was continued in nearby Chennai, where many of India’s top stakeholders in Cyber Security came together at the Hyatt hotel to discuss the modern cyber security landscape. Several of the key points of the day emphasized how the industry realm needed to be especially keen on Cyber Security today. Early speakers explained how many Cyber-Attacks occur as opportunistic attacks on financial institutions, and that these breaches often take months to be discovered, with the discovery usually being made by a third-party. For those reasons, it was repeatedly mentioned throughout the day that modern entities must anticipate attacks as inevitable, and prepare themselves to be able to respond and successfully bounce-back.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several panelists of the event expanded upon the evolving challenges facing industries, and explained why service based industry continually grows more susceptible to Cyber-Attack. There were representatives from Microsoft, Flextronics, MyEasyDoc, and others, who explained how technological demands of modern consumers resulted inadvertently in weaker security. For example, with customers expecting real-time access to data rather than periodic data reports, i.e financial data reports, industries must now keep their data open, which weakens database security. Overall, the primary challenge faced by the industry was effectively summarized by Microsoft India CSO Ganapathi Subramaniam, stating that within web services, “Security and usability are inversely proportional.” Essentially, the more convenient a product, the less secure its infrastructure.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Despite discussion of the difficulties facing modern producers and consumers, there were undoubtedly highlights of optimism at the conference. A presentation by event sponsor Juniper Networks shed light on practices which combat Cyber-Attackers, including rerouting perceived Distributed Denial of Service (DDoS) attacks and finger-printing suspected hackers through a series of characteristics rather than just IP addresses (these characteristics include browser version, fonts, Add-Ons, time zone, and more). Notably, there was a call for cooperation on all fronts in combatting Cyber-crime, for public-private partnerships (PPP), and many citizens stood and spoke on the behalf of civil society’s incorporation in the process as well. One speaker, Retired Brig. Abhimanyu Ghosh admirably tore down sector divisions in the face of Cyber-Security threats, saying “We all want to secure ourselves. It is not a question of industry versus government, government versus industry. Government needs industry, and industry needs government.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, a few speakers used their opportunity at the conference to highlight issues related to rights and responsibilities of both citizens and government in internet. Nikhil Moro, a scholar at the Hindu Center for Politics and Public Policy, spoke at length about the urgent condition of laws which undermine freedom of speech and freedom of expression in India, especially within while online. His talk, which occurred near the end of the event, stirred the crowd to discussion, and helped remind the attendees of the comprehensiveness of issues which demand attention in the realm of a growing internet presence.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes'&gt;https://cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>kovey</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-07-26T08:18:01Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/dsci-best-practices-meet">
    <title>DSCI Best Practices Meet</title>
    <link>https://cis-india.org/internet-governance/news/dsci-best-practices-meet</link>
    <description>
        &lt;b&gt;Udbhav Tiwari represented CIS on a Panel titled "Reposing Trust in Citizen Identity Systems" at the DSCI Best Practices Meet held at the ITC Gardenia on June 22 and 23, 2017 in Bangalore. &lt;/b&gt;
        &lt;p&gt;The event discussions featured around privacy and Aadhaar.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/dsci-best-practices-meet'&gt;https://cis-india.org/internet-governance/news/dsci-best-practices-meet&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-07-07T01:39:23Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/driving-in-the-surveillance-society-cameras-rfid-black-boxes">
    <title>Driving in the Surveillance Society: Cameras, RFID tags and Black Boxes...</title>
    <link>https://cis-india.org/internet-governance/blog/driving-in-the-surveillance-society-cameras-rfid-black-boxes</link>
    <description>
        &lt;b&gt;In this post, Maria Xynou looks at red light cameras, RFID tags and black boxes used to monitor vehicles in India.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p&gt;&lt;i&gt;This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC&lt;/i&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;How many times in your life have you heard of people been involved in car accidents and of pedestrians being hit by red-light-running vehicles? What if there could be a solution for all of this? Well, several countries, including the United States, the United Kingdom and Singapore, have &lt;a href="http://www.thenewspaper.com/rlc/docs/syn310.pdf"&gt;already adopted measures&lt;/a&gt; to tackle vehicle accidents and fatalities, some of which include traffic enforcement cameras and other security measures. India is currently joining the league by not only installing red light cameras, but by also including radio frequency identification (RFID) tags on vehicles´ number plates, as well as by installing electronic toll collection systems and black boxes in some automobiles. Although such measures could potentially increase our safety, &lt;a href="http://arstechnica.com/tech-policy/2012/09/your-car-tracked-the-rapid-rise-of-license-plate-readers/2/"&gt;privacy concerns&lt;/a&gt; have arisen as it remains unclear how data collected will be used.&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;b&gt;Red light cameras&lt;/b&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Last week, the Chennai police announced that it plans&lt;/span&gt;&lt;a href="http://articles.timesofindia.indiatimes.com/2011-05-12/chennai/29535601_1_red-light-camera-system-red-light-cameras-traffic-signals"&gt; to install traffic enforcement cameras&lt;/a&gt;&lt;span&gt;, otherwise known as red light cameras, at 240 traffic signals over the next months, in order to put an end to car thefts in the city. Red light cameras, which capture images of vehicles entering an intersection against a red traffic light, have been installed in Bangalore since &lt;/span&gt;&lt;a href="http://www.traffictechnologytoday.com/news.php?NewsID=2767"&gt;early 2008&lt;/a&gt;&lt;span&gt; and a&lt;/span&gt;&lt;a href="http://ibnlive.in.com/news/study-finds-red-light-cameras-cuts-crashes/142065-57-132.html"&gt; study&lt;/a&gt;&lt;span&gt; indicates that they have reduced the traffic violation rates. A &lt;/span&gt;&lt;a href="http://www.thenewspaper.com/rlc/docs/syn310.pdf"&gt;2003 report by the National Cooperative Highway Research Programme (NCHRP)&lt;/a&gt;&lt;span&gt; examined studies from the previous 30 years in the United States, the United Kingdom, Australia and Singapore and concluded that red light cameras ´improve the overall safety of intersections when they are used´.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;/span&gt;&lt;span&gt;However, how are traffic violation rates even measured? According to &lt;/span&gt;&lt;a href="http://blogs.wsj.com/numbersguy/seeing-red-1208/"&gt;Barbara Langland Orban&lt;/a&gt;&lt;span&gt;, an associate professor of health policy and management at the University of South Florida:&lt;/span&gt;&lt;/p&gt;
&lt;blockquote class="italized"&gt;&lt;i&gt;“Safety is measured in crashes, in particular injury crashes, and violations are not a proxy for injuries. Also, violations can be whatever number an agency chooses to report, which is called an ‘endogenous variable’ in research and not considered meaningful as the number can be manipulated. In contrast, injuries reflect the number of people who seek medical care, which cannot be manipulated by the reporting methods of jurisdictions.”&lt;/i&gt;&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Last year,  the Bombay state government informed the High Court that the &lt;/span&gt;&lt;a href="http://www.indianexpress.com/news/cctvs-not-fit-to-detect-traffic-violations-state-to-hc/910392"&gt;100 CCTV cameras&lt;/a&gt;&lt;span&gt; installed at traffic junctions in 2006-2007 were unsuitable for traffic enforcement because they lacked the capacity of automatic processing. Nonetheless, red light cameras, which are capable of monitoring speed and intersections with stop signals, are currently being proliferated in India. Yet, questions remain: Do red light cameras adequately increase public safety? Do they serve financial interests? Do they violate driver´s &lt;/span&gt;&lt;a href="http://www.thehindu.com/opinion/op-ed/of-constitutional-due-process/article436586.ece"&gt;due-process rights&lt;/a&gt;&lt;span&gt;?&lt;/span&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;&lt;b&gt;RFID tags and Black Boxes&lt;/b&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;A communication revolution is upon us, as Maharashtra state transport department is currently including radio &lt;/span&gt;&lt;a href="http://www.dnaindia.com/mumbai/report_maharashtra-rto-spy-to-breathe-down-drivers-neck_1625521"&gt;frequency identification (RFID) tags on each and every number plate of vehicles&lt;/a&gt;&lt;span&gt;. This ultimately means that the state will be able to monitor your vehicle´s real-time movement and track your whereabouts. RFID tags are not only supposedly used to increase public safety by tracking down offenders, but to also streamline public transport timetables. Thus, the movement of buses and cars would be precisely monitored and would provide passengers minute-to-minute information at bus stops. Following the &lt;/span&gt;&lt;a href="http://www.hsrpdelhi.com/Rule50.pdf"&gt;2001 amendment of Rule 50 of the Central Motor Vehicles Rules&lt;/a&gt;&lt;span&gt;, 1989, new number plates with RFID tags have been made mandatory for all types of motor vehicles throughout India.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;RFID technology has also been launched at Maharashtra´s &lt;/span&gt;&lt;a href="http://articles.timesofindia.indiatimes.com/2012-08-18/mumbai/33261046_1_rfid-stickers-border-check-posts"&gt;state border check-posts&lt;/a&gt;&lt;span&gt;. Since last year, the state government has been circulating RFID stickers to trucks, trailers and tankers, which would not only result in heavy goods vehicles not having to wait in long queues for clearance at check-posts, but would also supposedly put an end to corruption by RTO officials.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;By &lt;/span&gt;&lt;a href="http://articles.timesofindia.indiatimes.com/2013-03-07/mumbai/37530519_1_plazas-on-national-highways-toll-plazas-toll-collection"&gt;31 March 2014&lt;/a&gt;&lt;span&gt;, it is estimated that RFID-based electronic toll collection (ETC) systems will be installed on all national highways in India. According to &lt;/span&gt;&lt;a href="http://netindian.in/news/2013/03/05/00023379/electronic-toll-collection-all-national-highways-march-2014-joshi"&gt;Dr. Joshi&lt;/a&gt;&lt;span&gt;, the Union Minister for Road Transport and Highways:&lt;/span&gt;&lt;/p&gt;
&lt;blockquote class="italized" style="text-align: justify; "&gt;&lt;i&gt;“&lt;/i&gt;&lt;i&gt;The RFID technology&lt;/i&gt;&lt;i&gt; shall expedite the clearing of traffic at toll plazas and the need of carrying cash shall also be eliminated when toll plazas shall be duly integrated with each other throughout India.”&lt;/i&gt;&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Although Dr. Joshi´s mission to create a quality highway network across India and to increase the transparency of the system seems rational, the ETC system raises privacy concerns, as it &lt;/span&gt;&lt;a href="http://articles.timesofindia.indiatimes.com/2013-03-07/mumbai/37530519_1_plazas-on-national-highways-toll-plazas-toll-collection"&gt;uniquely identifies each vehicle&lt;/a&gt;&lt;span&gt;, collects data and provides general vehicle and traffic monitoring. This could potentially lead to a privacy violation, as India currently lacks adequate statutory provisions which could safeguard the use of our data from potential abuse. All we know is that our vehicles are being monitored, but it remains unclear how the data collected will be used, shared and retained, which raises concerns.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The cattle and pedestrians roaming the streets in India appear to have increased the need for the installation of an &lt;/span&gt;&lt;a href="http://www.thehindu.com/news/national/article3636417.ece"&gt;Event Data Recorder (EDR)&lt;/a&gt;&lt;span&gt;, otherwise known as a black box, which is a device capable of recording information related to crashes or accidents. The purpose of a black box is to record the speed of the vehicle at the point of impact in the case of an accident and whether the driver had applied the brakes. This would help insurance companies in deciding whether or not to entertain insurance claims, as well as to determine whether a driver is responsible for an accident.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Black boxes for vehicles are already being designed, tested and installed in some vehicles in India at an affordable cost. In fact, manufacturers in India have recommended that the government make it &lt;/span&gt;&lt;a href="http://www.thehindu.com/news/national/article3636417.ece"&gt;mandatory for cars&lt;/a&gt;&lt;span&gt; to be fitted with the device, rather than it being optional. But can we have privacy when our cars are being monitored? This is essentially a case of proactive monitoring which has not been adequately justified yet, as it remains unclear how information would be used, who would be authorised to use and share such information, and whether its use would be accounted for to the individual.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;b&gt;Are monitored cars safer?&lt;/b&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The trade-off is clear: the privacy and anonymity of our movement is being monitored in exchange for the provision of safety. But are we even getting any safety in return? According to a &lt;/span&gt;&lt;a href="http://www.fhwa.dot.gov/publications/research/safety/05049/05049.pdf"&gt;2005 Federal Highway Administration study&lt;/a&gt;&lt;span&gt;, although it shows a decrease in  front-into-side crashes at intersections with cameras, an increase in rear-end crashes has also been proven. Other&lt;/span&gt;&lt;a href="http://www.techdirt.com/articles/20091218/1100537428.shtml"&gt; studies&lt;/a&gt;&lt;span&gt; of red light cameras in the US have shown that more accidents have occurred since the installation of traffic enforcement cameras at intersections. Although no such research has been undertaken in India yet, the effectiveness, necessity and utility of red light cameras remain ambiguous.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Furthermore, there have been &lt;/span&gt;&lt;a href="http://www.usatoday.com/story/news/nation/2013/03/08/speed-camera-ruling/1974369/"&gt;claims&lt;/a&gt;&lt;span&gt; that the installation of red light cameras, ETCs, RFID tags, black boxes and other technologies do not primarily serve the purpose of public security, but financial gain. A huge debate has arisen in the United States on whether such monitoring of vehicles actually improves safety, or whether its primary objective is to serve financial interests. Red light cameras have already generated about $1.5 million in fines in the Elmwood village of Ohio, which leads critics to believe that the installation of such cameras has more to do with revenue enhancement than safety. The same type of question applies to India and yet a clear-cut answer has not been reached.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Companies which manufacture &lt;/span&gt;&lt;a href="http://dir.indiamart.com/impcat/vehicle-tracking-systems.html"&gt;vehicle tracking systems&lt;/a&gt;&lt;span&gt; are widespread in India, which constitutes the monitoring of our cars a vivid reality. Yet, there is a lack of statutory provisions in India for the privacy of our vehicle´s real-time movement and hence, we are being monitored without any safeguards. Major privacy concerns arise in regards to the monitoring of vehicles in India, as the following questions have not been adequately addressed: What type of data is collected in India through the monitoring of vehicles? Who can legally authorize access to such data? Who can have access to such data and under what conditions? Is data being shared between third parties and if so, under what conditions?How long is such data being retained for?&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;And more importantly: Why is it important to address the above questions? Does it even matter if the movement of our vehicles is being monitored? How would that affect us personally? Well, the monitoring of our cars implies a huge probability that it´s not our vehicles per se which are under the microscope,&lt;/span&gt;&lt;a href="http://www.farnish.plus.com/amatterofscale/mirrors/omni/surveillance.htm"&gt; but us&lt;/a&gt;&lt;span&gt;. And while the tracking of our movement might not end us up arrested, interrogated, tortured or imprisoned tomorrow...it might in the future. As long as we are being monitored,&lt;/span&gt;&lt;a href="http://www.samharris.org/blog/item/the-trouble-with-profiling"&gt; we are all suspects&lt;/a&gt;&lt;span&gt; and we may potentially be treated as any other offender who is suspected to have committed a crime. The current statutory omission in India to adequately regulate the use of traffic enforcement cameras, RFID tags, black boxes and other technologies used to track and monitor the movement of our vehicles can potentially violate our due process rights and infringe upon our right to privacy and other human rights. Thus, the collection, access, use, analysis, sharing and retention of data acquired through the monitoring of vehicles in India should be strictly regulated to ensure that we are not exposed to our defenceless control.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;b&gt;Maneuvering our monitoring&lt;/b&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Nowadays, surveillance appears to be the quick-fix solution for everything related to public security; but that does not need to be the case.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Instead of installing red light cameras monitoring our cars´ movements and bombarding us with fines, other ´simple´ measures could be enforced in India, such as&lt;/span&gt;&lt;a href="http://d2dtl5nnlpfr0r.cloudfront.net/tti.tamu.edu/documents/0-4196-2.pdf"&gt; increasing the duration of the yellow light&lt;/a&gt;&lt;span&gt; between the green and the red, &lt;/span&gt;&lt;a href="http://www.motorists.org/red-light-cameras/alternatives"&gt;re-timing lights&lt;/a&gt;&lt;span&gt; so drivers will encounter fewer red ones or increasing the visibility distance of the traffic lights so that it is more likely for a driver to stop. Such measures should be enforced by governments, especially since the monitoring of our vehicles is not adequately justified.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Strict laws regulating the use of all technologies monitoring vehicles in India, whether red light cameras, RFID tags or black boxes, should be enacted now. Such regulations should clearly specify the terms of monitoring vehicles, as well as the conditions under which data can be collected, accessed, shared, used, processed and stored. The enactment of regulations on the monitoring of vehicles in India could minimize the potential for citizens´ due process rights to be breached, as well as to ensure that their right to privacy and other human rights are legally protected. This would just be another step towards preventing ubiquitous surveillance and if governments are interested in protecting their citizens´ human rights as they claim they do, then there is no debate on the necessity of regulating the monitoring of our vehicles. The question though which remains is:&lt;/span&gt;&lt;/p&gt;
&lt;blockquote class="quoted"&gt;&lt;i&gt;Should we be monitored at all?&lt;/i&gt;&lt;/blockquote&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/driving-in-the-surveillance-society-cameras-rfid-black-boxes'&gt;https://cis-india.org/internet-governance/blog/driving-in-the-surveillance-society-cameras-rfid-black-boxes&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>maria</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>SAFEGUARDS</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-07-12T15:26:33Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/draft-intl-principles-on-communications-surveillance-and-human-rights">
    <title>Draft International Principles on Communications Surveillance and Human Rights</title>
    <link>https://cis-india.org/internet-governance/blog/draft-intl-principles-on-communications-surveillance-and-human-rights</link>
    <description>
        &lt;b&gt;These principles were developed by Privacy International and the Electronic Frontier Foundation and seek to define an international standard for the surveillance of communications. The Centre for Internet and Society has been contributing feedback to the principles. &lt;/b&gt;
        &lt;hr /&gt;
&lt;p&gt;The principles are still in draft form. The most recent version can be accessed &lt;a class="external-link" href="http://necessaryandproportionate.net"&gt;here&lt;/a&gt;. &lt;i&gt;This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC&lt;/i&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Our goal is that these principles will provide civil society groups, industry, and governments with a framework against which we can evaluate whether current or proposed surveillance laws and practices are consistent with human rights. We are concerned that governments are failing to develop legal frameworks to adhere to international human rights and adequately protect communications privacy, particularly in light of innovations in surveillance laws and techniques.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These principles are the outcome of a consultation with experts from civil society groups and industry across the world. It began with a meeting in Brussels in October 2012 to address shared concerns relating to the global expansion of government access to communications. Since the Brussels meeting we have conducted further consultations with international experts in communications surveillance law, policy and technology.&lt;a href="#fn1" name="fr1"&gt;[1]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We are now launching a global consultation on these principles. Please send us comments and suggestions by January 3rd 2013, by emailing rights (at) eff (dot) org.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Preamble&lt;/b&gt;&lt;br /&gt;Privacy is a fundamental human right, and is central to the maintenance of democratic societies. It is essential to human dignity and it reinforces other rights, such as freedom of expression and association, and is recognised under international human rights law.&lt;a href="#fn2" name="fr2"&gt;[2]&lt;/a&gt; Activities that infringe on the right to privacy, including the surveillance of personal communications by public authorities, can only be justified where they are necessary for a legitimate aim, strictly proportionate, and prescribed by law.&lt;a href="#fn3" name="fr3"&gt;[3]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Before public adoption of the Internet, well-established legal principles and logistical burdens inherent in monitoring communications generally limited access to personal communications by public authorities. In recent decades, those logistical barriers to mass surveillance have decreased significantly. The explosion of digital communications content and information about communications, or “communications metadata”, the falling cost of storing and mining large sets of data, and the commitment of personal content to third party service providers make surveillance possible at an unprecedented scale.&lt;a href="#fn4" name="fr4"&gt;[4]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While it is universally accepted that access to communications content must only occur in exceptional situations, the frequency with which public authorities are seeking access to information about an individual’s communications or use of electronic devices is rising dramatically—without adequate scrutiny. &lt;a href="#fn5" name="fr5"&gt;[5]&lt;/a&gt; When accessed and analysed, communications metadata may create a profile of an individual's private life, including medical conditions, political and religious viewpoints, interactions and interests, disclosing even greater detail than would be discernible from the content of a communication alone. &lt;a href="#fn6" name="fr6"&gt;[6]&lt;/a&gt; Despite this, legislative and policy instruments often afford communications metadata a lower level of protection and do not place sufficient restrictions on how they can be subsequently used by agencies, including how they are data-mined, shared, and retained.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is therefore necessary that governments, international organisations, civil society and private service providers articulate principles establishing the minimum necessary level of protection for digital communications and communications metadata (collectively "information") to match the goals articulated in international instruments on human rights— including a democratic society governed by the rule of law. The purpose of these principles is to:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify; "&gt;Provide guidance for legislative changes and advancements related to communications and   communications metadata to ensure that pervasive use of modern  communications technology does not result in an erosion of privacy.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Establish appropriate      safeguards to regulate access by public authorities (government agencies,      departments, intelligence services or law enforcement agencies) to      communications and communications metadata about an individual’s use of an      electronic service or communication media. &lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;We call on governments to establish stronger protections as required by their constitutions and human rights obligations, or as they recognize that technological changes or other factors require increased protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These principles focus primarily on rights to be asserted against state surveillance activities. We note that governments are required not only to respect human rights in their own conduct, but to protect and promote the human rights of individuals in general.&lt;a href="#fn7" name="fr7"&gt;[7]&lt;/a&gt; Companies are required to follow data protection rules and yet are also compelled to respond to lawful requests. Like other initiatives,&lt;a href="#fn8" name="fr8"&gt;[8]&lt;/a&gt; we hope to provide some clarity by providing the below principles on how state surveillance laws must protect human rights.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;The Principles&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Legality&lt;/b&gt;: Any limitation to the right to privacy must be prescribed by law. Neither the Executive nor the Judiciary may adopt or implement a measure that interferes with the right to privacy without a previous act by the Legislature that results from a comprehensive and participatory process. Given the rate of technological change, laws enabling limitations on the right to privacy should be subject to periodic review by means of a participatory legislative or regulatory process&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Legitimate Purpose&lt;/b&gt;: Laws should only allow access to communications or communications metadata by authorised public authorities for investigative purposes and in pursuit of a legitimate purpose, consistent with a free and democratic society.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Necessity&lt;/b&gt;: Laws allowing access to communications or communications metadata by authorised public authorities should limit such access to that which is strictly and demonstrably necessary, in the sense that an overwhelmingly positive justification exists, and justifiable in a democratic society in order for the authority to pursue its legitimate purposes, and which the authority would otherwise be unable to pursue. The onus of establishing this justification, in judicial as well as in legislative processes, is on the government.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Adequacy&lt;/b&gt;: Public authorities should restrain themselves from adopting or implementing any measure of intrusion allowing access to communications or communications metadata that is not appropriate for fulfillment of the legitimate purpose that justified establishing that measure.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Competent Authority&lt;/b&gt;: Authorities capable of making determinations relating to communications or communications metadata must be competent and must act with independence and have adequate resources in exercising the functions assigned to them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Proportionality&lt;/b&gt;: Public authorities should only order the preservation and access to specifically identified, targeted communications or communications metadata on a case-by-case basis, under a specified legal basis. Competent authorities must ensure that all formal requirements are fulfilled and must determine the validity of each specific attempt to access or receive communications or communications metadata, and that each attempt is proportionate in relation to the specific purposes of the case at hand. Communications and communications metadata are inherently sensitive and their acquisition should be regarded as highly intrusive. As such, requests should &lt;b&gt;at a minimum&lt;/b&gt; establish a) that there is a very high degree of probability that a serious crime has been or will be committed; b) and that evidence of such a crime would be found by accessing the communications or communications metadata sought; c) other less invasive investigative techniques have been exhausted; and d) that a plan to ensure that the information collected will be only that information reasonably related to the crime and that any excess information collected will be promptly destroyed or returned. Neither the scope of information types, the number or type of persons whose information is sought, the amount of data sought, the retention of that data held by the authorities, nor the level of secrecy afforded to the request should go beyond what is demonstrably necessary to achieve a specific investigation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Due process&lt;/b&gt;: Due process requires that governments must respect and guarantee an individual’s human rights, that any interference with such rights must be authorised in law, and that the lawful procedure that governs how the government can interfere with those rights is properly enumerated and available to the general public.&lt;a href="#fn9" name="fr9"&gt;[9]&lt;/a&gt;While criminal investigations and other considerations of public security and safety may warrant limited access to information by public authorities, the granting of such access must be subject to guarantees of procedural fairness. Every request for access should be subject to prior authorisation by a competent authority, except when there is imminent risk of danger to human life. &lt;a href="#fn10" name="fr10"&gt;[10]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;User notification&lt;/b&gt;: Notwithstanding the notification and transparency requirements that governments should bear, service providers should notify a user that a public authority has requested his or her communications or communications metadata with enough time and information about the request so that a user may challenge the request. In specific cases where the public authority wishes to delay the notification of the affected user or in an emergency situation where sufficient time may not be reasonable, the authority should be obliged to demonstrate that such notification would jeopardize the course of investigation to the competent judicial authority reviewing the request. In such cases, it is the responsibility of the public authority to notify the individual affected and the service provider as soon as the risk is lifted or after the conclusion of the investigation, whichever is sooner.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Transparency about use of government surveillance&lt;/b&gt;: The access capabilities of public authorities and the process for access should be prescribed by law and should be transparent to the public. The government and service providers should provide the maximum possible transparency about the access by public authorities without imperiling ongoing investigations, and with enough information so that individuals have sufficient knowledge to fully comprehend the scope and nature of the law, and when relevant, challenge it. Service providers must also publish the procedure they apply to deal with data requests from public authorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Oversight&lt;/b&gt;: An independent oversight mechanism should be established to ensure transparency of lawful access requests. This mechanism should have the authority to access information about public authorities' actions, including, where appropriate, access to secret or classified information, to assess whether public authorities are making legitimate use of their lawful capabilities, and to publish regular reports and data relevant to lawful access. This is in addition to any oversight already provided through another branch of government such as parliament or a judicial authority. This mechanism must provide – at a minimum – aggregate information on the number of requests, the number of requests that were rejected, and a specification of the number of requests per service provider and per type of crime. &lt;a href="#fn11" name="fr11"&gt;[11]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Integrity of communications and systems&lt;/b&gt;: It is the responsibility of service providers to transmit and store communications and communications metadata securely and to a degree that is minimally necessary for operation. It is essential that new communications technologies incorporate security and privacy in the design phases. In order, in part, to ensure the integrity of the service providers’ systems, and in recognition of the fact that compromising security for government purposes almost always compromises security more generally, governments shall not compel service providers to build surveillance or monitoring capability into their systems. Nor shall governments require that these systems be designed to collect or retain particular information purely for law enforcement or surveillance purposes. Moreover, &lt;i&gt;a priori&lt;/i&gt; data retention or collection should never be required of service providers and orders for communications and communications metadata preservation must be decided on a case-by-case basis. Finally, present capabilities should be subject to audit by an independent public oversight body.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Safeguards for international cooperation&lt;/b&gt;: In response to changes in the flows of information and the technologies and services that are now used to communicate, governments may have to work across borders to fight crime. Mutual legal assistance treaties (MLATs) should ensure that, where the laws of more than one state could apply to communications and communications metadata, the higher/highest of the available standards should be applied to the data. Mutual legal assistance processes and how they are used should also be clearly documented and open to the public. The processes should distinguish between when law enforcement agencies can collaborate for purposes of intelligence as opposed to sharing actual evidence. Moreover, governments cannot use international cooperation as a means to surveil people in ways that would be unlawful under their own laws. States must verify that the data collected or supplied, and the mode of analysis under MLAT, is in fact limited to what is permitted. In the absence of an MLAT, service providers should not respond to requests of the government of a particular country requesting information of users if the requests do not include the same safeguards as providers would require from domestic authorities, and the safeguards do not match these principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Safeguards against illegitimate access&lt;/b&gt;: To protect individuals against unwarranted attempts to access communications and communications metadata, governments should ensure that those authorities and organisations who initiate, or are complicit in, unnecessary, disproportionate or extra-legal interception or access are subject to sufficient and significant dissuasive penalties, including protection and rewards for whistleblowers, and that individuals affected by such activities are able to access avenues for redress. Any information obtained in a manner that is inconsistent with these principles is inadmissible as evidence in any proceeding, as is any evidence derivative of such information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Cost of surveillance&lt;/b&gt;: The financial cost of providing access to user data should be borne by the public authority undertaking the investigation. Financial constraints place an institutional check on the overuse of orders, but the payments should not exceed the service provider’s actual costs for reviewing and responding to orders, as such would provide a perverse financial incentive in opposition to user’s rights.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Signatories&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Organisations&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Article 19 (International)&lt;/li&gt;
&lt;li&gt;Bits of Freedom (Netherlands)&lt;/li&gt;
&lt;li&gt;Center for Internet &amp;amp;      Society India (CIS India)&lt;/li&gt;
&lt;li&gt;Derechos Digitales (Chile)&lt;/li&gt;
&lt;li&gt;Electronic Frontier Foundation      (International)&lt;/li&gt;
&lt;li&gt;Privacy International      (International)&lt;/li&gt;
&lt;li&gt;Samuelson-Glushko Canadian      Internet Policy and Public Interest Clinic (Canada)&lt;/li&gt;
&lt;li&gt;Statewatch (UK)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Individuals&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Renata Avila, human rights      lawyer (Guatemala)&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;b&gt;Footnotes&lt;/b&gt;&lt;/p&gt;
&lt;ol&gt;
&lt;p style="text-align: justify; "&gt;[&lt;a href="#fr1" name="fn1"&gt;1&lt;/a&gt;]For more information about the      background to these principles and the process undertaken, see      https://www.privacyinternational.org/blog/towards-international-principles-on-communications-surveillance&lt;br /&gt;[&lt;a href="#fr2" name="fn2"&gt;2&lt;/a&gt;]Universal Declaration of Human      Rights Article 12, United Nations Convention on Migrant Workers Article      14, UN Convention of the Protection of the Child Article 16, International      Covenant on Civil and Political Rights, International Covenant on Civil      and Political Rights Article 17; regional conventions including Article 10      of the African Charter on the Rights and Welfare of the Child, Article 11      of the American Convention on Human Rights, Article 4 of the African Union      Principles on Freedom of Expression, Article 5 of the American Declaration      of the Rights and Duties of Man, Article 21 of the Arab Charter on Human      Rights, and Article 8 of the European Convention for the Protection of      Human Rights and Fundamental Freedoms; Johannesburg Principles on National      Security, Free Expression and Access to Information, Camden Principles on      Freedom of Expression and Equality.&lt;br /&gt;[&lt;a href="#fr3" name="fn3"&gt;3&lt;/a&gt;]Martin Scheinin, “Report of the      Special Rapporteur on the promotion and protection of human rights and      fundamental freedoms while countering terrorism,” p11, available at &lt;a href="http://www2.ohchr.org/english/issues/terrorism/rapporteur/docs/A_HRC_13_37_AEV.pdf"&gt;http://www2.ohchr.org/english/issues/terrorism/rapporteur/docs/A_HRC_13_37_AEV.pdf&lt;/a&gt;.      See also General Comments No. 27, Adopted by The Human Rights Committee      Under Article 40, Paragraph 4, Of The International Covenant On Civil And      Political Rights, CCPR/C/21/Rev.1/Add.9, November 2, 1999, available at &lt;a href="http://www.unhchr.ch/tbs/doc.nsf/0/6c76e1b8ee1710e380256824005a10a9?Opendocument"&gt;http://www.unhchr.ch/tbs/doc.nsf/0/6c76e1b8ee1710e380256824005a10a9?Opendocument&lt;/a&gt;.&lt;br /&gt;[&lt;a href="#fr4" name="fn4"&gt;4&lt;/a&gt;]Communications metadata may      include information about our identities (subscriber information, device      information), interests, including medical conditions, political and      religious viewpoints (websites visited, books and other materials read,      watched or listened to, searches conducted, resources used), interactions      (origins and destinations of communications, people interacted with,      friends, family, acquaintances), location (places and times, proximities      to others); in sum, logs of nearly every action in modern life, our mental      states, interests, intentions, and our innermost thoughts.&lt;br /&gt;[&lt;a href="#fr5" name="fn5"&gt;5&lt;/a&gt;]For example, in the United      Kingdom alone, there are now approximately 500,000 requests for      communications metadata every year, currently under a self-authorising      regime for law enforcement agencies, who are able to authorise their own      requests for access to information held by service providers. Meanwhile,      data provided by Google’s Transparency reports shows that requests for      user data from the U.S. alone rose from 8888 in 2010 to 12,271 in 2011.&lt;br /&gt;[&lt;a href="#fr6" name="fn6"&gt;6&lt;/a&gt;]See as examples, a review of      Sandy Petland’s work, ‘Reality Mining’, in MIT’s Technology Review, 2008,      available at &lt;a href="http://www2.technologyreview.com/article/409598/tr10-reality-mining/"&gt;http://www2.technologyreview.com/article/409598/tr10-reality-mining/&lt;/a&gt; and also see Alberto Escudero-Pascual and Gus Hosein, ‘Questioning lawful      access to traffic data’, Communications of the ACM, Volume 47 Issue 3,      March 2004, pages 77 - 82.&lt;br /&gt;[&lt;a href="#fr7" name="fn7"&gt;7&lt;/a&gt;]Report of the UN Special      Rapporteur on the promotion and protection of the right to freedom of      opinion and expression, Frank La Rue, May 16 2011, available at &lt;a href="http://www2.ohchr.org/english/bodies/hrcouncil/docs/17session/a.hrc.17.27_en.pdf"&gt;http://www2.ohchr.org/english/bodies/hrcouncil/docs/17session/a.hrc.17.27_en.pdf&lt;/a&gt;&lt;br /&gt;[&lt;a href="#fr8" name="fn8"&gt;8&lt;/a&gt;]The Global Network Initiative      establishes standards to help the ICT sector protect the privacy and free      expression of their users. See &lt;a href="http://www.globalnetworkinitiative.org/"&gt;http://www.globalnetworkinitiative.org/&lt;/a&gt;&lt;br /&gt;[&lt;a href="#fr9" name="fn9"&gt;9&lt;/a&gt;]As defined by international and      regional conventions mentioned above.&lt;br /&gt;[&lt;a href="#fr10" name="fn10"&gt;10&lt;/a&gt;]Where judicial review is waived      in such emergency cases, a warrant must be retroactively sought within 24      hours.&lt;br /&gt;[&lt;a href="#fr11" name="fn11"&gt;11&lt;/a&gt;]One example of such a report is      the US Wiretap report, published by the US Court service. Unfortunately      this applies only to interception of communications, and not to access to      communications metadata. See &lt;a href="http://www.uscourts.gov/Statistics/WiretapReports/WiretapReport2011.aspx"&gt;http://www.uscourts.gov/Statistics/WiretapReports/WiretapReport2011.aspx&lt;/a&gt;.      The UK Interception of Communications Commissioner publishes a report that      includes some aggregate data but it is does not provide sufficient data to      scrutinise the types of requests, the extent of each access request, the      purpose of the requests, and the scrutiny applied to them. See &lt;a href="http://www.intelligencecommissioners.com/sections.asp?sectionID=2&amp;amp;type=top"&gt;http://www.intelligencecommissioners.com/sections.asp?sectionID=2&amp;amp;type=top&lt;/a&gt;.&lt;/p&gt;
&lt;/ol&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/draft-intl-principles-on-communications-surveillance-and-human-rights'&gt;https://cis-india.org/internet-governance/blog/draft-intl-principles-on-communications-surveillance-and-human-rights&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>elonnai</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>SAFEGUARDS</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-07-12T15:55:45Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/draft-human-dna-profiling-bill-april-2012">
    <title>Draft Human DNA Profiling Bill (April 2012): High Level Concerns</title>
    <link>https://cis-india.org/internet-governance/blog/draft-human-dna-profiling-bill-april-2012</link>
    <description>
        &lt;b&gt;In 2007 the Draft Human DNA Profiling Bill was piloted by the Centre for DNA Fingerprinting and Diagnostics, with the objective of regulating the use of DNA for forensic and other purposes. In February 2012 another draft of the Bill was leaked. The February 2012 Bill was drafted by the Department of Biotechnology. Another working draft of the Bill was created in April 2012. The most recent version of the Bill seeks to create DNA databases at the state, regional, and national level. &lt;/b&gt;
        &lt;hr /&gt;
&lt;p&gt;&lt;i&gt;This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC&lt;/i&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Each database will contain profiles of victims, offenders, suspects, missing persons and volunteers for the purpose of establishing identity in criminal and civil proceedings. The Bill also establishes a process for certifying DNA laboratories, and creating a DNA board for overseeing the carrying out of the Act. Though it is important to carefully regulate the use of DNA for criminal purposes, and such a law is needed in India, the present working draft of the Bill is lacking important safeguards and contains overreaching provisions, which could lead to violation of individual rights. The text of the 2012 draft is still being discussed and has not been finalized.  Below are high level concerns that CIS has with the April 2012 draft Human DNA Profiling Bill.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Broad offences and instances of when DNA can be collected&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The schedule of the Bill lists applicable instances for human DNA profiling and addition to the DNA database. Under this list, the Bill lays out nine Acts, for example the Indian Penal Code and the Protection of Civil Rights Act, and states that offences under these Acts are applicable instances of human DNA profiling. This allows the scope of the database to be expansive, as any individual who has committed an offence found under any of these Acts to be placed on the DNA database, and might include offences for which DNA evidence is not useful.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the schedule under section C &lt;b&gt;Civil disputes and other civil matters &lt;/b&gt;the Bill lists a number of civil disputes and civil matters for which DNA can be taken and entered onto the database. For example:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;i&gt;(v) Issues relating to immigration or emigration &lt;/i&gt;&lt;/li&gt;
&lt;li&gt;&lt;i&gt;(vi) Issues relating to establishment of individual identity &lt;/i&gt;&lt;/li&gt;
&lt;li&gt;&lt;i&gt;(vii) Any other civil matter as may be specified by the regulations of the Board &lt;/i&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;In these instances no crime has been committed and there is no justification for taking the DNA of the individual without their consent. In cases of civil disputes&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;i&gt; &lt;/i&gt;&lt;/b&gt;Offences for which DNA can be collected must be criminal and must be specified individually by the Bill. When DNA is used in civil cases, the consent of the individual must be taken. In civil cases a DNA profile should not be stored on the database. DNA profiling and storage on a database should not be allowed in instances like v, vi, vii listed above.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Inadequate level of authorization for sharing of information&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Bill allows for the DNA Data Bank Manager to determine when it is appropriate to communicate whether the DNA profile received is already contained in the Data Bank, and any other information contained in the Data Bank in relation to the DNA profile received.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Section 35 (1): “…&lt;i&gt;shall communicate, for the purposes of the investigation or prosecution in a criminal offence, the following information to a court, tribunal, law enforcement agency, or DNA laboratory in India which the DNA Data Bank Manager considers is concerned with it, appropriate, namely (a) as to whether the DNA profile received is already contained in the Data Bank; and (b) any information, other than the DNA profile received, is contained in the Data Bank in relation to  the DNA profile received.&lt;/i&gt;”&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: The Data Bank Manager should not be given the power to determine appropriate instances for the communication of information. Law enforcement agencies, DNA laboratories, etc. should be required to gain prior authorization, from the DNA Board, before requesting the disclosure of information from the DNA Data Bank Manager. Upon receiving proof of authorization, the DNA databank can share the requested information.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Inaccurate understanding of infallibility of DNA&lt;/h3&gt;
&lt;p&gt;The preamble to the Bill inaccurately states:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The Dexoxyribose Nucleic Acid (DNA) analysis of body substances is a powerful technology that makes it possible to determine whether the source of origin of one body substance is identical to that of another, and further to establish the biological relationship, if any between two individuals, living or dead without any doubt.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;i&gt; &lt;/i&gt;&lt;/b&gt;The Bill should recognize that DNA evidence is not infallible. For example, false matches can occur based on the type of profiling system used, and that error can take place in the chain of custody of the DNA sample.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The “definition” of DNA profiling is too loose in the Bill. Any technology used to create DNA profiles is subject to error. The estimate of this error should be experimentally obtained, rather than being a theoretical projection.&lt;/i&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Inadequate access controls&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Bill only restricts access to information on the DNA database that relates to a victim or to a person who has been excluded as a suspect in relevant investigations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Section 43: Access to the information in the National DNA Data Bank shall be restricted in the manner as may be prescribed if the information relates to a DNA profile derived from a) a victim of an offence which forms or formed the object of the relevant investigation, or b) a person who has been excluded as a suspect in the relevant investigation.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; Though it is important that access is restricted in these instances, access should also be restricted for: volunteers, missing persons, and victims. Broad access to every index in the database should not be permitted when a DNA sample for a crime is being searched for a match. Ideally, a crime scene index will be created, and samples will only be compared to that specific crime scene. The access procedure should be transparent with regular information published in an annual report, minutes of oversight meetings taken, etc.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Lack of standards and process for collection of DNA samples&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In three places the Bill mentions that a procedure for the collection of DNA profiles will be established, yet no process is enumerated in the actual text of the Bill.&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12 (w) “The Board will have the power to… specify by regulation, the list of applicable instances of human DNA profiling and the sources and manner of collection of samples in addition to the lists contained in the Schedule. &lt;/i&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 66(d) “The Central Government will have the power to make Rules pertaining to… The list of applicable instances of human DNA profiling and the sources and manner of collection of samples in addition to the lists contained in the Schedule under clause (w) of section 12. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Schedule: In the title “List of applicable instances of Human DNA Profiling and Sources and Manner of Collection of Samples for DNA Profiling”. But the schedule does not detail the manner of collection of samples for DNA profiling&lt;/i&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: According to the Criminal Procedure Code, section 53 and 54, DNA samples can only be collected by certified medical professionals. This must be reflected by the Bill. The Bill should also state that the collection of DNA must take place in a secure location and in a secure manner. When DNA is collected, consent must be taken, unless the individual is convicted of a crime for which DNA evidence is directly relevant or the court has ordered the collection. When DNA is collected, personal identification information should not be sent with samples to laboratories, and all transfers of data (from police station to lab) must be secure. Upon collection, information regarding the collection of information and potential use and misuse of DNA information must be provided to the individual.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Inadequate appeal process&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The provisions in the Bill allow aggrieved individuals to bring complaints to the DNA Board. If the complaint is not addressed, the individual can take the complaint to the court. Though grievances can be taken to the Board and the court, it is not clear if the individual has the right to appeal the collection, analysis, sharing, and use of his/her DNA. The text of section 58 implies that the Board and the Central government will have the power to take action based on complaints. This power was not listed above in the sections where the powers of the board and the central government are defined, thus it is unclear what actions the Board or the Central Government would be able to take on complaint.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Section 58: No court shall take cognizance of any offence punishable under this Act or any rules or regulations made thereunder save on a complaint made by the Central Government or its officer or Board or its officer or any other person authorized by them: Provided that nothing contained in this sub-section shall prevent an aggrieved person from approaching a court, if upon his application to the Central Government or the Board, no action is taken by them within a period of three months from the date of receipt of the application.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: Individuals should be allowed to appeal a decision to collect DNA or share a DNA profile, and take any grievance directly to the court. If the Board or the Central Government will have a role in hearing complaints, etc. These must be enumerated in the provisions of the Act.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Inclusion of population testing&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Though the main focus of the Bill is for the use of DNA in criminal and civil cases, the provisions of the Bill also allow for population testing and research to be done on collected samples.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Section 4: The Board shall consist of the following Members appointed from amongst persons of ability, integrity, and standing who have knowledge or experience in DNA profiling including.. (m) A population geneticist to be nominated by the President, Indian National Science Academy, Den Delhi-Member. &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Section 40: Information relating to DNA profiles, DNA samples and records relating thereto shall be made available in the following instances, namely, (e) for creation and maintenance of a population statistics database that is to be used, as prescribed, or the purposes of identification research, protocol development or quality control provide that it does not contain any personally identifiable information and does not violate ethical norms. &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: Delete these provisions. If DNA testing is going to done for population analysis purposes, regulations for this must be provided for in a separate legislation, stored in separate database, informed consent taken from each participant, and an ethics board must be established. It is not sufficient or ethical to conduct population testing only on DNA samples from victims, offenders, suspects, and volunteers.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Provisions delegated to regulation that need to be incorporated into text of Bill&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Bill empowers the board to formulate regulations for, and the Central Government to make Rules to, a number of provisions that should be within the text of the Bill itself. By leaving these provisions to Regulations and Rules, the Bill is a skeleton which when enacted will only allow for DNA Labs to be certified and DNA databases to be established.  Aspects that need to be included as provisions include:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Section 12: The Board shall exercise and discharge the following functions for the purposes of this Act namely &lt;/i&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12(j) – authorizing procedures for communication of DNA profile for civil proceedings and for crime investigation by law enforcement and other agencies.&lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12(p) – making specific recommendations to (ii) ensure the accuracy, security, and confidentiality of DNA information, (iii) ensure the timely removal and destruction of obsolete, expunged or inaccurate DNA information (iv) take any other necessary steps required to be taken to protect privacy.&lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12(w) – Specifying, by regulation, the list of applicable instances of human DNA profiling and the sources a manner of collection of samples in addition to the lists contained in the Schedule. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12(u) – establishing procedure for cooperation in criminal investigation between various investigation agencies within the country and with international agencies.&lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12(x) – Enumerating the guidelines for storage of biological substances and their destruction. &lt;/i&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Section 65(1) The Central Government may, by notification, make rules for carrying out the purposes of this Act&lt;/i&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 65 (c) – The officials who are authorized to receive the communication pertaining to information as to whether a person’s DNA profile is contained in the offenders’ index under sub-section (2) of section 35&lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 65 (d) – The manner in which the DNA profile of a person from the offenders’ index shall be expunged under sub-section (2) of section 37&lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt; Section 65 (e) – The manner in which the DNA profile of a person from the offender’s index shall be expunged under sub-section (3) of section 37 &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 65 (h) – The manner in which access to the information in the DNA data Bank shall be restricted under section 43 &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 65 (zg) – Authorization of other persons, if any, for collection of non-intimate forensic procedures under Part II of the Schedule. &lt;/i&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Broad Language that needs to be specified or deleted&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;There are a number of places in the Bill which use broad and vague language. This is problematic as it expands the potential scope of the Bill. Instances where broad language is used includes:&lt;/p&gt;
&lt;p&gt;Preamble:  &lt;i&gt;There is, thus, need to regulate the use of human DNA Profiles through an Act passed by the Parliament only for Lawful purposes of establishing identity in a criminal or civil proceeding and for other specified purposes.&lt;/i&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12: The Board may make regulations for (j) authorizing procedures for communications of DNA profile for civil proceedings and for crime investigation by law enforcement and other agencies. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12: The Board may make regulations for (y) undertaking any other activity which in the opinion of the Board advances the purposes of this Act. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 12: The Board may make regulations for (z) performing such other functions as may be assigned to it by the Central Government from time to time. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 32: The indices maintained under sub-section (4) shall include information of data based on DNA analysis prepared by a DNA laboratory duly approved by the Board under section 15 of the Act and of records relating thereto, in accordance with the standards as may be specified by the regulations made by the Board.&lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 35 (1) On receipt of a DNA profile for entry in the DNA Data Bank, the DNA Data Bank Manager shall cause it to be compared with the DNA profiles in the DNA Data Bank and shall communication, for purposes of the investigation or prosecution in a criminal offence, the following information…(a) as to whether the DNA profile received is already contained in the Data Bank and (b) any information other than the DNA profile received, is contained in the Data Bank in relation to the DNA profile received. (2) The information as to whether a person’s DNA profile is contained in the offenders’ index may be communicated to an official who is authorized to receive the same as prescribed.&lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 39: All DNA profiles and DNA samples and records thereof shall be used solely for the purpose of facilitating identification of the perpetrator of a specified offence under Part I of the Schedule. Provided that such profiles or samples may be used to identify victims of accidents or disasters or missing persons or for purposes related to civil disputes and other civil matters listed in Part 1 of the Schedule for other purposes as may be specified by the regulations made by the board. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 40: Information relating to DNA profiles, DNA samples and records relating thereto shall be made available in the following instances, namely (g) for any other purposes, as may be prescribed. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Schedule, C Civil disputes and other civil matters vii) any other civil matter as may be specified y the regulations made by the Board. &lt;/i&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Recommendation&lt;/b&gt;: All broad and vague language should be deleted and replaced with specific language.&lt;/p&gt;
&lt;h3&gt;Jurisdiction&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Section 1(2) It extends to the whole of India.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Section 2(f) “Crime scene index” means an index of DNA profiles derived from forensic material found (i) at any place (whether within or outside of India) where a specified offence was, or is reasonably suspected of having been, committed. &lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The validity of DNA profiles found outside of India is unclear as the Act only extends to the whole of India.&lt;/p&gt;
&lt;h3&gt;Inconsistent provisions&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Bill contains provisions that are inconsistent including:&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Preamble … from collection to reporting and also to establish a National DNA Data Bank and for matters connected therewith or incidental thereto. &lt;/i&gt;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;i&gt;Section 32 (1) The Central Government shall, by notification establish a National DNA Data Bank and as many Regional DNA Data Banks there under for every State or a group of States, as necessary. (2) Every State Government may, by notification establish a State DNA Data Bank which shall share the information with the National DNA Data Bank. The National DNA Data Bank shall receive DNA data from State DNA Data Banks…&lt;/i&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: The introduction to the Bill states that only a National DNA Data Bank will be established, yet in the provisions of the Bill it states that Regional and State level DNA databanks will also be established. It should be clarified in the introduction to the Bill that state level, regional level, and a national level DNA database will be created.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Inadequate qualifications of DNA Data Bank Manager&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Section 33: “&lt;i&gt;The DNA Data Bank Manager shall be a person not below the rank of Joint Secretary to the Government of India or equivalent and he shall report to the Member –Secretary of the Board. The DNA Data Bank Manager shall be a scientist with understanding of computer applications and statistics.&lt;/i&gt;”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: This is not sufficient qualifications. The DNA Data Bank Manager needs to have experience and expertise handling, working with, and managing DNA for forensic purposes.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Lack of restrictions on labs seeking certification&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;According to section 16(2), before withdrawing approval granted to a DNA laboratory...the Board will give time to the laboratory...for taking necessary steps to comply with such directions...and conditions.” &lt;br /&gt;&lt;b&gt;Recommendation&lt;/b&gt;: This section should specify that during the time period of gaining certification, the DNA laboratory is not allowed to process DNA.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Incomplete terms for use of DNA in courts&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Section 45 of the Bill allows any individual undergoing a sentence of imprisonment or under sentence of death to apply to the court which convicted him for an order for DNA testing. The Bill lists seven conditions that must be met for this DNA evidence to be accepted and used in court. &lt;br /&gt;&lt;b&gt;Recommendation&lt;/b&gt;: This section speaks only to the use of DNA in courts upon request by a convicted individual. This section should lay down standards for all instances of use of DNA in courts. Included in this, the provision should clarify that when DNA is used, corroborating evidence will be required in courts, and if confirmatory samples will be taken from defendants. Individuals should also have the right to have a second sample taken and re-analyzed as a check, and individuals must have a right to obtain re-analysis of crime scene forensic evidence in the event of appeal.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Inadequate privacy protections&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Besides section 38 which requires that all DNA profiles, samples, and records are kept confidential, the Bill leaves all other privacy protections to be recommended by the DNA profiling Board.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Section 12(o) The Board shall exercise and discharge the following functions…“Making recommendation for provision of privacy protection laws, regulations and practices relating to access to, or use of, store DNA samples or DNA analyses with a view to ensure that such protections are sufficient.” &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: Basic privacy protections such as access, use, and storage of DNA samples should be written into the provisions of the Bill and not left as recommendations for the Board to make.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Missing Provisions&lt;/h2&gt;
&lt;ol&gt; &lt;/ol&gt;&lt;ol&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Notification to the individual:&lt;/b&gt; There are no provisions that ensure that notification is given to an individual if his/her information is legally accessed or shared. Notification to the individual would be appropriate in section 36, which allows for the sharing of DNA profiles with foreign states, and section 35, which allows for the sharing of information with a court, tribunal, law enforcement agency, or DNA laboratory. As part of the notification, an individual should be given the right to appeal the decision.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Consent: &lt;/b&gt;There are no provisions which speak to consent being taken from individuals whose DNA is collected. Consent must be taken from volunteers, missing persons (or their families), victims, and suspects. DNA can be taken compulsorily from offenders after they have been convicted. If an individual refuses to provide a DNA sample, a judge can override the decisions and order that a DNA sample be taken. In all cases that DNA is collected without consent, it must be clear that DNA evidence is directly relevant to the case.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Right to request deletion of DNA profile from database: &lt;/b&gt;There are no provisions which give volunteers (children volunteers when they become adults), victims, and missing persons the right to request that their profile be deleted from the DNA database. This could be provided in section 37 which speaks to the expunction of records of acquitted convicts. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Right of individuals to bring a private cause of action: &lt;/b&gt;There are no provisions which give the individual the right to bring a privacy cause of action for the unlawful storage of private information in the national, regional, or state DNA database. This is an important check against the unlawful collection, analysis, and storage of private genetic information on the database. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Right to review one's personal data: &lt;/b&gt;There are no provisions that allow an individual to review his/her information contained on the state, regional, or national database. This is an important check against the unlawful collection, analysis, and storage of private genetic information on the database. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Independence of DNA laboratories and DNA banks from the police: &lt;/b&gt;There are no provisions which ensure that DNA laboratories and DNA data banks remain independent from the police. This is an important check in ensuring against the tampering of DNA evidence. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Established profiling standard: &lt;/b&gt;The Bill does not mandate the use of one single profiling standard. This is important in order to minimize false matches occurring by chance and to ensure consistency across DNA testing and profiling. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;b&gt;Destruction of DNA samples: &lt;/b&gt;There are no provisions mandating that original samples of DNA be deleted. DNA samples should be destroyed once the DNA profiles needed for identification purposes have been obtained from them – allowing for sufficient time for quality assurance (six months). Furthermore, only a barcode and no identifying details should be sent to labs with samples for analysis.&lt;/li&gt;
&lt;/ol&gt; 
&lt;ul&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/draft-human-dna-profiling-bill-april-2012'&gt;https://cis-india.org/internet-governance/blog/draft-human-dna-profiling-bill-april-2012&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>elonnai</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>SAFEGUARDS</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-07-12T15:36:59Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation">
    <title>Draft bill proposes Rs 1 crore fine, 3 year jail for data privacy violation</title>
    <link>https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation</link>
    <description>
        &lt;b&gt;The move comes at a time when user data of Indians is under threat from social media firms accused of data mining and sharing information with private companies for advertising and marketing purposes. There has also been a growing concern over Aadhaar.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Vidhi Choudhury was published in the &lt;a class="external-link" href="https://www.hindustantimes.com/india-news/draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation/story-Cbxt5LxKhINJiDdtipZlGI.html"&gt;Hindustan Times&lt;/a&gt; on June 8, 2018. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Even as a 10-member government panel is due to submit its recommendations for a new data privacy bill, a group of lawyers on Friday uploaded a model citizens’ code, which they said could give the panel pointers to what India’s final privacy law should be.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Internet Freedom Foundation (IFF) launched its community project, ‘Save our Privacy’, in what it described as a bid to safeguard individuals’ right to privacy. This model code, titled ‘Indian Privacy Code, 2018’, has been drafted by lawyers such Gautam Bhatia, Apar Gupta and Raman Jit Singh Chima, among others.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Many of these lawyers made a joint submission to the Justice BN Srikrishna Committee in the past. On Friday, they sent him an email with the copy of the code with its seven core principles. The core principles follow what IFF calls a “rights-based approach to protect people from harmful use of their personal data”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“In a world where personal data has power, people need to be put in charge of their own lives,” said New Delhi-based lawyer Apar Gupta.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft bill sets a penalty of up to Rs 1 crore for the violation of privacy of citizens and a prison sentence of up to three years. It also provides for a penalty of up to Rs 10 crore to anyone found to be performing surveillance unlawfully, with a prison term of up to five years.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The move comes at a time when user data of Indians is under potential threat from social media companies that have been accused of data mining and sharing user information with private firms for advertising and marketing purposes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There has also been a growing concern in India over the validity of the Aadhaar law. A Constitution bench of the Supreme Court has finished hearing a slew of petitions against the unique identity number and has reserved its judgment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On 31 July, the government constituted the panel headed by Justice Srikrishna to study various issues relating to data protection and suggest a draft data protection bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;IFF said in a statement that it had concerns over the “composition, lack of diversity and transparency” of the committee. It also said it was concerned about the lack of urgency India had shown about making a privacy law, and that its civil society project was important to build awareness on privacy and data protection in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The Indian Privacy Code, 2018 ensures that right to privacy does not undermine the Right to Information Act. All the other existing laws including the Telegraph Act and the Aadhaar Act should be subject to this law,” Chima said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“We hope the Justice BN Srikrishna Committee considers and adopts the language we propose,” he added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;According to a senior official at the home ministry who spoke on the condition of anonymity, the privacy bill hasn’t come up for discussion yet. “In any case, the said bill will be taken up by the IT ministry first. The IT ministry will be responsible for piloting the proposed bill on privacy and MHA will, in the later stages, give its opinion on security issues related to the proposed bill,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A government official on condition of anonymity said that its for the Justice Srikirshna Committee to look at the model privacy code launched today and decide what they want to use from it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When contacted, Ajay Sawhney, secretary for ministry of electronics and technology said: “The Justice Srikrishna Committee will submit its report shortly.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The reason civil society is doing this is because the government is not sharing their draft bills,” said Sunil Abraham, founder of think tank Centre for Internet and Society (CIS). In 2013, CIS had also published a citizen’s draft privacy protection bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(With inputs from Azaan Javaid)&lt;/i&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation'&gt;https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-29T16:48:48Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly">
    <title>Does the Social Web need a Googopoly?</title>
    <link>https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly</link>
    <description>
        &lt;b&gt;While the utility of the new social tool Buzz is still under question, the bold move into social space taken last week by the Google Buzz team has Gmail users questioning privacy implications of the new feature.  In this post, I posit that Buzz highlights two  privacy challenges of the social web.  First, the application has sidestepped the consensual and contextual qualities desirable of social spaces.  Secondly, Google’s move highlights the increasingly competitive and convergent nature of the social media landscape.  &lt;/b&gt;
        
&lt;p&gt;&lt;/p&gt;
&lt;p&gt;Last week, and for many a surprise, Google launched its new
social networking platform, Buzz.&amp;nbsp; The
new service is Google’s effort to amplify the “social nature” of their services
by integrating them under one platform, and adding some extra social utility.&amp;nbsp;&amp;nbsp; The social application runs from the Gmail
interface, but also links other Google accounts a user may have, including
albums on Picasa, and Google Reader.&amp;nbsp; &amp;nbsp;The service also allows for the sharing from
external sources, such as photos on Flickr, and videos from YouTube.&amp;nbsp; The service also allows users to post, like,
or dislike the status updates of others which may be publicly searchable if the
user opts.&amp;nbsp; Before a Gmail user may fully
participate in Google Buzz service, a unique Google Personal Profile must be
created.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;User Consent&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Much of the buzz surrounding the new social networking
service last week wasn’t paying much lip service to the new application.&amp;nbsp; Instead, an uproar of privacy concerns continued
to dominate the Buzz scene, with many critics quickly labeling Buzz a “&lt;a href="http://news.cnet.com/8301-31322_3-10451428-256.html"&gt;privacy nightmare&lt;/a&gt;”.&amp;nbsp; A &lt;a href="http://digitaldaily.allthingsd.com/20100216/epic-files-ftc-complaint-over-google-buzz/?mod=ATD_rss"&gt;formal
complaint&lt;/a&gt; has been already filed with the US Federal Trade Commission in
response to Google’s new privacy violating service.&amp;nbsp; &amp;nbsp;A
second-year Harvard Law student has also filed a &lt;a href="http://abcnews.go.com/Technology/google-buzz-draws-class-action-suit-harvard-student/story?id=9875095&amp;amp;page=1"&gt;class-action
suit&lt;/a&gt; against the company for its privacy malpractices.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Much of the privacy talk thus far has focused on issues of
consent, or lack thereof, in this case.&amp;nbsp; Upon
Buzz’s launch, Gmail users were automatically subscribed as “opting in” for the
service.&amp;nbsp; Google has used the private
address books of millions of Gmail accounts to build social networks from the
contacts users email and chat with most.&amp;nbsp;
To entice users into using the service, Gmail users were set to
auto-follow all of their contacts, and in turn, to be followed by them,
too.&amp;nbsp; Furthermore, all new Buzz users had
been set to automatically share all public Picasa albums and Google Reader items
with their new social graph.&amp;nbsp; It is
argued that social network services should be &lt;a href="http://jonoscript.wordpress.com/2010/02/20/buzz-off-google-social-networks-should-always-be-opt-in-not-opt-out/"&gt;opt-in,
rather than opt-out&lt;/a&gt;, and that Buzz has violated the consensual nature of
the social web.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Illuminating the complications of building a social graph
from ones inbox is the story of an Australian women, who remains anonymous.&amp;nbsp; As she claims, most of the emails currently received
through her Gmail account, are those from her abusive ex-boyfriend.&amp;nbsp; Due to Google’s assumption that Gmail users
would like to be “auto-followed” by their Gmail contacts (mirroring Twitters friendship
protocol), items shared between herself and new boyfriend through her Google
reader account had become public to her broader social graph, including her
ex-boyfriend and his harassing friends.&lt;/p&gt;
&lt;p&gt;In a &lt;a href="http://www.gizmodo.com.au/2010/02/fck-you-google/"&gt;blog response&lt;/a&gt;
directed to Google’s Buzz team, the woman scornfully wrote- “&lt;em&gt;F*ck you, Google. My privacy concerns are
not trite. They are linked to my actual physical safety, and I will now have to
spend the next few days maintaining that safety by continually knocking down
followers as they pop up. A few days is how long I expect it will take before
you either knock this shit off, or I delete every Google account I have ever
had and use Bing out of f*cking spite&lt;/em&gt;”.&amp;nbsp;
As this case demonstrates, the people we mail most often may not be our
closest friends. &amp;nbsp;&amp;nbsp;As email has replaced
the telephone for many as the dominate mode of communication--some contacts may
be friends, however, many others may not be. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;In response to the uproar, tweaks to Buzz’s privacy features
have since been made.&amp;nbsp; Todd Jackson,
Buzz’s product manager, has also posted a &lt;a href="http://gmailblog.blogspot.com/2010/02/millions-of-buzz-users-and-improvements.html"&gt;public
apology&lt;/a&gt; to the official Gmail Blog late last week for not “getting
everything quite right”.&amp;nbsp; The service will
now assume the more user-centric “auto-suggest” model, allowing users to selectively
choose the contacts they wish to follow, and will also no longer auto-link Picasa
and Reader content.&amp;nbsp; However, as the &lt;a href="http://digitaldaily.allthingsd.com/20100216/epic-files-ftc-complaint-over-google-buzz/?mod=ATD_rss"&gt;EPIC’s
complaint notes&lt;/a&gt;, many are still unsatisfied with the opt-out nature of the
service, arguing that users should be able to opt-into the service if they so
choose, rather than having to delist themselves for a service they didn’t necessarily
sign up.&amp;nbsp; Ethical quandaries also still
loom over Google’s misuse of the users’ private contact lists to jumpstart
their new service.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Contextual Integrity &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The attacks on personal privacy resulting from Google’s model
are vast.&amp;nbsp; As the case of the Australian
woman illuminates, the concept of the “online friend” has completely taken out
of context with Buzz’s initial auto-follow model.&amp;nbsp; Many of the contacts we make on a daily basis
need not be made public through the Google profile.&amp;nbsp; For most, this Buzz’s privacy breach may be
benign or annoying at most. However, those who are engaged in sensitive social
or political relationships via their Gmail chat or email accounts, the revelation
of common contact could have been potentially damaging for many. &amp;nbsp;A reporter from CNET has cleverly labeled
Buzz’ as a “&lt;a href="http://news.cnet.com/8301-17939_109-10451703-2.html"&gt;socially
awkward networking&lt;/a&gt;”, as bringing diverse contacts under one umbrella
doesn’t exactly make the most social sense. In response, Gmail users are
required to sort through and filter their Buzz followers according, or choose
to disable the service all together.&lt;/p&gt;
&lt;p&gt;Besides questions of who is stalking whom, the assumptive
and public nature of Google’s&amp;nbsp; new move
has cast a shadow of doubt among Gmail users regarding the ability of Google to
maintain the privacy and contextual integrity of the Gmail account.&amp;nbsp; Should one account be the place to socialize,
and&amp;nbsp; “do business”?&amp;nbsp; Gmail is, and should remain, an email
service.&amp;nbsp; However, Buzz takes the email
experience into new and questionable grounds.&amp;nbsp;
Do Gmail users feel entirely comfortable having their personal email,
social graph, and chat functions all coming under the auspices of one platform?
&amp;nbsp;&amp;nbsp;Many users felt they had been lured
into using a social networking service that they didn’t sign up for in the
first place. &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Social Media Competition&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;In addition to Google’s attempt to integrate their various
service offerings, Buzz is seen as an obvious attempt to bolster
competitiveness in the social media market.&amp;nbsp;
In 2004, Google released Orkut. While the service has become big in
countries such as Brazil and India, it has been overshadowed by sites such as
Facebook in other jurisdictions, and has not been able to prove itself as a mainstream
space for networking.&amp;nbsp; In the past year, Google
had also launched Google Wave, a tool that mixes e-mail, with instant messaging
and the ability for several people to collaborate on documents.&amp;nbsp; However, the application failed to completely
win over audiences, and was considered one of the &lt;a href="http://www.readwriteweb.com/archives/top_10_failures_of_2009.php"&gt;top
failures of 2009&lt;/a&gt;.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;With Google unable to effectively saturate the social media
ecosystem, Buzz is an attempt to compete with the searchable and real time
experiences provided by social media giants, Facebook and Twitter.&amp;nbsp; Increased competition within the social media
market could be a positive development for privacy, as social media companies
could arguably be compete on their ability to provide users with preferable privacy
architectures.&amp;nbsp; To the contrary, however,
such competition has thus far had negative ramifications for user privacy, as
the recent Buzz and Facebook moves illustrates.&amp;nbsp;
Facebook’s loosened privacy settings were a &lt;a href="http://www.economist.com/specialreports/displaystory.cfm?story_id=15350984"&gt;competitive
knee-jerk&lt;/a&gt; to Twitters searchable and real time experience.&amp;nbsp; Through a Twitter search, individuals can
come to know what people are saying about a certain topic, event, or product,
and as a result, the service has received a great deal attention from users,
and non-users such as advertisers, alike.&amp;nbsp;
&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In an attempt to one-up, their competition, the “Twitterization”
of Facebook followed in two distinct stages.&amp;nbsp;
First was with the implementation of the Facebook News Feed, which gave
users a real time account of actions their friends on the site.&amp;nbsp; Many argued that this feature invaded user
privacy.&amp;nbsp; However, it was argued by
Facebook that they only were making available information that was already
accessible through individual profile pages.&amp;nbsp;
The News Feed, as it happens, effectively took user information and
actions on the site out of original context by streaming this information live
for others easy viewing.&amp;nbsp; Information
users once had to rummage for had become accessible in real time on the
homepage of the service.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Secondly, Facebooks’ recent &lt;a href="http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly"&gt;privacy
scandal&lt;/a&gt; was a step towards making profile information more searchable and accessible
to third parties, as is most often the case with the more public feeds on Twitter.&amp;nbsp; As &lt;a href="https://cis-india.org/openness/blog-old/•%09http:/www.simplyzesty.com/twitter/unrelenting-twitterization-facebook-continues/"&gt;one
commentator notes&lt;/a&gt;, &amp;nbsp;&amp;nbsp;“&lt;em&gt;Facebook used to be very private but private
is not great for search, to have great search you need all of the data to be
publicly available as it mostly is on Twitter. Facebook have not quite nailed
real time search yet but they are getting there and it will soon be a great way
of examining sentiment across different demographics&lt;/em&gt;”. &amp;nbsp;As a result, information on Facebook, such as
name, profile picture, friends list, location and fan pages have become open
access information.&amp;nbsp; In addition, users
on Facebook have been subjected to new privacy regime without notice, leaving
their profile pages generally more open, and searchable through Google.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Converging the Online
Self&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The impact Buzz alone can make on the social media landscape
remains questionable (Gmail heralds only 140 million accounts, which is a deficient
cry from Facebooks’ 400+ million dedicated users).&amp;nbsp; However, despite Googles’ in/ability to
become claim hegemony over the social web landscape, the abuse of private information
to launch a new service has raised serious debate over the privacy and the
future of social networking.&amp;nbsp; The Buzz
service marks more than yet another new social networking service that brushes
aside the privacy of users.&amp;nbsp; As user control
and privacy becomes an increasingly peripheral concern, Google’s shift toward privacy
decontrol also signifies a worrisome supply-side shift towards the
“convergence” of online identity.&lt;/p&gt;
&lt;p&gt;Within this new dominant paradigm, privacy concerns are
often interpreted as antithetical to competitiveness in the social media
marketplace.&amp;nbsp; Instead of an imagined ecosystem
based on user control and privacy preference, it can now be inferred that the
competiveness of social networking services will continue to disrupt the
delicate balance between the public and private online. Regardless that greater
visibility and searchability of the social profile may not be in the public
interest, Google’s recent move works to reinforcement of the new status quo of
“openness”.&amp;nbsp; Furthermore, it is
questionable as to how concentrated and integrated a user may want their online
activities to become.&amp;nbsp; A critical
discourse of online privacy must, therefore, take into account the ways in
which the social web has renders the user increasingly transparent through networks
of networking services.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Google’s Buzz illustrates this point quite well.&amp;nbsp;&amp;nbsp; Initially, Gmail was a straightforward email
service.&amp;nbsp; Next, the AdWords advertising service
and Gmail chat had become integrated into the Gmail experience.&amp;nbsp; Because Google was using the confidential
emails of its Gmail users, privacy concerns began to mount upon the launch of
the the AdWords service.&amp;nbsp; However,
turmoil surrounding AdWords died down, notably as Google continues to reassert
that is is bots, not humans, that are scanning the emails in order to provide
the AdWords service.&amp;nbsp; Next, there gradually
occurred a convergence of Google services under the single social profile, or
“email address”.&amp;nbsp; A single Gmail account
potentially includes use of with Google reader, calendar, chat, groups and an Orkut
account.&amp;nbsp; In terms of behavioral targeted
advertising, Google has recently announced that they will be providing
personalized search results even to users who have not signed up for Google
services.&amp;nbsp; This will be done through the
placement a cookie on all machines to provide targeted advertising seamlessly
through each Google search and browsing session.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While many argue that the collection of non-personally
identifiable information poses no privacy harm, this assumption needs
reassessment.&amp;nbsp; As Google comes to offer
us more, they also come to learn more, and Buzz signifies this trend towards a Googopolized
social web.&amp;nbsp; To add another layer of
complexity to Googles hegemony, users of the Buzz service are also required to create
a “Google Profile”, which is searchable online and displays real time status
updates, comments, and connections from other social network services, such as
Facebook and Twitter.&amp;nbsp; As Google recently
launched the beta version of the new &lt;a href="http://googleblog.blogspot.com/2009/10/introducing-google-social-search-i.html"&gt;Social
Search&lt;/a&gt;, Buzz was just the service required to increase the relevance to the
new service by encouraging Gmail users to publish even more personal
information.&amp;nbsp; The creation of a personal
Google profile, which is indexed and searchable, raises many concerns about
privacy and identity, and doubts are continually raised over &lt;a href="http://www.businessinsider.com/hey-google-thi-i-why-privacy-matter-2010-2"&gt;how
much Google should come to know&lt;/a&gt; about us.&lt;/p&gt;
&lt;p&gt;While Google’s services have arguably made the online social
experience more seamless and tailored, it is questionable as to how relevant,
or even desirable, such a shift may be.&amp;nbsp;
At present, it may appear that Google is wearing far too many hats, and
users should be wary of placing all eggs into one basket.&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;As
the launch of Buzz has shown us, user consent and the contextual integrity of
private personal information can be compromised when a diverse number of online
services are integrated and given a social spin.&amp;nbsp;&amp;nbsp;&amp;nbsp; When competition among social web providers
drives users to lose control of the private information which is inherently theirs,
critical questions surrounding competition, convergence and privacy require
critical exploration.&amp;nbsp;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly'&gt;https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>rebecca</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Social Networking</dc:subject>
    
    
        <dc:subject>Competition</dc:subject>
    
    
        <dc:subject>Google Buzz</dc:subject>
    

   <dc:date>2011-08-18T05:06:37Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online">
    <title>Does the Safe-Harbor Program Adequately Address Third Parties Online?</title>
    <link>https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online</link>
    <description>
        &lt;b&gt;While many citizens outside of the US and EU benefit from the data privacy provisions the Safe Harbor Program, it remains unclear how successfully the program can govern privacy practices when third-parties continue to gain more rights over personal data.  Using Facebook as a site of analysis, I will attempt to shed light on the deficiencies of the framework for addressing the complexity of data flows in the online ecosystem. &lt;/b&gt;
        
&lt;p&gt;To date, the EU-US Safe Harbor Program leads in governing
the complex and multi-directional flows of personal information online. &amp;nbsp;&amp;nbsp;As commerce began to thrive in the online
context, the European Union was faced with the challenge of ensuring that personal
information exchanged through online services were granted
levels of protect on par with provisions set out in EU privacy law.&amp;nbsp; This was important, notably as the piecemeal
and sectoral approach to privacy legislation in the United states was deemed incompatible
with the EU approach.&amp;nbsp; While the Safe
Harbor program did not aim to protect the privacy of citizens outside of the
European Union per say, the program has in practice set minimum standards for
online data privacy due to the international success of American online
services.&lt;/p&gt;

&lt;p&gt;While many citizens outside of the US and EU benefit from
the Safe Harbor Program, it remains unclear how successful the program will be in an
online ecosystem where third-parties are being granted increasingly more rights
over the data they receive from first parties.&amp;nbsp;
Using Facebook as a site of analysis, I will attempt to shed light on
the deficiencies of the framework for addressing the complexity of data flows
in the online ecosystem.&amp;nbsp; First, I will argue
that the safe harbor program does not do enough to ensure that participants are
held reasonably responsible third party privacy practices.&amp;nbsp; Second, I will argue that the information
asymmetries created between first party sites, citizens, and governance bodies
vis-à-vis third parties obscures the application of the Safe Harbor Model.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The EU-US
Safe-Harbor Agreement&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;In 1995, and based on earlier &lt;a href="http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html"&gt;OECD
guidelines&lt;/a&gt;, the EU Data Directive on the “protection of individuals with
regard to the processing of personal data and the free movement of such data”
was passed&lt;a name="_ednref1" href="#_edn1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [1].&amp;nbsp; The original purpose of the EU Privacy
Directive was not only to increase privacy protection within the European
Union, but to also promote trade liberalization and a single integrated market
in the EU.&amp;nbsp; After the Data Directive was
passed, each member state of the EU incorporated the principles of
the directive into national laws accordingly.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While the Directive was successful in harmonizing data
privacy in the European Union, it also embodied extraterritorial
provisions, giving in reach&lt;a name="_ednref2" href="#_edn2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; beyond the EU.&amp;nbsp; Article 25 of the Directive states that the
EU commission may ban data transfers to third countries that do not ensure “an
adequate level of protect’ of data privacy rights&lt;a name="_ednref3" href="#_edn3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [2].&amp;nbsp; Also, Article 26 of the Directive, expanding
on Article 25, states that personal data cannot be &lt;em&gt;transferred &lt;/em&gt;to a country that “does not ensure an adequate level of
protection” if the data controller does not enter into a contract that adduces
adequate privacy safeguards&lt;a name="_ednref4" href="#_edn4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [3].
&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In light of the increased occurrence of cross-border
information flows, the Data Directive itself was not effective enough to ensure that
privacy principles were enforced outside of the EU.&amp;nbsp; Articles 25 and 26 of the Directive had essentially deemed all cross-border data-flows to the US in contravention of EU privacy law.&amp;nbsp; Therefor, the EU-US Safe-Harbor was established by the
EU Council and the US Department of Commerce as a way of mending the variant
levels of privacy protection set out in these jurisdictions, while also promoting
online commerce.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Social Networking
Sites and the Safe-Harbor Principles&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The case of social networking sites exemplifies the ease
with which data is transferred, processed, and stored between jurisdictionas.&amp;nbsp; While many of the top social networking sites
are registered American entities, they continue to attract users not only from
the EU, but also internationally.&amp;nbsp; In agreement
to the EU law, many social networking sites, including LinkedIn, Facebook,
Myspace, and Bebo, now adhere to the principles of the program.&amp;nbsp; The enforcement of the Safe Harbor takes
place in the United States in accordance with U.S. law and relies, to a great
degree, on enforcement by the private sector.&amp;nbsp;
TRUSTe, an independent certification program and dispute mechanism, has become the most popular governance mechanism for the safe harbor program
among social networking sites.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Drawing broadly on the principles embodied within the EU
Data Directive and the OECD Guidelines, the seven principles of the Safe-Harbor
were developed.&amp;nbsp; These principles include
Notice, Choice, Onward Transfer, Access and Accuracy, Security, Data Integrity
and Enforcement.&amp;nbsp;&amp;nbsp; The principle of “Notice”
sets out that organizations must inform individuals about the purposes for
which it collects and uses information about them, how to contact the
organization with any inquiries or complaints, the types of third parties to
which it disclosures the information, and the choices and means the organization
offers individuals for limiting its use and disclosure.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;“Choice” ensures that individuals have the opportunity to
choose to opt out whether their personal information is disclosed to a third
party, and to ensure that information is not used for purposes incompatible with the purposes for
which it was originally collected.&amp;nbsp; The
“Onward Transfer” principle ensures that third parties receiving information
subscribes to the Safe Harbor principles, is subject to the Directive, or
enters into a written agreement which requires that the third party provide at
least the same level of privacy protection as is requires by the relevant
principles.&lt;/p&gt;
&lt;p&gt;The principles of “Security” and “Data Integrity” seek to
ensure that reasonable precautions are taken to protect the loss or misuse of
data, and that information is not used in a manner which is incompatible with
the purposes for it is has been collected—minimizing the risk that personal
information would be misused or abused.&amp;nbsp;&amp;nbsp;&amp;nbsp;
Individuals are also granted the right, through the access principle, to
view the personal information about them that an organization holds, and to
ensure that it is up-to-date and accurate.&amp;nbsp;
The “Enforcement” principle works to ensure that an effective mechanism
for assuring compliance with the principles, and that there are consequences
for the organization when the principles are not followed.&lt;/p&gt;
&lt;p&gt;The principles of the program are rather quite clear and
enforceable in the first party context, despite some prevailing ambiguities.&amp;nbsp; The privacy policies of most social
networking services have become increasingly clear and straightforward since
their inception.&amp;nbsp; Facebook, for example,
has revamped its &lt;a href="http://www.facebook.com/privacy/explanation.php"&gt;privacy
regime&lt;/a&gt; several times, and gives explicit notice to users how their
information is being used.&amp;nbsp; The privacy
policy also explains the relationship between third parties and your personal information—including
how it may be used by advertisers, search engines, and fellow members.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;With respect to third party advertisers, principles of
“choice” are clearly granted by most social networking services.&amp;nbsp; For example, the &lt;a href="http://www.networkadvertising.org/"&gt;Network Advertising Initiative&lt;/a&gt;, a
self-regulatory initiative of the online advertising industry, clearly lists
its member websites and allows individuals to opt out of any targeted
advertising conducted by its members.&amp;nbsp; In
Facebook’s description of “cookies” in their privacy policy, a direct link to NAI’s
opt out features is given, allowing individuals to make somewhat informed
choices about their participation in such programs.&amp;nbsp; This point is, of course, in light of the
fact that most users do not read or understand the privacy policies provided by
social networking sites&lt;a name="_ednref5" href="#_edn5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [4].
It is also important to note that Google—a major player in the online
advertising business, does not grant users of Buzz and Orkut the same “opt-out”
options as sites such as Facebook and Bebo.&lt;/p&gt;
&lt;p&gt;Under the auspices of the US Federal Trade Commission, the
Safe Harbor Program has also successfully investigated and settled several
privacy-related breaches which have taken place on social networking sites.&amp;nbsp; Of the most famous cases is &lt;a href="http://www.beaconclasssettlement.com/"&gt;Lane et al. v. Facebook et al.&lt;/a&gt;,
which was a class action suit brought against Facebook’s Beacon Advertising
program.&amp;nbsp; The US Federal Trade Commission
was quick to insight an investigation of the program after many privacy groups
and individuals became critical of its questionable advertising practices.&amp;nbsp; The Beacon program was designed to allow
Facebook users to share information with their friends about actions taken on
affiliated, third party sites.&amp;nbsp; This had included,
for example, the movie rentals a user had made through the Blockbuster website.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The Plaintiffs filed a suit, alleging that Facebook and its
affiliates did not give users adequate notice and choice about Beacon and the
collection and use of users’ personal information. &amp;nbsp;&amp;nbsp;&amp;nbsp;The Beacon program was ultimately found to
be in breach of US law, including the &lt;a href="http://epic.org/privacy/vppa/"&gt;Video
Privacy Protection Act&lt;/a&gt;, which bans the disclosure of personally identifiable
rental information.&amp;nbsp; Facebook has
announced the settlement of the lawsuit, not bringing individual settlements,
but a marked end to the program and the development of a 9.5 million dollar &lt;a href="http://www.p2pnet.net/story/37119"&gt;Facebook Privacy Fund&lt;/a&gt; dedicated to
privacy and data-related issues.&amp;nbsp; Other privacy
related investigations of social networking sites launched by the FTC under the
Safe Harbor Program include Facebook’s &lt;a href="http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly"&gt;privacy
changes&lt;/a&gt; in late 2009, and the Google’s recently released &lt;a href="http://www.networkworld.com/news/2010/032910-lawmakers-ask-for-ftc-investigation.html"&gt;Buzz
application&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Despite the headway the Safe Harbor is making, many privacy
related questions remain ambiguous with respect to the responsibilities social networking
sites through the program.&amp;nbsp; For example,
Bebo &lt;a href="http://www.bebo.com/Privacy2.jsp"&gt;reserves the right&lt;/a&gt; to
supplement a social profile with addition information collected from publicly
available information and information from other companies.&amp;nbsp; Bebo’s does adhere to the “notice principle”—as
it makes know to users how their information will be used through their privacy
policy. However, it remains unclear if appropriate disclosures are given by Bebo
as required by Safe Harbor Framework, notably as the sources of “publicly
available information” as a concept remains broad and obscured in the privacy policy.&amp;nbsp; It is also unclear whether or not Bebo users
are able to, under the “Choice” principle, refuse to having their profiles from
being supplemented by other information sources.&amp;nbsp; Also, under the “access
principle”, do individuals have the right to review all information held about them as “Bebo
users”?&amp;nbsp; The right to review information
held by a social networking site is an important one that should be upheld.&amp;nbsp; This is most notable as supplementary information
from outside social networking services is employed &amp;nbsp;to profile individual users in ways which may
work to categorize individuals in undesirable ways.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The Third Party Problem&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Cooperation between social networking sites and the Safe
Harbor has improved, and most of these sites now have privacy policies which
explicitly address the principles of the Program.&amp;nbsp;&amp;nbsp; It should also be noted that public interest
groups, such as Epic, the Center for Digital Democracy, and The Electronic
Frontier Foundation, have played a key role in ensuring that data privacy
breaches are brought to the attention of the FTC under the program.&amp;nbsp; While the program has somewhat adequately
addressed the privacy practices of first party participants, the number of
third parties on social networking sites calls into question the
comprehensiveness and effectiveness of the Safe Harbor program.&amp;nbsp; Facebook itself as a first party site may adhere
to the Safe Harbor Program.&amp;nbsp; However, its
growing number third party platform members may not always adhere to best practices
in the field, nor can Facebook or the Safe Harbor Program guarantee that they
do so.&lt;/p&gt;
&lt;p&gt;The Safe Harbor Program does require that all participants
take certain security measures when transferring data to a third party.&amp;nbsp; Third parties must either subscribe to the
safe harbor principles, or be subject to the EU Data Directive.&amp;nbsp; Alternatively, an organization can may also
enter into a written agreement with a third party requiring that they provide
at least the same level of privacy protection as is required by program
principles.&amp;nbsp; Therefore, third parties of
participating program sites are, de facto, bound by the safe harbor principles by
the way of entering into agreement with a first party participant of the
program. &amp;nbsp;This is the approach taken by
most social networking sites and their third parties.&lt;/p&gt;
&lt;p&gt;It is important to note, however, that third parties are not
governed directly by the regulatory bodies, such as the FTC.&amp;nbsp; The safe harbor website also &lt;a href="http://www.export.gov/safeharbor/eu/eg_main_018476.asp"&gt;explicitly notes&lt;/a&gt;
that the program does not apply to third parties.&amp;nbsp; Therefore, as per these provisions, Facebook must
adhere to the principles of the program, while its third party platform members
(such as social gaming companies), only must do so indirectly as per a separate
contract with Facebook.&amp;nbsp; The
effectiveness of this indirect mode of governing of third party privacy
practices is questionable for numerous reasons.&lt;/p&gt;
&lt;p&gt;Firstly, while Facebook does take steps to ensure that
third parties use information from Facebook in a manner which is consistent to
the safe harbor principles, the company explicitly &lt;a href="http://www.facebook.com/policy.php"&gt;waives any guarantee&lt;/a&gt; that third
parties will “follow their rules”. &amp;nbsp;&amp;nbsp;Prior to allowing third parties to access any
information about users, Facebook requires third parties to &lt;a href="http://www.facebook.com/terms.php"&gt;agree to terms&lt;/a&gt; that limit their
use of information, and also use technical measures to ensure that they only
obtain authorized information.&amp;nbsp;&amp;nbsp; Facebook
also warns users to “always review the policies of third party applications and
websites to make sure you are comfortable with the ways in which they use
information”.&amp;nbsp; Not only are users
required to read the privacy policies of every third party application, but are
also expected to report applications which may be in violation of privacy
principles.&amp;nbsp; In this sense, Facebook not
only waives responsibility for third party privacy breaches, but also places further
regulatory onus upon the user.&lt;/p&gt;
&lt;p&gt;As the program guidelines express, the safe harbor relies to
a great degree on enforcement by the private sector.&amp;nbsp; However, it is likely that a self-regulatory
framework may lead the industry into a state of regulatory malaise.&amp;nbsp; Under the safe harbor program, Facebook must
ensure that the privacy practices of third parties are adequate.&amp;nbsp; However, at the same time, the company may
simultaneously waiver their responsibility for third party compliance with safe
harbor principles.&amp;nbsp; Therefore, it remains
questionable as to where responsibility for third parties exactly lies.&amp;nbsp; When third parties are not directly
answerable to the governing bodies of safe harbor program, and when first parties
can to waive responsibility for their practices, from where does the incentive to
effectively regulate third parties to come from?&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While Facbeook may in fact take reasonable legal and technical
measures to ensure third party compliance, the room for potential dissonance
between speech and deed&amp;nbsp; is worrisome.&amp;nbsp; Facebook is required to ensure that third
parties provide “&lt;a href="http://www.export.gov/safeharbor/eu/eg_main_018476.asp"&gt;at least the same
level of privacy protection&lt;/a&gt;” as they do.&amp;nbsp;
However, in practice, this has yet to become the case.&amp;nbsp; A quick survey of twelve of the most popular
Platform Applications in the gaming category showed&lt;a name="_ednref6" href="#_edn6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;
that third parties are not granting their users the “same level of privacy
protection”[5].&amp;nbsp; For example, section 9.2.3
of Facebooks “&lt;a href="http://www.facebook.com/terms.php"&gt;Rights and
Responsibilities&lt;/a&gt;” for Developers/Operators of applications/sites states
that they must “have a privacy policy or otherwise make it clear to users what
user data you are going to use and how you will use, display, or share that
data”.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;However, out of the 12 gaming applications surveyed, four
companies failed to make privacy policies available to users &lt;em&gt;before&lt;/em&gt; they granted the application
access to the personal information, including that of their friends&lt;a name="_ednref7" href="#_edn7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [6].&amp;nbsp; After searching for the privacy policies on
the websites of each of the four social gaming companies, two completely failed
to post privacy policies on their central websites. &amp;nbsp;&amp;nbsp;This practice is in direct breach of the
contract made between these companies and Facebook, as mentioned above.&amp;nbsp; In addition to many applications failing to clearly
post privacy policies, many of provisions set out in these policies were
questionable vis-à-vis safe harbor principles.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;For example Zynga, makes of popular games Mafia Wars and
Farmville, reserve the right to “maintain copies of your content
indefinitely”.&amp;nbsp; This practice remains contrary
to Safe Harbor principles which states that information should not be kept for
longer than required to run a service.&amp;nbsp;
Electronic Arts also maintains similar provisions for data retention in
its privacy policy.&amp;nbsp;&amp;nbsp; Such practices are
rather worrisome also in light of the fact that both companies also reserve the
right to collect information on users from other sources to supplement profiles
held.&amp;nbsp; This includes (but is not limited
to) newspapers and Internet sources such as blogs, instant messaging services, and
other games.&amp;nbsp;&amp;nbsp; It is also notable to
mention that only one of the twelve social gaming companies surveyed directly
participates in the safe harbor program.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In addition to the difficulties of ensuring that safe harbor
principles are adhered to by third parties, the information asymmetries which
exist between first party sites, citizens, and governance bodies vis-à-vis
third parties complicate this model.&amp;nbsp; Foremost,
it is clear that Facebook, despite its resources, cannot keep tabs on the
practices of all of their applications.&amp;nbsp;&amp;nbsp;
This puts into question if industry self-regulation can really guarantee
that privacy is respected by third parties in this context.&amp;nbsp; Furthermore, the lack of knowledge or
understanding held by citizens about how third parties user their information
is particularly problematic when a system relies so heavily on users to report
suspected privacy breaches.&amp;nbsp; The same is
likely to be true for governments, too.&amp;nbsp; As
one legal scholar, promoting a more laisse-fair approach to third party
regulation, notes—multiple and invisible third party relationships presents
challenges to traditional forms of legal regulation&lt;a name="_ednref8" href="#_edn8"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [7].&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In an “open “social ecosystem, the sheer volume of data
flows between users of social networking sites and third party players appears
to have become increasingly difficult to effectively regulate.&amp;nbsp; While the safe harbor program has been
successful in establishing best practices and minimum standards for data
privacy, it is also clear that governance bodies, and public interest groups,
have focused most attention on large industry players such as Facebook.&amp;nbsp; This has left smaller third party players on
social networking sites in the shadows of any substantive regulatory concern.&amp;nbsp; &amp;nbsp;&amp;nbsp;If
one this has become clear, it is the fact that governments may no longer be
able to effectively govern the flows of data in the burgeoning context of “open
data”.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;As I have demonstrated, it remains questionable whether or
not Facebook can regulate third parties data collection practices
effectively.&amp;nbsp; Imposing more stringent
responsibilities on safe harbor participants could be a positive step.&amp;nbsp; It is reasonable to assume that it would be
undue to impose liability on social networking sites for the data breaches of
third parties.&amp;nbsp; However, it is not
unreasonable to require sites like Facebook go beyond setting “minimum
standards” for data privacy, towards taking a more active enforcement, if even
through TRUSTe or another regulatory body.&amp;nbsp;
If the safe harbor is to be effective, it cannot allow program participants
to simply wave the liability for third party privacy practices.&amp;nbsp; The indemnity granted to third parties on social
networking sites may deem the safe harbor program more effective in sustaining
the non-liability of third parties, rather than protecting the data privacy of
citizens.&lt;/p&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;
&lt;hr align="left" size="1" width="33%" /&gt;

&lt;/div&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn1" href="#_ednref1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[1] Official Directive 95/46/EC&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn2" href="#_ednref2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn3" href="#_ednref3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[2] 95/46/EC&lt;/p&gt;
&lt;p class="discreet"&gt;[3] Ibid&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn4" href="#_ednref4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;a name="_edn5" href="#_ednref5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/a&gt;[4] See Acquisit,
A. a. (n.d.). Imagined Communities: Awareness, Information Sharing, and Privacy
on Facebook. &lt;em&gt;PET 2006&lt;/em&gt;&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn6" href="#_ednref6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[5] Of the Privacy Policy browsed include, Zynga, Rock
You!, Crowdstar, Mind Jolt, Electronic Arts, Pop Cap Games, Slash Key, Playdom,
Meteor Games, Broken Bulb Studios, Wooga, and American Global Network.&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn7" href="#_ednref7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[6] By adding an application, users are also sharing with
third parties the information of their friends if they do not specifically &amp;nbsp;opt out of this practice.&lt;/p&gt;
&lt;p class="discreet"&gt;[7]See&lt;strong&gt;
&lt;/strong&gt;&amp;nbsp;Milina, S. (2003).
Let the Market Do its Job: Advocating an Integrated Laissez-Faire Approach to
Online Profiling. &lt;em&gt;Cardozo Arts and Entertainment Law Journal&lt;/em&gt; .&lt;/p&gt;
&lt;pre&gt;&lt;/pre&gt;
&lt;div&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;/div&gt;
&lt;h2&gt;&amp;nbsp;&lt;/h2&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online'&gt;https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>rebecca</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Social Networking</dc:subject>
    

   <dc:date>2011-08-02T07:19:34Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter">
    <title>Does Size Matter? A Tale of Performing Welfare, Producing Bodies and Faking Identity</title>
    <link>https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter</link>
    <description>
        &lt;b&gt;Malavika Jayaram gave a talk.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cyber.law.harvard.edu/events/luncheon/2014/05/jayaram"&gt;This was published by the website of Berkman Center for Internet and Society&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Big Data doesn’t get much bigger than India’s identity project. The world’s largest biometric database - currently consisting of almost 600 million enrolled - seduces with promises of inclusion, legitimacy and visibility. By locating this techno-utopian vision within the larger surveillance state that a unique identifier facilitates, Malavika will describe the ‘welfare industrial complex’ that imagines the poor as the next emerging market. She will highlight the risks of the body as password, of implementing e-governance in a legal vacuum, and of digitization reinforcing existing inequalities. The export of technologies of control - once they have been tested on a massive population that has little agency and limited ability to withhold consent - transforms this project from a site of local activism to one with global repercussions. By offering a perspective that is somewhat different from the traditional western focus of privacy, she hopes to generate a more inclusive discourse about what it means to be autonomous and empowered in the face of paternalistic development projects. She will highlight, in particular, the varied ways in which the project is already being subverted and re-purposed, in ways that are humorous and poignant.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;About Malavika&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Malavika is a Fellow at the Berkman Center for Internet and Society at  Harvard University, focusing on privacy, identity and free expression.  She is also a Fellow at the Centre for Internet and Society, Bangalore,  and the author of the India chapter for the Data Protection &amp;amp;  Privacy volume in the Getting the Deal Done series. Malavika is one of  10 Indian lawyers in The International Who's Who of Internet e-Commerce  &amp;amp; Data Protection Lawyers directory. In August 2013, she was voted  one of India’s leading lawyers - one of only 8 women to be featured in  the “40 under 45” survey conducted by Law Business Research, London. In a  different life, she spent 8 years in London, practicing law with global  firm Allen &amp;amp; Overy in the Communications, Media &amp;amp; Technology  group, and as VP and Technology Counsel at Citigroup. She is working on a  PhD about the development of a privacy jurisprudence and discourse in  India, viewed partly through the lens of the Indian biometric ID  project.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Podcast&lt;/h3&gt;
&lt;p&gt;Watch the podcast &lt;a class="external-link" href="http://castroller.com/podcasts/BerkmanCenterFor/4060529"&gt;at this link&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter'&gt;https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-06-04T09:45:49Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates">
    <title>Do We Really Need an App for That? Examining the Utility and Privacy Implications of India’s Digital Vaccine Certificates</title>
    <link>https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates</link>
    <description>
        &lt;b&gt;We examine the purported benefits of digital vaccine certificates over regular paper-based ones and analyse the privacy implications of their use.&lt;/b&gt;
        
&lt;p&gt;&lt;em&gt;This blogpost was edited by Gurshabad Grover, Yesha Tshering Paul, and Amber Sinha.&lt;br /&gt;It was originally published on &lt;a href="https://digitalid.design/vaccine-certificates.html"&gt;Digital Identities: Design and Uses&lt;/a&gt; and is cross-posted here.&lt;br /&gt;&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;In an experiment to streamline its COVID-19 immunisation drive, India has adopted a centralised vaccine administration system called CoWIN (or COVID Vaccine Intelligence Network). In addition to facilitating registration for both online and walk-in vaccine appointments, the system also allows for the &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;digital verification&lt;/a&gt; of vaccine certificates, which it issues to people who have received a dose. This development aligns with a global trend, as many countries have adopted or are in the process of adopting “vaccine passports” to facilitate safe movement of people while resuming commercial activity.
    &lt;br /&gt;&lt;br /&gt;Some places, such as the &lt;a href="https://www.schengenvisainfo.com/news/all-your-questions-on-eus-covid-19-vaccine-certificate-answered/" target="_blank"&gt;EU&lt;/a&gt;, have constrained the scope of use of their vaccine certificates to international travel. The Indian government, however, has so far &lt;a href="https://www.livemint.com/opinion/columns/vaccination-certificates-need-a-framework-to-govern-their-use-11618160385602.html" target="_blank"&gt;skirted&lt;/a&gt; important questions around where and when this technology should be used. By allowing &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;anyone&lt;/a&gt; to use the online CoWIN portal to scan and verify certificates, and even providing a way for the private-sector to incorporate this functionality into their applications, the government has opened up the possibility of these digital certificates being used, and even mandated, for domestic everyday use such as going to a grocery shop, a crowded venue, or a workplace.
    &lt;br /&gt;&lt;br /&gt;In this blog post, we examine the purported benefits of digital vaccine certificates over regular paper-based ones, analyse the privacy implications of their use, and present recommendations to make them more privacy respecting. We hope that such an analysis can help inform policy on appropriate use of this technology and improve its privacy properties in cases where its use is warranted.
    &lt;br /&gt;&lt;br /&gt;We also note that while this post only examines the merits of a technological solution put out by the government, it is more important to &lt;a href="https://www.accessnow.org/cms/assets/uploads/2021/04/Covid-Vaccine-Passports-Threaten-Human-Rights.pdf" target="_blank"&gt;consider&lt;/a&gt; the effects that placing restrictions on the movement of unvaccinated people has on their civil liberties in the face of a vaccine rollout that is inequitable along many lines, including &lt;a href="https://thewire.in/gender/women-falling-behind-in-indias-covid-19-vaccination-drive" target="_blank"&gt;gender&lt;/a&gt;, &lt;a href="https://www.thehindu.com/sci-tech/science/will-25-covid-19-vaccines-for-private-hospitals-aggravate-inequity/article34799098.ece" target="_blank"&gt;caste-class&lt;/a&gt;, and &lt;a href="https://scroll.in/article/994871/tech-savvy-indians-drive-to-villages-for-covid-19-vaccinations-those-without-smartphones-lose-out" target="_blank"&gt;access to technology&lt;/a&gt;.&lt;/p&gt;
&lt;h4&gt;How do digital vaccine certificates work?&lt;/h4&gt;
&lt;p&gt;Every vaccine recipient in the country is required to be registered on the CoWIN platform using one of &lt;a href="https://www.cowin.gov.in/faq" target="_blank"&gt;seven&lt;/a&gt; existing identity documents. [1] &lt;a name="ref1"&gt;&lt;/a&gt; Once a vaccine is administered, CoWIN generates a vaccine certificate which the recipient can access on the CoWIN website. The certificate is a single page document that contains the recipient’s personal information — their name, age, gender, identity document details, unique health ID, a reference ID — and some details about the vaccine given.&lt;a name="ref2"&gt;&lt;/a&gt; [2] It also includes a “secure QR code” and a link to CoWIN’s verification &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;portal&lt;/a&gt;.
  &lt;br /&gt;&lt;br /&gt;The verification portal allows for the verification of a certificate by scanning the attached QR code. Upon completion, the portal displays a success message along with some of the information printed on the certificate.
  &lt;br /&gt;&lt;br /&gt;Verification is done using a cryptographic mechanism known as &lt;a href="https://en.wikipedia.org/wiki/Digital_signature" target="_blank"&gt;digital signatures&lt;/a&gt;, which are encoded into the QR code attached to a vaccine certificate. This mechanism allows “offline verification”, which means that the CoWIN verification portal or any private sector app attempting to verify a certificate does not need to contact the CoWIN servers to establish its authenticity. It instead uses a “public key” issued by CoWIN beforehand to verify the digital signature attached to the certificate.
  &lt;br /&gt;&lt;br /&gt;The benefit of this convoluted design is that it protects user privacy. Performing verification offline and not contacting the CoWIN servers, precludes CoWIN from gleaning sensitive metadata about usage of the vaccine certificate. This means that CoWIN does not learn about where and when an individual uses their vaccine certificate, and who is verifying it. This closes off a potential avenue for mass surveillance. [3] However, given how certificate revocation checks are being implemented (detailed in the privacy implications section below), CoWIN ends up learning this information anyway.&lt;/p&gt;
&lt;h4&gt;Where is digital verification useful?&lt;/h4&gt;
&lt;p&gt;The primary argument for the adoption of digital verification of vaccine certificates over visual examination of regular paper-based ones is security. In the face of vaccine hesitancy, there are concerns that people may forge vaccine certificates to get around any restrictions that may be put in place on the movement of unvaccinated people. The use of digital signatures serves to allay these fears.
&lt;br /&gt;&lt;br /&gt;In its current form, however, digital verification of vaccine certificates is no more secure than visually inspecting paper-based ones. While the “secure QR code” attached to digital certificates can be used to verify the authenticity of the certificate itself, the CoWIN verification portal does not provide any mechanism nor does it instruct verifiers to authenticate the identity of the person presenting the certificate. This means that unless an accompanying identity document is also checked, an individual can simply present someone else’s certificate.
&lt;br /&gt;&lt;br /&gt;There are no simple solutions to this limitation; adding a requirement to inspect identity documents in addition to digital verification of the vaccine certificate would not be a strong enough security measure to prevent the use of duplicate vaccine certificates. People who are motivated enough to forge a vaccine certificate, can also duplicate one of the seven ID documents which can be used to register on CoWIN, some of which are simple paper-based documents. [4] Requiring even stronger identity checks, such as the use of Aadhaar-based biometrics, would make digital verification of vaccine certificates more secure. However, this would be a wildly disproportionate incursion on user privacy — allowing for the mass collection of metadata like when and where a certificate is used — something that digital vaccine certificates were explicitly designed to prevent. Additionally, in Russia, people were &lt;a href="https://www.washingtonpost.com/world/europe/moscow-fake-vaccine-coronavirus/2021/06/26/0881e1e4-cf98-11eb-a224-bd59bd22197c_story.html" target="_blank"&gt;found&lt;/a&gt; issuing fake certificates by discarding real vaccine doses instead of administering them. No technological solution can prevent such fraud.
&lt;br /&gt;&lt;br /&gt;As such, the utility of digital certificates is limited to uses such as international travel, where border control agencies already have strong identity checks in place for travellers. Any everyday usage of the digital verification functionality on vaccine certificates would not present any benefit over visually examining a piece of paper or a screen.&lt;/p&gt;
&lt;h4&gt;Privacy implications of digital certificates&lt;/h4&gt;
&lt;p&gt;In addition to providing little security utility over manual inspection of certificates, digital certificates also present privacy issues, these are listed below along with recommendations to mitigate them:
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(i) The verification portal leaks sensitive metadata to CoWIN’s servers:&lt;/em&gt; An analysis of network requests made by the CoWin verification portal reveals that it conducts a ‘revocation check’ each time a certificate is verified. This check was also found in the source &lt;a href="https://github.com/egovernments/DIVOC/blob/e667697b47a50a552b8d0a8c89a950180217b945/interfaces/vaccination-api.yaml#L385" target="_blank"&gt;code&lt;/a&gt;, which is made openly available&lt;a name="ref5"&gt;&lt;/a&gt;.
[5]&lt;/p&gt;
&lt;p&gt;Revocation checks are an important security consideration while using digital signatures. They allow the issuing authority (CoWIN, in this case) to revoke a certificate in case the account associated with it is lost or stolen, or if a certificate requires correction. However, the way they have been implemented here presents a significant privacy issue. Sending certificate details to the server on every verification attempt allows it to learn about where and when an individual is using their vaccine certificate.
&lt;br /&gt;&lt;br /&gt;We note that the revocation check performed by the CoWIN portal does not necessarily mean that it is storing this information. Nevertheless, sending certificate information to the server directly contradicts claims of an “offline verification” process, which is the basis of the design of these digital certificates.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendations:&lt;/strong&gt; Implementing privacy-respecting revocation checks such as Certificate Revocation Lists, [6] or Range Queries [7] would mitigate this issue. However, these solutions are either complex or present bandwidth and storage tradeoffs for the verifier.
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(ii) Oversharing of personally identifiable information:&lt;/em&gt; CoWIN’s vaccine certificates include more personally identifiable information (name, age, gender, identity document details and unique health ID) than is required for the purpose of verifying the certificate. An examination of the vaccine certificates available to us revealed that while the Aadhaar number is appropriately masked, other personal identifiers such as passport number and unique health ID were not masked. Additionally, the inclusion of demographic details, such as age and gender, provides little security benefit by limiting the pool of duplicate certificates that can be used and are not required in light of the security analysis above.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendation:&lt;/strong&gt; Personal identifiers (such as passport number and unique health ID) should be appropriately masked and demographic details (age, gender) can be removed.
&lt;br /&gt;&lt;br /&gt;The minimal set of data required for identity-linked usage for digital verification, as described above, is a full name and masked ID document details. All other personally identifying information can be removed. In case of paper-based certificates, which is suggested for domestic usage, only the details about vaccine validity would suffice and no personal information is required.
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(iii) Making information available digitally increases the likelihood of collection:&lt;/em&gt; All of the personal information printed on the certificate is also encoded into the QR code. This is &lt;a href="https://www.bbc.com/news/uk-scotland-57208607" target="_blank"&gt;necessary&lt;/a&gt; because the digital signature verification process also verifies the integrity of this information (i.e. it wasn’t modified). A side effect of this is that the personal information is made readily available in digital form to verifiers when it is scanned, making it easy for them to store. This is especially likely in private sector apps who may be interested in collecting demographic information and personal identifiers to track customer behaviour.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendation:&lt;/strong&gt; Removing extraneous information from the certificate, as suggested above, mitigates this risk as well.&lt;/p&gt;
&lt;h4&gt;Conclusion&lt;/h4&gt;
&lt;p&gt;Our analysis reveals that without incorporating strong, privacy-invasive identity checks, digital verification of vaccine certificates does not provide any security benefit over manually inspecting a piece of paper. The utility of digital verification is limited to purposes that already conduct strong identity checks.
&lt;br /&gt;&lt;br /&gt;In addition to their limited applicability, in their current form, these digital certificates also generate a trail of data and metadata, giving both government and industry an opportunity to infringe upon the privacy of the individuals using them.
&lt;br /&gt;&lt;br /&gt;Keeping this in mind, the adoption of this technology should be discouraged for everyday use.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;References&lt;/h4&gt;
&lt;p&gt;[1] Exceptions &lt;a href="https://web.archive.org/web/20210511045921/https://www.mohfw.gov.in/pdf/SOPforCOVID19VaccinationofPersonswithoutPrescribedIdentityCards.pdf" target="_blank"&gt;exist&lt;/a&gt; for people without state-issued identity documents.&lt;/p&gt;
&lt;p&gt;[2] This information was gathered by inspecting three vaccine certificates linked to the author’s CoWIN account, which they were authorised to view, and may not be fully accurate.&lt;/p&gt;
&lt;p&gt;[3] This design is similar to Aadhaar’s “&lt;a href="https://resident.uidai.gov.in/offline-kyc" target="_blank"&gt;offline KYC&lt;/a&gt;” process.&lt;/p&gt;
&lt;p&gt;[4] “Aadhaar Card: UIDAI says downloaded versions on ordinary paper, mAadhaar perfectly valid”, &lt;em&gt;Zee Business&lt;/em&gt;, April 29 2019, &lt;em&gt;https://www.zeebiz.com/india/news-aadhaar-card-uidai-says-downloaded-versions-on-ordinary-paper-maadhaar-perfectly-valid-96790&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;[5] This check was also verified to be present in the reference &lt;a href="https://github.com/egovernments/DIVOC/blob/261a61093b89990fe34698f9ba17367d4cb74c34/public_app/src/components/CertificateStatus/index.js#L125" target="_blank"&gt;code&lt;/a&gt; made available for private-sector applications incorporating this functionality, suggesting that private sector apps will also be affected by this.&lt;/p&gt;
&lt;p&gt;[6] &lt;a href="https://en.wikipedia.org/wiki/Certificate_revocation_list" target="_blank"&gt;Certificate Revocation Lists&lt;/a&gt; allow the server to provide a list of revoked certificates to the verifier, instead of the verifier querying the server each time. This, however, can place heavy bandwidth and storage requirements on the verifying app as this list can potentially grow long.&lt;/p&gt;
&lt;p&gt;[7] Range Queries are described in this &lt;a href="https://www.ics.uci.edu/~gts/paps/st06.pdf" target="_blank"&gt;paper&lt;/a&gt;. In this method, the verifier requests revocation status from the server by specifying a range of certificate identifiers within which the certificate being verified lies. If there are any revoked certificates within this range, the server will send their identifiers to the verifier, who can then check if the certificate in question is on the list. For this to work, the range selected must be sufficiently large to include enough potential candidates to keep the server from guessing which one is in use.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates'&gt;https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>divyank</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digital ID</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    
    
        <dc:subject>Appropriate Use of Digital ID</dc:subject>
    

   <dc:date>2021-08-03T05:13:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme">
    <title>Do we need the Aadhar scheme?</title>
    <link>https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme</link>
    <description>
        &lt;b&gt;"Decentralisation and privacy are preconditions for security. Digital signatures don’t require centralised storage and are much more resilient in terms of security", Sunil Abraham in the Business Standard on 1 February 2012.&lt;/b&gt;
        
&lt;p&gt;We don’t need Aadhar because we already have a much more robust identity management and authentication system based on digital signatures that has a proven track record of working at a “billions-of-users” scale on the internet with reasonable security. The Unique Identification (UID) project based on the so-called “infallibility of biometrics” is deeply flawed in design. These design disasters waiting to happen cannot be permanently thwarted by band-aid policies.&lt;/p&gt;
&lt;p&gt;Biometrics are poor authentication factors because once they are compromised they cannot be re-secured unlike digital signatures. Additionally, an individual’s biometrics can be harvested remotely without his or her conscious cooperation. The iris can be captured remotely without a person’s knowledge using a high-res digital camera.&lt;/p&gt;
&lt;p&gt;Biometrics are poor identification factors in a country where the registrars have commercial motivation to create ghost identities. For example, bank managers trying to achieve targets for deposits by opening benami accounts. Biometrics for these ghost identities can be imported from other countries or generated endlessly using image processing software. The de-duplication engine at the Unique Identification Authority of India (UIDAI) will be fooled into thinking that these are unique residents.&lt;/p&gt;
&lt;p&gt;An authentication system does not require a centralised database of authentication factors and transaction details. This is like arguing that the global system of e-commerce needs a centralised database of passwords and logs or, to use an example from the real world, to secure New Delhi, all citizens must deposit duplicate keys to their private property with the police.&lt;/p&gt;
&lt;p&gt;Decentralisation and privacy are preconditions for security. The “end-to-end principle” used to design internet security is also in compliance with Gandhian principles of Panchayat Raj. Digital signatures don’t require centralised storage of private keys and are, therefore, much more resilient in terms of security.&lt;/p&gt;
&lt;p&gt;Biometrics as authentication factors require the government to store biometrics of all citizens but citizens are not allowed to store biometrics of politicians and bureaucrats. The state authenticates the citizen but the citizen cannot conversely authenticate the state. Digital signatures as an authentication factor, on the other hand, does not require this asymmetry since citizens can store public keys of state actors and authenticate them. The equitable power relationship thus established allows both parties to store a legally non-repudiable audit trail for critical transactions like delivery of welfare services. Biometrics exacerbates the exiting power asymmetry between citizens and state unlike digital signatures, which is peer authentication technology.&lt;/p&gt;
&lt;p&gt;Privacy protections should be inversely proportional to power. The transparency demanded of politicians, bureaucrats and large corporations cannot be made mandatory for ordinary citizens. Surveillance must be directed at big-ticket corruption, at the top of the pyramid and not retail fraud at the bottom. Even for retail fraud, the power asymmetry will result in corruption innovating to circumvent technical safeguards. Government officials should be required by law to digitally sign the movement of resources each step of the way till it reaches a citizen. Open data initiatives should make such records available for public scrutiny. With support from civil society and the media, citizens will themselves address retail fraud. To solve corruption, the state should become more transparent to the citizen and not vice versa.&lt;/p&gt;
&lt;p&gt;UIDAI’s latest 23-page biometrics report is supposed to dispel the home ministry’s security anxieties. It says “biometric data is collected by software provided by the UIDAI, which immediately encrypts and applies a digital signature.” Surely, what works for UIDAI, that is digital signatures, should work for citizens too. The report does not cover even the most basic attack — for example, the registrar could pretend that UIDAI software is faulty and harvest biometrics again using a parallel set-up. If biometrics are infallible, as the report proclaims, then sections in the draft UID Bill that criminalise attempts to defraud the system should be deleted.&lt;/p&gt;
&lt;p&gt;The compromise between UIDAI and the home ministry appears to be a turf battle for states where security concerns trump developmental aspirations. This compromise does nothing to address the issues raised by the Parliamentary Standing Committee on Finance, headed by the Bharatiya Janata Party’s Yashwant Sinha.&lt;/p&gt;
&lt;p&gt;Read the &lt;a class="external-link" href="http://www.business-standard.com/india/news/do-we-needaadhar-scheme/463324/"&gt;original published in the Business Standard&lt;/a&gt; on 1 February 2012&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme'&gt;https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2012-02-03T10:11:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
