<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 541 to 555.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/global-voices-february-11-2016-netizen-report"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/a-case-for-greater-privacy-paternalism"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/openness/open-data-hackathons-are-great-but-address-privacy-and-license-concerns"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/data-privacy-day-2016"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/events/big-data-governance-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/events/network-neutrality-regulation-across-south-asia-a-roundtable-on-aspects-of-differential-pricing"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/what-are-peoples-rights-in-digital-world"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/human-rights-in-the-age-of-digital-technology-a-conference-to-discuss-the-evolution-of-privacy-and-surveillance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/livemint-moulishree-srivastava-january-5-2016-nasscom-against-differential-pricing-for-data-services"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/reply-to-rti-application-under-rti-act-of-2005-from-vanya-rakesh"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/rti-response-regarding-the-uidai"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/unbundling-issues-of-privacy-data-security-identity-matrics-for-financial-inclusion"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/kick-off-meeting-for-the-politics-of-data-project"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/news/global-voices-february-11-2016-netizen-report">
    <title>Netizen Report: The EU Wrestles With Facebook Over Privacy   </title>
    <link>https://cis-india.org/internet-governance/news/global-voices-february-11-2016-netizen-report</link>
    <description>
        &lt;b&gt;Global Voices Advocacy's Netizen Report offers an international snapshot of challenges, victories, and emerging trends in Internet rights around the world. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post published in Global Voices on February 11, 2016 quotes Pranesh Prakash and Subhashish Panigrahi.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In the latest development in the negotiations between the United States and European Union over data transfer rules, Reuters reports France’s data protection authority gave Facebook&lt;a href="http://www.reuters.com/article/us-facebook-france-privacy-idUSKCN0VH1U1"&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;three months to stop tracking&lt;/span&gt;&lt;/a&gt; non-users’ Web activity without their consent, and ordered Facebook to cease some transfers of personal data to the United States or face fines. In response, Facebook asserted it does not use the now-defunct&lt;a href="https://en.wikipedia.org/wiki/International_Safe_Harbor_Privacy_Principles"&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;Safe Harbor&lt;/span&gt;&lt;/a&gt; agreement to move data to the United States and instead has set up alternative legal structures to keep its data transfers in line with EU law. Despite this, Facebook was forced last year to&lt;a href="http://venturebeat.com/2016/02/08/french-data-privacy-regulator-to-facebook-you-have-3-months-to-stop-tracking-non-users/"&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;stop tracking Belgian non-users&lt;/span&gt;&lt;/a&gt; after it was taken to court by the Belgian regulator. Last week, the United States and European Union agreed upon a new legal framework to replace Safe Harbor, but as it is not yet operational, several European data protection authorities are still deciding whether data transfers should be restricted.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;strong&gt;Big Blow for Facebook’s Free Basics&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Indian regulators &lt;a href="http://inbministry.blogspot.in/2016/02/telecom-regulatory-authority-of-india.html"&gt;&lt;span&gt;officially banned “differential pricing”&lt;/span&gt;&lt;/a&gt;or discriminatory tariffs placed on data services depending on their content. This means that Internet users in India are guaranteed equal access to any website they want, regardless of how they connect to the Internet, &lt;a href="https://advox.globalvoices.org/2016/02/09/a-good-day-for-the-internet-everywhere-india-bans-differential-data-pricing/"&gt;&lt;span&gt;ays Global Voices’ Subhashish Panigrahi&lt;/span&gt;&lt;/a&gt;. The decision is a particular blow to Facebook’s Free Basics application, which uses differential pricing mechanisms to make accessing Facebook, WhatsApp and a limited number of other websites free to users who do not pay for mobile data plans. Though Facebook promotes the program as a means to increasing digital access, it has come under backlash in India and a number of other countries. Internet policy expert &lt;a href="https://twitter.com/pranesh/status/696732814083907584?ref_src=twsrc%5Etfw"&gt;&lt;span&gt;Pranesh Prakash emphasized&lt;/span&gt;&lt;/a&gt;that though the ruling is a win for open access in India, these efforts must continue until India is truly and equally connected.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;strong&gt;Google’s new scheme to combat online extremism &lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In an effort to combat groups like ISIS that recruit online, Google has launched a&lt;a href="http://www.theguardian.com/uk-news/2016/feb/02/google-pilot-extremist-anti-radicalisation-information"&gt;&lt;span&gt;pilot scheme&lt;/span&gt;&lt;/a&gt;to point users who search for extremist terms toward anti-radicalization links. It announced the new effort on February 2 at a&lt;a href="http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/home-affairs-committee/countering-extremism/oral/28376.html"&gt;&lt;span&gt; meeting&lt;/span&gt;&lt;/a&gt; with the U.K. Home Affairs Select Committee on Countering Extremism. Representatives of Twitter and Facebook were also challenged by members of Parliament on their role in combatting the spread of terrorist material. Twitter&lt;a href="http://www.nytimes.com/2016/02/06/technology/twitter-account-suspensions-terrorism.html"&gt;&lt;span&gt; announced&lt;/span&gt;&lt;/a&gt;that it had suspended 125,000 accounts associated with extremism since mid-2015 in response to pressure from the US government. However, as the New York Times’ Mike Isaac notes, “these companies must walk a fine line between bearing responsibility for their platforms and avoiding becoming the arbiter of what constitutes free speech.”&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;strong&gt;What’s going to happen to Ukraine’s database of ‘explicit content’?&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Ukrainian censorship body, National Expert Commission for Protection of Public Morality, dissolved last year, but its&lt;a href="https://globalvoices.org/2016/02/05/ukrainian-censors-explicit-content-database-is-up-for-grabs/"&gt;&lt;span&gt; legacy lives on&lt;/span&gt;&lt;/a&gt; as a database of “explicit content” that no one in the government seems to know what to do with. The database includes a sizeable amount of content “containing elements of sexual nature and erotica,” but the commission was also well known for its &lt;a href="http://www.mediaite.com/tv/ukraine-govt-wants-to-ban-spongebob-promotes-homosexuality/"&gt;&lt;span&gt;attempt to ban&lt;/span&gt;&lt;/a&gt; Spongebob Squarepants, Shrek, and Teletubbies. Users have suggested the team responsible for dissolving the commission make the content more widely available, so they can see where taxpayers’ money went.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;strong&gt;How to protect yourself from government hacking&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Hacking human rights workers, journalists, and NGOs has become &lt;a href="https://www.amnesty.org/en/latest/campaigns/2016/01/brief-history-of-government-hacking-human-rights-organizations/"&gt;&lt;span&gt;common practice &lt;/span&gt;&lt;/a&gt;for governments around the world, according to Amnesty International’s Morgan Marquis-Boire and Electronic Frontier Foundation’s Eva Galperin. In a post for Amnesty International, the two provide a brief history of government hacking and give suggestions for NGOs and human rights organizations to protect themselves.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;strong&gt;Taking on Russia’s invasive surveillance &lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Two Russian Internet service providers are taking the Federal Security Service to court to&lt;a href="https://advox.globalvoices.org/2016/02/03/isps-take-kremlin-to-court-over-online-surveillance/"&gt;&lt;span&gt; challenge the surveillance system&lt;/span&gt;&lt;/a&gt; employed by Russian federal police to spy on Internet use. ISPs play a critical role in making surveillance possible, by installing expensive equipment that provides police access—making this case a significant affront to Russia’s invasive surveillance apparatus.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;strong&gt;Telegram in Iran&lt;/strong&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Messaging app Telegram’s &lt;a href="http://www.theguardian.com/world/2016/feb/08/telegram-the-instant-messaging-app-freeing-up-iranians-conversations?CMP=share_btn_tw"&gt;&lt;span&gt;growing influence&lt;/span&gt;&lt;/a&gt; is being characterized as a major factor in the dissemination and spread of information leading up to Iran’s Feb. 26 parliamentary elections, but &lt;a href="https://globalvoices.org/2015/08/28/is-telegrams-compliance-with-iran-compromising-the-digital-security-of-its-users/"&gt;&lt;span&gt; the platform&lt;/span&gt;&lt;/a&gt;’s susceptibility to state manipulation is also becoming more apparent. After the arrest of former BBC journalist Bahman Doroshafaei, the government&lt;a href="https://motherboard.vice.com/read/iran-telegram-account-bbc-journalist"&gt;&lt;span&gt; took over his Telegram account&lt;/span&gt;&lt;/a&gt; and started to message his contacts. Some believe this was an effort to extract sensitive information or to distribute spyware. Fatemeh Shams, a friend of Doroshafaei, posted the following warning to her Facebook account:&lt;/p&gt;
&lt;blockquote style="text-align: justify; "&gt;
&lt;p&gt;Someone has been talking to me for two hours from Bahman's hacked Telegram account and now is chatting with my friends with my account..If anyone messaged you on Telegram [from my account] please ignore it. I've lost access to my account.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;Mahsa Alimardani, &lt;/strong&gt;&lt;a href="https://advocacy.globalvoicesonline.org/author/ellery-roberts-biddle/"&gt;&lt;strong&gt;&lt;span&gt;Ellery Roberts Biddle&lt;/span&gt;&lt;/strong&gt;&lt;/a&gt;&lt;strong&gt;, Hae-in Lim and&lt;/strong&gt;&lt;a href="https://advocacy.globalvoicesonline.org/author/sarahbmyers/"&gt;&lt;strong&gt;&lt;span&gt; Sarah Myers West&lt;/span&gt;&lt;/strong&gt;&lt;/a&gt; &lt;strong&gt;contributed to this report.&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/global-voices-february-11-2016-netizen-report'&gt;https://cis-india.org/internet-governance/news/global-voices-february-11-2016-netizen-report&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-02-27T07:39:01Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/a-case-for-greater-privacy-paternalism">
    <title>A Case for Greater Privacy Paternalism?</title>
    <link>https://cis-india.org/internet-governance/blog/a-case-for-greater-privacy-paternalism</link>
    <description>
        &lt;b&gt;This is the second part of a series of three articles exploring the issues with the privacy self management framework and potential alternatives. &lt;/b&gt;
        
&lt;div align="left"&gt;&amp;nbsp;&lt;/div&gt;
&lt;h3 align="left" style="text-align: justify;"&gt;The first part of the series can be accessed &lt;a class="external-link" href="http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy"&gt;here&lt;/a&gt;.&lt;/h3&gt;
&lt;p align="left"&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3 align="left" style="text-align: justify;"&gt;Background&lt;/h3&gt;
&lt;p align="left" style="text-align: justify;"&gt;The current data privacy protection framework across most jurisdictions is built around a rights based approach which entrusts the individual with having 	the wherewithal to make informed decisions about her interests and well-being.&lt;a name="_ftnref1" href="#_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In 	his book, &lt;em&gt;The Phantom Public&lt;/em&gt;, published in 1925, Walter Lippmann argues that the rights based approach is based on the idea of a sovereign and omnicompetent citizens, who can direct public affairs, however, this idea is a mere phantom or an abstraction.	&lt;a name="_ftnref2" href="#_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jonathan Obar, Assistant Professor of Communication and Digital Media Studies in 	the Faculty of Social Science and Humanities at University of Ontario Institute of Technology, states that Lippmann's thesis remains equally relevant in the context of current models of self-management, particularly for privacy.&lt;a name="_ftnref3" href="#_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In	&lt;a href="http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy"&gt;the previous post&lt;/a&gt;, Scott Mason and I had looked at the 	limitations of a 'notice and consent' regime for privacy governance. Having established the deficiencies of the existing framework for data protection, I 	will now look at some of the alternatives proposed that may serve to address these issues.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;In this article, I will look at paternalistic solutions posed as alternatives to the privacy self-management regime. I will look at theories of paternalism 	and libertarianism in the context of privacy and with reference to the works of some of the leading philosophers on jurisprudence and political science. 	The paper will attempt to clarify the main concepts and the arguments put forward by both the proponents and opponents of privacy paternalism. The first alternative solution draws on Anita Allen's thesis in her book, &lt;em&gt;Unpopular Privacy&lt;/em&gt;,&lt;a name="_ftnref4" href="#_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; which deals with the questions whether individuals have a moral obligation to 	protect their own privacy. Allen expands the idea of rights to protect one's own self interests and duties towards others to the notion that we may have 	certain duties not only towards others but also towards ourselves because of their overall impact on the society. In the next section, we will look at the 	idea of 'libertarian paternalism' as put forth by Cass Sunstein and Richard Thaler&lt;a name="_ftnref5" href="#_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and what its impact could be on privacy governance.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;&lt;strong&gt;Paternalism&lt;/strong&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Gerald Dworkin, Professor Emeritus at University of California, Davis, defines paternalism as "interference of a state or an individual with another person, against their will, and defended or motivated by a claim that the person interfered with will be better off or protected from harm."	&lt;a name="_ftnref6" href="#_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Any act of paternalism will involve some limitation on the autonomy of the subject 	of the regulation usually without the consent of the subject, and premised on the belief that such act shall either improve the welfare of the subject or 	prevent it from diminishing.&lt;a name="_ftnref7" href="#_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Seana Shiffrin, Professor of Philosophy and Pete 	Kameron Professor of Law and Social Justice at UCLA, takes a broader view of paternalism and includes within its scope not only matters which are aimed at 	improving the subject's welfare, but also the replacement of the subject's judgement about matters which may otherwise have lied legitimately within the 	subject's control.&lt;a name="_ftnref8" href="#_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In that sense, Shiffrin's view is interesting for it dispenses 	with both the requirement for active interference, and such act being premised on the subject's well-being.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;The central premise of John Stuart Mill's &lt;em&gt;On Liberty&lt;/em&gt; is that the only justifiable purpose to exert power over the will of an individual is to 	prevent harm to others. "His own good, either physical or moral," according to Mill, "is not a sufficient warrant." However, various scholars over the 	years have found Mill's absolute prohibition problematic and support some degree of paternalism. John Rawls' Principle of Fairness, for instance has been 	argued to be inherently paternalistic. If one has to put it in a nutshell, the aspect about paternalism that makes it controversial is that it involves 	coercion or interference, which in any theory of normative ethics or political science needs to be justified based on certain identified criteria. Staunch 	opponents of paternalism believe that this justification can never be met. Most scholars however, do not argue that all forms of paternalism are untenable 	and the bulk of scholarship on paternalism is devoted to formulating the conditions under which this justification is satisfied.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Paternalism interferes with self-autonomy in two ways according to Peter de Marneffe, the Professor of Philosophy at the School of Historical, 	Philosophical and Religious Studies, Arizona State University.&lt;a name="_ftnref9" href="#_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The first is the 	prohibition principle, under which a person's autonomy is violated by being prohibited from making a choice. The second is the opportunity principle which 	undermines the autonomy of a person by reducing his opportunities to make a choice. Both the cases should be predicated upon a finding that the 	paternalistic act will lead to welfare or greater autonomy. According to de Marneffe, there are three conditions under which such acts of paternalism are justified - the benefits of welfare should be substantial, evident and must outweigh the benefits of self-autonomy.&lt;a name="_ftnref10" href="#_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;There are two main strands of arguments made against paternalism.&lt;a name="_ftnref11" href="#_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The first 	argues that interference with the choices of informed adults will always be an inferior option to letting them decide for themselves, as each person is the 	'best judge' of his or her interests. The second strand does not engage with the question about whether paternalism can make better decisions about 	individuals, but states that any benefit derived from the paternalist act is outweighed by the harm of violation of self-autonomy. Most proponents of 	soft-paternalism build on this premise by trying to demonstrate that not all paternalistic acts violate self-autonomy. There are various forms of 	paternalism that we do not question despite them interfering with our autonomy - seat belt laws and restriction of tobacco advertising being a few of them. 	If we try to locate arguments for self-autonomy in the Kantian framework, it refers not just to the ability to do what one chooses, but to rational 	self-governance.&lt;a name="_ftnref12" href="#_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This theory automatically "opens the door for justifiable 	paternalism."&lt;a name="_ftnref13" href="#_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In this paper, I assume that certain forms of paternalism are 	justified. In the remaining two section, I will look at two different theories advocating greater paternalism in the context of privacy governance and try 	to examine the merits and issues with such measures.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;&lt;strong&gt;A moral obligation to protect one's privacy&lt;/strong&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Modest Paternalism&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;In her book, &lt;em&gt;Unpopular Privacy&lt;/em&gt;,&lt;a name="_ftnref14" href="#_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Anita Allen states that enough emphasis is not 	placed by people on the value of privacy. The right of individuals to exercise their free will and under the 'notice and consent' regime, give up their 	rights to privacy as they deem fit is, according to her, problematic. The data protection law in most jurisdictions, is designed to be largely 	value-neutral in that it does not sit on judgement on what is the nature of information that is being revealed and how the collector uses it. Its primary emphasis is on providing the data subject with information about the above and allowing him to make informed decisions. In	&lt;a href="http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy"&gt;my previous post&lt;/a&gt;, Scott Mason and I had discussed 	that with online connectivity becomes increasingly important to participation in modern life, the choice to withdraw completely is becoming less and less 	of a genuine option.&lt;a name="_ftnref15" href="#_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Lamenting that people put little emphasis on privacy and 	often give away information which, upon retrospection and due consideration, they would feel, they ought not have disclosed, Allen proposes what she calls 	'modest paternalism' in which regulations mandate that individuals do not waive their privacy is certain limited circumstances.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Allen acknowledges the tension between her arguments in favor of paternalism and her avowed support for the liberal ideals of autonomy and that government 	interference should be limited, to the extent possible. However, she tries to make a case for greater paternalism in the context of privacy. She begins by 	categorizing privacy as a "primary good" essential for "self respect, trusting relationships, positions of responsibility and other forms of flourishing." In another article, Allen states that this "technophilic generation appears to have made disclosure the default rule of everyday life."&lt;a name="_ftnref16" href="#_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Relying on various anecdotes and examples of individuals' disregard for privacy, 	she argues that privacy is so "neglected in contemporary life that democratic states, though liberal and feminist, could be justified in undertaking a 	rescue mission that includes enacting paternalistic privacy laws for the benefit of un-eager beneficiaries." She does state that in most cases it may be 	more advantageous to educate and incentivise individuals towards making choices that favor greater privacy protection. However, in exceptional cases, 	paternalism would be justified as a tool to ensure greater privacy.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;A duty towards oneself&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;In an article for the Harvard Symposium on Privacy in 2013, Allen states that laws generally provide a framework built around rights of individuals that 	enable self-protection and duties towards others. G A Cohen describes Robert Nozick's views which represents this libertarian philosophy as follows: "The 	thought is that each person is the morally rightful owner of himself. He possesses over himself, as a matter of moral right, all those rights that a 	slaveholder has over a chattel slave as a matter of legal right, and he is entitled, morally speaking, to dispose over himself in the way such a 	slaveholder is entitled, legally speaking, to dispose over his slave."&lt;a name="_ftnref17" href="#_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As per the 	libertarian philosophy espoused by Nozick, everyone is licensed to abuse themselves in the same manner slaveholders abused their slaves.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Allen asks the question whether there is a duty towards oneself and if such a duty exists, should it be reflected in policy or law. She accepts that a range of philosophers consider the idea of duties to oneself as illogical or untenable.	&lt;a name="_ftnref18" href="#_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Allen, however relies on the works of scholars such as Lara Denis, Paul 	Eisenberg and Daniel Kading who have located such a duty. She develops a schematic of two kinds of duties - first order duties that requires we protect 	ourselves for the sake of others, and second order, derivative duties that we protect ourself. Through the essay, she relies on the Kantian framework of 	categorical imperative to build the moral thrust of her arguments. Kantian view of paternalism would justify those acts which interfere with an 	individual's autonomy in order to prevent her from exercising her autonomy irrationally, and draw her towards rational end that agree with her conception 	of good.&lt;a name="_ftnref19" href="#_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, Allen goes one step further and she locates the genesis for 	duties to both others (perfect duties) and oneself (imperfect duties) in the categorical imperative . Her main thesis is that there are certain situations 	where we have a moral duty to protect our own privacy where failure to do so would have an impact on either specific others or the society, at large.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Issues&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Having built this interesting and somewhat controversial premise, Allen does not sufficiently expand upon it to present a nuanced solution. She provides a 	number of anecdotes but does not formulate any criteria for when privacy duties could be self-regarding. Her test for what kinds of paternalistic acts are 	justified is also extremely broad. She argues for paternalism where is protects privacy rights that "enhance liberty, liberal ways of life, well-being and 	expanded opportunity." She does not clearly define the threshold for when policy should move from incentives to regulatory mandate nor does she elaborate 	upon what forms paternalism would both serve the purpose of protecting privacy as well as ensuring that there is no unnecessary interference with the 	rights of individual.&lt;a name="_ftnref20" href="#_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;&lt;strong&gt;Nudge and libertarian paternalism&lt;/strong&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;What is nudge?&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;In 2006, Richard Thaler and Cass Sunstein published their book &lt;em&gt;Nudge: Improving decisions about health, wealth and happiness&lt;/em&gt;.	&lt;a name="_ftnref21" href="#_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The central thesis of the book is that in order to make most of decisions, we 	rely on a menu of options made available to us and the order and structure of choices is characterised by Thaler and Sunstein as "choice architecture." 	According to them, the choice architecture has a significant impact on the choices that we make. The book looks at examples from a food cafeteria, the 	position of restrooms and how whether the choice is to opt-in or opt-out influences the retirement plans that were chosen. This choice architecture 	influences our behavior without coercion or a set of incentives, as conventional public policy theory would have us expect. The book draws on work done by cognitive scientists such as Daniel Kahneman&lt;a name="_ftnref22" href="#_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and Amos Tversky&lt;a name="_ftnref23" href="#_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; as well as Thaler's own research in behavioral economics.	&lt;a name="_ftnref24" href="#_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The key takeaway from cognitive science and behavioral economics used in this 	book is that choice architecture influences our actions in anticipated ways and leads to predictably irrational behavior. Thaler and Sunstein believe that 	this presents a great potential for policy makers. They can tweak the choice architecture in their specific domains to influence the decisions made by its 	subjects and nudge them towards behavior that is beneficial to them and/or the society.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;The great attraction of the argument made by Thaler and Sunstein is that it offers a compromise between forbearance and mandatory regulation. If we 	identify the two ends of the policy spectrum as - a) paternalists who believe in maximum interference through legal regulations that coerce behavior to 	meet the stated goals of the policy, and b) libertarians who believe in the free market theory that relies on the individuals making decisions in their 	best interests, 'nudging' falls somewhere in the middle, leading to the oxymoronic yet strangely apt phrase, "libertarian paternalism." The idea is to 	design choices in such as way that they influence decision-making so as to increase individual and societal welfare. In his book, &lt;em&gt;The Laws of Fear&lt;/em&gt;, Cass Sunstein argues that the anti-paternalistic position is incoherent as "there is no way to avoid effects on behavior and choices."&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;The proponents of libertarian paternalism refute the commonly posed question about who decides the optimal and desirable results of choice architecture, by 	stating that this form of paternalism does not promote a perfectionist standard of welfare but an individualistic and subjective standard. According to 	them, choices are not prohibited, cordoned off or made to carry significant barriers. However, it is often difficult to conclude what it is that is better 	for the welfare of people, even from their own point of view. The claim that nudges lead to choices that make them better off by their own standards seems 	more and more untenable. What nudges do is lead people towards certain broad welfare which the choice-architects believe make the lives of people better in 	the longer term.&lt;a name="_ftnref25" href="#_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;How nudges could apply to privacy?&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Our &lt;a href="http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy"&gt;previous post&lt;/a&gt; echoes the assertion made by 	Thaler and Sunstein that the traditional rational choice theory that assumes that individuals will make rationally optimal choices in their self interest 	when provided with a set of incentives and disincentives, is largely a fiction. We have argued that this assertion holds true in the context of privacy 	protection principles of notice and informed consent. Daniel Solove has argued that insights from cognitive science, particularly using the theory of nudge would be an acceptable compromise between the inefficacy of privacy self-management and the dangers of paternalism.&lt;a name="_ftnref26" href="#_ftn26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; His rationale is that while nudges influence choice, they are not overly 	paternalistic in that they still give the individual the option of making choices contrary to those sought by the choice architecture. This is an important 	distinction and it demonstrates that 'nudging' is less coercive than how we generally understand paternalistic policies.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;One of the nudging techniques which makes a lot of sense in the context of the data protection policies is the use of defaults. It relies on the 	oft-mentioned status quo bias.&lt;a name="_ftnref27" href="#_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This is mentioned by Thaler and Sunstein with 	respect to encouraging retirement savings plans and organ donation, but would apply equally to privacy. A number of data collectors have maximum disclosure 	as their default settings and effort in understanding and changing these settings is rarely employed by users. A rule which mandates that data collectors 	set optimal defaults that ensure that the most sensitive information is subjected to least degree of disclosure unless otherwise chosen by the user, will 	ensure greater privacy protection.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Ryan Calo and Dr. Victoria Groom explored an alternative to the traditional notice and consent regime at the Centre of Internet and Society, Stanford 	University.&lt;a name="_ftnref28" href="#_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; They conducted a two-phase experimental study. In the first phase, a 	standard privacy notice was compared with a control condition and a simplified notice to see if improving the readability impacted the response of users. 	In the second phase, the notice was compared with five notices strategies, out of which four were intended to enhance privacy protective behavior and one was intended to lower it. Shara Monteleone and her team used a similar approach but with a much larger sample size.&lt;a name="_ftnref29" href="#_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; One of the primary behavioral insights used was that when we do repetitive 	activities including accepting online terms and conditions or privacy notices, we tend to use our automatic or fast thinking instead to reflective or slow 	thinking.&lt;a name="_ftnref30" href="#_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Changing them requires leveraging the automatic behavior of the 	individuals.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Alessandro Acquisti, Professor of Information Technology and Public Policy at the Heinz College, Carnegie Mellon University, has studied the application of 	methodologies from behavioral economics to investigate privacy decision-making.&lt;a name="_ftnref31" href="#_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; He highlights a variety of factors that distort decision-making such as - "inconsistent preferences and frames of judgment; opposing or contradictory needs 	(such as the need for publicity combined with the need for privacy); incomplete information about risks, consequences, or solutions inherent to 	provisioning (or protecting) personal information; bounded cognitive abilities that limit our ability to consider or reflect on the consequences of 	privacy-relevant actions; and various systematic (and therefore predictable) deviations from the abstractly rational decision process." Acquisti looks at 	three kinds of policy solutions taking the example of social networking sites collecting sensitive information- a) hard paternalistic approach which ban 	making visible certain kind of information on the site, b) a usability approach that entails designing the system in way that is most intuitive and easy 	for users to decide whether to provide the information, c) a soft paternalistic approach which seeks to aid the decision-making by providing other 	information such as how many people would have access to the information, if provided, and set defaults such that the information is not visible to others 	unless explicitly set by the user. The last two approaches are typically cited as examples of nudging approaches to privacy.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Another method is to use tools that lead to decreased disclosure of information. For example, tools like Social Media Sobriety Test&lt;a name="_ftnref32" href="#_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; or Mail Goggles&lt;a name="_ftnref33" href="#_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; serve to block the sites during certain hours set by user during which one expects to be at their most vulnerable, and the online services are blocked unless the user can pass a dexterity examination.&lt;a name="_ftnref34" href="#_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Rebecca Belabako and her team are building privacy enhanced tools for Facebook 	and Twitter that will provide greater nudges in restricting who they share their location on Facebook and restricting their tweets to smaller group of 	people.&lt;a name="_ftnref35" href="#_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ritu Gulia and Dr. Sapna Gambhir have suggested nudges for social networking websites that randomly select pictures of people who will have access to the information to emphasise the public or private setting of a post.&lt;a name="_ftnref36" href="#_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These approaches try to address the myopia bias where we choose immediate access 	to service over long term privacy harms.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;The use of nudges as envisioned in the examples above is in some ways an extension of already existing research which advocates a design standard that 	makes the privacy notices more easily intelligible.&lt;a name="_ftnref37" href="#_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, studies show only an 	insignificant improvement by using these methods. Nudging, in that sense goes one step ahead. Instead of trying to make notices more readable and enable 	informed consent, the design standard will be intended to simply lead to choices that the architects deem optimal.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Issues with nudging&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;One of the primary justifications that Thaler and Sunstein put forward for nudging is that the choice architecture is ubiquitous. The manner in which 	option are presented to us impact how we make decision whether it was intended to do so or not, and that there is no such thing a neutral architecture. 	This inevitability, according to them, makes a strong case for nudging people towards choices that will lead to their well-being. However, this assessment 	does not support the arguments made by them that libertarian paternalism nudges people towards choices from their own point of view. It is my contention 	that various examples of libertarian paternalism, as put forth by Thaler and Sunstein, do in fact interfere with our self-autonomy as the choice 	architecture leads us not to options that we choose for ourselves in a fictional neutral environments, but to those options that the architects believe are 	good for us. This substitution of judgment would satisfy the definition by Seana Shiffron. Second, the fact that there is no such things as a neutral 	architecture, is by itself, not justification enough for nudging. If we view the issue only from the point of view of normative ethics, assuming that 	coercion and interference are undesirable, intentional interference is much worse than unintentional interference.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;However, there are certain nudges that rely primarily on providing information, dispensing advice and rational persuasion.&lt;a name="_ftnref38" href="#_ftn38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The freedom of choice is preserved in these circumstances. Libertarians may 	argue that even these circumstances the shaping of choice is problematic. This issue, J S Blumenthal-Barby argues, is adequately addressed by the publicity 	condition, a concept borrowed by Thaler and Sunstein from John Rawls.&lt;a name="_ftnref39" href="#_ftn39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The 	principle states that officials should never use a technique they would be uncomfortable defending to the public; nudging is no exception. However, this 	seems like a simplistic solution to a complex problem. Nudges are meant to rely on inherent psychological tendencies, leveraging the theories about automatic and subconscious thinking as described by Daniel Kahneman in his book, "Thinking Fast, Thinking Slow."&lt;a name="_ftnref40" href="#_ftn40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In that sense, while transparency is desirable it may not be very effective.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Other commentators also note that while behavioral economics can show why people make certain decisions, it may not be able to reliably predict how people will behave in different circumstances. The burden of extrapolating the observations into meaningful nudges may prove to be too heavy.&lt;a name="_ftnref41" href="#_ftn41"&gt;&lt;sup&gt;&lt;sup&gt;[41]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, the most oft-quoted criticism of nudging is that it will rely on officials to formulate the desired goals towards which the choice architecture will lead us.&lt;a name="_ftnref42" href="#_ftn42"&gt;&lt;sup&gt;&lt;sup&gt;[42]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The judgments of these officials could be flawed and subject to influence by 	large corporations.&lt;a name="_ftnref43" href="#_ftn43"&gt;&lt;sup&gt;&lt;sup&gt;[43]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These concerns echo the best judge argument made against all 	forms of paternalism, mentioned earlier in this essay. J S Blumenthal-Barby, Assistant Professor at the Center for Medical Ethics and Health Policy, Baylor College of Medicine, also examines the claim that the choice architects will be susceptible to the same biases while designing the choice environment.&lt;a name="_ftnref44" href="#_ftn44"&gt;&lt;sup&gt;&lt;sup&gt;[44]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; His first argument in response to this is that experts who extensively study 	decision-making may be less prone to these errors. Second, he argues that even with errors and biases, a choice architecture which attempts to the rights the wrongs of a random and unstructured choice environment is a preferable option.&lt;a name="_ftnref45" href="#_ftn45"&gt;&lt;sup&gt;&lt;sup&gt;[45]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;Most libertarians will find the notion that individuals are prevented from sharing some information about themselves problematic. Anita Allen's idea about 	self-regarding duties is at odds how we understand rights and duties in most jurisdictions. Her attempt to locate an ethical duty to protect one's privacy, 	while interesting, is not backed by a formulation of how such a duty would work. While she relies largely on an Kantian framework, her definition of 	paternalism, as can be drawn from her writing is broader than that articulated by Kant himself. On the other hand, Thaler and Sunstein's book Nudge and 	related writings by them do attempt to build a framework of how nudging would work and answer some questions they anticipate would be raised against the 	idea of libertarian paternalism.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;By and large, I feel that, Thaler and Sunstein's idea of libertarian paternalism could be justified in the context of privacy and data protection governance. It would be fair to say the first two conditions of de Marneffe under which such acts of paternalism are justified	&lt;a name="_ftnref46" href="#_ftn46"&gt;&lt;sup&gt;&lt;sup&gt;[46]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; are largely satisfied by nudges that ensures greater privacy protection. If 	nudges can ensure greater privacy protection, its benefits are both substantial and evident. However, the larger question is whether these purported 	benefits outweigh the costs of loss of self-autonomy. Given the numerous ways in which the 'notice and consent' framework is ineffective and leads to very 	little informed consent, it can be argued that there is little exercise of autonomy, to begin with, and hence, the loss of self-autonomy is not 	substantial. Some of the conceptual issues which doubt the ability of nudges to solve complex problems remain unanswered and we will have to wait for more 	analysis by both cognitive scientists and policy-makers. However, given the growing inefficacy of the existing privacy protection framework, it would be a 	good idea of begin using some insights from cognitive science and behavioral economics to ensure greater privacy protection.&lt;/p&gt;
&lt;p align="left" style="text-align: justify;"&gt;The current value-neutrality of data protection law with respect of the kind of data collected and its use, and its complete reliance on the data subject 	to make an informed choice is, in my opinion, an idea that has run its course. Rather than focussing solely on the controls at the stage of data 	collection, I believe we need a more robust theory of how to govern the subsequent uses of data. This will is the focus of the next part of this series in 	which I will look at the greater use of risk-based approach to privacy protection.&lt;/p&gt;
&lt;div align="left" style="text-align: justify;"&gt;&lt;br clear="all" /&gt;
&lt;hr size="1" width="33%" /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a name="_ftn1" href="#_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; With invaluable inputs from Scott Mason.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a name="_ftn2" href="#_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Walter Lippmann, The Phantom Public, Transaction Publishers, 1925.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a name="_ftn3" href="#_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jonathan Obar, Big Data and the Phantom Public: Walter Lippmann and the fallacy of data privacy self management, Big Data and Society, 2015, available at &lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2239188"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2239188&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a name="_ftn4" href="#_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Anita Allen, Unpopular Privacy: What we must hide?, Oxford University Press USA, 2011.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a name="_ftn5" href="#_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Richard Thaler and Cass Sunstein, Nudge, Improving decisions about health, wealth and happinessYale University Press, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a name="_ftn6" href="#_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://plato.stanford.edu/entries/paternalism/"&gt;http://plato.stanford.edu/entries/paternalism/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a name="_ftn7" href="#_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Christian Coons and Michael Weber, ed., Paternalism: Theory and Practice; Cambridge University Press, 2013. at 29.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a name="_ftn8" href="#_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Seana Shiffrin, Paternalism, Unconscionability Doctrine, and Accommodation, available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2682745"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2682745&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a name="_ftn9" href="#_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Peter de Marneffe, Self Sovereignty and Paternalism, from Christian Coons and Michael Weber, ed., Paternalism: Theory and Practice; Cambridge 			University Press, 2013. at 58.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a name="_ftn10" href="#_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;em&gt;Id&lt;/em&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a name="_ftn11" href="#_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Christian Coons and Michael Weber, ed., Paternalism: Theory and Practice; Cambridge University Press, 2013. at 74.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a name="_ftn12" href="#_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Christian Coons and Michael Weber, ed., Paternalism: Theory and Practice; Cambridge University Press, 2013. at 115.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a name="_ftn13" href="#_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;em&gt;Ibid&lt;/em&gt; at 116.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a name="_ftn14" href="#_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Anita Allen, Unpopular Privacy: What we must hide?, Oxford University Press USA, 2011.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a name="_ftn15" href="#_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Janet Vertasi, My Experiment Opting Out of Big Data Made Me Look Like a Criminal, 2014, available at			&lt;a href="http://time.com/83200/privacy-internet-big-data-opt-out/"&gt;http://time.com/83200/privacy-internet-big-data-opt-out/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a name="_ftn16" href="#_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Anita Allen, Privacy Law: Positive Theory and Normative Practice, available at 			&lt;a href="http://harvardlawreview.org/2013/06/privacy-law-positive-theory-and-normative-practice/"&gt; http://harvardlawreview.org/2013/06/privacy-law-positive-theory-and-normative-practice/ &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a name="_ftn17" href="#_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; G A Cohen, Self ownership, world ownership and equality, available at 			&lt;a href="http://journals.cambridge.org/action/displayAbstract?fromPage=online&amp;amp;aid=3093280"&gt; http://journals.cambridge.org/action/displayAbstract?fromPage=online&amp;amp;aid=3093280 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a name="_ftn18" href="#_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Marcus G. Singer, On Duties to Oneself, available at			&lt;a href="http://www.jstor.org/stable/2379349?seq=1#page_scan_tab_contents"&gt;http://www.jstor.org/stable/2379349?seq=1#page_scan_tab_contents&lt;/a&gt;; 			Kurt Baier, The moral point of view: A rational basis of ethics, available at 			&lt;a href="https://www.uta.edu/philosophy/faculty/burgess-jackson/Baier,%20The%20Moral%20Point%20of%20View%20%281958%29%20%28Excerpt%20on%20Ethical%20Egoism%29.pdf"&gt; https://www.uta.edu/philosophy/faculty/burgess-jackson/Baier,%20The%20Moral%20Point%20of%20View%20%281958%29%20%28Excerpt%20on%20Ethical%20Egoism%29.pdf &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a name="_ftn19" href="#_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Michael Cholbi, Kantian Paternalism and suicide intervention, from Christian Coons and Michael Weber, ed., Paternalism: Theory and Practice; 			Cambridge University Press, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a name="_ftn20" href="#_ftnref20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Eric Posner, Liberalism and Concealment, available at 			&lt;a href="https://newrepublic.com/article/94037/unpopular-privacy-anita-allen"&gt; https://newrepublic.com/article/94037/unpopular-privacy-anita-allen &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a name="_ftn21" href="#_ftnref21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Richard Thaler and Cass Sunstein, Nudge, Improving decisions about health, wealth and happinessYale University Press, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a name="_ftn22" href="#_ftnref22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Kahneman, Thinking, fast and slow, Farrar, Straus and Giroux, 2011.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a name="_ftn23" href="#_ftnref23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Kahneman, Paul Slovic and Amos Tversky, Judgment under uncertainty: heuristics and biases, Cambridge University Press, 1982; Daniel Kahneman 			and Amos Tversky, Choices, Values and Frames, Cambridge University Press, 2000.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a name="_ftn24" href="#_ftnref24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Richard Thaler, Advances in behavioral finance, Russell Sage Foundation, 1993.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a name="_ftn25" href="#_ftnref25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Thaler, Sunstein and Balz, Choice Architecture, available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583509"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583509&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a name="_ftn26" href="#_ftnref26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Solove, Privacy self-management and consent dilemma, 2013 available at			 &lt;a href="http://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2093&amp;amp;context=faculty_publications"&gt; http://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2093&amp;amp;context=faculty_publications &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a name="_ftn27" href="#_ftnref27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Frederik Borgesius, Behavioral sciences and the regulation of privacy on the Internet, available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2513771"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2513771&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a name="_ftn28" href="#_ftnref28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ryan Calo and Dr. Victoria Groom, Reversing the Privacy Paradox: An experimental study, available at			&lt;a href="http://ssrn.com/abstract=1993125"&gt;http://ssrn.com/abstract=1993125&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a name="_ftn29" href="#_ftnref29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Shara Monteleon et al, Nudges to Privacy Behavior: Exploring an alternative approahc to privacy notices, available at 			&lt;a href="http://publications.jrc.ec.europa.eu/repository/bitstream/JRC96695/jrc96695.pdf"&gt; http://publications.jrc.ec.europa.eu/repository/bitstream/JRC96695/jrc96695.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a name="_ftn30" href="#_ftnref30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Kahneman, Thinking, fast and slow, Farrar, Straus and Giroux, 2011.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a name="_ftn31" href="#_ftnref31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alessandro Acquisti, Nudging Privacy, available at 			&lt;a href="http://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf"&gt; http://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a name="_ftn32" href="#_ftnref32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.webroot.com/En_US/sites/sobrietytest/test.php?url=0"&gt;http://www.webroot.com/En_US/sites/sobrietytest/test.php?url=0&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a name="_ftn33" href="#_ftnref33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://google.about.com/od/m/g/mail_goggles.htm"&gt;http://google.about.com/od/m/g/mail_goggles.htm&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a name="_ftn34" href="#_ftnref34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Rebecca Balebako et al, Nudging Users towards privacy on mobile devices, available at			&lt;a href="https://www.andrew.cmu.edu/user/pgl/paper6.pdf"&gt;https://www.andrew.cmu.edu/user/pgl/paper6.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a name="_ftn35" href="#_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;em&gt;Id&lt;/em&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a name="_ftn36" href="#_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ritu Gulia and Dr. Sapna Gambhir, Privacy and Privacy Nudges for OSNs: A Review, available at			&lt;a href="http://www.ijircce.com/upload/2014/march/14L_Privacy.pdf"&gt;http://www.ijircce.com/upload/2014/march/14L_Privacy.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a name="_ftn37" href="#_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Annie I. Anton et al., Financial Privacy Policies and the Need for Standardization, 2004 available at &lt;a href="https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf"&gt;https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf&lt;/a&gt;; Florian Schaub, R. Balebako et al, "A Design Space for effective privacy notices" available at			 &lt;a href="https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf"&gt; https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a name="_ftn38" href="#_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Hausman and Bryan Welch argue that these cases are mistakenly characterized as nudges. They believe that nudges do not try to inform the 			automatic system, but manipulate the inherent cognitive biases. Daniel Hausman and Bryan Welch, Debate: To Nudge or Not to Nudge, Journal of 			Political Philosophy 18(1).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a name="_ftn39" href="#_ftnref39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ryan Calo, Code, Nudge or Notice, available at&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a name="_ftn40" href="#_ftnref40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Kahneman, Thinking, fast and slow, Farrar, Straus and Giroux, 2011.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn41"&gt;
&lt;p&gt;&lt;a name="_ftn41" href="#_ftnref41"&gt;&lt;sup&gt;&lt;sup&gt;[41]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Evan Selinger and Kyle Powys Whyte, Nudging cannot solve complex policy problems.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn42"&gt;
&lt;p&gt;&lt;a name="_ftn42" href="#_ftnref42"&gt;&lt;sup&gt;&lt;sup&gt;[42]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Mario J. Rizzo &amp;amp; Douglas Glen Whitman, The Knowledge Problem of New Paternalism, available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1310732"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1310732&lt;/a&gt;; Pierre Schlag, Nudge, Choice Architecture, and Libertarian Paternalism, available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1585362"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1585362&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn43"&gt;
&lt;p&gt;&lt;a name="_ftn43" href="#_ftnref43"&gt;&lt;sup&gt;&lt;sup&gt;[43]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Edward L. Glaeser, Paternalism and Psychology, available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=917383"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=917383&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn44"&gt;
&lt;p&gt;&lt;a name="_ftn44" href="#_ftnref44"&gt;&lt;sup&gt;&lt;sup&gt;[44]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; J S BLumenthal-Barby, Choice Architecture: A mechanism for improving decisions&lt;/p&gt;
&lt;p&gt;while preserving liberty?, from Christian Coons and Michael Weber, ed., Paternalism: Theory and Practice; Cambridge University Press, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn45"&gt;
&lt;p&gt;&lt;a name="_ftn45" href="#_ftnref45"&gt;&lt;sup&gt;&lt;sup&gt;[45]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;em&gt;Id&lt;/em&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn46"&gt;
&lt;p&gt;&lt;a name="_ftn46" href="#_ftnref46"&gt;&lt;sup&gt;&lt;sup&gt;[46]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; According to de Marneffe, there are three conditions under which such acts of paternalism are justified - the benefits of welfare should be 			substantial, evident and must outweigh the benefits of self-autonomy. Peter de Marneffe, Self Sovereignty and Paternalism, from Christian Coons and 			Michael Weber, ed., Paternalism: Theory and Practice; Cambridge University Press, 2013. at 58.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/a-case-for-greater-privacy-paternalism'&gt;https://cis-india.org/internet-governance/blog/a-case-for-greater-privacy-paternalism&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-02-20T07:28:43Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/openness/open-data-hackathons-are-great-but-address-privacy-and-license-concerns">
    <title>Open Data Hackathons are Great, but Address Privacy and License Concerns</title>
    <link>https://cis-india.org/openness/open-data-hackathons-are-great-but-address-privacy-and-license-concerns</link>
    <description>
        &lt;b&gt;This is to cross-publish a blog post from DataMeet website regarding a letter shared with the organisers of Urban Hack 2015, Bangalore, in response to a set of privacy and license concerns identified and voiced during the hackathon by DataMeet members. Sumandro Chattapadhyay co-authored and co-signed the letter. The blog post is written by Nisha Thompson.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Hackathons are a source of confusion and frustration for us. DataMeet actively does not do them unless there is a very specific outcome the community wants like&lt;a href="https://github.com/datameet/maps/tree/master/parliamentary-constituencies"&gt; freeing a whole dataset &lt;/a&gt;or introducing &lt;a href="http://datameet.org/2015/05/13/mumbai-meet-6-data-science-hackathon/"&gt;open data to a new audience&lt;/a&gt;. We feel that they cause burn out, are not productive, and in general don't help create a healthy community of civic tech and open data enthusiasts.&lt;/p&gt;
&lt;p&gt;That is not to say we feel others shouldn't do them, they are very good opportunities to spark discussion and introduce new audiences to problems in the social sector. &lt;a href="http://www.datakind.org/chapters/datakind-blr"&gt;DataKind&lt;/a&gt; and &lt;a href="https://rhokbangalore.wordpress.com/"&gt;RHOK&lt;/a&gt; and numerous others host hackathons or variations of them regularly to stir the pot, bring new people into civic tech and they can be successful starts to long term connections and experiments. A lot of people in the DataMeet community participate and enjoy hackathons.&lt;/p&gt;
&lt;p&gt;However, with great data access comes great responsibility. We always want to make sure that even if no output is achieved when a dataset is opened at least no harm should be done.&lt;/p&gt;
&lt;p&gt;Last October an open data hackathon,&lt;a href="https://www.hackerearth.com/sprints/urban-hack/"&gt; Urban Hack&lt;/a&gt;, run by Hacker Earth, &lt;a href="http://www.nasscom.in/"&gt;NASSCOM&lt;/a&gt;, &lt;a href="http://www.xrci.xerox.com/"&gt;XEROX&lt;/a&gt;, &lt;a href="https://console.ng.bluemix.net/?cm_mmc=EcoDISA-_-Bluemix_day-_-11-15-14::12-31-15-_-UrbanHack"&gt;IBM &lt;/a&gt;and &lt;a href="http://wri-india.org/"&gt;World Resource Institute India&lt;/a&gt; wanted to bring out open data and spark innovation in the transport and crime space by making datasets from &lt;a href="http://mybmtc.com/"&gt;Bangalore Metropolitan Transport Corporation (BMTC)&lt;/a&gt; and the Bangalore City Police available to work with. A DataMeet member (&lt;a href="http://www.lostprogrammer.com/"&gt;Srinivas Kodali&lt;/a&gt;) was participating, he is a huge transport data enthusiast and wanted to take a look at what is being made available.&lt;/p&gt;
&lt;p&gt;In the morning shortly after it started I received a call from him that there is a dataset that was made available that seems to be violating privacy and data security. We contacted the organizers and they took it down, later we realized it was quite a sensitive dataset and a few hundred people had already downloaded it. We were also distressed that they had not clarified ownership of data, license of data, and had linked to sources like &lt;a href="http://openbangalore.org/"&gt;Open Bangalore&lt;/a&gt;  without specifying licensing, which violated the license.&lt;/p&gt;
&lt;p&gt;The organizers were quite noted and had been involved with hackathons before so it was a little distressing to see these mistakes being made. We were concerned that the government partners (who had not participated in these types of events before) were also being exposed to poor practices. As smart cities initiatives take over the Indian urban space, we began to realize that this is a mistake that shouldn't happen again.&lt;/p&gt;
&lt;p&gt;Along with &lt;a href="http://cis-india.org/"&gt;Centre for Internet and Society&lt;/a&gt; and Random Hacks of Kindness we sent the organizers, Bangalore City Police and BMTC a letter about the breach in protocol. We wanted to make sure everyone was aware of the issues and that measures were taken to not repeat these mistakes.&lt;/p&gt;
&lt;p&gt;You can see the letter here:&lt;/p&gt;
&lt;p&gt;&lt;iframe src="https://www.documentcloud.org/documents/2702333-Appropriate-and-Responsible-Practices-for.html" height="500" width="600"&gt;&lt;/iframe&gt;&lt;/p&gt;
&lt;p&gt;We are very proud of the DataMeet community and Srinivas for bringing this violation to the attention of the organizers. As people who participate in hackathons and other data events it is imperative that privacy and security are kept in mind at all times. In a space like India where a lot of these concepts are new to institutions, like the Government, it is essential that we are always using opportunities not only to showcase the power of open data but also good practices for protecting privacy and ensuring security.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Originally posted on DataMeet website: &lt;a href="http://datameet.org/2016/02/02/to-hack-or-not-to-hack/"&gt;http://datameet.org/2016/02/02/to-hack-or-not-to-hack/&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/openness/open-data-hackathons-are-great-but-address-privacy-and-license-concerns'&gt;https://cis-india.org/openness/open-data-hackathons-are-great-but-address-privacy-and-license-concerns&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sumandro</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Open Government Data</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Hackathon</dc:subject>
    
    
        <dc:subject>Openness</dc:subject>
    

   <dc:date>2016-02-05T20:37:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/data-privacy-day-2016">
    <title>Data Privacy Day 2016</title>
    <link>https://cis-india.org/internet-governance/news/data-privacy-day-2016</link>
    <description>
        &lt;b&gt;The Bangalore chapter of Data Privacy Day was organized by Data Security Council of India on January 28, 2016 at Electronic City in Bangalore. Sunil Abraham was a panelist.&lt;/b&gt;
        &lt;h3&gt;Agenda&lt;/h3&gt;
&lt;table class="listing"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;&lt;img src="https://cis-india.org/home-images/DSCI.jpg/@@images/db4d4755-b12d-47fc-85fa-bf728f2b82b8.jpeg" alt="DSCI" class="image-inline" title="DSCI" /&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/data-privacy-day-2016'&gt;https://cis-india.org/internet-governance/news/data-privacy-day-2016&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-29T15:34:18Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/events/big-data-governance-india">
    <title>Big Data and Governance in India</title>
    <link>https://cis-india.org/internet-governance/events/big-data-governance-india</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) is happy to invite you to a discussion on the role of Big Data in governance in India with a focus on Digital India, UID Scheme and Smart Cities Mission in India on January 23, 2016 at CIS office in Bangalore from 11 a.m. to 4 p.m.&lt;/b&gt;
        &lt;h3&gt;&lt;a href="https://cis-india.org/internet-governance/blog/background-note-big-data" class="internal-link"&gt;Background Note&lt;/a&gt;&lt;/h3&gt;
&lt;hr /&gt;
&lt;p&gt;The roundtable discussion intends to delve deeper into various issues around the role of big data in Government schemes and projects like the Digital India, the UID Scheme and the 100 Smart Cities Mission. Some of the topics would include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Use/Assumptions about use of Big Data.&lt;/li&gt;
&lt;li&gt;The public dialogue in the context of Big Data, rights, and governance.&lt;/li&gt;
&lt;li&gt;Status and Role of India's data protection standards impacted by Big Data.&lt;/li&gt;
&lt;li&gt;Legal hurdles posed by Big Data.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;We look forward to making this a forum for knowledge exchange and a learning opportunity for our friends and colleagues attending the discussion.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Contact:&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Vanya Rakesh vanya@cis-india.org +919586572707&lt;/li&gt;
&lt;li&gt;Amber Sinha amber@cis-india.org +919620180343&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Agenda&lt;/h2&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Introduction&lt;br /&gt;11:00 am - 11.30 am&lt;br /&gt;&lt;br /&gt;&lt;/td&gt;
&lt;td&gt;Introduction about “Big Data in the Global South: Mitigating Harms” and “Big Data in Indian Governance”.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Digital India&lt;br /&gt;11.30 am - 1:00 pm&lt;br /&gt;&lt;br /&gt;&lt;/td&gt;
&lt;td&gt;Discussion&lt;br /&gt;&lt;br /&gt; 
&lt;ul&gt;
&lt;li&gt;Schemes under Digital India and how Big Data pertains to them&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Scale and nature of data being collected&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Actors involved&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Research Methodology and coding&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;“Cradle to grave” identity&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Need for privacy legislation/data protection policies&lt;/li&gt;
&lt;/ul&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1:00 pm- 2:00 pm &lt;br /&gt;&lt;/td&gt;
&lt;td&gt;Lunch&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Big Data and Smart Cities&lt;br /&gt;2:00 pm - 3:30pm &lt;br /&gt;&lt;br /&gt;&lt;/td&gt;
&lt;td&gt;Discussion&lt;br /&gt;&lt;br /&gt; 
&lt;ul&gt;
&lt;li&gt;Use/Assumptions about use of Big Data in Smart cities.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Organisations/companies driving the use of Big Data in Governance in India&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;The public dialogue around the scheme in the context of big data, rights, and governance&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Impact of Big Data on India's Data Protection Standards &lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Impact of Big Data on other legislation/policy besides privacy . What type of 'legal hurdles' could Big Data pose?&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Need for creating regulatory/legal framework&lt;/li&gt;
&lt;/ul&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3:30pm-4:00pm&lt;/td&gt;
&lt;td&gt;Tea/Coffee&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h2&gt;Detailed Agenda&lt;/h2&gt;
&lt;h3&gt;Digital India&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Scope of schemes under Digital India and how Big Data pertains to them&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What are the ways in which Big Data is defined?&lt;/li&gt;
&lt;li&gt;What aspects of Digital India initiatives pertain to Big Data?&lt;/li&gt;
&lt;li&gt;What could be the harms/benefits of Big Data for Digital India?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Scale and nature of data being collected&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What do the schemes intend to quantify?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Actors involved&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What kinds of issue arise in PPP model?&lt;/li&gt;
&lt;li&gt;Questions about ownership of data, access-control and security&lt;/li&gt;
&lt;li&gt;Application of Section 43A rules to private parties involved&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Research Methodology and coding&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What the relevant questions that need to be asked in mapping each scheme?&lt;/li&gt;
&lt;li&gt;How do we view e-governance initiatives vis-a-vis privacy principles?&lt;/li&gt;
&lt;li&gt;What are the rights of citizens, and how are they impacted?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;“Cradle to grave” identity&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What does ‘cradle to grave’ digital identity mean?&lt;/li&gt;
&lt;li&gt;What is the impact of using the Aadhaar number?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Need for privacy legislation/data protection policies&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What aspects of the right to privacy pertain to the schemes?&lt;/li&gt;
&lt;li&gt;Extending the Section 43A rules to government agencies&lt;/li&gt;
&lt;li&gt;Justice Shah committee’s nine privacy principles.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Big Data and Smart Cities&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Use/Assumptions about use of Big Data in Smart cities&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What can be termed as big data in the context of smart cities.&lt;/li&gt;
&lt;li&gt;What would be the role of big data.&lt;/li&gt;
&lt;li&gt;Where do we see use/potential use of big data in the smart cities.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;What bodies/companies are driving the use of Big Data in Governance in India? &lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Identifying actors involved.&lt;/li&gt;
&lt;li&gt;Defining the role of: Government bodies, Private companies like IT Companies, consultants, etc.  in use of big data. Clarity on ownership, storage, use, re-use, deletion of data. Question of accountability in case of breach/misuse.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;What has been the public dialogue around a scheme in the context of big data, rights, and governance? &lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Weighing promises of big data.&lt;/li&gt;
&lt;li&gt;Weighing challenges of big data.&lt;/li&gt;
&lt;li&gt;Concerns around big data- data security, privacy, digital resilience of infrastructure, risks of identity management, Circumvention of democracy, social exclusion, right to equality, right to access, etc.&lt;/li&gt;
&lt;li&gt;Issue of governance and implementation: role of SPVs.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;How are India's data protection standards impacted by Big Data? &lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Need for developing standards.&lt;/li&gt;
&lt;li&gt;Drawing from existing international standards.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Are there other legislation/policy besides privacy impacted by Big Data? what type of 'legal hurdles' could Big Data pose?&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Legal landscaping: impact on current laws/policies/provisions.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Need for creating regulatory/legal framework?&lt;/b&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/events/big-data-governance-india'&gt;https://cis-india.org/internet-governance/events/big-data-governance-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Smart Cities</dc:subject>
    
    
        <dc:subject>Event</dc:subject>
    

   <dc:date>2016-01-17T01:57:45Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/events/network-neutrality-regulation-across-south-asia-a-roundtable-on-aspects-of-differential-pricing">
    <title>Network Neutrality Regulation across South Asia: A Roundtable on Aspects of Differential Pricing</title>
    <link>https://cis-india.org/internet-governance/events/network-neutrality-regulation-across-south-asia-a-roundtable-on-aspects-of-differential-pricing</link>
    <description>
        &lt;b&gt;The Centre of Internet and Society (CIS) in association with Observer Research Foundation, and IT For Change in collaboration with the Annenberg School for Communications at the University of Pennsylvania is pleased to announce a roundtable on ‘Network Neutrality Regulation Across South Asia: Aspects of Differential Pricing” that will take place on January 22, 2016 from 11.00 a.m. to 5.00 p.m. at TERI in Bangalore. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;b&gt;&lt;a href="https://cis-india.org/internet-governance/blog/network-neutrality-across-south-asia" class="internal-link"&gt;Download the Invite&lt;/a&gt;&lt;/b&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The objective of this roundtable will be to look into the issue of differential pricing in light of TRAI’s recent consultation process, with the specific intention of research building. The network neutrality debate has gained significant momentum in India during the past year, with competing interests of internet service providers, OTTs and the public giving rise to important questions of ICT regulation and policy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With Facebook looking to expand its zero rated walled garden, Free Basics, into nascent markets, differential pricing is an important point of regulatory policy not just in India, but in jurisdictions across South Asia. These countries have limited connectivity, large consumer potential and low internet penetration which bring to the fore questions of access, diversity, competition and innovation. To this end, the roundtable will seek to address the regulatory and market aspects of differential pricing as well as the impact on rights. Broadly, the roundtable will be forward looking and seek to build future research agendas.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Draft Agenda&lt;/h3&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;11:00 – 11:30&lt;/td&gt;
&lt;td&gt;Tea and Registration&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;11:30 – 12:30&lt;/td&gt;
&lt;td&gt;Roundtable 1: Framing the issue:&lt;br /&gt; 
&lt;ul&gt;
&lt;li&gt;The practice of differential pricing&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Examples of differential pricing&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Stakeholder perspectives&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Competition and market effect of differential pricing&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Larger social consequences of differential pricing&lt;/li&gt;
&lt;/ul&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;12:30 – 1:00&lt;/td&gt;
&lt;td&gt;Lunch&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1:00 – 2:30&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Roundtable 2: Regulatory response:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Discerning governmental actions&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Locating public interest&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Moving from research to action&lt;/li&gt;
&lt;/ul&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2:30 – 3:00&lt;/td&gt;
&lt;td&gt;Tea&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3:00 – 4:30&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Roundtable 3: Impact on rights:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Access&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Freedom of expression&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Privacy&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Equity and Social Justice&lt;/li&gt;
&lt;/ul&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4:30 – 5:00&lt;/td&gt;
&lt;td&gt;Discussion and research agenda building&lt;br /&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3&gt;Roundtable Questions:&lt;/h3&gt;
&lt;p&gt;Roundtable 1: FRAMING THE ISSUE:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What is differential pricing and how does it work? What are the technical components and policy components of differential pricing? What are examples of differential pricing?&lt;/li&gt;
&lt;li&gt;What has been the response from different stakeholders to differential pricing schemes? What are the arguments for/against differential pricing?&lt;/li&gt;
&lt;li&gt;What could be the market effect of differential pricing?&lt;/li&gt;
&lt;li&gt;What are possible larger social impacts of differential pricing?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Roundtable 2: REGULATORY RESPONSE:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;How have governments responded to differential pricing? What can these responses tell us about the position of governments?&lt;/li&gt;
&lt;li&gt;What are the different components for consideration with developing a regulatory response? What are different forms of regulation for differential pricing?&lt;/li&gt;
&lt;li&gt;What type of policy research around differential pricing can drive meaningful action?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Roundtable 3: IMPACT ON RIGHTS:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;How does differential pricing impact the right to access, freedom of expression, privacy, and equity and social justice?&lt;/li&gt;
&lt;li&gt;Are there ways to mitigate this impact through regulation? Market incentives? Company policy?&lt;/li&gt;
&lt;li&gt;What are forms of redress that individuals could seek in the context of differential pricing?&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/events/network-neutrality-regulation-across-south-asia-a-roundtable-on-aspects-of-differential-pricing'&gt;https://cis-india.org/internet-governance/events/network-neutrality-regulation-across-south-asia-a-roundtable-on-aspects-of-differential-pricing&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Free Basics</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Event</dc:subject>
    

   <dc:date>2016-01-17T02:41:13Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/what-are-peoples-rights-in-digital-world">
    <title>What are People's Rights in Digital World</title>
    <link>https://cis-india.org/internet-governance/news/what-are-peoples-rights-in-digital-world</link>
    <description>
        &lt;b&gt;Vanya Rakesh participated in this workshop organized by IT for Change on December 4, 2015 in Bangalore.&lt;/b&gt;
        &lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;
&lt;p&gt;&lt;img src="https://cis-india.org/home-images/PeoplesRights.jpg" alt="Peoples Rights" class="image-inline" title="Peoples Rights" /&gt;&lt;/p&gt;
&lt;/th&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Above: Participants from the workshop&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;This workshop by IT for Change to build  conceptions of rights with regard to the digital realm based on our tacit formative consciousness about them and undertake such an exercise to draw the first outlines of the social contract that must underpin our pervasively digital existence. IT for Change brought together thought leaders engaged in rights frameworks (including rights activists across domains and digital rights activists) to participate in this preliminary inquiry, to build from scratch a conception of what constitutes an equitable and just digital society, and what individual and collective rights would be commensurate to such a conception.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For more info &lt;a class="external-link" href="http://sflc.in/workshop-on-what-are-peoples-rights-in-the-digital-world/"&gt;click here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/what-are-peoples-rights-in-digital-world'&gt;https://cis-india.org/internet-governance/news/what-are-peoples-rights-in-digital-world&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-12T01:51:53Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy">
    <title>A Critique of Consent in Information Privacy</title>
    <link>https://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy</link>
    <description>
        &lt;b&gt;The idea of informed consent in privacy law is supposed to ensure the autonomy of an individual in any exercise which involves sharing of the individual's personal information. Consent is usually taken through a document, a privacy notice, signed or otherwise agreed to by the participant.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;b&gt;Notice and Consent as cornerstone of privacy law&lt;/b&gt;&lt;br /&gt;The privacy notice, which is the primary subject of this article, conveys all pertinent information, including risks and benefits to the participant, and in the possession of such knowledge, they can make an informed choice about whether to participate or not.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most modern laws and data privacy principles seek to focus on individual control. In this context, the definition by the late Alan Westin, former Professor 	of Public Law &amp;amp; Government Emeritus, Columbia University, which characterises privacy as "the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other,"	&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; is most apt. The idea of privacy as control is what finds articulation in data protection policies across jurisdictions beginning from the Fair Information Practice Principles (FIPP) from the United States.	&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Schwarz, the Jefferson E. Peyser Professor at UC Berkeley School of Law and a Director of the Berkeley Center for Law and Technology, called the FIPP the building blocks of modern information privacy law.	&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These principles trace their history to a report called 'Records, Computers and 	Rights of Citizens'&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; prepared by an Advisory Committee appointed by the US Department 	of Health, Education and Welfare in 1973 in response to the increasing automation in data systems containing information about individuals. The Committee's 	mandate was to "explore the impact of computers on record keeping about individuals and, in addition, to inquire into, and make recommendations regarding, 	the use of the Social Security number."&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The most important legacy of this report was 	the articulation of five principles which would not only play a significant role in the privacy laws in US but also inform data protection law in most 	privacy regimes internationally&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; like the OECD Privacy Guidelines, the EU Data 	Protection Principles, the FTC Privacy Principles, APEC Framework or the nine National Privacy Principles articulated by the Justice A P Shah Committee 	Report which are reflected in the Privacy Bill, 2014 in India. Fred Cate, the C. Ben Dutton Professor of Law at the Indiana University Maurer School of 	Law, effectively summarises the import of all of these privacy regimes as follows:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"All of these data protection instruments reflect the same approach: tell individuals what data you wish to collect or use, give them a choice, grant them 	access, secure those data with appropriate technologies and procedures, and be subject to third-party enforcement if you fail to comply with these 	requirements or individuals' expressed preferences"&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This makes the individual empowered and allows them to weigh their own interests in exercising their consent. The allure of this paradigm is that in one 	elegant stroke, it seeks to "ensure that consent is informed and free and thereby also to implement an acceptable tradeoff between privacy and competing 	concerns."&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This system was originally intended to be only one of the multiple ways 	in data processing would be governed, along with other substantive principles such as data quality, however, it soon became the dominant and often the only 	mechanism.&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In recent years however, the emergence of Big Data and the nascent development of the Internet of Things has led many commentators to begin questioning the workability of consent as a principle of privacy.	&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In this article we will look closely at the some of issues with the concept of 	informed consent, and how these notions have become more acute in recent years. Following an analysis of these issues, we will conclude by arguing that 	today consent, as the cornerstone of privacy law, may in fact be thought of as counter-productive and that a rethinking of a principle based approach to 	privacy may be necessary.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Problems with Consent&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To a certain extent, there are some cognitive problems that have always existed with the issue of informed consent such as long and difficult to understand 	privacy notices,&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; although, in recent past with these problems have become much 	more aggravated. Fred Cate points out that FIPPs at their inception were broad principles which included both substantive and procedural aspects. However, 	as they were translated into national laws, the emphasis remained on the procedural aspect of notice and consent. From the idea of individual or societal 	welfare as the goals of privacy, the focus had shifted to individual control.&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; With data collection occurring with every use of online services, and complex data sets being created, it is humanly impossible to exercise rational 	decision-making about the choice to allow someone to use our personal data. The thrust of Big Data technologies is that the value of data resides not in its primary purposes but in its numerous secondary purposes where data is re-used many times over.	&lt;a href="#_ftn13" name="_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In that sense, the very idea of Big Data conflicts with the data minimization 	principle.&lt;a href="#_ftn14" name="_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The idea is to retain as much data as possible for secondary uses. Since, these secondary uses are, by their nature, unanticipated, its runs counter to the the very idea of the purpose limitation principle.	&lt;a href="#_ftn15" name="_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The notice and consent requirement has simply led to a proliferation of long and 	complex privacy notices which are seldom read and even more rarely understood. We will articulate some issues with privacy notices which have always 	existed, and have only become more exacerbated in the context of Big Data and the Internet of Things.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;1. &lt;/b&gt; &lt;b&gt;Failure to read/access privacy notices &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The notice and consent principle relies on the ability of the individual to make an informed choice after reading the privacy notice. The purpose of a 	privacy notice is to act as a public announcement of the internal practices on collection, processing, retention and sharing of information and make the 	user aware of the same.&lt;a href="#_ftn16" name="_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, in order to do so the individual must first be 	able to access the privacy notices in an intelligible format and read them. Privacy notices come in various forms, ranging from documents posted as privacy policies on a website, to click through notices in a mobile app, to signs posted in public spaces informing about the presence of CCTV cameras.	&lt;a href="#_ftn17" name="_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In order for the principle of notice and consent to work, the privacy notices need to be made available in a language understood by the user. As per 	estimates, about 840 million people (11% of the world population) can speak or understand English. However, most privacy notices online are not available 	in the local language in different regions.&lt;a href="#_ftn18" name="_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Further, with the ubiquity of 	smartphones and advent of Internet of Things, constrained interfaces on mobile screens and wearables make the privacy notices extremely difficult to read. 	It must be remembered that privacy notices often run into several pages, and smaller screens effectively ensure that most users do not read through them. Further, connected wearable devices often have "little or no interfaces that readily permit choices."	&lt;a href="#_ftn19" name="_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As more and more devices are connected, this problem will only get more 	pronounced. Imagine in a world where refrigerators act as the intermediary disclosing information to your doctor or supermarket, at what point does the 	data subject step in and exercise consent.&lt;a href="#_ftn20" name="_ftnref20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another aspect that needs to be understood is that unlike earlier when data collectors were far and few in between, the user could theoretically make a 	rational choice taking into account the purpose of data collection. However, in the world of Big Data, consent often needs to be provided while the user is 	trying to access services. In that context click through privacy notices such as those required to access online application, are treated simply as an 	impediment that must be crossed in order to get access to services. The fact that the consent need to be given in real time almost always results in 	disregarding what the privacy notices say.&lt;a href="#_ftn21" name="_ftnref21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, some scholars have argued that while individual control over data may be appealing in theory, it merely gives an illusion of enhanced privacy but 	not the reality of meaningful choice.&lt;a href="#_ftn22" name="_ftnref22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Research demonstrates that the presence of 	the term 'privacy policy' leads people to the false assumption that if a company has a privacy policy in place, it automatically means presence of 	substantive and responsible limits on how data is handled.&lt;a href="#_ftn23" name="_ftnref23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Joseph Turow, the 	Robert Lewis Shayon Professor of Communication at the Annenberg School for Communication, and his team for example has demonstrated how "[w]hen consumers 	see the term 'privacy policy,' they believe that their personal information will be protected in specific ways; in particular, they assume that a website 	that advertises a privacy policy will not share their personal information."&lt;a href="#_ftn24" name="_ftnref24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In 	reality, however, privacy policies are more likely to serve as liability disclaimers for companies than any kind of guarantee of privacy for consumers. 	Most people tend to ignore privacy policies.&lt;a href="#_ftn25" name="_ftnref25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Cass Sunstein states that our 	cognitive capacity to make choices and take decisions is limited. When faced with an overwhelming number of choices to make, most of us do not read privacy 	notices and resort to default options.&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt; The requirement to make choices, sometimes several times in a day, imposes significant burden on the consumers as well the business seeking such consent.	&lt;a href="#_ftn27" name="_ftnref27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;2. &lt;/b&gt; &lt;b&gt;Failure to understand privacy notices&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;FTC chairperson Edith Ramirez stated: "In my mind, the question is not whether consumers should be given a say over unexpected uses of their data; rather, 	the question is how to provide simplified notice and choice."&lt;a href="#_ftn28" name="_ftnref28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Privacy notices 	often come in the form of long legal documents much to the detriment of the readers' ability to understand them. These policies are "long, complicated, 	full of jargon and change frequently."&lt;a href="#_ftn29" name="_ftnref29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kent walker list five problems that 	privacy notices typically suffer from - a) overkill - long and repetitive text in small print, b) irrelevance - describing situations of little concern to 	most consumers, c) opacity - broad terms the reflect the truth that is impossible to track and control all the information collected and stored, d) 	non-comparability - simplification required to achieve comparability will lead to compromising accuracy, and e) inflexibility - failure to keep pace with 	new business models.&lt;a href="#_ftn30" name="_ftnref30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Erik Sherman did a review of twenty three corporate privacy 	notices and mapped them against three indices which give approximate level of education necessary to understand text on a first read. His results show that most of policies can only be understood on the first read by people of a grade level of 15 or above.	&lt;a href="#_ftn31" name="_ftnref31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; FTC Chairperson Timothy Muris summed up the problem with long privacy notices when he said, "Acres of trees died to produce a blizzard of barely comprehensible privacy notices."	&lt;a href="#_ftn32" name="_ftnref32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Margaret Jane Radin, the former Henry King Ransom Professor of Law Emerita at the University of Michigan, provides a good definition of free consent. It 	"involves a knowing understanding of what one is doing in a context in which it is actually&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;possible for or to do otherwise, and an affirmative action in doing something, rather&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;than a merely passive acquiescence in accepting something."&lt;a href="#_ftn33" name="_ftnref33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; There have been various proposals advocating a more succinct and simpler standard for privacy notices,&lt;a href="#_ftn34" name="_ftnref34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; or multi-layered notices&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; or representing the information in the form of a table.	&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, studies show only an insignificant improvement in the understanding by consumers when privacy policies are represented in graphic formats like tables and labels.	&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; It has also been pointed out that it is impossible to convey complex data 	policies in simple and clear language.&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;3. &lt;/b&gt; &lt;b&gt;Failure to anticipate/comprehend the consequences of consent&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Today's infinitely complex and labyrinthine data ecosystem is beyond the comprehension of most ordinary users. Despite a growing willingness to share 	information online, most have no understanding of what happens to their data once they have uploaded it - Where it goes? Whom it is held by? Under what 	conditions? For what purpose? Or how might it be used, aggregated, hacked, or leaked in the future? For the most part, the above operations are "invisible, 	managed at distant centers, from behind the scenes, by unmanned powers."&lt;a href="#_ftn39" name="_ftnref39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The perceived opportunities and benefits of Big Data have led to an acceptance of the indiscriminate collection of as much data as possible as well as the 	retention of that data for unspecified future analysis. For many advocates, such practices are absolutely essential if Big Data is to deliver on its 	promises.. Experts have argued that key privacy principles particularly those of collection limitation, data minimization and purpose limitation should not 	be applied to Big Data processing.&lt;a href="#_ftn40" name="_ftnref40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As mentioned above, in the case of Big Data, 	the value of the data collected comes often not from its primary purpose but from its secondary uses. Deriving value from datasets involves amalgamating 	diverse datasets and executing speculative and exploratory kinds of analysis in order to discover hidden insights and correlations that might have 	previously gone unnoticed.&lt;a href="#_ftn41" name="_ftnref41"&gt;&lt;sup&gt;&lt;sup&gt;[41]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As such organizations are today routinely reprocessing 	data collected from individuals for purposes not directly related to the services they provide to the customer. These secondary uses of data are becoming increasingly valuable sources of revenue for companies as the value of data in and of itself continues to rise.	&lt;a href="#_ftn42" name="_ftnref42"&gt;&lt;sup&gt;&lt;sup&gt;[42]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Purpose Limitation&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The principle of purpose limitation has served as a key component of data protection for decades. Purposes given for the processing of users' data should 	be given at the time of collection and consent and should be "specified, explicit and legitimate". In practice however, reasons given typically include phrases such as, 'for marketing purposes' or 'to improve the user experience' that are vague and open to interpretation.	&lt;a href="#_ftn43" name="_ftnref43"&gt;&lt;sup&gt;&lt;sup&gt;[43]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some commentators whilst conceding the fact that purpose limitation in the era of Big Data may not be possible have instead attempted to emphasise the 	notion of 'compatible use' requirements. In the view of Working Party on the protection of individuals with regard to the processing of person data, for 	example, use of data for a purpose other than that originally stated at the point of collection should be subject to a case-by-case review of whether not 	further processing for different purpose is justifiable - i.e., compatible with the original purpose. Such a review may take into account for example, the 	context in which the data was originally collected, the nature or sensitivity of the data involved, and the existence of relevant safeguards to insure fair 	processing of the data and prevent undue harm to the data subject.&lt;a href="#_ftn44" name="_ftnref44"&gt;&lt;sup&gt;&lt;sup&gt;[44]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the other hand, Big Data advocates have argued that an assessment of legitimate interest rather than compatibility with the initial purpose is far 	better suited to Big Data processing.&lt;a href="#_ftn45" name="_ftnref45"&gt;&lt;sup&gt;&lt;sup&gt;[45]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; They argue that today the notion of purpose 	limitation has become outdated. Whereas previously data was collected largely as a by-product of the purpose for which it was being collected. If for 	example, we opted to use a service the information we provided was for the most part necessary to enable the provision of that service. Today however, the 	utility of data is no longer restricted to the primary purpose for which it is collected but can be used to provide all kinds of secondary services and 	resources, reduce waste, increase efficiency and improve decision-making.&lt;a href="#_ftn46" name="_ftnref46"&gt;&lt;sup&gt;&lt;sup&gt;[46]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These 	kinds of positive externalities, Big Data advocates insist, are only made possible by the reprocessing of data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Unfortunately for the notion of consent the nature of these secondary purposes are rarely evident at the time of collection. Instead the true value of the 	data can often only be revealed when it is amalgamated with other diverse datasets and subjected to various forms of analysis to help reveal hidden and 	non-obvious correlations and insights.&lt;a href="#_ftn47" name="_ftnref47"&gt;&lt;sup&gt;&lt;sup&gt;[47]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The uncertain and speculative value of 	data therefore means that it is impossible to provide "specific, explicit, and legitimate" details about how a given data set will be used or how it might 	be aggregated in future. Without this crucial information data subjects have no basis upon which they can make an informed decision about whether or not to 	provide consent. Robert Sloan and Richard Warner argue that it is impossible for a privacy notice to contain enough information to enable free consent. 	They argue that current data collection practices are highly complex and that these practices involve collection of information at one stage for one purpose and then retain, analyze, and distribute it for a variety of other purposes in unpredictable ways.	&lt;a href="#_ftn48" name="_ftnref48"&gt;&lt;sup&gt;&lt;sup&gt;[48]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Helen Nissenbaum points to the ever changing nature of data flow and the 	cognitive challenges it poses. "Even if, for a given moment, a&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;snapshot of the information flows could be grasped, the realm is in constant flux, with new firms entering the picture, new analytics, and new back end contracts forged: in other words, we are dealing with a recursive capacity that is indefinitely extensible."	&lt;a href="#_ftn49" name="_ftnref49"&gt;&lt;sup&gt;&lt;sup&gt;[49]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Scale and Aggregation&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Today the quantity of data being generated is expanding at an exponential rate. From smartphones and televisions, trains and airplanes, sensor-equipped 	buildings and even the infrastructures of our cities, data now streams constantly from almost every sector and function of daily life, 'creating countless 	new digital puddles, lakes, tributaries and oceans of information'.&lt;a href="#_ftn50" name="_ftnref50"&gt;&lt;sup&gt;&lt;sup&gt;[50]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In 2011 it 	was estimated that the quantity of data produced globally would surpass 1.8 zettabytes , by 2013 that had grown to 4 zettabytes , and with the nascent development of the Internet of Things gathering pace, these trends are set to continue.	&lt;a href="#_ftn51" name="_ftnref51"&gt;&lt;sup&gt;&lt;sup&gt;[51]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Big Data by its very nature requires the collection and processing of very large 	and very diverse data sets. Unlike other forms scientific research and analysis which utilize various sampling techniques to identify and target the types 	of data most useful to the research questions, Big Data instead seeks to gather as much data as possible, in order to achieve full resolution of the 	phenomenon being studied, a task made much easier in recent years as a result of the proliferation of internet enabled devices and the growth of the 	Internet of Things. This goal of attaining comprehensive coverage exists in tension however with the key privacy principles of collection limitation and data minimization which seek to limit both the quantity and variety of data collected about an individual to the absolute minimum.	&lt;a href="#_ftn52" name="_ftnref52"&gt;&lt;sup&gt;&lt;sup&gt;[52]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The dilution of the purpose limitation principle entails that even those who understand privacy notices and are capable of making rational choices about 	it, cannot conceptualize how their data will be aggregated and possibly used or re-used. Seemingly innocuous bits of data revealed at different stages 	could be combined to reveal sensitive information about the individual. Daniel Solove, the John Marshall Harlan Research Professor of Law at the George 	Washington University Law School, in his book, "The Digital Person", calls it the aggregation effect. He argues that the ingenuity of the data mining techniques and the insights and predictions that could be made by it render any cost-benefit analysis that an individual could make ineffectual.	&lt;a href="#_ftn53" name="_ftnref53"&gt;&lt;sup&gt;&lt;sup&gt;[53]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;4. &lt;/b&gt; &lt;b&gt;Failure to opt-out&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The traditional choice against the collection of personal data that users have had access to, at least in theory, is the option to 'opt-out' of certain 	services. This draws from the free market theory that individuals exercise their free will when they use services and always have the option of opting out, 	thus, arguing against regulation but relying on the collective wisdom of the market to weed out harms. The notion that the provision of data should be a 	matter of personal choice on the part of the individual and that the individual can, if they chose decide to 'opt-out' of data collection, for example by 	ceasing use of a particular service, is an important component of privacy and data protection frameworks. The proliferation of internet-enabled devices, 	their integration into the built environment and the real-time nature of data collection and analysis however are beginning to undermine this concept. For 	many critics of Big Data, the ubiquity of data collection points as well as the compulsory provision of data as a prerequisite for the access and use of many key online services, is making opting-out of data collection not only impractical but in some cases impossible.	&lt;a href="#_ftn54" name="_ftnref54"&gt;&lt;sup&gt;&lt;sup&gt;[54]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Whilst sceptics may object that individuals are still free to stop using services that require data. As online connectivity becomes increasingly important to participation in modern life, the choice to withdraw completely is becoming less of a genuine choice.	&lt;a href="#_ftn55" name="_ftnref55"&gt;&lt;sup&gt;&lt;sup&gt;[55]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Information flows not only from the individuals it is about but also from what 	other people say about them. Financial transactions made online or via debit/credit cards can be analysed to derive further information about the 	individual. If opting-out makes you look anti-social, criminal, or unethical, the claims that we are exercising free will seems murky and leads one to 	wonder whether we are dealing with coercive technologies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another issue with the consent and opt-out paradigm is the binary nature of the choice. This binary nature of consent makes a mockery of the notion that 	consent can function as an effective tool of personal data management. What it effectively means is that one can either agree with the long privacy 	notices, or choose to abandon the desired service. "This binary choice is not what the privacy architects envisioned four decades ago when they imagined 	empowered individuals making informed decisions about the processing of their personal data. In practice, it certainly is not the optimal mechanism to ensure that either information privacy or the free flow of information is being protected."	&lt;a href="#_ftn56" name="_ftnref56"&gt;&lt;sup&gt;&lt;sup&gt;[56]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Conclusion: 'Notice and Consent' is counter-productive&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There continues to be an unwillingness amongst many privacy advocates to concede that the concept of consent is fundamentally broken, as Simon Davies, a 	privacy advocate based in London, comments 'to do so could be seen as giving ground to the data vultures', and risks further weakening an already 	dangerously fragile privacy framework.&lt;a href="#_ftn57" name="_ftnref57"&gt;&lt;sup&gt;&lt;sup&gt;[57]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Nevertheless, as we begin to transition 	into an era of ubiquitous data collection, evidence is becoming stronger that consent is not simply ineffective, but may in some instances might be 	counter-productive to the goals of privacy and data protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As already noted, the notion that privacy agreements produce anything like truly informed consent has long since been discredited; given this fact, one may 	ask for whose benefit such agreements are created? One may justifiably argue that far from being for the benefit and protection of users, privacy agreement 	may in fact be fundamentally to the benefit of data brokers, who having gained the consent of users can act with near impunity in their use of the data 	collected. Thus, an overly narrow focus on the necessity of consent at the point of collection, risks diverting our attention from the arguably more important issue of how our data is stored, analysed and distributed by data brokers following its collection.	&lt;a href="#_ftn58" name="_ftnref58"&gt;&lt;sup&gt;&lt;sup&gt;[58]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Furthermore, given the often complicated and cumbersome processes involved in gathering consent from users, some have raised concerns that the mechanisms 	put in place to garner consent could themselves morph into surveillance mechanisms. Davies, for example cites the case of the EU Cookie Directive, which 	required websites to gain consent for the collection of cookies. Davies observes how, 'a proper audit and compliance element in the system could require 	the processing of even more data than the original unregulated web traffic. Even if it was possible for consumers to use some kind of gateway intermediary 	to manage the consent requests, the resulting data collection would be overwhelming''. Thus in many instances there exists a fundamental tension between the requirement placed on companies to gather consent and the equally important principle of data minimization.	&lt;a href="#_ftn59" name="_ftnref59"&gt;&lt;sup&gt;&lt;sup&gt;[59]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Given the above issues with notice and informed consent in the context of information privacy, and the fact that it is counterproductive to the larger 	goals of privacy law, it is important to revisit the principle or rights based approach to data protection, and consider a paradigm shift where one moves 	to a risk based approach that takes into account the actual threats of sharing data rather than relying on what has proved to be an ineffectual system of 	individual control. We will be dealing with some of these issues in a follow up to this article.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alan Westin, Privacy and Freedom, Atheneum, New York, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; FTC Fair Information Practice Principles (FIPP) available at			&lt;a href="https://www.it.cornell.edu/policies/infoprivacy/principles.cfm"&gt;https://www.it.cornell.edu/policies/infoprivacy/principles.cfm&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul M. Schwartz, "Privacy and Democracy in Cyberspace," 52 Vanderbilt Law Review 1607, 1614 (1999).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; US Secretary's Advisory Committee on Automated Personal Data Systems, Records, Computers and the Rights of Citizens, available at			&lt;a href="http://www.justice.gov/opcl/docs/rec-com-rights.pdf"&gt;http://www.justice.gov/opcl/docs/rec-com-rights.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://epic.org/privacy/ppsc1977report/c13.htm"&gt;https://epic.org/privacy/ppsc1977report/c13.htm&lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Marc Rotenberg, "Fair Information Practices and the Architecture of Privacy: What Larry Doesn't Get," available at 			&lt;a href="https://journals.law.stanford.edu/sites/default/files/stanford-technology-law-review/online/rotenberg-fair-info-practices.pdf"&gt; https://journals.law.stanford.edu/sites/default/files/stanford-technology-law-review/online/rotenberg-fair-info-practices.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Fred Cate, The Failure of Information Practice Principles, available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Robert Sloan and Richard Warner, Beyong Notice and Choice: Privacy, Norms and Consent, 2014, available at 			&lt;a href="https://www.suffolk.edu/documents/jhtl_publications/SloanWarner.pdf"&gt; https://www.suffolk.edu/documents/jhtl_publications/SloanWarner.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Fred Cate, Viktor Schoenberger, Notice and Consent in a world of Big Data, available at			&lt;a href="http://idpl.oxfordjournals.org/content/3/2/67.abstract"&gt;http://idpl.oxfordjournals.org/content/3/2/67.abstract&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Solove, Privacy self-management and consent dilemma, 2013 available at 			&lt;a href="http://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2093&amp;amp;context=faculty_publications"&gt; http://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2093&amp;amp;context=faculty_publications &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ben Campbell, Informed consent in developing countries: Myth or Reality, available at 			&lt;a href="https://www.dartmouth.edu/~ethics/docs/Campbell_informedconsent.pdf"&gt; https://www.dartmouth.edu/~ethics/docs/Campbell_informedconsent.pdf &lt;/a&gt; ;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 7.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Viktor Mayer Schoenberger and Kenneth Cukier, Big Data: A Revolution that will transform how we live, work and think" John Murray, London, 2013 at 			153.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The Data Minimization principle requires organizations to limit the collection of personal data to the minimum extent necessary to obtain their 			legitimate purpose and to delete data no longer required.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Omer Tene and Jules Polonetsky, "Big Data for All: Privacy and User Control in the Age of Analytics," SSRN Scholarly Paper, available at			&lt;a href="http://papers.ssrn.com/abstract=2149364"&gt;http://papers.ssrn.com/abstract=2149364&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Florian Schaub, R. Balebako et al, "A Design Space for effective privacy notices" available at 			&lt;a href="https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf"&gt; https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Daniel Solove, The Digital Person: Technology and Privacy in the Information Age, NYU Press, 2006.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.ethnologue.com/statistics/size"&gt;http://www.ethnologue.com/statistics/size&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Opening Remarks of FTC Chairperson Edith Ramirez Privacy and the IoT: Navigating Policy Issues International Consumer Electronics Show Las Vegas, 			Nevada January 6, 2015 available at 			&lt;a href="https://www.ftc.gov/system/files/documents/public_statements/617191/150106cesspeech.pdf"&gt; https://www.ftc.gov/system/files/documents/public_statements/617191/150106cesspeech.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/"&gt; http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 10.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 7.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Chris Jay Hoofnagle &amp;amp; Jennifer King, Research Report: What Californians Understand&lt;/p&gt;
&lt;p&gt;About Privacy Online, available at &lt;a href="http://ssrn.com/abstract=1262130"&gt;http://ssrn.com/abstract=1262130&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Joseph Turrow, Michael Hennesy, Nora Draper, The Tradeoff Fallacy, available at 			&lt;a href="https://www.asc.upenn.edu/sites/default/files/TradeoffFallacy_1.pdf"&gt; https://www.asc.upenn.edu/sites/default/files/TradeoffFallacy_1.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Saul Hansell, "Compressed Data: The Big Yahoo Privacy Storm That Wasn't," New York Times, May 13, 2002 available at 			&lt;a href="http://www.nytimes.com/2002/05/13/business/compressed-data-the-big-yahoo-privacy-storm-that-wasn-t.html?_r=0"&gt; http://www.nytimes.com/2002/05/13/business/compressed-data-the-big-yahoo-privacy-storm-that-wasn-t.html?_r=0 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;[26]&lt;/a&gt; Cass Sunstein, Choosing not to choose: Understanding the Value of Choice, Oxford University Press, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; For example, Acxiom, processes more than 50 trillion data transactions a year. 			&lt;a href="http://www.nytimes.com/2012/06/17/technology/acxiom-the-quiet-giant-of-consumer-database-marketing.html?pagewanted=all&amp;amp;_r=0"&gt; http://www.nytimes.com/2012/06/17/technology/acxiom-the-quiet-giant-of-consumer-database-marketing.html?pagewanted=all&amp;amp;_r=0 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Opening Remarks of FTC Chairperson Edith Ramirez Privacy and the IoT: Navigating Policy Issues International Consumer Electronics Show Las Vegas, 			Nevada January 6, 2015 available at 			&lt;a href="https://www.ftc.gov/system/files/documents/public_statements/617191/150106cesspeech.pdf"&gt; https://www.ftc.gov/system/files/documents/public_statements/617191/150106cesspeech.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; L. F. Cranor. Necessary but not sufficient: Standardized mechanisms for privacy notice and choice. Journal on Telecommunications and High Technology Law, 10:273, 2012, available at			&lt;a href="http://jthtl.org/content/articles/V10I2/JTHTLv10i2_Cranor.PDF"&gt;http://jthtl.org/content/articles/V10I2/JTHTLv10i2_Cranor.PDF&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kent Walker, The Costs of Privacy, 2001 available at 			&lt;a href="https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy"&gt; https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Erik Sherman, "Privacy Policies are great - for Phds", CBS News, available at			&lt;a href="http://www.cbsnews.com/news/privacy-policies-are-great-for-phds/"&gt;http://www.cbsnews.com/news/privacy-policies-are-great-for-phds/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Timothy J. Muris, Protecting Consumers' Privacy: 2002 and Beyond, available at			&lt;a href="http://www.ftc.gov/speeches/muris/privisp1002.htm"&gt;http://www.ftc.gov/speeches/muris/privisp1002.htm&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 1999 available at			&lt;a href="http://www.repository.law.indiana.edu/ilj/vol75/iss4/1/"&gt;http://www.repository.law.indiana.edu/ilj/vol75/iss4/1/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Annie I. Anton et al., Financial Privacy Policies and the Need for Standardization, 2004 available at			&lt;a href="https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf"&gt;https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf&lt;/a&gt;; Florian Schaub, R. 			Balebako et al, "A Design Space for effective privacy notices" available at 			&lt;a href="https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf"&gt; https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The Center for Information Policy Leadership, Hunton &amp;amp; Williams LLP, "Ten Steps To Develop A Multi-Layered Privacy Notice" available at 			&lt;a href="https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf"&gt; https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Allen Levy and Manoj Hastak, Consumer Comprehension of Financial Privacy Notices, Interagency Notice Project, available at			&lt;a href="https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf"&gt;https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Patrick Gage Kelly et al., Standardizing Privacy Notices: An Online Study of the Nutrition Label Approach available at 			&lt;a href="https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf"&gt; https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Howard Latin, "Good" Warnings, Bad Products, and Cognitive Limitations, 41 UCLA Law Review available at 			&lt;a href="https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5"&gt; https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jonathan Obar, Big Data and the Phantom Public: Walter Lippmann and the fallacy of data privacy self management, Big Data and Society, 2015, available at&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2239188"&gt; &lt;/a&gt; &lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2239188"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2239188&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Viktor Mayer Schoenberger and Kenneth Cukier, Big Data: A Revolution that will transform how we live, work and think" John Murray, London, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn41"&gt;
&lt;p&gt;&lt;a href="#_ftnref41" name="_ftn41"&gt;&lt;sup&gt;&lt;sup&gt;[41]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 15.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn42"&gt;
&lt;p&gt;&lt;a href="#_ftnref42" name="_ftn42"&gt;&lt;sup&gt;&lt;sup&gt;[42]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 40.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn43"&gt;
&lt;p&gt;&lt;a href="#_ftnref43" name="_ftn43"&gt;&lt;sup&gt;&lt;sup&gt;[43]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 29 Working Party, (2013) Opinion 03/2013 on Purpose Limitation, Article 29, available at: 			&lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf"&gt; http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn44"&gt;
&lt;p&gt;&lt;a href="#_ftnref44" name="_ftn44"&gt;&lt;sup&gt;&lt;sup&gt;[44]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn45"&gt;
&lt;p&gt;&lt;a href="#_ftnref45" name="_ftn45"&gt;&lt;sup&gt;&lt;sup&gt;[45]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; It remains unclear however whose interest would be accounted, existing EU legislation would allow commercial/data broker/third party interests to 			trump those of the user, effectively allowing re-processing of personal data irrespective of whether that processing would be in the interest of 			the user.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn46"&gt;
&lt;p&gt;&lt;a href="#_ftnref46" name="_ftn46"&gt;&lt;sup&gt;&lt;sup&gt;[46]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 40.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn47"&gt;
&lt;p&gt;&lt;a href="#_ftnref47" name="_ftn47"&gt;&lt;sup&gt;&lt;sup&gt;[47]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 10.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn48"&gt;
&lt;p&gt;&lt;a href="#_ftnref48" name="_ftn48"&gt;&lt;sup&gt;&lt;sup&gt;[48]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Robert Sloan and Richard Warner, Beyong Notice and Choice: Privacy, Norms and Consent, 2014, available at 			&lt;a href="https://www.suffolk.edu/documents/jhtl_publications/SloanWarner.pdf"&gt; https://www.suffolk.edu/documents/jhtl_publications/SloanWarner.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn49"&gt;
&lt;p&gt;&lt;a href="#_ftnref49" name="_ftn49"&gt;&lt;sup&gt;&lt;sup&gt;[49]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Helen Nissenbaum, A Contextual Approach to Privacy Online, available at			&lt;a href="http://www.amacad.org/publications/daedalus/11_fall_nissenbaum.pdf"&gt;http://www.amacad.org/publications/daedalus/11_fall_nissenbaum.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn50"&gt;
&lt;p&gt;&lt;a href="#_ftnref50" name="_ftn50"&gt;&lt;sup&gt;&lt;sup&gt;[50]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; D Bollier, The Promise and Peril of Big Data. The Aspen Institute, 2010, available at: 			&lt;a href="http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/The_Promise_and_Peril_of_Big_Data.pdf"&gt; http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/The_Promise_and_Peril_of_Big_Data.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn51"&gt;
&lt;p&gt;&lt;a href="#_ftnref51" name="_ftn51"&gt;&lt;sup&gt;&lt;sup&gt;[51]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Meeker, M. &amp;amp; Yu, L. Internet Trends, Kleiner Perkins Caulfield Byers, (2013),			&lt;a href="http://www.slideshare.net/kleinerperkins/kpcb-internet-trends-2013"&gt;http://www.slideshare.net/kleinerperkins/kpcb-internet-trends-2013&lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn52"&gt;
&lt;p&gt;&lt;a href="#_ftnref52" name="_ftn52"&gt;&lt;sup&gt;&lt;sup&gt;[52]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 40.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn53"&gt;
&lt;p&gt;&lt;a href="#_ftnref53" name="_ftn53"&gt;&lt;sup&gt;&lt;sup&gt;[53]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 17.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn54"&gt;
&lt;p&gt;&lt;a href="#_ftnref54" name="_ftn54"&gt;&lt;sup&gt;&lt;sup&gt;[54]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Janet Vertasi, My Experiment Opting Out of Big Data Made Me Look Like a Criminal, 2014, available at			&lt;a href="http://time.com/83200/privacy-internet-big-data-opt-out/"&gt;http://time.com/83200/privacy-internet-big-data-opt-out/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn55"&gt;
&lt;p&gt;&lt;a href="#_ftnref55" name="_ftn55"&gt;&lt;sup&gt;&lt;sup&gt;[55]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn56"&gt;
&lt;p&gt;&lt;a href="#_ftnref56" name="_ftn56"&gt;&lt;sup&gt;&lt;sup&gt;[56]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.techpolicy.com/NoticeConsent-inWorldBigData.aspx"&gt;http://www.techpolicy.com/NoticeConsent-inWorldBigData.aspx&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn57"&gt;
&lt;p&gt;&lt;a href="#_ftnref57" name="_ftn57"&gt;&lt;sup&gt;&lt;sup&gt;[57]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Simon Davies, Why the idea of consent for data processing is becoming meaningless and dangerous, available at 			&lt;a href="http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/"&gt; http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn58"&gt;
&lt;p&gt;&lt;a href="#_ftnref58" name="_ftn58"&gt;&lt;sup&gt;&lt;sup&gt;[58]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 10.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn59"&gt;
&lt;p&gt;&lt;a href="#_ftnref59" name="_ftn59"&gt;&lt;sup&gt;&lt;sup&gt;[59]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Simon Davies, Why the idea of consent for data processing is becoming meaningless and dangerous, available at 			&lt;a href="http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/"&gt; http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy'&gt;https://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha and Scott Mason</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-18T02:20:10Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/human-rights-in-the-age-of-digital-technology-a-conference-to-discuss-the-evolution-of-privacy-and-surveillance">
    <title>Human Rights in the Age of Digital Technology: A Conference to Discuss the Evolution of Privacy and Surveillance</title>
    <link>https://cis-india.org/internet-governance/blog/human-rights-in-the-age-of-digital-technology-a-conference-to-discuss-the-evolution-of-privacy-and-surveillance</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society organised a conference in roundtable format called ‘Human Rights in the Age of Digital Technology: A Conference to discuss the evolution of Privacy and Surveillance. The conference was held at Indian Habitat Centre on October 30, 2015. The conference was designed to be a forum for discussion, knowledge exchange and agenda building to draw a shared road map for the coming months.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In India, the Right to Privacy has been interpreted to mean an individual's’ right to be left alone. In the age of massive use of Information and Communications Technology, it has become imperative to have this right protected. The Supreme Court has held in a number of its decisions that the right to privacy is implicit in the fundamental right to life and personal liberty under Article 21 of the Indian Constitution, though Part III does not explicitly mention this right. The Supreme Court has identified the right to privacy most often in the context of state surveillance and introduced the standards of compelling state interest, targetted surveillance and oversight mechanism which have been incorporated in the forms of rules under the Indian Telegraph Act, 1885.  Of late, privacy concerns have gained importance in India due to the initiation of national programmes like the UID Scheme, DNA Profiling, the National Encryption Policy, etc. attracting criticism for their impact on the right to privacy. To add to the growing concerns, the Attorney General, Mukul Rohatgi argued in the ongoing Aadhaar case that the judicial position on whether the right to privacy is a fundamental right is unclear and has questioned the entire body of jurisprudence on right to privacy in the last few decades.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Participation&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The roundtable saw participation from various civil society organisation such as Centre for Communication Governance, The Internet Democracy Project, as well as individual researchers like Dr. Usha Ramanathan and Colonel Mathew.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Introductions&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Vipul Kharbanda, Consultant, CIS made the introductions and laid down the agenda for the day. Vipul presented a brief overview of the kind of work of CIS is engaged in around privacy and surveillance, in areas including among others, the Human DNA Profiling Bill, 2014, the Aadhaar Project, the Privacy Bill and surveillance laws in India. It was also highlighted that CIS was engaged in work in the field of Big Data in light of the growing voices wanting to use Big Data in the Smart Cities projects, etc and one of the questions was to analyse whether the 9 Privacy Principles would still be valid in a Big Data and IoT paradigm.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The Aadhaar Case&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Dr. Usha Ramanathan began by calling the Aadhaar project an identification project as opposed to an identity project. She brought up various aspects of project ranging from the myth of voluntariness, the strong and often misleading marketing that has driven the project, the lack of mandate to collect biometric data and the problems with the technology itself. She highlighted  inconsistencies, irrationalities and lack of process that has characterised the Aadhaar project since its inception. A common theme that she identified in how the project has been run was the element of ad-hoc-ness about many important decisions taken on a national scale and migrating from existing systems to the Aadhaar framework. She particularly highlighted the fact that as civil society actors trying to make sense of the project, an acute problem faced was the lack of credible information available. In that respect, she termed it as ‘powerpoint-driven project’ with a focus on information collection but little information available about the project itself. Another issue that Dr. Ramanathan brought up was that the lack of concern that had been exhibited by most people in sharing their biometric information without being aware of what it would be used, was in some ways symptomatic of they way we had begun to interact with technology and willingly giving information about ourselves, with little thought. Dr Ramanathan’s presentation detailed the response to the project from various quarters in the form of petitions in different high courts in India, how the cases were received by the courts and the contradictory response from the government at various stages. Alongside, she also sought to place the Aadhaar case in the context of various debates and issues, like its conflict with the National Population Register, exclusion, issues around ownership of data collected, national security implications and impact on privacy and surveillance. Aside from the above issues, Dr. Ramanathan also posited that the kind of flat idea of identity envisaged by projects like Aadhaar is problematic in that it adversely impacts how people can live, act and define themselves. In summation, she termed the behavior of the government as irresponsible for the manner in which it has changed its stand on issues to suit the expediency of the moment, and was particularly severe on the Attorney General raising questions about the existence of a fundamental right to privacy and casually putting in peril jurisprudence on civil liberties that has evolved over decades.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Colonel Mathew concurred with Dr. Ramanathan that the Aadhaar Project was not about identity but about identification. Prasanna developed on this further saying that while identity was a right unto the individual, identification was something done to you by others. Colonel Mathew further presented a brief history of the Aadhaar case, and how the significant developments over the last few years have played out in the courts. One of the important questions that Colonel Mathew addressed was the claim of uniqueness made by the UID project. He pointed to research conducted by Hans Varghese Mathew which analysed the data on biometric collection and processing released by the UID and demonstrated that there was a clear probability of a duplication in 1 out of every 97 enrolments. He also questioned the oft-repeated claim that UID would give identification to those without it and allow them to access welfare schemes. In this context, he pointed at the failures of the introducer system and the fact that only 0.03% of those registered have been enrolled through the introducer system. Colonel Mathew also questioned the change in stance by the ruling party, BJP which had earlier declared that the UID project should be scrapped as it was a threat to national security. According to him, the prime mover of the scheme were corporate interests outside the country interested in the data to be collected. This, he claimed created very serious risks to the national security. Prasanna further added to this point stating that while, on the face of it, some of the claims of threats to national security may sound alarmist in nature, if one were to critically study the manner in which the data had collected for this project, the concerns appeared justified.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The Draft Encryption Policy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Amber Sinha, Policy Officer at CIS, made a presentation on the brief appearance of the Draft Encryption Policy which was released in October this year, and withdrawn by the government within a day. Amber provided an overview of the policy emphasising on clauses around limitations on kind of encryption algorithms and key sizes individuals and organisations could use and the ill-advised procedures that needed to be followed. After the presentation, the topic was opened for discussion. The initial part of the discussion was focussed on specific clauses that threatened privacy and could serve the ends of enabling greater surveillance of the electronic communications of individuals and organisations, most notably having an exhaustive list of encryption algorithms, and the requirement to keep all encrypted communication in plain text format for a period of 90 days. We also attempted to locate the draft policy in the context of privacy debates in India as well as the global response to encryption. Amber emphasised that while mandating minimum standards of encryption for communication between government agencies may be a honorable motive, as it is concerned with matters of national security, however when this is extended to private parties and involved imposes upward thresholds on the kinds of encryption they can use, it stems from the motive of surveillance. Nayantara, of The Internet Democracy Project, pointed out that there had been global push back against encryption by governments in various countries like US, Russia, China, Pakistan, Israel, UK, Tunisia and Morocco. In India also, the IT Act places limits on encryption. Her points stands further buttressed by the calls against encryption in the aftermath of the terrorist attacks in Paris last month.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It also intended to have a session on the Human DNA Profiling Bill led by Dr. Menaka Guruswamy. However, due to certain issues in scheduling and paucity of time, we were not able to have the session.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Questions Raised&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;On Aadhaar, some of the questions raised included the question of  applicability of the Section 43A, IT Act rules to the private parties involved in the process. The issue of whether Aadhaar can be tool against corruption was raised by Vipul. However, Colonel Mathew demonstrated through his research that issues like corruption in the TPDS system and MNREGA which Aadhaar is supposed to solve, are not effectively addressed by it but that there were simpler solutions to these problems. &lt;br /&gt;&lt;br /&gt;Ranjit raised questions about the different contexts of privacy, and referred to the work of Helen Nissenbaum. He spoke about the history of freely providing biometric information in India, initially for property documents and how it has gradually been used for surveillance. He argued has due to this tradition, many people in India do not view sharing of biometric information as infringing on their privacy. Dipesh Jain, student at Jindal Global Law School pointed to challenges like how individual privacy is perceived in India, its various contexts, and people resorting to the oft-quoted dictum of ‘why do you want privacy if you have nothing to hide’. In the context, it is pertinent to mention the response of Edward Snowden to this question who said, “Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.” Aakash Solanki, researcher &lt;br /&gt;&lt;br /&gt;Vipul and Amber also touched upon the new challenges that are upon us in a world of Big Data where traditional ways to ensure data protection through data minimisation principle and the methods like anonymisation may not work. With advances in computer science and mathematics threatening to re-identify anonymized datasets, and more and more reliances of secondary uses of data coupled with the inadequacy of the idea of informed consent, a significant paradigm shift may be required in how we view privacy laws. &lt;br /&gt;&lt;br /&gt;A number of action items going forward were also discussed, where different individuals volunteered to lead research on issues like the UBCC set up by the UIDAI, GSTN, the first national data utility, looking the recourses available to individual where his data is held by parties outside India’s jurisdiction.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/human-rights-in-the-age-of-digital-technology-a-conference-to-discuss-the-evolution-of-privacy-and-surveillance'&gt;https://cis-india.org/internet-governance/blog/human-rights-in-the-age-of-digital-technology-a-conference-to-discuss-the-evolution-of-privacy-and-surveillance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-11T02:12:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/livemint-moulishree-srivastava-january-5-2016-nasscom-against-differential-pricing-for-data-services">
    <title>Nasscom against differential pricing for data services</title>
    <link>https://cis-india.org/internet-governance/news/livemint-moulishree-srivastava-january-5-2016-nasscom-against-differential-pricing-for-data-services</link>
    <description>
        &lt;b&gt;The National Association of Software and Services Companies says it should be the regulator that decides on such content, not firms.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Moulishree Srivastava was &lt;a class="external-link" href="http://www.livemint.com/Consumer/j1P4yZ3brS4Ttk6kUqy1QJ/Nasscom-against-differential-pricing-for-data-services.html"&gt;published in Livemint &lt;/a&gt;on January 5, 2016. Pranesh Prakash gave inputs.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;India’s top software lobby on Monday said if select web content needs  to be provided cheaper for some Indians, it must be the regulator that  decides on such content, not companies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In its response to a consultation paper by the Telecom Regulatory  Authority of India (Trai) on differential pricing for data usage, the  National Association of Software and Services Companies (Nasscom)  objected to plans such as Free Basics and Airtel Zero where companies  choose content to be provided at different speeds and prices, but backed  powers for the regulator to allow such a model if the regulator deems  they are in “public interest”, while adhering to principles of net  neutrality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“We strongly oppose any model where telecom service providers (TSPs)  or their partners have a say or discretion in choosing content that is  made available at favourable rates, speed... any differential pricing by  TSP either directly such as Airtel Zero or indirectly as in the case of  Free Basics through a platform provider which limits access to the  internet services or websites (selected by the TSP or by the partners)  violate the idea of net neutrality,” said R. Chandrashekhar, president,  Nasscom.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“But when we recognize the reality of India as a country which has  low internet penetration and even lower broadband penetration, apart  from low levels of digital literacy and limited local language  content... there may be a need to provide certain services in public  interest at differential or lower prices which the regulator feels are  necessary,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Therefore, it is important that the regulator should have the power  to allow differential pricing for certain types or classes of services  that are deemed to be in public interest and based on mandatory prior  approvals,” he said. “Any such programmes should abide by the principles  of net neutrality and not constrain innovation in any way and not  constrain innovation in any way.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Differential pricing for data usage means offering services at  different price points to different users. However, analysts say it  could lead to an anti-competitive environment, hurting small companies  and start-ups, while giving the TSPs and their partner platforms  near-monopolistic access to the vast amount of user data that has  potential commercial value in a country such as India where privacy laws  are not strong.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Differential pricing is a significant aspect of the net neutrality  debate that erupted in India in 2015, when Trai released a consultation  paper in April. Soon, telecom operator Bharti Airtel Ltd launched Zero, a  marketing platform that allows customers to access mobile applications  for free but charges the application providers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Facebook’s Free Basics service (the new name for Internet.org) aims  to offer people without the Internet free access to a handful of  websites and a range of services through mobile phones, which net  neutrality activists say will violate the core principle that everyone  should have unrestricted access to Internet and it should not be  regulated by a company.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Following the outrage, Trai put Free Basics on hold, asking Reliance  Communications Ltd to furnish the detailed terms and conditions of its  Free Basics service. The next step will be announced later this month.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In an op-ed in the &lt;i&gt;Times of India&lt;/i&gt; last week, Nandan Nilekani,  co-founder of Infosys Ltd. and former chairman of Unique Identification  Authority of India, publicly criticized Facebook’s Free Basics, calling  it a walled garden.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The walled garden of Free Basics goes against the spirit of openness  on the internet, and in the guise of being pro-poor, balkanises it.  Only Free Basics-approved websites will be accessible for free,” he said  in the article which he co-authored with Viral Shah who led the design  of government’s subsidy platforms using Aadhaar. “In theory, anyone  meeting the technical guidelines today can participate. However,  services that may potentially compete with telco offerings may not join  Free Basics. Since Facebook does not currently subsidise free usage,  telcos will have to foot the bill by raising prices.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He said schemes such as direct benefit transfer for Internet data  packs would be better compared to programmes such as Free Basics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Nasscom, in its response, recommended “mandatory prior approval of  such services by the regulator and sharing of periodic information on  tariff plans seek to lower the price as well as zero rating services,”  adding that these programmes should abide by the principle of net  neutrality, meaning it should not limit consumers access to pre-defined  set of services or websites.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Any such differential pricing programs should have explicit approval  of the regulator—and should be deemed to be in the public interest and  the onus of proving it to be in the public interest in the first  instance would be on service provider and before Trai arrives at a final  decision a public consultation is also advised because of the dangers  involved,” Nasscom said. “Even after the approval, suitable oversight  mechanism should be maintained by the regulator in all such case.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Pranesh Prakash, policy director at the Centre for Internet and  Society (CIS), said Nasscom’s approach to make differential pricing  plans and options as an exception rather than the rule was quite  reasonable. “It says that if differential pricing services adhere to the  guidelines of being non-discriminatory, non-anti-competitive,  non-predatory, non-ambiguous and transparent, they can be allowed under  the supervision of the regulator, which is similar to the position  adopted by CIS,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Though some of their positions are ambiguous—for instance what they  mean by non-discriminatory, and whether they are okay with differential  pricing between classes of applications, are unclear—and some of their  recommendations increase regulatory complexity, such as their proposal  for independent not-for-profit entities with independent boards to own  and manage such differential pricing programs, by and large it is a  useful submission,” Prakash added.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/livemint-moulishree-srivastava-january-5-2016-nasscom-against-differential-pricing-for-data-services'&gt;https://cis-india.org/internet-governance/news/livemint-moulishree-srivastava-january-5-2016-nasscom-against-differential-pricing-for-data-services&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Free Basics</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    

   <dc:date>2016-01-06T15:12:17Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/reply-to-rti-application-under-rti-act-of-2005-from-vanya-rakesh">
    <title>Reply to RTI Application under RTI Act of 2005 from Vanya Rakesh</title>
    <link>https://cis-india.org/internet-governance/blog/reply-to-rti-application-under-rti-act-of-2005-from-vanya-rakesh</link>
    <description>
        &lt;b&gt;Unique Identification Authority of India replied to the RTI application filed by Vanya Rakesh. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Madam,&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;Please refer to your RTI application dated 3.12.2015 received in the Division on 10.12.2015 on the subject mentioned above requesting to provide the information in electronic form via the email address vanya@cis-india.org, copies of the artwork in print media released by UIDAI to create awareness about use of Aadhaar not being mandatory.&lt;/li&gt;
&lt;li&gt;I am directed to furnish herewith in electronic form, copy of the artwork in print media released / published in the epapers edition of the Times of India and Dainik Jagran in their respective editions of dated 29.8.2015 in a soft copy, about obtaining of Aadhaar not being mandatory for a citizen, as desired.&lt;/li&gt;
&lt;li&gt;In case, you want to go for an appeal in connection with the information provided, you may appeal to the Appellate Authority indicated below within thirty days from the date of receipt of this letter.&lt;br /&gt;Shri Harish Lal Verma,&lt;br /&gt;Deputy Director (Media),&lt;br /&gt;Unique Identification Authority of India&lt;br /&gt;3nd Floor, Tower – II, Jeevan Bharati Building,&lt;br /&gt;New Delhi – 110001.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;&lt;br /&gt;Yours faithfully,&lt;br /&gt;&lt;br /&gt;(T Gou Khangin)&lt;br /&gt;Section Officer &amp;amp; CPIO Media Division&lt;br /&gt;&lt;br /&gt;Copy for information to: Deputy Director (Establishment) &amp;amp; Nodal CPIO&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Below scanned copies:&lt;/p&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;RTI Reply&lt;/th&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;img src="https://cis-india.org/home-images/RTIReplytoSh.VanyaRakesh.jpg" alt="RTI Reply" class="image-inline" title="RTI Reply" /&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;Coverage in Dainik Jagran&lt;br /&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;img src="https://cis-india.org/home-images/DainikJagran29.08.2015.png" alt="Dainik Jagran" class="image-inline" title="Dainik Jagran" /&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;b&gt;&lt;a href="https://cis-india.org/internet-governance/blog/uid-ad" class="internal-link"&gt;Download the coverage in the Times of India here&lt;/a&gt;&lt;/b&gt;. Read the earlier blog entry &lt;a class="external-link" href="http://cis-india.org/internet-governance/blog/rti-response-regarding-the-uidai"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/reply-to-rti-application-under-rti-act-of-2005-from-vanya-rakesh'&gt;https://cis-india.org/internet-governance/blog/reply-to-rti-application-under-rti-act-of-2005-from-vanya-rakesh&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vanya</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-13T02:40:57Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015">
    <title>Eight Key Privacy Events in India in the Year 2015</title>
    <link>https://cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015</link>
    <description>
        &lt;b&gt;As the year draws to a close, we are enumerating some of the key privacy related events in India that transpired in 2015. Much like the last few years, this year, too, was an eventful one in the context of privacy.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;While we did not witness, as one had hoped, any progress in the passage of a privacy law, the year saw significant developments with respect to the ongoing 	Aadhaar case. The statement by the Attorney General, India's foremost law officer, that there is a lack of clarity over whether the right to privacy is a fundamental right, and the fact the the matter is yet unresolved was a huge setback to the jurisprudence on privacy.	&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; However, the court has recognised a purpose limitation as applicable into the Aadhaar scheme, limiting 	the sharing of any information collected during the enrollment of residents in UID. A draft Encryption Policy was released and almost immediately withdrawn 	in the face of severe public backlash, and an updated Human DNA Profiling Bill was made available for comments. Prime Minister Narendra Modi's much 	publicised project "Digital India" was in news throughout the year, and it also attracted its' fair share of criticism in light of the lack of privacy 	safeguards it offered. Internationally, a lawsuit brought by Maximilian Schrems, an Austrian privacy activist, dealt a body blow to the fifteen year old 	Safe Harbour Framework in place for data transfers between EU and USA. Below, we look at what were, according to us, the eight most important privacy 	events in India, in 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;1. &lt;/b&gt; &lt;b&gt;August 11, 2015 order on Aadhaar not being compulsory&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In 2012, a writ petition was filed by Judge K S Puttaswamy challenging the government's policy in its attempt to enroll all residents of India in the UID 	project and linking the Aadhaar card with various government services. A number of other petitioners who filed cases against the Aadhaar scheme have also 	been linked with this petition and the court has been hearing them together. On September 11, 2015, the Supreme Court reiterated its position in earlier orders made on September 23, 2013 and March 24, 2014 stating that the Aadhaar card shall not be made compulsory for any government services.	&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt; Building on its earlier position, the court passed the following orders:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) The government must give wide publicity in the media that it was not mandatory for a resident to obtain an Aadhaar card,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) The production of an Aadhaar card would not be a condition for obtaining any benefits otherwise due to a citizen,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Aadhaar card would not be used for any purpose other than the PDS Scheme, for distribution of foodgrains and cooking fuel such as kerosene and for the 	LPG distribution scheme.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;d) The information about an individual obtained by the UIDAI while issuing an Aadhaar card shall not be used for any other purpose, save as above, except 	as may be directed by a Court for the purpose of criminal investigation.&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Despite this being the fifth court order given by the Supreme Court&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt; stating that the Aadhaar card cannot 	be a mandatory requirement for access to government services or subsidies, repeated violations continue. One of the violations which has been widely 	reported is the continued requirement of an Aadhaar number to set up a Digital Locker account which also led to activist, Sudhir Yadav filing a petition in 	the Supreme Court.&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;2. &lt;/b&gt; &lt;b&gt;No Right to Privacy - Attorney General to SC&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Attorney General, Mukul Rohatgi argued before the Supreme Court in the Aadhaar case that the Constitution of India did not provide for a fundamental 	Right to Privacy.&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; He referred to the body of case in the Supreme Court dealing with this issue and made a 	reference to the 1954 case, MP Sharma v. Satish Chandra&lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt; stating that there was "clear divergence of 	opinion" on the Right to Privacy and termed it as "a classic case of unclear position of law." He also referred to the discussion on this matter in the 	Constitutional Assembly Debates and pointed to the fact the framers of the Constitution did not intend for this to be a fundamental right. He said the 	matter needed to be referred to a nine judge Constitution bench.&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt; This raises serious questions over the 	jurisprudence developed by the Supreme Court on the right to privacy over the last five decades. The matter is currently pending resolution by a larger 	bench which needs to be constituted by the Chief Justice of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;3. &lt;/b&gt; &lt;b&gt;Shreya Singhal judgment and Section 69A, IT Act&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the much celebrated judgment, Shreya Singhal v. Union of India, in March 2015, the Supreme Court struck down Section 66A of the Information Technology 	Act, 2000 as unconstitutional and laid down guidelines for online takedowns under the Internet intermediary rules. However, significantly, the court also 	upheld Section 69A and the blocking rules under this provision. It was held to be a narrowly-drawn provision with adequate safeguards. The rules prescribe 	a procedure for blocking which involves receipt of a blocking request, examination of the request by the Committee and a review committee which performs 	oversight functions. However, commentators have pointed to the opacity of the process in the rules under this provisions. While the rules mandate that a 	hearing is given to the originator of the content, this safeguard is widely disregarded. The judgment did not discuss Section 69 of the Information 	Technology Act, 2000 which deal with decrypting of electronic communication, however, the Department of Electronic and Information Technology brought up 	this issue subsequently, through a Draft Encryption Policy, discussed below.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;4. &lt;/b&gt; &lt;b&gt;Circulation and recall of Draft Encryption Policy&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On October 19, 2015, the Department of Electronic and Information Technology (DeitY) released for public comment a draft National Encryption Policy. The draft received an immediate and severe backlash from commentators, and was withdrawn by September 22, 2015.	&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt; The government blamed a junior official for the poor drafting of the document and noted that it had been 	released without a review by the Telecom Minister, Ravi Shankar Prasad and other senior officials.&lt;a href="#_ftn10" name="_ftnref10"&gt;[10]&lt;/a&gt; The 	main areas of contention were a requirement that individuals store plain text versions of all encrypted communication for a period of 90 days, to be made 	available to law enforcement agencies on demand; the government's right to prescribe key-strength, algorithms and ciphers; and only government-notified 	encryption products and vendors registered with the government being allowed to be used for encryption.&lt;a href="#_ftn11" name="_ftnref11"&gt;[11]&lt;/a&gt; The purport of the above was to limit the ways in which citizens could encrypt electronic communication, and to allow adequate access to law enforcement 	agencies. The requirement to keep all encrypted information in plain text format for a period of 90 days garnered particular criticism as it would allow 	for creation of a 'honeypot' of unencrypted data, which could attract theft and attacks.&lt;a href="#_ftn12" name="_ftnref12"&gt;[12]&lt;/a&gt; The withdrawal of the draft policy is not the final chapter in this story, as the Telecom Minister has promised that the Department will come back with a revised policy.	&lt;a href="#_ftn13" name="_ftnref13"&gt;[13]&lt;/a&gt; This attempt to put restrictions on use of encryption technologies is not only in line with a host of 	surveillance initiatives that have mushroomed in India in the last few years,&lt;a href="#_ftn14" name="_ftnref14"&gt;[14]&lt;/a&gt; but also finds resonance with a global trend which has seen various governments and law enforcement organisations argue against encryption.	&lt;a href="#_ftn15" name="_ftnref15"&gt;[15]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;5. &lt;/b&gt; &lt;b&gt;Privacy concerns raised about Digital India&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Digital India initiative includes over thirty Mission Mode Projects in various stages of implementation.	&lt;a href="#_ftn16" name="_ftnref16"&gt;[16]&lt;/a&gt; All of these projects entail collection of vast quantities of personally identifiable information of 	the citizens. However, most of these initiatives do not have clearly laid down privacy policies.&lt;a href="#_ftn17" name="_ftnref17"&gt;[17]&lt;/a&gt; There 	is also a lack of properly articulated access control mechanisms and doubts over important issues such as data ownership owing to most projects involving public private partnership which involves private organisation collecting, processing and retaining large amounts of data.	&lt;a href="#_ftn18" name="_ftnref18"&gt;[18]&lt;/a&gt; Ahead of Prime Minister Modi's visit to the US, over 100 hundred prominent US based academics released a statement raising concerns about "lack of safeguards about privacy of information, and thus its potential for abuse" in the Digital India project.	&lt;a href="#_ftn19" name="_ftnref19"&gt;[19]&lt;/a&gt; It has been pointed out that the initiatives could enable a "cradle-to-grave digital identity that is unique, lifelong, and authenticable, and it plans to widely use the already mired in controversy Aadhaar program as the identification system."	&lt;a href="#_ftn20" name="_ftnref20"&gt;[20]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;6. &lt;/b&gt; &lt;b&gt;Issues with Human DNA Profiling Bill, 2015&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Human DNA Profiling Bill, 2015 envisions the creation of national and regional DNA databases comprising DNA profiles of the categories of persons 	specified in the Bill.&lt;a href="#_ftn21" name="_ftnref21"&gt;[21]&lt;/a&gt; The categories include offenders, suspects, missing persons, unknown deceased 	persons, volunteers and such other categories specified by the DNA Profiling Board which has oversight over these banks. The Bill grants wide discretionary powers to the Board to introduce new DNA indices and make DNA profiles available for new purposes it may deem fit.	&lt;a href="#_ftn22" name="_ftnref22"&gt;[22]&lt;/a&gt; These, and the lack of proper safeguards surrounding issues like consent, retention and collection 	pose serious privacy risks if the Bill becomes a law. Significantly, there is no element of purpose limitation in the proposed law, which would allow the 	DNA samples to be re-used for unspecified purposes.&lt;a href="#_ftn23" name="_ftnref23"&gt;[23]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;7. &lt;/b&gt; &lt;b&gt;Impact of the Schrems ruling on India&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In Schrems v. Data Protection Commissioner, the Court of Justice in European Union (CJEU) annulled the Commission Decision 2000/520 according to which US 	data protection rules were deemed sufficient to satisfy EU privacy rules enabling transfers of personal data from EU to US, otherwise known as the 'Safe 	Harbour' framework. The court ruled that broad formulations of derogations on grounds of national security, public interest and law enforcement in place in 	the US goes beyond the test of proportionality and necessity under the Data Protection rules.&lt;a href="#_ftn24" name="_ftnref24"&gt;[24]&lt;/a&gt; This 	judgment could also have implications for the data processing industry in India. For a few years now, a framework similar to the Safe Harbour has been 	under discussion for transfer of data between India and EU. The lack of a privacy legislation has been among the significant hurdles in arriving at a 	framework.&lt;a href="#_ftn25" name="_ftnref25"&gt;[25]&lt;/a&gt; In the absence of a Safe Harbour framework, the companies in India rely on alternate 	mechanisms such as Binding Corporate Rules (BCR) or Model Contractual Clauses. These contracts impose the obligation on the data exporters and importers to 	ensure that 'adequate level of data protection' is provided. The Schrems judgement makes it clear that 'adequate level of data protection' entails a regime 	that is 'essentially equivalent' to that envisioned under Directive 95/46.&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt; What this means is that any 	new framework of protection between EU and other countries like US or India will necessarily have to meet this test of essential equivalence. The PRISM 	programme in the US and a host of surveillance programmes that have been initiated by the government in India in the last few years could pose problems in 	satisfying this test of essential equivalence as they do not conform to the proportionality and necessity principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;8. &lt;/b&gt; &lt;b&gt;The definition of "unfair trade practices" in the Consumer Protection Bill, 2015&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Consumer Protection Bill, 2015, tabled in the Parliament towards the end of the monsoon session&lt;a href="#_ftn27" name="_ftnref27"&gt;[27]&lt;/a&gt; has 	introduced an expansive definition of the term "unfair trade practices." The definition as per the Bill includes the disclosure "to any other person any 	personal information given in confidence by the consumer."&lt;a href="#_ftn28" name="_ftnref28"&gt;[28]&lt;/a&gt; This clause exclude from the scope of unfair 	trade practices, disclosures under provisions of any law in force or in public interest. This provision could have significant impact on the personal data 	protection law in India. Currently, the only law governing data protection law are the Reasonable security practices and procedures and sensitive personal 	data or information Rules, 2011&lt;a href="#_ftn29" name="_ftnref29"&gt;[29]&lt;/a&gt; prescribed under Section 43A of the Information Technology Act, 2000. Under these rules, sensitive personal data or information is protected in that their disclosure requires prior permission from the data subject.	&lt;a href="#_ftn30" name="_ftnref30"&gt;[30]&lt;/a&gt; For other kinds of personal information not categorized as sensitive personal data or information, the only recourse of data subjects in case to claim breach of the terms of privacy policy which constitutes a lawful contract.	&lt;a href="#_ftn31" name="_ftnref31"&gt;[31]&lt;/a&gt; The Consumer Protection Bill, 2015, if enacted as law, could significantly expand the scope of 	protection available to data subjects. First, unlike the Section 43A rules, the provisions of the Bill would be applicable to physical as well as 	electronic collection of personal information. Second, disclosure to a third party of personal information other than sensitive personal data or 	information could also have similar 'prior permission' criteria under the Bill, if it can be shown that the information was shared by the consumer in 	confidence.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we see above are events largely built around a few trends that we have been witnessing in the context of privacy in India, in particular and across 	the world, in general. Lack of privacy safeguards in initiatives like the Aadhaar project and Digital India is symptomatic of policies that are not 	comprehensive in their scope, and consequently fail to address key concerns. Dr Usha Ramanathan has called these policies "powerpoint based policies" which are implemented based on proposals which are superficial in their scope and do not give due regard to their impact on a host of issues.	&lt;a href="#_ftn32" name="_ftnref32"&gt;[32]&lt;/a&gt; Second, the privacy concerns posed by the draft Encryption Policy and the Human DNA Profiling Bill point to the motive of surveillance that is in line with other projects introduced with the intent to protect and preserve national security.	&lt;a href="#_ftn33" name="_ftnref33"&gt;[33]&lt;/a&gt; Third, the incidents that championed the cause of privacy like the Schrems judgment have largely been 	initiated by activists and civil society actors, and have typically entailed the involvement of the judiciary, often the single recourse of actors in the 	campaign for the protection of civil rights. It must be noted that jurisprudence on the right to privacy in India has not moved beyond the guidelines set 	forth by the Supreme Court in PUCL v. Union of India.&lt;a href="#_ftn34" name="_ftnref34"&gt;[34]&lt;/a&gt; However, new mass surveillance programmes and 	massive collection of personal data by both public and private parties through various schemes mandated a re-look at the standards laid down twenty years 	ago. The privacy issue pending resolution by a larger bench in the Aadhaar case affords an opportunity to revisit those principles in light of how 	surveillance has changed in the last two decades and strengthen privacy and data protection.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Right to Privacy not a fundamental right, cannot be invoked to scrap Aadhar: Centre tells Supreme Court, available at 			&lt;a href="http://articles.economictimes.indiatimes.com/2015-07-23/news/64773078_1_fundamental-right-attorney-general-mukul-rohatgi-privacy"&gt; http://articles.economictimes.indiatimes.com/2015-07-23/news/64773078_1_fundamental-right-attorney-general-mukul-rohatgi-privacy &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; SC allows govt to link Aadhaar card with PDS and LPG subsidies, available at 			&lt;a href="http://timesofindia.indiatimes.com/india/SC-allows-govt-to-link-Aadhaar-card-with-PDS-and-LPG-subsidies/articleshow/48436223.cms"&gt; http://timesofindia.indiatimes.com/india/SC-allows-govt-to-link-Aadhaar-card-with-PDS-and-LPG-subsidies/articleshow/48436223.cms &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://judis.nic.in/supremecourt/imgs1.aspx?filename=42841"&gt;http://judis.nic.in/supremecourt/imgs1.aspx?filename=42841&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Five SC Orders Later, Aadhaar Requirement Continues to Haunt Many, available at 			&lt;a href="http://thewire.in/2015/09/19/five-sc-orders-later-aadhaar-requirement-continues-to-haunt-many-11065/"&gt; http://thewire.in/2015/09/19/five-sc-orders-later-aadhaar-requirement-continues-to-haunt-many-11065/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; Digital Locker scheme challenged in Supreme Court, available at 			&lt;a href="http://www.moneylife.in/article/digital-locker-scheme-challenged-in-supreme-court/42607.html"&gt; http://www.moneylife.in/article/digital-locker-scheme-challenged-in-supreme-court/42607.html &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Privacy not a fundamental right, argues Mukul Rohatgi for Govt as Govt affidavit says otherwise, available at 			&lt;a href="http://www.legallyindia.com/Constitutional-law/privacy-not-a-fundamental-right-argues-mukul-rohatgi-for-govt-as-govt-affidavit-says-otherwise"&gt; http://www.legallyindia.com/Constitutional-law/privacy-not-a-fundamental-right-argues-mukul-rohatgi-for-govt-as-govt-affidavit-says-otherwise &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; 1954 SCR 1077.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Supra Note 1.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Government to withdraw draft encryption policy, available at 			&lt;a href="http://www.thehindu.com/news/national/govt-to-withdraw-draft-encryption-policy/article7677348.ece"&gt; http://www.thehindu.com/news/national/govt-to-withdraw-draft-encryption-policy/article7677348.ece &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Encryption policy poorly worded by officer: Telecom Minister Ravi Shankar Prasad, available at 			&lt;a href="http://economictimes.indiatimes.com/articleshow/49068406.cms?utm_source=contentofinterest&amp;amp;utm_medium=text&amp;amp;utm_campaign=cppst"&gt; http://economictimes.indiatimes.com/articleshow/49068406.cms?utm_source=contentofinterest&amp;amp;utm_medium=text&amp;amp;utm_campaign=cppst &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Updated: India's draft encryption policy puts user privacy in danger, available at 			&lt;a href="http://www.medianama.com/2015/09/223-india-draft-encryption-policy/"&gt; http://www.medianama.com/2015/09/223-india-draft-encryption-policy/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Bhairav Acharya, The short-lived adventure of India's encryption policy, available at 			&lt;a href="http://notacoda.net/2015/10/10/the-short-lived-adventure-of-indias-encryption-policy/"&gt; http://notacoda.net/2015/10/10/the-short-lived-adventure-of-indias-encryption-policy/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Supra Note 9.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Maria Xynou, Big democracy, big surveillance: India's surveillance state, available at 			&lt;a href="https://www.opendemocracy.net/opensecurity/maria-xynou/big-democracy-big-surveillance-indias-surveillance-state"&gt; https://www.opendemocracy.net/opensecurity/maria-xynou/big-democracy-big-surveillance-indias-surveillance-state &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; China passes controversial anti-terrorism law to access encrypted user accounts, available at 			&lt;a href="http://www.theverge.com/2015/12/27/10670346/china-passes-law-to-access-encrypted-communications"&gt; http://www.theverge.com/2015/12/27/10670346/china-passes-law-to-access-encrypted-communications &lt;/a&gt; ; Police renew call against encryption technology that can help hide terrorists, available at 			&lt;a href="http://www.washingtontimes.com/news/2015/nov/16/paris-terror-attacks-renew-encryption-technology-s/?page=all"&gt; http://www.washingtontimes.com/news/2015/nov/16/paris-terror-attacks-renew-encryption-technology-s/?page=all &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.mmp.cips.org.in/digital-india/"&gt;http://www.mmp.cips.org.in/digital-india/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;[17]&lt;/a&gt; &lt;a href="http://slides.com/cisindia/big-data-in-indian-governance-preliminary-findings#/"&gt; http://slides.com/cisindia/big-data-in-indian-governance-preliminary-findings#/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Indira Jaising, Digital India Schemes Must Be Preceded by a Data Protection and Privacy Law, available at 			&lt;a href="http://thewire.in/2015/07/04/digital-india-schemes-must-be-preceded-by-a-data-protection-and-privacy-law-5471/"&gt; http://thewire.in/2015/07/04/digital-india-schemes-must-be-preceded-by-a-data-protection-and-privacy-law-5471/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; US academics raise privacy concerns over 'Digital India' campaign, available at			&lt;a href="http://yourstory.com/2015/08/us-digital-india-campaign/"&gt;http://yourstory.com/2015/08/us-digital-india-campaign/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Lisa Hayes, Digital India's Impact on Privacy: Aadhaar numbers, biometrics, and more, available at 			&lt;a href="https://cdt.org/blog/digital-indias-impact-on-privacy-aadhaar-numbers-biometrics-and-more/"&gt; https://cdt.org/blog/digital-indias-impact-on-privacy-aadhaar-numbers-biometrics-and-more/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.prsindia.org/uploads/media/draft/Draft%20Human%20DNA%20Profiling%20Bill%202015.pdf"&gt; http://www.prsindia.org/uploads/media//draft/Draft%20Human%20DNA%20Profiling%20Bill%202015.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Comments on India's Human DNA Profiling Bill (June 2015 version), available at 			&lt;a href="http://www.genewatch.org/uploads/f03c6d66a9b354535738483c1c3d49e4/IndiaDNABill_FGPI_15.pdf"&gt; http://www.genewatch.org/uploads/f03c6d66a9b354535738483c1c3d49e4/IndiaDNABill_FGPI_15.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Elonnai Hickok, Vanya Rakesh and Vipul Kharbanda, CIS Comments and Recommendations to the Human DNA Profiling Bill, June 2015, available at 			&lt;a href="http://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015"&gt; http://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://curia.europa.eu/jcms/upload/docs/application/pdf/2015-10/cp150117en.pdf"&gt; http://curia.europa.eu/jcms/upload/docs/application/pdf/2015-10/cp150117en.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Jyoti Pandey, Contestations of Data, ECJ Safe Harbor Ruling and Lessons for India, available at 			&lt;a href="http://cis-india.org/internet-governance/blog/contestations-of-data-ecj-safe-harbor-ruling-and-lessons-for-india"&gt; http://cis-india.org/internet-governance/blog/contestations-of-data-ecj-safe-harbor-ruling-and-lessons-for-india &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Simon Cox, Case Watch: Making Sense of the Schrems Ruling on Data Transfer, available at 			&lt;a href="https://www.opensocietyfoundations.org/voices/case-watch-making-sense-schrems-ruling-data-transfer"&gt; https://www.opensocietyfoundations.org/voices/case-watch-making-sense-schrems-ruling-data-transfer &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.prsindia.org/billtrack/the-consumer-protection-bill-2015-3965/"&gt; http://www.prsindia.org/billtrack/the-consumer-protection-bill-2015-3965/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Section 2(41) (I) of the Consumer Protection Bill, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.ijlt.in/pdffiles/IT-(Reasonable%20Security%20Practices)-Rules-2011.pdf"&gt; http://www.ijlt.in/pdffiles/IT-%28Reasonable%20Security%20Practices%29-Rules-2011.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Rule 6 of Reasonable security practices and procedures and sensitive personal data or information Rules, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Rule 4 of Reasonable security practices and procedures and sensitive personal data or information Rules, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://cis-india.org/internet-governance/events/communication-rights-in-the-age-of-digital-technology"&gt; http://cis-india.org/internet-governance/events/communication-rights-in-the-age-of-digital-technology &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Supra Note 11.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Chaitanya Ramachandra, PUCL V. Union of India Revisited: Why India's Sureveillance Law must be redesigned for the Digital Age, available at 			&lt;a href="http://nujslawreview.org/wp-content/uploads/2015/10/Chaitanya-Ramachandran.pdf"&gt; http://nujslawreview.org/wp-content/uploads/2015/10/Chaitanya-Ramachandran.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015'&gt;https://cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-03T05:43:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/rti-response-regarding-the-uidai">
    <title>RTI response regarding the UIDAI</title>
    <link>https://cis-india.org/internet-governance/blog/rti-response-regarding-the-uidai</link>
    <description>
        &lt;b&gt;This is a response to the RTI filed regarding UIDAI&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The Supreme Curt of India, by virtue of an order dated 11th August 2015, directed the Government to widely publicize in electronic and print media, including radio and television networks that obtaining Aadhar card is not mandatory for the citizens to avail welfare schemes of the Government. (until the matter is resolved). CIS filed an RTI to get information about the steps taken by Government in this regard, the initiatives taken, and details about the expenditure incurred to publicize and inform the public about Aadhar not being mandatory to avail welfare schemes of the Government. &lt;br /&gt;&lt;br /&gt;Response: It has been informed that an advisory was issued by UIDAI headquarters to all regional offices to comply with the order, along with several advertisement campaigns. The total cost incurred so far by UIDAI for this is Rs. 317.30 lakh.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;b&gt;&lt;a href="https://cis-india.org/internet-governance/blog/rti.pdf" class="internal-link"&gt;Download the Response&lt;/a&gt;&lt;/b&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/rti-response-regarding-the-uidai'&gt;https://cis-india.org/internet-governance/blog/rti-response-regarding-the-uidai&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vanya</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2015-12-22T02:57:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/unbundling-issues-of-privacy-data-security-identity-matrics-for-financial-inclusion">
    <title>Unbundling Issues of Privacy, Data Security, Identity Matrics, for Financial Inclusion</title>
    <link>https://cis-india.org/internet-governance/news/unbundling-issues-of-privacy-data-security-identity-matrics-for-financial-inclusion</link>
    <description>
        &lt;b&gt;This event was organized by Indicus Foundation and MicroSave on December 10, 2015 at the Metropolitan Hotel and Spa, New Delhi. Sunil Abraham was a speaker.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;While the initiative towards financial inclusion has gathered new impetus with the PMJDY and the accelerated roll out of benefits, there is also a parallel narrative of concerns over the legality and fundamental constitutionality of identity verification, which is a centre piece for delivery of financial benefits and services. These divergent narratives have now reached the Supreme Court.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At one end of the spectrum are the voices that avow the power of biometric technology to irrepudiately establish biological identity; at the other, the alarmism over targeting, concentration and misuse of personal information contained in the world’s biggest personal database. There is also a third extreme position of whether Indian citizens are entitled to the right to privacy constitutionally, and whether the right to privacy includes the right to refuse a national identity number or metric altogether. That India has yet to enact a Privacy Bill and the National Identity Authority Bill on which rests the statutory basis for UIDAI and Aadhaar only adds to the quagmire.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several issues lie intertwined in this miasma: Privacy as an absolute right; Definition and Limits of Personal Information and Sensitive Personal Information; Consent protocols over use of personal information; Data Security; Appropriate and inclusive technology platforms; and Responsibilities and Liabilities governing the use of personal information for bonafide purposes. These straddle multiple domains: data accuracy and irrepudiability; storage, security and encryption; and sharing of information for transaction processing including across national boundaries. Unfortunately, all of these tend to get lumped together in the public debate.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The aim of this workshop is to unbundle the issues and understand each of them from the perspective of financial inclusion, to be able to answer these questions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;How essential and critical is a unified Identity metric for digital financial transactions? How essential is that such a metric be biometric?&lt;/li&gt;
&lt;li&gt;To what extent does the centralised storage of biometric data represent risks of personal safety and national security, compared to the information on election voter lists, passport offices, census data, and bank accounts?&lt;/li&gt;
&lt;li&gt;What are the possible sources of transactional risk and security breaches in data sharing, and what are the international best practices?&lt;/li&gt;
&lt;li&gt;Is the present Aadhaar architecture robust enough to: address all the genuine and reasonable concerns over leakage and misuse of sensitive personal information; and to ensure that no genuine identity holder is turned away from a service, entitlement or benefit to which (s)he has a right or claim?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In this direction, we have the privilege to interact in this workshop with experts from The Centre for Internet and Society, and Data Security Council of India who have been at the forefront of the discussions on privacy and data security aspects of technology based innovations including for financial inclusion.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://cis-india.org/internet-governance/blog/icfi-workshop" class="internal-link"&gt;Download the Workshop Schedule here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/unbundling-issues-of-privacy-data-security-identity-matrics-for-financial-inclusion'&gt;https://cis-india.org/internet-governance/news/unbundling-issues-of-privacy-data-security-identity-matrics-for-financial-inclusion&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-03T10:45:19Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/kick-off-meeting-for-the-politics-of-data-project">
    <title>Kick Off Meeting for the Politics of Data Project</title>
    <link>https://cis-india.org/internet-governance/news/kick-off-meeting-for-the-politics-of-data-project</link>
    <description>
        &lt;b&gt;Tactical Technology Collective (TTC) on December 7 and 8, 2015 organized this event in Phnom Penh. Amber Sinha participated in it.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The areas TTC is planning to focus on in the Politics of Data project include:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Politics of Data: exploring questions about what it means to live in a data society and how it impacts our autonomy and privacy. Me and My Shadow is one of the projects under Politics of Data that looks at the digital traces that we leave behind and how these pieces of information are created, stored and collected. It provides people with resources to learn about how these digital traces can create stories or profiles about you, and how to minimise your digital traces online.&lt;/li&gt;
&lt;li&gt;Digital Security and Privacy: through this programme, they intend to work with rights advocates, journalists, activists and others to build their digital security skills.&lt;/li&gt;
&lt;li&gt;Exposing and Shaping Issues: this part of the programme will explore new forms of finding, creating and representing evidence by advocacy and activist groups and individuals.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The meetings saw participation from a host of organisations in Asia including Bytes for All, Cambodian Center for Human Rights, OpenNet, Community Legal Education Center, Engage Media, iPlural and Mido.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/kick-off-meeting-for-the-politics-of-data-project'&gt;https://cis-india.org/internet-governance/news/kick-off-meeting-for-the-politics-of-data-project&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-12T16:42:29Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
