<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/internet-governance/blog/online-anonymity/search_rss">
  <title>We are anonymous, we are legion</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 1446 to 1460.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/are-we-throwing-our-data-protection-regimes-under-the-bus"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/the-hindu-businessline-august-28-p-anima-the-new-tattler-in-town"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/the-changing-landscape-of-ict-governance-and-practice-convergence-and-big-data"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/responsible-data-forum"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-times-of-india-sandhya-soman-august-23-2015-the-seedy-underbelly-of-revenge-porn"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/hindustan-times-august-20-2015-aloke-tikku-stats-from-2014-reveal-horror-of-scrapped-section-66-a-of-it-act"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/right-to-privacy-in-peril"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/hardnewsmedia-august-10-2015-abeer-kapoor-net-neutrality-india-is-a-keybattle-ground"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-submission-to-unga-wsis-review"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/desi-blitz-august-7-2015-nazhat-khan-india-partially-lifts-porn-ban"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/open-magazine-august-7-2015-ullekh-np-genetic-profiling"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/are-we-throwing-our-data-protection-regimes-under-the-bus">
    <title>Are we Throwing our Data Protection Regimes under the Bus? </title>
    <link>https://cis-india.org/internet-governance/blog/are-we-throwing-our-data-protection-regimes-under-the-bus</link>
    <description>
        &lt;b&gt;In this blog post Rohan examines why the principle of consent is providing us increasingly less of an aegis in protecting our data. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Consent is complicated. What we think of as reasonably obtained consent varies substantially with the circumstance. For example, in treating rape cases, the UK justice system has moved to recognise complications like alcohol and its effect on explicit consent&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt;. Yet in contracts, consent may be implied simply when one person accepts another’s work on a contract without objections&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt;. These situations highlight the differences between the various forms of informed consent and the implications on its validity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Consent has emerged as a key principle in regulating the use of personal data, and different countries have adopted different regimes, ranging from the comprehensive regimes like of the EU to more sectoral approaches like that in the USA. However, in our modern epoch characterised by the big data analytics that are now commonplace, many commentators have challenged the efficacy and relevance of consent in data protection. I argue that we may even risk throwing our data protection regimes under the proverbial bus should we continue to focus on consent as a key pillar of data protection.&lt;/p&gt;
&lt;h3&gt;Consent as a tool in Data Protection Regimes&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In fact, even a cursory review of current data protection laws around the world shows the extent of the law’s reliance on consent. In the EU for example, Article 7 of the Data Protection Directive, passed in 1995, provides that data processing is only legitimate when “the data subject has unambiguously given his consent”&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt;. Article 8, which guards against processing of sensitive data, provides that such prohibitions may be lifted when “the data subject has given his explicit consent to the processing of those data”&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt;. Even as the EU attempts to strengthen data protection within the bloc with the proposed reforms to data protection&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;, the focus on the consent of data subject remains strong. There are proposals for an “unambiguous consent by the data subject”&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; requirement to be put in place. Such consent will be mandatory before any data processing can occur&lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Despite adopting very different overall approaches to data protection and privacy, consent is an equally integral part of data protection frameworks in the USA. In his book Protectors of Privacy&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt;, Abraham Newman describes two main types of privacy legislation: comprehensive and limited. He argues that places like the EU have adopted comprehensive regimes, which primarily seek to protect individuals because of the “informational and power asymmetry” between individuals and organisations&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt;. On the other hand, he classifies the American approach as limited, focusing on more sectoral protections and principles of fair information practice instead of overarching legislation&lt;a href="#_ftn10" name="_ftnref10"&gt;[10]&lt;/a&gt;. These sectors include the Fair Credit Reporting Act&lt;a href="#_ftn11" name="_ftnref11"&gt;[11]&lt;/a&gt; (which governs consumer credit reporting), the Privacy Act&lt;a href="#_ftn12" name="_ftnref12"&gt;[12]&lt;/a&gt; (which governs data collected by Federal government) and Electronic Communications Privacy Act&lt;a href="#_ftn13" name="_ftnref13"&gt;[13]&lt;/a&gt; (which deals with email communications) among others. However, the Federal Trade Commission describes itself as having only “limited authority over the collection and dissemination of personal data collected online”&lt;a href="#_ftn14" name="_ftnref14"&gt;[14]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This is because the general data processing that is commonplace in today’s era of big data is only regulated by the privacy protections that come from the Federal Trade Commission’s (FTC) Fair Information Practice Principles (FIPPs). Expectedly, consent is equally important under the FTC’s FIPPs. The FTC describes the principle of consent as “the second widely-accepted core principle of fair information practice”&lt;a href="#_ftn15" name="_ftnref15"&gt;[15]&lt;/a&gt; in addition to the principle of notice. Other guidelines on fair data processing published by organisations like the Organisation for Economic Cooperation and Development&lt;a href="#_ftn16" name="_ftnref16"&gt;[16]&lt;/a&gt; (OECD) or Canadian Standards Association&lt;a href="#_ftn17" name="_ftnref17"&gt;[17]&lt;/a&gt; (CSA) also include consent as a key mechanism in data protection.&lt;/p&gt;
&lt;h3&gt;The origins of consent in privacy and data protection&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Given the clearly extensive reliance on consent in data protection, it seems prudent to examine the origins of consent in privacy and data protection. Just why does consent have so much weight in data protection?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One reason is that data protection, along with inextricably linked concerns about privacy, could be said to be rooted in protecting private property. It was argued that the “early parameters of what was to become the right to privacy were set in cases dealing with unconventional property claims”&lt;a href="#_ftn18" name="_ftnref18"&gt;[18]&lt;/a&gt;, such as unconsented publication of personal letters&lt;a href="#_ftn19" name="_ftnref19"&gt;[19]&lt;/a&gt; or photographs&lt;a href="#_ftn20" name="_ftnref20"&gt;[20]&lt;/a&gt;. It was the publication of Brandeis and Warren’s well-known article “The Right to Privacy”&lt;a href="#_ftn21" name="_ftnref21"&gt;[21]&lt;/a&gt;, that developed “the current philosophical dichotomy between privacy and property rights”&lt;a href="#_ftn22" name="_ftnref22"&gt;[22]&lt;/a&gt;, as they asserted that privacy protections ought to be recognised as a right in and of themselves and needed separate protection&lt;a href="#_ftn23" name="_ftnref23"&gt;[23]&lt;/a&gt;. Indeed, it was Warren and Brandeis who famously borrowed Justice Cooley's expression that privacy is the “right to be let alone”&lt;a href="#_ftn24" name="_ftnref24"&gt;[24]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the other side of the debate are scholars like Epstein and Posner, who see privacy protections as part of protecting personal property under tort law&lt;a href="#_ftn25" name="_ftnref25"&gt;[25]&lt;/a&gt;. However, the central point is that most scholars seem to acknowledge the relationship between privacy and private property. Even Brandeis and Warren themselves argued that one general aim of privacy is “to protect the privacy of private life, and to whatever degree and in whatever connection a man's life has ceased to be private”&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is also important to locate the idea of consent within the domain of privacy and private property protections. Ostensibly, consent seems to have the effect of lessening the privacy protections afforded in a particular situation to a person, because by acquiescing to the situation, one could be seen as waiving their privacy concerns. Brandeis and Warren concur with this position as they acknowledge how “the right to privacy ceases upon the publication of the facts by the individual, or with his consent”&lt;a href="#_ftn27" name="_ftnref27"&gt;[27]&lt;/a&gt;. They assert that this is “but another application of the rule which has become familiar in the law of literary and artistic property”&lt;a href="#_ftn28" name="_ftnref28"&gt;[28]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Perhaps the most eloquent articulation of the importance of consent in privacy comes from Sir Edward Coke’s idea that “every man’s house is his castle”&lt;a href="#_ftn29" name="_ftnref29"&gt;[29]&lt;/a&gt;. Though the ‘Castle Doctrine’ has been used as a justification for protecting one’s property with the use of force&lt;a href="#_ftn30" name="_ftnref30"&gt;[30]&lt;/a&gt;, I think that implied in the idea of the ‘Castle Doctrine’ is that consent is necessary in order to preserve privacy. If not, why would anyone be justified in preventing trespass, other than to prevent unconsented entry or use of their property. The doctrine of “Volenti non fit injuria”&lt;a href="#_ftn31" name="_ftnref31"&gt;[31]&lt;/a&gt;, or ‘to one who consents no injury is done’, is thus the very embodiment of the role of consent in protecting private property. And as conceptions of private property develop to recognise that the data one gives out is part of his private property, for example in &lt;i&gt;US v. Jones&lt;/i&gt;, which led scholars to assert that “people should be able to maintain reasonable expectations of privacy in some information voluntarily disclosed to third parties”&lt;a href="#_ftn32" name="_ftnref32"&gt;[32]&lt;/a&gt;, so does consent act as an important aspect of privacy protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Yet, linking privacy with private property is not universally accepted as the conception of privacy. For instance, Alan Westin, in his book Privacy and Freedom&lt;a href="#_ftn33" name="_ftnref33"&gt;[33]&lt;/a&gt;, describes privacy as “the right to control information about oneself”&lt;a href="#_ftn34" name="_ftnref34"&gt;[34]&lt;/a&gt;. Another scholar, Ruth Gavison, contends instead that “our interest in privacy is related to our concern over our accessibility to others: the extent to which we are known to others, the extent to which others have physical access to us, and the extent to which we are the subject of others' attention”&lt;a href="#_ftn35" name="_ftnref35"&gt;[35]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While these alternative notions about privacy’s foundational principles may differ from those related to linking privacy with private property, locating consent within these formulations of privacy is possible. Regarding Westin’s argument, I think that implicit in the right to control one’s information are ideas about individual autonomy, which is exercised through giving or withholding one’s consent. Similarly, Gavison herself states that privacy functions to advance “liberty, autonomy and selfhood”&lt;a href="#_ftn36" name="_ftnref36"&gt;[36]&lt;/a&gt;. Consent plays a key role in upholding this liberty, autonomy and selfhood that privacy affords us. Clearly therefore, it is far from unfounded to claim that consent is an integral part of protecting privacy.&lt;/p&gt;
&lt;h3&gt;Consent, Big Data and Data protection&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Given the solid underpinnings of the principle of consent in privacy protection, it was hardly a coincidence that consent became an integral part of data protection. However, with the rise of big data practices, one quickly finds that consent ceases to work effectively as a tool for protecting privacy. In a big data context, Solove argues that privacy regulation rooted in consent is ineffective, because garnering consent amidst ubiquitous data collection for all the online services one uses as part of daily life is unmanageable&lt;a href="#_ftn37" name="_ftnref37"&gt;[37]&lt;/a&gt;. Additionally, the secondary uses of one’s data are difficult to assess at the point of collection, and subsequently meaningful consent for secondary use is difficult to obtain&lt;a href="#_ftn38" name="_ftnref38"&gt;[38]&lt;/a&gt;. This section examines these two primary consequences of prioritising consent amidst Big data practises.&lt;/p&gt;
&lt;h3&gt;Consent places unrealistic and unfair expectations on the Individual&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;As noted by Tene and Polonetsky, the first concern is that current privacy frameworks which emphasize informed consent “impose significant, sometimes unrealistic, obligations on both organizations and individuals”&lt;a href="#_ftn39" name="_ftnref39"&gt;[39]&lt;/a&gt;. The premise behind this argument stems from the way that consent is often garnered by organisations, especially regarding use of their services. An examination of various terms of use policies from banks, online video streaming websites, social networking sites, online fashion or more general online shopping websites reveals a deluge of information that the user has to comprehend. Moreover, there are a too many “entities collecting and using personal data to make it feasible for people to manage their privacy separately with each entity”&lt;a href="#_ftn40" name="_ftnref40"&gt;[40]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As Cate and Mayer-Schönberger note in the Microsoft Global Privacy Summit Summary Report, “almost everywhere that individuals venture, especially online, they are presented with long and complex privacy notices routinely written by lawyers for lawyers, and then requested to either “consent” or abandon the use of the desired service”&lt;a href="#_ftn41" name="_ftnref41"&gt;[41]&lt;/a&gt;. In some cases, organisations try to simplify these policies for the users of their service, but such initiatives make up the minority of terms of use policies. Tene and Polonetsky assert that “it is common knowledge among practitioners in the field that privacy policies serve more as liability disclaimers for businesses than as assurances of privacy for consumers”&lt;a href="#_ftn42" name="_ftnref42"&gt;[42]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, it is equally important to consider the principle of consent from perspective of companies. At a time where many businesses have to comply with numerous regulations and processes in the name of ‘compliance’&lt;a href="#_ftn43" name="_ftnref43"&gt;[43]&lt;/a&gt;, the obligations for obtaining consent could burden some businesses. Firms have to gather consent amidst enhancing user or customer experiences, which represents a tricky balance to find. For example, requiring consent at every stage may make the user experience much worse. Imagine having to give consent for your profile to be uploaded every time you make a high score in a video game? At the same time, “organizations are expected to explain their data processing activities on increasingly small screens and obtain consent from often-uninterested individuals”&lt;a href="#_ftn44" name="_ftnref44"&gt;[44]&lt;/a&gt;. Given these factors, it is somewhat understandable for companies to garner consent for all possible (secondary) uses as otherwise it is not feasible to keep collecting.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Nonetheless, this results in situations where “data processors can perhaps too easily point to the formality of notice and consent and thereby abrogate much of their responsibility”&lt;a href="#_ftn45" name="_ftnref45"&gt;[45]&lt;/a&gt;.The totality of the situation shows the odds stacked against the individual. It could be even argued that this is one manifestation of the informational and power asymmetry that exists between individuals and organisations&lt;a href="#_ftn46" name="_ftnref46"&gt;[46]&lt;/a&gt;, because users may unwittingly agree to unfair, unclear or even unknown terms and conditions and data practices. Not only are individuals greatly misinformed about data collected about them, but the vast majority of people do not even read these Terms and Conditions or End User license agreements&lt;a href="#_ftn47" name="_ftnref47"&gt;[47]&lt;/a&gt;. Solove also argues that “people often lack enough expertise to adequately assess the consequences of agreeing to certain present uses or disclosures of their data”&lt;a href="#_ftn48" name="_ftnref48"&gt;[48]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the organisational practice of providing extensive and complicated terms of use policies is not illegal, the fact that by one estimation, it may take you would have to take 76 working days to review the privacy policies you have agreed to online&lt;a href="#_ftn49" name="_ftnref49"&gt;[49]&lt;/a&gt;, or by another, that in the USA the opportunity cost society incurs in reading privacy policies is $781 billion&lt;a href="#_ftn50" name="_ftnref50"&gt;[50]&lt;/a&gt;, should not go unnoticed. I do think it is unfair for the law to put users into such situations, where they are “forced to make overly complex decisions based on limited information”&lt;a href="#_ftn51" name="_ftnref51"&gt;[51]&lt;/a&gt;. There have been laudable attempts by some government organisations like Canada’s Office of the Privacy Commissioner and USA’s Federal Trade Commission to provide guidance to firms to make their privacy policies more accessible&lt;a href="#_ftn52" name="_ftnref52"&gt;[52]&lt;/a&gt;. However, these are hard to enforce. Therefore, it can be assumed that when users have neither the expertise nor the rigour to review privacy policies effectively, the consent they provide would naturally be far from informed.&lt;/p&gt;
&lt;h3&gt;Secondary use, Aggregation and Superficial Consent&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;What amplifies this informational asymmetry is the potential for the aggregation of individual’s data and subsequent secondary use of that data collected. “Even if people made rational decisions about sharing individual pieces of data in isolation, they greatly struggle to factor in how their data might be aggregated in the future”&lt;a href="#_ftn53" name="_ftnref53"&gt;[53]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This has to do with the prevalence of big data analytics that characterizes our modern epoch, and has major implications for the nature and meaningfulness of the consent users provide. By definition, “big data analysis seeks surprising correlations”&lt;a href="#_ftn54" name="_ftnref54"&gt;[54]&lt;/a&gt; and some of its most insightful results are counterintuitive and nearly impossible to conceive at the point of primary data collection. One noteworthy example comes from the USA, with the predictive analytics of Walmart. By studying purchasing patterns of its loyalty card holders&lt;a href="#_ftn55" name="_ftnref55"&gt;[55]&lt;/a&gt;, the company ascertained that prior to a hurricane the most popular items that people tend to buy are actually Pop Tarts (a pre-baked toaster pastry) and Beer&lt;a href="#_ftn56" name="_ftnref56"&gt;[56]&lt;/a&gt;. These correlations are highly counterintuitive and far from what people expect to be necessities before a hurricane. These insights led to Walmart stores being stocked with the most relevant products at the time of need. This is one example of how data might be repurposed and aggregated for a novel purpose, but nonetheless the question about the nature of consent obtained by Walmart for the collection and analysis of the shopping habits of its loyalty card holders stands.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One reason secondary uses make consent less meaningful has been articulated by De Zwart et al, who observe that “the idea of consent becomes unworkable in an environment where it is not known, even by the people collecting and selling data, what will happen to the data”&lt;a href="#_ftn57" name="_ftnref57"&gt;[57]&lt;/a&gt;. Taken together with Solove’s aggregation effect, two points become apparent:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify; "&gt;Data we consent to be collected about us may be aggregated with other data we may have revealed in the past. While separately they may be innocuous, there is a risk of future aggregation to create new information which one may find overly intrusive and not consent to. However, current data protection regimes make it hard for one to provide such consent, because there is no way for the user to know how his past and present data may be aggregated in the future.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Data we consent to be collected for one specific purpose may be used in a myriad of other ways. The user has virtually no way to know how their data might be repurposed because often time neither do the collectors of that data&lt;a href="#_ftn58" name="_ftnref58"&gt;[58]&lt;/a&gt;.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Therefore, regulators reliance on principles of purpose limitation and the mechanism of consent for robust data protection seems suboptimal at the very least, as big data practices of aggregation, repurposing and secondary uses become commonplace.&lt;/p&gt;
&lt;h3&gt;Other problems with the mechanism of consent in the context of Big Data&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;On one end of the spectrum are situations where organisations garner consent for future secondary uses at the time of data collection. As discussed earlier, this is currently the common practice for organisations and the likelihood of users providing informed consent is low.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, equally valid is considering the situations on the other end of the spectrum, where obtaining user consent for secondary use becomes too expensive and cumbersome&lt;a href="#_ftn59" name="_ftnref59"&gt;[59]&lt;/a&gt;. As a result, potentially socially valuable secondary use of data for research and innovation or simply “the practice of informed and reflective citizenship”&lt;a href="#_ftn60" name="_ftnref60"&gt;[60]&lt;/a&gt; may not take place. While potential social research may be hindered by the consent requirement, the reality that one cannot give meaningful consent to an unknown secondary uses of data is more pressing. Essentially, not knowing what you are consenting to scarcely provides the individual with any semblance of strong privacy protections and so the consent that individuals provide is superficial at best.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Many scholars also point to the binary nature of consent as it stands today&lt;a href="#_ftn61" name="_ftnref61"&gt;[61]&lt;/a&gt;. Solove describes consent in data protection as nuanced&lt;a href="#_ftn62" name="_ftnref62"&gt;[62]&lt;/a&gt; while Cate and Mayer-Schönberger go further to assert that “binary choice is not what the privacy architects envisioned four decades ago when they imagined empowered individuals making informed decisions about the processing of their personal data”. This dichotomous nature of consent further reduces its usefulness in data protection regimes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Whether data collection is opted into or opted out of also has a bearing on the nature of the consent obtained. Many argue that regulations with options to opt out are not effective as “opt-out consent might be the product of mere inertia or lack of awareness of the option to opt out”&lt;a href="#_ftn63" name="_ftnref63"&gt;[63]&lt;/a&gt;. This is in line with initiatives around the world to make gathering consent more explicit by having options to opt in instead of opt out. Noted articulations of the impetus to embrace opt in regimes include ex FTC chairman Jon Leibowitz as early as 2007&lt;a href="#_ftn64" name="_ftnref64"&gt;[64]&lt;/a&gt;, as well as being actively considered by the EU in the reform of their data protection laws&lt;a href="#_ftn65" name="_ftnref65"&gt;[65]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, as Solove rightly points out, opt in consent is problematic as well&lt;a href="#_ftn66" name="_ftnref66"&gt;[66]&lt;/a&gt;. There are a few reasons for this: first, that many data collectors have the “sophistication and motivation to find ways to generate high opt-in rates”&lt;a href="#_ftn67" name="_ftnref67"&gt;[67]&lt;/a&gt; by “conditioning products, services, or access on opting in”&lt;a href="#_ftn68" name="_ftnref68"&gt;[68]&lt;/a&gt;. In essence, they leave individuals no choice but to opt into data collection because using their particular product or service is dependant or ‘conditional’ on explicit consent. A pertinent example of this is the end-user license agreement to Apple’s iTunes Store&lt;a href="#_ftn69" name="_ftnref69"&gt;[69]&lt;/a&gt;. Solove rightly notes that “if people want to download apps from the store, they have no choice but to agree. This requirement is akin to an opt-in system — affirmative consent is being sought. But hardly any bargaining or choosing occurs in this process”&lt;a href="#_ftn70" name="_ftnref70"&gt;[70]&lt;/a&gt;. Second, as stated earlier, obtaining consent runs the risk of impeding potential innovation or research because it is too cumbersome or expensive to obtain&lt;a href="#_ftn71" name="_ftnref71"&gt;[71]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Third, as Tene and Polonetsky argue, “collective action problems threaten to generate a suboptimal equilibrium where individuals fail to opt into societally beneficial data processing in the hope of free-riding on others’ good will”&lt;a href="#_ftn72" name="_ftnref72"&gt;[72]&lt;/a&gt;. A useful example to illustrate this comes from another context where obtaining consent is the difference between life and death: organ donation. The gulf in consenting donors between countries with an opt in regime for organ donation and countries with an opt out regime is staggering. Even countries that are culturally similar, such as Austria and Germany, exhibit vast differences in donation rates – Austria at 99% compared to just 12% in Germany&lt;a href="#_ftn73" name="_ftnref73"&gt;[73]&lt;/a&gt;. This suggests that in terms of obtaining consent (especially for socially valuable actions), opt in methods may be limiting, because people may have an aversion to anything being presumed about their choices, even if costs of opting out are low&lt;a href="#_ftn74" name="_ftnref74"&gt;[74]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What the above section demonstrates is how consent may be somewhat limited as a tool for data protection regimes, especially in a big data context. That said, consent is not in itself a useless or outdated concept. The problems raised above articulate the problems that relying on consent extensively pose in a big data context. Consent should still remain a part of data protection regimes. However, there are both better ways to obtain consent (for organisations that collect data) as well as other areas to focus regulatory attention on aside from the time of data collection.&lt;/p&gt;
&lt;h3&gt;What can organisations do better to obtain more meaningful consent&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Organisations that collect data could alter the way the obtain user consent. Most people can attest to having checked a box that was lying surreptitiously next to the words ‘I agree’, thereby agreeing to the Terms and Conditions or End-user License Agreement for a particular service or product. This is in line with the need for both parties to assent to the terms of a contract as part of making valid a contract&lt;a href="#_ftn75" name="_ftnref75"&gt;[75]&lt;/a&gt;. Some of the more common types of online agreements that users enter into are Clickwrap and Browsewrap agreements. A Clickwrap agreement is “formed entirely in an online environment such as the Internet, which sets forth the rights and obligations between parties”&lt;a href="#_ftn76" name="_ftnref76"&gt;[76]&lt;/a&gt;. They “require a user to click "I agree" or “I accept” before the software can be downloaded or installed”&lt;a href="#_ftn77" name="_ftnref77"&gt;[77]&lt;/a&gt;. On the other hand, Browsewrap agreements “try to characterize your simple use of their website as your ‘agreement’ to a set of terms and conditions buried somewhere on the site”&lt;a href="#_ftn78" name="_ftnref78"&gt;[78]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Because Browsewrap agreements do not “require a user to engage in any affirmative conduct”&lt;a href="#_ftn79" name="_ftnref79"&gt;[79]&lt;/a&gt;, the kind of consent that these types of agreements obtain is highly superficial. In fact, many argue that such agreements are slightly unscrupulous because users are seldom aware that such agreements exist&lt;a href="#_ftn80" name="_ftnref80"&gt;[80]&lt;/a&gt;, often hidden in small print&lt;a href="#_ftn81" name="_ftnref81"&gt;[81]&lt;/a&gt; or below the download button&lt;a href="#_ftn82" name="_ftnref82"&gt;[82]&lt;/a&gt; for example. And the courts have begun to consider such terms and practices unfair, which “hold website users accountable for terms and conditions of which a reasonable Internet user would not be aware just by using the site”&lt;a href="#_ftn83" name="_ftnref83"&gt;[83]&lt;/a&gt;. For example, In &lt;i&gt;re Zappos.com Inc., Customer Data Security Breach Litigation&lt;/i&gt;, the court said of their Terms of Use (which is in a browsewrap agreement):&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The Terms of Use is inconspicuous, buried in the middle to bottom of every Zappos.com webpage among many other links, and the website never directs a user to the Terms of Use. No reasonable user would have reason to click on the Terms of Use”&lt;a href="#_ftn84" name="_ftnref84"&gt;[84]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Clearly, courts recognise the potential for consent or assent to be obtained in a hardly transparent or hands on manner. Organisations that collect data should be aware of this and consider other options for obtaining consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A few commentators have suggested that organisations switch to using Clickwrap or clickthrough agreements to obtain consent. Undergirding this argument is the fact that courts have on numerous occasions, upheld the validity of a Clickwrap agreement. Such cases include &lt;i&gt;Groff v. America Online, Inc&lt;a href="#_ftn85" name="_ftnref85"&gt;&lt;b&gt;[85]&lt;/b&gt;&lt;/a&gt;&lt;/i&gt; and &lt;i&gt;Hotmail Corporation v. Van Money Pie, Inc&lt;a href="#_ftn86" name="_ftnref86"&gt;&lt;b&gt;[86]&lt;/b&gt;&lt;/a&gt;&lt;/i&gt;. These cases built upon the precedent-setting case of &lt;i&gt;Pro CD v. Zeidenberg&lt;/i&gt;, in which the court ruled that “Shrinkwrap licenses are enforceable unless their terms are objectionable on grounds applicable to contracts in general”&lt;a href="#_ftn87" name="_ftnref87"&gt;[87]&lt;/a&gt;. Shrinkwrap licenses, which refer to end user license agreements printed on the shrinkwrap of a software product which a user will definitely notice and have the opportunity to read before opening and using the product, and the rules that govern them, have seen application to clickthrough agreements. As Bayley rightly noted, the validity of clickthrough agreements is dependent on “reasonable notice and opportunity to review—whether the placement of the terms and click-button afforded the user a reasonable opportunity to find and read the terms without much effort”&lt;a href="#_ftn88" name="_ftnref88"&gt;[88]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;From the perspective of companies and other organisations which attempt to garner consent from users to collect and process their data, utilizing Clickwrap agreements might be one useful solution to consider in obtaining more meaningful and informed consent. In fact Bayley contends that clear Clickwrap agreements are “the “best practice” mechanism for creating a contractual relationship between an online service and a user”&lt;a href="#_ftn89" name="_ftnref89"&gt;[89]&lt;/a&gt;. He suggests the following mechanism for acquiring clear and informed consent via contractual agreement&lt;a href="#_ftn90" name="_ftnref90"&gt;[90]&lt;/a&gt;:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify; "&gt;Conspicuously present the TOS to the user prior to any payment (or other commitment by the user) or installation of software (or other changes to a user’s machine or browser, like cookies, plug-ins, etc.)&lt;/li&gt;
&lt;li&gt;Allow the user to easily read and navigate all of the terms (i.e. be in a normal, readable typeface with no scroll box)&lt;/li&gt;
&lt;li&gt;Provide an opportunity to print, and/or save a copy of, the terms&lt;/li&gt;
&lt;li&gt;Offer the user the option to decline as prominently and by the same method as the option to agree&lt;/li&gt;
&lt;li&gt;Ensure the TOS is easy to locate online after the user agrees.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;These principles make a lot of sense for organisations, as it requires relatively minor procedural changes instead of more transformational efforts to alter the way the validate their data processing processes entirely.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Herzfield adds two further suggestions to this list. First, organisations should not allow any use of their product or service until “express and active manifestation of assent”&lt;a href="#_ftn91" name="_ftnref91"&gt;[91]&lt;/a&gt;. Also, they should institute processes where users re-iterate their consent and assent to the terms of use&lt;a href="#_ftn92" name="_ftnref92"&gt;[92]&lt;/a&gt;. He goes further to propose a baseline that organisations should follow: “companies should always provide at least inquiry notice of all terms, and require counterparties to manifest assent, through action or inaction, in a manner that reasonable people would clearly understand to be assent”&lt;a href="#_ftn93" name="_ftnref93"&gt;[93]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While obtaining informed and meaningful consent is neither fool proof nor a process which has widely accepted clear steps, what is clear is that current efforts by organisations may be insufficient. As Cate and Mayer-Schönberger note, “data processors can perhaps too easily point to the formality of notice and consent and thereby abrogate much of their responsibility”&lt;a href="#_ftn94" name="_ftnref94"&gt;[94]&lt;/a&gt;. One thing they can do to both ensure more meaningful and informed consent (from the perspective of the users) and preventing potential legal action for unscrupulous or unfair terms is to change the way they obtain consent from opt out to opt in.&lt;/p&gt;
&lt;h3&gt;Conclusion – how should regulation change&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In conclusion, the current emphasis and extensive use of consent in data protection seems to be limited in effectively protecting against illegitimate processing of data in a big data context. More people are starting to use online services extensively. This is coupled by the fact that organisations are realizing the value of collecting and analysing user data to carry out data-driven analytics for insights that can improve the efficacy of the product. Clearly, data protection has never been more crucial.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However not only does emphasising consent seem less relevant, because the consent organisations obtain is seldom informed, but it may even jeopardise the intentions of data protection. Commentators are quick to point out how nimble firms are at acquiring consent in newer ways that may comply with laws but still allow them to maintain their advantageous position of asymmetric power. Kuner, Cate, Millard and Svantesson, all eminent scholars in the field of Big data, asked the prescient question: “Is there a proper role for individual consent?”&lt;a href="#_ftn95" name="_ftnref95"&gt;[95]&lt;/a&gt;They believe consent still has a role, but that finding this role in the Big data context is challenging&lt;a href="#_ftn96" name="_ftnref96"&gt;[96]&lt;/a&gt;. However, there is surprising consensus on the approach that should be taken as data protection regimes shift away from consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In fact, the alternative is staring at us in the face: data protection regimes have to look elsewhere, to other points along the data analysis process for aspects to regulate and ensure legitimate and fair processing of data. One compelling idea which had broad-based support during the aforementioned Microsoft Privacy Summit was that “new approaches must shift responsibility away from data subjects toward data users and toward a focus on accountability for responsible data stewardship”&lt;a href="#_ftn97" name="_ftnref97"&gt;[97]&lt;/a&gt;, ie creating regulations to guide data processing instead of the data collection. De Zwart et al. suggest that regulation must instead “focus on the processes involved in establishing algorithms and the use of the resulting conclusions”&lt;a href="#_ftn98" name="_ftnref98"&gt;[98]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This might involve regulations relating to requiring data collectors to publish the queries they run on the data. This would be a solution that balances maintaining the ‘trade secret’ of the firm, who has creatively designed an algorithm, with ensuring fairness and legitimacy in data processing. One manifestation of this approach is in conceptualising procedural data due process which “would regulate the fairness of Big Data’s analytical processes with regard to how they use personal data (or metadata derived from or associated with personal data) in any adjudicative process, including processes whereby Big Data is being used to determine attributes or categories for an individual”&lt;a href="#_ftn99" name="_ftnref99"&gt;[99]&lt;/a&gt;. While there is debate regarding the usefulness of a data due process, the idea of data due process is just part of the consortium of ideas surrounding alternatives to consent in data protection. The main point is that “greater transparency should be required if there are fewer opportunities for consent or if personal data can be lawfully collected without consent”&lt;a href="#_ftn100" name="_ftnref100"&gt;[100]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is also worth considering exactly what a single use of group or individual’s data is, and what types of uses or processes require a “greater form of authorization”&lt;a href="#_ftn101" name="_ftnref101"&gt;[101]&lt;/a&gt;. Certain data processes could require special affirmative consent to be procured, which is not applicable for other less intimate matters. Canada’s Office of the Privacy Commissioner released a privacy toolkit for organisations, in which they provide some exceptions to the consent principle, one of which is if data collection “is clearly in the individual’s interests and consent is not available in a timely way”&lt;a href="#_ftn102" name="_ftnref102"&gt;[102]&lt;/a&gt;. Some therefore suggest that “if notice and consent are reserved for more appropriate uses, individuals might pay more attention when this mechanism is used”&lt;a href="#_ftn103" name="_ftnref103"&gt;[103]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another option for regulators is to consider the development and implementation of a sticky privacy policies regime. This refers to “machine-readable policies [that] can stick to data to define allowed usage and obligations as it travels across multiple parties, enabling users to improve control over their personal information”&lt;a href="#_ftn104" name="_ftnref104"&gt;[104]&lt;/a&gt;. Sticky privacy policies seem to alleviate the risk of repurposed, unanticipated uses of data because users who consent to giving out their data will be consenting to how it is used thereafter. However, the counter to sticky policies is that it places even greater obligations on users to decide how they would like their data used, not just at one point but for the long term. To expect organisations to state their purposes for future use of individuals data or that individuals are to give informed consent to such uses seems farfetched from both perspectives.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Still another solution draws from the noted scholar Helen Nissenbaum’s work on privacy. She argues that “the benchmark of privacy is contextual integrity”&lt;a href="#_ftn105" name="_ftnref105"&gt;[105]&lt;/a&gt;. ”Contextual integrity ties adequate protection for privacy to norms of specific contexts, demanding that information gathering and dissemination be appropriate to that context and obey the governing norms of distribution within it”&lt;a href="#_ftn106" name="_ftnref106"&gt;[106]&lt;/a&gt;. According to this line of thinking, legislators should instead focus their attention on what constitutes appropriateness in certain contexts, although this could be a challenging task as contexts merge and understandings of appropriateness change according to the circumstances of a context. .&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While there is little consensus regarding the numerous ways to focus regulatory attention on data processing and the uses of data collected, there is more support for a shift away from consent, as exemplified by the Microsoft privacy Summit:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“There was broad general agreement that privacy frameworks that rely heavily on individual notice and consent are neither sustainable in the face of dramatic increases in the volume and velocity of information flows nor desirable because of the burden they place on individuals to understand the issues, make choices, and then engage in oversight and enforcement.”&lt;a href="#_ftn107" name="_ftnref107"&gt;[107]&lt;/a&gt; I think Cate and Mayer- Schönberger make for the most valid conclusion to this article, as well as to summarise the debate I have presented. They say that “in short, ensuring individual control over personal data is not only an increasingly unattainable objective of data protection, but in many settings it is an undesirable one as well.”&lt;a href="#_ftn108" name="_ftnref108"&gt;[108]&lt;/a&gt; We might very well be throwing the entire data protection regimes under the bus.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;[1]&lt;/a&gt; Gordon Rayner and Bill Gardner, “Men Must Prove a Woman Said ‘Yes’ under Tough New Rape Rules - Telegraph,” &lt;i&gt;The Telegraph&lt;/i&gt;, January 28, 2015, sec. Law and Order, http://www.telegraph.co.uk/news/uknews/law-and-order/11375667/Men-must-prove-a-woman-said-Yes-under-tough-new-rape-rules.html.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;[2]&lt;/a&gt; Legal Information Institute, “Implied Consent,” accessed August 25, 2015, https://www.law.cornell.edu/wex/implied_consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;[3]&lt;/a&gt; European Parliament, Council of the European Union, &lt;i&gt;Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data&lt;/i&gt;, 1995, http://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX:31995L0046.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;[4]&lt;/a&gt; See supra note 3.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; European Commission, “Stronger Data Protection Rules for Europe,” &lt;i&gt;European Commission Press Release Database&lt;/i&gt;, June 15, 2015, http://europa.eu/rapid/press-release_MEMO-15-5170_en.htm.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;[6]&lt;/a&gt; Council of the European Union, “Data Protection: Council Agrees on a General Approach,” June 15, 2015, http://www.consilium.europa.eu/en/press/press-releases/2015/06/15-jha-data-protection/.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;[7]&lt;/a&gt; See supra note 6.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;[8]&lt;/a&gt; Abraham L. Newman, &lt;i&gt;Protectors of Privacy: Regulating Personal Data in the Global Economy&lt;/i&gt; (Ithaca, NY: Cornell University Press, 2008).&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;[9]&lt;/a&gt; See supra note 8, at 24.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;[10]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;[11]&lt;/a&gt; 15 U.S.C. §1681.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;[12]&lt;/a&gt; 5 U.S.C. § 552a.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;[13]&lt;/a&gt; 18 U.S.C. § 2510-22.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;[14]&lt;/a&gt; Federal Trade Commission, “Privacy Online: A Report to Congress,” June 1998, https://www.ftc.gov/sites/default/files/documents/reports/privacy-online-report-congress/priv-23a.pdf: 40.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;[15]&lt;/a&gt; See supra note 14, at 8.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;[16]&lt;/a&gt; Organisation for Economic Cooperation and Development, “2013 OECD Privacy Guidelines,” 2013, http://www.oecd.org/internet/ieconomy/privacy-guidelines.htm.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;[17]&lt;/a&gt; Canadian Standards Association, “Canadian Standards Association Model Code,” March 1996, https://www.cippguide.org/2010/06/29/csa-model-code/.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;[18]&lt;/a&gt; Mary Chlopecki, “The Property Rights Origins of Privacy Rights | Foundation for Economic Education,” August 1, 1992, http://fee.org/freeman/the-property-rights-origins-of-privacy-rights.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;[19]&lt;/a&gt; See &lt;i&gt;Pope v&lt;/i&gt;.&lt;i&gt; Curl &lt;/i&gt;(1741), available &lt;a href="http://www.commonlii.org/uk/cases/EngR/1741/500.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;[20]&lt;/a&gt; See &lt;i&gt;Prince Albert v. Strange&lt;/i&gt; (1849), available &lt;a href="http://www.bailii.org/ew/cases/EWHC/Ch/1849/J20.html"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;[21]&lt;/a&gt; Samuel D. Warren and Louis D. Brandeis, “The Right to Privacy,” &lt;i&gt;Harvard Law Review&lt;/i&gt; 4, no. 5 (December 15, 1890): 193–220, doi:10.2307/1321160.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;[22]&lt;/a&gt; See supra note 18.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;[23]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;[24]&lt;/a&gt; See supra note 21.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;[25]&lt;/a&gt; See for example, Richard Epstein, “Privacy, Property Rights, and Misrepresentations,” &lt;i&gt;Georgia Law Review&lt;/i&gt;, January 1, 1978, 455. And Richard Posner, “The Right of Privacy,” &lt;i&gt;Sibley Lecture Series&lt;/i&gt;, April 1, 1978, http://digitalcommons.law.uga.edu/lectures_pre_arch_lectures_sibley/22.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;[26]&lt;/a&gt; See supra note 21, at 215.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;[27]&lt;/a&gt; &lt;a href="http://www.english.illinois.edu/-people-/faculty/debaron/582/582%20readings/right%20to%20privacy.pdf"&gt;See supra note 21, at 218&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;[28]&lt;/a&gt; &lt;a href="http://www.english.illinois.edu/-people-/faculty/debaron/582/582%20readings/right%20to%20privacy.pdf"&gt;Ibid.&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;[29]&lt;/a&gt; Adrienne W. Fawcett, “Q: Who Said: ‘A Man’s Home Is His Castle’?,” &lt;i&gt;Chicago Tribune&lt;/i&gt;, September 14, 1997, http://articles.chicagotribune.com/1997-09-14/news/9709140446_1_castle-home-sir-edward-coke.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;[30]&lt;/a&gt; Brendan Purves, “Castle Doctrine from State to State,” &lt;i&gt;South Source&lt;/i&gt;, July 15, 2011, http://source.southuniversity.edu/castle-doctrine-from-state-to-state-46514.aspx.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;[31]&lt;/a&gt; “Volenti Non Fit Injuria,” &lt;i&gt;E-Lawresources&lt;/i&gt;, accessed August 25, 2015, http://e-lawresources.co.uk/Volenti-non-fit-injuria.php.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;[32]&lt;/a&gt; Bryce Clayton Newell, “Local Law Enforcement Jumps on the Big Data Bandwagon: Automated License Plate Recognition Systems, Information Privacy, and Access to Government Information,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, October 16, 2013), http://papers.ssrn.com/abstract=2341182.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;[33]&lt;/a&gt; Alan Westin, &lt;i&gt;Privacy and Freedom&lt;/i&gt; (Ig Publishing, 2015).&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;[34]&lt;/a&gt; Helen Nissenbaum, “Privacy as Contextual Integrity,” &lt;i&gt;Washington Law Review&lt;/i&gt; 79 (2004): 119.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;[35]&lt;/a&gt; Ruth Gavison, “Privacy and the Limits of Law,” &lt;i&gt;The Yale Law Journal&lt;/i&gt; 89, no. 3 (January 1, 1980): 421–71, doi:10.2307/795891: 423.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;[36]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;[37]&lt;/a&gt; Daniel J. Solove, “Privacy Self-Management and the Consent Dilemma,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, November 4, 2012), &lt;a href="http://papers.ssrn.com/abstract=2171018"&gt;http://papers.ssrn.com/abstract=2171018&lt;/a&gt;: 1888.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;[38]&lt;/a&gt; Ibid, at 1889.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;[39]&lt;/a&gt; Omer Tene and Jules Polonetsky, “Big Data for All: Privacy and User Control in the Age of Analytics,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, September 20, 2012), &lt;a href="http://papers.ssrn.com/abstract=2149364"&gt;http://papers.ssrn.com/abstract=2149364&lt;/a&gt;: 261.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;[40]&lt;/a&gt; See supra note 37, at 1881.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref41" name="_ftn41"&gt;[41]&lt;/a&gt; Fred H. Cate and Viktor Mayer-Schönberger, “Notice and Consent in a World of Big Data - Microsoft Global Privacy Summit Summary Report and Outcomes,” Microsoft Global Privacy Summit, November 9, 2012, &lt;a href="http://www.microsoft.com/en-us/download/details.aspx?id=35596"&gt;http://www.microsoft.com/en-us/download/details.aspx?id=35596&lt;/a&gt;: 3.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref42" name="_ftn42"&gt;&lt;sup&gt;&lt;sup&gt;[42]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; See supra note 39.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref43" name="_ftn43"&gt;[43]&lt;/a&gt; See for example, US Securities and Exchange Commission, “Corporation Finance Small Business Compliance Guides,” accessed August 26, 2015, &lt;a href="https://www.sec.gov/info/smallbus/secg.shtml"&gt;https://www.sec.gov/info/smallbus/secg.shtml&lt;/a&gt; and Australian Securities &amp;amp; Investments Commission, “Compliance for Small Business,” accessed August 26, 2015, http://asic.gov.au/for-business/your-business/small-business/compliance-for-small-business/.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref44" name="_ftn44"&gt;[44]&lt;/a&gt; See supra note 39.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref45" name="_ftn45"&gt;[45]&lt;/a&gt; See supra note 41.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref46" name="_ftn46"&gt;[46]&lt;/a&gt; See supra note 8, at 24.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref47" name="_ftn47"&gt;[47]&lt;/a&gt; See for example, James Daley, “Don’t Waste Time Reading Terms and Conditions,” &lt;i&gt;The Telegraph&lt;/i&gt;, September 3, 2014, and Robert Glancy, “Will You Read This Article about Terms and Conditions? You Really Should Do,” &lt;i&gt;The Guardian&lt;/i&gt;, April 24, 2014, sec. Comment is free, http://www.theguardian.com/commentisfree/2014/apr/24/terms-and-conditions-online-small-print-information.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref48" name="_ftn48"&gt;[48]&lt;/a&gt; See supra note 37, at 1886.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref49" name="_ftn49"&gt;[49]&lt;/a&gt; Alex Hudson, “Is Small Print in Online Contracts Enforceable?,” &lt;i&gt;BBC News&lt;/i&gt;, accessed August 26, 2015, http://www.bbc.com/news/technology-22772321.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref50" name="_ftn50"&gt;[50]&lt;/a&gt; Aleecia M. McDonald and Lorrie Faith Cranor, “Cost of Reading Privacy Policies, The,” &lt;i&gt;I/S: A Journal of Law and Policy for the Information Society&lt;/i&gt; 4 (2009 2008): 541&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref51" name="_ftn51"&gt;[51]&lt;/a&gt; See supra note 41, at 4.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref52" name="_ftn52"&gt;[52]&lt;/a&gt; For Canada, see Office of the Privacy Commissioner of Canada, “Fact Sheet: Ten Tips for a Better Online Privacy Policy and Improved Privacy Practice Transparency,” October 23, 2013, &lt;a href="https://www.priv.gc.ca/resource/fs-fi/02_05_d_56_tips2_e.asp"&gt;https://www.priv.gc.ca/resource/fs-fi/02_05_d_56_tips2_e.asp&lt;/a&gt;. And Office of the Privacy Commissioner of Canada, “Privacy Toolkit - A Guide for Businesses and Organisations to Canada’s Personal Information Protection and Electronic Documents Act,” accessed August 26, 2015, https://www.priv.gc.ca/information/pub/guide_org_e.pdf.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For USA, see Federal Trade Commission, “Internet of Things: Privacy &amp;amp; Security in a Connected World,” Staff Report (Federal Trade Commission, January 2015), https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref53" name="_ftn53"&gt;[53]&lt;/a&gt; See supra note 37, at 1889.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref54" name="_ftn54"&gt;[54]&lt;/a&gt; See supra note 39, at 261.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref55" name="_ftn55"&gt;[55]&lt;/a&gt; Jakki Geiger, “The Surprising Link Between Hurricanes and Strawberry Pop-Tarts: Brought to You by Clean, Consistent and Connected Data,” &lt;i&gt;The Informatica Blog - Perspectives for the Data Ready Enterprise&lt;/i&gt;, October 3, 2014, http://blogs.informatica.com/2014/03/10/the-surprising-link-between-strawberry-pop-tarts-and-hurricanes-brought-to-you-by-clean-consistent-and-connected-data/#fbid=PElJO4Z_kOu.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref56" name="_ftn56"&gt;[56]&lt;/a&gt; Constance L. Hays, “What Wal-Mart Knows About Customers’ Habits,” &lt;i&gt;The New York Times&lt;/i&gt;, November 14, 2004, http://www.nytimes.com/2004/11/14/business/yourmoney/what-walmart-knows-about-customers-habits.html.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref57" name="_ftn57"&gt;[57]&lt;/a&gt; M. J. de Zwart, S. Humphreys, and B. Van Dissel, “Surveillance, Big Data and Democracy: Lessons for Australia from the US and UK,” &lt;i&gt;Http://www.unswlawjournal.unsw.edu.au/issue/volume-37-No-2&lt;/i&gt;, 2014, &lt;a href="https://digital.library.adelaide.edu.au/dspace/handle/2440/90048"&gt;https://digital.library.adelaide.edu.au/dspace/handle/2440/90048&lt;/a&gt;: 722.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref58" name="_ftn58"&gt;[58]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref59" name="_ftn59"&gt;[59]&lt;/a&gt; See supra note 41, at 3.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref60" name="_ftn60"&gt;[60]&lt;/a&gt; Julie E. Cohen, “What Privacy Is For,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, November 5, 2012), http://papers.ssrn.com/abstract=2175406.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref61" name="_ftn61"&gt;[61]&lt;/a&gt; See supra note 37, at 1901.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref62" name="_ftn62"&gt;[62]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref63" name="_ftn63"&gt;[63]&lt;/a&gt; See supra note 37, at 1899.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref64" name="_ftn64"&gt;[64]&lt;/a&gt; Jon Leibowitz, “So Private, So Public: Individuals, The Internet &amp;amp; The paradox of behavioural marketing” November 1, 2007, https://www.ftc.gov/sites/default/files/documents/public_statements/so-private-so-public-individuals-internet-paradox-behavioral-marketing/071031ehavior_0.pdf: 6.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref65" name="_ftn65"&gt;[65]&lt;/a&gt; See supra note 5.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref66" name="_ftn66"&gt;[66]&lt;/a&gt; See supra note 37, at 1898.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref67" name="_ftn67"&gt;[67]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref68" name="_ftn68"&gt;[68]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref69" name="_ftn69"&gt;[69]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref70" name="_ftn70"&gt;[70]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref71" name="_ftn71"&gt;[71]&lt;/a&gt; See supra note 41, at 3.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref72" name="_ftn72"&gt;[72]&lt;/a&gt; See supra note 39, at 261.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref73" name="_ftn73"&gt;[73]&lt;/a&gt; Richard H. Thaler, “Making It Easier to Register as an Organ Donor,” &lt;i&gt;The New York Times&lt;/i&gt;, September 26, 2009, http://www.nytimes.com/2009/09/27/business/economy/27view.html.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref74" name="_ftn74"&gt;[74]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref75" name="_ftn75"&gt;[75]&lt;/a&gt; &lt;i&gt;The Oxford Introductions to U.S. Law: Contracts&lt;/i&gt;, 1 edition (New York: Oxford University Press, 2010): 67.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref76" name="_ftn76"&gt;[76]&lt;/a&gt; Francis M. Buono and Jonathan A. Friedman, “Maximizing the Enforceability of Click-Wrap Agreements,” &lt;i&gt;Journal of Technology Law &amp;amp; Policy&lt;/i&gt; 4, no. 3 (1999), http://jtlp.org/vol4/issue3/friedman.html.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref77" name="_ftn77"&gt;[77]&lt;/a&gt; North Carolina State University, “Clickwraps,” &lt;i&gt;Software @ NC State Information Technology&lt;/i&gt;, accessed August 26, 2015, http://software.ncsu.edu/clickwraps.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref78" name="_ftn78"&gt;[78]&lt;/a&gt; Ed Bayley, “The Clicks That Bind: Ways Users ‘Agree’ to Online Terms of Service,” &lt;i&gt;Electronic Frontier Foundation&lt;/i&gt;, November 16, 2009, https://www.eff.org/wp/clicks-bind-ways-users-agree-online-terms-service.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref79" name="_ftn79"&gt;[79]&lt;/a&gt; Ibid, at 2.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref80" name="_ftn80"&gt;[80]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref81" name="_ftn81"&gt;[81]&lt;/a&gt; See &lt;i&gt;Nguyen v. Barnes &amp;amp; Noble Inc&lt;/i&gt;., (9&lt;sup&gt;th&lt;/sup&gt; Cir. 2014), available &lt;a href="http://cdn.ca9.uscourts.gov/datastore/opinions/2014/08/18/12-56628.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref82" name="_ftn82"&gt;[82]&lt;/a&gt; See &lt;i&gt;Specht v. Netscape Communications Corp.&lt;/i&gt;,(2d Cir. 2002), available &lt;a href="http://cyber.law.harvard.edu/stjohns/Specht_v_Netscape.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref83" name="_ftn83"&gt;[83]&lt;/a&gt; See supra note 78, at 2.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref84" name="_ftn84"&gt;[84]&lt;/a&gt; See &lt;i&gt;In Re: Zappos.com, Inc., Customer Data Security Breach Litigation&lt;/i&gt;, No. 3:2012cv00325: pg 8 line 23-26, available &lt;a href="http://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=1152&amp;amp;context=historical"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref85" name="_ftn85"&gt;[85]&lt;/a&gt; See &lt;i&gt;Groff v. America Online&lt;/i&gt;, Inc., 1998, available &lt;a href="http://www.internetlibrary.com/cases/lib_case20.cfm"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref86" name="_ftn86"&gt;[86]&lt;/a&gt; Hotmail Corp. v. Van$ Money Pie, Inc., 1998, available &lt;a href="http://cyber.law.harvard.edu/property00/alternatives/hotmail.html"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref87" name="_ftn87"&gt;[87]&lt;/a&gt; ProCD Inc. v. Zeidenberg, (7th. Cir. 1996), available &lt;a href="https://www.law.cornell.edu/copyright/cases/86_F3d_1447.htm"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref88" name="_ftn88"&gt;[88]&lt;/a&gt; See supra note 78, at 1.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref89" name="_ftn89"&gt;[89]&lt;/a&gt; See supra note 78, at 2.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref90" name="_ftn90"&gt;[90]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref91" name="_ftn91"&gt;[91]&lt;/a&gt; Oliver Herzfeld, “Are Website Terms Of Use Enforceable?,” &lt;i&gt;Forbes&lt;/i&gt;, January 22, 2013, http://www.forbes.com/sites/oliverherzfeld/2013/01/22/are-website-terms-of-use-enforceable/.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref92" name="_ftn92"&gt;[92]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref93" name="_ftn93"&gt;[93]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref94" name="_ftn94"&gt;[94]&lt;/a&gt; See supra note 41, at 3.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref95" name="_ftn95"&gt;[95]&lt;/a&gt; Christopher Kuner et al., “The Challenge of ‘big Data’ for Data Protection,” &lt;i&gt;International Data Privacy Law&lt;/i&gt; 2, no. 2 (May 1, 2012): 47–49, doi:10.1093/idpl/ips003: 49.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref96" name="_ftn96"&gt;[96]&lt;/a&gt; Ibid.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref97" name="_ftn97"&gt;[97]&lt;/a&gt; See supra note 41, at 5.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref98" name="_ftn98"&gt;[98]&lt;/a&gt; See supra note 57, at 723.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref99" name="_ftn99"&gt;[99]&lt;/a&gt; Kate Crawford and Jason Schultz, “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, October 1, 2013), http://papers.ssrn.com/abstract=2325784: 109.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref100" name="_ftn100"&gt;[100]&lt;/a&gt; See supra note 41, at 13.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref101" name="_ftn101"&gt;[101]&lt;/a&gt; See supra note 41, at 5.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref102" name="_ftn102"&gt;[102]&lt;/a&gt; See supra note 52, Privacy Toolkit, at 14.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref103" name="_ftn103"&gt;[103]&lt;/a&gt; See supra note 41, at 6.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref104" name="_ftn104"&gt;[104]&lt;/a&gt; Siani Pearson and Marco Casassa Mont, “Sticky Policies: An Approach for Managing Privacy across Multiple Parties,” &lt;i&gt;Computer&lt;/i&gt;, 2011.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref105" name="_ftn105"&gt;[105]&lt;/a&gt; See supra note 34, at 138.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref106" name="_ftn106"&gt;[106]&lt;/a&gt; See supra note 34, at 118.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref107" name="_ftn107"&gt;[107]&lt;/a&gt; See supra note 41, at 5.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref108" name="_ftn108"&gt;[108]&lt;/a&gt; See supra note 41, at 4.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/are-we-throwing-our-data-protection-regimes-under-the-bus'&gt;https://cis-india.org/internet-governance/blog/are-we-throwing-our-data-protection-regimes-under-the-bus&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Rohan George</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-09-10T14:02:08Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/the-hindu-businessline-august-28-p-anima-the-new-tattler-in-town">
    <title>The new tattler in town</title>
    <link>https://cis-india.org/internet-governance/news/the-hindu-businessline-august-28-p-anima-the-new-tattler-in-town</link>
    <description>
        &lt;b&gt;WhatsApp messages, and in particular ‘admins’ of WhatsApp groups come under pressure as rumour-mongering catches the attention of the police.&lt;/b&gt;
        &lt;p class="body" style="text-align: justify; "&gt;The article by P. Anima was published in the &lt;a class="external-link" href="http://www.thehindubusinessline.com/blink/know/the-new-tattler-in-town/article7587041.ece"&gt;Hindu Businessline&lt;/a&gt; on August 28, 2015. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p class="body" style="text-align: justify; "&gt;In early August, Solapur in southeast Maharashtra was  gripped by a strange fear. Like most small towns, Solapur rarely makes  the headlines except when drought deepens. That changed as alarmed  villagers in almost all of the district’s 11 tehsils camped outdoors day  and night, on the look out for an unseen enemy. “In the &lt;i&gt;bastis&lt;/i&gt; (villages), residents kept night vigils, sitting around a fire,” Deepak  Homkar, a local journalist, recalls. Rumours were flying thick — of  theft, widespread looting and possible kidnapping of children. And all  of it over WhatsApp, the instant messaging app. Similar scenes were  reported from Ahmedabad a month earlier. Rumours of dacoity and  terrorist attacks spread panic in areas around Ahmedabad. Arrests were  made of those who had allegedly sent fear-mongering texts, but the  damage had already been done.&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;With over 800 million,  and growing, active users worldwide, WhatsApp is popular among the 160  million smartphone users in India too. Neatly slotting lives into groups  of friends, work and family, it allows users to flit in and out of  interactions. But a fair amount of trouble-making is also springing up  from the app.&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;In Solapur, the police tasked with putting the rumours to an end had visited &lt;i&gt;bastis&lt;/i&gt;,  narrowed down the suspected smartphone users and randomly checked their  WhatsApp messages. “We found these rumours on some, not other  [phones],” says a police official. After 36 hours of search, 16 young  men were held under IPC 505 1(B) for spreading alarm and fear in  Pandharpur tehsil alone. It included those who allegedly sent the  message and several ‘admins’ (those who open and manage the group  accounts). After being questioned and warned against repeating such  texts, the men were let off.&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;The complicity of the  WhatsApp admin, whether as a passive onlooker or whether they forwarded  the messages themselves, remains hazy. “It is possible that some admins  may have forwarded the text, but I spoke to at least one who was held  only because he managed the group,” says Homkar.&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;Are  the admins culpable in such situations? “Not at all,” say internet  experts, if the admin has had nothing to do with the fear-mongering  texts. But if the admin has forwarded a potentially harmful message,  he/she is accountable like anyone else. “It is then an act done  knowingly,” says Prasanth Sugathan, counsel at the Delhi-based Software  Freedom Law Centre. “The act of forwarding makes you accountable. The  burden of truth is on you,” he adds. Chinmayi Arun, research director at  Centre for Communication Governance, National Law University, Delhi,  pitches in, “Just being an inactive administrator of a large group, who  may not be able to vet all of its content, is very different from  forwarding rumours. People who forward rumours should be responsible  enough to at least highlight their doubtful veracity.”&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;However,  if the admin is unconnected to the group activities, he cannot be held  for merely starting the group, they say. “Unless, of course, you have  started it for an illegal activity or to cause an offence,” says Sunil  Abraham, executive director of the Bengaluru-based The Centre for  Internet and Society. The WhatsApp admin, they point out, is a mere  intermediary. One who isn’t vested with any power, except to add or  remove members from the group.&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;Twenty-three-year-old  Hrishikesh, from Dhanbad, is currently admin in six groups. In three,  he is one among multiple admins. He hardly keeps tab on the goings-on in  this space and is not acquainted with all members, he says. “A WhatsApp  admin has no control, no facility to moderate or tweak a message,” says  Sugathan. Abraham trots out Section 79 of the IT Act. “It gives the  admin immunity from liability that emerges from content posted by the  members,” he says. The best way to track the original senders in such  cases, he says, is to rope in the help of the telecom department, the  other intermediary (in the case of WhatsApp, the owner Facebook) and  blend it with some ‘old-fashioned’ detective work.&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;“Counter  bad speech with good speech,” says Abraham, and that is often the best  way to deal with rumour-mongering. Instances like those at Solapur and  Ahmedabad have been rare, he reasons. “Such stuff can be dealt better  with education rather than regulation. All types of nuisance shouldn’t  be regulated. The cost of implementing new laws and training police  personnel for it is not cheap. In these cases, SMSes from the police  could go to every single mobile user in the district, telling them the  rumours are false.”&lt;/p&gt;
&lt;p class="body" style="text-align: justify; "&gt;Sugathan concurs with this.  Facebook, radio and other mass media should be used by the police to  quell rumours, he says. He points out that in the aftermath of the 2011  London riots, although social media was blamed for aggravating the  situation, there were ample warnings against shutting it down during  such times. “Blocking the medium is blocking an avenue for information.  One cannot arrest each and every person. So educating people works  better,” says Sugathan. Some like Abraham consider these hiccups  inevitable in our evolving use of social media. A new technology is  often considered sacrosanct and reliable. “From repeated exposure  emerges critical understanding. It will take us another five years to  know that Wikipedia is not the source of truth.”&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/the-hindu-businessline-august-28-p-anima-the-new-tattler-in-town'&gt;https://cis-india.org/internet-governance/news/the-hindu-businessline-august-28-p-anima-the-new-tattler-in-town&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-09-26T16:31:00Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/the-changing-landscape-of-ict-governance-and-practice-convergence-and-big-data">
    <title>The Changing Landscape of ICT Governance and Practice - Convergence and Big Data</title>
    <link>https://cis-india.org/internet-governance/news/the-changing-landscape-of-ict-governance-and-practice-convergence-and-big-data</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Sharat Chandra Ram was granted the &lt;a href="http://www.cprsouth.org/2015/02/call-for-applications-2015-young-scholar-awards/"&gt;Young Scholar Award 2015&lt;/a&gt; to attend the &lt;i&gt;Young Scholar Workshop (August 24 - 25, 2015)&lt;/i&gt; followed by main &lt;a href="http://www.cprsouth.org/"&gt;&lt;i&gt;CPRSouth2015 conference&lt;/i&gt; (Communication Policy Research South) conference &lt;i&gt;(26th - 28th August 2015&lt;/i&gt;)&lt;/a&gt; - "The Changing Landscape of ICT Governance and Practice - Convergence and Big Data"  that was co-organized by the 'Innovation Center for Big Data and Digital Convergence, Yuan Ze University, Taiwan. The agenda for Young Scholar 2015 pre-conferernce workshop can be accessed &lt;a class="external-link" href="http://www.cprsouth.org/cprsouth-2015-call-for-abstracts/cprsouth-2015-young-scholar-awards-call-for-applications/"&gt;here&lt;/a&gt;. The CPR South 2015: Conference Programme agenda can be accessed &lt;a class="external-link" href="http://www.cprsouth.org/cprsouth-2015-call-for-abstracts/cpr-south-2015-conference-programme/"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/the-changing-landscape-of-ict-governance-and-practice-convergence-and-big-data'&gt;https://cis-india.org/internet-governance/news/the-changing-landscape-of-ict-governance-and-practice-convergence-and-big-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    

   <dc:date>2015-09-07T13:48:37Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015">
    <title>CIS Comments and Recommendations to the Human DNA Profiling Bill, June 2015</title>
    <link>https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) submitted a clause-by-clause comments on the Human DNA Profiling Bill that was circulated by the Department of Biotechnology on June 9, 2015. &lt;/b&gt;
        &lt;p class="Standard" style="text-align: justify; "&gt;The Centre for Internet and Society is a non-profit research organisation that works on policy issues relating to privacy, freedom of expression, accessibility for persons with diverse abilities, access to knowledge, intellectual property rights and openness. It engages in academic research to explore and affect the shape and form of Internet, along with its relationship with the Society, with particular emphasis on South-South dialogues and exchange. The Centre for Internet and Society was also a member of the Expert Committee which was constituted in the year 2013 by the Department of Biotechnology to discuss the draft Human DNA Profiling Bill.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Missing aspects from the Bill&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;The Human DNA Profiling Bill, 2015 has overlooked and has not touched upon the following crucial factors :&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;span&gt;Objects Clause&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;An ‘objects clause,’ detailing the intention of the legislature and containing principles to inform the application of a statute, in the main body of the statute is an enforceable mechanism to give directions to a statute and can be a formidable primary aid in statutory interpretation. [See, for example, section 83 of the Patents Act, 1970 that directly informed the Order of the Controller of Patents, Mumbai, in the matter of NATCO Pharma and Bayer Corporation in Compulsory Licence Application No. 1 of 2011.] Therefore, the Bill should incorporate an objects clause that makes clear that&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;“DNA profiles merely estimate the identity of persons, they do not conclusively establish unique identity, therefore forensic DNA profiling should only have probative value and not be considered as conclusive proof.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;The Act recognises that all individuals have a right to privacy that must be continuously weighed against efforts to collect and retain DNA and in order to protect this right to privacy the principles of notice, confidentiality, collection limitation, personal autonomy, purpose limitation and data minimization must be adhered to at all times.”&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;span&gt;Collection and Consent&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;The Bill does not contain provisions regarding instances when the DNA samples can be collected from the individuals without consent (nor does the Bill establish or refer to an authorization procedure for such collection), when DNA samples can be collected from individuals only with informed consent, and how and in what instances individuals can withdraw their consent.  The issue of whether DNA samples can be collected without the consent of the individual is a vexed one and requires complex questions relating to individual privacy as well as the right against self incrimination. While the question of whether an accused can be made to give samples of blood, semen, etc. which had been in issue in a wide gamut of decisions in India has finally been settled by section 53 of the Code of Criminal Procedure, which allows collection of medical evidence from an accused, thus laying to rest any claims based on the right against self incrimination. However there are still issues dealing with the right to privacy and the violation thereof due to the non-consensual collection of DNA samples. This is an issue which needs to be addressed in this Act itself and should not be left unaddressed as this would only lead to a lack of clarity and protracted court cases to determine this issue. An illustration of this problem is where the Bill allows for collection of intimate body samples. There is a need for inclusion of stringent safeguard measures regarding the same since without such safeguards, the collection of intimate body samples would be an outright infringement of privacy. Further, maintaining a database for convicts and suspects is one thing, however collecting and storing intimate samples of individuals is a gross violation of the citizens’ right to privacy, and without adequate mechanisms regarding consent and security, stands at a huge risk of being misused.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;span&gt;Privacy Safeguards&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;Presently, the Bill is being introduced without comprehensive privacy safeguards in place on issues such as consent, collection, retention, etc. as is evident from the comments made below. Though the DNA Board is given the responsibility of recommending best practices pertaining to privacy  (clause 13 (l)) – this is not adequate given the fact that India does not have a comprehensive privacy legislation. Though &lt;a href="http://deity.gov.in/sites/upload_files/dit/files/GSR313E_10511(1).pdf"&gt;section 43A and associated Rules&lt;/a&gt; of the Information Technology Act would apply to the collection, use, and sharing of DNA data by DNA laboratories  (as they would fall under the definition of ‘body corporate’ under the IT Act), the National and State Data Banks and the DNA Board would not clearly be body corporate as per the IT Act and would not fall under the ambit of the provision or Rules.  Safeguards are needed to protect against the invasion of informational privacy and physical privacy at the level of these State controlled bodies.  The fact that the Bill is to be introduced into Parliament prior to the enactment of a privacy legislation in India is significant as according to discussions in the &lt;a href="http://cis-india.org/internet-governance/blog/expert-committee-meetings.zip/view"&gt;Record Notes of the &lt;/a&gt;4h Meeting of the &lt;a href="http://cis-india.org/internet-governance/blog/expert-committee-meetings.zip/view"&gt;Expert Committee&lt;/a&gt; - &lt;i&gt;“the Expert Committee also discussed and emphasized that the Privacy Bill is being piloted by the Government. That Bill will over-ride all the other provisions on privacy issues in the DNA Bill.”&lt;/i&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;span&gt;Lack of restriction on type of analysis to be performed&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;The Bill currently does not provide any restriction on the types of analysis that can be performed on a DNA sample or profile. This could allow for DNA samples to be analyzed for purposes beyond basic identification of an individual – such as for health, genetic, or racial purposes. As a form of purpose limitation the Bill should define narrowly the types of analysis that can be performed on a DNA sample.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;span&gt;Purpose Limitation&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;The Bill does not explicitly restrict the use of a DNA sample or DNA profile to the purpose it was originally collected and created for. This could allow for the re-use of samples and profiles for unintended purposes.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;span&gt;Annual Public Reporting&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;The Bill does not require the DNA Board to disclose publicly available information on an annual basis regarding the functioning and financial aspects of matters contained within the Bill. Such disclosure is crucial in ensuring that the public is able to make informed decisions. Categories that could be included in such reports include: Number of DNA profiles added to each indice within the databank, total number of DNA profiles contained in the database, number of DNA profiles deleted from the database, the number of matches between crime scene DNA profiles and DNA profiles, the number of cases in which DNA profiles were used in and the percentage in which DNA profiles assisted in the final conclusion of the case, and the number and categories of DNA profiles shared with international entities.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;span&gt;Elimination Indice&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;An elimination indice containing the profiles of medical professionals, police, laboratory personnel etc. working on a case is necessary in case they contaminate collected samples by accident.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Clause by Clause Recommendations&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;As stated the Human DNA Profiling Bill 2015 is to &lt;i&gt;regulate the use of DNA analysis of human body substances profiles and to establish the DNA Profiling Board for laying down the standards for laboratories, collection of human body substances, custody trail from collection to reporting and also to establish a National DNA Data Bank.&lt;/i&gt;&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;As stated, the purpose of the DNA Human Profiling Bill is to broadly regulate the of DNA analysis and establish a DNA Data Bank.  Despite this, the majority of provisions in the Bill pertain to the collection, use, access etc. of DNA samples and profiles for civil and criminal purposes. The result of this is an 'unbalanced Bill' - with the majority of provisions focusing on issues related to forensic use. At the same time the Bill is not a comprehensive forensic bill – resulting in legislative gaps.&lt;/li&gt;
&lt;li&gt;Additionally, the Bill contains provisions beyond the stated purpose. These include:&lt;/li&gt;
&lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Facilitating the creation of a Data Bank for statistical purposes (Clause 33(e))&lt;/li&gt;
&lt;li&gt;Establishing state and regional level databanks in addition to a national level databank (Clause 24)&lt;/li&gt;
&lt;li&gt;Developing procedure and providing for the international sharing of DNA profiles with foreign Governments, organizations, institutions, or agencies. (Clause 29)&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The Bill should ideally be limited to regulating the use of DNA samples and profiles for criminal purposes. If the scope remains broad, all purposes should be equally and comprehensively regulated.&lt;/li&gt;
&lt;li&gt;The stated purpose of the Bill should address all aspects of the Bill. Provisions beyond the scope of the Bill should be removed.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;span&gt;Chapter 1: Preliminary&lt;/span&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 2: &lt;/b&gt;This clause defines the terms used in the Bill.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;A number of terms are incomplete and some terms used in the Bill have not been included in the list of definitions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The definition of DNA Data bank manager - clause 2 (1)(g) - must be renamed as National DNA Data bank manager.&lt;/li&gt;
&lt;li&gt;The definition of “DNA laboratory” in clause 2(1)(h) should refer to the specific clauses that empower the Central Government and State Governments to license and recognise DNA laboratories. This is a drafting error.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The definition of “DNA profile” in clause 2(1)(i) is too vague. Merely the results of an analysis of a DNA sample may not be sufficient to create an actual DNA profile. Further, the results of the analysis may yield DNA information that, because of incompleteness or lack of information, is inconclusive. These incomplete bits of information should not be recognised as DNA profiles. This definition should be amended to clearly specify the contents of a complete and valid DNA profile that contains, at least, numerical representations of 17 or more loci of short tandem repeats that are sufficient to estimate biometric individuality of a person. The definition of “DNA profile” does not restrict the analysis to forensic DNA profiles: this means additional information, such as health-related information could be analyzed and stored against the wishes of the individual, even though such information plays no role in solving crimes.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The term “known sample” that is defined in clause 2(1)(m) is not used anywhere outside the definitions clause and should be removed.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The definition of “offender” in clause 2(1)(q) is vague because it does not specify the offenses for which an “offender” needs to be convicted. It is also linked to an unclear definition of the term “under trial”, which does not specify the nature of pending criminal proceedings and, therefore, could be used to describe simple offenses such as, for example, failure to pay an electricity bill, which also attracts criminal penalties.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The term “proficiency testing” that is defined in clause 2(1)(t) is not used anywhere in the text of the DNA Bill and should be removed.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The definitions of “quality assurance”, “quality manual” and “quality system” serve no enforceable purpose since they are used only in relation to the DNA Profiling Board’s rule making powers under Chapter IX, clause 58. Their inclusion in the definitions clause is redundant. Accordingly, these definitions should be removed.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The term “suspect” defined in clause 2(1)(za) is vague and imprecise. The standard by which suspicion is to be measured, and by whom suspicion may be entertained – whether police or others, has not been specified. The term “suspect” is not defined in either the Code of Criminal Procedure, 1973 ("CrPC") or the Indian Penal Code, 1860 ("IPC").&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The term volunteer defined in clause 2(zf) only addresses consent from the parent or guardian of a child or an incapable person. This term should be amended to include informed consent from any volunteer.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Chapter II: DNA Profiling Board&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 4:&lt;/b&gt; This clause addresses the composition of the DNA Profiling Board.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment&lt;/b&gt;: The size and composition of the Board that is staffed under clause 4 is extremely large. The number of members remains to be 15, as it was in the 2012 Bill.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; Drawing from the experiences of other administrative and regulatory bodies in India, the size of the Board should be reduced to no more than five members. The Board must contain at least:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;One ex-Judge or senior lawyer&lt;/li&gt;
&lt;li&gt;Civil society – both institutional and non-institutional&lt;/li&gt;
&lt;li&gt;Privacy advocates&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Note:&lt;/b&gt; The reduction of the size of the Board was agreed upon by &lt;a href="http://cis-india.org/internet-governance/blog/expert-committee-meetings.zip/view"&gt;the Expert Committee from 16 members (2012 Bill) to 11 member&lt;/a&gt;s. This recommendation has not been incorporated.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 5(1): &lt;/b&gt;The clause specifies the term of the Chairperson of the DNA Profiling Board to be five years and also states that the person shall not be eligible for re-appointment or extension of the term so specified.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; The Chairperson of the Board, who is first mentioned in clause 5(1), has not been duly and properly appointed.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; Clause 4 should be amended to mention the appointment of the Chairperson and other Members.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 7: &lt;/b&gt; The clause requires members to react on a case-by-case basis to the business of the Board by excusing themselves from deliberations and voting where necessary.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This clause addresses the issue of conflict of interest only in narrow cases and does not provide penalty if a member fails to adhere to the laid out procedure.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The Bill should require members to make full and public disclosures of their real and potential conflicts of interest and the Chairperson must have the power to prevent such members from voting on interested matters. Failure to follow such anti-collusion and anti-corruption safeguards should attract criminal penalties.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 12(5)&lt;/b&gt;:  The clause states that the board shall have the power to co-opt such number of persons as it may deem necessary to attend the meetings of the Board and take part in the proceedings of the board, but such persons will not have the right to vote. &lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; While serving on the Expert Committee, CIS provided &lt;a href="http://cis-india.org/internet-governance/blog/dna-dissent"&gt;language   regarding&lt;/a&gt; how the Board could consult with the public. This language has not been fully incorporated.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;As per the recommendation of CIS, the following language should be adopted in the Bill: &lt;i&gt;The Board, in carrying out its functions and activities, shall be required to consult with all persons and groups of persons whose rights and related interests may be affected or impacted by any DNA collection, storage, or profiling activity. The Board shall, while considering any matter under its purview, co-opt or include any person, group of persons, or organisation, in its meetings and activities if it is satisfied that that person, group of persons, or organisation, has a substantial interest in the matter and that it is necessary in the public interest to allow such participation. The Board shall, while consulting or co-opting persons, ensure that meetings, workshops, and events are conducted at different places in India to ensure equal regional participation and activities.&lt;/i&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 13:&lt;/b&gt; The clause lays down the functions to be performed by the DNA Profiling Board, which includes it’s role in regulation of the DNA Data Banks, DNA Laboratories and techniques to be adopted for collection of the DNA samples.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;While serving on the Expert Committee, &lt;a href="http://cis-india.org/internet-governance/blog/expert-committee-meetings.zip/view"&gt;CIS recommended&lt;/a&gt; that the functions of the DNA Profiling Board should be limited to licensing, developing standards and norms, safeguarding privacy and other rights, ensuring public transparency, promoting information and debate and a few other limited functions necessary for a regulatory authority.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;Furthermore, this clause delegates a number of functions to the Board that places the Board in the role of a manager and regulator for issues pertaining to DNA Profiling including functions of the DNA Databases, DNA Laboratories, ethical concerns, privacy concerns etc.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;As per CIS’s recommendations the functions of the Board should be limited to licensing, developing standards and norms, safeguarding privacy and other rights, ensuring public transparency, promoting information and debate and a few other limited functions necessary for a regulatory authority.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;Towards this, the Board should be comprised of separate Committees to address these different functions. At the minimum, there should be a Committee addressing regulatory issues pertaining to the functioning of Data Banks and Laboratories and an Ethics Committee to provide independent scrutiny of ethical issues.  Additionally:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Clause 13(j) allows the Board to disseminate best practices concerning the collection and analysis of DNA samples to ensure quality and consistency. The process for collection of DNA samples and analysis should be established in the Bill itself or by regulations. Best practices are not enforceable and do not formalize a procedure.&lt;/li&gt;
&lt;li&gt;Clause 13(q)  allows the Board to establish procedure for cooperation in criminal investigation between various investigation agencies within the country and with international agencies. This procedure, at the minimum, should be subject to oversight by the Ministry of External Affairs.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Chapter III: Approval of DNA Laboratories&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 15:&lt;/b&gt; This clause states that every DNA Laboratory has to make an application before the Board for the purpose of undertaking DNA profiling and also for renewal.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;Though the Bill requires DNA Laboratories to make an application for the undertaking DNA Profiling, it does not clarify that the Lab must receive approval before collection and analysis of DNA samples and profiles.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The Bill should clarify that all DNA Laboratories must receive approval for functioning prior to the collection or analysis of any DNA samples and profiles.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Chapter IV: Standards, Quality Control and Quality Assurance Obligations of DNA Laboratory and Infrastructure and Training&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 19: &lt;/b&gt;This clause defines the obligations of a DNA laboratory. Sub-section (d) maintains that one such obligation is the sharing of the 'DNA data' prepared and maintained by the laboratory with the State DNA Data Bank and the National DNA Data Bank.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; ‘DNA Data’ is a new term that has not been defined under Clause 2  of the Bill. It is thus unclear what data would be shared between State DNA data banks and the National DNA data bank - DNA samples? DNA profiles? associated records?  It is also unclear in what manner and on what basis the information would be shared.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The term ‘DNA Data’ should be defined to clarify what information will be shared between State and National DNA Data Banks. The flow of and access to data between the State DNA Data Bank and National DNA Data Bank should also be established in the Bill.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 22: &lt;/b&gt;The clause lays down the measures to be adopted by a DNA Laboratory and 22(h) includes a provision requiring the conducting of annual audits according to prescribed standards.&lt;b&gt; &lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The definition of “audit” under Chapter VI in clause 22 under ‘Explanation’ is relevant for measuring the training programmes and laboratory conditions. However, the term “audit” is subsequently used in an entirely different manner in Chapter VII which relates to financial information and transparency.&lt;/li&gt;
&lt;li&gt;The standards for the destruction of DNA samples have not been included within the list of measures that DNA laboratories must take. &lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The definition of ‘audit’ must be amended or removed as it is being used in different contexts. The term “audit” has a well established use for financial information that does not require a definition.&lt;/li&gt;
&lt;li&gt;Standards for the destruction of DNA samples should be developed and included as a measure DNA laboratories must take. &lt;/li&gt;
&lt;li&gt;&lt;b&gt;Clause 23:&lt;/b&gt; This clause lays down the sources for collection of samples for the purpose of DNA profiling. 23(1)(a) includes collection from bodily substances and 23(1)(c) includes clothing and other objects. Explanation (b) provides a definition of 'intimate body sample'.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Permitting the collection of DNA samples from bodily substances and clothing and other objects allows for the broad collection of DNA samples without contextualizing such collection. In contrast &lt;i&gt;23(b) Scene of occurrence or scene of crime&lt;/i&gt; limits the collection of samples to a specific context.&lt;/li&gt;
&lt;li&gt;This clause also raises the issue of consent and invasion of privacy of an individual. If “intimate body samples” are to be taken of individuals, then this would be an invasion of the person’s right to bodily privacy if such collection is done without the person’s consent (except in the specific instance when it is done in pursuance of section 53 of the Criminal Procedure Code).&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Sources for the collection of DNA samples should be contextualized to prevent broad, unaccounted for, or unregulated collection. Clause (a) and (c) should be deleted and replaced with contexts in which the collection DNA collection would be permitted. &lt;/li&gt;
&lt;li&gt;The Bill should specify circumstances on which non-intimate samples can be collected and the process for the same.&lt;/li&gt;
&lt;li&gt;The Bill should specify that intimate body samples can only be taken with informed consent except as per section 53 of the Criminal Procedure Code.&lt;/li&gt;
&lt;li&gt;The Bill should require that any individual that has a sample taken (intimate and non-intimate) is provided with notice of their rights and the future uses of their DNA sample and profile.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;span&gt;Chapter V: DNA Data Bank &lt;/span&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 24:&lt;/b&gt;This clause addresses establishment of DNA Data Banks at the State and National Level. 24(5) establishes that the National DNA Data Bank will receive data from State DNA Data Banks and store the approved DNA Profiles  as per regulations.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;As noted previously, ‘DNA Data’ is a new term that has not been defined in the Bill. It is thus unclear what data would be shared between State DNA data banks and the National DNA data bank - DNA samples? DNA profiles? associated records? &lt;/li&gt;
&lt;li&gt;The process for sharing Data between the State and National Data Banks is not defined.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The term ‘DNA Data’ should be defined to clarify what information will be shared between State and National DNA Data Banks. &lt;/li&gt;
&lt;li&gt;The process for the National DNA Data Bank receiving DNA data from State DNA Data Banks and DNA laboratories needs to be defined in the Bill or by regulation. This includes specifying how frequently information will be shared etc.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 25:&lt;/b&gt; This clause establishes standards for the maintenance of indices by DNA databanks. 25(1) states that every DNA Data Bank needs to maintain the prescribed indices for various categories of data including an index for a crime scene, suspects, offenders, missing persons, unknown deceased persons, volunteers, and other indices as may be specified by regulation. &lt;b&gt;25(2) &lt;/b&gt;states that in addition to the indices, the DNA Data Bank should contain information regarding each of the DNA profiles. It can either be the identity of the person from whose bodily substance the profile was derived in case of a suspect or an offender, or the case reference number of the investigation associated with such bodily substances in other cases. &lt;b&gt;25(3) &lt;/b&gt;states that the indices maintained shall include information regarding the data which is based on the DNA profiling and the relevant records.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment&lt;/b&gt;:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;25(1): The creation of multiple indices cannot be justified and must be limited since collection of biological source material is an invasion of privacy that must be conducted only in strict conditions when the potential harm to individuals is outweighed by the public good. This balance may only be struck when dealing with the collection and profiling of samples from certain categories of offenders. The implications of collecting and profiling DNA samples from corpses, suspects, missing persons and others are vast.  Specifically a 'volunteer' index could possibly be used for racial/community/religious profiling.&lt;/li&gt;
&lt;li&gt;25(2): This clause requires the names of individuals to be connected to their profiles, and hence accessible to persons having access to the databank.&lt;/li&gt;
&lt;li&gt;25(3) The clause states that only information related to DNA profiling and will be stored in an indice. Yet, it is unclear what such information might be. This could allow inconsistencies in data stored in an indice and could allow for unnecessary information to be stored on an indice.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;25(1) &lt;/b&gt;Ideally, DNA databanks should be created for dedicated purposes. This would mean that a databank for forensic purposes should contain only an offenders’ index and a crime scene index while a databank for missing persons would contain only a missing persons indice etc. If numerous indices are going to be contained in one databank, the Bill needs to recognize the sensitivity of each indice as well as the difference between each indice and lay down appropriate and strict conditions for collection of data for such indice, addition of data into the indice, as well as use, access, and retention of data within the indice.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;25(2) &lt;/b&gt;DNA profiles, once developed, should be maintained with complete anonymity and retained separate from the names of their owners. This amendment becomes even more important if we consider the fact that an “offender” may be convicted by a lower court and have his profile included in the data bank, but may get acquitted later. However, till the time that such person is acquitted, his/her profile with the identifying information would still be in the data bank, which is an invasion of privacy.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;25(3)&lt;/b&gt; What information will be stored in indices should be clearly defined in the Bill and should be tailored appropriately to each category of indice.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 28:&lt;/b&gt; This clause addresses the comparison and communication of DNA profiles.  28(1) states that the DNA profile entered in the offenders or crime scene index shall be compared by the DNA Data Bank Manger against profiles contained in the DNA Data Bank and the DNA Data Bank Manager will communicate such information with any court, tribunal, law enforcement agency, or approved DNA laboratory which he may consider appropriate for the purpose of investigation. 28(2) allows for any information relating to a person's DNA profile contained in the suspect's index or offenders' index to be communicated to authorised persons.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment&lt;/b&gt;:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;28(1) (a-c) allows for the DNA Bank Manager to communicate the following: 1.) if the DNA profile is not contained in the Data Bank and what information is not contained, 2.) if the DNA profile is contained in the data bank and what information is contained, and if in the opinion of the Manager, 3.) the DNA profile is similar to one stored in the Databank. These options of communication are problematic as they 1. allow for all associated information to be communicated – even if such information is not necessary, 2.) Allows for the DNA Databank Manager to communicate that a profile is  'similar' without defining what 'similar' would constitute.&lt;/li&gt;
&lt;li&gt;28(1) only addresses the comparison of DNA profiles entered  into the offenders index or the crime scene index against all other profiles entered into the DNA Data Bank.&lt;/li&gt;
&lt;li&gt;28(1) gives the DNA Data Bank manager broad discretion in determining if information should be communicated and requires no accountability for such a decision.&lt;/li&gt;
&lt;li&gt;28(2) only addresses information in the suspect's and offender's index and does not address information in any other index.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; Rather than allowing for broad searches across the entire database, the Bill should be clear about which profiles can be compared against which indices. Such distinctions must take into consideration if a profile was taken on consent and what was consented to.&lt;/li&gt;
&lt;li&gt;Ideally, the response from the DNA Databank Manager should be limited to a 'yes' or 'no' response and only further information should be revealed on receipt of a court order.&lt;/li&gt;
&lt;li&gt;The Bill should define what constitutes 'similar'&lt;/li&gt;
&lt;li&gt;A process for determining if information should be communicated should be established in the Bill and followed by the DNA Data Bank Manager. The Manager should also be held accountable through oversight mechanisms for such decisions. This is particularly important, as a DNA laboratory would be a private body.&lt;/li&gt;
&lt;li&gt;Information stored in any index should be disclosed to only authorized parties. &lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 29: &lt;/b&gt;This clause provides for comparison and sharing of DNA profiles with foreign Government, organisations, institutions or agencies. 29(1) allows the DNA Bank Manager to run a comparison of the received profile against all indices in the databank and communicate specified responses through the Central Bureau of Investigation.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;This clause allows for international disclosures of DNA profiles of  Indians through a procedure that is to be established by the Board (see clause 13(q))&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The disclosure of DNA profiles of Indians with international entities should be done via the MLAT process as it is the typical process followed when sharing information with international entities for law enforcement purposes.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 30:&lt;/b&gt; This clause provides for the permanent retention of information pertaining to a convict in the offenders’ index and the expunging of such information in case of a court order establishing acquittal of a person, or the conviction being set aside.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment&lt;/b&gt;: This clause addresses only the retention and expunging of records of a  convict stored in the offenders index upon the receipt of a court order or the conviction being set aside. This implies that records in all other indices - including volunteers - can be retained permanently. This clause also does not address situations where an individuals DNA profile is added to the databank, but the case never goes to court.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation&lt;/b&gt;: The Bill should establish retention standards and deletion standards for each indice that it creates. Furthermore, the Bill should require the immediate destruction of DNA samples once a DNA profile for identification purposes has been created. An exception to this should be the destruction of samples stored in the crime scene index.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Chapter VI: Confidentiality of and Access to DNA Profiles, Samples, and Records&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 33&lt;/b&gt;: This provision lays down the cases and the persons to which information pertaining to DNA profiles, samples and records stored in the DNA Data Bank shall be made available. Specifically, 33(e) permits disclosure for the creation and maintenance of a population statistics Data Bank.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;This clause addresses disclosure of information in the DNA Data Bank, but does not directly address the use of DNA samples or DNA profiles. This allows for the possibility of re-use of samples and profiles.&lt;/li&gt;
&lt;li&gt;There is no limitation on the information that can be disclosed. The clause allows for any information stored in the Data Bank to be disclosed for a number of circumstances/to a variety of people.&lt;/li&gt;
&lt;li&gt;There is no authorization process for the disclosure of such information. Of the circumstances listed – an authorization process is mentioned only for the disclosure of information in the case of investigations relating to civil disputes or other civil matters with the concurrence of the court. This implies that there is no procedure for authorizing the disclosure of information for identification purposes in criminal cases, in judicial proceedings, for facilitating prosecution and adjudication of criminal cases, for the purpose of taking defence by an accused in a criminal case, and for the creation and maintenance of a population statistics Data Bank.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The Bill should establish an authorization process for the disclosure of information stored in a data bank. This process must limit the disclosure of information to what is necessary and proportionate for achieving the requested purpose.&lt;/li&gt;
&lt;li&gt; Clause 33(e) should be deleted as the non-consensual disclosure of DNA profiles for the study of population genetics is specifically illegal. The use of the database for statistical purposes should be limited to purposes pertaining to understanding effectiveness of the databank.&lt;/li&gt;
&lt;li&gt;Clause 33(f) should be deleted as it is not necessary for DNA profiles to be stored in a database to be useful for civil purposes. Instead samples for civil purposes are only needed as per the relevant case and specified persons.&lt;/li&gt;
&lt;li&gt;Clause 33(g) should be deleted as it allows for the scope of cases in which DNA can be disclosed to by expanded as prescribed.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 34: &lt;/b&gt;This clause allows for access to information for operation maintenance and training.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Comment&lt;/b&gt;: This clause would allow individuals in training access to data stored on the database for training purposes. This places the security of the databank and the data stored in the databank at risk.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Recommendation:&lt;/b&gt; Training of individuals should be conducted via simulation only.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 35: &lt;/b&gt;This clause allows for access to information in the DNA Data Bank for the purpose of a one time keyboard search. A one time keyboard search allows for information from a DNA sample to be compared with information in the index without the information from the DNA sample being included in the index. The clause allows for an authorized individual to carry out such a search on information obtained from an DNA sample lawfully collected for the purpose of criminal investigation, except if the DNA sample was submitted for elimination purposes.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Comment: &lt;/b&gt;The purpose of this clause is unclear as is the scope. The clause allows for the sample to be compared against 'the index' without specifying which index. The clause also allows for 'information obtained from a DNA sample' rather than a profile.  Thus, the clause appears to allow for any information derived from a DNA sample collected for a criminal investigation to be compared against all data within the databank – without recording such information. Such a comparison is vast in scope and open to abuse.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Recommendation: &lt;/b&gt;To ensure that this provision is not used for conducting searches outside of the scope of the original purpose, only DNA profiles, rather than 'information derived from a sample' should be allowed to be compared,  only the indices relevant to the sample should be compared, and the search should be authorized and justified.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Clause 36&lt;/b&gt; : This clause addresses the restriction of access to information in the crime scene index if the individual is a victim of a specified offense or if the person has been eliminated as a suspect of an investigation.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;This clause only addresses restriction of access to the crime scene index and does not address restriction of access to other indices.&lt;/li&gt;
&lt;li&gt;This clause only restricts access to the indice for certain category of individual and for a specific status of a person. Oddly, the clause does not include authorization or rank as a means for determining or restricting access.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;This clause should be amended to lay down standards for restriction of access for all indices.&lt;/li&gt;
&lt;li&gt;Access to all information in the databank should be restricted by default and permission should be based on authorization rather than category or status of individual.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 38&lt;/b&gt;: This clause sets out a post-conviction right related to criminal procedure and evidence.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;This clause would fundamentally alter the nature of India’s criminal justice system, which currently does not contain specific provisions for post-conviction testing rights.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; This clause should be deleted and the issue of post conviction rights related to criminal procedure and evidence referenced to the appropriate legislation.  Clause 38 is implicated by Article 20(2) of the Constitution of India and by section 300 of the CrPC. The principle of autrefois acquit that informs section 300 of the CrPC specifically deals with exceptions to the rule against double jeopardy that permit re-trials. [See, for instance, Sangeeta Mahendrabhai Patel (2012) 7 SCC 721.] The person must be duly accorded with a right to know rules may provide for- the authorized persons to whom information relating to a person’s DNA profile contained in the offenders’ index shall be communicated. Alternatively, this right could be limited only to accused persons who’s trial is still at the stage of production of evidence in the Trial Court. This suggestion is being made because unless the right as it currently stands, is limited in some manner, every convict with the means to engage a lawyer would ask for DNA analysis of the evidence in his/her case thereby flooding the system with useless requests risking a breakdown of the entire machinery.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Chapter VII: Finance, Accounts, and Audit&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Clause 39: &lt;/b&gt;This clause allows the Central Government to make grants and loans to the DNA Board after due appropriation by Parliament.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;This clause allows the Central Government to grant and loan money to the DNA Board, but does not require any proof or justification for the sum of money being given.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;This clause should require a formal cost benefit analysis, and financial assessment prior to the giving of any grants or loans.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Chapter VIII: Offences and Penalties&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Chapter IX: Miscellaneous&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Clause 53: &lt;/b&gt;This clause allows protects the Central Government and the Members of the Board from suit, prosecution, or other legal proceedings for actions that they have taken in good faith.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;Though it is important to take into consideration if an action has been taken in good faith, absolving the Government and Board from accountability for actions leaves little course of redress for the individual. This is particularly true as the Central Government and the Board are given broad powers under the Bill.&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommended: &lt;/b&gt;If the Central Government and the Board will be protected for actions taken in good faith, their powers should be limited. Specifically, they should not have the ability to widen the scope of the Bill.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Clause 57:&lt;/b&gt; This clause states that the Central Government will have the powers to make Rules for a number of defined issues.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; 57(d) allows for the regulations to be created regarding the use of population statistics Data Bank created and maintained for the purposes of identification research and protocol development or quality control.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; 57(d) should be deleted as any use for the creation of a population statistics Data Bank created and maintained for the purposes of identification research and protocol  development or quality control is beyond the scope of the Bill.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Clause 58: &lt;/b&gt;This clause empowers the Board to make regulations regarding a number of aspects related to the Bill.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Comment&lt;/b&gt;: There a number of functions that the Board can make regulations for that should be defined within the Bill itself to ensure that the scope of the Bill does not expand without Parliamentary oversight and approval.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Recommendation:&lt;/b&gt; 58(2)(g) should be deleted as it allows the Board to create regulations for other relevant uses of DNA techniques and technologies, 58(2)(u) should be deleted as it allows the Board to include new categories of indices to databanks, and 58(2) (aa) should be deleted as it allows the Board to decide which other indices a DNA profile may be compared with in the case of sharing of DNA profiles with foreign Governments, organizations, or institutions.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Clause 61:&lt;/b&gt; This clause states that no civil court will have jurisdiction to entertain any suit or proceeding in respect of any matter which the Board is empowered to determine and no injunction shall be granted.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This clause in practice will limit the recourse that individuals can take and will exclude the Board from the oversight of civil or criminal courts.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The power to collect, store and analyse human DNA samples has wide reaching consequences for people whose samples are being utilised for this purpose, specially if their samples are being labeled in specific indexes such as “index of offenders”, etc. The individual should therefore have a right to approach the court of law to safeguard his/her rights. Therefore this provision barring the jurisdiction of the courts should be deleted.&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt;Schedule&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Schedule A:&lt;/b&gt; The schedule refers to section 33(f) which allows for disclosure of information in relation to DNA profiles, DNA samples, and records in a DNA Data Bank to be communicated in cases of investigations relating to civil disputes or other civil matters or offenses or cases listed in the schedule with the concurrence of the court.&lt;/li&gt;
&lt;/ul&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;As 33(f) requires the concurrence of the court for disclosure of information, it is unclear what purpose the schedule serves. If the Schedule is meant to serve as a guide to the Court on appropriate instances for the disclosure of information stored in the DNA databank – the schedule is too general by listing entire Acts, while at the same time being too specific by naming specific Acts. Ideally, courts should use principles and the greater public interest to reach a decision as to whether or not disclosure of information in the DNA databank is appropriate. At a minimum these principles should include necessity (of the disclosure) and proportionality (of the type/amount of information disclosed).&lt;/p&gt;
&lt;p class="Standard" style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;As we recommended the deletion of clause 33(f) as it is not necessary to databank DNA profiles for civil purposes, the schedule should also be deleted.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Note: &lt;/b&gt;The schedule differs drastically from previous drafts and from discussions  held in the Expert Committee and recommendations agreed upon. As per the Meeting Minutes of the&lt;a href="http://cis-india.org/internet-governance/blog/expert-committee-meetings.zip/view"&gt; Expert Committee&lt;/a&gt; meeting held on November 10th 2014 &lt;i&gt;“The Committee recommended incorporation of the comments received from the members of the Expert Committee appropriately in the draft Bill...Point no. 1 suggested by Mr. Sunil Abraham in the Schedule of the draft Bill to define the cases in which DNA samples can be collected without consent by incorporating point no. 1 (I.e 'Any offence under the Indian Penal Code, 1860 if it is listed as a cognizable offence in Part I of the First Schedule of the code of Criminal Procedure, 1973)&lt;/i&gt;”&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;p&gt;Download CIS submission &lt;a href="https://cis-india.org/internet-governance/blog/cis-human-dna-profiling-bill-2015" class="internal-link"&gt;here&lt;/a&gt;. See the cover letter &lt;a href="https://cis-india.org/internet-governance/blog/cover-letter-for-dna-profiling-bill-2015" class="internal-link"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015'&gt;https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok, Vipul Kharbanda and Vanya Rakesh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>DNA Profiling</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2015-09-02T17:09:04Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/responsible-data-forum">
    <title>Responsible Data Forum: Discussion on the Risks and Mitigations of releasing Data</title>
    <link>https://cis-india.org/internet-governance/blog/responsible-data-forum</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The &lt;a href="https://responsibledata.io/discussion-on-the-risks-and-mitigations-of-releasing-data/"&gt;Responsible Data Forum&lt;/a&gt; initiated a discussion on 26&lt;sup&gt;th&lt;/sup&gt; August 2015 to discuss the &lt;b&gt;risks and mitigations of releasing data&lt;/b&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The discussion was regarding the question of adoption of adequate measures to mitigate risks to people and communities when some data is prepared to be released or for sharing purposes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following concerns entailed the discussion:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;What is risk- risks in releasing development data and PII&lt;/li&gt;
&lt;li&gt;What kinds of risks are there&lt;/li&gt;
&lt;li&gt;Risk to whom?&lt;/li&gt;
&lt;li&gt;Risks in dealing with PII, discussed by way of several examples&lt;/li&gt;
&lt;li&gt;What is missing from the world&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The first thing to be done is that if a dataset is made, then you have the responsibility that no harm is caused to the people who are connected to the dataset and a balance must be created between good use of the data on one hand and protecting data subjects, sources and managers on the other.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To answer what is risk, it was defined to be the “probability of something happening multiplied by the resulting cost or benefit if it does” (Oxford English Dictionary). So it is based on cost/benefit, probability, and a subject. For probability, all possible risks must be considered and work in terms of how much harm would happen and how likely that is about to happen. These issues must be considered necessarily.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An example in this context was that of the Syrian government where the bakeries were targeted as the bombers knew where the bakeries are, making them easy targets. It was discussed how in this backdrop of secure data release mechanism, local context is an important issue.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another example of bad practice was the leak of information in the Ashley Madison case wherein several people have committed suicide.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Kinds of risk:&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;physical harm:&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;The next point of discussion was regarding kinds of the physical risks to data subjects when there is release/sharing of data related to them. Some of them were:&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt; i.  security issues&lt;/li&gt;
&lt;li&gt; ii. hate speech&lt;/li&gt;
&lt;li&gt; iii. voter issues&lt;/li&gt;
&lt;li&gt; iv. police action&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Hence PII goes both ways- where some choose to run the risk of PII being identified; on the other hand some run the risk of being identified as the releaser of information.&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;Legal harms- to explain what can be legal harms posed in releasing or sharing data, an example was discussed of an image marking exercise of a military camp wherein people joined in, marked military equipment and discovered people who are from that country.&lt;/li&gt;
&lt;li&gt;Reputational harm as an organization primarily.&lt;/li&gt;
&lt;li&gt;Privacy breach- which can lead to all sorts of harms.&lt;/li&gt;
&lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Risk to whom?&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Data subjects – this includes:&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt; i.  Data collectors&lt;/li&gt;
&lt;li&gt; ii. Data processing team &lt;/li&gt;
&lt;li&gt; iii. Person releasing the data &lt;/li&gt;
&lt;li&gt; iv. Person using the data&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Also, the likely hood of risk ranges from low, medium and high. We as a community are at a risk at worse.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;PII: &lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;- Any data which can be used to identify any specific individual. Such information does not only include names, addresses or phone numbers but could also be data sets that don’t in themselves identify an individual.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For example, in some places sharing of social security number is required for HIV+ status check-up; hence, one needs to be aware of the environment of data sets that go into it. In another situation where there is a small population and there is a need to identify people of a street, village or town for the purpose of religion, then even this data set can put them to risk.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Hence, awareness with respect to the demographics is important to ascertain how many people reside in that place, be aware of the environment and accordingly decide what data set must be made.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;- Another way to mitigate risks at the time of release/sharing of data is partial release only to some groups, like for the purpose of academics or to data subjects.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;- Different examples were discussed to identify how release of data irresponsibly has affected the data subjects and there is a need to work to mitigate harms caused in such cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Example- in the New York City taxi case data about every taxi ride was released-including pickup and drop locations, times, fares. Here it becomes more problematic if someone is visiting strip clubs, then re-identification takes place and this necessitates protection of people against such insinuation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This shows how data sets can lead to re-identification, even when it is not required. Hence, the involved actors must understand the responsibilities when engaging in data collection or release and accordingly mitigate the risks so associated.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;- A concern was raised over collection and processing of the information of genetic diseases of a small population since practically it is not possible to guarantee that the information of data subjects to whom the data relates will not be released or exposed or it won’t be re-identifiable. Though best efforts would be made by experts, however, realistically, it is not possible to guarantee people that they will not be identified. So the question of informing people of such risks is highly crucial. It is suggested that one way of mitigating risks is involving the people and letting them know. Awareness regarding potential impact by breach of data or identification is very important.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;- Another factor for consideration is the context in which the information was collected. The context for collection of data seems to change over a period of time. For example, many human rights funders want information on their websites changed or removed in the backdrop of changing contexts, circumstances and situation. In this case also, the collection and release of data and the risks associated become important due to changing contexts.&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;What is missing from the world?&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Though recognition of risks has been done and is an ongoing process, what is missing from the world are uniform guidelines, rules or law. There are no policies for informed consent or for any means to mitigate risks collectively in a uniform manner. There must be adoption of principles of necessity, proportionality and informed consent.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/responsible-data-forum'&gt;https://cis-india.org/internet-governance/blog/responsible-data-forum&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vanya</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-09-06T14:29:14Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-times-of-india-sandhya-soman-august-23-2015-the-seedy-underbelly-of-revenge-porn">
    <title>The seedy underbelly of revenge porn</title>
    <link>https://cis-india.org/internet-governance/blog/the-times-of-india-sandhya-soman-august-23-2015-the-seedy-underbelly-of-revenge-porn</link>
    <description>
        &lt;b&gt;Intimate photos posted by angry exes are becoming part of an expanding online body of dirty work.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Sandhya Soman was published in the &lt;a class="external-link" href="http://timesofindia.indiatimes.com/home/sunday-times/deep-focus/The-seedy-underbelly-of-revenge-porn/articleshow/48627922.cms?from=mdr"&gt;Times of India&lt;/a&gt; on August 23, 2015.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span id="advenueINTEXT" style="float:left; "&gt;Three  lakh 'Likes' aren't easy to come by. But Geeta isn't gloating. She's  livid, and waiting for the day a video-sharing site will take down the  popular clip of her having sex with her vengeful ex-husband. "Every  other day somebody calls or messages to say they've seen me," says  Geeta.&lt;br /&gt; &lt;br /&gt; She is not alone. Two weeks ago, law student Shrutanjaya  Bhardwaj Whatsapped women he knew asking if any of them had come across  cases of online sexual harassment. In a few hours, his phone was filled  with tales of harassment by ex-boyfriends and strangers. Instances  ranged from strangers publishing morphed photographs on Facebook, to  ex-husbands and boyfriends circulating intimate photos and videos on  porn sites. Of the 40 responses, around 25 were cases of abuse by former  partners. "I have heard friends talking about the problem, but never  realized it was this bad," says Bhardwaj.&lt;br /&gt; &lt;br /&gt; These days, revenge  is best served online - it travels faster and has potential for greater  damage. But despite the widespread nature of the crime, many targets  hesitate to complain for fear of being shamed and blamed. "A 15-year-old  girl is going to worry about how her parents will react if she talks  about it," says Chinmayi Arun, research director, Centre for  Communication Governance at Delhi National Law University. There is also  fear of harassment by the police, says Rohini Lakshane, researcher,  Centre for Internet and Society. Worst of all is the waiting. "Even if a  police complaint is filed, it takes ages to find out who shot it, who  uploaded it and where it is circulated. Such content is mirrored across  many sites," she says.&lt;br /&gt; &lt;br /&gt; Geeta is familiar with the routine. Her  harassment started with photographs sent to family, friends and  colleagues. After an acrimonious divorce, several videos were released  in 2013. "There were some 25-30 videos on various sites.&lt;br /&gt; &lt;br /&gt; After  an FIR was filed, the police wrote to websites and some of the links  were removed," says Geeta, who has been flagging content on a popular  site, which has not yet responded to her privacy violation report. "My  face is seen clearly on it. People even come up to me in restaurants  saying they've seen it. How do I get on with my life?" asks a distraught  Geeta. She also recently filed an affidavit supporting the  controversial porn ban PIL in a last-ditch effort to erase the abuse  that began after her divorce.&lt;br /&gt; &lt;br /&gt; The cyber cell officer in charge  of her case says he had got websites to shut down several URLs but was  thwarted by the repeal of section 66A of the IT Act that dealt with  offensive messages sent electronically. When asked why section 67 (cyber  pornography) of the same act and various sections in the criminal law  couldn't be used, the officer says that only 66A is applicable to the  evidence he has. "I asked for more links and she sent them to me. We'll  see if other sections can be applied," he says. Lawyers and activists,  argue that existing laws are good enough like sections 354A (sexual  harassment), 354C (voyeurism), 354D (stalking) and 509 (outraging  modesty) of the IPC.&lt;br /&gt; &lt;br /&gt; Though there are no official statistics  for what is popularly referred to as 'revenge' porn, there is a flood of  such images online. Lakshane, who studied consent in amateur  pornography for the NGO-run EroTICs India project in 2014, found  clandestinely shot clips to exhibitionist ones where faces are blurred  or cropped.&lt;br /&gt; &lt;br /&gt; Social activist Sunita Krishnan has raised the red  flag over several video clips, including two that show gang rape, which  were circulated on Whatsapp. Some of the content she came across showed  familiarity between the man and woman, indicating an existing  relationship. In one clip, the man says: "How dare you go with that  fellow. What you did it to him, do it to me."&lt;br /&gt; &lt;br /&gt; Most home-grown  clips end up on desi sites with servers abroad, making it difficult to  take down content. Some do have a policy of asking for consent of people  in the frame. But Lakshane, who wanted to test this policy, says when  she approached one website that has servers abroad saying that she had a  sexually explicit video, the reply was a one-liner asking her to send  it. "They didn't ask for any consent emails," she says. In lieu of  payment, they offered her a free account on another file-sharing site,  which seemed to partner with the site. With no financial links to those  submitting videos, sites like these make money out of subscriptions from  consumers, or ads.&lt;br /&gt; &lt;br /&gt; A few months ago, the CBI arrested a man  from Bengaluru for uploading porn clips, using high-end editing software  and cameras. Kaushik Kuonar allegedly headed a syndicate and was  supposed to be behind the rape clips reported by Krishnan. "I am  skeptical of the idea of amateur porn being randomly available across  the Internet. There seem to be people like the man in Bengaluru who are  apparently sourcing, distributing and making money out of it," says  Chinmayi Arun. "He had 474 clips, including some of rape," adds  Krishnan.&lt;br /&gt; &lt;br /&gt; Social media companies, meanwhile, say they're  working with authorities to prevent such violations. Facebook  spokesperson says the company removes content that violates its  community standards. It also works with the women and child development  ministry to help women stay safe online. Google, Microsoft, Twitter and  Reddit have promised to remove links to revenge porn on request, while  countries like Japan and Israel have made it illegal.&lt;br /&gt; &lt;br /&gt; In India,  the National Commission for Women started a consultation on online  harassment but is yet to submit a report. In the absence of clarity,  activists like Krishnan endorse the banning of porn sites. Not all agree  with sweeping solutions. Lakshane says sometimes a court order helps to  get tech companies to act faster on requests as in the case of a 2012  sex tape scandal where Google removed search results to 360 web pages.  Also, the term 'revenge' porn, she says, is a misnomer as the videos are  meant to shame women. "These are not movies where actors get paid.  Somebody else is making money off this gross violation of privacy." &lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-times-of-india-sandhya-soman-august-23-2015-the-seedy-underbelly-of-revenge-porn'&gt;https://cis-india.org/internet-governance/blog/the-times-of-india-sandhya-soman-august-23-2015-the-seedy-underbelly-of-revenge-porn&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2015-09-27T14:25:43Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/hindustan-times-august-20-2015-aloke-tikku-stats-from-2014-reveal-horror-of-scrapped-section-66-a-of-it-act">
    <title>Stats from 2014 reveal horror of scrapped section 66A of IT Act </title>
    <link>https://cis-india.org/internet-governance/news/hindustan-times-august-20-2015-aloke-tikku-stats-from-2014-reveal-horror-of-scrapped-section-66-a-of-it-act</link>
    <description>
        &lt;b&gt;An average of six netizens were arrested every day in 2014 for posting offensive content online under section 66A of the Information Technology Act, a draconian and much abused law no longer in use.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Aloke Tikku was &lt;a class="external-link" href="http://www.hindustantimes.com/tech/stats-from-2014-reveal-horror-of-scrapped-section-66a-of-it-act/story-G2xCoELsNbxpl5dXvl0aFJ.html"&gt;published in the Hindustan Times&lt;/a&gt; on August 20, 2015. Pranesh Prakash gave inputs.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;A first-of-its-kind set of statistics compiled by the National Crime  Records Bureau reveals that 2,402 people, including 29 women, were  arrested in 4,192 cases under section 66A — which was struck down in  March by the Supreme Court that ruled that it violated the  constitutional freedom of speech.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These arrests made up nearly 60% of all arrests under the IT Act, and  40% of arrests for cyber crimes in 2014. It was also a little less than  twice the number of people caught red-handed accepting bribes the same  year.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“These statistics are shocking. I had assumed there may be a few  hundred cases, at worst,” said Shreya Singhal, on whose petition the top  court had scrapped the provision.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“It validates the judgment even more than when it was delivered,” said Singhal, a law student.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Quite like Rinu Srinivasan – one of two Mumbai girls arrested in 2012  for a Facebook post regarding Shiv Sena chief Bal Thackrey’s death —  nearly half of those arrested (1,217) were in the 18-29 age group. This  included nine girls. Another 1,015 were in the 30-44 age group while 166  were between 45 and 59 years old.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The now-repealed section 66A prescribed a three-year jail term for  online content that could be construed to be offensive or false.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This is the first time the NCRB has collected detailed statistics on  cyber crimes, listing out the number of cases registered under each  section of the IT Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A government official conceded that the large number of cases  registered under section 66A meant that the Centre’s guidelines — issued  after a public outcry in November 2012 against its misuse — had served  little purpose. In May 2013, the Supreme Court too put its weight behind  the guidelines and made it legally binding on them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In these guidelines, the Centre had made prior approval of an  inspector general of police-rank officer mandatory for all arrests under  section 66A. “Either this rule wasn’t followed or the IGPs did not rise  to the occasion,” the official said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The NCRB did not give a state-wise break-up of arrests under section 66A.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But in terms of cases registered, Uttar Pradesh led the pack with  898, followed by Karnataka (603), Assam (377), Maharashtra (375),  Telangana (352), Rajasthan (291), Kerala (229), Punjab (123) and Delhi  (137).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“It was “unconscionable that 2,402 persons were arrested in 2014, and  many made to languish in jail, under a provision that we now know to  have been unconstitutional,” said Pranesh Prakash at the  Bengaluru-headquartered research and advocacy group, Centre for Internet  and Society.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Even after the Supreme Court laid down more stringent ad-hoc  guidelines on arrests under Section 66A, it is clear they were not  effective in the least: 860 charge-sheets were filed by the police under  Section 66A in 2014,” the policy director at CIS said.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/hindustan-times-august-20-2015-aloke-tikku-stats-from-2014-reveal-horror-of-scrapped-section-66-a-of-it-act'&gt;https://cis-india.org/internet-governance/news/hindustan-times-august-20-2015-aloke-tikku-stats-from-2014-reveal-horror-of-scrapped-section-66-a-of-it-act&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Section 66A</dc:subject>
    

   <dc:date>2015-09-26T07:28:13Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology">
    <title>Security: Privacy, Transparency and Technology</title>
    <link>https://cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society (CIS) has been involved in privacy and data protection research for the last five years. It has participated as a member of the Justice A.P. Shah Committee, which has influenced the draft Privacy Bill being authored by the Department of Personnel and Training. It has organised 11 multistakeholder roundtables across India over the last two years to discuss a shadow Privacy Bill drafted by CIS with the participation of privacy commissioners and data protection authorities from Europe and Canada.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The article was co-authored by Sunil Abraham, Elonnai Hickok and Tarun Krishnakumar. It was published by Observer Research Foundation, &lt;a href="https://cis-india.org/internet-governance/blog/security-privacy-transparency-technology.pdf" class="internal-link"&gt;Digital Debates 2015: CyFy Journal Volume 2&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;Our centre’s work on privacy was considered incomplete by some stakeholders because of a lack of focus in the area of cyber security and therefore we have initiated research on it from this year onwards. In this article, we have undertaken a preliminary examination of the theoretical relationships between the national security imperative and privacy, transparency and technology.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Security and Privacy&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Daniel J. Solove has identified the tension between security and privacy as a false dichotomy: "Security and privacy often clash, but there need not be a zero-sum tradeoff." &lt;a name="fr1" href="#fn1"&gt;[1]&lt;/a&gt; Further unpacking this false dichotomy, Bruce Schneier says, "There is no security without privacy. And liberty requires both security and privacy." &lt;a name="fr2" href="#fn2"&gt;[2]&lt;/a&gt; Effectively, it could be said that privacy is a precondition for security, just as security is a precondition for privacy. A secure information system cannot be designed without guaranteeing the privacy of its authentication factors, and it is not possible to guarantee privacy of authentication factors without having confidence in the security of the system. Often policymakers talk about a balance between the privacy and security imperatives—in other words a zero-sum game. Balancing these imperatives is a foolhardy approach, as it simultaneously undermines both imperatives. Balancing privacy and security should instead be framed as an optimisation problem. Indeed, during a time when oversight mechanisms have failed even in so-called democratic states, the regulatory power of technology &lt;a name="fr3" href="#fn3"&gt;[3]&lt;/a&gt; should be seen as an increasingly key ingredient to the solution of that optimisation problem.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Data retention is required in most jurisdictions for law enforcement, intelligence and military purposes. Here are three examples of how security and privacy can be optimised when it comes to Internet Service Provider (ISP) or telecom operator logs:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;&lt;strong&gt;Data Retention&lt;/strong&gt;: We propose that the office of the Privacy Commissioner generate a cryptographic key pair for each internet user and give one key to the ISP / telecom operator. This key would be used to encrypt logs, thereby preventing unauthorised access. Once there is executive or judicial authorisation, the Privacy Commissioner could hand over the second key to the authorised agency. There could even be an emergency procedure and the keys could be automatically collected by concerned agencies from the Privacy Commissioner. This will need to be accompanied by a policy that criminalises the possession of unencrypted logs by ISP and telecom operators.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;&lt;strong&gt;Privacy-Protective Surveillance&lt;/strong&gt;: Ann Cavoukian and Khaled El Emam &lt;a name="fr4" href="#fn4"&gt;[4]&lt;/a&gt; have proposed combining intelligent agents, homomorphic encryption and probabilistic graphical models to provide “a positive-sum, ‘win–win’ alternative to current counter-terrorism surveillance systems.” They propose limiting collection of data to “significant” transactions or events that could be associated with terrorist-related activities, limiting analysis to wholly encrypted data, which then does not just result in “discovering more patterns and relationships without an understanding of their context” but rather “intelligent information—information selectively gathered and placed into an appropriate context to produce actual knowledge.” Since fully homomorphic encryption may be unfeasible in real-world systems, they have proposed use of partially homomorphic encryption. But experts such as Prof. John Mallery from MIT are also working on solutions based on fully homomorphic encryption.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;&lt;strong&gt;Fishing Expedition Design&lt;/strong&gt;: Madan Oberoi, Pramod Jagtap, Anupam Joshi, Tim Finin and Lalana Kagal have proposed a standard &lt;a name="fr5" href="#fn5"&gt;[5]&lt;/a&gt; that could be adopted by authorised agencies, telecom operators and ISPs. Instead of giving authorised agencies complete access to logs, they propose a format for database queries, which could be sent to the telecom operator or ISP by authorised agencies. The telecom operator or ISP would then process the query, and anonymise/obfuscate the result-set in an automated fashion based on applicable privacypolicies/regulation. Authorised agencies would then hone in on a subset of the result-set that they would like with personal identifiers intact; this smaller result set would then be shared with the authorised agencies.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;An optimisation approach to resolving the false dichotomy between privacy and security will not allow for a total surveillance regime as pursued by the US administration. Total surveillance brings with it the ‘honey pot’ problem: If all the meta-data and payload data of citizens is being harvested and stored, then the data store will become a single point of failure and will become another target for attack. The next Snowden may not have honourable intentions and might decamp with this ‘honey pot’ itself, which would have disastrous consequences.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;If total surveillance will completely undermine the national security imperative, what then should be the optimal level of surveillance in a population? The answer depends upon the existing security situation. If this is represented on a graph with security on the y-axis and the proportion of the population under surveillance on the x-axis, the benefits of surveillance could be represented by an inverted hockey-stick curve. To begin with, there would already be some degree of security. As a small subset of the population is brought under surveillance, security would increase till an optimum level is reached, after which, enhancing the number of people under surveillance would not result in any security pay-off. Instead, unnecessary surveillance would diminish security as it would introduce all sorts of new vulnerabilities. Depending on the existing security situation, the head of the hockey-stick curve might be bigger or smaller. To use a gastronomic analogy, optimal surveillance is like salt in cooking—necessary in small quantities but counter-productive even if slightly in excess.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In India the designers of surveillance projects have fortunately rejected the total surveillance paradigm. For example, the objective of the National Intelligence Grid (NATGRID) is to streamline and automate targeted surveillance; it is introducing technological safeguards that will allow express combinations of result-sets from 22 databases to be made available to 12 authorised agencies. This is not to say that the design of the NATGRID cannot be improved.&lt;/p&gt;
&lt;h3&gt;Security and Transparency&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;There are two views on security and transparency: One, security via obscurity as advocated by vendors of proprietary software, and two, security via transparency as advocated by free/open source software (FOSS) advocates and entrepreneurs. Over the last two decades, public and industry opinion has swung towards security via transparency. This is based on the Linus rule that “given enough eyeballs, all bugs are shallow.” But does this mean that transparency is a necessary and sufficient condition? Unfortunately not, and therefore it is not necessarily true that FOSS and open standards will be more secure than proprietary software and proprietary standards.&lt;/p&gt;
&lt;blockquote style="text-align: justify;" class="pullquote"&gt;Optimal surveillance is like salt in cooking—necessary in small quantities but counter-productive even if slightly in excess.&lt;/blockquote&gt;
&lt;p style="text-align: justify;"&gt;The recent detection of the Heartbleed &lt;a name="fr6" href="#fn6"&gt;[6]&lt;/a&gt; security bug in Open SSL, &lt;a name="fr7" href="#fn7"&gt;[7]&lt;/a&gt; causing situations where more data can be read than should be allowed, and Snowden’s revelations about the compromise of some open cryptographic standards (which depend on elliptic curves), developed by the US National Institute of Standards and Technology, are stark examples. &lt;a name="fr8" href="#fn8"&gt;[8]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;At the same time, however, open standards and FOSS are crucial to maintaining the balance of power in information societies, as civil society and the general public are able to resist the powers of authoritarian governments and rogue corporations using cryptographic technology. These technologies allow for anonymous speech, pseudonymous speech, private communication, online anonymity and circumvention of surveillance and censorship. For the media, these technologies enable anonymity of sources and the protection of whistle-blowers—all phenomena that are critical to the functioning of a robust and open democratic society. But these very same technologies are also required by states and by the private sector for a variety of purposes—national security, e-commerce, e-banking, protection of all forms of intellectual property, and services that depend on confidentiality, such as legal or medical services.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In order words, all governments, with the exception of the US government, have common cause with civil society, media and the general public when it comes to increasing the security of open standards and FOSS. Unfortunately, this can be quite an expensive task because the re-securing of open cryptographic standards depends on mathematicians. Of late, mathematical research outputs that can be militarised are no longer available in the public domain because the biggest employers of mathematicians worldwide today are the US military and intelligence agencies. If other governments invest a few billion dollars through mechanisms like Knowledge Ecology International’s proposed World Trade Organization agreement on the supply of knowledge as a public good, we would be able to internationalise participation in standard-setting organisations and provide market incentives for greater scrutiny of cryptographic standards and patching of vulnerabilities of FOSS. This would go a long way in addressing the trust deficit that exists on the internet today.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Security and Technology&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;A techno-utopian understanding of security assumes that more technology, more recent technology and more complex technology will necessarily lead to better security outcomes.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This is because the security discourse is dominated by vendors with sales targets who do not present a balanced or accurate picture of the technologies that they are selling. This has resulted in state agencies and the general public having an exaggerated understanding of the capabilities of surveillance technologies that is more aligned with Hollywood movies than everyday reality.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;More Technology&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Increasing the number of x-ray machines or full-body scanners at airports by a factor of ten or hundred will make the airport less secure unless human oversight is similarly increased. Even with increased human oversight, all that has been accomplished is an increase in the potential locations that can be compromised. The process of hardening a server usually involves stopping non-essential services and removing non-essential software. This reduces the software that should be subject to audit, continuously monitored for vulnerabilities and patched as soon as possible. Audits, ongoing monitoring and patching all cost time and money and therefore, for governments with limited budgets, any additional unnecessary technology should be seen as a drain on the security budget. Like with the airport example, even when it comes to a single server on the internet, it is clear that, from a security perspective, more technology without a proper functionality and security justification is counter-productive. To reiterate, throwing increasingly more technology at a problem does not make things more secure; rather, it results in a proliferation of vulnerabilities.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Latest Technology&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Reports that a number of state security agencies are contemplating returning to typewriters for sensitive communications in the wake of Snowden’s revelations makes it clear that some older technologies are harder to compromise in comparison to modern technology. &lt;a name="fr9" href="#fn9"&gt;[9]&lt;/a&gt; Between iris- and fingerprint-based biometric authentication, logically, it would be easier for a criminal to harvest images of irises or authentication factors in bulk fashion using a high resolution camera fitted with a zoom lens in a public location, in comparison to mass lifting of fingerprints.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Complex Technology&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Fifteen years ago, Bruce Schneier said, "The worst enemy of security is complexity. This has been true since the beginning of computers, and it’s likely to be true for the foreseeable future." &lt;a name="fr10" href="#fn10"&gt;[10]&lt;/a&gt; This is because complexity increases fragility; every feature is also a potential source of vulnerabilities and failures. The simpler Indian electronic machines used until the 2014 elections are far more secure than the Diebold voting machines used in the 2004 US presidential elections. Similarly when it comes to authentication, a pin number is harder to beat without user-conscious cooperation in comparison to iris- or fingerprint-based biometric authentication.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In the following section of the paper we have identified five threat scenarios &lt;a name="fr11" href="#fn11"&gt;[11]&lt;/a&gt; relevant to India and identified solutions based on our theoretical framing above.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Threat Scenarios and Possible Solutions&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Hacking the NIC Certifying Authority&lt;/strong&gt;&lt;br /&gt;One of the critical functions served by the National Informatics Centre (NIC) is as a Certifying Authority (CA). &lt;a name="fr12" href="#fn12"&gt;[12]&lt;/a&gt; In this capacity, the NIC issues digital certificates that authenticate web services and allow for the secure exchange of information online. &lt;a name="fr13" href="#fn13"&gt;[13]&lt;/a&gt; Operating systems and browsers maintain lists of trusted CA root certificates as a means of easily verifying authentic certificates. India’s Controller of Certifying Authority’s certificates issued are included in the Microsoft Root list and recognised by the majority of programmes running on Windows, including Internet Explorer and Chrome. &lt;a name="fr14" href="#fn14"&gt;[14]&lt;/a&gt; In 2014, the NIC CA’s infrastructure was compromised, and digital certificates were issued in NIC’s name without its knowledge. &lt;a name="fr15" href="#fn15"&gt;[15]&lt;/a&gt; Reports indicate that NIC did not "have an appropriate monitoring and tracking system in place to detect such intrusions immediately." &lt;a name="fr16" href="#fn16"&gt;[16]&lt;/a&gt; The implication is that websites could masquerade as another domain using the fake certificates. Personal data of users can be intercepted or accessed by third parties by the masquerading website. The breach also rendered web servers and websites of government bodies vulnerable to attack, and end users were no longer sure that data on these websites was accurate and had not been tampered with. &lt;a name="fr17" href="#fn17"&gt;[17]&lt;/a&gt; The NIC CA was forced to revoke all 250,000 SSL Server Certificates issued until that date &lt;a name="fr18" href="#fn18"&gt;[18]&lt;/a&gt; and is no longer issuing digital certificates for the time being. &lt;a name="fr19" href="#fn19"&gt;[19]&lt;/a&gt;Public key pinning is a means through which websites can specify which certifying authorities have issued certificates for that site. Public key pinning can prevent man-in-the-middle attacks due to fake digital certificates. &lt;a name="fr20" href="#fn20"&gt;[20]&lt;/a&gt; Certificate Transparency allows anyone to check whether a certificate has been properly issued, seeing as certifying authorities must publicly publish information about the digital certificates that they have issued. Though this approach does not prevent fake digital certificates from being issued, it can allow for quick detection of misuse. &lt;a name="fr21" href="#fn21"&gt;[21]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;‘Logic Bomb’ against Airports&lt;/strong&gt;&lt;br /&gt;Passenger operations in New Delhi’s Indira Gandhi International Airport depend on a centralised operating system known as the Common User Passenger Processing System (CUPPS). The system integrates numerous critical functions such as the arrival and departure times of flights, and manages the reservation system and check-in schedules. &lt;a name="fr22" href="#fn22"&gt;[22]&lt;/a&gt; In 2011, a logic bomb attack was remotely launched against the system to introduce malicious code into the CUPPS software. The attack disabled the CUPPS operating system, forcing a number of check-in counters to shut down completely, while others reverted to manual check-in, resulting in over 50 delayed flights. Investigations revealed that the attack was launched by three disgruntled employees who had assisted in the installation of the CUPPS system at the New Delhi Airport. &lt;a name="fr23" href="#fn23"&gt;[23]&lt;/a&gt; Although in this case the impact of the attack was limited to flight delay, experts speculate that the attack was meant to take down the entire system. The disruption and damage resulting from the shutdown of an entire airport would be extensive.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Adoption of open hardware and FOSS is one strategy to avoid and mitigate the risk of such vulnerabilities. The use of devices that embrace the concept of open hardware and software specifications must be encouraged, as this helps the FOSS community to be vigilant in detecting and reporting design deviations and investigate into probable vulnerabilities.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Attack on Critical Infrastructure&lt;/strong&gt;&lt;br /&gt;The Nuclear Power Corporation of India encounters and prevents numerous cyber attacks every day. &lt;a name="fr24" href="#fn24"&gt;[24]&lt;/a&gt; The best known example of a successful nuclear plant hack is the Stuxnet worm that thwarted the operation of an Iranian nuclear enrichment complex and set back the country’s nuclear programme. &lt;a name="fr25" href="#fn25"&gt;[25] &lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The worm had the ability to spread over the network and would activate when a specific configuration of systems was encountered &lt;a name="fr26" href="#fn26"&gt;[26]&lt;/a&gt; and connected to one or more Siemens programmable logic controllers. &lt;a name="fr27" href="#fn27"&gt;[27]&lt;/a&gt; The worm was suspected to have been initially introduced through an infected USB drive into one of the controller computers by an insider, thus crossing the air gap. &lt;a name="fr28" href="#fn28"&gt;[28]&lt;/a&gt; The worm used information that it gathered to take control of normal industrial processes (to discreetly speed up centrifuges, in the present case), leaving the operators of the plant unaware that they were being attacked. This incident demonstrates how an attack vector introduced into the general internet can be used to target specific system configurations. When the target of a successful attack is a sector as critical and secured as a nuclear complex, the implications for a country’s security and infrastructure are potentially grave.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Security audits and other transparency measures to identify vulnerabilities are critical in sensitive sectors. Incentive schemes such as prizes, contracts and grants may be evolved for the private sector and academia to identify vulnerabilities in the infrastructure of critical resources to enable/promote security auditing of infrastructure.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Micro Level: Chip Attacks&lt;/strong&gt;&lt;br /&gt;Semiconductor devices are ubiquitous in electronic devices. The US, Japan, Taiwan, Singapore, Korea and China are the primary countries hosting manufacturing hubs of these devices. India currently does not produce semiconductors, and depends on imported chips. This dependence on foreign semiconductor technology can result in the import and use of compromised or fraudulent chips by critical sectors in India. For example, hardware Trojans, which may be used to access personal information and content on a device, may be inserted into the chip. Such breaches/transgressions can render equipment in critical sectors vulnerable to attack and threaten national security. &lt;a name="fr29" href="#fn29"&gt;[29]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Indigenous production of critical technologies and the development of manpower and infrastructure to support these activities are needed. The Government of India has taken a number of steps towards this. For example, in 2013, the Government of India approved the building of two Semiconductor Wafer Fabrication (FAB) manufacturing facilities &lt;a name="fr30" href="#fn30"&gt;[30]&lt;/a&gt; and as of January 2014, India was seeking to establish its first semiconductor characterisation lab in Bangalore. &lt;a name="fr31" href="#fn31"&gt;[31]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Macro Level: Telecom and Network Switches&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The possibility of foreign equipment containing vulnerabilities and backdoors that are built into its software and hardware gives rise to concerns that India’s telecom and network infrastructure is vulnerable to being hacked and accessed by foreign governments (or non-state actors) through the use of spyware and malware that exploit such vulnerabilities. In 2013, some firms, including ZTE and Huawei, were barred by the Indian government from participating in a bid to supply technology for the development of its National Optic Network project due to security concerns. &lt;a name="fr32" href="#fn32"&gt;[32]&lt;/a&gt; Similar concerns have resulted in the Indian government holding back the conferment of ‘domestic manufacturer’ status on both these firms. &lt;a name="fr33" href="#fn33"&gt;[33]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Following reports that Chinese firms were responsible for transnational cyber attacks designed to steal confidential data from overseas targets, there have been moves to establish laboratories to test imported telecom equipment in India. &lt;a name="fr34" href="#fn34"&gt;[34]&lt;/a&gt; Despite these steps, in a February 2014 incident the state-owned telecommunication company  Bharat Sanchar Nigam Ltd’s network was hacked, allegedly by Huawei. &lt;a name="fr35" href="#fn35"&gt;[35]&lt;/a&gt;&lt;/p&gt;
&lt;blockquote style="text-align: justify;" class="pullquote"&gt;Security practitioners and policymakers need to avoid the zero-sum framing prevalent in popular discourse regarding security VIS-A-VIS privacy, transparency and technology.&lt;/blockquote&gt;
&lt;p style="text-align: justify;"&gt;A successful hack of the telecom infrastructure could result in massive disruption in internet and telecommunications services. Large-scale surveillance and espionage by foreign actors would also become possible, placing, among others, both governmental secrets and individuals personal information at risk.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;While India cannot afford to impose a general ban on the import of foreign telecommunications equipment, a number of steps can be taken to address the risk of inbuilt security vulnerabilities. Common International Criteria for security audits could be evolved by states to ensure compliance of products with international norms and practices. While India has already established common criteria evaluation centres, &lt;a name="fr36" href="#fn36"&gt;[36]&lt;/a&gt; the government monopoly over the testing function has resulted in only three products being tested so far. A Code Escrow Regime could be set up where manufacturers would be asked to deposit source code with the Government of India for security audits and verification. The source code could be compared with the shipped software to detect inbuilt vulnerabilities.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Cyber security cannot be enhanced without a proper understanding of the relationship between security and other national imperatives such as privacy, transparency and technology. This paper has provided an initial sketch of those relationships, but sustained theoretical and empirical research is required in India so that security practitioners and policymakers avoid the zero-sum framing prevalent in popular discourse and take on the hard task of solving the optimisation problem by shifting policy, market and technological levers simultaneously. These solutions must then be applied in multiple contexts or scenarios to determine how they should be customised to provide maximum security bang for the buck.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn1" href="#fr1"&gt;1&lt;/a&gt;]. Daniel J. Solove, Chapter 1 in Nothing to Hide: The False Tradeoff between Privacy and Security (Yale University Press: 2011), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1827982.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn2" href="#fr2"&gt;2&lt;/a&gt;]. Bruce Schneier, “What our Top Spy doesn’t get: Security and Privacy aren’t Opposites,” Wired, January 24, 2008, http://archive.wired.com/politics/security commentary/security matters/2008/01/securitymatters_0124 and Bruce Schneier, “Security vs. Privacy,” Schneier on Security, January 29, 2008, https://www.schneier.com/blog/archives/2008/01/security_vs_pri.html.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn3" href="#fr3"&gt;3&lt;/a&gt;]. There are four sources of power in internet governance: Market power exerted by private sector organisations; regulatory power exerted by states; technical power exerted by anyone who has access to certain categories of technology, such as cryptography; and finally, the power of public pressure sporadically mobilised by civil society. A technically sound encryption standard, if employed by an ordinary citizen, cannot be compromised using the power of the market or the regulatory power of states or public pressure by civil society. In that sense, technology can be used to regulate state and market behaviour.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn4" href="#fr4"&gt;4&lt;/a&gt;]. Ann Cavoukian and Khaled El Emam, “Introducing Privacy-Protective Surveillance: Achieving Privacy and Effective Counter-Terrorism,” Information &amp;amp; Privacy Commisioner, September 2013, Ontario, Canada, http://www.privacybydesign.ca/content/uploads/2013/12/pps.pdf.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn5" href="#fr5"&gt;5&lt;/a&gt;]. Madan Oberoi, Pramod Jagtap, Anupam Joshi, Tim Finin and Lalana Kagal, “Information Integration and Analysis: A Semantic Approach to Privacy”(presented at the third IEEE International Conference on Information Privacy, Security, Risk and Trust, Boston, USA, October 2011), ebiquity.umbc.edu/_file_directory_/papers/578.pdf.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn6" href="#fr6"&gt;6&lt;/a&gt;]. Bruce Byfield, “Does Heartbleed disprove ‘Open Source is Safer’?,” Datamation, April 14, 2014, http://www.datamation.com/open-source/does-heartbleed-disprove-open-source-is-safer-1.html.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn7" href="#fr7"&gt;7&lt;/a&gt;]. “Cybersecurity Program should be more transparent, protect privacy,” Centre for Democracy and Technology Insights, March 20, 2009, https://cdt.org/insight/cybersecurity-program-should-be-more-transparent-protect-privacy/#1.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn8" href="#fr8"&gt;8&lt;/a&gt;]. “Cracked Credibility,” The Economist, September 14, 2013, http://www.economist.com/news/international/21586296-be-safe-internet-needs-reliable-encryption-standards-software-and.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn9" href="#fr9"&gt;9&lt;/a&gt;]. Miriam Elder, “Russian guard service reverts to typewriters after NSA leaks,” The Guardian, July 11, 2013, www.theguardian.com/world/2013/jul/11/russia-reverts-paper-nsa-leaks and Philip Oltermann, “Germany ‘may revert to typewriters’ to counter hi-tech espionage,” The Guardian, July 15, 2014, www.theguardian.com/world/2014/jul/15/germany-typewriters-espionage-nsa-spying-surveillance.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn10" href="#fr10"&gt;10&lt;/a&gt;]. Bruce Schneier, “A Plea for Simplicity,” Schneier on Security, November 19, 1999, https://www.schneier.com/essays/archives/1999/11/a_plea_for_simplicit.html.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn11" href="#fr11"&gt;11&lt;/a&gt;]. With inputs from Pranesh Prakash of the Centre for Internet and Society and Sharathchandra Ramakrishnan of Srishti School of Art, Technology and Design.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn12" href="#fr12"&gt;12&lt;/a&gt;]. “Frequently Asked Questions,” Controller of Certifying Authorities, Department of Electronics and Information Technology, Government of India, http://cca.gov.in/cca/index.php?q=faq-page#n41.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn13" href="#fr13"&gt;13&lt;/a&gt;]. National Informatics Centre Homepage, Government of India, http://www.nic.in/node/41.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn14" href="#fr14"&gt;14&lt;/a&gt;]. Adam Langley, “Maintaining Digital Certificate Security,” Google Security Blog, July 8, 2014, http://googleonlinesecurity.blogspot.in/2014/07/maintaining-digital-certificate-security.html.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn15" href="#fr15"&gt;15&lt;/a&gt;]. This is similar to the kind of attack carried out against DigiNotar, a Dutch certificate authority. See: http://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1246&amp;amp;context=jss.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn16" href="#fr16"&gt;16&lt;/a&gt;]. R. Ramachandran, “Digital Disaster,” Frontline, August 22, 2014, http://www.frontline.in/the-nation/digital-disaster/article6275366.ece.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn17" href="#fr17"&gt;17&lt;/a&gt;]. Ibid.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn18" href="#fr18"&gt;18&lt;/a&gt;]. “NIC’s digital certification unit hacked,” Deccan Herald, July 16, 2014, http://www.deccanherald.com/content/420148/archives.php.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn19" href="#fr19"&gt;19&lt;/a&gt;]. National Informatics Centre Certifying Authority Homepage, Government of India, http://nicca.nic.in//.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn20" href="#fr20"&gt;20&lt;/a&gt;]. Mozilla Wiki, “Public Key Pinning,” https://wiki.mozilla.org/SecurityEngineering/Public_Key_Pinning.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn21" href="#fr21"&gt;21&lt;/a&gt;]. “Certificate Transparency - The quick detection of fraudulent digital certificates,” Ascertia, August 11, 2014, http://www.ascertiaIndira.com/blogs/pki/2014/08/11/certificate-transparency-the-quick-detection-of-fraudulent-digital-certificates.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn22" href="#fr22"&gt;22&lt;/a&gt;]. “Indira Gandhi International Airport (DEL/VIDP) Terminal 3, India,” Airport Technology.com, http://www.airport-technology.com/projects/indira-gandhi-international-airport-terminal -3/.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn23" href="#fr23"&gt;23&lt;/a&gt;]. “How techies used logic bomb to cripple Delhi Airport,” Rediff, November 21, 2011, http://www.rediff.com/news/report/how-techies-used-logic-bomb-to-cripple-delhi-airport/20111121 htm.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn24" href="#fr24"&gt;24&lt;/a&gt;]. Manu Kaushik and Pierre Mario Fitter, “Beware of the bugs,” Business Today, February 17, 2013, http://businesstoday.intoday.in/story/india-cyber-security-at-risk/1/191786.html.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn25" href="#fr25"&gt;25&lt;/a&gt;]. “Stuxnet ‘hit’ Iran nuclear plants,” BBC, November 22, 2010, http://www.bbc.com/news/technology-11809827.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn26" href="#fr26"&gt;26&lt;/a&gt;]. In this case, systems using Microsoft Windows and running Siemens Step7 software were targeted.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn27" href="#fr27"&gt;27&lt;/a&gt;]. Jonathan Fildes, “Stuxnet worm ‘targeted high-value Iranian assets’,” BBC, September 23, 2010, http://www.bbc.com/news/technology-11388018.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn28" href="#fr28"&gt;28&lt;/a&gt;]. Farhad Manjoo, “Don’t Stick it in: The dangers of USB drives,” Slate, October 5, 2010, http://www.slate.com/articles/technology/technology/2010/10/dont_stick_it_in.html.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn29" href="#fr29"&gt;29&lt;/a&gt;]. Ibid.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn30" href="#fr30"&gt;30&lt;/a&gt;]. “IBM invests in new $5bn chip fab in India, so is chip sale off?,” ElectronicsWeekly, February 14, 2014, http://www.electronicsweekly.com/news/business/ibm-invests-new-5bn-chip-fab-india-chip-sale-2014-02/.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn31" href="#fr31"&gt;31&lt;/a&gt;]. NT Balanarayan, “Cabinet Approves Creation of Two Semiconductor Fabrication Units,” Medianama, February 17, 2014, http://articles.economictimes.indiatimes.com/2014-02-04/news/47004737_1_indian-electronics-special-incentive-package-scheme-semiconductor-association.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn32" href="#fr32"&gt;32&lt;/a&gt;]. Jamie Yap, “India bars foreign vendors from national broadband initiative,” ZD Net, January 21, 2013, http://www.zdnet.com/in/india-bars-foreign-vendors-from-national-broadband-initiative-7000010055/.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn33" href="#fr33"&gt;33&lt;/a&gt;]. Kevin Kwang, “India holds back domestic-maker status for Huawei, ZTE,” ZD Net, February 6, 2013, http://www.zdnet.com/in/india-holds-back-domestic-maker-status-for-huawei-zte-70 00010887/. Also see “Huawei, ZTE await domestic-maker tag,” The Hindu, February 5, 2013, http://www.thehindu.com/business/companies/huawei-zte-await-domesticmaker-tag/article4382888.ece.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn34" href="#fr34"&gt;34&lt;/a&gt;]. Ellyne Phneah, “Huawei, ZTE under probe by Indian government,” ZD Net, May 10, 2013, http://www.zdnet.com/in/huawei-zte-under-probe-by-indian-government-7000015185/.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;[&lt;a name="fn35" href="#fr35"&gt;35&lt;/a&gt;]. Devidutta Tripathy, “India investigates report of Huawei hacking state carrier network,” Reuters, February 6, 2014, http://www.reuters.com/article/2014/02/06/us-india-huawei-hacking-idUSBREA150QK20140206.&lt;/p&gt;
&lt;p&gt;[&lt;a name="fn36" href="#fr36"&gt;36&lt;/a&gt;]. “Products Certified,” Common Criteria Portal of India, http://www.commoncriteria-india.gov.in/Pages/ProductsCertified.aspx.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology'&gt;https://cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Homepage</dc:subject>
    

   <dc:date>2015-09-15T10:53:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things">
    <title>A Review of the Policy Debate around Big Data and Internet of Things</title>
    <link>https://cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things</link>
    <description>
        &lt;b&gt;This blog post seeks to review and understand how regulators and experts across jurisdictions are reacting to Big Data and Internet of Things (IoT) from a policy perspective.&lt;/b&gt;
        &lt;h3&gt;Defining and Connecting Big Data and Internet of Things&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Internet of Things is a term that refers to networked objects and systems that can connect to the internet and can transmit and receive data. Characteristics of IoT include the gathering of information through sensors, the automation of functions, and analysis of collected data.[1] For IoT devices, because of the &lt;i&gt;velocity&lt;/i&gt; at which data is generated, the &lt;i&gt;volume&lt;/i&gt; of data that is generated, and the &lt;i&gt;variety&lt;/i&gt; of data generated by different sources [2] - IoT devices can be understood as generating Big Data and/or relying on Big Data analytics. In this way IoT devices and Big Data are intrinsically interconnected.&lt;/p&gt;
&lt;h3&gt;General Implications of Big Data and Internet of Things&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Big Data paradigms are being adopted across countries, governments, and business sectors because of the potential insights and change that it can bring. From improving an organizations business model, facilitating urban development, allowing for targeted and individualized services, and enabling the prediction of certain events or actions - the application of Big Data has been recognized as having the potential to bring about dramatic and large scale changes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At the same time, experts have identified risks to the individual that can be associated with the generation, analysis, and use of Big Data. In May 2014, the White House of the United States completed a ninety day study of how big data will change everyday life. The Report highlights the potential of Big Data as well as identifying a number of concerns associated with Big Data. For example: the selling of personal data, identification or re-identification of individuals, profiling of individuals, creation and exacerbation of information asymmetries, unfair, discriminating, biased, and incorrect decisions based on Big Data analytics, and lack of or misinformed user consent.[3] Errors in Big Data analytics that experts have identified include statistical fallacies, human bias, translation errors, and data errors.[4] Experts have also discussed fundamental changes that Big Data can bring about. For example, Danah Boyd and Kate Crawford in the article &lt;i&gt;"Critical Questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon"&lt;/i&gt; propose that Big Data can change the definition of knowledge and shape the reality it measures.[5] Similarly, a BSC/Oxford Internet Institute conference report titled " &lt;i&gt;The Societal Impact of the Internet of Things&lt;/i&gt;" points out that often users of Big Data assume that information and conclusions based on digital data is reliable and in turn replace other forms of information with digital data.[6]&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Concerns that have been voiced by the Article 29 Working Party and others specifically about IoT devices have included insufficient security features built into devices such as encryption, the reliance of the devices on wireless communications, data loss from infection by malware or hacking, unauthorized access and use of personal data, function creep resulting from multiple IoT devices being used together, and unlawful surveillance.[7]&lt;/p&gt;
&lt;h3&gt;Regulation of Big Data and Internet of Things&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The regulation of Big Data and IoT is currently being debated in contexts such as the US and the EU. Academics, civil society, and regulators are exploring questions around the adequacy of present regulation and overseeing frameworks to address changes brought about Big Data, and if not - what forms of or changes in regulation are needed? For example, Kate Crawford and Jason Shultz in the article &lt;i&gt;"Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms"&lt;/i&gt;stress the importance of bringing in 'data due process rights' i.e ensuring fairness in the analytics of Big Data and how personal information is used.[8] While Solon Barocas and Andrew Selbst in the article &lt;i&gt;"Big Data's Disparate Impact"&lt;/i&gt; explore if present anti-discrimination legislation and jurisprudence in the US is adequate to protect against discrimination arising from Big Data practices - specifically data mining.[9]&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The Impact of Big Data and IoT on Data Protection Principles&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the context of data protection, various government bodies, including the Article 29 Data Protection Working Party set up under the Directive 95/46/EC of the European Parliament, the Council of Europe, the European Commission, and the Federal Trade Commission, as well as experts and academics in the field, have called out at least ten different data protection principles and concepts that Big Data impacts:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify; "&gt;&lt;strong&gt;Collection Limitation:&lt;/strong&gt; As a result of the generation of Big Data as enabled by networked devices, increased capabilities to analyze Big Data, and the prevalent use of networked systems - the principle of collection limitation is changing.[10]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Consent: &lt;/strong&gt;As a result of the use of data from a wide variety of sources and the re-use of data which is inherent in Big Data practices - notions of informed consent (initial and secondary) are changing.[11]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Data Minimization:&lt;/strong&gt; As a result of Big Data practices inherently utilizing all data possible - the principle of data minimization is changing/obsolete.[12]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Notice:&lt;/strong&gt; As a result of Big Data practices relying on vast amounts of data from numerous sources and the re-use of that data - the principle of notice is changing.[13]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Purpose Limitation:&lt;/strong&gt; As a result of Big Data practices re-using data for multiple purposes - the principle of purpose limitation is changing/obsolete.[14]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Necessity: &lt;/strong&gt;As a result of Big Data practices re-using data, the new use or re-analysis of data may not be pertinent to the purpose that was initially specified- thus the principle of necessity is changing.[15]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Access and Correction:&lt;/strong&gt; As a result of Big Data being generated (and sometimes published) at scale and in real time - the principle of user access and correction is changing.[16]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Opt In and Opt Out Choices: &lt;/strong&gt;Particularly in the context of smart cities and IoT which collect data on a real time basis, often without the knowledge of the individual, and for the provision of a service - it may not be easy or possible for individuals to opt in or out of the collection of their data.[17]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;PI:&lt;/strong&gt; As a result of Big Data analytics using and analyzing a wide variety of data, new or unexpected forms of personal data may be generated - thus challenging and evolving beyond traditional or specified definitions of personal information.[18]&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Data Controller:&lt;/strong&gt; In the context of IoT, given the multitude of actors that can collect, use and process data generated by networked devices, the traditional understanding of what and who is a data controller is changing.[19]&lt;/li&gt;
&lt;/ol&gt;
&lt;h3 style="text-align: justify; "&gt;Possible Technical and Policy Solutions&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In a Report titled "&lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World&lt;/i&gt;" by the Federal Trade Commission in the United States it was noted that though IoT changes the application and understanding of certain privacy principles, it does not necessarily make them obsolete.[20] Indeed many possible solutions that have been suggested to address the challenges posed by IoT and Big Data are technical interventions at the device level rather than fundamental policy changes. For example it has been proposed that IoT devices can be programmed to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Automatically delete data after a specified period of time [21] (addressing concerns of data retention)&lt;/li&gt;
&lt;li&gt;Ensure that personal data is not fed into centralized databases on an automatic basis [22] (addressing concerns of transfer and sharing without consent, function creep, and data breach)&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Offer consumers combined choices for consent rather than requiring a one time blanket consent at the time of initiating a service or taking fresh consent for every change that takes place while a consumer is using a service. [23] (addressing concerns of informed and meaningful consent)&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Categorize and tag data with accepted uses and programme automated processes to flag when data is misused. [24] (addressing concerns of misuse of data)&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Apply 'sticky policies' - policies that are attached to data and define appropriate uses of the data as it 'changes hands' [25] (addressing concerns of user control of data)&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Allow for features to only be turned on with consent from the user [26] (addressing concerns of informed consent and collection without the consent or knowledge of the user)&lt;/li&gt;
&lt;li&gt;Automatically convert raw personal data to aggregated data [27] (addressing concerns of misuse of personal data and function creep)&lt;/li&gt;
&lt;li&gt;Offer users the option to delete or turn off sensors [28] (addressing concerns of user choice, control, and consent)&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Such solutions place the designers and manufacturers of IoT devices in a critical role. Yet some, such as Kate Crawford and Jason Shultz are not entirely optimistic about the possibility of effective technological solutions - noting in the context of automated decision making that it is difficult to build in privacy protections as it is unclear when an algorithm will predict personal information about an individual.[29]&lt;/p&gt;
&lt;p&gt;Experts have also suggested that more emphasis should be placed on the principles and practices of:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Transparency,&lt;/li&gt;
&lt;li&gt; Access and correction,&lt;/li&gt;
&lt;li&gt;Use/misuse&lt;/li&gt;
&lt;li&gt;Breach notification&lt;/li&gt;
&lt;li&gt;Remedy&lt;/li&gt;
&lt;li&gt;Ability to withdraw consent&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Others have recommended that certain privacy principles need to be adapted to the Big Data/IoT context. For example, the Article 29 Working Party has clarified that in the context of IoT, consent mechanisms need to include the types of data collected, the frequency of data collection, as well as conditions for data collection.[30] While the Federal Trade Commission has warned that adopting a pure "use" based model has its limitations as it requires a clear (and potentially changing) definition of what use is acceptable and what use is not acceptable, and it does not address concerns around the collection of sensitive personal information.[31] In addition to the above, the European Commission has stressed that the right of deletion, the right to be forgotten, and data portability also need to be foundations of IoT systems and devices.[32]&lt;/p&gt;
&lt;h3&gt;Possible Regulatory Frameworks&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;To the question - are current regulatory frameworks adequate and is additional legislation needed, the FTC has recommended that though a specific IoT legislation may not be necessary, a horizontal privacy legislation would be useful as sectoral legislation does not always account for the use, sharing, and reuse of data across sectors. The FTC also highlighted the usefulness of privacy impact assessments and self regulatory steps to ensure privacy.[33] The European Commission on the other hand has concluded that to ensure enforcement of any standard or protocol - hard legal instruments are necessary.[34] As mentioned earlier, Kate Crawford and Jason Shultz have argued that privacy regulation needs to move away from principles on collection, specific use, disclosure, notice etc. and focus on elements of due process around the use of Big Data - as they say "procedural data due process". Such due process should be based on values instead of defined procedures and should include at the minimum notice, hearing before an independent arbitrator, and the right to review. Crawford and Shultz more broadly note that there are conceptual differences between privacy law and big data that pose as serious challenges i.e privacy law is based on causality while big data is a tool of correlation. This difference raises questions about how effective regulation that identifies certain types of information and then seeks to control the use, collection, and disclosure of such information will be in the context of Big Data – something that is varied and dynamic. According to Crawford and Shultz many regulatory frameworks will struggle with this difference – including the FTC's Fair Information Privacy Principles and the EU regulation including the EU's right to be forgotten.[35] The European Data Protection Supervisor on the other hand looks at Big Data as spanning the policy areas of data protection, competition, and consumer protection – particularly in the context of 'free' services. The Supervisor argues that these three areas need to come together to develop ways in which the challenges of Big Data can be addressed. For example, remedy could take the form of data portability – ensuring users the ability to move their data to other service providers empowering individuals and promoting competitive market structures or adopting a 'compare and forget' approach to data retention of customer data. The Supervisor also stresses the need to promote and treat privacy as a competitive advantage, thus placing importance on consumer choice, consent, and transparency.[36] The European Data Protection reform has been under discussion and it is predicted to be enacted by the end of 2015. The reform will apply across European States and all companies operating in Europe. The reform proposes heavier penalties for data breaches, seeks to provide users with more control of their data.[37] Additionally, Europe is considering bringing digital platforms under the Network and Information Security Directive – thus treating companies like Google and Facebook as well as cloud providers and service providers as a critical sector. Such a move would require companies to adopt stronger security practices and report breaches to authorities.[38]&lt;/p&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;A review of the different opinions and reactions from experts and policy makers demonstrates the ways in which Big Data and IoT are changing traditional forms of protection that governments and societies have developed to protect personal data as it increases in value and importance. While some policy makers believe that big data needs strong legislative regulation and others believe that softer forms of regulation such as self or co-regulation are more appropriate, what is clear is that Big Data is either creating a regulatory dilemma– with policy makers searching for ways to control the unpredictable nature of big data through policy and technology through the merging of policy areas, the honing of existing policy mechanisms, or the broadening of existing policy mechanisms - while others are ignoring the change that Big Data brings with it and are forging ahead with its use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Answering the 'how do we regulate Big Data” question requires &lt;strong&gt;re-conceptualization of data ownership and realities&lt;/strong&gt;. Governments need to first recognize the criticality of their data and the data of their citizens/residents, as well as the contribution to a country's economy and security that this data plays. With the technologies available now, and in the pipeline, data can be used or misused in ways that will have vast repercussions for individuals, society, and a nation. All data, but especially data directly or indirectly related to citizens and residents of a country, needs to be looked upon as owned by the citizens and the nation. In this way, data should be seen as a part of &lt;strong&gt;critical&lt;/strong&gt; &lt;strong&gt;national infrastructure of a nation, &lt;/strong&gt;and accorded the security, protections, and legal backing thereof to &lt;strong&gt;prevent the misuse of the resource by the private or public sectors, local or foreign governments&lt;/strong&gt;. This could allow for local data warehousing and bring physical and access security of data warehouses on par with other critical national infrastructure. Recognizing data as a critical resource answers in part the concern that experts have raised – that Big Data practices make it impossible for data to be categorized as personal and thus afforded specified forms of protection due to the unpredictable nature of big data. Instead – all data is now recognized as critical.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to being able to generate personal data from anonymized or non-identifiable data, big data also challenges traditional divisions of public vs. private data. Indeed Big Data analytics can take many public data points and derive a private conclusion. The use of Big Data analytics on public data also raises questions of consent. For example, though a license plate is public information – should a company be allowed to harvest license plate numbers, combine this with location, and sell this information to different interested actors? This is currently happening in the United States.[39] Lastly, Big Data raises questions of ownership. A solution to the uncertainty of public vs. private data and associated consent and ownership could be the creation a &lt;strong&gt;National Data Archive&lt;/strong&gt; with such data. The archive could function with representation from the government, public and private companies, and civil society on the board. In such a framework, for example, companies like Airtel would provide mobile services, but the CDRs and customer data collected by the company would belong to the National Data Archive and be available to Airtel and all other companies within a certain scope for use. This 'open data' approach could enable innovation through the use of data but within the ambit of national security and concerns of citizens – a framework that could instill trust in consumers and citizens. Only when backed with strong security requirements, enforcement mechanisms and a proactive, responsive and responsible framework can governments begin to think about ways in which Big Data can be harnessed.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;[1] BCS - The Chartered Institute for IT. (2013). The Societal Impact of the Internet of Things. Retrieved May 17, 2015, from http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;[2] Sicular, S. (2013, March 27). Gartner’s Big Data Definition Consists of Three Parts, Not to Be Confused with Three “V”s. Retrieved May 20, 2015, from http://www.forbes.com/sites/gartnergroup/2013/03/27/gartners-big-data-definition-consists-of-three-parts-not-to-be-confused-with-three-vs/&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[3] Executive Office of the President. “Big Data: Seizing Opportunities, Preserving Values”. May 2014. Available at: &lt;a href="https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf"&gt;https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[4] Moses, B., Lyria, &amp;amp; Chan, J. (2014). Using Big Data for Legal and Law Enforcement Decisions: Testing the New Tools (SSRN Scholarly Paper No. ID 2513564). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2513564&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[5] Danah Boyd, Kate Crawford. &lt;a href="http://www.tandfonline.com/doi/abs/10.1080/1369118X.2012.678878"&gt;CRITICAL QUESTIONS FOR BIG DATA&lt;/a&gt;. In&lt;a href="http://www.tandfonline.com/toc/rics20/15/5"&gt;formation, Communication &amp;amp; Society &lt;/a&gt; Vol. 15, Iss. 5, 2012. Available at: &lt;a href="http://www.tandfonline.com/doi/full/10.1080/1369118X.2012.678878"&gt;http://www.tandfonline.com/doi/full/10.1080/1369118X.2012.678878&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[6]  The Chartered Institute for IT, Oxford Internet Institute, University of Oxford. “The Societal Impact of the Internet of Things” February 2013. Available at: &lt;a href="http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf"&gt;http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[7] ARTICLE 29 Data Protection Working Party. (2014). &lt;i&gt;Opinion 8/2014 on the on Recent Developments on the Internet of Things.&lt;/i&gt; European Commission. Retrieved May 20, 2015, from http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[8] Crawford, K., &amp;amp; Schultz, J. (2013). Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms (SSRN Scholarly Paper No. ID 2325784). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2325784&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[9] Barocas, S., &amp;amp; Selbst, A. D. (2015). Big Data’s Disparate Impact (SSRN Scholarly Paper No. ID 2477899). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2477899&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[10] Barocas, S., &amp;amp; Selbst, A. D. (2015). Big Data’s Disparate Impact (SSRN Scholarly Paper No. ID 2477899). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2477899&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[11] Article 29 Data Protection Working Party. “Opinion 8/2014 on the on Recent Developments on the Internet of Things”. September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;h&lt;/a&gt;&lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;ttp://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[12] Tene, O., &amp;amp; Polonetsky, J. (2013). Big Data for All: Privacy and User Control in the Age of Analytics. Northwestern Journal of Technology and Intellectual Property, 11(5), 239.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[13]  Omer Tene and Jules Polonetsky, &lt;i&gt;Big Data for All: Privacy and User Control in the Age of Analytics&lt;/i&gt;, 11 Nw. J. Tech. &amp;amp; Intell. Prop. 239 (2013).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[14] Article 29 Data Protection Working Party. “Opinion 8/2014 on the on Recent Developments on the Internet of Things”. September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;h&lt;/a&gt;&lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;ttp://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[15] Information Commissioner's Office. (2014). Big Data and Data Protection. Infomation Commissioner's Office. Retrieved May 20, 2015, from https://ico.org.uk/media/for-organisations/documents/1541/big-data-and-data-protection.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[16] Article 29 Data Protection Working Party. “Opinion 8/2014 on the on Recent Developments on the Internet of Things”. September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;h&lt;/a&gt;&lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;ttp://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[17] The Chartered Institute for IT and Oxford Internet Institute, University of Oxford. “The Societal Impact of the Internet of Things”. February 14&lt;sup&gt;th&lt;/sup&gt; 2013. Available at: &lt;a href="http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf"&gt;http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[18] Kate Crawford and Jason Shultz, “Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms”. Boston College Law Review, Volume 55, Issue 1, Article 4. January 1st 2014. Available at: &lt;a href="http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&amp;amp;context=bclr"&gt;http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&amp;amp;context=bclr&lt;/a&gt;. Accessed: July 2nd 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[19] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16th 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2nd 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[20] Federal Trade Commission. (2015). &lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World.&lt;/i&gt; Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[21] Federal Trade Commission. (2015). &lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World.&lt;/i&gt; Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[22] Federal Trade Commission. (2015). &lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World.&lt;/i&gt; Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[23] Federal Trade Commission. (2015). &lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World.&lt;/i&gt; Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[24] Federal Trade Commission. (2015). &lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World.&lt;/i&gt; Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[25] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[26] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[27] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[28] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[29]  Kate Crawford and Jason Shultz, “Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms”. Boston College Law Review, Volume 55, Issue 1, Article 4. January 1st 2014. Available at: &lt;a href="http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&amp;amp;context=bclr"&gt;http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&amp;amp;context=bclr&lt;/a&gt;. Accessed: July 2nd 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[30]  Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[31] Federal Trade Commission. (2015). &lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World.&lt;/i&gt; Federal Trade Commission. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[32] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[33] Federal Trade Commission. (2015). &lt;i&gt;Internet of Things: Privacy &amp;amp; Security in a Connected World.&lt;/i&gt; Federal Trade Commission. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[34] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16&lt;sup&gt;th&lt;/sup&gt; 2014. Available at: &lt;a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf"&gt;http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[35] Kate Crawford and Jason Shultz, “Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms”. Boston College Law Review, Volume 55, Issue 1, Article 4. January 1&lt;sup&gt;st&lt;/sup&gt; 2014. Available at: &lt;a href="http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&amp;amp;context=bclr"&gt;http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&amp;amp;context=bclr&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[36] European Data Protection Supervisor. Preliminary Opinion of the European Data Protection Supervisor, Privacy and competitiveness in the age of big data: the interplay between data protection, competition law and consumer protection in the Digital Economy. March 2014. Available at: https://secure.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/Consultation/Opinions/2014/14-03-26_competitition_law_big_data_EN.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[37] SC Magazine. Harmonised EU data protection and fines by the end of the year. June 25&lt;sup&gt;th&lt;/sup&gt; 2015. Available at: &lt;a href="http://www.scmagazineuk.com/harmonised-eu-data-protection-and-fines-by-the-end-of-the-year/article/422740/"&gt;http://www.scmagazineuk.com/harmonised-eu-data-protection-and-fines-by-the-end-of-the-year/article/422740/&lt;/a&gt;. Accessed: August 8&lt;sup&gt;th&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[38] Tom Jowitt, “Digital Platforms to be Included in EU Cybersecurity Law”. TechWeek Europe. August 7&lt;sup&gt;th&lt;/sup&gt; 2015. Available at: http://www.techweekeurope.co.uk/e-regulation/digital-platforms-eu-cybersecuity-law-174415&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;[39] Adam Tanner. Data Brokers are now Selling Your Car's Location for $10 Online. July 10&lt;sup&gt;th&lt;/sup&gt; 2013. Available at: http://www.forbes.com/sites/adamtanner/2013/07/10/data-broker-offers-new-service-showing-where-they-have-spotted-your-car/&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things'&gt;https://cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>elonnai</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    

   <dc:date>2015-08-17T08:36:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/right-to-privacy-in-peril">
    <title>Right to Privacy in Peril</title>
    <link>https://cis-india.org/internet-governance/blog/right-to-privacy-in-peril</link>
    <description>
        &lt;b&gt;It seems to have become quite a fad, especially amongst journalists, to use this headline and claim that the right to privacy which we consider so inherent to our being, is under attack. However, when I use this heading in this piece I am not referring to the rampant illegal surveillance being done by the government, or the widely reported recent raids on consenting (unmarried) adults who were staying in hotel rooms in Mumbai. I am talking about the fact that the Supreme Court of India has deemed it fit to refer the question of the very existence of a fundamental right to privacy to a Constitution Bench to finally decide the matter, and define the contours of such right if it does exist.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In an order dated August 11, 2015 the Supreme Court finally gave in to the arguments advanced by the Attorney General and admitted that there is some “unresolved contradiction” regarding the existence of a constitutional “right to privacy” under the Indian Constitution and requested that a Constitutional Bench of appropriate strength.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Supreme Court was hearing a petition challenging the implementation of the Adhaar Card Scheme of the government, where one of the grounds to challenge the scheme was that it was violative of the right to privacy guaranteed to all citizens under the Constitution of India. However to counter this argument, the State (via the Attorney General) challenged the very concept that the Constitution of India guarantees a right to privacy by relying on an “unresolved contradiction” in judicial pronouncements on the issue, which so far had only been of academic interest. This “unresolved contradiction” arose because in the cases of &lt;b&gt;&lt;i&gt;M.P. Sharma &amp;amp; Others v. Satish Chandra &amp;amp; Others&lt;/i&gt;&lt;/b&gt;,&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; and &lt;b&gt;&lt;i&gt;Kharak Singh &lt;/i&gt;&lt;/b&gt;&lt;i&gt;v. &lt;b&gt;State of U.P. &amp;amp; Others,&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;b&gt;[2]&lt;/b&gt;&lt;/a&gt; &lt;/b&gt;&lt;/i&gt;(decided by &lt;i&gt;Eight &lt;/i&gt;and &lt;i&gt;Six &lt;/i&gt;Judges respectively) the Supreme Court has categorically denied the existence of a right to privacy under the Indian Constitution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However somehow the later case of &lt;i&gt;Gobind&lt;/i&gt; v. &lt;i&gt;State of M.P. and another&lt;/i&gt;,&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt; (which was decided by a two Judge Bench of the Supreme Court) relied upon the opinion given by the minority of two judges in &lt;i&gt;Kharak Singh&lt;/i&gt; to hold that a right to privacy does exist and is guaranteed as a fundamental right under the Constitution of India.&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt; Thereafter a large number of cases have held the right to privacy to be a fundamental right, the most important of which are &lt;b&gt;&lt;i&gt;R. Rajagopal &amp;amp; Another &lt;/i&gt;&lt;/b&gt;&lt;i&gt;v. &lt;b&gt;State of Tamil Nadu &amp;amp; Others,&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;b&gt;[5]&lt;/b&gt;&lt;/a&gt; &lt;/b&gt;&lt;/i&gt;(popularly known as &lt;i&gt;Auto Shanker’s &lt;/i&gt;case) and &lt;b&gt;&lt;i&gt;People’s Union for Civil Liberties (PUCL) &lt;/i&gt;&lt;/b&gt;&lt;i&gt;v. &lt;b&gt;Union of India &amp;amp; Another&lt;/b&gt;&lt;/i&gt;.&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; However, as was noticed by the Supreme Court in its August 11 order, all these judgments were decided by two or three Judges only.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The petitioners on the other hand made a number of arguments to counter those made by the Attorney General to the effect that the fundamental right to privacy is well established under Indian law and that there is no need to refer the matter to a Constitutional Bench. These arguments are:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(i) The observations made in &lt;b&gt;&lt;i&gt;M.P. Sharma &lt;/i&gt;&lt;/b&gt;regarding the absence of right to privacy are not part of the &lt;i&gt;ratio decidendi&lt;/i&gt; of that case and, therefore, do not bind the subsequent smaller Benches such as &lt;b&gt;&lt;i&gt;R. Rajagopal &lt;/i&gt;&lt;/b&gt;and &lt;b&gt;&lt;i&gt;PUCL&lt;/i&gt;&lt;/b&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(ii) Even in &lt;b&gt;&lt;i&gt;Kharak Singh &lt;/i&gt;&lt;/b&gt;it was held that the right of a person not to be disturbed at his residence by the State is recognized to be a part of a fundamental right guaranteed under Article 21. It was argued that this is nothing but an aspect of privacy. The observation in para 20 of the majority judgment (quoted in footnote 2 above) at best can be construed only to mean that there is no fundamental right of privacy against the State’s authority to keep surveillance on the activities of a person. However, they argued that such a conclusion cannot be good law any more in view of the express declaration made by a seven-Judge bench decision of this Court in &lt;b&gt;&lt;i&gt;Maneka Gandhi &lt;/i&gt;&lt;/b&gt;&lt;i&gt;v. &lt;b&gt;Union of India &amp;amp; Another&lt;/b&gt;&lt;/i&gt;.&lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(iii) Both &lt;b&gt;&lt;i&gt;M.P. Sharma &lt;/i&gt;&lt;/b&gt;&lt;i&gt;(supra) &lt;/i&gt;and &lt;b&gt;&lt;i&gt;Kharak Singh &lt;/i&gt;&lt;/b&gt;&lt;i&gt;(supra) &lt;/i&gt;were decided on an interpretation of the Constitution based on the principles expounded in &lt;b&gt;&lt;i&gt;A.K. Gopalan &lt;/i&gt;&lt;/b&gt;&lt;i&gt;v. &lt;b&gt;State of Madras&lt;/b&gt;&lt;/i&gt;,&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt; which have themselves been declared wrong by a larger Bench in &lt;b&gt;&lt;i&gt;Rustom Cavasjee Cooper &lt;/i&gt;&lt;/b&gt;&lt;i&gt;v. &lt;b&gt;Union of India&lt;/b&gt;&lt;/i&gt;.&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other than the points above, it was also argued that world over in all the countries where Anglo-Saxon jurisprudence is followed, ‘privacy’ is recognized as an important aspect of the liberty of human beings. The petitioners also submitted that it was too late in the day for the Union of India to argue that the Constitution of India does not recognize privacy as an aspect of the liberty under Article 21 of the Constitution of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However these arguments of the petitioners were not enough to convince the Supreme Court that there is no doubt regarding the existence and contours of the right to privacy in India. The Court, swayed by the arguments presented by the Attorney General, admitted that questions of far reaching importance for the Constitution were at issue and needed to be decided by a Constitutional Bench.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Giving some insight into its reasoning to refer this issue to a Constitutional Bench, the Court did seem to suggest that its decision to refer the matter to a larger bench was more an exercise in judicial propriety than an action driven by some genuine contradiction in the law. The Court said that if the observations in &lt;b&gt;&lt;i&gt;M.P. Sharma &lt;/i&gt;&lt;/b&gt;&lt;i&gt;(supra) &lt;/i&gt;and &lt;b&gt;&lt;i&gt;Kharak Singh &lt;/i&gt;&lt;/b&gt;&lt;i&gt;(supra) &lt;/i&gt;were accepted as the law of the land, the fundamental rights guaranteed under the Constitution of India would get “denuded of vigour and vitality”. However the Court felt that institutional integrity and judicial discipline require that smaller benches of the Court follow the decisions of larger benches, unless they have very good reasons for not doing so, and since in this case it appears that the same was not done therefore the Court referred the matter to a larger bench to scrutinize the ratio of &lt;b&gt;&lt;i&gt;M.P. Sharma &lt;/i&gt;&lt;/b&gt;&lt;i&gt;(supra) &lt;/i&gt;and &lt;b&gt;&lt;i&gt;Kharak Singh &lt;/i&gt;&lt;/b&gt;&lt;i&gt;(supra)&lt;/i&gt; and decide the judicial correctness of subsequent two judge and three judge bench decisions which have asserted or referred to the right to privacy.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;[1]&lt;/a&gt; AIR 1954 SC 300. In para 18 of the Judgment it was held: “A power of search and seizure is in any system of jurisprudence an overriding power of the State for the protection of social security and that power is necessarily regulated by law. When the Constitution makers have thought fit not to subject such regulation to constitutional limitations &lt;i&gt;by recognition of a fundamental right to privacy&lt;/i&gt;, analogous to the American Fourth Amendment, &lt;i&gt;we have no justification to import it, into a totally different fundamental right, by some process of strained construction&lt;/i&gt;.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;[2]&lt;/a&gt; AIR 1963 SC 1295. In para 20 of the judgment it was held: “&lt;b&gt;… &lt;/b&gt;Nor do we consider that Art. 21 has any relevance in the context as was sought to be suggested by learned counsel for the petitioner. As already pointed out, &lt;i&gt;the right of privacy is not a guaranteed right under our Constitution&lt;/i&gt;and therefore the attempt to ascertain the movement of an individual which is merely a manner in which privacy is invaded is not an infringement of a fundamental right guaranteed by Part III.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;[3]&lt;/a&gt; (1975) 2 SCC 148.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;[4]&lt;/a&gt; It is interesting to note that while the decisions in both &lt;i&gt;Kharak Singh&lt;/i&gt; and &lt;i&gt;Gobind&lt;/i&gt; were given in the context of similar facts (challenging the power of the police to make frequent domiciliary visits both during the day and night at the house of the petitioner) while the majority in &lt;i&gt;Kharak Singh&lt;/i&gt; specifically denied the existence of a fundamental right to privacy, however they held the conduct of the police to be violative of the right to personal liberty guaranteed under Article 21, since the Regulations under which the police actions were undertaken were themselves held invalid. On the other hand, while &lt;i&gt;Gobind&lt;/i&gt; held that a fundamental right to privacy does exist in Indian law, it may be interfered with by the State through procedure established by law and therefore upheld the actions of the police since they were acting under validly issued Regulations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; (1994) 6 SCC 632.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;[6]&lt;/a&gt; (1997) 1 SCC 301.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;[7]&lt;/a&gt; (1978) 1 SCC 248.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;[8]&lt;/a&gt; AIR 1950 SC 27.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;[9]&lt;/a&gt; (1970) 1 SCC 248.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/right-to-privacy-in-peril'&gt;https://cis-india.org/internet-governance/blog/right-to-privacy-in-peril&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vipul</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2015-08-13T15:32:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011">
    <title>Big Data and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011</title>
    <link>https://cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011</link>
    <description>
        &lt;b&gt;Experts and regulators across jurisdictions are examining the impact of Big Data practices on traditional data protection standards and principles. This will be a useful and pertinent exercise for India to undertake as the government and the private and public sectors begin to incorporate and rely on the use of Big Data in decision making processes and organizational operations.This blog provides an initial evaluation of how Big Data could impact India's current data protection standards.&lt;/b&gt;
        &lt;p&gt;Experts and regulators across the globe are examining the impact of Big Data practices on traditional data protection standards and principles. This will be a useful and pertinent exercise for India to undertake as the government and the private and public sectors begin to incorporate and rely on the use of Big Data in decision making processes and organizational operations.&lt;/p&gt;
&lt;p&gt;Below is an initial evaluation of how Big Data could impact India's current data protection standards.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;India currently does not have comprehensive privacy legislation - but the Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules 2011 formed under section 43A of the Information Technology Act 2000&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; define a data protection framework for the processing of digital data by Body Corporate. Big Data practices will impact a number of the provisions found in the Rules:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Scope of Rules: &lt;/b&gt;Currently the Rules apply to Body Corporate and digital data. As per the IT Act, Body Corporate is defined as &lt;i&gt;"Any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities."&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The present scope of the Rules excludes from its purview a number of actors that do or could have access to Big Data or use Big Data practices. The Rules would not apply to government bodies or individuals collecting and using Big Data. Yet, with technologies such as IoT and the rise of Smart Cities across India – a range of government, public, and private organizations and actors could have access to Big Data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Definition of personal and sensitive personal data: &lt;/b&gt;Rule 2(i) defines personal information as &lt;i&gt;"information that relates to a natural person which either directly or indirectly, in combination with other information available or likely to be available with a body corporate, is capable of identifying such person."&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;Rule 3 defines sensitive personal information as:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Password,&lt;/li&gt;
&lt;li&gt;Financial information,&lt;/li&gt;
&lt;li&gt;Physical/physiological/mental health condition,&lt;/li&gt;
&lt;li&gt;Sexual orientation,&lt;/li&gt;
&lt;li&gt;Medical records and history,&lt;/li&gt;
&lt;li&gt;Biometric information&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The present definition of personal data hinges on the factor of identification (data that is capable of identifying a person). Yet this definition does not encompass information that is associated to an already identified individual - such as habits, location, or activity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The definition of personal data also addresses only the identification of 'such person' and does not address data that is related to a particular person but that also reveals identifying information about another person - either directly - or when combined with other data points.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;By listing specific categories of sensitive personal information, the Rules do not account for additional types of sensitive personal information that might be generated or correlated through the use of Big Data analytics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Importantly, the definitions of sensitive personal information or personal information do not address how personal or sensitive personal information - when anonymized or aggregated – should be treated.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Consent&lt;/b&gt;: Rule 5(1) requires that Body Corporate must, prior to collection, obtain consent in writing through letter or fax or email from the provider of sensitive personal data regarding the use of that data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a context where services are delivered with little or no human interaction, data is collected through sensors, data is collected on a real time and regular basis, and data is used and re-used for multiple and differing purposes - it is not practical, and often not possible, for consent to be obtained through writing, letter, fax, or email for each instance of data collection and for each use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Notice of Collection: &lt;/b&gt;Rule 5(3) requires Body Corporate to provide the individual with a notice during collection of information that details the fact that information is being collected, the purpose for which the information is being collected, the intended recipients of the information, the name and address of the agency that is collecting the information and the agency that will retain the information. Furthermore body corporate should not retain information for longer than is required to meet lawful purposes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though this provision acts as an important element of transparency, in the context of Big Data, communicating the purpose for which data is collected, the intended recipients of the information, the name and address of the agency that is collecting the information and the agency that will retain the information could prove to be difficult to communicate as they are likely to encompass numerous agencies and change depending upon the analysis being done.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Access and correction&lt;/b&gt;: Rule 5(6) provides individuals with the ability to access sensitive personal information held by the body corporate and correct any inaccurate information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This provision would be difficult to implement effectively in the context of Big Data as vast amounts of data are being generated and collected on an ongoing and real time basis and often without the knowledge of the individual.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Purpose Limitation:&lt;/b&gt; Rule 5(5) requires that body corporate should use information only of the purpose which it has been collected.&lt;/p&gt;
&lt;p&gt;In the context of Big Data this provision would overlook the re-use of data that is inherent in such practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Security:&lt;/b&gt; Rule 8 states that any Body Corporate or person on its behalf will be understood to have complied with reasonable security practices and procedures if they have implemented such practices and have in place codes that address managerial, technical, operational and physical security control measures. These codes could follow the IS/ISO/IEC 27001 standard or another government approved and audited standard.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This provision importantly requires that data controllers collecting and processing data have in place strong security practices. In the context of Big Data – the security of devices that might be generating or collecting data and algorithms processing and analysing data is critical. Once generated, it might be challenging to ensure the data is being transferred to or being analysed by organisations that comply with such security practices as listed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Data Breach&lt;/b&gt; : Rule 8 requires that if a data breach occurs, Body Corporate would have to be able to demonstrate that they have implemented their documented information security codes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though this provision holds a company accountable for the implementation of security practices, it does not address how a company should be held accountable for a large scale data breach as in the context of Big Data the scope and impact of a data breach is on a much larger scale.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Opt in and out and ability to withdraw consent&lt;/b&gt; : Rule 5(7) requires Body Corporate or any person on its behalf, prior to the collection of information - including sensitive personal information - must give the individual the option of not providing information and must give the individual the option of withdrawing consent. Such withdrawal must be sent in writing to the body corporate.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The feasibility of such a provision in the context of Big Data is unclear, especially in light of the fact that Big Data practices draw upon large amounts of data, generated often in real time, and from a variety of sources.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Disclosure of Information&lt;/b&gt;: Rule 6 maintains that disclosure of sensitive personal data can only take place with permission from the provider of such information or as agreed to through a lawful contract.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This provision addresses disclosure and does not take into account the “sharing” of information that is enabled through networked devices, as well as the increasing practice of companies to share anonymized or aggregated data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Privacy Policy&lt;/b&gt; : Rule 4 requires that body corporate have in place a privacy policy on their website that provides clear and accessible statements of its practices and policies, type of personal or sensitive personal information that is being collected, purpose of the collection, usage of the information, disclosure of the information, and the reasonable security practices and procedures that have been put in place to secure the information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the context of Big Data where data from a variety of sources is being collected, used, and re-used it is important for policies to 'follow data' and appear in a contextualized manner. The current requirement of having Body Corporate post a single overarching privacy policy on its website could prove to be inadequate.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Remedy&lt;/b&gt; : Section 43A of the Act holds that if a body corporate is negligent in implementing and maintain reasonable security practices and procedures which results in wrongful loss or wrongful gain to any person, the body corporate can be held liable to pay compensation to the affected person.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This provision will provide limited remedy for an affected individual in the context of Big Data. Though important to help prevent data breaches resulting from negligent data practices, implementation of reasonable security practices and procedures cannot be the only hinging point for determining liability of a Body Corporate for violations and many of the harms possible through Big Data are not in the form of wrongful loss or wrongful gain to another person. Indeed many harms possible through Big Data are non-economic in nature – including physical invasion of privacy, and discriminatory practices that can arise from decisions based on Big Data analytics. Nor does the provision address the potential for future damage that can result from a 'Big Data data breach'.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The safeguards noted in the above section are not the only legal provisions that speak to privacy in India. There are over fifty sectoral legislation that have provisions addressing privacy - for example provisions addressing confidentiality of health and banking information. The government of India is also in the process of drafting a privacy legislation. In 2012 the Report of the Group of Experts on Privacy provided recommendations for a privacy framework in India. The Report envisioned a framework of co-regulation - with sector level self regulatory organization developing privacy codes (that are not lower than the defined national privacy principles) and that are enforced by a privacy commissioner.&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt; Perhaps this method would be optimal for the regulation of Big Data- allowing for the needed flexibility and specificity in standards and device development. Though the Report notes that individuals can seek remedy from the court and the Privacy Commissioner can issue fines for a violation, the development of privacy legislation in India has yet to clearly integrate the importance of due process and remedy. With the onset of Big Data - this will become more important than ever.&lt;/p&gt;
&lt;h3&gt;&lt;/h3&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The use and generation of Big Data in India is growing. Plans such as free wifi zones in cities&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt;, city wide CCTV networks with facial recognition capabilities&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt;, and the implementation of an identity/authentication platform for public and private services&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;, are indicators towards a move of data generation that is networked and centralized, and where the line between public and private is blurred through the vast amount of data that is collected.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In such developments and innovations what is privacy and what role does privacy play? Is it the archaic inhibitor - limiting the sharing and use of data for new and innovative purposes? Will it be defined purely by legislative norms or through device/platform design as well? Is it a notion that makes consumers think twice about using a product or service or is it a practice that enables consumer and citizen uptake and trust and allows for the growth and adoption of these services?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;How privacy will be regulated and how it will be perceived is still evolving across jurisdictions, technologies, and cultures - but it is clear that privacy is not being and cannot be overlooked. Governments across the world are reforming and considering current and future privacy regulation targeted towards life in a quantified society. As the Indian government begins to roll out initiatives that create a "Digital India" indeed a "quantified India", taking privacy into consideration could facilitate the uptake, expansion, and success of these practices and services. As the Indian government pursues the opportunities possible through Big Data it will be useful to review existing privacy protections and deliberate on if, and in what form, future protections for privacy and other rights will be needed.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;[1]&lt;/a&gt;Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules 2011). Available at: http://deity.gov.in/sites/upload_files/dit/files/GSR313E_10511(1).pdf&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;[2]&lt;/a&gt;Group of Experts on Privacy. (2012). &lt;i&gt;Report of the Group of Experts on Privacy.&lt;/i&gt; New Delhi: Planning Commission, Government of India. Retrieved May 20, 2015, from http://planningcommission.nic.in/reports/genrep/rep_privacy.pdf&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;[3]&lt;/a&gt; NDTV. “Free Public Wi-Fi Facility in Delhi to Have Daily Data Limit. NDTV, May 25&lt;sup&gt;th&lt;/sup&gt; 2015, Available at: &lt;a href="http://gadgets.ndtv.com/internet/news/free-public-wi-fi-facility-in-delhi-to-have-daily-data-limit-695857"&gt;http://gadgets.ndtv.com/internet/news/free-public-wi-fi-facility-in-delhi-to-have-daily-data-limit-695857&lt;/a&gt;. Accessed: July 2&lt;sup&gt;nd&lt;/sup&gt; 2015.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;[4]&lt;/a&gt;FindBiometrics Global Identity Management. “Surat Police Get NEC Facial Recognition CCTV System”. July 21&lt;sup&gt;st&lt;/sup&gt; 2015. Available at: http://findbiometrics.com/surat-police-nec-facial-recognition-27214/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt;UIDAI Official Website. Available at: https://uidai.gov.in/&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011'&gt;https://cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>elonnai</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2015-08-11T07:01:12Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/hardnewsmedia-august-10-2015-abeer-kapoor-net-neutrality-india-is-a-keybattle-ground">
    <title>Net Neutrality: India is a Keybattle Ground</title>
    <link>https://cis-india.org/internet-governance/news/hardnewsmedia-august-10-2015-abeer-kapoor-net-neutrality-india-is-a-keybattle-ground</link>
    <description>
        &lt;b&gt;Hardnews talks to Sunil Abraham, the executive director of the Centre for Internet and Society (CIS), about the future of the Internet in India.&lt;/b&gt;
        &lt;p id="stcpDiv" style="text-align: justify; "&gt;The article by Abeer Kapoor was &lt;a class="external-link" href="http://www.hardnewsmedia.com/2015/08/net-neutrality-india-keybattle-ground"&gt;published in Hardnews&lt;/a&gt; on August 10, 2015.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;span&gt;&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;There are competing definitions of net neutrality. What do you think an Indian definition of net neutrality should be?&lt;/b&gt;&lt;br /&gt;It should be driven by an empirical  understanding of the harms and benefits for Indian consumers. Any  regulation should be based on evidence of harm. Forbearance should be  the first option for any regulator. The second option is mandating  transparency. The third option, as (Managing Director of the World  Dialogue on Regulation for Network Economies Programme) William Melody  says, should be raising competition before we consider other more  intrusive regulatory measures such as price regulation, mandatory  registration and licensing, etc. Telling network administrators how to  run their networks should be the very last option we consider. Ideally,  the Competition Commission of India should have started an investigation  into the competition harms emerging from network neutrality violations.  There are other harms emerging from network neutrality violations, such  as free speech harms, diversity harms, innovation harms and privacy  harms. These residual elements should have been the focus of the TRAI  (Telecom Regulatory Authority of India) consultation paper process, the  DoT (Department of Telecommunications) panel process and the  consultations of the parliamentary standing committee.&lt;/span&gt;&lt;/p&gt;
&lt;div id="stcpDiv"&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;There  are certain rights that are essential, like privacy. How do you think  the right to privacy will play into the definition of Indian net  neutrality?&lt;/b&gt;&lt;br /&gt;Deep packet inspection – which is a  method that is used to manage Internet traffic and walled garden access  via mobile applications – causes significant privacy harms and gives  rise to a range of security vulnerabilities. These cannot be directly  addressed in network neutrality policy. On privacy and security, it is  not clear that the Indian situation is different from the global trend,  so it is unlikely that we will have an India-specific privacy language  in our network neutrality policy.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Privacy harms caused by network  neutrality violations have to be addressed by enacting the privacy bill  into law. The Department of Personnel and Training (DoPT) has been  working on this Bill for the last five or six years. The latest draft  has implemented the recommendations of the Justice AP Shah Committee.  The last leak of the privacy Bill revealed that the DoPT has included  the nine principles identified by the &lt;span&gt;&lt;a href="http://planningcommission.nic.in/reports/genrep/rep_privacy.pdf"&gt;Shah Committee Report on Privacy&lt;/a&gt;&lt;/span&gt;.  We hope that the government will introduce this Bill at the earliest.  Section 43A of the IT Act may also need to be amended to address all the  nine privacy principles.&lt;/span&gt;&lt;/p&gt;
&lt;div id="stcpDiv"&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;The  report drafted by DoT on net neutrality is ambiguous and almost  reluctant to take a stand. What are the key points of this report?&lt;/b&gt;&lt;br /&gt;The &lt;span&gt;&lt;a href="https://mygov.in/sites/default/files/master_image/Net_Neutrality_Committee_report.pdf"&gt;DoT panel report&lt;/a&gt; &lt;/span&gt;does  take a stand. It clearly identifies network neutrality as a policy  goal. Unfortunately, the panel did not provide its own definition of  network neutrality, but instead quoted a definition submitted by civil  society activists who testified before it without explicitly adopting  it. The panel report examines zero rating and legitimate traffic  management in quite a bit of detail and does prescribe some regulatory  decision trees to the policymakers. When it comes to specialised  services and walled gardens there could have been more detailed and  specific recommendations. The biggest disappointment in the report is  the call for licensing of those OTT (Over the Top) service providers  that provide equivalent services to those provided by telcos. While the  need to address regulatory arbitrage from the perspective of privacy and  surveillance law may be virtuous, it may not be technically feasible to  do so, especially if there is end-to-end encryption. Also, regulatory  arbitrage could be addressed by reducing regulations for telcos rather  than increasing them for &lt;/span&gt;&lt;span&gt;OTT providers.&lt;/span&gt;&lt;/p&gt;
&lt;div id="stcpDiv"&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;Do you think licensing and regulation of OTT services such as Google and WhatsApp are a necessity?&lt;/b&gt;&lt;br /&gt;It is a myth that they exist in a  regulatory vacuum. Many regulations do apply to them and a few of them  do comply with Indian authorities on issues like speech regulation,  legal interception and also data access. With competition law and  taxation there is very little compliance. The trouble is not that there  are regulatory vacuums, but rather that these services operate from  foreign jurisdictions. Without offices, servers and human resources  within the Indian jurisdiction it is very difficult for the courts to  implement their orders, and for law enforcement to ensure compliance  with Indian laws. This jurisdictional challenge affects most developing  countries and not just India, and can only be solved by harmonising  procedural and substantive law across jurisdictions, through the spread  of soft norms, development of self-regulatory mechanisms using the  multi-stakeholder models and through the creation of international law  through various multilateral and pluri-lateral bodies.&lt;/span&gt;&lt;/p&gt;
&lt;div id="stcpDiv"&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;The report reduces the neutrality debate to ‘access.’ Do you think this approach is reductive?&lt;/b&gt;&lt;br /&gt;Access is very important in the  Indian context so I don’t see how that is reductive. Many observers  believe that the next round in the war for network neutrality will  happen in the global South. India is a key battleground – what happens  here will have global impact and implications. Network neutrality  policies need to consider free speech, privacy, competition, diversity  and innovation goals of the markets they seek to regulate. If we are not  being doctrinaire about network neutrality we could adopt what  (Professor of Internet &amp;amp; Media Law at the University of  Sussex) Chris Marsden calls forward-looking “positive net neutrality”  wherein “higher QoS (Quality of Service) for higher prices should be  offered on fair, reasonable and non-discriminatory [FRAND] terms to all  comers”. FRAND, according to Prof. Marsden, is well understood by the  telcos and ISPs (Internet Service Providers) as it is the basis of  common carriage. This understanding of network neutrality allows for  technical and business model innovation by ISPs and telcos without the  associated harms. There are zero-rating services being launched  by Mozilla, Jaana, Mavin and others that are attempting to do this. I do  not believe that they violate network neutrality principles, unlike  Airtel Zero or Internet.org.&lt;/span&gt;&lt;/p&gt;
&lt;div id="stcpDiv"&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;While  this report attempts to arrive at a middle ground between the TSPs and  the OTTs, how is this going to reflect in the government’s ‘Digital  India’ programme?&lt;/b&gt;&lt;br /&gt;We know we have a policy solution  when all stakeholders are equally unhappy. But we also need an elegant  solution that is easy to implement. Scholars like (Associate Professor  of Computer Science at Columbia University) Vishal Mishra have a  theoretical solution based on the Shapley Value, that assumes a  multi-sided market model, but this may not work in real life. Professor  V. Sridhar of the International Institute of Information Technology,  Bengaluru (IIITB) has a very elegant idea of setting a ceiling and floor  for price and speed and also for insisting on a minimum QoS of the  whole of the Internet. These ideas I have not heard in the American and  European debate around network neutrality. I remain hopeful that the  Indian middle ground will be qualitatively different, given that the  structure and constraints of the Indian telecom sector are very  different from that in developed countries. Ensuring network neutrality  is essential to the success of Digital India. Unfortunately, the Digital  India plans that we have heard so far don’t make this &lt;/span&gt;&lt;span&gt;explicitly clear.&lt;/span&gt;&lt;/p&gt;
&lt;div id="stcpDiv"&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;The  Internet was never meant to be monetised. Do you think that private  players are eating into a public good that is absolutely necessary for  development?&lt;/b&gt;&lt;br /&gt;I have never heard that statement before. &lt;a href="http://www.hardnewsmedia.com/2011/06/3992"&gt;&lt;span&gt;The Internet&lt;/span&gt;, &lt;span&gt;after its early history, has been completely built using private capital&lt;/span&gt;&lt;/a&gt;.  The public Internet has always been monetised. Collectively, the  individual entrepreneurs and enterprises that build and run the  components of the Internet have created a common public good – which is  the globally interconnected network. But the motivation for private  capital behind maintaining and building their corner or component of  this network has also been profit maximisation.&lt;/span&gt;&lt;/p&gt;
&lt;div id="stcpDiv"&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;b&gt;What has contributed to the growing need to regulate and administer the Internet?&lt;/b&gt;&lt;br /&gt;Technical advancements and business  model innovations have resulted in both benefits and harms and therefore  there could be a rationale for regulation. But more regulation per se  is not a virtue and does not serve the interest of citizens and  consumers. Expanding the regulatory scope of government infinitely will  only result in failure, given the limited capacity and resources of the  State. Therefore, whenever the State enters a new area of regulation it  should ideally stop regulating in another area. In other words, there is  no clear case that the regulation of the Internet is needed to keep  growing exponentially – as evolving technologies may require specific  regulation – if the resultant harms cannot be addressed using existing  law. In most cases, traditional law is sufficient to deal with crimes  and offences online.&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;p&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;This story is from the print issue of Hardnews: August 2015&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/hardnewsmedia-august-10-2015-abeer-kapoor-net-neutrality-india-is-a-keybattle-ground'&gt;https://cis-india.org/internet-governance/news/hardnewsmedia-august-10-2015-abeer-kapoor-net-neutrality-india-is-a-keybattle-ground&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Net Neutrality</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-09-20T07:08:42Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-submission-to-unga-wsis-review">
    <title>CIS submission to the UNGA WSIS+10 Review</title>
    <link>https://cis-india.org/internet-governance/blog/cis-submission-to-unga-wsis-review</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) submitted its comments to the non-paper on the UNGA Overall Review of
the Implementation of the WSIS outcomes, evaluating the progress made and challenges ahead.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;b&gt;To what extent has progress been made on the vision of the peoplecentred, inclusive and development oriented Information Society in the ten years since the WSIS?&lt;/b&gt;&lt;br /&gt;The World Summit on the Information Society (WSIS) in 2003 and 2005 played an important role in encapsulating the potential of knowledge and information and communication technologies (ICT) to contribute to economic and social development. Over the past ten years, most countries have sought to foster the use of information and knowledge by creating enabling environment for innovation and through efforts to increase access. There have been interventions to develop ICT for development both at an international and national level through private sector investment, bilateral treaties and national strategies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, much of the progress made in the past ten years in terms of getting people connected and reaping the benefits of ICT has not been sufficiently peoplecentred, nor have they been sufficiently inclusive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These developments have not been sufficiently peoplecentred, since governments across the world have been using the Internet as a monumental surveillance tool, invading people’s privacy without legitimate justifications, in an arbitrary manner without due care for reasonableness,  proportionality, or democratic accountability. These developments have not been sufficiently peoplecentred, since the largest and most profitable Internet businesses — businesses that have more users than most nationstates have citizens, yet have one-sided terms of service — have eschewed core principles like open standards and interoperability that helped create the Internet and the World Wide Web, and instead promote silos.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We still reside in a world where development has been very lopsided, and ICTs have contributed to reducing some of these gulfs, while exacerbating others. For instance, persons with visual impairment are largely yet to reap the benefits of the Information Society due to a lack of attention paid to universal, while sighted persons have benefited far more; the ability of persons who don’t speak a language like English to contribute to global Internet governance discussions is severely limited; the spread of academic knowledge largely remains behind prohibitive paywalls.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As ICTs have grown both in sophistication and reach, much work remains to achieve the peoplecentred, inclusive and developmentoriented information society envisaged in WSIS. While the diffusion of ICTs has created new opportunities for development, even today less than half the world has access to broadband (with only eleven per cent of the world’s population having access to fixed broadband). See &lt;a class="external-link" href="http://www.itu.int/en/ITUD/Statistics/Documents/facts/ICTFactsFigures2015.pdf"&gt;International Telecommunication Union, ICT Facts and Figures: The World in 2015&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ninety per cent of people connected come from the industrialized countries — North America (thirty per cent), Europe (thirty per cent) and the AsiaPacific (thirty per cent). Four billion people from developing countries remain offline, representing two-thirds of the population residing in developing countries. Of the nine hundred and forty million people residing in Least Developed Countries (LDCs), only eighty-nine million use the Internet and only seven per cent of households have Internet access, compared with the world average of forty-six per cent. See &lt;a class="external-link" href="http://www.itu.int/en/ITUD/Statistics/Documents/facts/ICTFactsFigures2015.pdf"&gt;International Telecommunication Union, ICT Facts and Figures: The World in 2015&lt;/a&gt;. This digital divide is first and foremost a question of access to basic infrastructure (like electricity).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Furthermore, there is a problem of affordability, all the more acute since in the South in comparison with countries of the North due to the high costs related to access to the connection. Further, linguistic, educational, cultural and content related barriers are also contributing to this digital divide. Growth of restrictive regimes around intellectual property, vision of the equal and connected society. Security of critical infrastructure with in light of ever growing vulnerabilities, the loss of trust following revelations around mass surveillance and a lack of consensus on how to tackle these concerns are proving to be a challenge to the vision of a connected information society. The WSIS+10 overall review is timely and a much needed intervention in assessing the progress made and planning for the challenges ahead.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There were two bodies as major outcomes of the WSIS process: the Internet Governance Forum and the Digital Solidarity Fund, with both of these largely failing to achieve their intended goals. The Internet Governance Forum, which is meant to be a leading example of “multi-stakeholder governance” is also a leading example of what the Multi-stakeholder Advisory Group (MAG) noted in 2010 as “‘black box’ approach”, with the entire process around the nomination and selection of the MAG being opaque. Indeed, when CIS requested the IGF Secretariat to share information on the nominators, we were told that this information will not be made private. Five years since the MAG lamented its own blackbox nature, things have scarcely improved. Further, analysis of MAG membership since 2006 shows that 26 persons have served for 6 years or more, with the majority of them being from government, industry, or the technical community. Unsurprisingly, 36 per cent of the MAG membership has come from the WEOG group, highlighting both deficiencies in the nomination/selection&lt;br /&gt;process as well as the need for capacity building in this most important area. The Digital Solidarity Fund failed for a variety of reason, which we have analysed in a &lt;a class="external-link" href="https://docs.google.com/document/d/1E0HKY06744b6i2slR5HMk9Qd6I7zPFWJlKSmhsneAs/ edit"&gt;separate document&lt;/a&gt; annexed to this response.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;What are the challenges to the implementation of WSIS outcomes?&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some of the key areas that need attention going forward and need to be addressed include:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Access to Infrastructure&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Developing policies aimed at promoting innovation and increasing affordable access to hardware and software, and curbing the ill effects of the currentlyexcessive patent and copyright regimes.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Focussing global energies on solutions to lastmile access to the Internet in a manner that is not decoupled from developmental ground realities.&lt;/li&gt;
&lt;li&gt;This would include policies on spectrum sharing, freeing up underutilized spectrum, and increasing unlicensed spectrum.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;This would also include governmental policies on increasing competition among Internet providers at the last mile as well as at the backbone (both nationally and internationally), as well as commitments for investments in basic infrastructure such as an openaccess national fibreoptic backbone where the private sector investment is not sufficient.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Developing policies that encourage local Internet and communications infrastructure in the form of Internet exchange points, data centres, community broadcasting.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Access to Knowledges&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;As the Washington Declaration on IP and the Public Interest5 points out, the enclosure of the public domain and knowledge commons through expansive “intellectual property” laws and policies has only gotten worse with digital technologies, leading to an unjust allocation of information goods, and continuing royalty outflows from the global South to a handful of developing countries. This is not sustainable, and urgent action is needed to achieve more democratic IP laws, and prevent developments such as extra judicial enforcement mechanisms such as digital restrictions management systems from being incorporated within Web standards.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Aggressive development of policies and adoption of best practices to ensure that persons with disabilities are not treated as secondgrade citizens, but are able to fully and equally participate in and benefit from the Information Society.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Despite the rise of video content on the Internet, much of that has been in parts of the world with already high literacy, and language and illiteracy continue to pose barriers to full usage of the Internet.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;While the Tunis Agenda highlighted the need to address communities marginalized in Information Society discourse, including youth, older persons, women, indigenous peoples, people with disabilities, and remote and rural communities, but not much progress has been seen on this front.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Rights, Trust, and Governance&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Ensuring effective and sustainable participation especially from developing countries and marginalised communities. Developing governance mechanisms that are accountable, transparent and provide checks against both unaccountable commercial interests as well as governments.&lt;/li&gt;
&lt;li&gt;Building citizen trust through legitimate, accountable and transparent governance mechanisms.&lt;/li&gt;
&lt;li&gt;Ensuring cooperation between states as security is influenced by global foreign policy, and is of principal importance to citizens and consumers, and an enabler of other rights.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;As the Manila Principles on Intermediary Liability show, uninformed intermediary liability policies, blunt and heavy handed regulatory measures, failing to meet the principles of necessity and proportionality, and a lack of consistency across these policies has resulted in censorship and other human rights abuses by governments and private parties, limiting individuals’ rights to free expression and creating an environment of uncertainty that also impedes innovation online. In developing, adopting, and reviewing legislation, policies and practices that govern the liability of intermediaries, interoperable and harmonized regimes that can promote innovation while respecting users’ rights in line with the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights and the United Nations Guiding Principles on Business and Human Rights are needed and should be encouraged.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;An important challenge before the Information Society is that of the rise of “quantified society”, where enormous amounts of data are generated constantly, leading to great possibilities and grave concerns regarding privacy and data protection.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Reducing tensions arising from the differences between cultural and digital nationalism including on issues such as data sovereignty, data localisation, unfair trade and the need to have open markets.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Currently, there is a lack of internationally recognized venues accessible to all stakeholders for not only discussing but also acting upon many of these issues.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;What should be the priorities in seeking to achieve WSIS outcomes and progress towards the Information Society, taking into account emerging trends?&lt;/b&gt;&lt;br /&gt;All the challenges mentioned above should be a priority in achieving WSIS outcomes and ensuring innovation to lead social and economic progress in society. Digital literacy, multilingualism and addressing privacy and user data related issues need urgent attention in the global agenda. Enabling increased citizen participation thus accounting for the diverse voices that make the Internet a unique medium should also be treated as priority. Renewing the IGF mandate and giving it teeth by adopting indicators for development and progress, periodic review and working towards tangible outcomes would be beneficial to achieving the goal of a connected information society.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;What are general expectations from the WSIS + 10 High Level Meeting of the United Nations General Assembly?&lt;/b&gt;&lt;br /&gt;We would expect the WSIS+10 High Level Meeting to endorse an outcome document that seeks to d evelop a comprehensive policy framework addressing the challenges highlighted above . It would also be beneficial, if the outcome document could identify further steps to assess development made so far, and actions for overcoming the identified challenges. Importantly, this should not only be aimed at governments, but at all stakeholders. This would be useful as a future road map for regulation and would also allow us to understand the impact of Internet on society.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;What shape should the outcome document take?&lt;/b&gt;&lt;br /&gt;The outcome document should be a resolution of the UN General Assembly, with high level policy statements and adopted agreements to work towards identified indicators. It should stress the urgency of reforms needed for ICT governance that is democratic, respectful of human rights and social justice and promotes participatory policymaking. The language should promote the use of technologies and institutional architectures of governance that ensure users’ rights over data and information and recognize the need to restrict abusive use of technologies including those used for mass surveillance. Further, the outcome document should underscore the relevance of the Universal Declaration of Human Rights, including civil, political, social, economic, and cultural rights, in the Information Society.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The outcome document should also acknowledge that certain issues such as security, ensuring transnational rights, taxation, and other such cross jurisdictional issues may need greater international cooperation and should include concrete steps on how to proceed on these issues. The outcome document should acknowledge the limited progress made through outcome-less multi-stakeholder governance processes such as the Internet Governance Forum, which favour status quoism, and seek to enable the IGF to be more bold in achieving its original goals, which are still relevant. It should be frank in its acknowledgement of the lack of consensus on issues such as “enhanced cooperation” and the “respective roles” of stakeholders in multi-stakeholder processes, as brushing these difficulties under the carpet won’t help in magically building consensus. Further, the outcome document should recognize that there are varied approaches to multi-stakeholder governance.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-submission-to-unga-wsis-review'&gt;https://cis-india.org/internet-governance/blog/cis-submission-to-unga-wsis-review&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>jyoti</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>WSIS+10</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-08-09T16:24:04Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/desi-blitz-august-7-2015-nazhat-khan-india-partially-lifts-porn-ban">
    <title>India partially lifts Porn Ban? </title>
    <link>https://cis-india.org/internet-governance/news/desi-blitz-august-7-2015-nazhat-khan-india-partially-lifts-porn-ban</link>
    <description>
        &lt;b&gt;India is said to have partially removed the porn ban. But many internet service providers have refused to restore access, due to a 'vague' government order. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post by Nazhat Khan was &lt;a class="external-link" href="http://www.desiblitz.com/content/india-partially-lifts-porn-ban"&gt;published in DESI blitz&lt;/a&gt; on August 7, 2015.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;India has partially  lifted the ban of online pornography, just days after blocking user  access to 857 adult websites.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Indian government enforced the ban on July 30, 2015, only to reverse  its decision on August 4, 2015.  Ravi Shankar Prasad, the Communications and IT Minister, clarifies the  ban only targets websites promoting child pornography.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He says: “A new notification will be issued shortly. The ban will be  partially withdrawn. Sites that do not promote child porn will be  unbanned.”  Under the new order, internet service providers (ISPs) in India are  allowed to unblock these 857 websites – except for those that contain  child pornography.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This has caused another outrage. ISPs complain it is not within their  capability and responsibility to do so.  Internet Service Providers Association of India (ISPAI) explains: “ISPs  have no way or mechanism to filter out child pornography from URLs, and  the further unlimited sub-links.&lt;/p&gt;
&lt;table class="invisible" style="text-align: justify; "&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;&lt;img src="https://cis-india.org/home-images/copy3_of_Pranesh.png" alt="Pranesh" class="image-inline" title="Pranesh" /&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p style="text-align: justify; "&gt;“Hence, we request your good self to advise us immediately on the future course of action in this regard.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Till your further directive, the ISPs are keeping the said 857 URLs disabled.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An executive at an Indian ISP tells the Wall Street Journal: “How can we go ahead? What if something comes up tomorrow [on one of these sites], which has child porn, or something else?” &lt;br /&gt;&lt;br /&gt;Pranesh Prakash, policy director at the Centre for Internet and Society, points out it is not right for the government to pass the ball over to private companies. &lt;br /&gt;&lt;br /&gt;He says: “The onus cannot be put on the service providers. What the government is doing is inherently unfair, it is not what the law requires.” In effect, porn sites in India are still blocked. The Supreme Court and senior officials are yet to provide clearer directives for ISPs.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/desi-blitz-august-7-2015-nazhat-khan-india-partially-lifts-porn-ban'&gt;https://cis-india.org/internet-governance/news/desi-blitz-august-7-2015-nazhat-khan-india-partially-lifts-porn-ban&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Censorship</dc:subject>
    

   <dc:date>2015-09-20T06:30:34Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/open-magazine-august-7-2015-ullekh-np-genetic-profiling">
    <title>Genetic Profiling: Is it all in the DNA? </title>
    <link>https://cis-india.org/internet-governance/news/open-magazine-august-7-2015-ullekh-np-genetic-profiling</link>
    <description>
        &lt;b&gt;A Bill seeks to make genetic profiling mandatory for the fight against crime—and generates a debate about the clash of ethics, freedom, science and data.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Ullekh NP was &lt;a class="external-link" href="http://www.openthemagazine.com/article/nation/genetic-profiling-is-it-all-in-the-dna"&gt;published in Open Magazine&lt;/a&gt; on August 7, 2015. Sunil Abraham gave his inputs.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;When British geneticist Sir Alec Jeffreys first developed the DNA  profiling test 31 years ago in his laboratory at Leicester University,  he didn’t help the police prove a man guilty. His test—back then it took  weeks to complete DNA profiling procedures as opposed to a few hours  now—proved that a rape suspect in police custody was innocent. Details  from the whole exercise also subsequently helped the local police nab  the real criminal, who had killed his teenaged rape victim. Later, the  police found that he was the one who had committed a similar crime three  years earlier in a village nearby. Britain was destined to make great  gains in solving crimes thanks to DNA identification, while the rest of  the developed world, including the US, caught up later, but only after  lagging initially thanks to the relentless—and sometimes  ill-founded—opposition from civil liberties activists. In India, the  Human DNA Profiling Bill, 2015, a proposed law that envisages collecting  DNA finger prints—which are unique to an individual—especially of  criminals, has been in the making for the past 12 years. The draft bill,  which will shortly be placed before the Union Cabinet for its nod, has  been prepared by the Department of Biotechnology and the Centre for DNA  Fingerprinting &amp;amp; Diagnostics (CDFD), a Hyderabad-based Central  Government-run agency, after examining and reviewing submissions by a  panel of experts, holding consultations with various stakeholders and  getting responses from the public. Notwithstanding the claims of  safeguards against any misuse of the intended DNA data base, activists,  lawyers, internet freedom fighters, civil liberty activists and  columnists have been up in arms against the Government, arguing that the  DNA profiling bill is ill- conceived and naïve—to the extent that it  would destroy an individual’s right to privacy as it lacks provisions to  check data tampering.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The international experience has proved otherwise. Ever since Sir  Jeffreys extracted DNA from human muscle tissue, identified and  processed genetic markers (which are unique to individuals except in the  case of identical twins) from what was until then considered ‘seemingly  purposeless segments of the human DNA’ in the words of writers Peter  Reinharz and Howard Safir, more than 500,000 ‘otherwise unsolvable’  cases have been solved in the developed world thanks to the DNA  identification, note CDFD scientists. DNA is the hereditary material in  the human body. It is found in blood, saliva, urine, strands of hair,  semen, tears, skin, etcetera.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Dr Madhusudan Reddy Nandineni, staff scientist and group leader,  laboratory of DNA fingerprinting services and laboratory of genomics and  profiling applications, CDFD, is worried that opposition to the Bill is  gaining momentum in India due to a raft of reasons. Of course, the  West, too, has witnessed sharp protests against DNA profiling laws. One  of the key reasons anti-profiling activists have an edge, says a senior  Home Ministry official who asks not to be named, is that there is a  “general public anxiety” over “anything to do with disclosing personal  details”. He agrees that the tests are going to be intrusive, because  muscle tissue may have to be collected from private parts. The procedure  of DNA sample collection—as explained in the draft Bill submitted in  January by a committee headed by TS Rao, senior adviser to the  department of biotechnology—talks about obtaining intimate body samples  of living persons (on page 6-7 of the 48- page document) from ‘the  genital or anal area, the buttocks and also breasts in the case of a  female’. According to the draft Bill, it also involves external  examination of private parts, taking samples from pubic hair or by swabs  or washing or by vacuum suction, by scraping or by lifting by tape and  taking of a photograph or video recording of, or an impression or cast  of a wound in those areas. “But then, it is par for the course,” says  the Home Ministry official by way of justification.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;American military historian and author Edward Luttwak agrees that DNA  profiling is a significant intrusion into the “very body of a citizen”.  That is the price one has to pay in the choice between liberty and  equality before investigation, he posits. Luttwak is glad that in the  US, as well as in other countries that have such profiling laws, DNA  identification has yielded results. “It protects suspicious/ low status  but innocent people from false accusations and helps to catch  clever/high-status law-breakers,” he says.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;+++&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For his part, Dr Nandineni says that every aspect of the Human  DNA Profiling Bill for India is based on similar legislation that has  already been implemented in the US, Canada, UK, Australia and  Continental Europe for more than 20 years. He also contends that the  benefits that have accrued there are enormous, which India has missed  out on for all these years. “In all these countries, the concerns of the  general public on privacy matters have been allayed in their  legislation,” he adds. He points out that the retention of DNA profiles  in a ‘DNA Data Bank’ is meant to apprehend repeat offenders and thus  serve a larger societal good. As regards privacy concerns, Dr Nandineni  says that consultations on the preparations of the Bill lasted for 2-3  years and took into account the views of an expert committee whose  members included representatives of NGOs.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Dr Nandineni is of the view that the opponents of the Bill have  managed to get an upper hand in a national debate thanks to their  media-savvy backgrounds. Agrees the Home Ministry official: “Perhaps the  drafters of the Bill have not been communicative enough in getting  their points across to the public and the media. Which might explain why  the Bill has come under tremendous attack in the media. Even otherwise,  global trends also show that civil liberty rights activists have had  great initial advantage in their campaign against DNA profiling.” After  all, the potential for misuse of DNA samples is not restricted to  biological material collected under the provisions of the DNA Bill  alone, Nandineni offers. “Any and every blood sample collected by a  clinical laboratory has the same potential for misuse,” he says.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While Dr J Gowrishankar, director, CDFD, has been vocal about the  positives of the Bill, its opponents have been louder. Many of those  who oppose the Bill say the question is not one of being loud or feeble,  but about being naïve or not.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The likes of Sunil Abraham, executive director of Bangalore-based  internet research organisation Centre for Internet and Society (CIS),  have no argument against DNA profiling being the gold standard for all  forensic investigations. “There is nothing wrong with using DNA evidence  for forensic purposes,” says Abraham, “However, the draft Bill is  filled with techno-utopianism; it assumes that the people and machines  that leverage DNA technologies are infallible.” He goes on, “This is not  true. It is easier to tamper with DNA evidence than it is to tamper  with a video recording. Therefore, all we are asking for are process  checks that prevent compromised persons and machines from using DNA  evidence to convict or exonerate the wrong person.” His contention is  that if the DNA sample is sent to two different labs and both labs come  back with exactly the same result, then the courts can be convinced of  the veracity of the result. “Also the Bill says that DNA labs will give  courts ‘yes’ or ‘no’ answers to questions related to DNA matching. But  ideally, the lab must give the exact match percentage along with all the  detailed information that emerges from the match process so that the  court can fully appreciate the significance of the DNA evidence,” he  suggests.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Abraham and legal scholar Usha Ramanathan—both members of the  expert panel who filed notes of dissent and disagreed with various  aspects of the Bill—have a problem with the claim that the proposed DNA  data bank will cover only criminals and not the general public. Points  out Ramanathan: “The Bill does not restrict the data base to criminals  alone, not by a long shot. The provision in the proposed Bill reads:  ‘(Clause 31(4)) Every DNA Data Bank shall maintain following indices for  various categories of data, namely: (a) a crime scene index; (b) a  suspects’ index; (c) an offenders’ index; (d) a missing persons’ index;  (e) unknown deceased persons’ index; (f) a volunteers’ index; and (g)  such other DNA indices as may be specified by regulations.’ That is an  elaborate set of indices. There is certainly a lot of the ‘general  public’ in it.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Supporters of the DNA Profiling Bill have maintained that a DNA  data bank is not for the public but only for a limited category of  individuals. The proposed law also provides for storing profiles with  the consent of relatives of missing children and grownups so that  relationship identities can be established.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ramanathan is also worried that apart from purposes of criminal  justice, DNA profiling may be extended to parental disputes (maternity  or paternity), issues related to pedigree, those related to assisted  reproductive technologies (surrogacy, in vitro fertilisation or IVF,  intrauterine implantation or IUI, and so on), to transplantation of  human organs (donor and recipient) under the Transplantation of Human  Organs Act, 1994, and also related to immigration or emigration. She had  objected to the requirement of revealing a person’s caste in the  application form for offering blood samples. “This Bill is certainly not  a convict data base. The ambitions are much much vaster, and little to  do with crime control,” she alleges.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Abraham agrees that some safeguards have been built in the  proposed law to prevent any misuse of DNA data under pressure from  expert panel members such as him. However, he says, cyber security and  privacy-related issues are not addressed in a comprehensive manner. “The  Bill basically hopes that the Privacy Bill will address all of this  when it becomes law. But unfortunately, a bill could take 7-10 years  before it becomes law,” he says.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Dr Gowrishankar of CDFD and others have conceded that it was the  decision of the expert panel to include an enabling provision for the  privacy issues of DNA profiling to comply with the proposed Privacy  Bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Abraham says that various measures to prevent ‘privacy harms’ to  volunteers are missing in the latest draft of the Bill. “Given that  biometric technology works on probabilistic matching, the larger the  size of the database, the larger the incidence of mistaken  identification. Therefore it is important that the database remain as  small as necessary,” he asserts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;+++&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The estimated cost of the Bill is Rs 20 crore—to create the  infrastructure for the DNA Profiling Board and the data bank, which  includes buildings, furniture, computer servers and so on. Among other  things, the DNA Profiling Board is tasked with the responsibility of  laying down and implementing standards for laboratories and proper  protocols for ‘Data Bank’ operations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;CDFD scientists and government officials are keen to highlight  the ‘under- hyped’ benefits of DNA profiling –similar to the Innocence  Project in the US, which was aimed at securing the release of people who  were erroneously convicted on the basis of other lines of evidence.  Abraham has no patience for such comparisons. “DNA profiling for  forensic purposes is very advanced and sophisticated, but technologies  do not exist in a vacuum,” he says, “These advanced technologies have to  work within traditional institutions with vulnerabilities and flaws. We  need to, therefore, have non-technological procedural fixes that ensure  that these technologies are not compromised by money and power. The  choice is between the right to privacy and the rights and requirements  of the criminal justice process.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ramanathan agrees with that view. “In the Indian context, the  state of investigation is so poor that we have been looking for ways of  circumventing our problems, not addressing them. That is how  narco-analysis began to be used, till the court struck it down. DNA may  be more reliable than most other scientific tools available to us today,  but it is not all about the science. We also have to worry about  contamination, what happens in the chain of custody, its potential for  being planted or otherwise abused, and the errors even in the  laboratory. You may remember the avowed mix-up of results in the Aarushi  [Talwar murder] case, something the lab said they noticed over two  years after they had given it to the investigators. The danger of  treating DNA as conclusive and not needing corroboration is exacerbated  in this kind of a vulnerable system. Which is why bringing this into a  DNA data base law and not putting any checks on criminal procedure is  less than wise,” she elaborates. She is least impressed with the ‘idea’  of ‘pedigree’ and of ‘population genetics’ in the Bill. “Institutions  like the CDFD have been collecting DNA from suspects and asking for the  caste of the person on the form. How does this seem innocent and  safeguarded?” she asks.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Meanwhile, columnist and author Salil Tripathi says that it is  sheer hubris to think that technology will provide all the answers to  crime-fighting. “Tech- nology is enormously useful and powerful, but it  is value-neutral; it can be used for good or bad ends… There have to be  sufficient safeguards, overseen not only by technologists, law  enforcement officers and bureaucrats, but also by lawyers and civil  liberties experts, who can point out potential flaws and misuse and  prevent those.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Tripathi, too, is piqued that one of the markers sought is of  caste. “Why?” he asks, emphatic that the country’s people should be  concerned about allowing the state so much power over their lives. “And  it may not be only the state; given that the scope of its future  expansion is undefined, what guarantees are there that private actors  won’t have access to the data, and if so, what security protocols would  apply?”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Dr Gowrishankar and Dr Nandineni are right in saying that without DNA  fingerprinting, many international criminals would still be at liberty,  and the opponents of the Bill do not disagree with the efficacy of the  technique developed by Sir Jeffreys. Instead, they are placing the  spotlight on various objectionable aspects in the proposed law. In a  country which first needs—according to former RAW chief Vikram Sood—to  ensure access to Photofit (a technique to create an accurate image of a  person that gels with a witness’ description) for its ground-level  police operatives to combat crime, critics of the Bill seem to have won  the war of words.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/open-magazine-august-7-2015-ullekh-np-genetic-profiling'&gt;https://cis-india.org/internet-governance/news/open-magazine-august-7-2015-ullekh-np-genetic-profiling&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>DNA Profiling</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2015-09-13T09:47:17Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
