<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/a2k/front-page/search_rss">
  <title>Access To Knowledge (A2K)</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 2071 to 2085.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/every-town-had-its-jio-dara"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/intermediary-liability-and-gender-based-violence"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/about/policies/ethical-research-guidelines"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/enlarging-the-small-print"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process">
    <title>EVMs: How transparent is the Indian election process?</title>
    <link>https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process</link>
    <description>
        &lt;b&gt;Electronic Voting Machines (EVMs) have become a bone of contention after the results of the Assembly elections in five states were declared last Saturday and the BSP president Mayawati alleged tampering. The Congress party and the Aam Aadmi Party (AAP) have called for a probe into her allegation. Social media too is abuzz with messages and videos showing how the machines can be allegedly manipulated to sway the votes in favour of a particular candidate.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Smriti Sharma Vasudeva was &lt;a href="http://www.thestatesman.com/india/evms-how-transparent-is-the-indian-election-process-1489512231.html"&gt;published         in the Statesman&lt;/a&gt; on March 14, 2017. Pranesh Prakash was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Overnight, several videos on Whatsapp have surfaced wherein people can be seen explaining the "mechanism" on how to alter the votes polled for a candidate in another candidate's favour. Several similar posts and articles are doing the rounds on Facebook.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;BBC added fuel to the fire when it shared a 2010       article on how 'US "Scientists" hack India Electronic Machines' .       The article details how scientists at a US university say they       have developed a technique to hack into Indian electronic voting       machines. While the article was posted on the BBC website a day       after the election results were declared, it drew considerable       flak from users on Facebook who criticised the website for its       'irresponsible' act of sharing an article with a "click bait"       headline just to grab eyeballs.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amid all this frenzy, the Election Commission of       India has issued statements clarifying how the entire process is       transparent and fool proof and tampering with the EVMs is a       far-fetched thing given the checks and balances in place. For       instance, the EVMs undergo the process of randomisation wherein       which machine will go to which constituency and to which booth is       not known to anyone till the last moment. Similarly, before the       polling starts, mock polling takes place in the presence of       representatives of all the political parties and then each of       these machines are tested and a satisfactory report is generated       and only after that polling begins.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, all these checks and balances still do       not ensure a fool proof system if experts are to be believed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Pranesh Prakash, Policy Director for The Centre       for Internet and Society, a non-profit organisation that       undertakes interdisciplinary research on internet and digital       technologies from policy and academic perspectives, said: "The       Electronic Voting Machines used in India are the simplest, with no       large operating system requirements and are not networked. Thus,       from a software design perspective, these are really good and the       chances of these being tampered with are bleaker. However it       doesn't mean these are fool proof. Most of the developed countries       do not trust these machines and these are definitely not secure       enough for democratic elections.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"While there are many advantages of using EVMs in       the electoral process over the traditional ballot papers, still       there are many ways in which one can tamper with these machines       without any technical ingenuity. The best way is to make use of       the EVMs and ensure that the Voter Verified Paper Audit Trail       (VVPAT) are effectively utilised to make it an overall effective       system".&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recently, the Supreme Court had mandated that       VVPAT machines should be used in all the polls and thus the       Election Commission had installed VVPAT machines in several       constituencies. However, not sure of the efficacy of this system,       the Election Commission had itself raised apprehensions regarding       performance of the paper-trail machine, which gives a receipt to       the voter, verifying the vote went in favour of the candidate       against whose name the button was pressed on the electronic voting       machine.a&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process'&gt;https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital India</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-03-17T01:57:19Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/every-town-had-its-jio-dara">
    <title>Every Town had its Jio Dara</title>
    <link>https://cis-india.org/internet-governance/blog/every-town-had-its-jio-dara</link>
    <description>
        &lt;b&gt;Strap: In the hills of Darjeeling, residents facing an indefinite internet shutdown were thrown an unexpected lifeline in the form of 'Jio dara', a feeble signal from Sikkim towers that nevertheless kept a small line of communication open between the besieged towns in the region and the rest of the world.&lt;/b&gt;
        &lt;p class="normal" style="text-align: justify; "&gt;&lt;b&gt;Bangalore, Karnataka: &lt;/b&gt;Alvin Lama writes rock music is his downtime, and these days his songs are rather politically charged. The 100-day internet shutdown in Darjeeling during the Gorkaland agitation in 2017 inspired his latest single, titled&lt;a href="https://www.facebook.com/Gsihm/videos/vb.1835066709/10207932050739205/?type=2&amp;amp;theater"&gt; &lt;/a&gt;&lt;a href="https://www.facebook.com/Gsihm/videos/vb.1835066709/10207932050739205/?type=2&amp;amp;theater"&gt;Jio Dara&lt;/a&gt;. In Lama’s song, he tells his listeners, “Come let’s go to Jio Dara” where they can be free from the prison of internet shutdown to send and receive messages from the outside world. “I am using that window of access to tell people about our struggle. It has a bit of an anti-administration message,” he says.&lt;/p&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;
&lt;p style="text-align: center; "&gt;&lt;img src="https://cis-india.org/home-images/WBJio.jpg/@@images/4adfc2eb-90c3-4660-8773-0787b2628ffe.jpeg" alt="WB Jio" class="image-inline" title="WB Jio" /&gt;&lt;/p&gt;
&lt;/th&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: center; "&gt;&lt;span class="discreet"&gt;View from Carmichael Ground, a Jio Dara spot (Picture Courtesy: Nisha Chettri, Caffeine and Copies)&lt;/span&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p class="normal" style="text-align: justify; "&gt;Jio Dara (‘dara’ meaning ‘hillock’), also alternatively called ‘Reliance gully’, was not always a specific place but a small window of opportunity during which a weak 2G signal could be accessed in the hills. Towns like Darjeeling and Kalimpong lie very close to the border of West Bengal, separated from their northern neighbour Sikkim by the river Rangeet; and often in the hills along the river bank, phones pick faint signals from the mobile phone towers in Sikkim. For a population that was completely shut off from the outside world, even this thin, fragile lifeline was precious. “I was not here during the agitation but somehow would get information about what was happening in the hills from my family and friends through the Jio Dara,” Alvin says.&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;Alvin, also founder director &amp;amp; CEO of the Good Shepard Institute of Hospitality Management, is not the only musician to immortalise Jio Dara in song. Young student Saif Ali Khan and his friends also wrote and composed their own ode to this happy accident. “It was really born out of boredom,” he says. “My brother, my friends and I were sitting around the campus and chatting. Classes were cancelled due to the strike and our education was on hold. And we overhead a couple talking about where they were going to go for their date. Of course, we should go to Jio Dara, the girl said, and that led to an argument.”&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;This sparked off their&lt;a href="https://www.youtube.com/watch?v=ybewgPw_Ack"&gt; &lt;/a&gt;&lt;a href="https://www.youtube.com/watch?v=ybewgPw_Ack"&gt;Jio Dara&lt;/a&gt; song which was written, composed and recorded by Khan and his friends under their Firfiray Productions. A satirical take on the internet shutdown and how it has affected the lives of the students in Darjeeling, the song plays out like a dialogue between two lovers and serves as a light-hearted look at a situation that was anything but.&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;For three months between June and September, the administration had shut down internet access in Darjeeling and in its surrounding hills. This prevented the outside world from hearing the voices of the Gorkhaland protesters but information still trickled out, as it is wont to do, through various sources, one of these being the Jio Dara.&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;How did this work? Reliance Jio had not long ago made a big splash in India’s telecom market with cheap unlimited data packs and lifetime validity deals, and many had switched to Jio to take advantage of this. This was what eventually gave Jio users the edge, helping them tap into the signal from the towers across the border. While it isn't clear whether signals from other networks were also available in these spots (information varies from they were no other networks at all to there were some but they were even weaker than Jio), what's certain is that without the free internet that Jio subscribers enjoyed, access to the internet through other networks was not feasible after a point because recharging your number at the local mobile shop wasn't an option anymore.&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;These hotspots used to vary, according to Lama. “The signal would be strong today, but next day one might have to move a few hundred metres up or down till they connected with the network. So, you would go searching in the hills till you get a signal and then the word would spread,” he says. People in Darjeeling were lucky in that their Jio Dara was inside town near the mall in Chowrasta, but it was not as convenient in Kalimpong. One had to travel a couple of kilometres from the city centre to Carmichael grounds, sometimes go even further up the hill towards areas that were facing Sikkim. “People would get to know through word-of-mouth and the number of people there would snowball,” Lama tells us. People, young and old, would come to log in, even though the connection was patchy and slow, to talk about the events of the day, upload pictures, connect with family and friends and basically tell the world what really was happening in Darjeeling.&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;It became an unofficial symbol of resistance. Each town had its very own Jio Dara and it transcended merely a physical location to become an idea. “Our habits changed after June 18, when the government undemocratically blocked the internet service in the hills,” writes Nisha Chettri, a journalist with the Statesman, in her blog ‘Caffeine and Copies’. Carmichael Ground in Kalimpong invariably became a meeting spot for all sorts of occasions – birthdays, dates, get-togethers. She says that some Jio users even shared their mobile hotspot with others so that everyone could use the internet.&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;Local journalists would file their stories and upload their pictures side by side with ordinary citizens updating their social media statuses. It helped journalists like the Telegraph’s Passan Yolmo to maintain a line of communication with his publishers. Most evenings he would connect to the Jio Dara to send across photographs from the day, as many as the feeble 2G connection would allow.&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;“I don’t know who first found this spot behind Chowrasta,” says Khan. Perched in the centre of the city and at a higher elevation than the rest, Chowrasta is a popular tourist destination in Darjeeling; so it couldn’t have been long before people stumbled onto this secret. “I accidentally discovered it one day when I walked past it and suddenly my phone started pinging and I received a bunch of texts on WhatsApp. I checked my phone and realised I was connected to Sikkim’s Jio network.”&lt;/p&gt;
&lt;p class="normal" style="text-align: justify; "&gt;Ayswarya Murthy is a Bangalore-based journalist and a member of&lt;a href="https://101reporters.com/"&gt; &lt;/a&gt;&lt;a href="https://101reporters.com/"&gt;101Reporters.com&lt;/a&gt;, a pan-India network of grassroots reporters.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Shutdown stories are the output of a collaboration between 101 Reporters and CIS with support from Facebook.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/every-town-had-its-jio-dara'&gt;https://cis-india.org/internet-governance/blog/every-town-had-its-jio-dara&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Ayswarya Murthy</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Shutdown</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2017-12-21T16:24:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/intermediary-liability-and-gender-based-violence">
    <title>Event Report on Intermediary Liability and Gender Based Violence </title>
    <link>https://cis-india.org/internet-governance/blog/intermediary-liability-and-gender-based-violence</link>
    <description>
        &lt;b&gt;This report is a summary of the proceedings of the Roundtable Conference organized by the Centre for Internet and Society (CIS) at the Digital Citizen Summit, an annual summit organized by the Digital Empowerment Foundation. It was conducted at the India International Centre in New Delhi on November 1, 2018 from 11.30 a.m. to 12.30 p.m.&lt;/b&gt;
        
&lt;p&gt;With inputs and edited by Ambika Tandon. Click here to download the &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/intermediary-liability-and-gender-based-violence-report"&gt;PDF&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Background&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The topic of discussion was intermediary liability and Gender Based Violence (GBV), the debate on GBV globally and in India evolving to include myriad forms of violence in online spaces in the past few years. This ranges from violence native to the digital, such as identity theft, and extensions of traditional forms of violence, such as online harassment, cyberbullying, and cyberstalking&lt;a name="_ftnref1" href="#_ftn1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt;. Given the extent of personal data available online, cyber attacks have led to a variety of financial and personal harms.&lt;a name="_ftnref2" href="#_ftn2"&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/a&gt; Studies have explored the extent of psychological and even physical harm to victims, which has been found to have similar effects to violence in the physical world&lt;a name="_ftnref3" href="#_ftn3"&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/a&gt;. Despite this, technologically-facilitated violence is often ignored or trivialised. When present, redressal mechanisms are often inadequate, further exacerbating the effects of violence on victims.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;TheRoundtable explored ways of how intermediaries can help tackle gender based violence and discussed attempts at making the Internet a safer place for women which can ultimately help make it a gender equal environment. It also analyzed the key concerns of privacy and security leading the conversation to how we can demand more from platforms for our protection and how best to regulate them.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The roundtable had four female and one male participants from various civil society organisations working on rights in the digital space.&lt;/p&gt;
&lt;h2&gt;Roundtable Discussion&lt;/h2&gt;
&lt;h3&gt;Online Abuse&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;The discussion commenced with the acknowledgement of it being well documented that women and sexual minorities face a disproportionate level of violence in the digital space, as an extension/reproduction of physical space. GBV exists on a continuum from the physical, verbal, and technologically enabled, either partially or fully, with overflowing boundaries and deep interconnections between different kinds of violence. Some forms of traditional violence such as harassment, stalking, bullying, sex trafficking, extend themselves into the digital realm while other forms are uniquely tech enabled like doxxing and morphing of imagery. Due to this considerations of anonymity, privacy, and consent, need to be re-thought in the context of tech enabled GBV. These come into play in a situation where the technological realm has largely been corporatised and functions under the imperative of treating the user and their data as the final product.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;It was noted early on that GBV online can be a misnomer because it can be across a number of spaces and, the participants concentrated on laying down the specific contours of tech mediated or tech enabled violence. One of the discussants stated that the term GBV is a not a useful one since it does not encompass everything that is talked about when referring to online abuse. The phenomenon that gets the most traction is trolling on social media or abuse on social media. This is partly because it is the most visible people who are affected by it, and also since often, it is the most difficult to treat under law. In a 2012 study by the Internet Democracy Project focusing on online verbal abuse in social media, every woman they interviewed started by asserting that she is not a victim. The challenge with using the GBV framework is that it positions the woman as a victim. Other incidents on social media such as verbal abuse where there are rape threats or death threats, especially when there is an indication that the perpetrator is aware of the physical location of the victim, need to be treated differently from say online trolling.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Further, certain forms of violence, such as occurrences of ‘revenge porn’ or the non-consensual sharing of intimate images, including rape videos are easier to fit within the description of GBV. It is important to make these distinctions because the remedies then should be commensurate with perceived harm. It is not appropriate to club all of these together since the criminal threshold for each act is different. Whereas being called a “slut” or a “bitch” would not be enough for someone to be arrested, if a woman is called that repetitively by a large number of people the commensurate harm could be quite significant. Thus, using GBV as a broad term for all forms of violence ends up invisiblising certain forms of violence and prevents a more nuanced treatment of the discussion.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In response to this, a participant highlighted the normalisation of gendered hate speech, to the extent of lack of recognition as a form of hate speech. This lacunae in our law stems from the fact that we inherited our hate speech laws from a colonial era where it was based on the grounds of incitement of violence, more so physical violence. As a result, we do not take the International Covenant on Civil and Political Rights (ICCPR) standard of incitement to discrimination. If the law was based on an incitement to discriminate point of view then acts of trolling could come under hate speech. Even in the United Kingdom where there is higher sentencing for gender based crime as compared to other markers of identity such as race, gender does not fall under the parameters of hate speech. This can also be attributed to the threshold at which criminalization kicks in for such acts.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;A significant aspect of online verbal abuse pointed out by a participant was that it does not affect all women equally. In a study, the Twitter accounts of 12 publicly visible women across the political spectrum were looked at for 2 weeks in early December, 2017. They were filtered against keywords and analyzed for abusive content. One Muslim woman in the study had extremely high levels of abuse, being consistently addressed as “Jihad man, Jihad didi or Jihad biwi”. According to the participant, she is also the least likely to get justice through the criminal system for such vitriol and as such, this disparity in the likelihood of facing online abuse and accessing official redressal mechanisms should be recognized. Another discussant reaffirmed the importance of making a distinction between online abuse against someone as opposed to gender based violence online where the threat itself is gendered.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In a small ethnographic study with the Bangalore police undertaken by one of the participants, the police were asked for their opinion on the following situation: A women voluntarily providers photos of herself in a relationship and once the relationship is over, the man distributes it. Is there a cause for redressal?&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Policemen responded that since she gave it voluntarily in the first instance, the burden of the consequences is now on her. So even in a feminist framework of consent and agency where we have laws for actions of voyeurism and publishing photos of private parts, it is not being recognized by institutional response mechanisms.&lt;/p&gt;
&lt;h3&gt;Intermediary Liability&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Private communications based intermediaries can be understood to be of two types: those that enable the carriage/transmission of communications and provide access to the internet, and those that host third party content. The latter have emerged as platforms that are central to the exercising of voice, the exchange of information and knowledge, and even the mobilisation of social movements. The norms and regulations around what constitutes gender based violence in this realm is then shaped not only by state regulations, but content moderation standards of these intermediaries. Further, the kinds of preventive tools and tools providing redressal are controlled by these platforms. More than before, we are looking deeper into the role of these companies that function as intermediaries and control access to third party content without performing editorial functions.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In the Intermediary Liability framework in the United States formulated in the 1990s, the intermediaries that were envisioned were not the intermediaries we have now. With time, the intermediary today is able to access and possess your data while urging a certain kind of behaviour from you. There is then an intermediary design duty which is not currently accounted for by the law. Moreover, the law practices a one size fits all regime whereas what could be more suitable is having approached tailored as per the offence. So for child pornography, a ‘removal when uploaded’ action using artificial intelligence or machine learning is appropriate but a notice and takedown approach is better for other kinds of content takedown.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Globally, another facet is that of safe harbour provisions for platforms. When intermediaries such as Google and Facebook were established, they were thought of as neutral pipes since they were not creating the content but only facilitating access. However, as they have scaled and as their role in ecosystem has increased, they are now one of the intervention points for governments as gatekeepers of free speech. One needs to be careful in asking for an expansion of the role and responsibilities of platforms because then complementary to that we will also have to see that the frameworks regulating them need to be revisited. Additionally, would a similar standard be applicable to larger and smaller intermediaries, or do we need layers of distinction between their responsibilities? Internet platforms such as the GAFA (Google, Apple, Facebook and Amazon) yield exceptional power to dictate what discourse takes place and this translates into the the online and offline divide disappearing. Do we then hold these four intermediaries to a separate and higher standard? If not, then all small players will be held to stringent rules disadvantaging their functioning and ultimately, stifling innovation. Thus, regulation is definitely needed but instead of a uniform one, one that’s layered and tailor-made to different situations and platform visibility levels could be more useful.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Some participants shared the opinion that because these intermediaries are based in foreign countries and have primary legal obligations there, the insulation plays out in the citizen’s benefit. It lends itself a layer of freedom of speech and expression that is not present in the substantive law, rule of law framework or the institutional culture in India.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Child pornography is an area where platforms are taking a lot of responsibility. Google has spoken about how they have been using machine learning algorithms to block 40% of such content and Microsoft is also working on a similar process. If we argue for more intervention from platforms, we simultaneously also need to look at their machine learning algorithms. Concerns of how these algorithms are being deployed and further, being incorporated into the framework of controlling child pornography are relevant since there is not much accountability and transparency regarding the same.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Another fraction that has emerged from recent events is the divide between traditional form of media and new media. Taking the example of rape victims and sexual harassment claims, there are strict rules regarding the kinds of details that can be disclosed and the manner in which this is to be done. In the Kathua rape case, for instance, the Delhi High Court sent a notice to Twitter and Facebook for revealing details because there are norms around this even though they have not been applicable to platforms. Hence, there are certain regulations that apply to old media that have now escaped in the frameworks applicable to the new media and at some level that gap needs to be bridged.&lt;/p&gt;
&lt;h3&gt;Role of Law&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;One of the participants brought up the question; what is the proper role of the law and does it come first or last? In case of the latter, the burden then falls upon the kind of standard setting that we do as a society. The role of platforms as an entity in mediating the online environment was discussed, given the concerns that have been highlighted about this environment, especially for women. The third thing to be considered is whether we run the risk of enforcing patriarchal behaviour by doubling down on the either of the two aforementioned factors. If legal standards are made too harsh they may end up reinforcing a power structure that is essentially dominated by upper caste men who comprise a majority of staff within law enforcement and the judiciary. Even though the subordinate judiciary do have mahila courts now, the application of the law seems to reify the position of the woman as the victim. This also brings up the question of who can become a victim within such frameworks, where selective bias such as elements of chastity come to play as court functions are undertaken.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;An assessment of the way criminal law in India is used to stifle free speech was carried out in 2013 and repeated in 2018, illustrating how censorship law is used to stifle voices of minorities and people critical of the political establishment. Even though it is perhaps time to revisit the earlier conceptualizations of intermediaries as neutral pipes, it is concerning to look at the the court cases regarding safe harbour in India. Many of them are carried out with the ostensible objective of protecting women's rights. In &lt;em&gt;Kamlesh Vaswani V Union of India&lt;/em&gt;, the petition claims that porn is a threat to Indian women and culture, ignoring the reality that many women watch porn as well. Pornhub releases figures on viewership every year, and of the entirety of Indian subscribers one third are women. This is not taken into account in such petitions. In &lt;em&gt;Prajwala V Union of India,&lt;/em&gt; an NGO sent the Supreme Court a letter raising concerns about videos of sexual violence being distributed on the internet. The letter sought to bring attention to the existence of such videos, as well as their rampant circulation on online platforms. At some point in the proceedings, the Court wanted the intermediaries to use keywords to take down content and keeping aside poor implementation, the rationale behind such a move is problematic in itself. For instance, if you choose sex as one of those words then all sexual education will disappear from the Internet. There are many problems with court encouraged filtering systems like one where a system automatically tells you when a rape video goes up. The question arises of how will you distinguish between a video that was consensually made depicting sexual activities and a rape video. The narrow minded responses to the Sabu Mathew and Prajwala cases originate in the conservative culture regarding sexual activity prevalent in India.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In a research project undertaken by one of the participants in the course of their work, they made a suggestion to include gender, sexuality and disability as grounds for hate speech while working with women’s rights activists and civil society organisations. This suggestion was not well received as they vehemently opposed more regulation. In their opinion, the laws that India has in place are not being upheld and creating new laws will not change if the implementation of legislation is flawed. For instance, even though the Supreme Court stuck down S.66A, Internet Freedom Foundation has earlier provided instances of its continued usage by police officers to file complaints.&lt;a name="_ftnref4" href="#_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Hate speech laws can be used to both ends, even though unlike in the US they do not determine whose speech they want to protect. Consequently, in the US a white supremacist gets as much protection as a Black Lives Matter activist but in India, that is not the case. The latest Law Commission Report on hate speech in India tries to make progress by incorporating the ICCPR view of incitement to discriminate and include dignity in the harms. It specifically speaks about hate speech against women saying that it does not always end up in violence but does result in a harm to dignity and standing in society. Often, protectionist forms of speech such as hate speech often end up hurting the people it aims to protect by reinforcing stereotypes.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Point of View undertook a study where they looked at the use of S.67 in the Information Technology (IT) Act which criminalizes obscene speech when you use a medium covered by the IT, in which they found that the section was used to criminalize political speech. In many censorship cases, the people who those provisions benefit are the ones in power.&lt;a name="_ftnref5" href="#_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; For instance in S.67, obscenity provisions do not protect women's rights, they protect morality of society. Even though these are done in the name of protecting women, when a woman herself decides that she wants to publish a revealing picture of herself online, it is disallowed by the law. That kind of control of sexuality is part of a larger patriarchal framework which does not support women's rights or recognise her sexuality. However, under Indian law, there are quite a few robust provisions for image based abuse, and there is some recognition of women in particular being vulnerable to it. S.66A of the IT Act specifically recognizes that it is a criminal activity to share images of someone’s private parts without their consent. This then also encompasses instances of ‘revenge porn’. That provision has been in place in India since 2008, in contrast to the US where half the states still do not have such a provision. Certain kinds of vulnerability have adequate recognition in the law, thus one should be wary of calls of censorship and lowering the standards for criminalizing speech.&lt;/p&gt;
&lt;h3&gt;Non-legal interventions&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;This section centres around the discussions of redressal mechanisms that can be used to address some of the forms of violence which do not emanate from the law. All of the participants emphasized the importance of creating safe spaces through non-legal interventions. It was debated whether there is a need to always approach the law or if it is possible to categorize forms of online violence according to the gravity of the violation committed. These can be in the form of community solutions where law is treated as the last resort. For instance, there was support for using community tools such as ‘feminist trollback’ where humor can be used to troll the trolls. Trolls feed on the fear of being trolled, so the harm can be mitigated by using community initiatives wherein the target can respond to the trolls with the help of other people in the community. It was reiterated that non technical and legal interventions are needed not only from the perspective of power relations within these spaces but also access to the spaces in the first place. Accordingly, the government should work on initiatives that get more women online and focus on policies that makes smartphones and data services more accessible. This would also be a good method to increase the safety of women and benefit from the strength in numbers.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In cases of the non-consensual sharing of intimate images, law can be the primary forum but in cases of trolling and other social media abuse, the question was raised - should we enhance the role of the intermediary platforms? Being the first point of intervention, their responsibility should be more than it currently is. However this would require them to act in the nature of police or judiciary and necessitate an examination of their algorithms. A large proportion of the designers of such algorithms are white males, which increases the possibility of their biases against women of colour for instance, to feed into the algorithms and reinforce a power structure that lacks accountability.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Participants questioned the lack of privacy in design with the example in mind being of how registrars do not make domain owner details private by default. Users have to pay an additional fee for not exposing their details to public and the notion of having to pay for privacy is unsettling. There is no information being provided during the purchasing of the domain name about the privacy feature as well. It was acknowledged that for audit and law enforcement purposes it is imperative to have the information of the owner of a domain name and their details since in cases of websites selling fake medicines, arms or hosting child pornography. Thus, it boils down to the kind of information necessary for law enforcement. Global domain name rules also impact privacy on the national level. The process of ascertaining the suitability and necessity of different kinds of information excludes ordinary citizens since all the consultations take place between the regulatory authority and the state. This makes it difficult for citizens to participate and contribute to this space without government approval.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Issues were flagged against community standards in that the violence that occurs to women is also because the harms are not equal for all. Further, some users are targeted specifically because of the community they come from or the views they have. Often also because, they represent a ‘type’ of a woman that does not adhere to the ‘ideal’ of a woman held by the perpetrator. Unfortunately community standards do not recognise differential harms towards certain communities in India or globally. Twitter, for example, regularly engages in shadow banning and targets people who do not conform to the moral views prevalent in that society where the platform is engaging in censorship. We know these instances occur only when our community members notice and notify us of the same. There is a certain amount of labor that the community has already put in flagging instances of these violations to the intermediary which also needs recognition. In this situation, Twitter is disproportionately handling how it engages with the two entities in question. Community standards could thus become a double edged sword without adding additional protections for certain disadvantaged communities.&lt;/p&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Currently, intermediaries are considered neutral pipes through which content flows and hence have no liability as long as they do not perform editorial functions. This has also been useful in ensuring that the freedom of speech is not harmed. However, given their potential ability to remedy this problem, as well as the fact that intermediaries sometimes benefit financially from such activities, it is important to look at the intermediaries’ responsibility in addressing these instances of violence. Governments across the world have taken different approaches to this question&lt;a name="_ftnref6" href="#_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;. Models, such as in the US, where intermediaries have been solely responsible to institute redressal mechanisms have proven to be ineffectual. On the other hand, in Thailand, where intermediaries are held primarily liable for content, the monitoring of content has led to several free speech harms.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;People are increasingly looking at other forms of social intervention to combat online abuse since technological and legal ones do not completely address and resolve the myriad issues emanating from this umbrella term. There is also a need to make the law gender sensitive as well as improving the execution of laws at ground level, possibly through sensitisation of law enforcement authorities. Gender based violence as a catchall phrase does not do justice to the full spectrum of experiences that victims face, especially women and sexual minorities.&amp;nbsp; Often these do not attract criminal punishment given the restricted framework of the current law and need to be seen through the prism of hate speech to strengthen these provisions.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Some actions within GBV receive more attention than others and as a consequence, these are the ones platforms and governments are most concerned with regulating. Considerations of free speech and censorship and the role of intermediaries in being the flag bearers of either has translated into growing calls for greater responsibility to be taken by these players. The roundtable raised some key concerns regarding revisiting intermediary liability within the context of the scale of the platforms, their content moderation policies and machine learning algorithms.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;&lt;a name="_ftn1" href="#_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;em&gt;See &lt;/em&gt;Khalil Goga, “How to tackle gender-based violence online”, World Economic Forum, 18 February 2015, &amp;lt;&lt;a href="https://www.weforum.org/agenda/2015/02/how-to-tackle-gender-based-violence-online/"&gt;https://www.weforum.org/agenda/2015/02/how-to-tackle-gender-based-violence-online/&lt;/a&gt;&amp;gt;. &lt;em&gt;See also&lt;/em&gt; Shiromi Pinto, “What is online violence and abuse against women?”, 20 November 2017, Amnest International, &amp;lt;&lt;a href="https://www.amnesty.org/en/latest/campaigns/2017/11/what-is-online-violence-and-abuse-against-women/"&gt;https://www.amnesty.org/en/latest/campaigns/2017/11/what-&lt;/a&gt;&lt;a href="https://www.amnesty.org/en/latest/campaigns/2017/11/what-is-online-violence-and-abuse-against-women/"&gt;is-online-violence-and-abuse-against-women/&lt;/a&gt;&amp;gt;.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;a name="_ftn2" href="#_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Nidhi Tandon, et. al., “Cyber Violence Against Women and Girls: A worldwide wake up call”, UN Broadband Commission for Digital Development Working Group on Broadband and Gender, &amp;lt;&lt;a href="http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/CI/CI/images/wsis/GenderReport2015FINAL.pdf"&gt;http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/CI/CI/images/wsis/GenderReport2015FINAL.pdf&lt;/a&gt;&amp;gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;a name="_ftn3" href="#_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;em&gt;See&lt;/em&gt; Azmina Dhrodia, “Unsocial Media: The Real Toll of Online Abuse against Women”, Amnesty Global Insights Blog, &amp;lt;&lt;a href="https://medium.com/amnesty-insights/unsocial-media-the-real-toll-of-online-abuse-against-women-37134ddab3f4"&gt;https://medium.com/amnesty-insights/unsocial-media-the-real-toll-of-online-abuse-against-women-37134ddab3f4&lt;/a&gt;&amp;gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;a name="_ftn4" href="#_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;em&gt;See&lt;/em&gt; Abhinav Sekhri and Apar Gupta, “Section 66A and other legal zombies”, Internet Freedom Foundation Blog, &amp;lt;https://internetfreedom.in/66a-zombie/?&lt;/p&gt;
&lt;p&gt;&lt;a name="_ftn5" href="#_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; See Bishakha Datta “Guavas and Genitals”, Point of View &amp;lt;https://itforchange.net/e-vaw/wp-content/uploads/2018/01/Smita_Vanniyar.pdf&amp;gt;&lt;/p&gt;
&lt;p&gt;&lt;a name="_ftn6" href="#_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; ‘Examining Technology-Mediated Violence Against Women Through a Feminist Framework: Towards appropriate legal-institutional responses in India’, Gurumurthy et al., January 2018.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/intermediary-liability-and-gender-based-violence'&gt;https://cis-india.org/internet-governance/blog/intermediary-liability-and-gender-based-violence&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>akriti</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-12-21T07:16:41Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages">
    <title>Even years later, Twitter doesn't delete your direct messages</title>
    <link>https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages</link>
    <description>
        &lt;b&gt;When does “delete” really mean delete? Not always, or even at all, if you’re Twitter .&lt;/b&gt;
        &lt;p&gt;The blog post by Zack Whittaker and Natasha Lomas was published in &lt;a class="external-link" href="https://techcrunch.com/2019/02/15/twitter-direct-messages/"&gt;Tech Crunch&lt;/a&gt; on February 15, 2019. Karan Saini was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Twitter  retains direct messages for years, including messages you and others  have deleted, but also data sent to and from accounts that have been  deactivated and suspended, according to security researcher Karan Saini.&lt;/p&gt;
&lt;p&gt;Saini  found years-old messages in a file from an archive of his data obtained  through the website from accounts that were no longer on Twitter. He  also reported a similar bug, found a year earlier but not disclosed  until now, that allowed him to use a since-deprecated API to retrieve  direct messages even after a message was deleted from both the sender  and the recipient — though, the bug wasn’t able to retrieve messages  from suspended accounts.&lt;/p&gt;
&lt;p&gt;Saini told TechCrunch that he had “concerns” that the data was retained by Twitter for so long.&lt;/p&gt;
&lt;p&gt;Direct messages &lt;a href="https://www.cnet.com/how-to/how-to-unsend-twitter-direct-messages/"&gt;once let users “unsend” messages&lt;/a&gt; from someone else’s inbox, simply by deleting it from their own.  Twitter changed this years ago, and now only allows a user to delete  messages from their account. “Others in the conversation will still be  able to see direct messages or conversations that you have deleted,”  Twitter says in &lt;a href="https://help.twitter.com/en/using-twitter/direct-messages"&gt;a help page&lt;/a&gt;. Twitter also says in its &lt;a href="https://twitter.com/en/privacy"&gt;privacy policy&lt;/a&gt; that  anyone wanting to leave the service can have their account “deactivated  and then deleted.” After a 30-day grace period, the account disappears,  along with its data.&lt;/p&gt;
&lt;p&gt;But, in our tests, we could recover direct  messages from years ago — including old messages that had since been  lost to suspended or deleted accounts. By downloading &lt;a href="https://twitter.com/settings/your_twitter_data"&gt;your account’s data&lt;/a&gt;, it’s possible to download all of the data Twitter stores on you.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://cis-india.org/home-images/Twitter.png/@@images/40867bd2-2284-4c9c-b42f-fb7a500b1c92.png" alt="Twitter" class="image-inline" title="Twitter" /&gt;&lt;/p&gt;
&lt;p&gt;A conversation, dated March 2016, with a suspended Twitter account was still retrievable today (Image: TechCrunch)&lt;/p&gt;
&lt;p&gt;Saini says this is a “functional bug” rather than a security flaw,  but argued that the bug allows anyone a “clear bypass” of Twitter  mechanisms to prevent accessed to suspended or deactivated accounts.&lt;/p&gt;
&lt;p&gt;But  it’s also a privacy matter, and a reminder that “delete” doesn’t mean  delete — especially with your direct messages. That can open up users,  particularly high-risk accounts like journalist and activists, to  government data demands that call for data from years earlier.&lt;/p&gt;
&lt;p&gt;That’s despite &lt;a href="https://help.twitter.com/en/rules-and-policies/twitter-law-enforcement-support"&gt;Twitter’s claim&lt;/a&gt; that once an account has been deactivated, there is “a very brief  period in which we may be able to access account information, including  tweets,” to law enforcement.&lt;/p&gt;
&lt;p&gt;A Twitter spokesperson said the  company was “looking into this further to ensure we have considered the  entire scope of the issue.”&lt;/p&gt;
&lt;p&gt;Retaining direct messages for years  may put the company in a legal grey area ground amid Europe’s new data  protection laws, which allows users to demand that a company deletes  their data.&lt;/p&gt;
&lt;p&gt;Neil Brown, a telecoms, tech and internet lawyer at &lt;a href="https://decoded.legal/"&gt;U.K. law firm Decoded Legal&lt;/a&gt;,  said there’s “no formality at all” to how a user can ask for their data  to be deleted. Any request from a user to delete their data that’s  directly communicated to the company “is a valid exercise” of a user’s  rights, he said.&lt;/p&gt;
&lt;p&gt;Companies can be fined up to four percent of their annual turnover for violating GDPR rules.&lt;/p&gt;
&lt;p&gt;“A  delete button is perhaps a different matter, as it is not obvious that  ‘delete’ means the same as ‘exercise my right of erasure’,” said Brown.  Given that there’s no case law yet under the new General Data Protection  Regulation regime, it will be up to the courts to decide, he said.&lt;/p&gt;
&lt;p&gt;When asked if Twitter thinks that consent to retain direct messages is withdrawn when a message or account is deleted, Twitter’s spokesperson had “nothing further” to add.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages'&gt;https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Zack Whittaker and Natasha Lomas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-18T14:17:54Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance">
    <title>European Union Draft Report Admonishes Mass Surveillance, Calls for Stricter Data Protection and Privacy Laws</title>
    <link>https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance</link>
    <description>
        &lt;b&gt;Ever since the release of the “Snowden files”, the secret documents evidencing the massive scale of surveillance undertaken by America’s National Security Agency and publically released by whistle-blower Edward Snowden, surveillance in the digital age has come to the fore of the global debate on internet governance and privacy.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The Committee on Civil Liberties, Justice and Home Affairs of the European Parliament in its draft report on global surveillance has issued a scathing indictment of the activities of the NSA and its counterparts in other member nations and is a welcome stance taken by an international body that is crucial to the fight against surveillance.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The "European Parliament &lt;a class="external-link" href="http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML%2BCOMPARL%2BPE-526.085%2B02%2BDOC%2BPDF%2BV0//EN"&gt;Draft Report&lt;/a&gt; on the US NSA surveillance programme, surveillance bodies in various Member States and their impact on EU citizens’ fundamental rights and on transatlantic cooperation in Justice and Home Affairs" released on the 8&lt;sup&gt;th&lt;/sup&gt; of January, 2014, comprehensively details and critiques the mass surveillance being undertaken by government agencies in the USA as well as within the EU, from a human rights and privacy perspective. The report examines the extent to which surveillance systems are employed by the USA and EU member-states, and declares these systems in their current avatars to be unlawful and in breach of international obligations and fundamental constitutional rights including &lt;i&gt;"the freedom of expression, of the press, of thought, of conscience, of religion and of association, private life, data protection, as well as the right to an effective remedy, the presumption of innocence and the right to a fair trial and non-discrimination"&lt;/i&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Furthermore, the report points to the erosion of trust between the EU and the US as well as amongst member states as an outcome of such secret surveillance, and criticises and calls for a suspension of the data-sharing and transfer agreements like the Terrorist Finance Tracking Program (TFTP), which share personal information about EU citizens with the United States, after examining the inadequacy of the US Safe Harbour Privacy principles in ensuring the security of such information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After considering the secret and unregulated nature of these programmes, the report points to the need of restricting surveillance systems and criticizes the lack of adequate data protection laws and privacy laws which adhere to basic principles such as necessity, proportionality and legality.. It also questions the underlying motives of these programmes as mere security-tools and points to the possible existence of political and economic motives behind their deployment. Recognizing the pitfalls of surveillance and the terrible potential for misuse, the report "&lt;i&gt;condemns in the strongest possible terms the vast, systemic, blanket collection of the personal data of innocent people, often comprising intimate personal information; emphasises that the systems of mass, indiscriminate surveillance by intelligence services constitute a serious interference with the fundamental rights of citizens; stresses that privacy is not a luxury right, but that it is the foundation stone of a free and democratic society; points out, furthermore, that mass surveillance has potentially severe effects on the freedom of the press, thought and speech, as well as a significant potential for abuse of the information gathered against political adversaries."&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amongst the recommendations in the 51-page report are calls for a prohibition of mass surveillance and bulk data collection, and an overhaul of the existing systems of data-protection across the European Union and in the US to recognize and strengthen the right to privacy of their citizens, as well as the implementation of democratic oversight mechanisms to check security and intelligence agencies. It also calls for a review of data-transfer programmes and ensuring that standards of privacy and other fundamental rights under the European constitution are met. The committee sets out a 7-point plan of action, termed the European Digital Habeus Corpus for Protecting Privacy, including &lt;a class="external-link" href="http://www.europarl.europa.eu/news/en/news-room/content/20130502BKG07917/html/QA-on-EU-data-protection-reform"&gt;adopting the Data Protection Package&lt;/a&gt;, suspending data transfers to the US until a more comprehensive data protection regime is through an Umbrella Agreement, enhancing fundamental freedoms of expression and speech, particularly for whistleblowers, developing a European Strategy for IT independence and developing the EU as a reference player for democratic and neutral governance of the internet.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though this draft report has no binding legal value as yet, the scathing criticism has assisted in calling to the attention of the global community the complex issues of internet governance and privacy and surveillance, and generated debate and discourse around the need for an overhaul of the current system. The recent decision of the US government to ‘democratize’ the internet by handing control of the DNS root zone to an international body, and thereby relinquishing a large part of its means of controlling the internet, is just one example of the systemic change &lt;a class="external-link" href="http://arstechnica.com/tech-policy/2014/03/in-sudden-announcement-us-to-give-up-control-of-dns-root-zone/"&gt;that this debate is generating&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance'&gt;https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>divij</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-09-30T08:52:45Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance">
    <title>European Summer School on Internet Governance</title>
    <link>https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance</link>
    <description>
        &lt;b&gt;The 13th European Summer School on Internet Governance was held at Meissen in Germany from 13 - 20 July 2019. Akriti Bopanna attended the school. The event was organized by EuroSSIG. &lt;/b&gt;
        &lt;p&gt;More information on the event can be &lt;a class="external-link" href="https://eurossig.eu/eurossig/2019-edition/programme-2019/"&gt;accessed on this page&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance'&gt;https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Internet Freedom</dc:subject>
    

   <dc:date>2019-07-23T00:30:15Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law">
    <title>European E-Evidence Proposal and Indian Law</title>
    <link>https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law</link>
    <description>
        &lt;b&gt;In April of 2018, the European Union issued the proposal for a new regime dealing with cross border sharing of data and information by issuing two draft instruments, an E-evidence Regulation (“Regulation”) and an E-evidence Directive (“Directive”), (together the “E-evidence Proposal”). The Regulation is a direction to states to put in place the proper legislative and regulatory machinery for the implementation of this regime while the Directive requires the states to enact laws governing service providers so that they would comply with the proposed regime.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The main feature of the E-evidence Proposal is twofold: (i) establishment of a legal regime whereunder competent authorities can issue European Production Orders (&lt;b&gt;EPOs&lt;/b&gt;) and European Preservation Orders (&lt;b&gt;EPROs&lt;/b&gt;) to entities in any other EU member country (together the “&lt;b&gt;Data Orders&lt;/b&gt;”); and (ii) an obligation on service providers offering services in any of the EU member countries to designate legal representatives who will be responsible for receiving the Data Orders, irrespective of whether such entity has an actual physical establishment in any EU member country.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this article we will briefly discuss the framework that has been proposed under the two instruments and then discuss how service providers based in India whose services are also available in Europe would be affected by these proposals. The authors would like to make it clear that this article is not intended to be an analysis of the E-evidence Proposal and therefore shall not attempt to bring out the shortcomings of the proposed European regime, except insofar as such shortcomings may affect the service providers located in India being discussed in the second part of the article.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Part I - E-evidence Directive and Regulation &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The E-evidence Proposal introduces the concept of binding EPOs and EPROs. Both Data Orders need to be issued or validated by a judicial authority in the issuing EU member country. A Data Order can be issued to seek preservation or production of data that is stored by a service provider located in another jurisdiction and that is necessary as evidence in criminal investigations or a criminal proceeding. Such Data Orders may only be issued if a similar measure is available for the same criminal offence in a comparable domestic situation in the issuing country. Both Data Orders can be served on entities offering services such as electronic communication services, social networks, online marketplaces, other hosting service providers and providers of internet infrastructure such as IP address and domain name registries. Thus companies such as Big Rock (domain name registry), Ferns n Petals (online marketplace providing services in Europe), Hike (social networking and chatting), etc. or any website which has a subscription based model and allows access to subscribers in Europe would potentially be covered by the E-evidence Proposal. The EPRO, similarly to the EPO, is addressed to the legal representative outside of the issuing country’s jurisdiction to preserve the data in view of a subsequent request to produce such data, which request may be issued through MLA channels in case of third countries or via a European Investigation Order (EIO) between EU member countries. Unlike surveillance measures or data retention obligations set out by law, which are not provided for by this proposal, the EPRO is an order issued or validated by a judicial authority in a concrete criminal proceeding after an individual evaluation of the proportionality and necessity in every single case.&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Like the EPO, it refers to the specific known or unknown perpetrators of a criminal offence that has already taken place. The EPRO only allows preserving data that is already stored at the time of receipt of the order, not the access to data at a future point in time after the receipt of the EPRO.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While EPOs to produce subscriber data&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and access data&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; can be issued for any criminal offence an EPO for content data&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and transactional data&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; may only be issued by a judge, a court or an investigating judge competent in the case. In case the EPO is issued by any other authority (which is competent to issue such an order in the issuing country), such an EPO has to be validated by a judge, a court or an investigating judge. In case of an EPO for subscriber data and access data, the EPO may also be validated by a prosecutor in the issuing country.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To reduce obstacles to the enforcement of the EPOs, the Directive makes it mandatory for service providers to designate a legal representative in the European Union to receive, comply with and enforce Data Orders. The obligation of designating a legal representative for all service providers that are operating in the European Union would ensure that there is always a clear addressee of orders aiming at gathering evidence in criminal proceedings. This would in turn make it easier for service providers to comply with those orders, as the legal representative would be responsible for receiving, complying with and enforcing those orders on behalf of the service provider.&lt;/p&gt;
&lt;p&gt;&lt;i&gt;&lt;span&gt;Grounds on which EPOs can be issued&lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The grounds on which Data Orders may be issued are contained in Articles 5 and 6 of the Regulation which makes it very clear that a Data Order may only be issued in a case if it is necessary and proportionate for the purposes of a criminal proceeding. The Regulation further specifies that an EPO may only be issued by a member country if a similar domestic order could be issued by the issuing state in a comparable situation. By using this device of linking the grounds to domestic law, the Regulation tries to skirt around the thorny issue of when and on what basis an EPO may be issued. The Regulation also assigns greater weight (in terms of privacy) to transactional and content data as opposed to subscriber and access data and subjects the production and preservation of the former to stricter requirements. Therefore while Data Orders for access and subscriber data may be issued for any criminal offence, orders for transactional and content data can only be issued in case of criminal offences providing for a maximum punishment of atleast 3 years and above. In addition to that EPOs for producing transactional or content data can also be issued for offences specifically listed in Article 5(4) of the Regulation. These offences have been specifically provided for since evidence for such cases would typically be available mostly only in electronic form. This is the justification for the application of the Regulation also in cases where the maximum custodial sentence is less than three years, otherwise it would become extremely difficult to secure convictions in those offences.&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Regulation also requires the issuing authority to take into account potential immunities and privileges under the law of the member country in which the service provider is being served the EPO, as well as any impact the EPO may have on fundamental interests of that member country such as national security and defence. The aim of this provision is to ensure that such immunities and privileges which protect the data sought are respected, in particular where they provide for a higher protection than the law of the issuing member country. In such situations the issuing authority “has to seek clarification before issuing the European Production Order, including by consulting the competent authorities of the Member State concerned, either directly or via Eurojust or the European Judicial Network.”&lt;/p&gt;
&lt;p&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt;&lt;span&gt;Grounds to Challenge EPOs&lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Service Providers have been given the option to object to Data Orders on certain limited grounds specified in the Regulation such as, if it was not issued by a proper issuing authority, if the provider cannot comply because of a &lt;i&gt;de facto&lt;/i&gt; impossibility or &lt;i&gt;force majeure&lt;/i&gt;, if the data requested is not stored with the service provider or pertains to a person who is not the customer of the service provider.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In all such cases the service provider has to inform the issuing authority of the reasons for the inability to provide the information in the specified form. Further, in the event that the service provider refuses to provide the information on the grounds that it is apparent that the EPO “manifestly violates” the Charter of Fundamental Rights of the European Union or is “manifestly abusive”, the service provider shall send the information in specified Form to the competent authority in the member state in which the Order has been received. The competent authority shall then seek clarification from the issuing authority through Eurojust or via the European Judicial Network.&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;If the issuing authority is not satisfied by the reasons given and the service provider still refuses to provide the information requested, the issuing authority may transfer the EPO Certificate along with the reasons given by the service provider for non compliance, to the enforcing authority in the addressee country. The enforcing authority shall then proceed to enforce the Order, unless it considers that the data concerned is protected by an immunity or privilege under its national law or its disclosure may impact its fundamental interests such as national security and defence; or the data cannot be provided due to one of the following reasons:&lt;/p&gt;
&lt;p&gt;(a) the European Production Order has not been issued or validated by an issuing authority as provided for in Article 4;&lt;/p&gt;
&lt;p&gt;(b) the European Production Order has not been issued for an offence provided for by Article 5(4);&lt;/p&gt;
&lt;p&gt;(c) the addressee could not comply with the EPOC because of de facto impossibility or force majeure, or because the EPOC contains manifest errors;&lt;/p&gt;
&lt;p&gt;(d) the European Production Order does not concern data stored by or on behalf of the service provider at the time of receipt of EPOC;&lt;/p&gt;
&lt;p&gt;(e) the service is not covered by this Regulation;&lt;/p&gt;
&lt;p&gt;(f) based on the sole information contained in the EPOC, it is apparent that it manifestly violates the Charter or that it is manifestly abusive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to the above mechanism the service provider may refuse to comply with an EPO on the ground that disclosure would force it to violate a third-country law that either protects “the fundamental rights of the individuals concerned” or “the fundamental interests of the third country related to national security or defence.” Where a provider raises such a challenge, issuing authorities can request a review of the order by a court in the member country. If the court concludes that a conflict as claimed by the service provider exists, the court shall notify authorities in the third-party country and if that third-party country objects to execution of the EPO, the court must set it aside.&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A service provider may also refuse to comply with an order because it would force the service provider to violate a third-country law that protects interests &lt;i&gt;other than&lt;/i&gt; fundamental rights or national security and defense. In such cases, the Regulation provides that the same procedure be followed as in case of law protecting fundamental rights or national security and defense, except that in this case the court, rather than notifying the foreign authorities, shall itself conduct a detailed analysis of the facts and circumstances to decide whether to enforce the order.&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt;&lt;span&gt;Service Provider “Offering Services in the Union”&lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As is clear from the discussion above, the proposed regime puts an obligation on service providers offering services in the Union to designate a legal representative in the European Union, whether the service provider is physically located in the European Union or not. This appears to be a fairly onerous obligation for small technology companies which may involve a significant cost to appoint and maintain a legal representative in the European Union, especially if the service provider is not located in the EU. Therefore the question arises as to which service providers would be covered by this obligation and the answer to that question lies in the definitions of the terms “service provider” and “offering services in the Union”.&lt;/p&gt;
&lt;p&gt;The term service provider has been defined in Article 2(2) of the Directive as follows:&lt;/p&gt;
&lt;p&gt;“‘service provider’ means any natural or legal person that provides one or more of the following categories of services:&lt;/p&gt;
&lt;p&gt;(a) electronic communications service as defined in Article 2(4) of [Directive establishing the European Electronic Communications Code];&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(b) information society services as defined in point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; for which the storage of data is a defining component of the service provided to the user, including social networks, online marketplaces facilitating transactions between their users, and other hosting service providers;&lt;/p&gt;
&lt;p&gt;(c) internet domain name and IP numbering services such as IP address providers, domain name registries, domain name registrars and related privacy and proxy services;”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Thus broadly speaking the service providers covered by the Regulation would include providers of electronic communication services, social networks, online marketplaces, other hosting service providers and providers of internet infrastructure such as IP address and domain name registries, or on their legal representatives where they exist. An important qualification that has been added in the definition is that it covers only those services where “storage of data is a defining component of the service”. Therefore, services for which the storage of data is not a defining component are not covered by the proposal. The Regulation also recognizes that most services delivered by providers involve some kind of storage of data, especially where they are delivered online at a distance; and therefore it specifically provides that services for which the storage of data is not a &lt;i&gt;main characteristic&lt;/i&gt; and is thus only of an ancillary nature would not be covered, including legal, architectural, engineering and accounting services provided online at a distance.&lt;a href="#_ftn13" name="_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This does not mean that all such service providers offering the type of services in which data storage is the main characteristic, in the EU, would be covered by the Directive. The term “offering services in the Union” has been defined in Article 2(3) of the Directive as follows:&lt;/p&gt;
&lt;p&gt;“‘offering services in the Union’ means:&lt;/p&gt;
&lt;p&gt;(a) enabling legal or natural persons in one or more Member State(s) to use the services listed under (3) above; and&lt;/p&gt;
&lt;p&gt;(b) having a substantial connection to the Member State(s) referred to in point (a);”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Clause (b) of the definition is the main qualifying factor which would ensure that only those entities whose offering of services has a “substantial connection” which the member countries of the EU would be covered by the Directive. The Regulation recognizes that mere accessibility of the service (which could also be achieved through mere accessibility of the service provider’s or an intermediary’s website in the EU) should not be a sufficient condition for the application of such an onerous condition and therefore the concept of a “substantial connection” was inserted to ascertain a sufficient relationship between the provider and the territory where it is offering its services. In the absence of a permanent establishment in an EU member country, such a “substantial connection” may be said to exist if there are a significant number of users in one or more EU member countries, or the “targeting of activities” towards one or more EU member countries. The “targeting of activities” may be determined based on various circumstances, such as the use of a language or a currency generally used in an EU member country, the availability of an app in the relevant national app store, providing local advertising or advertising in the language used in an EU member country, making use of any information originating from persons in EU member countries in the course of its activities, or from the handling of customer relations such as by providing customer service in the language generally used in EU member countries. A substantial connection can also be assumed where a service provider directs its activities towards one or more EU member countries as set out in Article 17(1)(c) of Regulation 1215/2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters.&lt;a href="#_ftn14" name="_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Part II - EU Directive and Service Providers located in India&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this part of the article we will discuss how companies based in India and running websites providing any “service” such as social networking, subscription based video streaming, etc. such as Hike or AltBalaji, Hotstar, etc. and how such companies would be affected by the E-evidence Proposal. At first glance a website providing a video streaming service may not appear to be covered by the E-evidence Proposal since one would assume that there may not be any storage of data. But if it is a service which allows users to open personal accounts (with personal and possibly financial details such as in the case of TVF, AltBalaji or Hotstar) and uses their online behaviour to push relevant material and advertisements to their accounts, whether that would make the storage of data a defining component of the website’s services as contemplated under the proposal is a question that may not be easy to answer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Even if it is assumed that the services of an Indian company can be classified as information society services for which the storage of data is a defining component, that by itself would not be sufficient to make the E-evidence Proposal applicable to it. The services of an Indian company would still need to have a “substantial connection” with an EU member country. As discussed above, this substantial connection may be said to exist based on the existence of (i) a significant number of users in one or more EU member countries, or (ii) the “targeting of activities” towards one or more EU member countries. The determination of whether a service provider is targeting its services towards an EU member country is to be made based on a number of factors listed above and is a subjective determination with certain guiding factors.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There does not seem to be clarity however on what would constitute a significant number of users and whether this determination is to be based upon the total number of users in an EU member country as a proportion of the population of the country or is it to be considered as a proportion of the total number of customers the service provider has worldwide. To explain this further let us assume that an Indian company such as Hotstar has a total user base of 100 million customers.&lt;a href="#_ftn15" name="_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; If there is a situation where 10 million of these 100 million subscribers are located in countries other than India, out of which there are about 40 thousand customers in France and another 40 thousand in Malta; then it would lead to some interesting analysis. Now 40 thousand customers in a customer base of 100 million is 0.04% of the total customer base of the service provider which generally speaking would not constitute a “significant number”. However if we reckon the 40 thousand customers from the point of view of the total population of the country of Malta, which is approximately 4.75 Lakh,&lt;a href="#_ftn16" name="_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; it would mean approx. 8.4% of the total population of Malta. It is unlikely that any service affecting almost a tenth of the population of the entire country can be labeled as not having a significant number of users in Malta. If the same math is done on the population of a country such as France, which has a population of approx. 67.3 million,&lt;a href="#_ftn17" name="_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; then the figure would be 0.05% of the total population; would that constitute a significant number as per the E-evidence Proposal.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The issues discussed above are very important for any service provider, specially a small or medium sized company since the determination of whether the E-evidence Proposal applies to them or not, apart from any potential legal implications, imposes a direct economic cost for designating a legal representative in an EU member country. Keeping in mind this economic burden and how it might affect the budget of smaller companies, the Explanatory Memorandum to the Regulation clarifies that this legal representative could be a third party, which could be shared between several service providers, and further the legal representative may accumulate different functions (e.g. the General Data Protection Regulation or e-Privacy representatives in addition to the legal representative provided for by the E-evidence Directive).&lt;a href="#_ftn18" name="_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In case all the above issues are determined to be in favour of the E-evidence Directive being applicable to an Indian company and the company designates a legal representative in an EU member country, then it remains to be seen how Indian laws relating to data protection would interact with the obligations of the Indian company under the E-evidence Directive. As per Rule 6 of the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (“&lt;b&gt;SPDI Rules&lt;/b&gt;”) service providers are not allowed to disclose sensitive personal data or information except with the prior permission of the except disclosure to mandated government agencies. The Rule provides that “the information shall be shared, without obtaining prior consent from provider of information, with &lt;i&gt;Government agencies mandated under the law&lt;/i&gt; to obtain information including sensitive personal data or information for the purpose of verification of identity, or for prevention, detection, investigation including cyber incidents, prosecution, and punishment of offences….”. Although the term “government agency mandated under law” has not been defined in the SPDI Rules, the term “law” has been defined in the Information Technology Act, 2000 (“&lt;b&gt;IT Act&lt;/b&gt;”) as under:&lt;/p&gt;
&lt;p&gt;“’law’ includes any Act of Parliament or of a State Legislature, Ordinances promulgated by the President or a Governor, as the case may be. Regulations made by the President under article 240, Bills enacted as President's Act under sub-clause (a) of clause (1) of article 357 of the Constitution and includes rules, regulations, byelaws and orders issued or made thereunder;”&lt;a href="#_ftn19" name="_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Since the SPDI Rules are issued under the IT Act, therefore the term “law” referred as used in the would have to be read as defined in the IT Act (unless court holds to the contrary). This would mean that Rule 6 of the SPDI Rules only recognises government agencies mandated under Indian law and therefore information cannot be disclosed to agencies not recognised by Indian law. In such a scenario an Indian company may not have any option except to raise an objection and challenge an EPO issued to it on the grounds provided in Article 16 of the Regulation, which process itself could mean a significant expenditure on the part of such a company.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Conclusion&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The framework sought to be established by the European Union through the E-evidence Proposal seeks to establish a regime different from those favoured by countries such as the United States which favours Mutual Agreements with (presumably) key nations or the push for data localisation being favoured by countries such as India, to streamline the process of access to digital data. Since the regime put forth by the EU is still only at the proposal stage, there may yet be changes which could clarify the regime significantly. However, as things stand Indian companies may be affected by the E-evidence Proposal in the following ways:&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Companies offering services outside India may inadvertently trigger obligations under the E-evidence Proposal if their services have a substantial connection with any of the member states of the European Union;&lt;/li&gt;
&lt;li&gt;Indian companies offering services overseas will have to make an internal determination as to whether the E-evidence Proposal applies to them or not;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;In case of Indian companies which come under the E-evidence Proposal, they would be obligated to designate a legal representative in an EU member state for receiving and executing Data Orders as per the E-evidence Proposal.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;If a legal representative is designated by the Indian company they may have to incur significant costs on maintaining a legal representative especially in a situation where they have to object to the implementation of an EPO. The company would also have to coordinate with the legal representative to adequately put forth their (Indian law related) concerns before the competent authority so that they are not forced to fall foul of their legal obligations in either jurisdiction. It is also unclear the extent to which appointed legal representatives from Indian companies could challenge or push back against requests received.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Disclaimer&lt;/span&gt;: The author of this Article is an Indian trained lawyer and not an expert on European law. The author would like to apologise for any incorrect analysis of European law that may have crept into this article despite best efforts.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Explanatory Memorandum to the Proposal for Regulation of the European Parliament and of the Council on European Production and Preservation Orders for Electronic Evidence in Criminal Matters, Pg. 4, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Subscriber data means data which is used to identify the user and has been defined in Article 2 (7) as follows:&lt;/p&gt;
&lt;p&gt;“‘subscriber data’ means any data pertaining to:&lt;/p&gt;
&lt;p&gt;(a) the identity of a subscriber or customer such as the provided name, date of birth, postal or geographic address, billing and payment data, telephone, or email;&lt;/p&gt;
&lt;p&gt;(b) the type of service and its duration including technical data and data identifying related technical measures or interfaces used by or provided to the subscriber or customer, and data related to the validation of the use of service, excluding passwords or other authentication means used in lieu of a password that are provided by a user, or created at the request of a user;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The term access data has been defined in Article 2(8) as follows:&lt;/p&gt;
&lt;p&gt;“‘access data’ means data related to the commencement and termination of a user access session to a service, which is strictly necessary for the sole purpose of identifying the user of the service, such as the date and time of use, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the user of a service, data identifying the interface used and the user ID. This includes electronic communications metadata as defined in point (g) of Article 4(3) of Regulation concerning the respect for private life and the protection of personal data in electronic communications;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The term content data has been defined in Article 2 (10) as follows:&lt;/p&gt;
&lt;p&gt;“‘content data’ means any stored data in a digital format such as text, voice, videos, images, and sound other than subscriber, access or transactional data;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The term transactional data has been defined in Article 2(9) as follows:&lt;/p&gt;
&lt;p&gt;“‘transactional data’ means data related to the provision of a service offered by a service provider that serves to provide context or additional information about such service and is generated or processed by an information system of the service provider, such as the source and destination of a message or another type of interaction, data on the location of the device, date, time, duration, size, route, format, the protocol used and the type of compression, unless such data constitues access data. This includes electronic communications metadata as defined in point (g) of Article 4(3) of [Regulation concerning the respect for private life and the protection of personal data in electronic communications];”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Explanatory Memorandum to the Proposal for Regulation of the European Parliament and of the Council on European Production and Preservation Orders for Electronic Evidence in Criminal Matters, Pg. 17, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Articles 9(4) and 10(5) of the Regulation.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 10(5) of the Regulation.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 15 of the Regulation.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 16 of the Regulation. Also see &lt;a href="https://www.insideprivacy.com/uncategorized/eu-releases-e-evidence-proposal-for-cross-border-data-access/"&gt;https://www.insideprivacy.com/uncategorized/eu-releases-e-evidence-proposal-for-cross-border-data-access/&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 2(4) of the Directive establishing European Electronic Communications Code provides as under:&lt;/p&gt;
&lt;p&gt;‘electronic communications service’ means a service normally provided for remuneration  via electronic communications networks,  which encompasses 'internet access service' as defined in Article 2(2) of Regulation (EU) 2015/2120; and/or 'interpersonal communications service'; and/or services consisting wholly or mainly in the conveyance of signals such as transmission services  used for the provision of machine-to-machine services and for broadcasting, but excludes services providing, or exercising editorial control over, content transmitted using electronic communications networks and services;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Information Society Services have been defined in the Directive specified as “any Information Society service, that is to say, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Proposal for a Directive of the European Parliament and of the Council Laying Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose of Gathering Evidence in Criminal Proceedings, Pg 8, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Proposal for a Directive of the European Parliament and of the Council Laying Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose of Gathering Evidence in Criminal Proceedings, Pg 9, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Hotstar already has an active customer base of 75 million, as of December, 2017; &lt;a href="https://telecom.economictimes.indiatimes.com/news/netflix-restricted-to-premium-subscribers-hotstar-leads-indian-ott-content-market/62351500"&gt;https://telecom.economictimes.indiatimes.com/news/netflix-restricted-to-premium-subscribers-hotstar-leads-indian-ott-content-market/62351500&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://en.wikipedia.org/wiki/Malta"&gt;https://en.wikipedia.org/wiki/Malta&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://en.wikipedia.org/wiki/France"&gt;https://en.wikipedia.org/wiki/France&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Proposal for a Directive of the European Parliament and of the Council Laying Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose of Gathering Evidence in Criminal Proceedings, Pg 5, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Section 2(y) of the Information Technology Act, 2000.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law'&gt;https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vipul</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-12-23T16:45:02Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties">
    <title>European Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties</title>
    <link>https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</link>
    <description>
        &lt;b&gt;The Court of Justice of the European Union has ruled that an "an internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties.” The decision adds to the conundrum of maintaining a balance between freedom of expression, protecting personal data and intermediary liability.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The ruling is expected to have considerable impact on reputation and privacy related takedown requests as under the decision, data subjects may approach the operator directly seeking removal of links to web pages containing personal data. Currently, users prove whether data needs to be kept online—the new rules reverse the burden of proof, placing an obligation on companies, rather than users for content regulation.&lt;/p&gt;
&lt;h3&gt;A win for privacy?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The ECJ ruling addresses Mario Costeja González complaint filed in 2010, against Google Spain and Google Inc., requesting that personal data relating to him appearing in search results be protected and that data which was no longer relevant be removed. Referring to &lt;a href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML"&gt;the Directive 95/46/EC&lt;/a&gt; of the European Parliament, the court said, that Google and other search engine operators should be considered 'controllers' of personal data. Following the decision, Google will be required to consider takedown requests of personal data, regardless of the fact that processing of such data is carried out without distinction in respect of information other than the personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The decision—which cannot be appealed—raises important of questions of how this ruling will be applied in practice and its impact on the information available online in countries outside the European Union.  The decree forces search engine operators such as Google, Yahoo and Microsoft's Bing to make judgement calls on the fairness of the information published through their services that reach over 500  million people across the twenty eight nation bloc of EU.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ECJ rules that search engines 'as a general rule,' should place the right to privacy above the right to information by the public. Under the verdict, links to irrelevant and out of date data need to be erased upon request, placing search engines in the role of controllers of information—beyond the role of being an arbitrator that linked to data that already existed in the public domain. The verdict is directed at highlighting the power of search engines to retrieve controversial information while limiting their capacity to do so in the future.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ruling calls for maintaining a balance in addressing the legitimate interest of internet users in accessing personal information and upholding the data subject’s fundamental rights, but does not directly address either issues. The court also recognised, that the data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life. Acknowledging that data belongs to the individual and is not the right of the company, European Commissioner Viviane Reding, &lt;a href="https://www.facebook.com/permalink.php?story_fbid=304206613078842&amp;amp;id=291423897690447&amp;amp;_ga=1.233872279.883261846.1397148393"&gt;hailed the verdict&lt;/a&gt;, "a clear victory for the protection of personal data of Europeans".&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Court stated that if data is deemed irrelevant at the time of the case, even if it has been lawfully processed initially, it must be removed and that the data subject has the right to approach the operator directly for the removal of such content. The liability issue is further complicated by the fact, that search engines such as Google do not publish the content rather they point to information that already exists in the public domain—raising questions of the degree of liability on account of third party content displayed on their services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ECJ ruling is based on the case originally filed against Google, Spain and it is important to note that, González argued that searching for his name linked to two pages originally published in 1998, on the website of the Spanish newspaper La Vanguardia. The Spanish Data Protection Agency did not require La Vanguardia to take down the pages, however, it did order Google to remove links to them. Google appealed this decision, following which the National  High Court of Spain sought advice from the European court. The definition of Google as the controller of information, raises important questions related to the distinction between liability of publishers and the liability of processors of information such as search engines.&lt;/p&gt;
&lt;h3&gt;The 'right to be forgotten'&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The decision also brings to the fore, the ongoing debate and &lt;a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law"&gt;fragmented opinions within the EU&lt;/a&gt;, on the right of the individual to be forgotten. The &lt;a href="http://www.bbc.com/news/technology-16677370"&gt;'right to be forgotten&lt;/a&gt;' has evolved from the European Commission's wide-ranging plans of an overhaul of the commission's 1995 Data Protection Directive. The plans for the law included allowing people to request removal of personal data with an obligation of compliance for service providers, unless there were 'legitimate' reasons to do otherwise. Technology firms rallying around issues of freedom of expression and censorship, have expressed concerns about the reach of the bill. Privacy-rights activist and European officials have upheld the notion of the right to be forgotten, highlighting the right of the individual to protect their honour and reputation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These issues have been controversial amidst EU member states with the UK's Ministry of Justice claiming the law 'raises unrealistic and unfair expectations' and  has &lt;a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law"&gt;sought to opt-out&lt;/a&gt; of the privacy laws. The Advocate General of the European Court &lt;a href="http://curia.europa.eu/juris/document/document.jsf?text=&amp;amp;docid=138782&amp;amp;pageIndex=0&amp;amp;doclang=EN&amp;amp;mode=req&amp;amp;dir=&amp;amp;occ=first&amp;amp;part=1&amp;amp;cid=362663#Footref91"&gt;Niilo Jääskinen's opinion&lt;/a&gt;, that the individual's right to seek removal of content should not be upheld if the information was published legally, contradicts the verdict of the ECJ ruling. The European Court of Justice's move is surprising for many and as Richard Cumbley, information-management and data protection partner at the law firm Linklaters &lt;a href="http://turnstylenews.com/2014/05/13/europe-union-high-court-establishes-the-right-to-be-forgotten/"&gt;puts it&lt;/a&gt;, “Given that the E.U. has spent two years debating this right as part of the reform of E.U. privacy legislation, it is ironic that the E.C.J. has found it already exists in such a striking manner."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The economic implications of enforcing a liability regime where search engine operators censor legal content in their results aside, the decision might also have a chilling effect on freedom of expression and access to information. Google &lt;a href="http://www.theguardian.com/technology/2014/may/13/right-to-be-forgotten-eu-court-google-search-results"&gt;called the decision&lt;/a&gt; “a disappointing ruling for search engines and online publishers in general,” and that the company would take time to analyze the implications. While the implications of the decision are yet to be determined, it is important to bear in mind that while decisions like these are public, the refinements that Google and other search engines will have to make to its technology and the judgement calls on the fairness of the information available online are not public.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ECJ press release is available &lt;a href="http://curia.europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf"&gt;here&lt;/a&gt; and the actual judgement is available &lt;a href="http://curia.europa.eu/juris/documents.jsf?pro=&amp;amp;lgrec=en&amp;amp;nat=or&amp;amp;oqp=&amp;amp;lg=&amp;amp;dates=&amp;amp;language=en&amp;amp;jur=C%2CT%2CF&amp;amp;cit=none%252CC%252CCJ%252CR%252C2008E%252C%252C%252C%252C%252C%252C%252C%252C%252C%252Ctrue%252Cfalse%252Cfalse&amp;amp;num=C-131%252F12&amp;amp;td=%3BALL&amp;amp;pcs=Oor&amp;amp;avg"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties'&gt;https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>jyoti</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    

   <dc:date>2014-05-14T14:18:46Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance">
    <title>EU parliament report slams US surveillance</title>
    <link>https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance</link>
    <description>
        &lt;b&gt;Report that outlines need for stringent laws for protecting citizen privacy, democratizing Internet governance holds lessons for India, say analysts.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The article by Moulishree Srivastava and Elizabeth Roche quotes Sunil Abraham. It was &lt;a class="external-link" href="http://www.livemint.com/Home-Page/nYXiR4LEVJLiROfl95aFxH/EU-parliament-report-slams-US-surveillance.html"&gt;published in Livemint&lt;/a&gt; on January 17, 2014.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;A European Union (EU) parliament report that outlines the need for stringent laws for protecting citizen privacy, democratizing Internet governance and rebuilding trust between Europe and the US holds many lessons for India, analysts and policymakers say.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The US government listened into Indian communications as part of its massive global surveillance, which was exposed last year in leaks to the media. The embassies of France, Italy, Greece, Japan, Mexico, South Korea and Turkey were also subjected to the surveillance put in place after the September 2001 terrorist attacks. According to the external affairs ministry, India has registered its protest at least thrice over the issue with US authorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A draft report on the US National Security Agency’s surveillance programme by the European parliament’s committee on civil liberties, justice and home affairs states that trust between the two transatlantic partners, trust among EU member-states, and trust between citizens and their governments were profoundly shaken because of the spying, and to rebuild trust in all these dimensions a comprehensive plan was urgently needed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"It is very doubtful that data collection of such magnitude is only guided by the fight against terrorism, as it involves the collection of all possible data of all citizens; points therefore to the possible existence of other power motives such as political and economic espionage," says the report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report recommends prohibiting blanket mass surveillance activities and bulk processing of personal data, and asks EU member-states, including the UK, Germany, France, Sweden and the Netherlands, to revise their national legislation and practices governing the activities of intelligence services to ensure that they are in line with the standards of the European Convention on Human Rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It also calls on the US to revise its legislation without delay in order to bring it in line with international law, recognizing privacy and other rights as well as providing for judicial redress for EU citizens.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"The American approach to privacy regulation has been deeply flawed. The US dominance over the Internet affects the structure and substance of Internet governance and among other human rights, the right to privacy," said Sunil Abraham, executive director of the Centre for Internet and Society, a Bangalore-based not-for-profit research organization. "The (EU) report, if implemented, may change the future of Internet governance by deepening the existing leadership provided by the EU in promoting their privacy standards globally."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On India’s rather restrained reaction to the spying, he said, “It is a tragedy that our politicians are not as proactive when it comes to protecting our rights. While India has only focused on changing its official email policy after the revelations of mass surveillance, it has done nothing as concrete and comprehensive as EU."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"There is neither the recognition of (the) pervasive nature of global mass surveillance, nor is there full appreciation (of) the damaging consequences," Abraham added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;J. Satyanarayana, secretary in India’s department of electronics and information technology, said the concerns over privacy are the same for India as for the EU, but declined to comment on what preventive steps the government is implementing due to security reasons. The EU report called for concluding the EU-US umbrella pact, a framework agreement on data protection in the field of police and judicial cooperation, to ensure proper redress mechanisms for EU citizens in the event of data transfers from the EU to the US for law enforcement purposes. The report asks EU policymakers not to initiate any new sectoral agreements or arrangements for the transfer of personal data for law enforcement purposes and suggests suspending the terrorist finance tracking programme until the umbrella agreement negotiations are concluded.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"EU wants to use EU-US umbrella agreement...to raise the US standards, to ensure the rights of EU citizens and perhaps all the citizens. All humans will need protection under US law as is currently the case in the EU,” said Abraham. “The prohibition of blanket surveillance that the report recommends will hopefully apply to all citizens regardless of their nationality."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft report goes as far as suggesting suspending Safe Harbour, the legal instrument used for the transfer of EU personal data to the US through Google, Microsoft, Yahoo, Facebook, Apple and LinkedIn, until a full review has been conducted and current loopholes are plugged. The report’s proposals and recommendations are likely to be implemented after election to the European parliament in May.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to reforms in the existing systems, the report outlines the importance of development of European clouds as it notes that trust in US cloud computing and cloud services providers has been affected by the surveillance practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Three of the major computerized reservation systems used by airlines worldwide are based in the US and that PNR (passenger name record) data are saved in cloud systems operating on US soil under US law...lacks data protection adequacy," states the report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;C.U. Bhaskar, analyst with the South Asia Monitor think tank, was of the view that India had “adequately” responded to the US through quiet diplomacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"It is unlikely that the US will give up cyber surveillance,” he said, adding, “We should acquire our own capacity to ensure adequate defensive and offensive firewalls and build up appropriate capacity for our cyber programmes."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Given our expertise in the IT (information technology) sector, as an analyst my opinion is that we have a reasonable capacity to build up our capabilities," Bhaskar added.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance'&gt;https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-02-03T06:13:55Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/about/policies/ethical-research-guidelines">
    <title>Ethical Research Guidelines</title>
    <link>https://cis-india.org/about/policies/ethical-research-guidelines</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society will endeavour to protect the physical, social and psychological well-being of those who participate in their research. The guidelines below state the necessary steps to follow while doing research.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The ethical research guideline requires CIS staff and consultants to consider and take the following steps while engaging in research.&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Providing notice to the individual of the: Aims, methods, his/her right to abstain from participation in the research and his/her right to terminate at any time his/her participation; the confidential nature of his/her replies and any limits on such confidentiality.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Providing informants and other participants the right to remain anonymous.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Taking informed consent from the individual that he/she agrees to participate. If children are involved in the research, informed consent will be taken from the parents. Informed consent will entail communicating :&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;Purpose(s) of the study, and the anticipated consequences of the research;&lt;/li&gt;
&lt;li&gt;Identity of funders and sponsors&lt;/li&gt;
&lt;li&gt;Anticipated uses of the data&lt;/li&gt;
&lt;li&gt;The degree of anonymity and confidentiality which may be afforded to informants and subjects.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Ensuring that when audio/visual-recorders and photographic records are being used, participants that are being recorded will be made aware of the use of the devices, and have the option to request that they not be used.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Ensuring that the identity and identifying information of the participant (if not already in the public domain) is destroyed at the end of project, unless the individual has consented to otherwise.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;At public events organized by CIS, it will be announced and publicly posted that the event is being recorded. Individuals will be given the choice object to being recorded or their name and organization shared in conference reports, blogs, articles etc. If the individual does not object, it will be considered that they have given their consent.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;The Centre for Internet and Society strictly follows a policy of &lt;strong&gt;No Plagiarism&lt;/strong&gt;.&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/about/policies/ethical-research-guidelines'&gt;https://cis-india.org/about/policies/ethical-research-guidelines&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Policies</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-10-13T12:21:48Z</dc:date>
   <dc:type>Page</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data">
    <title>Ethical Issues in Open Data</title>
    <link>https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data</link>
    <description>
        &lt;b&gt;On August 1, 2013, I took part in a web meeting, organized and hosted by Tim Davies of the World Wide Web foundation. The meeting, titled “Ethical issues in Open Data,” had an agenda focused around privacy considerations in the context of the open data movement.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The main panelists, Carly Nyst and Sam Smith from &lt;a class="external-link" href="http://https//www.privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, as well as Steve Song from the &lt;a class="external-link" href="http://www.idrc.ca/EN/Pages/default.aspx"&gt;International  Development Research Centre&lt;/a&gt;, were joined by roughly a dozen other privacy and development researchers from around the globe in the hour long session.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary issue of the meeting was the concern over modern capabilities of cross-analytics for de-anonymizing data sets and revealing personally identifiable information (PII) in open data. Open data can constitute publicly available information such as budgets, infrastructures, and population statistics, as long as the data meets the three open data characteristics: accessibility, machine readability, and availability for re-use. “Historically,” said Tim Davies, “public registers have been protected through obscurity.” However, both the capabilities of data analysts and the definition of personal data have continued to expand in recent years. This concern thus presents a conflict between researchers who advocate governments releasing open data reports, and researchers who emphasize privacy in the developing world.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Steve Song, advisor to IDRC Information &amp;amp; Networks program, spoke of the potential collateral damage that comes with publishing more and more types of information. Song addressed the imperative of the meeting in saying, “privacy needs to be a core part of open data conversation.” In his presentation, he gave a particularly interesting example of the tensions between public and private information implications. Following the infamous &lt;a class="external-link" href="http://en.wikipedia.org/wiki/Sandy_Hook_Elementary_School_shooting"&gt;2012 school shooting in Newtown, Connecticut&lt;/a&gt;, the information on Newtown’s gun permit owning citizens (made publicly available through America’s &lt;a class="external-link" href="http://foia.state.gov/"&gt;Freedom of Information Act&lt;/a&gt;) was aggregated into an interactive map which revealed the citizens’ addresses. This obviously became problematic for the Newtown community, as the map not only singled out homes which exercised their right to bear arms but also indirectly revealed which homes were without firearm protection and thereby more vulnerable to theft and crime. The Newtown example clearly demonstrates the relationship (and conflict) between open data and privacy; it resolves to the conflict between the right to information and the right to privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An apparent issue surrounding open data is its perceived binary nature. Many advocates either view data as being open, or not; any intermediary boundaries are only forms of governments limiting data accessibility. Therefore, a point raised by meeting attendee Raed Sharif aptly presented an open data counter-argument. Sarif noted how, inversely, privacy conceptions may form a threat to open data. He mentioned how governments could take advantage of privacy arguments to justify their refusal to publish open reports. &lt;br /&gt;&lt;br /&gt;However, Carly Nyst summarized the privacy concern and argument in her remarks near the end of the meeting. Namely, she reasoned that the open data mission is viable, if only limited to generic data, i.e., data about infrastructure, or other information that is in no way personal. Doing so will avoid obstructions of individual privacy. Until more advanced anonymization techniques can be achieved, which can overcome modern re-identification methods, publicly publishing PII may prove too risky. It was generally agreed upon during the meeting that open data is not inherently bad, and in fact its analysis and availability can be beneficial, but the threat of its misuse makes it dangerous. For the future of open data, researchers and advocates should perhaps consider more nuanced approaches to the concept in order to respect considerations for other ethical issues, such as privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data'&gt;https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>kovey</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-08-07T09:19:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age">
    <title>Ethical Data Design Practices in the AI (Artificial Intelligence) Age</title>
    <link>https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age</link>
    <description>
        &lt;b&gt;Shweta Mohandas was a panelist at discussion on Ethical Data Design Practices in the AI (Artificial Intelligence) Age, organised by Startup Grind, Bangalore on July 28, 2018 at NUMA Bangalore. &lt;/b&gt;
        &lt;h2&gt;Agenda&lt;/h2&gt;
&lt;p&gt;&lt;b&gt;Ethical Data Design Practices in the Age&lt;/b&gt;&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The panel discussion is intended to explore the challenges we face when designing the user experiences of the complex behavioral agents that increasingly run our lives.&lt;/p&gt;
&lt;p dir="ltr"&gt;Discussion centred around how to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Understand current thinking by the AI community on ethics and morality in computing and the challenges it presents. &lt;/li&gt;
&lt;li&gt;Explore examples of the ethical choices that products make now and will make in the near future.&lt;/li&gt;
&lt;li&gt;Learn how designers might approach designing experiences that face moral dilemmas.&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age'&gt;https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-08-01T23:14:21Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion">
    <title>Essentials of building internet tools for inclusion</title>
    <link>https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion</link>
    <description>
        &lt;b&gt;A talk jointly proposed by Chinmayi SK and Rohini Lakshané was selected for the Internet Freedom Festival held at Valencia, Spain from March 6 to 10, 2017. The talk held on March 6, 2017 was jointly organized by Random Hacks of Kindness, The Bachchao Project, and the Centre for Internet and Society. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;For the full schedule of Internet Freedom Festival, &lt;a class="external-link" href="https://internetfreedomfestival.org/schedule/"&gt;click here&lt;/a&gt;. A manual titled &lt;a class="external-link" href="https://github.com/thebachchaoproject/Manual-to-build-tech-for-diversity-and-Inclusion/blob/master/BuildingTechforDiversityandInclusion101.pdf"&gt;Building Tech for Diversity and Inclusion 101&lt;/a&gt; jointly authored by Chinmayi S.K., Rohini Lakshané and Willow Brugh was released during the event.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion'&gt;https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2017-03-14T14:38:23Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/enlarging-the-small-print">
    <title>Enlarging the Small Print: A Study on Designing Effective Privacy Notices for Mobile Applications</title>
    <link>https://cis-india.org/internet-governance/blog/enlarging-the-small-print</link>
    <description>
        &lt;b&gt;The Word’s biggest modern lie is often wholly considered to lie in the sentence “I haveread and agreed to the Terms and Conditions.” It is a well-known fact, backed by empirical research that consumers often skip reading cumbersome privacy notices. The reasons for these range from the lengthy nature, complicated legal jargon and inopportune moments when these notices are displayed. This paper seeks to compile and analyse the different simplified designs of privacy notices that have been proposed for mobile applications that encourage consumers to make informed privacy decisions.&lt;/b&gt;
        &lt;h2 style="text-align: justify; "&gt;Introduction: Ideas of Privacy and Consent Linked with Notices&lt;/h2&gt;
&lt;h3 style="text-align: justify; "&gt;The Notice and Choice Model&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Most modern laws and data privacy principles seek to focus on individual control. As Alan Westin of Columbia University characterises privacy, "it is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other,"	&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; Or simply put, personal information privacy is "the ability of the individual to personally control 	information about himself."&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The preferred mechanism for protecting online privacy that has emerged is that of Notice and Choice.&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt; The model, identified as "the most fundamental principle" in online privacy,&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt; refers to&lt;a href="http://itlaw.wikia.com/wiki/Post" title="Post"&gt;consumers&lt;/a&gt; consenting to privacy policies before availing of an online service.	&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following 3 standards of expectations of privacy in electronic communications have emerged in the United States courts:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;KATZ TEST: Katz v. United States,&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; a wiretap case, established expectation of privacy as one society is 	prepared to recognize as ―reasonable. &lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt;This concept is critical to a court's understanding of a new 	technology because there is no established precedent to guide its analysis&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;KYLLO/ KYLLO-KATZ HYBRID TEST: Society's reasonable expectation of privacy is higher when dealing with a new technology that is not ―generally 	available to the public.&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt;This follows the logic that it is reasonable to expect common data collection practices to be used but not rare ones. &lt;a href="#_ftn10" name="_ftnref10"&gt;[10]&lt;/a&gt; In Kyllo v. United States	&lt;a href="#_ftn11" name="_ftnref11"&gt;[11]&lt;/a&gt; law enforcement used a thermal imaging device to observe the relative heat levels inside a house. 	Though as per Katz the publicly available thermal radiation technology is reasonable, the uncommon means of collection was not. This modification to the 	Katz standard is extremely important in the context of mobile privacy. Mobile communications may be subdivided into smaller parts of audio from a phone 	call, e-mail, and data related to a user's current location. Following an application of the hybrid Katz/Kyllo test, the reasonable expectation of privacy 	in each of those communications would be determined separately&lt;a href="#_ftn12" name="_ftnref12"&gt;[12]&lt;/a&gt;, by evaluating the general accessibility 	of the technology required to capture each stream.&lt;a href="#_ftn13" name="_ftnref13"&gt;[13]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;DOUBLE CLICK TEST: DoubleClick&lt;a href="#_ftn14" name="_ftnref14"&gt;[14]&lt;/a&gt; illustrates the potential problems of transferring consent to a third 	party, one to whom the user never provided direct consent or is not even aware of. The court held that for DoubleClick, an online advertising network, to 	collect information from a user it needed only to obtain permission from the website that user accessed, and not from the user himself. The court reasoned 	that the information the user disclosed to the website was analogous to information one discloses to another person during a conversation. Just as the 	other party to the conversation would be free to tell his friends about anything that was said, a website should be free to disclose any information it 	receives from a user's visit after the user has consented to use the website's services. &lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;These interpretations have weakened the standards of online privacy. While the Katz test vaguely hinges on societal expectations, the Kyllo Test to an 	extent strengthens privacy rights by disallowing uncommon methods of collection, but as the DoubleClick Test illustrates, once the user has consented to 	such practices he cannot object to the same. There have been sugestions to consider personal information as property when it shares features of property 	like location data.&lt;a href="#_ftn15" name="_ftnref15"&gt;[15]&lt;/a&gt; It is fixed when it is in storage, it has a monetary value, and it is sold and traded on a regular basis. This would create a standard where consent is required for third-party access.	&lt;a href="#_ftn16" name="_ftnref16"&gt;[16]&lt;/a&gt; Consent will then play a more pivotal role in affixing liability.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The notice and choice mechanism is designed to put individuals in charge of the collection and use of their personal information. In theory, the regime preserves user autonomy by putting the individual in charge of decisions about the collection and use of personal information.	&lt;a href="#_ftn17" name="_ftnref17"&gt;[17]&lt;/a&gt; Notice and choice is asserted as a substitute for regulation because it is thought to be more 	flexible, inexpensive to implement, and easy to enforce.&lt;a href="#_ftn18" name="_ftnref18"&gt;[18]&lt;/a&gt; Additionally, notice and choice can legitimize an information practice, whatever it may be, by obtaining an individual's consent and suit individual privacy preferences.	&lt;a href="#_ftn19" name="_ftnref19"&gt;[19]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the notice and choice mechanism is often criticized for leaving users uninformed-or misinformed, at least-as people rarely see, read, or understand 	privacy notices. &lt;a href="#_ftn20" name="_ftnref20"&gt;[20]&lt;/a&gt; Moreover, few people opt out of the collection, use, or disclosure of their data when 	presented with the choice to do so.&lt;a href="#_ftn21" name="_ftnref21"&gt;[21]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amber Sinha of the Centre for Internet and Society argues that consent in these scenarios Is rarely meaningful as consumers fail to read/access privacy 	policies, understand the consequences and developers do not provide them the choice to opt out of a particular data practice while still being allowed to 	use their services. &lt;a href="#_ftn22" name="_ftnref22"&gt;[22]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Of particular concern is the use of software applications (apps) designed to work on mobile devices. Estimates place the current number of apps available 	for download at more than 1.5 million, and that number is growing daily.&lt;a href="#_ftn23" name="_ftnref23"&gt;[23]&lt;/a&gt; A 2011 Google study, "The 	Mobile Movement," identified that mobile devices are viewed as extensions of ourselves that we share with deeply personal relations with, raising 	fundamental questions of how apps and other mobile communications influence our privacy decision-making.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recent research indicates that mobile device users have concerns about the privacy implications of using apps.	&lt;a href="#_ftn24" name="_ftnref24"&gt;[24]&lt;/a&gt; The research finds that almost 60 percent of respondents ages 50 and older decided not to install an 	app because of privacy concerns (see figure 1).&lt;a href="#_ftn25" name="_ftnref25"&gt;[25]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ConsumerReactions.png" alt="Consumer Reactions" class="image-inline" title="Consumer Reactions" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Because no standards currently exist for providing privacy notice disclosure for apps, consumers may find it difficult to understand what data the app is 	collecting, how those data will be used, and what rights users have in limiting the collection and use of their data. Many apps do not provide users with privacy policy statements, making it impossible for app users to know the privacy implications of using a particular app.	&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt;Apps can make use of any or all of the device's functions, including contact lists, calendars, phone 	and messaging logs, locational information, Internet searches and usage, video and photo galleries, and other possibly sensitive information. For example, 	an app that allows the device to function as a scientific calculator may be accessing contact lists, locational data, and phone records even though such 	access is unnecessary for the app to function properly. &lt;a href="#_ftn27" name="_ftnref27"&gt;[27]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other apps may have privacy policies that are confusing or misleading. For example, an analysis of health and fitness apps found that more than 30 percent 	of the apps studied shared data with someone not disclosed in the app's privacy policy.&lt;a href="#_ftn28" name="_ftnref28"&gt;[28]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Types of E-Contracts&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Margaret Radin distinguishes two models of direct e-contracts based on consent as -"contract-as-consent" and "contract-as-product."	&lt;a href="#_ftn29" name="_ftnref29"&gt;[29]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The contract-as-consent model is the traditional picture of how binding commitment is arrived at between two humans. It involves a meeting of the minds 	which implies that terms be understood, alternatives be available, and probably that bargaining be possible.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the contract-as-product model, the terms are part of the product, not a conceptually separate bargain; physical product plus terms are a package deal. 	For example the fact that a chip inside an electronics item will wear out after a year is an unseen contract creating a take-it-or-leave-it choice not to 	buy the package.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The product-as-consent model defies traditional ideas of consent and raises questions of whether consent is meaningful. Modern day e-contracts such as 	click wrap, shrink wrap, viral contracts and machine-made contracts which form the privacy policy of several apps have a product-as-consent approach where 	consumers are given the take-it-or-leave-it option.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mobile application privacy notices fall into the product-as-consent model. Consumers often have to click "I agree" to all the innumerable Terms and 	Conditions in order to install the app. For instance terms that the fitness app will collect biometric data is a feature of the product that is 	non-negotiable. It is a classic take-it-or-leave-it approach where consumers compromise on privacy to avail services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Contracts that facilitate these transactions are generally long and complicated and often agreed to by consumers without reading them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Craswell strikes a balance in applying the liability rule to point out that as explaining the meaning of extensive fine print would be very costly to point 	out it could be efficient to affix the liability rule not as a written contract but rather on "reasonable" terms. This means that if a fitness app collects 	sensitive financial information, which is unreasonable given its core activities, then even if the user has consented to the same in the privacy policy's 	fine print the contract should be capable of being challenged.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;The Concept of Privacy by Design&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Privacy needs to be considered from the very beginning of system development. For this reason, Dr. Anne Cavoukian	&lt;a href="#_ftn30" name="_ftnref30"&gt;[30]&lt;/a&gt; coined the term "Privacy by Design", that is, privacy should be taken into account throughout the 	entire engineering process from the earliest design stages to the operation of the productive system. This holistic approach is promising, but it does not 	come with mechanisms to integrate privacy in the development processes of a system. The privacy-by-design approach, i.e. that data protection safeguards 	should be built into products and services from the earliest stage of development, has been addressed by the European Commission in their proposal for a 	General Data Protection Regulation. This proposal uses the terms "privacy by design" and "data protection by design" synonymously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The 7 Foundational Principles&lt;a href="#_ftn31" name="_ftnref31"&gt;[31]&lt;/a&gt; of Privacy by Design are:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Proactive not Reactive; Preventative not Remedial&lt;/li&gt;
&lt;li&gt;Privacy as the Default Setting&lt;/li&gt;
&lt;li&gt;Privacy Embedded into Design&lt;/li&gt;
&lt;li&gt;Full Functionality - Positive-Sum, not Zero-Sum&lt;/li&gt;
&lt;li&gt;End-to-End Security - Full Lifecycle Protection&lt;/li&gt;
&lt;li&gt;Visibility and Transparency - Keep it Open&lt;/li&gt;
&lt;li&gt;Respect for User Privacy - Keep it User-Centric&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Several terms have been introduced to describe types of data that need to be protected. A term very prominently used by industry is "personally 	identifiable information (PII)", i.e., data that can be related to an individual. Similarly, the European data protection framework centres on "personal 	data". However, some authors argue that this falls short since also data that is not related to a single individual might still have an impact on the 	privacy of groups, e.g., an entire group might be discriminated with the help of certain information. For data of this category the term "privacy-relevant 	data" has been used. &lt;a href="#_ftn32" name="_ftnref32"&gt;[32]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An essential part of Privacy by Design is that data subjects should be adequately informed whenever personal data is processed. Whenever data subjects use 	a system, they should be informed about which information is processed, for what purpose, by which means and who it is shared is with. They should be 	informed about their data access rights and how to exercise them.&lt;a href="#_ftn33" name="_ftnref33"&gt;[33]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Whereas system design very often does not or barely consider the end-users' interests, but primarily focuses on owners and operators of the system, it is 	essential to account the privacy and security interests of all parties involved by informing them about associated advantages (e.g. security gains) and 	disadvantages (e.g. costs, use of resources, less personalisation). By creating this system of "multilateral security" the demands of all parties must be 	realized.&lt;a href="#_ftn34" name="_ftnref34"&gt;[34]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The Concept of Data Minimization&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The most basic privacy design strategy is MINIMISE, which states that the amount of personal data that is processed should be restricted to the minimal 	amount possible. By ensuring that no, or no unnecessary, data is collected, the possible privacy impact of a system is limited. Applying the MINIMISE 	strategy means one has to answer whether the processing of personal data is proportional (with respect to the purpose) and whether no other, less invasive, 	means exist to achieve the same purpose. The decision to collect personal data can be made at design time and at run time, and can take various forms. For 	example, one can decide not to collect any information about a particular data subject at all. Alternatively, one can decide to collect only a limited set 	of attributes.&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;If a company collects and retains large amounts of data, there is an increased risk that the data will be used in a way that departs from consumers' 	reasonable expectations.&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are three privacy protection goals&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; that data minimization and privacy by 	design seek to achieve. These privacy protection goals are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Unlinkability - To prevent data being linked to an identifiable entity&lt;/li&gt;
&lt;li&gt;Transparency - The information has to be available before, during and after the processing takes place.&lt;/li&gt;
&lt;li&gt;Intervenability - Those who provide their data must have means of intervention into all ongoing or planned privacy-relevant data processing	&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Spiekermann and Cranor raised an intriguing point in their paper, they argued that those companies that employ privacy by design and data minimization practices in their applications should be allowed to skip the need for privacy policies and forgo need for notice and choice features.	&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;table style="text-align: justify; "&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;div&gt;
&lt;p&gt;&lt;b&gt; To Summarise: 							&lt;i&gt; The emerging model and legal dialogue that regulates online privacy is that of Notice and Choice which has been severely 								criticised for not creating informed choice making processes. E-contracts such as agreeing to privacy notices follow the 								consent-as-product model. When there is extensive fine print liability must be affixed on the basis of reasonable terms. 								Privacy notices must incorporate the concepts of Privacy by Design through providing complete information and collecting 								minimum data. &lt;/i&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h2 style="text-align: justify; "&gt;Features of Privacy Notices in the Current Mobile Ecosystem&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;A privacy notice inform a system's users or a company's customers of data practices involving personal information. Internal practices with regard to the 	collection, processing, retention, and sharing of personal information should be made transparent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Each app a user chooses to install on his smartphone can access different information stored on that device. There is no automatic access to user 	information. Each application has access only to the data that it pulls into its own 'sandbox'.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The sandbox is a set of fine-grained controls limiting an application's access to files, preferences, network resources, hardware etc. Applications cannot 	access each other's sandboxes.&lt;a href="#_ftn39" name="_ftnref39"&gt;[39]&lt;/a&gt; The data that makes it into the sandbox is normally defined by user permissions.&lt;a href="#_ftn40" name="_ftnref40"&gt;[40]&lt;/a&gt; These are a set of user defined controls&lt;a href="#_ftn41" name="_ftnref41"&gt;[41]&lt;/a&gt;and evidence that a user consents to the application accessing that data.	&lt;a href="#_ftn42" name="_ftnref42"&gt;[42]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To gain permission mobile apps generally display privacy notices that explicitly seek consent. These can leverage different channels, including a privacy 	policy document posted on a website or linked to from mobile app stores or mobile apps. For example, Google Maps uses a traditional clickwrap structure that requires the user to agree to a list of terms and conditions when the program is initially launched.	&lt;a href="#_ftn43" name="_ftnref43"&gt;[43]&lt;/a&gt; Foursquare, on the other hand, embeds its terms in a privacy policy posted on its website, and not 	within the app. &lt;a href="#_ftn44" name="_ftnref44"&gt;[44]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This section explains the features of current privacy notices on the 4 parameters of stage (at which the notice is given), content, length and user 	comprehension. Under each of these parameters the associated problems are identified and alternatives are suggested.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;(1) &lt;/b&gt; &lt;b&gt;Timing and Frequency of Notice: &lt;br /&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt; This sub-section identifies the various stages that notices are given and highlights their advantages, disadvantages and makes recommendations. It 		concludes with the findings of a study on what the ideal stage to provide notice is. This is supplemented with 2 critical models to address the common 		problems of habituation and contextualization. &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; Studies indicate that timing of notices or the stage at which they are given impact how consumer's recall and comprehend them and make choices 		accordingly. &lt;/b&gt; &lt;a href="#_ftn45" name="_ftnref45"&gt;[45]&lt;/a&gt; &lt;b&gt; I&lt;/b&gt; ntroducing only a 15-second delay between the presentation of privacy notices and privacy relevant choices can be enough to render notices ineffective at 	driving user behaviour.&lt;a href="#_ftn46" name="_ftnref46"&gt;[46]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Google Android and Apple iOS provide notices at different times. At the time of writing, Android users are shown a list of requested permissions while the 	app is being installed, i.e., after the user has chosen to install the app. In contrast, iOS shows a dialog during app use, the first time a permission is 	requested by an app. This is also referred to as a "just-in-time" notification. &lt;a href="#_ftn47" name="_ftnref47"&gt;[47]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following are the stages in which a notice can be given:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;1) NOTICE AT SETUP: Notice can be provided when a system is used for the first time&lt;a href="#_ftn48" name="_ftnref48"&gt;[48]&lt;/a&gt;. For instance, as 	part of a software installation process users are shown and have to accept the system's terms of use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) &lt;span&gt;Advantages&lt;/span&gt;: Users can inspect a system's data practices before using or purchasing it. The system developer is benefitted due to liability and 	transparency reasons that gain user trust. It provides the opportunity to explain unexpected data practices that may have a benign purpose in the context 	of the system&lt;a href="#_ftn49" name="_ftnref49"&gt;[49]&lt;/a&gt;. It can even impact purchase decisions. Egelman et al. found that participants were more 	likely to pay a premium at a privacy-protective website when they saw privacy information in search results, as opposed to on the website after selecting a 	search result&lt;a href="#_ftn50" name="_ftnref50"&gt;[50]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Users have become largely habituated to install time notices and ignore them&lt;a href="#_ftn51" name="_ftnref51"&gt;[51]&lt;/a&gt;. Users 	may have difficulty making informed decisions because they have not used the system yet and cannot fully assess its utility or weigh privacy trade-offs. They may also be focused on the primary task, namely completing the setup process to be able to use the system, and fail to pay attention to notices	&lt;a href="#_ftn52" name="_ftnref52"&gt;[52]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Privacy notices provided at setup time should be concise and focus on data practices immediately relevant to the primary user rather 	than presenting extensive terms of service. Integrating privacy information into other materials that explain the functionality of the system may further 	increase the chance that users do not ignore it.&lt;a href="#_ftn53" name="_ftnref53"&gt;[53]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;2) JUST IN TIME NOTICE: A privacy notice can be shown when a data practice is active, for example when information is being collected, used, or shared. 	Such notices are referred to as "contextualized" or "just-in-time" notices&lt;a href="#_ftn54" name="_ftnref54"&gt;[54]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: They enhance transparency and enable users to make privacy decisions in context. Users have also been shown to more freely share information 	if they are given relevant explanations at the time of data collection&lt;a href="#_ftn55" name="_ftnref55"&gt;[55]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Habituation can occur if these are shown too frequently. Moreover in apps such as gaming apps users generally tend to ignore notices 	displayed during usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Consumers can be given notice the first time a particular type of information is accessed such as email and then be given the option to 	opt out of further notifications. A Consumer may then seek to opt out of notices on email but choose to view all notices on health information that is 	accessed depending on his privacy priorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;3) CONTEXT-DEPENDENT NOTICES: The user's and system's context can also be considered to show additional notices or controls if deemed necessary	&lt;a href="#_ftn56" name="_ftnref56"&gt;[56]&lt;/a&gt;. Relevant context may be determined by a change of location, additional users included in or receiving 	the data, and other situational parameters. Some locations may be particularly sensitive, therefore users may appreciate being reminded that they are 	sharing their location when they are in a new place, or when they are sharing other information that may be sensitive in a specific context. Facebook introduced a privacy checkup message in 2014 that is displayed under certain conditions before posting publicly. It acts as a "nudge"	&lt;a href="#_ftn57" name="_ftnref57"&gt;[57]&lt;/a&gt; to make users aware that the post will be public and to help them manage who can see their posts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It may help users make privacy decisions that are more aligned with their desired level of privacy in the respective situation and thus 	foster trust in the system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Challenges in providing context-dependent notices are detecting relevant situations and context changes. Furthermore, determining whether a context is relevant to an individual's privacy concerns could in itself require access to that person's sensitive data and privacy preferences.	&lt;a href="#_ftn58" name="_ftnref58"&gt;[58]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Standards must be evolved to determine a contextual model based on user preferences.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;4) PERIODIC NOTICES: These are shown the first couple of times a data practice occurs, or every time. The sensitivity of the data practice may determine 	the appropriate frequency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It can further help users maintain awareness of privacy-sensitive information flows especially when data practices are largely invisible	&lt;a href="#_ftn59" name="_ftnref59"&gt;[59]&lt;/a&gt;such as in patient monitoring apps. This helps provide better control options.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Repeating notices can lead to notice fatigue and habituation&lt;a href="#_ftn60" name="_ftnref60"&gt;[60]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Frequency of these notices needs to be balanced with user needs. &lt;a href="#_ftn61" name="_ftnref61"&gt;[61]&lt;/a&gt; Data practices 	that are reasonably expected as part of the system may require only a single notice, whereas practices falling outside the expected context of use which 	the user is potentially unaware of may warrant repeated notices. Periodic notices should be relevant to users in order to be not perceived as annoying. A combined notice can remind about multiple ongoing data practices. Rotating warnings or changing their look can also further reduce habituation effects	&lt;a href="#_ftn62" name="_ftnref62"&gt;[62]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;5) PERSISTENT NOTICES: A persistent indicator is typically non-blocking and may be shown whenever a data practices is active, for instance when information 	is being collected continuously or when information is being transmitted&lt;a href="#_ftn63" name="_ftnref63"&gt;[63]&lt;/a&gt;. When inactive or not shown, 	persistent notices also indicate that the respective data practice is currently not active. For instance, Android and iOS display a small icon in the 	status bar whenever an application accesses the user's location.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: These are easy to understand and not annoying increasing their functionality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: These ambient indicators often go unnoticed.&lt;a href="#_ftn64" name="_ftnref64"&gt;[64]&lt;/a&gt; Most systems can only accommodate such 	indicators for a small number of data practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Persistent indicators should be designed to be noticeable when they are active. A system should only provide a small set of persistent 	indicators to indicate activity of especially critical data practices which the user can also specify.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;6) NOTICE ON DEMAND: Users may also actively seek privacy information and request a privacy notice. A typical example is posting a privacy policy at a persistent location&lt;a href="#_ftn65" name="_ftnref65"&gt;[65]&lt;/a&gt; and providing links to it from the app.	&lt;a href="#_ftn66" name="_ftnref66"&gt;[66]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: Privacy sensitive users are given the option to better explore policies and make informed decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: The current model of a link to a long privacy policy on a website will discourage users from requesting for information that they cannot 	fully understand and do not have time to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Better option are privacy settings interfaces or privacy dashboards within the system that provide information about data practices; 	controls to manage consent; summary reports of what information has been collected, used, and shared by the system; as well as options to manage or delete 	collected information. Contact information for a privacy office should be provided to enable users to make written requests.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Which of these Stages is the Most Ideal?&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In a series of experiments, Rebecca Balekabo and others &lt;a href="#_ftn67" name="_ftnref67"&gt;[67]&lt;/a&gt; have identified the impact of timing on 	smartphone privacy notices. The following 5 conditions were imposed on participants who were later tested on their levels of recall of the notices through 	questions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Not Shown: The participants installed and used the app without being shown a privacy notice&lt;/li&gt;
&lt;li&gt;App Store: Notice was shown at the time of installation at the app store&lt;/li&gt;
&lt;li&gt;App store Big: A large notice occupying more screen space was shown at the app store&lt;/li&gt;
&lt;li&gt;App Store Popup: A smaller popup was displayed at the app Store&lt;/li&gt;
&lt;li&gt;During use: Notice was shown during usage of the app&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The results (Figure) suggest that even if a notice contains information users care about, it is unlikely to be recalled if only shown in the app store and 	more effective when shown during app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Seeing the app notice during app usage resulted in better recall. Although participants remembered the notice shown after app use as well as in other 	points of app use, they found that it was not a good point for them to make decisions about the app because they had already used it, and participants 	preferred when the notice was shown during or before app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Hence depending on the app there are optimal times to show smartphone privacy notices to maximize attention and recall with preference being given to the 	beginning of or during app use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However several of these stages as outlined baove face the disadvantages of habituation and uncertainty on contextualization. The following 2 models have 	been proposed to address this:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;Habituation&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;When notices are shown too frequently, users may become habituated. Habituation may lead to users disregarding warnings, often without reading or 	comprehending the notice&lt;a href="#_ftn68" name="_ftnref68"&gt;[68]&lt;/a&gt;. To reduce habituation from app permission notices, Felt et al. identified a 	tested method to determine which permission requests should be emphasized &lt;a href="#_ftn69" name="_ftnref69"&gt;[69]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;They categorized actions on the basis of revertibility, severability, initiation, alterable and approval nature (Explained in figure) and applied the 	following permission granting mechanisms :&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Automatic Grant: It must be requested by the developer, but it is granted without user involvement.&lt;/li&gt;
&lt;li&gt;Trusted UI elements: They appear as part of an application's workflow, but clicking on them imbues the application with a new permission. To ensure 	that applications cannot trick users, trusted UI elements can be controlled only by the platform. For example, a user who is sending an SMS message from a 	third-party application will ultimately need to press a button; using trusted UI means the platform provides the button.&lt;/li&gt;
&lt;li&gt;Confirmation Dialog: Runtime consent dialogs interrupt the user's flow by prompting them to allow or deny a permission and often contain 	descriptions of the risk or an option to remember the decision.&lt;/li&gt;
&lt;li&gt;Install-time warning: These integrate permission granting into the installation flow. Installation screens list the application's requested 	permissions. In some platforms (e.g., Facebook), the user can reject some install-time permissions. In other platforms (e.g., Android and Windows 8 Metro), 	the user must approve all requested permissions or abort installation.&lt;a href="#_ftn70" name="_ftnref70"&gt;[70]&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Based on these conditions the following sequential model that the system must adopt was proposed to determine frequency of displaying notices:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/SequentialModel.png/@@images/6a94f50d-4bd0-4566-bc30-32d5ef3f53d3.png" alt="Sequential Model" class="image-inline" title="Sequential Model" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Initial tests have proven to be successful in reducing habituation effects and it is an important step towards designing and displaying privacy notices.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Contextualization&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Bastian Koning and others, in their paper "Towards Context Adaptive Privacy Decisions in Ubiquitous Computing"	&lt;b&gt; &lt;a href="#_ftn71" name="_ftnref71"&gt;&lt;b&gt;[71]&lt;/b&gt;&lt;/a&gt;&lt;/b&gt; propose a system for supporting a user's privacy decisions in situ, 	i.e., in the context they are required in, following the notion of contextual integrity. It approximates the user's privacy preferences and adapts them to 	the current context. The system can then either recommend sharing decisions and actions or autonomously reconfigure privacy settings. It is divided into 	the following stages:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/PrivacyDecisionProcess.png/@@images/4dd72aef-1bb1-42d9-ae59-9592b2a36b9f.png" alt="Privacy Decision Process" class="image-inline" title="Privacy Decision Process" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Context Model:&lt;/b&gt; A distinction is created between the decision level and system level. The system level enables context awareness but also filters context information and 	maps it to semantic concepts required for decisions. Semantic mappings can be derived from a pre-defined or learnt world model. On the decision level, the 	context model only contains components relevant for privacy decision making. For example: An activity involves the user, is assigned a type, i.e., a 	semantic label, such as home or work, based on system level input.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Privacy Decision Engine&lt;/b&gt; : The context model allows to reason about which context items are affected by a context transition. When a transition occurs, the privacy decision engine 	(PDE) evaluates which protection worthy context items are affected. Protection worthiness (or privacy relevance) of context items for a given context are 	determined by the user's privacy preferences that are This serves as a basis for adapting privacy preferences and is subsequently further adjusted to the 	user by learning from the user's explicit decisions, behaviour, and reaction to system actions. &lt;a href="#_ftn72" name="_ftnref72"&gt;[72]&lt;/a&gt; approximated by the system from the knowledge base.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The user's personality type is determined before initial system use&lt;/i&gt; to select a basic privacy profile.&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It may also be possible that the privacy preference cannot be realized in the current context. In that case, the privacy policy would suggest terminating 	the activity. For each privacy policy variant a confidence score is calculated based on how well it fits the adapted privacy preference. Based on the 	confidence scores, the PDE selects the most appropriate policy candidate or triggers user involvement if the confidence is below a certain threshold 	determined by the user's personality and previous privacy decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Realization and Enforcement:&lt;/b&gt; The selected privacy policy must be realized on the system level. This is by combining territorial privacy and information privacy aspects. The private 	territory is defined by a territorial privacy boundary that separates desired and undesired entities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Granularity adjustments for specific Information items is defined. For example, instead of the user's exact position only the street address or city can be 	provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ADVANTAGES: The personalization to a specific user has the advantage of better emulating that user's privacy decision process. It also helps to decide when 	to involve the user in the decision process by providing recommendations only and when privacy decisions can be realized autonomously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DISADVANTAGES: The entire model hinges on the ability of the system to accurately determine user profile before the user starts using it and not after, 	when preferences can be more accurately determined. There is no provision for the user to pick his own privacy profile, it is all system determined taking 	away an element of consent in the very beginning. As all further preferences are adapted on this base, it is possible that the system may not deliver. The 	use of confident scores is an approximation that can compromise privacy by a small numerical margin of difference.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However it is a useful insight on techniques of contextualization. Depending on the environment, different strategies for policy realization and varying 	degrees of enforcement are possible&lt;a href="#_ftn73" name="_ftnref73"&gt;[73]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Length&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The length of privacy policies is often cited as one reason they are so commonly ignored. Studies show privacy policies are hard to read, read 	infrequently, and do not support rational decision making. &lt;a href="#_ftn74" name="_ftnref74"&gt;[74]&lt;/a&gt; Aleecia M. McDonald and Lorrie Faith Cranor 	in their seminal study, "The Cost of Reading Privacy Policies" estimated that the the average length of privacy policies is 2,500 words. Using the reading 	speed of 250 words per minute which is typical for those who have completed secondary education, the average policy would take 10 minutes to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers also investigated how quickly people could read privacy policies when they were just skimming it for pertinent details. They timed 93 	people as they skimmed a 934-word privacy policy and answered multiple choice questions on its content.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though some people took under a minute and others up to 42 minutes, the bulk of the subjects of the research took between three and six minutes to skim the 	policy, which itself was just over a third of the size of the average policy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers used their data to estimate how much it costs to read the privacy policy of every site they visit once a year if their time was charged for 	and arrived at a mind boggling figure of $652 billion.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ProbabilityDensityFunction.png" alt="Probability Density Function" class="image-inline" title="Probability Density Function" /&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Though the figure of $652 billion has limited usefulness, because people rarely read whole policies and cannot charge anyone for the time it takes to do 	this, the researchers concluded that readers who do conduct a cost-benefit analysis might decide not to read any policies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Preliminary work from a small pilot study in our laboratory revealed that some Internet users believe their only serious risk online is they may lose up 	to $50 if their credit card information is stolen. For people who think that is their primary risk, our point estimates show the value of their time to 	read policies far exceeds this risk. Even for our lower bound estimates of the value of time, it is not worth reading privacy policies though it may be 	worth skimming them," said the research. This implies that seeing their only risk as credit card fraud suggests Internet users likely do not understand the 	risks to their privacy. As an FTC report recently stated, "it is unclear whether consumers even understand that their information is being collected, 	aggregated, and used to deliver advertising."&lt;a href="#_ftn75" name="_ftnref75"&gt;[75]&lt;/a&gt;"&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;If the privacy community can find ways to reduce the time cost of reading policies, it may be easier to convince Internet users to do so. For example, if 	consumers can move from needing to read policies word-for-word and only skim policies by providing useful headings, or with ways to hide all but relevant information in a layered format and thus reduce the effective length of the policies, more people may be willing to read them.	&lt;a href="#_ftn76" name="_ftnref76"&gt;[76]&lt;/a&gt; Apps can also adopt short form notices that summarize and link to the larger more complete notice 	displayed elsewhere. These short form notices need not be legally binding and must candidate that it does not cover all types of data collection but only 	the most relevant ones. &lt;a href="#_ftn77" name="_ftnref77"&gt;[77]&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;Content&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In an attempt to gain permission most privacy policies inform users about: (1) the type of information collected; and (2) the purpose for collecting that 	information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Standard privacy notices generally cover the points of:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;M&lt;b&gt;ethods Of Collection And Usage Of Personal Information&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;The Cookie Policy &lt;/b&gt; &lt;b&gt; &lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Sharing Of Customer Information&lt;/b&gt; &lt;a href="#_ftn78" name="_ftnref78"&gt;&lt;b&gt;[78]&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt; &lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Certified Information Privacy Professionals divide notices into the following sequential sections&lt;a href="#_ftn79" name="_ftnref79"&gt;[79]&lt;/a&gt;:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;i. &lt;b&gt;Policy Identification Details: D&lt;/b&gt;efines the policy name, version and description.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ii. &lt;b&gt;P3P-Based Components: &lt;/b&gt;Defines policy attributes that would apply if the policy is exported to a P3P format.	&lt;a href="#_ftn80" name="_ftnref80"&gt;[80]&lt;/a&gt; Such attributes would include: policy URLs, organization information, P&lt;span&gt;II&lt;/span&gt; access and dispute 	resolution procedures.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;iii. &lt;b&gt;Policy Statements and Related Elements: Groups, Purposes and PII Types-&lt;/b&gt;Policy statements define the individuals able to access 	certain types of information, for certain pre-defined purposes.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Applications tend to define the type of data broadly in an attempt to strike a balance between providing enough information so that application may gain 	consent to access a user's data and being broad enough to avoid ruling out specific information.&lt;a href="#_ftn81" name="_ftnref81"&gt;[81]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This leads to usage of vague terms like "information collected &lt;i&gt;may &lt;/i&gt;include."&lt;a href="#_ftn82" name="_ftnref82"&gt;[82]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly the purpose of the data acquisition is also very broad. For example, a privacy policy may state that user data can be collected for anything 	related to ―"improving the content of the Service." As the scope of ―improving the content of the Service is never defined, any usage could 	conceivably fall within that category.&lt;a href="#_ftn83" name="_ftnref83"&gt;[83]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several apps create user social profiles based on their online preferences to promote targeted marketing which is cleverly concealed in phrases like "we may also draw upon this Personal Information in order to adapt the Services of our community to your needs".	&lt;a href="#_ftn84" name="_ftnref84"&gt;[84]&lt;/a&gt; For instance Bees &amp;amp; Pollen is a "predictive personalization" platform for games and apps that 	"uses advanced predictive algorithms to detect complex, non-trivial correlations between conversion patterns and users' DNA signatures, thus enabling it to 	automatically serve each user a personalized best-fit game options, in real-time." In reality it analyses over 100 user attributes, including activity on 	Facebook, spending behaviours, marital status, and location.&lt;a href="#_ftn85" name="_ftnref85"&gt;[85]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notices also often mislead consumers into believing that their information will not be shared with third parties using the terms "unaffiliated third 	parties." Other affiliated companies within the corporate structure of the service provider may have access to user's data for marketing and other 	purposes. &lt;a href="#_ftn86" name="_ftnref86"&gt;[86]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are very few choices to opt-out of certain practices, such as sharing data for marketing purposes. Thus, users are effectively left with a 	take-it-or-leave-it choice - give up your privacy or go elsewhere.&lt;a href="#_ftn87" name="_ftnref87"&gt;[87]&lt;/a&gt;Users almost always grant consent if 	it is required to receive the service they want which raises the query if this consent is meaningful&lt;a href="#_ftn88" name="_ftnref88"&gt;[88]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The following recommendations have emerged:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Notice&lt;/b&gt; - Companies should provide consumers with clear, conspicuous notice that accurately describe their information practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; " type="disc"&gt;
&lt;li&gt; &lt;b&gt;Consumer Choice&lt;/b&gt; - Companies should provide consumers with the opportunity to decide (in the form of opting-out) if it may disclose personal information to unaffiliated 		third parties. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Access and Correction&lt;/b&gt; - Companies should provide consumers with the opportunity to access and correct personal information collected about the consumer. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Security&lt;/b&gt; - Companies must adopt reasonable security measures in order to protect the privacy of personal information. Possible security measures include: 		administrative security, physical security and technical security. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Enforcement&lt;/b&gt; - Companies should have systems through which they can enforce the privacy policy. This may be managed by the company, or an independent third party to ensure compliance. Examples of popular third parties include &lt;a href="https://www.cippguide.org/tag/bbbonline/"&gt;BBBOnLine&lt;/a&gt; and		&lt;a href="https://www.cippguide.org/tag/truste/"&gt;TRUSTe&lt;/a&gt;.&lt;a href="#_ftn89" name="_ftnref89"&gt;[89]&lt;/a&gt; &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Standardization&lt;/b&gt; : Several researchers and organizations have recommended a standardized privacy notice format that covers certain essential points.		&lt;a href="#_ftn90" name="_ftnref90"&gt;[90]&lt;/a&gt; However as displaying a privacy notice in itself is voluntary it is unpredictable whether 		companies would willingly adopt a standardized model. Moreover with the app market burgeoning with innovations a standard format may not cover all 		emergent data practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2 style="text-align: justify; "&gt;Comprehension&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;The FTC states that &lt;/b&gt; "the notice-and-choice model, as implemented, has led to long, incomprehensible privacy policies that consumers typically do not read, let alone 	understand. the question is not whether consumers should be given a say over unexpected uses of their data; rather, the question is how to provide 	simplified notice and choice"&lt;a href="#_ftn91" name="_ftnref91"&gt;[91]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notably, in a survey conducted by Zogby International, 93% of adults - and 81% of teens - indicated they would take more time to read terms and conditions 	for websites if they were written in clearer language.&lt;a href="#_ftn92" name="_ftnref92"&gt;[92]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most privacy policies are in natural language format: companies explain their practices in prose. One noted disadvantage to current natural language 	policies is that companies can choose which information to present, which does not necessarily solve the problem of information asymmetry between companies and consumers. Further, companies use what have been termed "weasel words" - legalistic, ambiguous, or slanted phrases - to describe their practices	&lt;a href="#_ftn93" name="_ftnref93"&gt;[93]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a study by Aleecia M. McDonald and others&lt;a href="#_ftn94" name="_ftnref94"&gt;[94]&lt;/a&gt;, it was found that accuracy in what users comprehend span 	a wide range. An average of 91% of participants answered correctly when asked about cookies, 61% answered correctly about opt out links, 60% understood 	when their email address would be "shared" with a third party, and only 46% answered correctly regarding telemarketing. Participants found those questions 	harder which substituted vague or complicated terms to refer to practices such as telemarketing by "the information you provide may be used for marketing 	services." Overall accuracy was a mere 33%.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Natural language policies are often long and require college-level reading skills. Furthermore, there are no standards for which information is disclosed, 	no standard place to find particular information, and data practices are not described using consistent language. These policies are "long, complicated, 	and full of jargon and change frequently."&lt;a href="#_ftn95" name="_ftnref95"&gt;[95]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Kent Walker list five problems that privacy notices typically suffer from -&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) overkill - long and repetitive text in small print,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) irrelevance - describing situations of little concern to most consumers,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) opacity - broad terms the reflect the truth that is impossible to track and control all the information collected and stored,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;d) non-comparability - simplification required to achieve comparability will lead to compromising accuracy, and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;e) inflexibility - failure to keep pace with new business models. &lt;a href="#_ftn96" name="_ftnref96"&gt;[96]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Researchers advocate a more succinct and simpler standard for privacy notices,&lt;a name="_ftnref34"&gt;&lt;/a&gt;&lt;a href="#_ftn97" name="_ftnref97"&gt;[97]&lt;/a&gt; such as representing the information in the form of a table. &lt;a href="#_ftn98" name="_ftnref98"&gt;[98]&lt;/a&gt; However, studies show only an insignificant improvement in the understanding by consumers when privacy policies are represented in graphic formats like tables and labels.	&lt;a href="#_ftn99" name="_ftnref99"&gt;[99]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are also recommendations to adopt a multi-layered approach where the relevant information is summarized through a short notice.&lt;a href="#_ftn100" name="_ftnref100"&gt;[100]&lt;/a&gt; This is backed by studies that consumers find layered policies easier to understand.	&lt;a href="#_ftn101" name="_ftnref101"&gt;[101]&lt;/a&gt; However they were less accurate in the layered format especially with parts that were not 	summarized. This suggests participants that did not continue to the full policy when the information they sought was not available on the short notice. 	Unless it is possible to identify all of the topics users care about and summarize to one page, the layered notice effectively hides information and reduces transparency. It has also been pointed out that it is impossible to convey complex data policies in simple and clear language.	&lt;a href="#_ftn102" name="_ftnref102"&gt;[102]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Consumers often struggle to map concepts such as third party access to the terms used in policies. This is also because companies with identical practices 	often convey different information, and these differences reflected in consumer's ability to understand the policies. These policies may need an 	educational component so readers understand what it means for a site to engage in a given practice&lt;a href="#_ftn103" name="_ftnref103"&gt;[103]&lt;/a&gt;. 	However it is unlikely that when readers fail to take time to read the policy that they will read up on additional educational components.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;[1]&lt;/a&gt; Amber Sinha http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;[2]&lt;/a&gt; Wang, &lt;i&gt;et al.&lt;/i&gt;, 1998) Milberg, &lt;i&gt;et al.&lt;/i&gt; (1995)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;[3]&lt;/a&gt; See e.g., White House, Consumer Privacy Bill of Rights (2012) 			http://www.whitehouse.gov/the-pressoffice/2012/02/23/we-can-t-wait-obama-administration-unveils-blueprint-privacy-bill-rights; Fed. Trade Comm'n, 			Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Business and Policy Makers (2012) 			http://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commissionreport-protecting-consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;[4]&lt;/a&gt; Fed. Trade Comm'n, Privacy Online: A Report to Congress 7 (June 1998), available at www.ftc.gov/reports/privacy3/priv-23a.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; &lt;a href="http://itlaw.wikia.com/wiki/U.S._Department_of_Commerce" title="U.S. Department of Commerce"&gt;U.S. Department of Commerce&lt;/a&gt; , &lt;a href="http://itlaw.wikia.com/wiki/Internet_Policy_Task_Force" title="Internet Policy Task Force"&gt;Internet Policy Task Force&lt;/a&gt;, 			&lt;a href="http://itlaw.wikia.com/wiki/Commercial_Data_Privacy_and_Innovation_in_the_Internet_Economy:_A_Dynamic_Policy_Framework" title="Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework"&gt; Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework &lt;/a&gt; 20 (Dec. 16, 2010) (&lt;a href="http://www.ntia.doc.gov/reports/2010/IPTF_Privacy_GreenPaper_12162010.pdf"&gt;full-text&lt;/a&gt;).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;[6]&lt;/a&gt; 389 U.S. 347 (1967).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;[7]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;[8]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;[9]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;[10]&lt;/a&gt; Kyllo, 533 U.S. at 34 (―[T]he technology enabling human flight has exposed to public view (and hence, we have said, to official observation) 			uncovered portions of the house and its curtilage that once were private.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;[11]&lt;/a&gt; Kyllo v. United States, 533 U.S. 27&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;[12]&lt;/a&gt; See Katz, 389 U.S. at 352 (―But what he sought to exclude when he entered the booth was not the intruding eye-it was the uninvited ear. He 			did not shed his right to do so simply because he made his calls from a place where he might be seen.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;[13]&lt;/a&gt; See United States v. Ahrndt, No. 08-468-KI, 2010 WL 3773994, at *4 (D. Or. Jan. 8, 2010).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;[14]&lt;/a&gt; In re DoubleClick Inc. Privacy Litig., 154 F. Supp. 2d 497 (S.D.N.Y. 2001).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;[15]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;[16]&lt;/a&gt; See Michael A. Carrier, Against Cyberproperty, 22 BERKELEY TECH. L.J. 1485, 1486 (2007) (arguing against creating a right to exclude users from 			making electronic contact to their network as one that exceeds traditional property notions).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;[17]&lt;/a&gt; See M. Ryan Calo, Against Notice Skepticism in Privacy (and Elsewhere), 87 NOTRE DAME L. REV. 1027, 1049 (2012) (citing Paula J. Dalley, The Use 			and Misuse of Disclosure as a Regulatory System, 34 FLA. ST. U. L. REV. 1089, 1093 (2007) ("[D]isclosure schemes comport with the prevailing 			political philosophy in that disclosure preserves individual choice while avoiding direct governmental interference.")).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;[18]&lt;/a&gt; See Calo, supra note 10, at 1048; see also Omri Ben-Shahar &amp;amp; Carl E. Schneider, The Failure of Mandated Disclosure, 159 U. PA. L. REV. 647, 682 			(noting that notice "looks cheap" and "looks easy").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;[19]&lt;/a&gt; Mark MacCarthy, New Directions in Privacy: Disclosure, Unfairness and Externalities, 6 I/S J. L. &amp;amp; POL'Y FOR INFO. SOC'Y 425, 440 (2011) 			(citing M. Ryan Calo, A Hybrid Conception of Privacy Harm Draft-Privacy Law Scholars Conference 2010, p. 28).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;[20]&lt;/a&gt; Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 HARV. L. REV. 1879, 1885 (2013) (citing Jon Leibowitz, Fed. 			Trade Comm'n, So Private, So Public: Individuals, the Internet &amp;amp; the Paradox of Behavioral Marketing, Remarks at the FTC Town Hall Meeting on 			Behavioral Advertising: Tracking, Targeting, &amp;amp; Technology (Nov. 1, 2007), available at 			http://www.ftc.gov/speeches/leibowitz/071031ehavior/pdf). Paul Ohm refers to these issues as "information-quality problems." See Paul Ohm, Branding 			Privacy, 97 MINN. L. REV. 907, 930 (2013). Daniel J. Solove refers to this as "the problem of the uninformed individual." See Solove, supra note 17&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;[21]&lt;/a&gt; See Edward J. Janger &amp;amp; Paul M. Schwartz, The Gramm-Leach-Bliley Act, Information Privacy, and the Limits of Default Rules, 86 MINN. L. REV. 			1219, 1230 (2002) (stating that according to one survey, "only 0.5% of banking customers had exercised their opt-out rights").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;[22]&lt;/a&gt; See Amber Sinha A Critique of Consent in Information Privacy 			http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;[23]&lt;/a&gt; Leigh Shevchik, "Mobile App Industry to Reach Record Revenue in 2013," New Relic (blog), April 1, 2013, 			http://blog.newrelic.com/2013/04/01/mobile-apps-industry-to-reach-record-revenue-in-2013/.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;[24]&lt;/a&gt; Jan Lauren Boyles, Aaron Smith, and Mary Madden, "Privacy and Data Management on Mobile Devices," Pew Internet &amp;amp; American Life Project, 			Washington, DC, September 5, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;[25]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;[26]&lt;/a&gt; "Mobile Apps for Kids: Disclosures Still Not Making the Grade," Federal Trade Commission, Washington, DC, December 2012&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;[27]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;[28]&lt;/a&gt; Linda Ackerman, "Mobile Health and Fitness Applications and Information Privacy," Privacy Rights Clearinghouse, San Diego, CA, July 15, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;[29]&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999). 			&lt;a href="http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj"&gt; http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj &lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;[30]&lt;/a&gt; William Aiello, Steven M. Bellovin, Matt Blaze, Ran Canetti, John Ioannidis, Angelos D. Keromytis, and Omer Reingold. Just fast keying: Key 			agreement in a hostile internet. ACM Trans. Inf. Syst. Secur., 7(2):242-273, 2004.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;[31]&lt;/a&gt; Privacy By Design The 7 Foundational Principles by Anne Cavoukian https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;[32]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;[33]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;[34]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; John Frank Weaver, We Need to Pass Legislation on Artificial Intelligence Early and Often, SLATE FUTURE TENSE (Sept. 12, 			2014),http://www.slate.com/blogs/future_tense/2014/09/12/we_need_to_pass_artificial_intelligence_laws_early_and_often.html&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Richard Warner &amp;amp; Robert Sloan, Beyond Notice and Choice: Privacy, Norms, and Consent, J. High Tech. L. (2013). Available at: 			http://scholarship.kentlaw.iit.edu/fac_schol/568&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;b&gt;&lt;sup&gt;&lt;b&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/b&gt;&lt;/sup&gt;&lt;/b&gt;&lt;/a&gt; &lt;a href="http://ssrn.com/abstract=1085333"&gt;&lt;b&gt;Engineering Privacy by Sarah Spiekermann, Lorrie Faith Cranor :: SSRN&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;[39]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;[40]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn41"&gt;
&lt;p&gt;&lt;a href="#_ftnref41" name="_ftn41"&gt;[41]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn42"&gt;
&lt;p&gt;&lt;a href="#_ftnref42" name="_ftn42"&gt;[42]&lt;/a&gt; See Katherine Noyes, Why Android App Security is Better Than for the iPhone, PC WORLD BUS. CTR. (Aug. 6, 2010, 4:20 PM), 			http://www.pcworld.com/businesscenter/article/202758/why_android_app_security_is_be tter_than_for_the_iphone.html; see also About Permissions for 			Third-Party Applications, BLACKBERRY, http://docs.blackberry.com/en/smartphone_users/deliverables/22178/ 			About_permissions_for_third-party_apps_50_778147_11.jsp (last visited Sept. 29, 2011); Security and Permissions, supra note 76.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn43"&gt;
&lt;p&gt;&lt;a href="#_ftnref43" name="_ftn43"&gt;[43]&lt;/a&gt; Peter S. Vogel, A Worrisome Truth: Internet Privacy is Impossible, TECHNEWSWORLD (June 8, 2011, 5:00 AM), http://www.technewsworld.com/ 			story/72610.html.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn44"&gt;
&lt;p&gt;&lt;a href="#_ftnref44" name="_ftn44"&gt;[44]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn45"&gt;
&lt;p&gt;&lt;a href="#_ftnref45" name="_ftn45"&gt;[45]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing Notice: A Large-scale Experiment on the Timing of Software License 			Agreements. In Proc. of CHI. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn46"&gt;
&lt;p&gt;&lt;a href="#_ftnref46" name="_ftn46"&gt;[46]&lt;/a&gt; I. Adjerid, A. Acquisti, L. Brandimarte, and G. Loewenstein. Sleights of Privacy: Framing, Disclosures, and the Limits of Transparency. In Proc. of 			SOUPS. ACM, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn47"&gt;
&lt;p&gt;&lt;a href="#_ftnref47" name="_ftn47"&gt;[47]&lt;/a&gt; http://delivery.acm.org/10.1145/2810000/2808119/p63-balebako.pdf?ip=106.51.36.200&amp;amp;id=2808119&amp;amp;acc=OA&amp;amp;key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E35B5BCE80D07AAD9&amp;amp;CFID=801296199&amp;amp;CFTOKEN=33661544&amp;amp;__acm__=1466052980_2f265a2442ea3394aa1ebab7e6449933&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn48"&gt;
&lt;p&gt;&lt;a href="#_ftnref48" name="_ftn48"&gt;[48]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn49"&gt;
&lt;p&gt;&lt;a href="#_ftnref49" name="_ftn49"&gt;[49]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn50"&gt;
&lt;p&gt;&lt;a href="#_ftnref50" name="_ftn50"&gt;[50]&lt;/a&gt; S. Egelman, J. Tsai, L. F. Cranor, and A. Acquisti. Timing is everything?: the effects of timing and placement of online privacy indicators. In 			Proc. CHI '09. ACM, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn51"&gt;
&lt;p&gt;&lt;a href="#_ftnref51" name="_ftn51"&gt;[51]&lt;/a&gt; R. B¨ohme and S. K¨opsell. Trained to accept?: A field experiment on consent dialogs. In Proc. CHI '10. ACM, 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn52"&gt;
&lt;p&gt;&lt;a href="#_ftnref52" name="_ftn52"&gt;[52]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn53"&gt;
&lt;p&gt;&lt;a href="#_ftnref53" name="_ftn53"&gt;[53]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn54"&gt;
&lt;p&gt;&lt;a href="#_ftnref54" name="_ftn54"&gt;[54]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn55"&gt;
&lt;p&gt;&lt;a href="#_ftnref55" name="_ftn55"&gt;[55]&lt;/a&gt; A. Kobsa and M. Teltzrow. Contextualized communication of privacy practices and personalization benefits: Impacts on users' data sharing and 			purchase behavior. In Proc. PETS '05. Springer, 2005.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn56"&gt;
&lt;p&gt;&lt;a href="#_ftnref56" name="_ftn56"&gt;[56]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn57"&gt;
&lt;p&gt;&lt;a href="#_ftnref57" name="_ftn57"&gt;[57]&lt;/a&gt; E. Choe, J. Jung, B. Lee, and K. Fisher. Nudging people away from privacy-invasive mobile apps through visual framing. In Proc. INTERACT '13. 			Springer, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn58"&gt;
&lt;p&gt;&lt;a href="#_ftnref58" name="_ftn58"&gt;[58]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn59"&gt;
&lt;p&gt;&lt;a href="#_ftnref59" name="_ftn59"&gt;[59]&lt;/a&gt; Article 29 Data Protection Working Party. Opinion 8/2014 on the Recent Developments on the Internet of Things. WP 223, Sept. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn60"&gt;
&lt;p&gt;&lt;a href="#_ftnref60" name="_ftn60"&gt;[60]&lt;/a&gt; B. Anderson, A. Vance, B. Kirwan, E. D., and S. Howard. Users aren't (necessarily) lazy: Using NeuroIS to explain habituation to security warnings. 			In Proc. ICIS '14, 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn61"&gt;
&lt;p&gt;&lt;a href="#_ftnref61" name="_ftn61"&gt;[61]&lt;/a&gt; B. Anderson, B. Kirwan, D. Eargle, S. Howard, and A. Vance. How polymorphic warnings reduce habituation in the brain - insights from an fMRI study. 			In Proc. CHI '15. ACM, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn62"&gt;
&lt;p&gt;&lt;a href="#_ftnref62" name="_ftn62"&gt;[62]&lt;/a&gt; M. S. Wogalter, V. C. Conzola, and T. L. Smith-Jackson. Research-based guidelines for warning design and evaluation. Applied Ergonomics, 16 USENIX 			Association 2015 Symposium on Usable Privacy and Security 17 33(3):219-230, 2002.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn63"&gt;
&lt;p&gt;&lt;a href="#_ftnref63" name="_ftn63"&gt;[63]&lt;/a&gt; L. F. Cranor, P. Guduru, and M. Arjula. User interfaces for privacy agents. ACM TOCHI, 13(2):135-178, 2006.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn64"&gt;
&lt;p&gt;&lt;a href="#_ftnref64" name="_ftn64"&gt;[64]&lt;/a&gt; R. S. Portnoff, L. N. Lee, S. Egelman, P. Mishra, D. Leung, and D. Wagner. Somebody's watching me? assessing the effectiveness of webcam indicator 			lights. In Proc. CHI '15, 2015&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn65"&gt;
&lt;p&gt;&lt;a href="#_ftnref65" name="_ftn65"&gt;[65]&lt;/a&gt; M. Langheinrich. Privacy by design - principles of privacy-aware ubiquitous systems. In Proc. UbiComp '01. Springer, 2001&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn66"&gt;
&lt;p&gt;&lt;a href="#_ftnref66" name="_ftn66"&gt;[66]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn67"&gt;
&lt;p&gt;&lt;a href="#_ftnref67" name="_ftn67"&gt;[67]&lt;/a&gt; The Impact of Timing on the Salience of Smartphone App Privacy Notices, Rebecca Balebako , Florian Schaub, Idris Adjerid , Alessandro Acquist 			,Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn68"&gt;
&lt;p&gt;&lt;a href="#_ftnref68" name="_ftn68"&gt;[68]&lt;/a&gt; R. Böhme and J. Grossklags. The Security Cost of Cheap User Interaction. In Workshop on New Security Paradigms, pages 67-82. ACM, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn69"&gt;
&lt;p&gt;&lt;a href="#_ftnref69" name="_ftn69"&gt;[69]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn70"&gt;
&lt;p&gt;&lt;a href="#_ftnref70" name="_ftn70"&gt;[70]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn71"&gt;
&lt;p&gt;&lt;a href="#_ftnref71" name="_ftn71"&gt;[71]&lt;/a&gt; Towards Context Adaptive Privacy Decisions in Ubiquitous Computing Florian Schaub∗ , Bastian Könings∗ , Michael Weber∗ , 			Frank Kargl† ∗ Institute of Media Informatics, Ulm University, Germany Email: { florian.schaub | bastian.koenings | michael.weber 			}@uni-ulm.d&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn72"&gt;
&lt;p&gt;&lt;a href="#_ftnref72" name="_ftn72"&gt;[72]&lt;/a&gt; M. Korzaan and N. Brooks, "Demystifying Personality and Privacy: An Empirical Investigation into Antecedents of Concerns for Information Privacy," 			Journal of Behavioral Studies in Business, pp. 1-17, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn73"&gt;
&lt;p&gt;&lt;a href="#_ftnref73" name="_ftn73"&gt;[73]&lt;/a&gt; B. Könings and F. Schaub, "Territorial Privacy in Ubiquitous Computing," in WONS'11. IEEE, 2011, pp. 104-108.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn74"&gt;
&lt;p&gt;&lt;a href="#_ftnref74" name="_ftn74"&gt;[74]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn75"&gt;
&lt;p&gt;&lt;a href="#_ftnref75" name="_ftn75"&gt;[75]&lt;/a&gt; 5 Federal Trade Commission, "Protecting Consumers in the Next Tech-ade: A Report by the Staff of the Federal Trade Commission," March 2008, 11, 			http://www.ftc.gov/os/2008/03/P064101tech.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn76"&gt;
&lt;p&gt;&lt;a href="#_ftnref76" name="_ftn76"&gt;[76]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;p&gt;I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue http://www.is-journal.org/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn77"&gt;
&lt;p&gt;&lt;a href="#_ftnref77" name="_ftn77"&gt;[77]&lt;/a&gt; IS YOUR INSEAM YOUR BIOMETRIC? Evaluating the Understandability of Mobile Privacy Notice Categories Rebecca Balebako, Richard Shay, and Lorrie 			Faith Cranor July 17, 2013 https://www.cylab.cmu.edu/files/pdfs/tech_reports/CMUCyLab13011.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn78"&gt;
&lt;p&gt;&lt;a href="#_ftnref78" name="_ftn78"&gt;[78]&lt;/a&gt; https://www.sba.gov/blogs/7-considerations-crafting-online-privacy-policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn79"&gt;
&lt;p&gt;&lt;a href="#_ftnref79" name="_ftn79"&gt;[79]&lt;/a&gt; https://www.cippguide.org&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn80"&gt;
&lt;p&gt;&lt;a href="#_ftnref80" name="_ftn80"&gt;[80]&lt;/a&gt; The Platform for Privacy Preferences Project, more commonly known as P3P was designed by the World Wide Web Consortium aka W3C in response to the 			increased use of the Internet for sales transactions and subsequent collection of personal information. P3P is a special protocol that allows a 			website's policies to be machine readable, granting web users' greater control over the use and disclosure of their information while browsing the 			internet.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn81"&gt;
&lt;p&gt;&lt;a href="#_ftnref81" name="_ftn81"&gt;[81]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn82"&gt;
&lt;p&gt;&lt;a href="#_ftnref82" name="_ftn82"&gt;[82]&lt;/a&gt; See Foursqaure Privacy Policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn83"&gt;
&lt;p&gt;&lt;a href="#_ftnref83" name="_ftn83"&gt;[83]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn84"&gt;
&lt;p&gt;&lt;a href="#_ftnref84" name="_ftn84"&gt;[84]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn85"&gt;
&lt;p&gt;&lt;a href="#_ftnref85" name="_ftn85"&gt;[85]&lt;/a&gt; Bees and Pollen, "Bees and Pollen Personalization Platform," http://www.beesandpollen.com/TheProduct. aspx; Bees and Pollen, "Sense6-Social Casino 			Games Personalization Solution," http://www.beesandpollen. com/sense6.aspx; Bees and Pollen, "About Us," http://www.beesandpollen.com/About.aspx.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn86"&gt;
&lt;p&gt;&lt;a href="#_ftnref86" name="_ftn86"&gt;[86]&lt;/a&gt; CFA on the NTIA Short Form Notice Code of Conduct to Promote Transparency in Mobile Applications July 26, 2013 | Press Release&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn87"&gt;
&lt;p&gt;&lt;a href="#_ftnref87" name="_ftn87"&gt;[87]&lt;/a&gt; P. M. Schwartz and D. Solove. Notice &amp;amp; Choice. In The Second NPLAN/BMSG Meeting on Digital Media and Marketing to Children, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn88"&gt;
&lt;p&gt;&lt;a href="#_ftnref88" name="_ftn88"&gt;[88]&lt;/a&gt; F. Cate. The Limits of Notice and Choice. IEEE Security Privacy, 8(2):59-62, Mar. 2010.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn89"&gt;
&lt;p&gt;&lt;a href="#_ftnref89" name="_ftn89"&gt;[89]&lt;/a&gt; https://www.cippguide.org/2011/08/09/components-of-a-privacy-policy/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn90"&gt;
&lt;p&gt;&lt;a href="#_ftnref90" name="_ftn90"&gt;[90]&lt;/a&gt; https://www.ftc.gov/public-statements/2001/07/case-standardization-privacy-policy-formats&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn91"&gt;
&lt;p&gt;&lt;a href="#_ftnref91" name="_ftn91"&gt;[91]&lt;/a&gt; Protecting Consumer Privacy in an Era of Rapid Change. Preliminary FTC Staff Report.December 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn92"&gt;
&lt;p&gt;&lt;a href="#_ftnref92" name="_ftn92"&gt;[92]&lt;/a&gt; . See Comment of Common Sense Media, cmt. #00457, at 1.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn93"&gt;
&lt;p&gt;&lt;a href="#_ftnref93" name="_ftn93"&gt;[93]&lt;/a&gt; Pollach, I. What's wrong with online privacy policies? Communications of the ACM 30, 5 (September 2007), 103-108&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn94"&gt;
&lt;p&gt;&lt;a href="#_ftnref94" name="_ftn94"&gt;[94]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn95"&gt;
&lt;p&gt;&lt;a href="#_ftnref95" name="_ftn95"&gt;[95]&lt;/a&gt; Amber Sinha Critique&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn96"&gt;
&lt;p&gt;&lt;a href="#_ftnref96" name="_ftn96"&gt;[96]&lt;/a&gt; Kent Walker, The Costs of Privacy, 2001 available at 			&lt;a href="https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy"&gt; https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn97"&gt;
&lt;p&gt;&lt;a href="#_ftnref97" name="_ftn97"&gt;[97]&lt;/a&gt; Annie I. Anton et al., Financial Privacy Policies and the Need for Standardization, 2004 available at			&lt;a href="https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf"&gt;https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf&lt;/a&gt;; Florian Schaub, R. 			Balebako et al, "A Design Space for effective privacy notices" available at 			&lt;a href="https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf"&gt; https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn98"&gt;
&lt;p&gt;&lt;a href="#_ftnref98" name="_ftn98"&gt;[98]&lt;/a&gt; Allen Levy and Manoj Hastak, Consumer Comprehension of Financial Privacy Notices, Interagency Notice Project, available at			&lt;a href="https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf"&gt;https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn99"&gt;
&lt;p&gt;&lt;a href="#_ftnref99" name="_ftn99"&gt;[99]&lt;/a&gt; Patrick Gage Kelly et al., Standardizing Privacy Notices: An Online Study of the Nutrition Label Approach available at 			&lt;a href="https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf"&gt; https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn100"&gt;
&lt;p&gt;&lt;a href="#_ftnref100" name="_ftn100"&gt;[100]&lt;/a&gt; The Center for Information Policy Leadership, Hunton &amp;amp; Williams LLP, "Ten Steps To Develop A Multi-Layered Privacy Notice" available at 			&lt;a href="https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf"&gt; https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn101"&gt;
&lt;p&gt;&lt;a href="#_ftnref101" name="_ftn101"&gt;[101]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn102"&gt;
&lt;p&gt;&lt;a href="#_ftnref102" name="_ftn102"&gt;[102]&lt;/a&gt; Howard Latin, "Good" Warnings, Bad Products, and Cognitive Limitations, 41 UCLA Law Review available at 			&lt;a href="https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5"&gt; https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn103"&gt;
&lt;p&gt;&lt;a href="#_ftnref103" name="_ftn103"&gt;[103]&lt;/a&gt; Report by Kleimann Communication Group for the FTC. Evolution of a prototype financial privacy notice, 2006. http://www.ftc.gov/privacy/ 			privacyinitiatives/ftcfinalreport060228.pdf Accessed 2 Mar 2007&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/enlarging-the-small-print'&gt;https://cis-india.org/internet-governance/blog/enlarging-the-small-print&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Meera Manoj</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-12-14T16:27:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi">
    <title>Encryption policy would have affected emails, operating systems, WiFi</title>
    <link>https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi</link>
    <description>
        &lt;b&gt;Our email data would have to be stored. If we connect to a WiFi, that data would have to be stored, and that's plain ridiculous. There is a problem when the government tries to target citizens to ensure national security, said Pranesh Prakash, policy director at the Bangalore-based Centre for Internet and Society. &lt;/b&gt;
        &lt;p&gt;The article by Amrita Madhukalya was published in &lt;a class="external-link" href="http://www.dnaindia.com/india/report-encryption-policy-would-have-affected-emails-operating-systems-wifi-2127715"&gt;DNA&lt;/a&gt; on September 23, 2015.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="http://www.dnaindia.com/topic/draft-national-policy"&gt;Draft National Policy&lt;/a&gt; on Encryption, withdrawn by the Department of Electronics and  Information Technology (DeiTY) after it created a furore on privacy  issues, would have had allowed the government access to any form of  digital data that required encryption. Not limited to just WhatsApp or  Viber data, it would have affected email services, WiFi, phone operating  systems, etc.&lt;/p&gt;
&lt;p&gt;"Our email data would have to be stored. If we connect to a WiFi,  that data would have to be stored, and that's plain ridiculous. There is  a problem when the government tries to target citizens to ensure  national security," said Pranesh Prakash, policy director at the  Bangalore-based Centre for Internet and Society.&lt;/p&gt;
&lt;p&gt;The government, criticised heavily for the policy, withdrew it on  Tuesday afternoon. It said that a new policy will be brought in its  place.&lt;/p&gt;
&lt;p&gt;Nikhil Pahwa of internet watchdog Medianama said that data about  normal day-to-day activities would have to be stored if the policy was  implemented. "The policy would have affected everyday business to  consumer data.&lt;br /&gt; This would mean that if a doctor or lawyer had your data digitised,  they will be open to access, and would have to be kept for at least 90  days," said Pahwa.&lt;/p&gt;
&lt;p&gt;However, he added that a robust encryption is needed. "It is believed that companies like Google, &lt;a href="http://www.dnaindia.com/topic/facebook"&gt;Facebook&lt;/a&gt; allow the NSA to access user data in the US, putting our personal  security, and the national security largely, at risk," said Pahwa.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi'&gt;https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-09-25T01:23:10Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
