<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 681 to 695.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/facebook-and-its-aversion-to-anonymous-and-pseudonymous-speech"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/mumbai-mirror-tariq-engineer-october-2-2016-eye-on-mumbai"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/first-post-august-23-2016-seetha-extending-aadhaar-to-more-areas-is-a-hare-brained-idea-it-should-be-dropped"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/livemint-june-8-2017-shaikh-zoaib-saleem-explore-money-apps-but-watch-your-data"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/dna-india-october-19-2012-saikat-datta-experts-committee-moots-law-to-protect-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/new-indian-express-may-6-2017-experts-stress-on-need-for-enhanced-security"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/about/policies/ethical-research-guidelines"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/enlarging-the-small-print"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/facebook-and-its-aversion-to-anonymous-and-pseudonymous-speech">
    <title>Facebook and its Aversion to Anonymous and Pseudonymous Speech</title>
    <link>https://cis-india.org/internet-governance/blog/facebook-and-its-aversion-to-anonymous-and-pseudonymous-speech</link>
    <description>
        &lt;b&gt;Jessamine Mathew explores Facebook's "real name" policy and its implications for the right to free speech. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The power to be unidentifiable on the internet has been a major reason for its sheer number of users. Most of the internet can now be freely used by anybody under a pseudonym without the fear of being recognised by anybody else. These conditions allow for the furtherance of free expression and protection of privacy on the internet, which is particularly important for those who use the internet as a medium to communicate political dissent or engage in any other activity which would be deemed controversial in a society yet not illegal. For example, an internet forum for homosexuals in India, discussing various issues which surround homosexuality may prove far more fruitful if contributors are given the option of being undetectable, considering the stigma that surrounds homosexuality in India, and the recent setting-aside of the Delhi High Court decision reading down Section 377 of the Indian Penal Code. The possibility of being anonymous or pseudonymous exists on many internet fora but on Facebook, the world’s greatest internet space for building connections and free expression, there is no sanction given to pseudonymous accounts as Facebook follows a real name policy. And as the &lt;a href="http://www.nytimes.com/2014/06/27/technology/facebook-battles-manhattan-da-over-warrants-for-user-data.html?_r=0"&gt;recent decision&lt;/a&gt; of a New York judge, disallowing Facebook from contesting warrants on private information of over 300 of its users, shows, there are clear threats to freedom of expression and privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the subject of using real names, Facebook’s Community Standards states, “Facebook is a community where people use their real identities. We require everyone to provide their real names, so you always know who you're connecting with. This helps keep our community safe.” Facebook’s Marketing Director, Randi Zuckerberg, &lt;a href="http://www.dailymail.co.uk/news/article-2019544/Facebook-director-Randi-Zuckerberg-calls-end-internet-anonymity.html"&gt;bluntly dismissed&lt;/a&gt; the idea of online anonymity as one that “has to go away” and that people would “behave much better” if they are made to use their real names. Apart from being a narrow-minded statement, she fails to realise that there are many different kinds of expression on the internet, from stories of sexual abuse victims to the views of political commentators, or indeed, whistleblowers, many of whom may prefer to use the platform without being identified. It has been decided in many cases that humans have a right to anonymity as it provides for the furtherance of free speech without the fear of retaliation or humiliation (&lt;i&gt;see &lt;/i&gt;Talley v. California).&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While Facebook’s rationale behind wanting users to register for accounts with their own names is based on the goal of maintaining the security of other users, it is still a serious infraction on users’ freedom of expression, particularly when anonymous speech has been protected by various countries. Facebook has evolved from a private space for college students to connect with each other to a very public platform where not just social connections but also discussions take place, often with a heavily political theme. Facebook has been described as &lt;a href="http://www.thenational.ae/news/uae-news/facebook-and-twitter-key-to-arab-spring-uprisings-report"&gt;instrumental&lt;/a&gt; in the facilitation of communication during the Arab Spring, providing a space for citizens to effectively communicate with each other and organise movements. Connections on Facebook are no longer of a purely social nature but have extended to political and legal as well, with it being used to promote movements all through the country. Even in India, Facebook was the &lt;a href="http://timesofindia.indiatimes.com/home/news/Facebook-Twitter-Google-change-face-of-Indian-elections/articleshow/34721829.cms"&gt;most widely adopted medium&lt;/a&gt;, along with Twitter and Facebook, for discourse on the political future of the country during, before and after the 2014 elections. Earlier in 2011, Facebook was &lt;a href="https://cis-india.org/news/web2.0-responds-to-hazare"&gt;used intensively&lt;/a&gt; during the India Against Corruption movement. There were pages created, pictures and videos uploaded, comments posted by an approximate of 1.5 million people in India. In 2012, Facebook was also used to &lt;a href="http://timesofindia.indiatimes.com/tech/social-media/Delhi-gang-rape-case-FacebookTwitter-fuels-rally-at-India-Gate/articleshow/17741529.cms"&gt;protest against the Delhi gang rape&lt;/a&gt; with many coming forward with their own stories of sexual assault, providing support to the victim, organising rallies and marches and protesting about the poor level of safety of women in Delhi.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Much like its content policy, Facebook exhibits a number of discrepancies in the implementation of the anonymity ban. Salman Rushdie found that his Facebook account had been &lt;a href="http://www.nytimes.com/2011/11/15/technology/hiding-or-using-your-name-online-and-who-decides.html?pagewanted=all&amp;amp;_r=0"&gt;suspended&lt;/a&gt; and when it was reinstated after he sent them proof of identity, Facebook changed his name to the name on his passport, Ahmed Rushdie instead of the name he popularly goes by. Through a series of tweets, he criticised this move by Facebook, forcing him to display his birth name. Eventually Facebook changed his name back to Salman Rushdie but not before serious questions were raised regarding Facebook’s policies. The Moroccan activist Najat Kessler’s account was also &lt;a href="https://www.google.co.in/url?sa=t&amp;amp;rct=j&amp;amp;q=&amp;amp;esrc=s&amp;amp;source=web&amp;amp;cd=5&amp;amp;cad=rja&amp;amp;uact=8&amp;amp;ved=0CD8QFjAE&amp;amp;url=http%3A%2F%2Fjilliancyork.com%2F2010%2F04%2F08%2Fon-facebook-deactivations%2F&amp;amp;ei=O1KxU-fwH8meugSZ74HgAg&amp;amp;usg=AFQjCNE7oUt2dyrSjpTskK7Oz3Q1OYXudg&amp;amp;sig2=bsOu46nmABTUhArhdjDCVw&amp;amp;bvm=bv.69837884,d.c2E"&gt;suspended&lt;/a&gt; as it was suspected that she was using a fake name. Facebook has also not just stopped at suspending individual user accounts but has also removed pages and groups because the creators used pseudonyms to create and operate the pages in question. This was seen in the case of Wael Ghonim who created a group which helped in mobilizing citizens in Egypt in 2011. Ghonim was a Google executive who did not want his online activism to affect his professional life and hence operated under a pseudonym. Facebook temporarily &lt;a href="http://www.newsweek.com/how-wael-ghonim-sparked-egypts-uprising-68727"&gt;removed&lt;/a&gt; the group due to his pseudonymity but later reinstated it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While Facebook performs its due diligence when it comes to some accounts, it has still done nothing about the overwhelmingly large number of obviously fake accounts, ranging from Santa Claus to Jack the Ripper. On my own Facebook friend list, there are people who have entered names of fictional characters as their own, clearly violating the real name policy. I once reported a pseudonymous account that used the real name of another person. Facebook thanked me for reporting the account but also said that I will “probably not hear back” from them. The account still exists with the same name. The redundancy of the requirement lies in the fact that Facebook does not request users to upload some form identification when they register with the site but only when they suspect them to be using a pseudonym. Since Facebook also implements its policies largely only on the basis of complaints by other users or the government, the real name policy makes many political dissidents and social activists the target of abuse on the internet.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Further, Articles 21 and 22 of the ICCPR grant all humans the right to free and peaceful assembly. As governments increasingly crack down on physical assemblies of people fighting for democracy or against legislation or conditions in a country, the internet has proved to be an extremely useful tool for facilitating this assembly without forcing people to endure the wrath of governmental authorities. A large factor which has promoted the popularity of internet gatherings is the way in which powerful opinions can be voice without the fear of immediate detection. Facebook has become the coveted online space for this kind of assembly but their policies and more particularly, faulty implementation of the policies, lead to reduced flows of communication on the site.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Of course, Facebook’s fears of cyberbullying and harassment are likely to materialise if there is absolutely no check on the identity of users.  A possible solution to the conflict between requiring real names to keep the community safe and still allowing individuals to be present on the network without the fear of identification by anybody would be to ask users to register with their own names but still allowing them to create a fictional name which would be the name that other Facebook users can see. Under this model, Facebook can also deal with the issue of safety through their system of reporting against other users. If a pseudonymous user has been reported by a substantial number of people for harassment or any other cause, then Facebook may either suspend the account or remove the content that is offensive. If the victim of harassment chooses to approach a judicial body, then Facebook may reveal the real name of the user so that due process may be followed. At the same time, users who utilise the website to present their views and participate in the online process of protest or contribute to free expression in any other way can do so without the fear of being detected or targeted.  Safety on the site can be maintained even without forcing users to reveal their real names to the world. The system that Facebook follows currently does not help curb the presence of fake accounts and neither does it promote completely free expression on the site.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/facebook-and-its-aversion-to-anonymous-and-pseudonymous-speech'&gt;https://cis-india.org/internet-governance/blog/facebook-and-its-aversion-to-anonymous-and-pseudonymous-speech&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Jessamine Mathew</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Chilling Effect</dc:subject>
    
    
        <dc:subject>Anonymity</dc:subject>
    
    
        <dc:subject>Pseudonimity</dc:subject>
    
    
        <dc:subject>Article 19(1)(a)</dc:subject>
    

   <dc:date>2014-07-04T07:53:07Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/mumbai-mirror-tariq-engineer-october-2-2016-eye-on-mumbai">
    <title>Eye on Mumbai</title>
    <link>https://cis-india.org/internet-governance/news/mumbai-mirror-tariq-engineer-october-2-2016-eye-on-mumbai</link>
    <description>
        &lt;b&gt;The feeds will be beamed to a video wall that stretches 21 feet across at the police’s command and control room.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Tariq Engineer was &lt;a href="http://www.mumbaimirror.indiatimes.com/mumbai/cover-story/Eye-on-Mumbai/articleshow/54634572.cms"&gt;published           in Mumbai Mirror&lt;/a&gt; today. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;When seven bombs exploded on local trains between Khar and         Borivali killing 209 people and injuring 714 in 2006, the         Maharashtra police looked for CCTV footage but couldn’t find any         because no cameras existed at railway stations back then.&lt;br /&gt; &lt;br /&gt; When terrorists landed near Machimar colony in Cuffe Parade in         2008 and proceeded to slaughter hundreds of people in the city,         CCTV footage was found only at the Taj and Trident hotels,         Chhatrapati Shivaji Terminus and near the Times of India         building. Places like Cama Hospital, Nariman House and Leopold         Café were simply off the grid.&lt;br /&gt; &lt;br /&gt; When Mumbai journalist J Dey was gunned down in Powai in 2011,         the police obtained CCTV footage from a shopping centre nearby         but it was so blurry, it was useless.&lt;br /&gt; &lt;br /&gt; In each of these situations, a fully functioning high-definition         CCTV system could have altered the outcome or aided the         investigation in critical ways. That glaring gap in Mumbai’s         security has now been filled by the Mumbai City Surveillance         Project, which officially goes live today.&lt;br /&gt; &lt;br /&gt; Over the last 20 months, a total of 4697 cameras have been         installed at 1510 locations around Mumbai city. In addition to         these, another 146 will survey the Bandra Kurla Complex. The         tender for the project was issued in 2015 and won by a         consortium led by construction major Larsen &amp;amp; Toubro with         MTNL, CMS Computers and Infinova, which supplied the cameras, as         partners.&lt;br /&gt; &lt;br /&gt; The project is actually an outcome of the 26/11 attacks, having         been recommended by the Ram Pradhan Committee, which was         appointed to evaluate the city administration’s responses to the         terror strike. According to Additional Chief Secretary (Home) KP         Bakshi these cameras will ensure roughly 80 per cent of Mumbsi         will be watched 24 hours a day, seven days a week. The city’s         inhabitants will now have to be on their best behaviour.&lt;br /&gt; &lt;br /&gt; “It was the police’s call to decide what they want to observe,”         Bakshi said. “Do they want to look at the traffic or at a place         where people gamble or do a lot of drinking?” The policeman in         charge of selection of spots for installation of cameras was         former additional commissioner of police Vasant Dhoble. Calling         him a “game-changer”, one of the project managers said it was         thanks to Dhoble that all the locations were surveyed in just         twoand-a-half months. Dhoble was also instrumental in ensuring         that the cameras were installed at the appropriate angles.&lt;br /&gt; &lt;br /&gt; While the initial estimate was for 6,000 cameras, it was         eventually determined that 4,697 were sufficient at this stage.         The cameras have been placed on poles similar to street lights —         2290 of them — some with multiple cameras. “Let’s say there is a         pole at Haji Ali Juice Center,” Bakshi said. “It may have three         cameras — one looking towards Heera Panna, the other looking         towards Mahalaxmi, the third looking towards Worli.”&lt;br /&gt; &lt;br /&gt; The vast majority of the cameras — roughly 4200 — will be fixed         and stare unblinkingly in one direction. The other 500 will be         PTZ, or pan/tilt/zoom cameras, so those watching can scan an         area or take a closer look at something that seems suspicious.         All of the cameras can see in high definition, with visibility         ranging from 50m to 120m. Some of them also have thermal imaging         and night vision.&lt;br /&gt; &lt;br /&gt; According to those involved in the project, the cameras have         been built to withstand the rigours of Mumbai’s weather —         specifically the heat and rain. Larsen &amp;amp; Toubro and CMS         Computers are responsible for the maintenance of the system.         Once the system is fully operational, the target is to have 99%         of the cameras live at all times barring accidents. The         responsibility for this lies with the service providers.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;A           smart system&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; The software that runs the cameras includes a Picture         Intelligence Unit (PIU) that will conduct facial recognition         analysis. If there is an image of a wanted person in the         database, the program will scan the footage for matches and send         a signal if it finds any. It will also send an alert if it         notices a suspicious object, say one that has been left         unattended for a pre-specified amount of time, so the cops can         check it out. Tracking police vehicles — like you can follow the         path of an Uber or Ola — is yet another feature, so if there is         trouble, the nearest vehicle can be dispatched.&lt;br /&gt; &lt;br /&gt; By Bakshi’s reckoning, if it is a small crime, then the police         should be on the scene in five to ten minutes. If it is         something like a bomb blast, then a Quick Response Team will be         deployed, which will take a little longer – say 10 to 15         minutes.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;Who           will be watching you?&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; The feeds from these cameras will be fed to a video wall that         stretches 21 feet across in a control room that has been set up         in the Commissioner of Police Headquarters at Crawford Market.         The footage will be monitored by about 20 observers who have         been specially trained for the job.&lt;br /&gt; &lt;br /&gt; However, a project manager said, watching the wall for more than         eight minutes “would make anyone mad” because it is so chaotic.         Therefore, each observer has his own workstation with three         computer screens where he can only watch the feeds he has been         assigned.&lt;br /&gt; &lt;br /&gt; Entry to the control room is also strictly monitored. It         requires five fingerprint access just to get in the room and a         thumb print to turn individual workstations on. Mobile phones         and personal effects are banned and the computers have no USB         ports, so data can’t be copied.&lt;br /&gt; &lt;br /&gt; In addition, there are viewing screens in each of the additional         commissioner’s zonal offices and in all 23 police stations and         roughly 200 observers will eventually be required to operate         them. A project manager said he hoped to have a 60-40 or 50-50         split between male and female observers. The observers are         monitored by the police, who will decide what actions to take         depending on what alerts are generated.&lt;br /&gt; &lt;br /&gt; The manpower is being provided by CMS Computers, with applicants         having their resumes verified by the police. Observers will         spend anywhere from four to six weeks in training before they         get on the job, one of the project managers said.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;Keeping           the data secure&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; The images from the standard cameras will be stored for 90 days,         while those taken with PTZ cameras will be stored for 30 days.         “If you store for longer periods, it involves more cost,” Bakshi         said. “We feel that if something has to be reported to us, it         will be reported within 90 days.”&lt;br /&gt; &lt;br /&gt; MTNL has set up a data centre in Worli and a disaster recovery         centre in Belapur. If something goes wrong in Worli, there will         still be connectivity via Belapur. Both centres have been         “tied-up” to make the data as safe as possible. At the test lab         at Larsen &amp;amp; Toubro’s project headquarters in Mallet Bunder,         they even have a rodent detection device that broadcasts an         ultrasonic frequency to drive away rats and stop them from         chewing up the wires.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;False           starts&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; The project took some time to get off the ground because getting         the details worked out was a painstaking elaborate process,         former Maharashtra chief secretary ( home) Amitabh Rajan, told         Mumbai Mirror. The committee wanted to make sure everything was         transparent and that there were no allegations against the         project. Control and security were also zealously guarded. “No         compromise on security, not even cost,” Rajan said. “Like         titration in chemistry, we eventually got the right         concentration.”&lt;br /&gt; &lt;br /&gt; There was also a battle between a lobby that wanted the system         to be set up using dedicated fibre optic cables, and a lobby of         technology providers that wanted to use wireless technology. The         cops backed cables, which are not only safer but make it easy to         add additional bandwidth, whereas wireless networks have limited         bandwidth. It was a battle the cops would eventually win but at         the cost of time.&lt;br /&gt; &lt;br /&gt; The tender process didn’t go smoothly either. Larsen and Toubro         were actually the winners of the fourth tender the Maharashtra         government put forward. The first tender had to be cancelled         because the winning consortium had not properly disclosed its         ownership structure — one of the companies turned out to be         controlled by a subsidiary of Reliance Industries. The second         was cancelled when the vendor’s bank guarantee cheque of Rs 2         crore bounced and the owner disappeared. He was eventually found         and arrested two years later.&lt;br /&gt; &lt;br /&gt; The third tender received no bidders because it did not offer         up-front payment for capital expenditure, according to then IT         secretary Rajesh Aggarwal, who was part of the committee. It was         finally on the fourth occasion, when the committee decided to         offer a certain percentage of the project cost at the start and         the rest over the remaining five years as maintenance fees, that         a deal could be sealed.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;Coordination           headache&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; The next hurdle was coordinating the work between all the         different organisations that populate Mumbai. The final total         was around 35 or 40 bodies, including the Municipal Corporation         of Greater Mumbai (MCGM), BEST and Reliance Power, the police,         MMRDA, the Government of India and the High Court. “To explain         to everyone that it is a security project and please don’t go by         normal rules, you have to give concessions for all these things,         all this co-ordination was a big job,” Bakshi said.&lt;br /&gt; &lt;br /&gt; It led to delays, which is why the project had to take the         extraordinary step of getting permission from the MCGM to dig up         roads during the monsoon to lay the fibre-optic cables. It was         the only way the project could make its deadline.&lt;br /&gt; &lt;br /&gt; “If we had done it like a normal project, it would have taken         five years,” an engineer said.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;A           question of privacy&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; Two experts in privacy issues that Mirror spoke to said that         such a system is in the public interest, but safeguards must be         built to prevent abuse. “If the data falls into the wrong hands,         it can create havoc,” said Pavan Duggal, an expert in the field         of cyber law. “Large scale surveillance of the public should not         be the norm, it should be the exemption to the norm.” he said.         “It can create unease and lessen the enjoyment of living in a         democratic society.”&lt;br /&gt; &lt;br /&gt; According to Sunil Abraham, director of the Centre for Internet         and Society, the biggest problem is that India does not have an         “omnibus privacy law”.&lt;br /&gt; &lt;br /&gt; Instead, it has about 50 different laws across sectors and         therefore privacy regulations are not consistent, which has         created a legal thicket. “110 countries have passed privacy laws         to European Union standards. India is really far behind,” he         said.&lt;br /&gt; &lt;br /&gt; He also listed a number of principles that he hoped the project         would abide by, such as the principles of notice (CCTV cameras         should be advertised as such), of openness (details of the         system should be made public), security (“if you don’t have         security, you can’t ensure privacy”) and of access (“we should         have a right to get the footage of ourselves”). He also warned         against the footage being shared between different security         agencies without due process.&lt;br /&gt; &lt;br /&gt; Additional Chief Secretary (Home) Bakshi said most of these         principles were part of the system. There would be boards         demarcating the CCTV cameras, the system would be publicly         launched, it was being made as secure as possible and footage         could be handed over depending on the circumstances. “If it is         your own, then no problem,” Bakshi said. “If it is someone         else’s then there are privacy issues. Is it because of criminal         intent or you want to track your girlfriend’s other boyfriend to         see if he is following her? These are issues. If you want yours,         on merit we can give. No issue.”&lt;br /&gt; &lt;br /&gt; Another concern Abraham raised is unique to India and the         Aadhaar card, which uses biometric data as passwords, not         identification. Since the CCTV cameras are high resolution, it         raises the risk of someone recreating your iris or finger prints         from a captured image and then “somebody could empty your         Aadhaarlinked bank accounts,” Abraham said.&lt;br /&gt; &lt;br /&gt; This is not as far-fetched as it sounds. Abraham pointed out         that in 2014 a member of the Chaos Collective Club, the largest         association of hackers in Europe, recreated the finger print of         a German minister from a photograph they took of her hand.&lt;br /&gt; &lt;br /&gt; “Other risks are smaller, a revealing photograph or someone         trying to blackmail you,” Abraham said.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;Not           just for crime&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; The camera feed has other applications too, beginning with         traffic management. An automatic number plate recognition system         will be installed as well. If you look around the corner, don’t         see a cop and jump a light, you could still get in trouble.         “6000 [sic] police in the sky are watching you and you will get         a challan sitting at home,” Aggarwal said. Other uses include         tracking of encroachments by the Municipal Corporation of         Greater Mumbai which will have an additional viewing centre.         Also garbage disposal and other civic issues such as water         logging and a subject dear to Mumbai citizens — potholes.         “Somebody complains that this road has a pothole, immediately         you can zoom in and see that yes, there is a pothole on this         road,” Bakshi said.&lt;br /&gt; &lt;br /&gt; There is also a provision to allow a further 103 locations to         plug-in and play. For example, if the Taj Mahal Hotel wants the         police to survey the hotel for a period of time, the hotel’s         CCTV system can be hooked up to the main control room within 48         hours. The same goes for the airport or the railway stations.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;Effect           of CCTV surveillance&lt;/b&gt;&lt;br /&gt; &lt;br /&gt; Worldwide the academic literature on CCTV surveillance suggests         its effectiveness, especially on crime prevention, is uncertain         or limited. “Post crime it really, really helps,” Aggarwal said,         “but for prevention, we have to wait and watch. If it reduces         sexual harassment for example, then that is priceless. Time will         tell how people try to beat the system and how the system tries         to catch up.”&lt;br /&gt; &lt;br /&gt; Joint Commissioner of Police, Law and Order, Deven Bharti said         he was already seeing an improvement in traffic management and         in prevention and detection of crimes thanks to the 3000-plus         cameras that were live when Mirror spoke to him two days ago,         though he said he could not provide details. “The system is         working to our satisfaction,” Bharti said.&lt;br /&gt; &lt;br /&gt; Bakshi said the effects of the system should start showing         roughly a month after the project is fully operational. “In         Pune, results started being seen within a month. Once all 4700         [cameras] are live, you will start seeing the results on traffic         violations, street crimes, and at general discipline level.         [First] Let the people know they are under surveillance, that         they are completely covered in Mumbai by CCTV.”&lt;br /&gt; &lt;br /&gt; The total cost of the project is Rs 1008 crore. Out of this,         about Rs 400 crore has already been spent. The balance will be         paid out in regular installments until October 2021. At that         point the Maharashtra government and Mumbai police will take         complete control of the project. “We presume that in five years’         time, we will have enough trained people to run it ourselves,”         Bakshi said.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/mumbai-mirror-tariq-engineer-october-2-2016-eye-on-mumbai'&gt;https://cis-india.org/internet-governance/news/mumbai-mirror-tariq-engineer-october-2-2016-eye-on-mumbai&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-10-02T10:22:20Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/first-post-august-23-2016-seetha-extending-aadhaar-to-more-areas-is-a-hare-brained-idea-it-should-be-dropped">
    <title>Extending Aadhaar to more areas is a hare-brained idea, it should be dropped</title>
    <link>https://cis-india.org/internet-governance/news/first-post-august-23-2016-seetha-extending-aadhaar-to-more-areas-is-a-hare-brained-idea-it-should-be-dropped</link>
    <description>
        &lt;b&gt;News reports that the mandatory use of Aadhaar could be extended to a host of new areas are extremely disturbing. According to these reports, the Unique Identification Authority of India (UIDAI) has identified 20 new areas for which Aadhaar can be made mandatory. This includes registration of companies and NGOs, insurance, competitive examinations and property and vehicle registration.
&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Seetha was published in &lt;a class="external-link" href="http://www.firstpost.com/business/extending-aadhaar-to-more-areas-is-a-hare-brained-idea-it-should-be-dropped-2972182.html"&gt;First Post&lt;/a&gt; on August 23, 2016. CIS article by Pranesh Prakash and Amber Sinha was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;If this happens, then it confirms the worst suspicions of all those who are opposed to Aadhaar – and this spans ideological divides – that it can be used to seriously compromise individual privacy.&lt;/p&gt;
&lt;div class="alignleft wp-caption" id="attachment_2972214" style="float: left; text-align: justify; "&gt;&lt;a href="http://s2.firstpost.in/wp-content/uploads/2016/08/Aadhaar-380.jpg"&gt;&lt;img alt="A villager scanning fingerprint for Aadhaar. Reuters file photo" class="wp-image-2972214 size-full" height="285" src="http://s2.firstpost.in/wp-content/uploads/2016/08/Aadhaar-380.jpg" width="380" /&gt;&lt;/a&gt;
&lt;p class="wp-caption-text"&gt;A villager scanning fingerprint for Aadhaar. Reuters file photo&lt;/p&gt;
&lt;/div&gt;
&lt;p style="text-align: justify; "&gt;The defenders of Aadhaar – mainly the previous and current governments, the UIDAI and Nandan Nilekani, the father of the Aadhaar – have always argued that these concerns are exaggerated. They have pointed out that Aadhaar does not take any details that are not already in the public domain – name, date of birth and permanent address – and that the biometric data is not shared with any of the authorities that seek verification by Aadhaar. That data remains with the UIDAI and it only confirms that a person with a particular Aadhaar number is who he claims he is.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But Aadhaar’s opponents have argued that the extensive use of Aadhaar allows disparate bits of information to be linked and this could become a genuine concern if this hare-brained idea gets official approval.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Now, there is certainly no doubt that Aadhaar is, in the absence of anything better, the best technological tool for establishing identity. It is not entirely fool-proof – there are issues relating to the fingerprints of manual labourers and iris scan of aged people or those with cataract – a solution needs to be found for this. According to&lt;span class="Apple-converted-space"&gt; &lt;/span&gt;&lt;a href="http://cis-india.org/internet-governance/blog/hindustan-times-amber-sinha-pranesh-prakash-march-12-2016-privacy-concerns-overshadow-monetary-benefits-of-aadhaar-scheme" rel="nofollow" target="_blank"&gt;this report&lt;/a&gt;&lt;span class="Apple-converted-space"&gt; &lt;/span&gt;by the Centre for Internet and Society, there was fingerprint authentication failure in 290 of 790 ration card holders in Andhra Pradesh who did not lift rations, and there was an ID mismatch in 93 instances. These problems notwithstanding, there is no denying that Aadhaar has helped in significantly containing (perhaps not entirely eliminating) the problem of identity theft for diversion of government doles and other benefits.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;So making Aadhaar compulsory for such cases is perfectly justifiable. Indeed, the Act giving legal status to Aadhaar is called Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mandatory quoting of Aadhaar can even be justified in the cases where duplication or falsification of identity can be used by criminals or those who fall foul of the law. Passports, for example, can be brought under the ambit of Aadhaar. Or even driving licences. A person whose licence has been suspended for repeated traffic violations should not be allowed to get another one under the same name or an assumed name.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But why should it be mandatory for bank accounts, if an individual is not interested in getting government doles? The quoting of Aadhaar for property transactions also does not make sense. If the idea is to prevent fraudulent transactions, it will not be foolproof. A person intending to sell an already sold property or one he does not own can do so even with an Aadhaar number, since people are allowed to own more than one piece of property. What will prevent this from happening is compulsory registration and digitisation of records as well as mandatory property titling; there has been little progress on both.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When filing of income tax returns is not possible without a PAN, there is little rationale for making Aadhaar mandatory for filing returns and even for PAN. It is not clear how quoting of Aadhaar is going to help in ensuring that fly-by-night companies and NGOs do not get established.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The insistence of Aadhaar on purchase of vehicles, landline and mobile phone connections and demat accounts is seriously violative of individual privacy and has enormous potential for misuse. The Act does give the government unbridled power to access data in the name of national security. This itself is worrying, since it can allow security agencies to go an random fishing expeditions to access personal financial transactions. Making it mandatory for even buying cars and phone connections (even though it is not illegal to own more than one vehicle or telephone connection) makes it even riskier – private agencies get access to one’s Aadhaar number. Forget security agencies, even unscrupulous private persons can track an individual’s personal activities, especially financial transactions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As it is, investigating agencies want to tap Aadhaar and biometric data at the drop of a hat. The UIDAI had to approach the Supreme Court in 2014 against a Goa High Court order ordering it to share biometric details of everyone enrolled in the state for solving a gang rape case. Even after the Supreme Court ruled in favour of UIDAI, a Kerala special investigation team wanted it to share biometric details to solve another rape case. If Aadhaar now becomes mandatory for a host of financial and other transactions, the points of potential privacy breaches only increase.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The move to extend the mandatory use of Aadhaar has to be stopped in its tracks. The mandatory use should be limited to delivery of government welfare benefits and doles (after ensuring that glitches are eliminated) and security-related services like passports. For everything else, it should be purely voluntary. There can be no compromise on this.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/first-post-august-23-2016-seetha-extending-aadhaar-to-more-areas-is-a-hare-brained-idea-it-should-be-dropped'&gt;https://cis-india.org/internet-governance/news/first-post-august-23-2016-seetha-extending-aadhaar-to-more-areas-is-a-hare-brained-idea-it-should-be-dropped&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-08-24T03:05:01Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/livemint-june-8-2017-shaikh-zoaib-saleem-explore-money-apps-but-watch-your-data">
    <title>Explore money apps but watch your data</title>
    <link>https://cis-india.org/internet-governance/news/livemint-june-8-2017-shaikh-zoaib-saleem-explore-money-apps-but-watch-your-data</link>
    <description>
        &lt;b&gt;Financial apps may appear to be free but before you install them, read their privacy policies to know what you may be signing away.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Shaikh Zoaib Saleem was published in &lt;a class="external-link" href="http://www.livemint.com/Money/qjtm4qje8GP4c9ENPKjP6M/Explore-money-apps-but-watch-your-data.html"&gt;Livemint&lt;/a&gt; on June 8, 2017. Pranesh Prakash was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p class="A5l" style="text-align: justify; "&gt;With  the increasing usage of smartphones and other smart devices, our use of  and dependence on mobile applications also increases. These apps, while  being installed on your device, ask for a lot of permissions. Most users  do not take a detailed look at all the permissions being granted to any  particular app’s publisher. Moreover, even fewer users look at the  privacy policies and terms of use of apps, which detail how the  publisher intends to utilize the data you share.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In most cases, the data collected is analysed and used for targeted  marketing campaigns by the apps’ publishers, based on the users’  profiles and habits. Read more about it here: &lt;i&gt;&lt;a href="http://bit.ly/2q3ByA3"&gt;bit.ly/2q3ByA3. &lt;/a&gt;&lt;/i&gt;While  this phenomena is spread across the board for all categories of apps,  we take a look at the privacy policies and terms of use of the top 10  Android financial apps in India (top 10 as of June 1, according to App  Annie, a mobile apps market research company based in California). The  10 apps are: PhonePe, BHIM, SBI Anywhere Personal, Kotak – 811 and  Mobile Banking, JioMoney Wallet, Money View Money Manager, State Bank  Buddy, Bank Balance Check, All Bank Balance Enquiry, iMobile by ICICI  Bank.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;&lt;b&gt;Collecting information&lt;/b&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The common  theme across privacy policies of these apps is that the information is  collected to enhance customer experience while using an app, respond to  customer complaints and resolve disputes. Another theme is tracking  consumer behaviour. For instance, PhonePe, in its privacy policy states,  “We may automatically track certain information about you based upon  your behaviour on our app. We use this information to do internal  research on our users’ demographics, interests, and behaviour to better  understand, protect and serve our users. This information is compiled  and analysed on an aggregated basis.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, the privacy  policy of BHIM app says, “…once you give us your personal information,  you are not anonymous to us. We may automatically track certain  information about you based upon your behaviour on our app to the extent  we deem fit.” It further adds that if you choose to transact on the  app, then “we collect information about your transaction behaviour.” All  the apps collect some or the other information like device IDs and  location.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;&lt;b&gt;Sharing Information&lt;/b&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The  information gathered by the apps is not just used by these companies  themselves, but also shared with third parties, subsidiaries, parent  companies and agents of the companies. iMobile by ICICI Bank, for  instance, in its privacy policy states that the bank will limit the  collection and use of customer information only on a need-to-know basis  to deliver better service to the customers. “ICICI Bank may use and  share the information provided by the customers with its affiliates and  third parties for providing services and any service-related activities  such as collecting subscription fees for such services, and notifying or  contacting the customers regarding any problem with, or the expiration  of, such services. In this regard, it may be necessary to disclose the  customer information to one or more agents and contractors of ICICI Bank  and their sub-contractors, but such agents, contractors, and  sub-contractors will be required to agree to use the information  obtained from ICICI Bank only for these purposes,” the policy reads.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly,  PhonePe in its privacy policy has said that the company may share  personal information with its other corporate entities and affiliates.  “We and our affiliates will share/sell some or all of your personal  information with another business entity should we (or our assets) plan  to merge with, or be acquired by that business entity, or  re-organization, amalgamation, restructuring of business. Should such a  transaction occur that other business entity (or the new combined  entity) will be required to follow this privacy policy with respect to  your personal information,” it reads.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While installing, the Kotak  app seeks your “irrevocable consent” to its privacy policy, which, among  others things, states: “We may disclose the customer information to  third parties for following, among other purposes, and will make  reasonable efforts to bind them to obligation to keep the same secure  and confidential and an obligation to use the information for the  purpose for which the same is disclosed, and you hereby give your  irrevocable consent for the same.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;JioMoney Wallet, while  disclosing upfront that the publishing company and its affiliates do not  sell or rent personal information to any third-party entities, also  adds that the company “engages a number of vendors, consultants,  contractors and takes support of our group companies or affiliates. We  may provide our partners access to or share your personal information to  enable them to provide the services subscribed by you.” Terms and  conditions of the BHIM app state: “For the protection of both the  parties, and as a tool to correct misunderstandings, the user  understands, agrees and authorises NPCI, at its discretion, and without  further prior notice to the user, to monitor and record any or all  telephone conversations between the user(s) and NPCI only.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is  imperative to note that most of these apps announce it upfront in their  privacy policies that the policy could change anytime without prior  information to the users. At the same time, it should be noted that  sharing of some data is required for proper functioning of many apps.  While most app publishers may not misuse the data being gathered, you  should know exactly what data is being used.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Pranesh Prakash,  policy director at the Centre for Internet and Society said that their  research outputs show that laws to deal with misuse of personal data are  very weak in India. “We need a strong privacy law to address these  issues, of which we have proposed a citizens’ draft. Clearly, the  prevailing situation shows that the industry is not taking enough  initiative on self-regulation. At the same time, even the government  isn’t taking much interest in consumer protection.”&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/livemint-june-8-2017-shaikh-zoaib-saleem-explore-money-apps-but-watch-your-data'&gt;https://cis-india.org/internet-governance/news/livemint-june-8-2017-shaikh-zoaib-saleem-explore-money-apps-but-watch-your-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-06-08T12:46:11Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/dna-india-october-19-2012-saikat-datta-experts-committee-moots-law-to-protect-privacy">
    <title>Experts' committee moots law to protect privacy</title>
    <link>https://cis-india.org/news/dna-india-october-19-2012-saikat-datta-experts-committee-moots-law-to-protect-privacy</link>
    <description>
        &lt;b&gt;In its report submitted to the Planning Commission on Thursday, the first ever experts’ group to identify the privacy issues and prepare a report to facilitate authoring of the privacy bill, has said that existing laws have created an ‘unclear regulatory regime’ which allows a state to be intrusive.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Saikat Datta's article was &lt;a class="external-link" href="http://www.dnaindia.com/india/report_experts-committee-moots-law-to-protect-privacy_1753827"&gt;published&lt;/a&gt; in DNA on October 19, 2012&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The report has been prepared by experts led by justice AP Shah, former chief justice of the Delhi high court.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In its exceptions to the proposed law on privacy, the experts’ group has recommended that national security, public order and disclosures made in ‘public interest’ will be exempted from the limitations of privacy. Several members of the group unsuccessfully argued to bring in the Intelligence agencies which are empowered to legally tap phones, intercept emails and conduct surveillance on citizens under the ambit of the Privacy Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report, a copy of which is available with &lt;i&gt;DNA&lt;/i&gt;, recognises that there are major differences in the existing laws that permit intrusive phone-tapping or surveillance of private citizens by the government.The group feels that “these differences have created an unclear regulatory regime that is inconsistent, non-transparent, and prone to misuse and does not provide remedy or compensation to aggrieved individuals.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Therefore, the group has recommended that when the government conducts any intrusive surveillance like phone tapping, it must adhere to the principles of proportionality, legality and remain within the boundaries of a democratic state.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The limitation (on tapping phones, etc) should be in proportion to the harm that has been caused or will be caused,” the report states.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Interestingly, the report also exempts the disclosure of personal or private information for journalistic or historical and scientific purposes from being curbed under the proposed Privacy Act. Interestingly, this will give journalists a legal cover from being hauled up under the proposed privacy laws when they file stories.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The government is keen to enact a privacy law quickly because of two major issues. The fallout of the leakage of the tapes of Niira Radia speaking to industry heads like Ratan Tata which led to a renewed clamour for a comprehensive Privacy Act. Ironically, anything related to phone-tapping has now been left out of the provisions of such an Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The other reason was the pressure from the industry that is keen to get business from abroad that deals with sensitive personal data. In the absence of any personal data protection laws, Indian companies were not getting any business from European or American firms. With this law, India can look forward to getting substantial business that involves personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With this framework in mind the experts’ group has recommended that notice be given to any individual from whom personal information will be sought. With intrusive government projects like the UID or the NATGRID, the group was worried that this kind of massive data in the hands of the government could turn this into a police state.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It has also mandated that the choice and consent of the individual must be taken before collecting this information. Also, there has to be a limitation on collecting this information and anything that has been collected will use the data for only a limited purpose. A data controller should be appointed to collect, maintain and use the data under strict stipulations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Therefore, the data controller will be made accountable for any lapse in handling or disclosure of the data. To ensure that this kind of control can be exercised, the group has suggested the appointment of privacy commissioners who will adjudicate on any matter of illegal disclosures and mete out server punishment.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Recommendations&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;National security, public order and disclosures      made in ‘public interest’ will be exempted from the limitations of privacy&lt;/li&gt;
&lt;li&gt;The limitation (on tapping phones, etc) should be      in proportion to the harm caused or will be caused&lt;/li&gt;
&lt;li&gt;Disclosure of personal or private information for      journalistic or historical and scientific purposes should be exempted from      being curbed under the proposed Act&lt;/li&gt;
&lt;li&gt;Notice be given to individual from whom      information has to be sought&lt;/li&gt;
&lt;li&gt;A data controller should be appointed to collect,      maintain and use the data &lt;/li&gt;
&lt;li&gt;Privacy commissioners who will adjudicate on any      matter of illegal disclosures be appointed&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Note: &lt;i&gt;The Centre for Internet &amp;amp; Society was part of the expert committee even though not explicitly mentioned&lt;/i&gt;.&lt;/p&gt;
&lt;ul&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/dna-india-october-19-2012-saikat-datta-experts-committee-moots-law-to-protect-privacy'&gt;https://cis-india.org/news/dna-india-october-19-2012-saikat-datta-experts-committee-moots-law-to-protect-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2012-10-22T10:18:34Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/new-indian-express-may-6-2017-experts-stress-on-need-for-enhanced-security">
    <title>Experts stress on need for enhanced security</title>
    <link>https://cis-india.org/internet-governance/news/new-indian-express-may-6-2017-experts-stress-on-need-for-enhanced-security</link>
    <description>
        &lt;b&gt;With more and more people falling prey to phishing scams, experts believe that lack of adequate security features in online payment systems will only increase the number of such cases in the coming days. While admitting that the rise in such crimes would be hard to stop or control, cyber security consultants also blame the lack of preparedness before taking the digital economy route as a cause for such problems.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="http://www.newindianexpress.com/cities/bengaluru/2017/may/06/experts-stress-on-need-for-enhanced-security-1601631.html"&gt;published in the New Indian Express&lt;/a&gt; on May 6, 2017. Pranesh Prakash was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Speaking to Express, Dr A Nagarathna of  the Advanced Centre on Cyber Law and Forensics, National Law School of  India University, said that apart from the push for digital payment  solutions, the merger of various State Bank entities also provided  chances for criminals to exploit gullible people.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“People tend to give away critical information since cyber criminals  seem so convincing. But they should remember that banks never collect  such information over phone,” she said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The cyber security features of banks and e-wallets are also  questionable. Banks and e-wallet service providers should be held  accountable for such crimes, so that they make an effort to ensure  necessary safety measures, she said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Pranesh Prakash, Policy Director at the Centre for Internet and Society,  noted that there were security concerns with e-wallets. “Many e-wallet  apps compromise on security in favour of convenience, but, at the same  time, have terms of service that hold customers liable for financial  losses.  There have been many reports of criminals working with rogue  telecom company employees to clone SIM cards and steal money via UPI and  BHIM,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He also criticised the use of biometrics as the only factor for  authorising payments to merchants using Aadhaar Pay.  He noted, “Your  fingerprints cannot be changed, unlike a PIN. So, if a merchant clones  your fingerprint, you cannot revoke it or replace it the way you can  with a debit card and a PIN.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another activist said the recommendations of Watal Committee, which  looked into digital payments, should be implemented. “As of now, the law  does not focus on the need for consumer protection in digital payments.  The Payment and Settlement Systems Act, 2007, needs to be updated,” he  said.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/new-indian-express-may-6-2017-experts-stress-on-need-for-enhanced-security'&gt;https://cis-india.org/internet-governance/news/new-indian-express-may-6-2017-experts-stress-on-need-for-enhanced-security&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-05-20T06:13:19Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process">
    <title>EVMs: How transparent is the Indian election process?</title>
    <link>https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process</link>
    <description>
        &lt;b&gt;Electronic Voting Machines (EVMs) have become a bone of contention after the results of the Assembly elections in five states were declared last Saturday and the BSP president Mayawati alleged tampering. The Congress party and the Aam Aadmi Party (AAP) have called for a probe into her allegation. Social media too is abuzz with messages and videos showing how the machines can be allegedly manipulated to sway the votes in favour of a particular candidate.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Smriti Sharma Vasudeva was &lt;a href="http://www.thestatesman.com/india/evms-how-transparent-is-the-indian-election-process-1489512231.html"&gt;published         in the Statesman&lt;/a&gt; on March 14, 2017. Pranesh Prakash was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Overnight, several videos on Whatsapp have surfaced wherein people can be seen explaining the "mechanism" on how to alter the votes polled for a candidate in another candidate's favour. Several similar posts and articles are doing the rounds on Facebook.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;BBC added fuel to the fire when it shared a 2010       article on how 'US "Scientists" hack India Electronic Machines' .       The article details how scientists at a US university say they       have developed a technique to hack into Indian electronic voting       machines. While the article was posted on the BBC website a day       after the election results were declared, it drew considerable       flak from users on Facebook who criticised the website for its       'irresponsible' act of sharing an article with a "click bait"       headline just to grab eyeballs.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amid all this frenzy, the Election Commission of       India has issued statements clarifying how the entire process is       transparent and fool proof and tampering with the EVMs is a       far-fetched thing given the checks and balances in place. For       instance, the EVMs undergo the process of randomisation wherein       which machine will go to which constituency and to which booth is       not known to anyone till the last moment. Similarly, before the       polling starts, mock polling takes place in the presence of       representatives of all the political parties and then each of       these machines are tested and a satisfactory report is generated       and only after that polling begins.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, all these checks and balances still do       not ensure a fool proof system if experts are to be believed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Pranesh Prakash, Policy Director for The Centre       for Internet and Society, a non-profit organisation that       undertakes interdisciplinary research on internet and digital       technologies from policy and academic perspectives, said: "The       Electronic Voting Machines used in India are the simplest, with no       large operating system requirements and are not networked. Thus,       from a software design perspective, these are really good and the       chances of these being tampered with are bleaker. However it       doesn't mean these are fool proof. Most of the developed countries       do not trust these machines and these are definitely not secure       enough for democratic elections.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"While there are many advantages of using EVMs in       the electoral process over the traditional ballot papers, still       there are many ways in which one can tamper with these machines       without any technical ingenuity. The best way is to make use of       the EVMs and ensure that the Voter Verified Paper Audit Trail       (VVPAT) are effectively utilised to make it an overall effective       system".&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recently, the Supreme Court had mandated that       VVPAT machines should be used in all the polls and thus the       Election Commission had installed VVPAT machines in several       constituencies. However, not sure of the efficacy of this system,       the Election Commission had itself raised apprehensions regarding       performance of the paper-trail machine, which gives a receipt to       the voter, verifying the vote went in favour of the candidate       against whose name the button was pressed on the electronic voting       machine.a&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process'&gt;https://cis-india.org/internet-governance/news/the-statesman-smriti-sharma-vasudeva-march-14-2017-evms-how-transparent-is-the-indian-election-process&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital India</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-03-17T01:57:19Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance">
    <title>European Union Draft Report Admonishes Mass Surveillance, Calls for Stricter Data Protection and Privacy Laws</title>
    <link>https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance</link>
    <description>
        &lt;b&gt;Ever since the release of the “Snowden files”, the secret documents evidencing the massive scale of surveillance undertaken by America’s National Security Agency and publically released by whistle-blower Edward Snowden, surveillance in the digital age has come to the fore of the global debate on internet governance and privacy.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The Committee on Civil Liberties, Justice and Home Affairs of the European Parliament in its draft report on global surveillance has issued a scathing indictment of the activities of the NSA and its counterparts in other member nations and is a welcome stance taken by an international body that is crucial to the fight against surveillance.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The "European Parliament &lt;a class="external-link" href="http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML%2BCOMPARL%2BPE-526.085%2B02%2BDOC%2BPDF%2BV0//EN"&gt;Draft Report&lt;/a&gt; on the US NSA surveillance programme, surveillance bodies in various Member States and their impact on EU citizens’ fundamental rights and on transatlantic cooperation in Justice and Home Affairs" released on the 8&lt;sup&gt;th&lt;/sup&gt; of January, 2014, comprehensively details and critiques the mass surveillance being undertaken by government agencies in the USA as well as within the EU, from a human rights and privacy perspective. The report examines the extent to which surveillance systems are employed by the USA and EU member-states, and declares these systems in their current avatars to be unlawful and in breach of international obligations and fundamental constitutional rights including &lt;i&gt;"the freedom of expression, of the press, of thought, of conscience, of religion and of association, private life, data protection, as well as the right to an effective remedy, the presumption of innocence and the right to a fair trial and non-discrimination"&lt;/i&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Furthermore, the report points to the erosion of trust between the EU and the US as well as amongst member states as an outcome of such secret surveillance, and criticises and calls for a suspension of the data-sharing and transfer agreements like the Terrorist Finance Tracking Program (TFTP), which share personal information about EU citizens with the United States, after examining the inadequacy of the US Safe Harbour Privacy principles in ensuring the security of such information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After considering the secret and unregulated nature of these programmes, the report points to the need of restricting surveillance systems and criticizes the lack of adequate data protection laws and privacy laws which adhere to basic principles such as necessity, proportionality and legality.. It also questions the underlying motives of these programmes as mere security-tools and points to the possible existence of political and economic motives behind their deployment. Recognizing the pitfalls of surveillance and the terrible potential for misuse, the report "&lt;i&gt;condemns in the strongest possible terms the vast, systemic, blanket collection of the personal data of innocent people, often comprising intimate personal information; emphasises that the systems of mass, indiscriminate surveillance by intelligence services constitute a serious interference with the fundamental rights of citizens; stresses that privacy is not a luxury right, but that it is the foundation stone of a free and democratic society; points out, furthermore, that mass surveillance has potentially severe effects on the freedom of the press, thought and speech, as well as a significant potential for abuse of the information gathered against political adversaries."&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amongst the recommendations in the 51-page report are calls for a prohibition of mass surveillance and bulk data collection, and an overhaul of the existing systems of data-protection across the European Union and in the US to recognize and strengthen the right to privacy of their citizens, as well as the implementation of democratic oversight mechanisms to check security and intelligence agencies. It also calls for a review of data-transfer programmes and ensuring that standards of privacy and other fundamental rights under the European constitution are met. The committee sets out a 7-point plan of action, termed the European Digital Habeus Corpus for Protecting Privacy, including &lt;a class="external-link" href="http://www.europarl.europa.eu/news/en/news-room/content/20130502BKG07917/html/QA-on-EU-data-protection-reform"&gt;adopting the Data Protection Package&lt;/a&gt;, suspending data transfers to the US until a more comprehensive data protection regime is through an Umbrella Agreement, enhancing fundamental freedoms of expression and speech, particularly for whistleblowers, developing a European Strategy for IT independence and developing the EU as a reference player for democratic and neutral governance of the internet.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though this draft report has no binding legal value as yet, the scathing criticism has assisted in calling to the attention of the global community the complex issues of internet governance and privacy and surveillance, and generated debate and discourse around the need for an overhaul of the current system. The recent decision of the US government to ‘democratize’ the internet by handing control of the DNS root zone to an international body, and thereby relinquishing a large part of its means of controlling the internet, is just one example of the systemic change &lt;a class="external-link" href="http://arstechnica.com/tech-policy/2014/03/in-sudden-announcement-us-to-give-up-control-of-dns-root-zone/"&gt;that this debate is generating&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance'&gt;https://cis-india.org/internet-governance/blog/european-union-draft-report-admonishes-mass-surveillance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>divij</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-09-30T08:52:45Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance">
    <title>EU parliament report slams US surveillance</title>
    <link>https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance</link>
    <description>
        &lt;b&gt;Report that outlines need for stringent laws for protecting citizen privacy, democratizing Internet governance holds lessons for India, say analysts.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The article by Moulishree Srivastava and Elizabeth Roche quotes Sunil Abraham. It was &lt;a class="external-link" href="http://www.livemint.com/Home-Page/nYXiR4LEVJLiROfl95aFxH/EU-parliament-report-slams-US-surveillance.html"&gt;published in Livemint&lt;/a&gt; on January 17, 2014.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;A European Union (EU) parliament report that outlines the need for stringent laws for protecting citizen privacy, democratizing Internet governance and rebuilding trust between Europe and the US holds many lessons for India, analysts and policymakers say.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The US government listened into Indian communications as part of its massive global surveillance, which was exposed last year in leaks to the media. The embassies of France, Italy, Greece, Japan, Mexico, South Korea and Turkey were also subjected to the surveillance put in place after the September 2001 terrorist attacks. According to the external affairs ministry, India has registered its protest at least thrice over the issue with US authorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A draft report on the US National Security Agency’s surveillance programme by the European parliament’s committee on civil liberties, justice and home affairs states that trust between the two transatlantic partners, trust among EU member-states, and trust between citizens and their governments were profoundly shaken because of the spying, and to rebuild trust in all these dimensions a comprehensive plan was urgently needed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"It is very doubtful that data collection of such magnitude is only guided by the fight against terrorism, as it involves the collection of all possible data of all citizens; points therefore to the possible existence of other power motives such as political and economic espionage," says the report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report recommends prohibiting blanket mass surveillance activities and bulk processing of personal data, and asks EU member-states, including the UK, Germany, France, Sweden and the Netherlands, to revise their national legislation and practices governing the activities of intelligence services to ensure that they are in line with the standards of the European Convention on Human Rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It also calls on the US to revise its legislation without delay in order to bring it in line with international law, recognizing privacy and other rights as well as providing for judicial redress for EU citizens.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"The American approach to privacy regulation has been deeply flawed. The US dominance over the Internet affects the structure and substance of Internet governance and among other human rights, the right to privacy," said Sunil Abraham, executive director of the Centre for Internet and Society, a Bangalore-based not-for-profit research organization. "The (EU) report, if implemented, may change the future of Internet governance by deepening the existing leadership provided by the EU in promoting their privacy standards globally."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On India’s rather restrained reaction to the spying, he said, “It is a tragedy that our politicians are not as proactive when it comes to protecting our rights. While India has only focused on changing its official email policy after the revelations of mass surveillance, it has done nothing as concrete and comprehensive as EU."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"There is neither the recognition of (the) pervasive nature of global mass surveillance, nor is there full appreciation (of) the damaging consequences," Abraham added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;J. Satyanarayana, secretary in India’s department of electronics and information technology, said the concerns over privacy are the same for India as for the EU, but declined to comment on what preventive steps the government is implementing due to security reasons. The EU report called for concluding the EU-US umbrella pact, a framework agreement on data protection in the field of police and judicial cooperation, to ensure proper redress mechanisms for EU citizens in the event of data transfers from the EU to the US for law enforcement purposes. The report asks EU policymakers not to initiate any new sectoral agreements or arrangements for the transfer of personal data for law enforcement purposes and suggests suspending the terrorist finance tracking programme until the umbrella agreement negotiations are concluded.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"EU wants to use EU-US umbrella agreement...to raise the US standards, to ensure the rights of EU citizens and perhaps all the citizens. All humans will need protection under US law as is currently the case in the EU,” said Abraham. “The prohibition of blanket surveillance that the report recommends will hopefully apply to all citizens regardless of their nationality."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft report goes as far as suggesting suspending Safe Harbour, the legal instrument used for the transfer of EU personal data to the US through Google, Microsoft, Yahoo, Facebook, Apple and LinkedIn, until a full review has been conducted and current loopholes are plugged. The report’s proposals and recommendations are likely to be implemented after election to the European parliament in May.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to reforms in the existing systems, the report outlines the importance of development of European clouds as it notes that trust in US cloud computing and cloud services providers has been affected by the surveillance practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Three of the major computerized reservation systems used by airlines worldwide are based in the US and that PNR (passenger name record) data are saved in cloud systems operating on US soil under US law...lacks data protection adequacy," states the report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;C.U. Bhaskar, analyst with the South Asia Monitor think tank, was of the view that India had “adequately” responded to the US through quiet diplomacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"It is unlikely that the US will give up cyber surveillance,” he said, adding, “We should acquire our own capacity to ensure adequate defensive and offensive firewalls and build up appropriate capacity for our cyber programmes."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Given our expertise in the IT (information technology) sector, as an analyst my opinion is that we have a reasonable capacity to build up our capabilities," Bhaskar added.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance'&gt;https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-02-03T06:13:55Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/about/policies/ethical-research-guidelines">
    <title>Ethical Research Guidelines</title>
    <link>https://cis-india.org/about/policies/ethical-research-guidelines</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society will endeavour to protect the physical, social and psychological well-being of those who participate in their research. The guidelines below state the necessary steps to follow while doing research.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The ethical research guideline requires CIS staff and consultants to consider and take the following steps while engaging in research.&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Providing notice to the individual of the: Aims, methods, his/her right to abstain from participation in the research and his/her right to terminate at any time his/her participation; the confidential nature of his/her replies and any limits on such confidentiality.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Providing informants and other participants the right to remain anonymous.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Taking informed consent from the individual that he/she agrees to participate. If children are involved in the research, informed consent will be taken from the parents. Informed consent will entail communicating :&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;Purpose(s) of the study, and the anticipated consequences of the research;&lt;/li&gt;
&lt;li&gt;Identity of funders and sponsors&lt;/li&gt;
&lt;li&gt;Anticipated uses of the data&lt;/li&gt;
&lt;li&gt;The degree of anonymity and confidentiality which may be afforded to informants and subjects.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Ensuring that when audio/visual-recorders and photographic records are being used, participants that are being recorded will be made aware of the use of the devices, and have the option to request that they not be used.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Ensuring that the identity and identifying information of the participant (if not already in the public domain) is destroyed at the end of project, unless the individual has consented to otherwise.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;At public events organized by CIS, it will be announced and publicly posted that the event is being recorded. Individuals will be given the choice object to being recorded or their name and organization shared in conference reports, blogs, articles etc. If the individual does not object, it will be considered that they have given their consent.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;The Centre for Internet and Society strictly follows a policy of &lt;strong&gt;No Plagiarism&lt;/strong&gt;.&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/about/policies/ethical-research-guidelines'&gt;https://cis-india.org/about/policies/ethical-research-guidelines&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Policies</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-10-13T12:21:48Z</dc:date>
   <dc:type>Page</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data">
    <title>Ethical Issues in Open Data</title>
    <link>https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data</link>
    <description>
        &lt;b&gt;On August 1, 2013, I took part in a web meeting, organized and hosted by Tim Davies of the World Wide Web foundation. The meeting, titled “Ethical issues in Open Data,” had an agenda focused around privacy considerations in the context of the open data movement.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The main panelists, Carly Nyst and Sam Smith from &lt;a class="external-link" href="http://https//www.privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, as well as Steve Song from the &lt;a class="external-link" href="http://www.idrc.ca/EN/Pages/default.aspx"&gt;International  Development Research Centre&lt;/a&gt;, were joined by roughly a dozen other privacy and development researchers from around the globe in the hour long session.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary issue of the meeting was the concern over modern capabilities of cross-analytics for de-anonymizing data sets and revealing personally identifiable information (PII) in open data. Open data can constitute publicly available information such as budgets, infrastructures, and population statistics, as long as the data meets the three open data characteristics: accessibility, machine readability, and availability for re-use. “Historically,” said Tim Davies, “public registers have been protected through obscurity.” However, both the capabilities of data analysts and the definition of personal data have continued to expand in recent years. This concern thus presents a conflict between researchers who advocate governments releasing open data reports, and researchers who emphasize privacy in the developing world.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Steve Song, advisor to IDRC Information &amp;amp; Networks program, spoke of the potential collateral damage that comes with publishing more and more types of information. Song addressed the imperative of the meeting in saying, “privacy needs to be a core part of open data conversation.” In his presentation, he gave a particularly interesting example of the tensions between public and private information implications. Following the infamous &lt;a class="external-link" href="http://en.wikipedia.org/wiki/Sandy_Hook_Elementary_School_shooting"&gt;2012 school shooting in Newtown, Connecticut&lt;/a&gt;, the information on Newtown’s gun permit owning citizens (made publicly available through America’s &lt;a class="external-link" href="http://foia.state.gov/"&gt;Freedom of Information Act&lt;/a&gt;) was aggregated into an interactive map which revealed the citizens’ addresses. This obviously became problematic for the Newtown community, as the map not only singled out homes which exercised their right to bear arms but also indirectly revealed which homes were without firearm protection and thereby more vulnerable to theft and crime. The Newtown example clearly demonstrates the relationship (and conflict) between open data and privacy; it resolves to the conflict between the right to information and the right to privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An apparent issue surrounding open data is its perceived binary nature. Many advocates either view data as being open, or not; any intermediary boundaries are only forms of governments limiting data accessibility. Therefore, a point raised by meeting attendee Raed Sharif aptly presented an open data counter-argument. Sarif noted how, inversely, privacy conceptions may form a threat to open data. He mentioned how governments could take advantage of privacy arguments to justify their refusal to publish open reports. &lt;br /&gt;&lt;br /&gt;However, Carly Nyst summarized the privacy concern and argument in her remarks near the end of the meeting. Namely, she reasoned that the open data mission is viable, if only limited to generic data, i.e., data about infrastructure, or other information that is in no way personal. Doing so will avoid obstructions of individual privacy. Until more advanced anonymization techniques can be achieved, which can overcome modern re-identification methods, publicly publishing PII may prove too risky. It was generally agreed upon during the meeting that open data is not inherently bad, and in fact its analysis and availability can be beneficial, but the threat of its misuse makes it dangerous. For the future of open data, researchers and advocates should perhaps consider more nuanced approaches to the concept in order to respect considerations for other ethical issues, such as privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data'&gt;https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>kovey</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-08-07T09:19:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age">
    <title>Ethical Data Design Practices in the AI (Artificial Intelligence) Age</title>
    <link>https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age</link>
    <description>
        &lt;b&gt;Shweta Mohandas was a panelist at discussion on Ethical Data Design Practices in the AI (Artificial Intelligence) Age, organised by Startup Grind, Bangalore on July 28, 2018 at NUMA Bangalore. &lt;/b&gt;
        &lt;h2&gt;Agenda&lt;/h2&gt;
&lt;p&gt;&lt;b&gt;Ethical Data Design Practices in the Age&lt;/b&gt;&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The panel discussion is intended to explore the challenges we face when designing the user experiences of the complex behavioral agents that increasingly run our lives.&lt;/p&gt;
&lt;p dir="ltr"&gt;Discussion centred around how to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Understand current thinking by the AI community on ethics and morality in computing and the challenges it presents. &lt;/li&gt;
&lt;li&gt;Explore examples of the ethical choices that products make now and will make in the near future.&lt;/li&gt;
&lt;li&gt;Learn how designers might approach designing experiences that face moral dilemmas.&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age'&gt;https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-08-01T23:14:21Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/enlarging-the-small-print">
    <title>Enlarging the Small Print: A Study on Designing Effective Privacy Notices for Mobile Applications</title>
    <link>https://cis-india.org/internet-governance/blog/enlarging-the-small-print</link>
    <description>
        &lt;b&gt;The Word’s biggest modern lie is often wholly considered to lie in the sentence “I haveread and agreed to the Terms and Conditions.” It is a well-known fact, backed by empirical research that consumers often skip reading cumbersome privacy notices. The reasons for these range from the lengthy nature, complicated legal jargon and inopportune moments when these notices are displayed. This paper seeks to compile and analyse the different simplified designs of privacy notices that have been proposed for mobile applications that encourage consumers to make informed privacy decisions.&lt;/b&gt;
        &lt;h2 style="text-align: justify; "&gt;Introduction: Ideas of Privacy and Consent Linked with Notices&lt;/h2&gt;
&lt;h3 style="text-align: justify; "&gt;The Notice and Choice Model&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Most modern laws and data privacy principles seek to focus on individual control. As Alan Westin of Columbia University characterises privacy, "it is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other,"	&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; Or simply put, personal information privacy is "the ability of the individual to personally control 	information about himself."&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The preferred mechanism for protecting online privacy that has emerged is that of Notice and Choice.&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt; The model, identified as "the most fundamental principle" in online privacy,&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt; refers to&lt;a href="http://itlaw.wikia.com/wiki/Post" title="Post"&gt;consumers&lt;/a&gt; consenting to privacy policies before availing of an online service.	&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following 3 standards of expectations of privacy in electronic communications have emerged in the United States courts:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;KATZ TEST: Katz v. United States,&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; a wiretap case, established expectation of privacy as one society is 	prepared to recognize as ―reasonable. &lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt;This concept is critical to a court's understanding of a new 	technology because there is no established precedent to guide its analysis&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;KYLLO/ KYLLO-KATZ HYBRID TEST: Society's reasonable expectation of privacy is higher when dealing with a new technology that is not ―generally 	available to the public.&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt;This follows the logic that it is reasonable to expect common data collection practices to be used but not rare ones. &lt;a href="#_ftn10" name="_ftnref10"&gt;[10]&lt;/a&gt; In Kyllo v. United States	&lt;a href="#_ftn11" name="_ftnref11"&gt;[11]&lt;/a&gt; law enforcement used a thermal imaging device to observe the relative heat levels inside a house. 	Though as per Katz the publicly available thermal radiation technology is reasonable, the uncommon means of collection was not. This modification to the 	Katz standard is extremely important in the context of mobile privacy. Mobile communications may be subdivided into smaller parts of audio from a phone 	call, e-mail, and data related to a user's current location. Following an application of the hybrid Katz/Kyllo test, the reasonable expectation of privacy 	in each of those communications would be determined separately&lt;a href="#_ftn12" name="_ftnref12"&gt;[12]&lt;/a&gt;, by evaluating the general accessibility 	of the technology required to capture each stream.&lt;a href="#_ftn13" name="_ftnref13"&gt;[13]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;DOUBLE CLICK TEST: DoubleClick&lt;a href="#_ftn14" name="_ftnref14"&gt;[14]&lt;/a&gt; illustrates the potential problems of transferring consent to a third 	party, one to whom the user never provided direct consent or is not even aware of. The court held that for DoubleClick, an online advertising network, to 	collect information from a user it needed only to obtain permission from the website that user accessed, and not from the user himself. The court reasoned 	that the information the user disclosed to the website was analogous to information one discloses to another person during a conversation. Just as the 	other party to the conversation would be free to tell his friends about anything that was said, a website should be free to disclose any information it 	receives from a user's visit after the user has consented to use the website's services. &lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;These interpretations have weakened the standards of online privacy. While the Katz test vaguely hinges on societal expectations, the Kyllo Test to an 	extent strengthens privacy rights by disallowing uncommon methods of collection, but as the DoubleClick Test illustrates, once the user has consented to 	such practices he cannot object to the same. There have been sugestions to consider personal information as property when it shares features of property 	like location data.&lt;a href="#_ftn15" name="_ftnref15"&gt;[15]&lt;/a&gt; It is fixed when it is in storage, it has a monetary value, and it is sold and traded on a regular basis. This would create a standard where consent is required for third-party access.	&lt;a href="#_ftn16" name="_ftnref16"&gt;[16]&lt;/a&gt; Consent will then play a more pivotal role in affixing liability.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The notice and choice mechanism is designed to put individuals in charge of the collection and use of their personal information. In theory, the regime preserves user autonomy by putting the individual in charge of decisions about the collection and use of personal information.	&lt;a href="#_ftn17" name="_ftnref17"&gt;[17]&lt;/a&gt; Notice and choice is asserted as a substitute for regulation because it is thought to be more 	flexible, inexpensive to implement, and easy to enforce.&lt;a href="#_ftn18" name="_ftnref18"&gt;[18]&lt;/a&gt; Additionally, notice and choice can legitimize an information practice, whatever it may be, by obtaining an individual's consent and suit individual privacy preferences.	&lt;a href="#_ftn19" name="_ftnref19"&gt;[19]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the notice and choice mechanism is often criticized for leaving users uninformed-or misinformed, at least-as people rarely see, read, or understand 	privacy notices. &lt;a href="#_ftn20" name="_ftnref20"&gt;[20]&lt;/a&gt; Moreover, few people opt out of the collection, use, or disclosure of their data when 	presented with the choice to do so.&lt;a href="#_ftn21" name="_ftnref21"&gt;[21]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amber Sinha of the Centre for Internet and Society argues that consent in these scenarios Is rarely meaningful as consumers fail to read/access privacy 	policies, understand the consequences and developers do not provide them the choice to opt out of a particular data practice while still being allowed to 	use their services. &lt;a href="#_ftn22" name="_ftnref22"&gt;[22]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Of particular concern is the use of software applications (apps) designed to work on mobile devices. Estimates place the current number of apps available 	for download at more than 1.5 million, and that number is growing daily.&lt;a href="#_ftn23" name="_ftnref23"&gt;[23]&lt;/a&gt; A 2011 Google study, "The 	Mobile Movement," identified that mobile devices are viewed as extensions of ourselves that we share with deeply personal relations with, raising 	fundamental questions of how apps and other mobile communications influence our privacy decision-making.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recent research indicates that mobile device users have concerns about the privacy implications of using apps.	&lt;a href="#_ftn24" name="_ftnref24"&gt;[24]&lt;/a&gt; The research finds that almost 60 percent of respondents ages 50 and older decided not to install an 	app because of privacy concerns (see figure 1).&lt;a href="#_ftn25" name="_ftnref25"&gt;[25]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ConsumerReactions.png" alt="Consumer Reactions" class="image-inline" title="Consumer Reactions" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Because no standards currently exist for providing privacy notice disclosure for apps, consumers may find it difficult to understand what data the app is 	collecting, how those data will be used, and what rights users have in limiting the collection and use of their data. Many apps do not provide users with privacy policy statements, making it impossible for app users to know the privacy implications of using a particular app.	&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt;Apps can make use of any or all of the device's functions, including contact lists, calendars, phone 	and messaging logs, locational information, Internet searches and usage, video and photo galleries, and other possibly sensitive information. For example, 	an app that allows the device to function as a scientific calculator may be accessing contact lists, locational data, and phone records even though such 	access is unnecessary for the app to function properly. &lt;a href="#_ftn27" name="_ftnref27"&gt;[27]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other apps may have privacy policies that are confusing or misleading. For example, an analysis of health and fitness apps found that more than 30 percent 	of the apps studied shared data with someone not disclosed in the app's privacy policy.&lt;a href="#_ftn28" name="_ftnref28"&gt;[28]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Types of E-Contracts&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Margaret Radin distinguishes two models of direct e-contracts based on consent as -"contract-as-consent" and "contract-as-product."	&lt;a href="#_ftn29" name="_ftnref29"&gt;[29]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The contract-as-consent model is the traditional picture of how binding commitment is arrived at between two humans. It involves a meeting of the minds 	which implies that terms be understood, alternatives be available, and probably that bargaining be possible.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the contract-as-product model, the terms are part of the product, not a conceptually separate bargain; physical product plus terms are a package deal. 	For example the fact that a chip inside an electronics item will wear out after a year is an unseen contract creating a take-it-or-leave-it choice not to 	buy the package.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The product-as-consent model defies traditional ideas of consent and raises questions of whether consent is meaningful. Modern day e-contracts such as 	click wrap, shrink wrap, viral contracts and machine-made contracts which form the privacy policy of several apps have a product-as-consent approach where 	consumers are given the take-it-or-leave-it option.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mobile application privacy notices fall into the product-as-consent model. Consumers often have to click "I agree" to all the innumerable Terms and 	Conditions in order to install the app. For instance terms that the fitness app will collect biometric data is a feature of the product that is 	non-negotiable. It is a classic take-it-or-leave-it approach where consumers compromise on privacy to avail services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Contracts that facilitate these transactions are generally long and complicated and often agreed to by consumers without reading them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Craswell strikes a balance in applying the liability rule to point out that as explaining the meaning of extensive fine print would be very costly to point 	out it could be efficient to affix the liability rule not as a written contract but rather on "reasonable" terms. This means that if a fitness app collects 	sensitive financial information, which is unreasonable given its core activities, then even if the user has consented to the same in the privacy policy's 	fine print the contract should be capable of being challenged.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;The Concept of Privacy by Design&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Privacy needs to be considered from the very beginning of system development. For this reason, Dr. Anne Cavoukian	&lt;a href="#_ftn30" name="_ftnref30"&gt;[30]&lt;/a&gt; coined the term "Privacy by Design", that is, privacy should be taken into account throughout the 	entire engineering process from the earliest design stages to the operation of the productive system. This holistic approach is promising, but it does not 	come with mechanisms to integrate privacy in the development processes of a system. The privacy-by-design approach, i.e. that data protection safeguards 	should be built into products and services from the earliest stage of development, has been addressed by the European Commission in their proposal for a 	General Data Protection Regulation. This proposal uses the terms "privacy by design" and "data protection by design" synonymously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The 7 Foundational Principles&lt;a href="#_ftn31" name="_ftnref31"&gt;[31]&lt;/a&gt; of Privacy by Design are:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Proactive not Reactive; Preventative not Remedial&lt;/li&gt;
&lt;li&gt;Privacy as the Default Setting&lt;/li&gt;
&lt;li&gt;Privacy Embedded into Design&lt;/li&gt;
&lt;li&gt;Full Functionality - Positive-Sum, not Zero-Sum&lt;/li&gt;
&lt;li&gt;End-to-End Security - Full Lifecycle Protection&lt;/li&gt;
&lt;li&gt;Visibility and Transparency - Keep it Open&lt;/li&gt;
&lt;li&gt;Respect for User Privacy - Keep it User-Centric&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Several terms have been introduced to describe types of data that need to be protected. A term very prominently used by industry is "personally 	identifiable information (PII)", i.e., data that can be related to an individual. Similarly, the European data protection framework centres on "personal 	data". However, some authors argue that this falls short since also data that is not related to a single individual might still have an impact on the 	privacy of groups, e.g., an entire group might be discriminated with the help of certain information. For data of this category the term "privacy-relevant 	data" has been used. &lt;a href="#_ftn32" name="_ftnref32"&gt;[32]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An essential part of Privacy by Design is that data subjects should be adequately informed whenever personal data is processed. Whenever data subjects use 	a system, they should be informed about which information is processed, for what purpose, by which means and who it is shared is with. They should be 	informed about their data access rights and how to exercise them.&lt;a href="#_ftn33" name="_ftnref33"&gt;[33]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Whereas system design very often does not or barely consider the end-users' interests, but primarily focuses on owners and operators of the system, it is 	essential to account the privacy and security interests of all parties involved by informing them about associated advantages (e.g. security gains) and 	disadvantages (e.g. costs, use of resources, less personalisation). By creating this system of "multilateral security" the demands of all parties must be 	realized.&lt;a href="#_ftn34" name="_ftnref34"&gt;[34]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The Concept of Data Minimization&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The most basic privacy design strategy is MINIMISE, which states that the amount of personal data that is processed should be restricted to the minimal 	amount possible. By ensuring that no, or no unnecessary, data is collected, the possible privacy impact of a system is limited. Applying the MINIMISE 	strategy means one has to answer whether the processing of personal data is proportional (with respect to the purpose) and whether no other, less invasive, 	means exist to achieve the same purpose. The decision to collect personal data can be made at design time and at run time, and can take various forms. For 	example, one can decide not to collect any information about a particular data subject at all. Alternatively, one can decide to collect only a limited set 	of attributes.&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;If a company collects and retains large amounts of data, there is an increased risk that the data will be used in a way that departs from consumers' 	reasonable expectations.&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are three privacy protection goals&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; that data minimization and privacy by 	design seek to achieve. These privacy protection goals are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Unlinkability - To prevent data being linked to an identifiable entity&lt;/li&gt;
&lt;li&gt;Transparency - The information has to be available before, during and after the processing takes place.&lt;/li&gt;
&lt;li&gt;Intervenability - Those who provide their data must have means of intervention into all ongoing or planned privacy-relevant data processing	&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Spiekermann and Cranor raised an intriguing point in their paper, they argued that those companies that employ privacy by design and data minimization practices in their applications should be allowed to skip the need for privacy policies and forgo need for notice and choice features.	&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;table style="text-align: justify; "&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;div&gt;
&lt;p&gt;&lt;b&gt; To Summarise: 							&lt;i&gt; The emerging model and legal dialogue that regulates online privacy is that of Notice and Choice which has been severely 								criticised for not creating informed choice making processes. E-contracts such as agreeing to privacy notices follow the 								consent-as-product model. When there is extensive fine print liability must be affixed on the basis of reasonable terms. 								Privacy notices must incorporate the concepts of Privacy by Design through providing complete information and collecting 								minimum data. &lt;/i&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h2 style="text-align: justify; "&gt;Features of Privacy Notices in the Current Mobile Ecosystem&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;A privacy notice inform a system's users or a company's customers of data practices involving personal information. Internal practices with regard to the 	collection, processing, retention, and sharing of personal information should be made transparent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Each app a user chooses to install on his smartphone can access different information stored on that device. There is no automatic access to user 	information. Each application has access only to the data that it pulls into its own 'sandbox'.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The sandbox is a set of fine-grained controls limiting an application's access to files, preferences, network resources, hardware etc. Applications cannot 	access each other's sandboxes.&lt;a href="#_ftn39" name="_ftnref39"&gt;[39]&lt;/a&gt; The data that makes it into the sandbox is normally defined by user permissions.&lt;a href="#_ftn40" name="_ftnref40"&gt;[40]&lt;/a&gt; These are a set of user defined controls&lt;a href="#_ftn41" name="_ftnref41"&gt;[41]&lt;/a&gt;and evidence that a user consents to the application accessing that data.	&lt;a href="#_ftn42" name="_ftnref42"&gt;[42]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To gain permission mobile apps generally display privacy notices that explicitly seek consent. These can leverage different channels, including a privacy 	policy document posted on a website or linked to from mobile app stores or mobile apps. For example, Google Maps uses a traditional clickwrap structure that requires the user to agree to a list of terms and conditions when the program is initially launched.	&lt;a href="#_ftn43" name="_ftnref43"&gt;[43]&lt;/a&gt; Foursquare, on the other hand, embeds its terms in a privacy policy posted on its website, and not 	within the app. &lt;a href="#_ftn44" name="_ftnref44"&gt;[44]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This section explains the features of current privacy notices on the 4 parameters of stage (at which the notice is given), content, length and user 	comprehension. Under each of these parameters the associated problems are identified and alternatives are suggested.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;(1) &lt;/b&gt; &lt;b&gt;Timing and Frequency of Notice: &lt;br /&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt; This sub-section identifies the various stages that notices are given and highlights their advantages, disadvantages and makes recommendations. It 		concludes with the findings of a study on what the ideal stage to provide notice is. This is supplemented with 2 critical models to address the common 		problems of habituation and contextualization. &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; Studies indicate that timing of notices or the stage at which they are given impact how consumer's recall and comprehend them and make choices 		accordingly. &lt;/b&gt; &lt;a href="#_ftn45" name="_ftnref45"&gt;[45]&lt;/a&gt; &lt;b&gt; I&lt;/b&gt; ntroducing only a 15-second delay between the presentation of privacy notices and privacy relevant choices can be enough to render notices ineffective at 	driving user behaviour.&lt;a href="#_ftn46" name="_ftnref46"&gt;[46]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Google Android and Apple iOS provide notices at different times. At the time of writing, Android users are shown a list of requested permissions while the 	app is being installed, i.e., after the user has chosen to install the app. In contrast, iOS shows a dialog during app use, the first time a permission is 	requested by an app. This is also referred to as a "just-in-time" notification. &lt;a href="#_ftn47" name="_ftnref47"&gt;[47]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following are the stages in which a notice can be given:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;1) NOTICE AT SETUP: Notice can be provided when a system is used for the first time&lt;a href="#_ftn48" name="_ftnref48"&gt;[48]&lt;/a&gt;. For instance, as 	part of a software installation process users are shown and have to accept the system's terms of use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) &lt;span&gt;Advantages&lt;/span&gt;: Users can inspect a system's data practices before using or purchasing it. The system developer is benefitted due to liability and 	transparency reasons that gain user trust. It provides the opportunity to explain unexpected data practices that may have a benign purpose in the context 	of the system&lt;a href="#_ftn49" name="_ftnref49"&gt;[49]&lt;/a&gt;. It can even impact purchase decisions. Egelman et al. found that participants were more 	likely to pay a premium at a privacy-protective website when they saw privacy information in search results, as opposed to on the website after selecting a 	search result&lt;a href="#_ftn50" name="_ftnref50"&gt;[50]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Users have become largely habituated to install time notices and ignore them&lt;a href="#_ftn51" name="_ftnref51"&gt;[51]&lt;/a&gt;. Users 	may have difficulty making informed decisions because they have not used the system yet and cannot fully assess its utility or weigh privacy trade-offs. They may also be focused on the primary task, namely completing the setup process to be able to use the system, and fail to pay attention to notices	&lt;a href="#_ftn52" name="_ftnref52"&gt;[52]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Privacy notices provided at setup time should be concise and focus on data practices immediately relevant to the primary user rather 	than presenting extensive terms of service. Integrating privacy information into other materials that explain the functionality of the system may further 	increase the chance that users do not ignore it.&lt;a href="#_ftn53" name="_ftnref53"&gt;[53]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;2) JUST IN TIME NOTICE: A privacy notice can be shown when a data practice is active, for example when information is being collected, used, or shared. 	Such notices are referred to as "contextualized" or "just-in-time" notices&lt;a href="#_ftn54" name="_ftnref54"&gt;[54]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: They enhance transparency and enable users to make privacy decisions in context. Users have also been shown to more freely share information 	if they are given relevant explanations at the time of data collection&lt;a href="#_ftn55" name="_ftnref55"&gt;[55]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Habituation can occur if these are shown too frequently. Moreover in apps such as gaming apps users generally tend to ignore notices 	displayed during usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Consumers can be given notice the first time a particular type of information is accessed such as email and then be given the option to 	opt out of further notifications. A Consumer may then seek to opt out of notices on email but choose to view all notices on health information that is 	accessed depending on his privacy priorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;3) CONTEXT-DEPENDENT NOTICES: The user's and system's context can also be considered to show additional notices or controls if deemed necessary	&lt;a href="#_ftn56" name="_ftnref56"&gt;[56]&lt;/a&gt;. Relevant context may be determined by a change of location, additional users included in or receiving 	the data, and other situational parameters. Some locations may be particularly sensitive, therefore users may appreciate being reminded that they are 	sharing their location when they are in a new place, or when they are sharing other information that may be sensitive in a specific context. Facebook introduced a privacy checkup message in 2014 that is displayed under certain conditions before posting publicly. It acts as a "nudge"	&lt;a href="#_ftn57" name="_ftnref57"&gt;[57]&lt;/a&gt; to make users aware that the post will be public and to help them manage who can see their posts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It may help users make privacy decisions that are more aligned with their desired level of privacy in the respective situation and thus 	foster trust in the system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Challenges in providing context-dependent notices are detecting relevant situations and context changes. Furthermore, determining whether a context is relevant to an individual's privacy concerns could in itself require access to that person's sensitive data and privacy preferences.	&lt;a href="#_ftn58" name="_ftnref58"&gt;[58]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Standards must be evolved to determine a contextual model based on user preferences.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;4) PERIODIC NOTICES: These are shown the first couple of times a data practice occurs, or every time. The sensitivity of the data practice may determine 	the appropriate frequency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It can further help users maintain awareness of privacy-sensitive information flows especially when data practices are largely invisible	&lt;a href="#_ftn59" name="_ftnref59"&gt;[59]&lt;/a&gt;such as in patient monitoring apps. This helps provide better control options.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Repeating notices can lead to notice fatigue and habituation&lt;a href="#_ftn60" name="_ftnref60"&gt;[60]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Frequency of these notices needs to be balanced with user needs. &lt;a href="#_ftn61" name="_ftnref61"&gt;[61]&lt;/a&gt; Data practices 	that are reasonably expected as part of the system may require only a single notice, whereas practices falling outside the expected context of use which 	the user is potentially unaware of may warrant repeated notices. Periodic notices should be relevant to users in order to be not perceived as annoying. A combined notice can remind about multiple ongoing data practices. Rotating warnings or changing their look can also further reduce habituation effects	&lt;a href="#_ftn62" name="_ftnref62"&gt;[62]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;5) PERSISTENT NOTICES: A persistent indicator is typically non-blocking and may be shown whenever a data practices is active, for instance when information 	is being collected continuously or when information is being transmitted&lt;a href="#_ftn63" name="_ftnref63"&gt;[63]&lt;/a&gt;. When inactive or not shown, 	persistent notices also indicate that the respective data practice is currently not active. For instance, Android and iOS display a small icon in the 	status bar whenever an application accesses the user's location.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: These are easy to understand and not annoying increasing their functionality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: These ambient indicators often go unnoticed.&lt;a href="#_ftn64" name="_ftnref64"&gt;[64]&lt;/a&gt; Most systems can only accommodate such 	indicators for a small number of data practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Persistent indicators should be designed to be noticeable when they are active. A system should only provide a small set of persistent 	indicators to indicate activity of especially critical data practices which the user can also specify.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;6) NOTICE ON DEMAND: Users may also actively seek privacy information and request a privacy notice. A typical example is posting a privacy policy at a persistent location&lt;a href="#_ftn65" name="_ftnref65"&gt;[65]&lt;/a&gt; and providing links to it from the app.	&lt;a href="#_ftn66" name="_ftnref66"&gt;[66]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: Privacy sensitive users are given the option to better explore policies and make informed decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: The current model of a link to a long privacy policy on a website will discourage users from requesting for information that they cannot 	fully understand and do not have time to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Better option are privacy settings interfaces or privacy dashboards within the system that provide information about data practices; 	controls to manage consent; summary reports of what information has been collected, used, and shared by the system; as well as options to manage or delete 	collected information. Contact information for a privacy office should be provided to enable users to make written requests.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Which of these Stages is the Most Ideal?&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In a series of experiments, Rebecca Balekabo and others &lt;a href="#_ftn67" name="_ftnref67"&gt;[67]&lt;/a&gt; have identified the impact of timing on 	smartphone privacy notices. The following 5 conditions were imposed on participants who were later tested on their levels of recall of the notices through 	questions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Not Shown: The participants installed and used the app without being shown a privacy notice&lt;/li&gt;
&lt;li&gt;App Store: Notice was shown at the time of installation at the app store&lt;/li&gt;
&lt;li&gt;App store Big: A large notice occupying more screen space was shown at the app store&lt;/li&gt;
&lt;li&gt;App Store Popup: A smaller popup was displayed at the app Store&lt;/li&gt;
&lt;li&gt;During use: Notice was shown during usage of the app&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The results (Figure) suggest that even if a notice contains information users care about, it is unlikely to be recalled if only shown in the app store and 	more effective when shown during app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Seeing the app notice during app usage resulted in better recall. Although participants remembered the notice shown after app use as well as in other 	points of app use, they found that it was not a good point for them to make decisions about the app because they had already used it, and participants 	preferred when the notice was shown during or before app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Hence depending on the app there are optimal times to show smartphone privacy notices to maximize attention and recall with preference being given to the 	beginning of or during app use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However several of these stages as outlined baove face the disadvantages of habituation and uncertainty on contextualization. The following 2 models have 	been proposed to address this:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;Habituation&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;When notices are shown too frequently, users may become habituated. Habituation may lead to users disregarding warnings, often without reading or 	comprehending the notice&lt;a href="#_ftn68" name="_ftnref68"&gt;[68]&lt;/a&gt;. To reduce habituation from app permission notices, Felt et al. identified a 	tested method to determine which permission requests should be emphasized &lt;a href="#_ftn69" name="_ftnref69"&gt;[69]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;They categorized actions on the basis of revertibility, severability, initiation, alterable and approval nature (Explained in figure) and applied the 	following permission granting mechanisms :&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Automatic Grant: It must be requested by the developer, but it is granted without user involvement.&lt;/li&gt;
&lt;li&gt;Trusted UI elements: They appear as part of an application's workflow, but clicking on them imbues the application with a new permission. To ensure 	that applications cannot trick users, trusted UI elements can be controlled only by the platform. For example, a user who is sending an SMS message from a 	third-party application will ultimately need to press a button; using trusted UI means the platform provides the button.&lt;/li&gt;
&lt;li&gt;Confirmation Dialog: Runtime consent dialogs interrupt the user's flow by prompting them to allow or deny a permission and often contain 	descriptions of the risk or an option to remember the decision.&lt;/li&gt;
&lt;li&gt;Install-time warning: These integrate permission granting into the installation flow. Installation screens list the application's requested 	permissions. In some platforms (e.g., Facebook), the user can reject some install-time permissions. In other platforms (e.g., Android and Windows 8 Metro), 	the user must approve all requested permissions or abort installation.&lt;a href="#_ftn70" name="_ftnref70"&gt;[70]&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Based on these conditions the following sequential model that the system must adopt was proposed to determine frequency of displaying notices:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/SequentialModel.png/@@images/6a94f50d-4bd0-4566-bc30-32d5ef3f53d3.png" alt="Sequential Model" class="image-inline" title="Sequential Model" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Initial tests have proven to be successful in reducing habituation effects and it is an important step towards designing and displaying privacy notices.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Contextualization&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Bastian Koning and others, in their paper "Towards Context Adaptive Privacy Decisions in Ubiquitous Computing"	&lt;b&gt; &lt;a href="#_ftn71" name="_ftnref71"&gt;&lt;b&gt;[71]&lt;/b&gt;&lt;/a&gt;&lt;/b&gt; propose a system for supporting a user's privacy decisions in situ, 	i.e., in the context they are required in, following the notion of contextual integrity. It approximates the user's privacy preferences and adapts them to 	the current context. The system can then either recommend sharing decisions and actions or autonomously reconfigure privacy settings. It is divided into 	the following stages:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/PrivacyDecisionProcess.png/@@images/4dd72aef-1bb1-42d9-ae59-9592b2a36b9f.png" alt="Privacy Decision Process" class="image-inline" title="Privacy Decision Process" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Context Model:&lt;/b&gt; A distinction is created between the decision level and system level. The system level enables context awareness but also filters context information and 	maps it to semantic concepts required for decisions. Semantic mappings can be derived from a pre-defined or learnt world model. On the decision level, the 	context model only contains components relevant for privacy decision making. For example: An activity involves the user, is assigned a type, i.e., a 	semantic label, such as home or work, based on system level input.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Privacy Decision Engine&lt;/b&gt; : The context model allows to reason about which context items are affected by a context transition. When a transition occurs, the privacy decision engine 	(PDE) evaluates which protection worthy context items are affected. Protection worthiness (or privacy relevance) of context items for a given context are 	determined by the user's privacy preferences that are This serves as a basis for adapting privacy preferences and is subsequently further adjusted to the 	user by learning from the user's explicit decisions, behaviour, and reaction to system actions. &lt;a href="#_ftn72" name="_ftnref72"&gt;[72]&lt;/a&gt; approximated by the system from the knowledge base.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The user's personality type is determined before initial system use&lt;/i&gt; to select a basic privacy profile.&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It may also be possible that the privacy preference cannot be realized in the current context. In that case, the privacy policy would suggest terminating 	the activity. For each privacy policy variant a confidence score is calculated based on how well it fits the adapted privacy preference. Based on the 	confidence scores, the PDE selects the most appropriate policy candidate or triggers user involvement if the confidence is below a certain threshold 	determined by the user's personality and previous privacy decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Realization and Enforcement:&lt;/b&gt; The selected privacy policy must be realized on the system level. This is by combining territorial privacy and information privacy aspects. The private 	territory is defined by a territorial privacy boundary that separates desired and undesired entities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Granularity adjustments for specific Information items is defined. For example, instead of the user's exact position only the street address or city can be 	provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ADVANTAGES: The personalization to a specific user has the advantage of better emulating that user's privacy decision process. It also helps to decide when 	to involve the user in the decision process by providing recommendations only and when privacy decisions can be realized autonomously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DISADVANTAGES: The entire model hinges on the ability of the system to accurately determine user profile before the user starts using it and not after, 	when preferences can be more accurately determined. There is no provision for the user to pick his own privacy profile, it is all system determined taking 	away an element of consent in the very beginning. As all further preferences are adapted on this base, it is possible that the system may not deliver. The 	use of confident scores is an approximation that can compromise privacy by a small numerical margin of difference.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However it is a useful insight on techniques of contextualization. Depending on the environment, different strategies for policy realization and varying 	degrees of enforcement are possible&lt;a href="#_ftn73" name="_ftnref73"&gt;[73]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Length&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The length of privacy policies is often cited as one reason they are so commonly ignored. Studies show privacy policies are hard to read, read 	infrequently, and do not support rational decision making. &lt;a href="#_ftn74" name="_ftnref74"&gt;[74]&lt;/a&gt; Aleecia M. McDonald and Lorrie Faith Cranor 	in their seminal study, "The Cost of Reading Privacy Policies" estimated that the the average length of privacy policies is 2,500 words. Using the reading 	speed of 250 words per minute which is typical for those who have completed secondary education, the average policy would take 10 minutes to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers also investigated how quickly people could read privacy policies when they were just skimming it for pertinent details. They timed 93 	people as they skimmed a 934-word privacy policy and answered multiple choice questions on its content.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though some people took under a minute and others up to 42 minutes, the bulk of the subjects of the research took between three and six minutes to skim the 	policy, which itself was just over a third of the size of the average policy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers used their data to estimate how much it costs to read the privacy policy of every site they visit once a year if their time was charged for 	and arrived at a mind boggling figure of $652 billion.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ProbabilityDensityFunction.png" alt="Probability Density Function" class="image-inline" title="Probability Density Function" /&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Though the figure of $652 billion has limited usefulness, because people rarely read whole policies and cannot charge anyone for the time it takes to do 	this, the researchers concluded that readers who do conduct a cost-benefit analysis might decide not to read any policies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Preliminary work from a small pilot study in our laboratory revealed that some Internet users believe their only serious risk online is they may lose up 	to $50 if their credit card information is stolen. For people who think that is their primary risk, our point estimates show the value of their time to 	read policies far exceeds this risk. Even for our lower bound estimates of the value of time, it is not worth reading privacy policies though it may be 	worth skimming them," said the research. This implies that seeing their only risk as credit card fraud suggests Internet users likely do not understand the 	risks to their privacy. As an FTC report recently stated, "it is unclear whether consumers even understand that their information is being collected, 	aggregated, and used to deliver advertising."&lt;a href="#_ftn75" name="_ftnref75"&gt;[75]&lt;/a&gt;"&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;If the privacy community can find ways to reduce the time cost of reading policies, it may be easier to convince Internet users to do so. For example, if 	consumers can move from needing to read policies word-for-word and only skim policies by providing useful headings, or with ways to hide all but relevant information in a layered format and thus reduce the effective length of the policies, more people may be willing to read them.	&lt;a href="#_ftn76" name="_ftnref76"&gt;[76]&lt;/a&gt; Apps can also adopt short form notices that summarize and link to the larger more complete notice 	displayed elsewhere. These short form notices need not be legally binding and must candidate that it does not cover all types of data collection but only 	the most relevant ones. &lt;a href="#_ftn77" name="_ftnref77"&gt;[77]&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;Content&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In an attempt to gain permission most privacy policies inform users about: (1) the type of information collected; and (2) the purpose for collecting that 	information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Standard privacy notices generally cover the points of:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;M&lt;b&gt;ethods Of Collection And Usage Of Personal Information&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;The Cookie Policy &lt;/b&gt; &lt;b&gt; &lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Sharing Of Customer Information&lt;/b&gt; &lt;a href="#_ftn78" name="_ftnref78"&gt;&lt;b&gt;[78]&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt; &lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Certified Information Privacy Professionals divide notices into the following sequential sections&lt;a href="#_ftn79" name="_ftnref79"&gt;[79]&lt;/a&gt;:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;i. &lt;b&gt;Policy Identification Details: D&lt;/b&gt;efines the policy name, version and description.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ii. &lt;b&gt;P3P-Based Components: &lt;/b&gt;Defines policy attributes that would apply if the policy is exported to a P3P format.	&lt;a href="#_ftn80" name="_ftnref80"&gt;[80]&lt;/a&gt; Such attributes would include: policy URLs, organization information, P&lt;span&gt;II&lt;/span&gt; access and dispute 	resolution procedures.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;iii. &lt;b&gt;Policy Statements and Related Elements: Groups, Purposes and PII Types-&lt;/b&gt;Policy statements define the individuals able to access 	certain types of information, for certain pre-defined purposes.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Applications tend to define the type of data broadly in an attempt to strike a balance between providing enough information so that application may gain 	consent to access a user's data and being broad enough to avoid ruling out specific information.&lt;a href="#_ftn81" name="_ftnref81"&gt;[81]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This leads to usage of vague terms like "information collected &lt;i&gt;may &lt;/i&gt;include."&lt;a href="#_ftn82" name="_ftnref82"&gt;[82]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly the purpose of the data acquisition is also very broad. For example, a privacy policy may state that user data can be collected for anything 	related to ―"improving the content of the Service." As the scope of ―improving the content of the Service is never defined, any usage could 	conceivably fall within that category.&lt;a href="#_ftn83" name="_ftnref83"&gt;[83]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several apps create user social profiles based on their online preferences to promote targeted marketing which is cleverly concealed in phrases like "we may also draw upon this Personal Information in order to adapt the Services of our community to your needs".	&lt;a href="#_ftn84" name="_ftnref84"&gt;[84]&lt;/a&gt; For instance Bees &amp;amp; Pollen is a "predictive personalization" platform for games and apps that 	"uses advanced predictive algorithms to detect complex, non-trivial correlations between conversion patterns and users' DNA signatures, thus enabling it to 	automatically serve each user a personalized best-fit game options, in real-time." In reality it analyses over 100 user attributes, including activity on 	Facebook, spending behaviours, marital status, and location.&lt;a href="#_ftn85" name="_ftnref85"&gt;[85]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notices also often mislead consumers into believing that their information will not be shared with third parties using the terms "unaffiliated third 	parties." Other affiliated companies within the corporate structure of the service provider may have access to user's data for marketing and other 	purposes. &lt;a href="#_ftn86" name="_ftnref86"&gt;[86]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are very few choices to opt-out of certain practices, such as sharing data for marketing purposes. Thus, users are effectively left with a 	take-it-or-leave-it choice - give up your privacy or go elsewhere.&lt;a href="#_ftn87" name="_ftnref87"&gt;[87]&lt;/a&gt;Users almost always grant consent if 	it is required to receive the service they want which raises the query if this consent is meaningful&lt;a href="#_ftn88" name="_ftnref88"&gt;[88]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The following recommendations have emerged:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Notice&lt;/b&gt; - Companies should provide consumers with clear, conspicuous notice that accurately describe their information practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; " type="disc"&gt;
&lt;li&gt; &lt;b&gt;Consumer Choice&lt;/b&gt; - Companies should provide consumers with the opportunity to decide (in the form of opting-out) if it may disclose personal information to unaffiliated 		third parties. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Access and Correction&lt;/b&gt; - Companies should provide consumers with the opportunity to access and correct personal information collected about the consumer. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Security&lt;/b&gt; - Companies must adopt reasonable security measures in order to protect the privacy of personal information. Possible security measures include: 		administrative security, physical security and technical security. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Enforcement&lt;/b&gt; - Companies should have systems through which they can enforce the privacy policy. This may be managed by the company, or an independent third party to ensure compliance. Examples of popular third parties include &lt;a href="https://www.cippguide.org/tag/bbbonline/"&gt;BBBOnLine&lt;/a&gt; and		&lt;a href="https://www.cippguide.org/tag/truste/"&gt;TRUSTe&lt;/a&gt;.&lt;a href="#_ftn89" name="_ftnref89"&gt;[89]&lt;/a&gt; &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Standardization&lt;/b&gt; : Several researchers and organizations have recommended a standardized privacy notice format that covers certain essential points.		&lt;a href="#_ftn90" name="_ftnref90"&gt;[90]&lt;/a&gt; However as displaying a privacy notice in itself is voluntary it is unpredictable whether 		companies would willingly adopt a standardized model. Moreover with the app market burgeoning with innovations a standard format may not cover all 		emergent data practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2 style="text-align: justify; "&gt;Comprehension&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;The FTC states that &lt;/b&gt; "the notice-and-choice model, as implemented, has led to long, incomprehensible privacy policies that consumers typically do not read, let alone 	understand. the question is not whether consumers should be given a say over unexpected uses of their data; rather, the question is how to provide 	simplified notice and choice"&lt;a href="#_ftn91" name="_ftnref91"&gt;[91]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notably, in a survey conducted by Zogby International, 93% of adults - and 81% of teens - indicated they would take more time to read terms and conditions 	for websites if they were written in clearer language.&lt;a href="#_ftn92" name="_ftnref92"&gt;[92]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most privacy policies are in natural language format: companies explain their practices in prose. One noted disadvantage to current natural language 	policies is that companies can choose which information to present, which does not necessarily solve the problem of information asymmetry between companies and consumers. Further, companies use what have been termed "weasel words" - legalistic, ambiguous, or slanted phrases - to describe their practices	&lt;a href="#_ftn93" name="_ftnref93"&gt;[93]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a study by Aleecia M. McDonald and others&lt;a href="#_ftn94" name="_ftnref94"&gt;[94]&lt;/a&gt;, it was found that accuracy in what users comprehend span 	a wide range. An average of 91% of participants answered correctly when asked about cookies, 61% answered correctly about opt out links, 60% understood 	when their email address would be "shared" with a third party, and only 46% answered correctly regarding telemarketing. Participants found those questions 	harder which substituted vague or complicated terms to refer to practices such as telemarketing by "the information you provide may be used for marketing 	services." Overall accuracy was a mere 33%.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Natural language policies are often long and require college-level reading skills. Furthermore, there are no standards for which information is disclosed, 	no standard place to find particular information, and data practices are not described using consistent language. These policies are "long, complicated, 	and full of jargon and change frequently."&lt;a href="#_ftn95" name="_ftnref95"&gt;[95]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Kent Walker list five problems that privacy notices typically suffer from -&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) overkill - long and repetitive text in small print,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) irrelevance - describing situations of little concern to most consumers,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) opacity - broad terms the reflect the truth that is impossible to track and control all the information collected and stored,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;d) non-comparability - simplification required to achieve comparability will lead to compromising accuracy, and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;e) inflexibility - failure to keep pace with new business models. &lt;a href="#_ftn96" name="_ftnref96"&gt;[96]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Researchers advocate a more succinct and simpler standard for privacy notices,&lt;a name="_ftnref34"&gt;&lt;/a&gt;&lt;a href="#_ftn97" name="_ftnref97"&gt;[97]&lt;/a&gt; such as representing the information in the form of a table. &lt;a href="#_ftn98" name="_ftnref98"&gt;[98]&lt;/a&gt; However, studies show only an insignificant improvement in the understanding by consumers when privacy policies are represented in graphic formats like tables and labels.	&lt;a href="#_ftn99" name="_ftnref99"&gt;[99]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are also recommendations to adopt a multi-layered approach where the relevant information is summarized through a short notice.&lt;a href="#_ftn100" name="_ftnref100"&gt;[100]&lt;/a&gt; This is backed by studies that consumers find layered policies easier to understand.	&lt;a href="#_ftn101" name="_ftnref101"&gt;[101]&lt;/a&gt; However they were less accurate in the layered format especially with parts that were not 	summarized. This suggests participants that did not continue to the full policy when the information they sought was not available on the short notice. 	Unless it is possible to identify all of the topics users care about and summarize to one page, the layered notice effectively hides information and reduces transparency. It has also been pointed out that it is impossible to convey complex data policies in simple and clear language.	&lt;a href="#_ftn102" name="_ftnref102"&gt;[102]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Consumers often struggle to map concepts such as third party access to the terms used in policies. This is also because companies with identical practices 	often convey different information, and these differences reflected in consumer's ability to understand the policies. These policies may need an 	educational component so readers understand what it means for a site to engage in a given practice&lt;a href="#_ftn103" name="_ftnref103"&gt;[103]&lt;/a&gt;. 	However it is unlikely that when readers fail to take time to read the policy that they will read up on additional educational components.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;[1]&lt;/a&gt; Amber Sinha http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;[2]&lt;/a&gt; Wang, &lt;i&gt;et al.&lt;/i&gt;, 1998) Milberg, &lt;i&gt;et al.&lt;/i&gt; (1995)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;[3]&lt;/a&gt; See e.g., White House, Consumer Privacy Bill of Rights (2012) 			http://www.whitehouse.gov/the-pressoffice/2012/02/23/we-can-t-wait-obama-administration-unveils-blueprint-privacy-bill-rights; Fed. Trade Comm'n, 			Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Business and Policy Makers (2012) 			http://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commissionreport-protecting-consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;[4]&lt;/a&gt; Fed. Trade Comm'n, Privacy Online: A Report to Congress 7 (June 1998), available at www.ftc.gov/reports/privacy3/priv-23a.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; &lt;a href="http://itlaw.wikia.com/wiki/U.S._Department_of_Commerce" title="U.S. Department of Commerce"&gt;U.S. Department of Commerce&lt;/a&gt; , &lt;a href="http://itlaw.wikia.com/wiki/Internet_Policy_Task_Force" title="Internet Policy Task Force"&gt;Internet Policy Task Force&lt;/a&gt;, 			&lt;a href="http://itlaw.wikia.com/wiki/Commercial_Data_Privacy_and_Innovation_in_the_Internet_Economy:_A_Dynamic_Policy_Framework" title="Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework"&gt; Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework &lt;/a&gt; 20 (Dec. 16, 2010) (&lt;a href="http://www.ntia.doc.gov/reports/2010/IPTF_Privacy_GreenPaper_12162010.pdf"&gt;full-text&lt;/a&gt;).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;[6]&lt;/a&gt; 389 U.S. 347 (1967).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;[7]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;[8]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;[9]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;[10]&lt;/a&gt; Kyllo, 533 U.S. at 34 (―[T]he technology enabling human flight has exposed to public view (and hence, we have said, to official observation) 			uncovered portions of the house and its curtilage that once were private.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;[11]&lt;/a&gt; Kyllo v. United States, 533 U.S. 27&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;[12]&lt;/a&gt; See Katz, 389 U.S. at 352 (―But what he sought to exclude when he entered the booth was not the intruding eye-it was the uninvited ear. He 			did not shed his right to do so simply because he made his calls from a place where he might be seen.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;[13]&lt;/a&gt; See United States v. Ahrndt, No. 08-468-KI, 2010 WL 3773994, at *4 (D. Or. Jan. 8, 2010).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;[14]&lt;/a&gt; In re DoubleClick Inc. Privacy Litig., 154 F. Supp. 2d 497 (S.D.N.Y. 2001).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;[15]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;[16]&lt;/a&gt; See Michael A. Carrier, Against Cyberproperty, 22 BERKELEY TECH. L.J. 1485, 1486 (2007) (arguing against creating a right to exclude users from 			making electronic contact to their network as one that exceeds traditional property notions).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;[17]&lt;/a&gt; See M. Ryan Calo, Against Notice Skepticism in Privacy (and Elsewhere), 87 NOTRE DAME L. REV. 1027, 1049 (2012) (citing Paula J. Dalley, The Use 			and Misuse of Disclosure as a Regulatory System, 34 FLA. ST. U. L. REV. 1089, 1093 (2007) ("[D]isclosure schemes comport with the prevailing 			political philosophy in that disclosure preserves individual choice while avoiding direct governmental interference.")).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;[18]&lt;/a&gt; See Calo, supra note 10, at 1048; see also Omri Ben-Shahar &amp;amp; Carl E. Schneider, The Failure of Mandated Disclosure, 159 U. PA. L. REV. 647, 682 			(noting that notice "looks cheap" and "looks easy").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;[19]&lt;/a&gt; Mark MacCarthy, New Directions in Privacy: Disclosure, Unfairness and Externalities, 6 I/S J. L. &amp;amp; POL'Y FOR INFO. SOC'Y 425, 440 (2011) 			(citing M. Ryan Calo, A Hybrid Conception of Privacy Harm Draft-Privacy Law Scholars Conference 2010, p. 28).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;[20]&lt;/a&gt; Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 HARV. L. REV. 1879, 1885 (2013) (citing Jon Leibowitz, Fed. 			Trade Comm'n, So Private, So Public: Individuals, the Internet &amp;amp; the Paradox of Behavioral Marketing, Remarks at the FTC Town Hall Meeting on 			Behavioral Advertising: Tracking, Targeting, &amp;amp; Technology (Nov. 1, 2007), available at 			http://www.ftc.gov/speeches/leibowitz/071031ehavior/pdf). Paul Ohm refers to these issues as "information-quality problems." See Paul Ohm, Branding 			Privacy, 97 MINN. L. REV. 907, 930 (2013). Daniel J. Solove refers to this as "the problem of the uninformed individual." See Solove, supra note 17&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;[21]&lt;/a&gt; See Edward J. Janger &amp;amp; Paul M. Schwartz, The Gramm-Leach-Bliley Act, Information Privacy, and the Limits of Default Rules, 86 MINN. L. REV. 			1219, 1230 (2002) (stating that according to one survey, "only 0.5% of banking customers had exercised their opt-out rights").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;[22]&lt;/a&gt; See Amber Sinha A Critique of Consent in Information Privacy 			http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;[23]&lt;/a&gt; Leigh Shevchik, "Mobile App Industry to Reach Record Revenue in 2013," New Relic (blog), April 1, 2013, 			http://blog.newrelic.com/2013/04/01/mobile-apps-industry-to-reach-record-revenue-in-2013/.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;[24]&lt;/a&gt; Jan Lauren Boyles, Aaron Smith, and Mary Madden, "Privacy and Data Management on Mobile Devices," Pew Internet &amp;amp; American Life Project, 			Washington, DC, September 5, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;[25]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;[26]&lt;/a&gt; "Mobile Apps for Kids: Disclosures Still Not Making the Grade," Federal Trade Commission, Washington, DC, December 2012&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;[27]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;[28]&lt;/a&gt; Linda Ackerman, "Mobile Health and Fitness Applications and Information Privacy," Privacy Rights Clearinghouse, San Diego, CA, July 15, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;[29]&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999). 			&lt;a href="http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj"&gt; http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj &lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;[30]&lt;/a&gt; William Aiello, Steven M. Bellovin, Matt Blaze, Ran Canetti, John Ioannidis, Angelos D. Keromytis, and Omer Reingold. Just fast keying: Key 			agreement in a hostile internet. ACM Trans. Inf. Syst. Secur., 7(2):242-273, 2004.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;[31]&lt;/a&gt; Privacy By Design The 7 Foundational Principles by Anne Cavoukian https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;[32]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;[33]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;[34]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; John Frank Weaver, We Need to Pass Legislation on Artificial Intelligence Early and Often, SLATE FUTURE TENSE (Sept. 12, 			2014),http://www.slate.com/blogs/future_tense/2014/09/12/we_need_to_pass_artificial_intelligence_laws_early_and_often.html&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Richard Warner &amp;amp; Robert Sloan, Beyond Notice and Choice: Privacy, Norms, and Consent, J. High Tech. L. (2013). Available at: 			http://scholarship.kentlaw.iit.edu/fac_schol/568&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;b&gt;&lt;sup&gt;&lt;b&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/b&gt;&lt;/sup&gt;&lt;/b&gt;&lt;/a&gt; &lt;a href="http://ssrn.com/abstract=1085333"&gt;&lt;b&gt;Engineering Privacy by Sarah Spiekermann, Lorrie Faith Cranor :: SSRN&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;[39]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;[40]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn41"&gt;
&lt;p&gt;&lt;a href="#_ftnref41" name="_ftn41"&gt;[41]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn42"&gt;
&lt;p&gt;&lt;a href="#_ftnref42" name="_ftn42"&gt;[42]&lt;/a&gt; See Katherine Noyes, Why Android App Security is Better Than for the iPhone, PC WORLD BUS. CTR. (Aug. 6, 2010, 4:20 PM), 			http://www.pcworld.com/businesscenter/article/202758/why_android_app_security_is_be tter_than_for_the_iphone.html; see also About Permissions for 			Third-Party Applications, BLACKBERRY, http://docs.blackberry.com/en/smartphone_users/deliverables/22178/ 			About_permissions_for_third-party_apps_50_778147_11.jsp (last visited Sept. 29, 2011); Security and Permissions, supra note 76.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn43"&gt;
&lt;p&gt;&lt;a href="#_ftnref43" name="_ftn43"&gt;[43]&lt;/a&gt; Peter S. Vogel, A Worrisome Truth: Internet Privacy is Impossible, TECHNEWSWORLD (June 8, 2011, 5:00 AM), http://www.technewsworld.com/ 			story/72610.html.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn44"&gt;
&lt;p&gt;&lt;a href="#_ftnref44" name="_ftn44"&gt;[44]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn45"&gt;
&lt;p&gt;&lt;a href="#_ftnref45" name="_ftn45"&gt;[45]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing Notice: A Large-scale Experiment on the Timing of Software License 			Agreements. In Proc. of CHI. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn46"&gt;
&lt;p&gt;&lt;a href="#_ftnref46" name="_ftn46"&gt;[46]&lt;/a&gt; I. Adjerid, A. Acquisti, L. Brandimarte, and G. Loewenstein. Sleights of Privacy: Framing, Disclosures, and the Limits of Transparency. In Proc. of 			SOUPS. ACM, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn47"&gt;
&lt;p&gt;&lt;a href="#_ftnref47" name="_ftn47"&gt;[47]&lt;/a&gt; http://delivery.acm.org/10.1145/2810000/2808119/p63-balebako.pdf?ip=106.51.36.200&amp;amp;id=2808119&amp;amp;acc=OA&amp;amp;key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E35B5BCE80D07AAD9&amp;amp;CFID=801296199&amp;amp;CFTOKEN=33661544&amp;amp;__acm__=1466052980_2f265a2442ea3394aa1ebab7e6449933&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn48"&gt;
&lt;p&gt;&lt;a href="#_ftnref48" name="_ftn48"&gt;[48]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn49"&gt;
&lt;p&gt;&lt;a href="#_ftnref49" name="_ftn49"&gt;[49]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn50"&gt;
&lt;p&gt;&lt;a href="#_ftnref50" name="_ftn50"&gt;[50]&lt;/a&gt; S. Egelman, J. Tsai, L. F. Cranor, and A. Acquisti. Timing is everything?: the effects of timing and placement of online privacy indicators. In 			Proc. CHI '09. ACM, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn51"&gt;
&lt;p&gt;&lt;a href="#_ftnref51" name="_ftn51"&gt;[51]&lt;/a&gt; R. B¨ohme and S. K¨opsell. Trained to accept?: A field experiment on consent dialogs. In Proc. CHI '10. ACM, 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn52"&gt;
&lt;p&gt;&lt;a href="#_ftnref52" name="_ftn52"&gt;[52]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn53"&gt;
&lt;p&gt;&lt;a href="#_ftnref53" name="_ftn53"&gt;[53]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn54"&gt;
&lt;p&gt;&lt;a href="#_ftnref54" name="_ftn54"&gt;[54]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn55"&gt;
&lt;p&gt;&lt;a href="#_ftnref55" name="_ftn55"&gt;[55]&lt;/a&gt; A. Kobsa and M. Teltzrow. Contextualized communication of privacy practices and personalization benefits: Impacts on users' data sharing and 			purchase behavior. In Proc. PETS '05. Springer, 2005.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn56"&gt;
&lt;p&gt;&lt;a href="#_ftnref56" name="_ftn56"&gt;[56]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn57"&gt;
&lt;p&gt;&lt;a href="#_ftnref57" name="_ftn57"&gt;[57]&lt;/a&gt; E. Choe, J. Jung, B. Lee, and K. Fisher. Nudging people away from privacy-invasive mobile apps through visual framing. In Proc. INTERACT '13. 			Springer, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn58"&gt;
&lt;p&gt;&lt;a href="#_ftnref58" name="_ftn58"&gt;[58]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn59"&gt;
&lt;p&gt;&lt;a href="#_ftnref59" name="_ftn59"&gt;[59]&lt;/a&gt; Article 29 Data Protection Working Party. Opinion 8/2014 on the Recent Developments on the Internet of Things. WP 223, Sept. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn60"&gt;
&lt;p&gt;&lt;a href="#_ftnref60" name="_ftn60"&gt;[60]&lt;/a&gt; B. Anderson, A. Vance, B. Kirwan, E. D., and S. Howard. Users aren't (necessarily) lazy: Using NeuroIS to explain habituation to security warnings. 			In Proc. ICIS '14, 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn61"&gt;
&lt;p&gt;&lt;a href="#_ftnref61" name="_ftn61"&gt;[61]&lt;/a&gt; B. Anderson, B. Kirwan, D. Eargle, S. Howard, and A. Vance. How polymorphic warnings reduce habituation in the brain - insights from an fMRI study. 			In Proc. CHI '15. ACM, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn62"&gt;
&lt;p&gt;&lt;a href="#_ftnref62" name="_ftn62"&gt;[62]&lt;/a&gt; M. S. Wogalter, V. C. Conzola, and T. L. Smith-Jackson. Research-based guidelines for warning design and evaluation. Applied Ergonomics, 16 USENIX 			Association 2015 Symposium on Usable Privacy and Security 17 33(3):219-230, 2002.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn63"&gt;
&lt;p&gt;&lt;a href="#_ftnref63" name="_ftn63"&gt;[63]&lt;/a&gt; L. F. Cranor, P. Guduru, and M. Arjula. User interfaces for privacy agents. ACM TOCHI, 13(2):135-178, 2006.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn64"&gt;
&lt;p&gt;&lt;a href="#_ftnref64" name="_ftn64"&gt;[64]&lt;/a&gt; R. S. Portnoff, L. N. Lee, S. Egelman, P. Mishra, D. Leung, and D. Wagner. Somebody's watching me? assessing the effectiveness of webcam indicator 			lights. In Proc. CHI '15, 2015&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn65"&gt;
&lt;p&gt;&lt;a href="#_ftnref65" name="_ftn65"&gt;[65]&lt;/a&gt; M. Langheinrich. Privacy by design - principles of privacy-aware ubiquitous systems. In Proc. UbiComp '01. Springer, 2001&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn66"&gt;
&lt;p&gt;&lt;a href="#_ftnref66" name="_ftn66"&gt;[66]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn67"&gt;
&lt;p&gt;&lt;a href="#_ftnref67" name="_ftn67"&gt;[67]&lt;/a&gt; The Impact of Timing on the Salience of Smartphone App Privacy Notices, Rebecca Balebako , Florian Schaub, Idris Adjerid , Alessandro Acquist 			,Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn68"&gt;
&lt;p&gt;&lt;a href="#_ftnref68" name="_ftn68"&gt;[68]&lt;/a&gt; R. Böhme and J. Grossklags. The Security Cost of Cheap User Interaction. In Workshop on New Security Paradigms, pages 67-82. ACM, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn69"&gt;
&lt;p&gt;&lt;a href="#_ftnref69" name="_ftn69"&gt;[69]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn70"&gt;
&lt;p&gt;&lt;a href="#_ftnref70" name="_ftn70"&gt;[70]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn71"&gt;
&lt;p&gt;&lt;a href="#_ftnref71" name="_ftn71"&gt;[71]&lt;/a&gt; Towards Context Adaptive Privacy Decisions in Ubiquitous Computing Florian Schaub∗ , Bastian Könings∗ , Michael Weber∗ , 			Frank Kargl† ∗ Institute of Media Informatics, Ulm University, Germany Email: { florian.schaub | bastian.koenings | michael.weber 			}@uni-ulm.d&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn72"&gt;
&lt;p&gt;&lt;a href="#_ftnref72" name="_ftn72"&gt;[72]&lt;/a&gt; M. Korzaan and N. Brooks, "Demystifying Personality and Privacy: An Empirical Investigation into Antecedents of Concerns for Information Privacy," 			Journal of Behavioral Studies in Business, pp. 1-17, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn73"&gt;
&lt;p&gt;&lt;a href="#_ftnref73" name="_ftn73"&gt;[73]&lt;/a&gt; B. Könings and F. Schaub, "Territorial Privacy in Ubiquitous Computing," in WONS'11. IEEE, 2011, pp. 104-108.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn74"&gt;
&lt;p&gt;&lt;a href="#_ftnref74" name="_ftn74"&gt;[74]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn75"&gt;
&lt;p&gt;&lt;a href="#_ftnref75" name="_ftn75"&gt;[75]&lt;/a&gt; 5 Federal Trade Commission, "Protecting Consumers in the Next Tech-ade: A Report by the Staff of the Federal Trade Commission," March 2008, 11, 			http://www.ftc.gov/os/2008/03/P064101tech.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn76"&gt;
&lt;p&gt;&lt;a href="#_ftnref76" name="_ftn76"&gt;[76]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;p&gt;I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue http://www.is-journal.org/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn77"&gt;
&lt;p&gt;&lt;a href="#_ftnref77" name="_ftn77"&gt;[77]&lt;/a&gt; IS YOUR INSEAM YOUR BIOMETRIC? Evaluating the Understandability of Mobile Privacy Notice Categories Rebecca Balebako, Richard Shay, and Lorrie 			Faith Cranor July 17, 2013 https://www.cylab.cmu.edu/files/pdfs/tech_reports/CMUCyLab13011.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn78"&gt;
&lt;p&gt;&lt;a href="#_ftnref78" name="_ftn78"&gt;[78]&lt;/a&gt; https://www.sba.gov/blogs/7-considerations-crafting-online-privacy-policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn79"&gt;
&lt;p&gt;&lt;a href="#_ftnref79" name="_ftn79"&gt;[79]&lt;/a&gt; https://www.cippguide.org&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn80"&gt;
&lt;p&gt;&lt;a href="#_ftnref80" name="_ftn80"&gt;[80]&lt;/a&gt; The Platform for Privacy Preferences Project, more commonly known as P3P was designed by the World Wide Web Consortium aka W3C in response to the 			increased use of the Internet for sales transactions and subsequent collection of personal information. P3P is a special protocol that allows a 			website's policies to be machine readable, granting web users' greater control over the use and disclosure of their information while browsing the 			internet.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn81"&gt;
&lt;p&gt;&lt;a href="#_ftnref81" name="_ftn81"&gt;[81]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn82"&gt;
&lt;p&gt;&lt;a href="#_ftnref82" name="_ftn82"&gt;[82]&lt;/a&gt; See Foursqaure Privacy Policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn83"&gt;
&lt;p&gt;&lt;a href="#_ftnref83" name="_ftn83"&gt;[83]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn84"&gt;
&lt;p&gt;&lt;a href="#_ftnref84" name="_ftn84"&gt;[84]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn85"&gt;
&lt;p&gt;&lt;a href="#_ftnref85" name="_ftn85"&gt;[85]&lt;/a&gt; Bees and Pollen, "Bees and Pollen Personalization Platform," http://www.beesandpollen.com/TheProduct. aspx; Bees and Pollen, "Sense6-Social Casino 			Games Personalization Solution," http://www.beesandpollen. com/sense6.aspx; Bees and Pollen, "About Us," http://www.beesandpollen.com/About.aspx.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn86"&gt;
&lt;p&gt;&lt;a href="#_ftnref86" name="_ftn86"&gt;[86]&lt;/a&gt; CFA on the NTIA Short Form Notice Code of Conduct to Promote Transparency in Mobile Applications July 26, 2013 | Press Release&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn87"&gt;
&lt;p&gt;&lt;a href="#_ftnref87" name="_ftn87"&gt;[87]&lt;/a&gt; P. M. Schwartz and D. Solove. Notice &amp;amp; Choice. In The Second NPLAN/BMSG Meeting on Digital Media and Marketing to Children, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn88"&gt;
&lt;p&gt;&lt;a href="#_ftnref88" name="_ftn88"&gt;[88]&lt;/a&gt; F. Cate. The Limits of Notice and Choice. IEEE Security Privacy, 8(2):59-62, Mar. 2010.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn89"&gt;
&lt;p&gt;&lt;a href="#_ftnref89" name="_ftn89"&gt;[89]&lt;/a&gt; https://www.cippguide.org/2011/08/09/components-of-a-privacy-policy/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn90"&gt;
&lt;p&gt;&lt;a href="#_ftnref90" name="_ftn90"&gt;[90]&lt;/a&gt; https://www.ftc.gov/public-statements/2001/07/case-standardization-privacy-policy-formats&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn91"&gt;
&lt;p&gt;&lt;a href="#_ftnref91" name="_ftn91"&gt;[91]&lt;/a&gt; Protecting Consumer Privacy in an Era of Rapid Change. Preliminary FTC Staff Report.December 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn92"&gt;
&lt;p&gt;&lt;a href="#_ftnref92" name="_ftn92"&gt;[92]&lt;/a&gt; . See Comment of Common Sense Media, cmt. #00457, at 1.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn93"&gt;
&lt;p&gt;&lt;a href="#_ftnref93" name="_ftn93"&gt;[93]&lt;/a&gt; Pollach, I. What's wrong with online privacy policies? Communications of the ACM 30, 5 (September 2007), 103-108&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn94"&gt;
&lt;p&gt;&lt;a href="#_ftnref94" name="_ftn94"&gt;[94]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn95"&gt;
&lt;p&gt;&lt;a href="#_ftnref95" name="_ftn95"&gt;[95]&lt;/a&gt; Amber Sinha Critique&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn96"&gt;
&lt;p&gt;&lt;a href="#_ftnref96" name="_ftn96"&gt;[96]&lt;/a&gt; Kent Walker, The Costs of Privacy, 2001 available at 			&lt;a href="https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy"&gt; https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn97"&gt;
&lt;p&gt;&lt;a href="#_ftnref97" name="_ftn97"&gt;[97]&lt;/a&gt; Annie I. Anton et al., Financial Privacy Policies and the Need for Standardization, 2004 available at			&lt;a href="https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf"&gt;https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf&lt;/a&gt;; Florian Schaub, R. 			Balebako et al, "A Design Space for effective privacy notices" available at 			&lt;a href="https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf"&gt; https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn98"&gt;
&lt;p&gt;&lt;a href="#_ftnref98" name="_ftn98"&gt;[98]&lt;/a&gt; Allen Levy and Manoj Hastak, Consumer Comprehension of Financial Privacy Notices, Interagency Notice Project, available at			&lt;a href="https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf"&gt;https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn99"&gt;
&lt;p&gt;&lt;a href="#_ftnref99" name="_ftn99"&gt;[99]&lt;/a&gt; Patrick Gage Kelly et al., Standardizing Privacy Notices: An Online Study of the Nutrition Label Approach available at 			&lt;a href="https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf"&gt; https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn100"&gt;
&lt;p&gt;&lt;a href="#_ftnref100" name="_ftn100"&gt;[100]&lt;/a&gt; The Center for Information Policy Leadership, Hunton &amp;amp; Williams LLP, "Ten Steps To Develop A Multi-Layered Privacy Notice" available at 			&lt;a href="https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf"&gt; https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn101"&gt;
&lt;p&gt;&lt;a href="#_ftnref101" name="_ftn101"&gt;[101]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn102"&gt;
&lt;p&gt;&lt;a href="#_ftnref102" name="_ftn102"&gt;[102]&lt;/a&gt; Howard Latin, "Good" Warnings, Bad Products, and Cognitive Limitations, 41 UCLA Law Review available at 			&lt;a href="https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5"&gt; https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn103"&gt;
&lt;p&gt;&lt;a href="#_ftnref103" name="_ftn103"&gt;[103]&lt;/a&gt; Report by Kleimann Communication Group for the FTC. Evolution of a prototype financial privacy notice, 2006. http://www.ftc.gov/privacy/ 			privacyinitiatives/ftcfinalreport060228.pdf Accessed 2 Mar 2007&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/enlarging-the-small-print'&gt;https://cis-india.org/internet-governance/blog/enlarging-the-small-print&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Meera Manoj</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-12-14T16:27:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public">
    <title>En Inde, le biométrique version très grand public </title>
    <link>https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public</link>
    <description>
        &lt;b&gt;Initiée en 2010, l’Aadhaar est désormais la plus grande base de données d’empreintes et d’iris au monde. Carte d’identité destinée aux 1,25 milliard d’Indiens, elle sert aussi de moyen de paiement. Mais la sécurité du système et son utilisation à des fins de surveillance posent question.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="http://www.liberation.fr/futurs/2017/04/27/en-inde-le-biometrique-version-tres-grand-public_1565815"&gt;published by Liberation&lt;/a&gt; on April 27, 2017. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Le front barré d’un signe religieux hindou rouge, Vivek  Kumar se tient droit derrière le comptoir de son étroite papeterie  située dans une allée obscure d’un quartier populaire du sud-est de New  Delhi. Sous le regard bienveillant d’une idole de Ganesh - le dieu qui  efface les obstacles -, le commerçant à la fine moustache et à la  chemise bleu-gris au col Nehru réalise des photocopies, fournit des  tampons ou des stylos à des dizaines de chalands.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Gaurav, un vendeur de légumes de la halle d’à côté, entre  acheter du crédit de communication mobile. Au moment de payer, il sort  son portefeuille, mais pas pour chercher de la monnaie. Il y prend sa  carte d’identité Aadhaar et fournit ses douze chiffres au commerçant.  Qui les entre dans un smartphone, sélectionne la banque de Gaurav et  indique le montant de l’achat. Le client n’a plus qu’à poser son pouce  sur un lecteur biométrique relié au combiné, connecté à Internet. Une  lumière rouge s’allume et un son retentit : la transaction est bien  passée.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Depuis mars, 32 banques indiennes fournissent ce service  novateur de paiement par empreinte digitale. Appelé Aadhaar Pay, il  utilise les informations biométriques, à savoir les dix empreintes  digitales et celle de l’iris, recueillies par le gouvernement depuis  septembre 2010 pour créer la première carte d’identité du pays. Toute  personne résidant en Inde depuis plus de six mois, y compris les  étrangers, peut s’inscrire et l’obtenir gratuitement.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;«Renverser le système»&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;L’Aadhaar («la fondation» en hindi) représente aujourd’hui  la plus grande base de données biométriques au monde, avec 1,13 milliard  de personnes enregistrées sur 1,25 milliard, soit 99 % de la population  adulte indienne.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;L’objectif initial était double : identifier la population -  10% des Indiens n’avaient jusqu’ici aucun papier, et donc aucun droit -  et se servir de ces moyens biométriques pour sécuriser l’attribution de  nombreuses subventions alimentaires ou énergétiques, dont le  détournement coûte plusieurs milliards d’euros chaque année à l’Etat  fédéral.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A partir de 2014, la nouvelle majorité nationaliste hindoue  du BJP a étendu les usages de l’Aadhaar pour transformer cet outil de  reconnaissance en un vrai «passe-partout» de la vie quotidienne indienne  : depuis l’ouverture d’une ligne téléphonique à la déclaration de ses  impôts, en passant surtout par la création d’un compte en banque, le  numéro Aadhaar sera à présent requis. Dans ce dernier cas, l’Aadhaar  permet en prime d’utiliser le paiement bancaire par biométrie pour  réduire le recours au liquide, qui représente encore plus de 90 % des  transactions dans le pays.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Le Premier ministre, Narendra Modi, a fait de cette  inclusion financière l’un de ses principaux chevaux de bataille :  en 2014, son gouvernement a lancé un énorme programme qui a permis la  création de 213 millions de comptes bancaires en deux ans - aujourd’hui,  quasiment tous les foyers en possèdent au moins un. Il a continué dans  cette voie énergique en démonétisant, en novembre, les principales  coupures. But de la manœuvre : convaincre les Indiens de se défaire, au  moins temporairement, de leur dépendance aux billets marqués de la tête  de Gandhi.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;«Le liquide est gratuit, donc il est difficile de pousser les gens à utiliser d’autres moyens de paiement,&lt;/i&gt; explique Ragavan Venkatesan, responsable des paiements numériques à la  banque IDFC, pionnière dans l’utilisation de l’Aadhaar Pay. &lt;i&gt;Nous avons donc renversé le système pour que le commerçant soit incité à utiliser les moyens numériques.»&lt;/i&gt; L’établissement financier a d’abord développé le &lt;i&gt;«microdistributeur de billets»&lt;/i&gt; : une tablette que le vendeur peut utiliser pour créer des comptes,  recevoir des petits dépôts ou fournir du liquide aux clients au nom de  la banque, contre une commission. Comme l’Aadhaar Pay, cette tablette se  connecte au lecteur biométrique - fourni par l’entreprise française  Safran - pour l’identification et l’authentification. Dans les deux cas,  et à la différence des paiements par carte, ni le marchand ni le client  ne paient pour l’utilisation de ce réseau. &lt;i&gt;«Le mode traditionnel de paiement par carte va progressivement disparaître»,&lt;/i&gt; prédit Ragavan Venkatesan.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Défi&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Pour l’instant, le système n’en est toutefois qu’à ses  débuts. Environ 70 banques - une minorité du réseau indien - sont  reliées à l’Aadhaar Pay, et lors de nos visites dans différents magasins  de New Delhi, une transaction a été bloquée pendant dix minutes à cause  d’un problème de serveur. La connectivité est d’ailleurs un défi dans  un pays dont la population est en majorité rurale : le système nécessite  au minimum le réseau 2G, dont sont dépourvus environ 8 % des villages,  selon le ministère des Télécommunications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mais c’est la protection du système qui est surtout en question : &lt;i&gt;«La  biométrie réduit fortement le niveau de sécurité, car c’est facile de  voler ces données et de les utiliser sans votre accord,&lt;/i&gt; explique Sunil Abraham, directeur du Centre pour l’Internet et la société de Bangalore. &lt;i&gt;Il  existe maintenant des appareils photo de haute résolution qui  permettent de capturer et de répliquer les empreintes ou l’iris»&lt;/i&gt;, affirme ce spécialiste en cybersécurité.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Le problème tient au caractère irrévocable de ces données  biométriques. A la différence d’une carte bancaire qu’on peut annuler et  remplacer, on ne peut changer d’empreinte ou d’iris. L’Autorité  indienne d’identification unique (UIDAI), qui gère l’Aadhaar, prévoit  bien que l’on puisse bloquer l’utilisation de ses propres données  biométriques sur demande, ce qui offre une solution de sécurisation  temporaire. &lt;i&gt;«Si un fraudeur essaie de les utiliser, on peut le repérer&lt;/i&gt; [grâce au réseau internet, ndlr] &lt;i&gt;et l’arrêter»,&lt;/i&gt; défend Ragavan Venkatesan, de la banque IDFC.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mais cela risque de ne pas suffire en cas de recel de ces  informations : la police vient d’interpeller un groupe de trafiquants  qui étaient en possession des données bancaires de 10 millions  d’Indiens, récupérées à travers des employés et sous-traitants, données  qu’ils revendaient par paquets. Une femme âgée s’était déjà fait dérober  146 000 roupies (un peu plus de 2 000 euros) à cause de cette fraude.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Outil idéal&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Le directeur de l’UIDAI assure qu’aucune fuite ni vol de  données n’ont été rapportés à ce jour depuis leurs serveurs - ce qui ne  garantit pas que cette confidentialité sera respectée par tous les  autres acteurs qui y ont accès. En février, un chercheur en  cybersécurité a alerté la police sur le fait que 500 000 numéros Aadhaar  ainsi que les détails personnels de leurs propriétaires - exclusivement  des mineurs - avaient été publiés en ligne. La loi sur l’Aadhaar punit  de trois ans de prison le vol ou le recel de ces données. Ce texte  adopté l’année dernière - soit six ans après le début de la collecte -  empêche également leur utilisation à d’autres fins que  l’authentification pour l’attribution de subventions et de services. Et  l’UIDAI ne peut y accéder pleinement qu’en cas de risque pour la  sécurité nationale, et selon une procédure spéciale.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Reste qu’il n’existe pas d’autorité, comme la Cnil en France&lt;i&gt;,&lt;/i&gt; chargée de veiller de manière indépendante à ce que ces lignes rouges  ne soient pas franchies par un Etat à la recherche de nouveaux moyens de  renseignement. Car les experts s’accordent sur ce point : le  biométrique est un outil idéal pour surveiller une population.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;En 2010, le gouvernement britannique avait d’ailleurs mis  fin à son projet de carte d’identité biométrique, estimant que le taux  d’erreurs dans l’authentification était trop élevé et le risque  d’atteinte aux libertés trop important. Les Indiens, souvent subjugués  par les nouvelles technologies pour résoudre leurs problèmes sociaux, ne  semblent pas prêts de revenir en arrière. Surtout si cela peut en plus  servir à mieux ficher un pays menacé par un terrorisme régional et  local.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public'&gt;https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-05-03T16:27:23Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward">
    <title>Emerging Technologies: Issues &amp; Way Forward</title>
    <link>https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward</link>
    <description>
        &lt;b&gt;Aayush Rathi and Gurshabad Grover attended a two day conference on 'Emerging Technologies: Issues &amp; Way Forward' organised by the Technology Policy team at the National Institute of Public Finance and Policy (NIPFP), held on 23rd and 24th May in Bangalore.&lt;/b&gt;
        &lt;p&gt;The themes for discussion included:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Privacy, surveillance and data protection&lt;/li&gt;
&lt;li&gt;Regulation of emerging technologies&lt;/li&gt;
&lt;li&gt;Building sound regulators for technology policy, and&lt;/li&gt;
&lt;li&gt;Fintech regulation&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/nipfp-bangalore-agenda"&gt;Click here&lt;/a&gt; to read the agenda&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward'&gt;https://cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-26T00:39:11Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
