<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/internet-governance/blog/online-anonymity/search_rss">
  <title>We are anonymous, we are legion</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 2076 to 2090.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/about/policies/ethical-research-guidelines"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/enlarging-the-small-print"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/encryption-and-anonymity-rights-and-risks"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/enabling-multi-stakeholder-cooperation-towards-a-transnational-framework-for-due-process"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/empowering-the-next-billion-by-improving-accessibility"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/individuals-in-search-of-society"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance">
    <title>European Summer School on Internet Governance</title>
    <link>https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance</link>
    <description>
        &lt;b&gt;The 13th European Summer School on Internet Governance was held at Meissen in Germany from 13 - 20 July 2019. Akriti Bopanna attended the school. The event was organized by EuroSSIG. &lt;/b&gt;
        &lt;p&gt;More information on the event can be &lt;a class="external-link" href="https://eurossig.eu/eurossig/2019-edition/programme-2019/"&gt;accessed on this page&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance'&gt;https://cis-india.org/internet-governance/news/european-summer-school-on-internet-governance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Internet Freedom</dc:subject>
    

   <dc:date>2019-07-23T00:30:15Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law">
    <title>European E-Evidence Proposal and Indian Law</title>
    <link>https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law</link>
    <description>
        &lt;b&gt;In April of 2018, the European Union issued the proposal for a new regime dealing with cross border sharing of data and information by issuing two draft instruments, an E-evidence Regulation (“Regulation”) and an E-evidence Directive (“Directive”), (together the “E-evidence Proposal”). The Regulation is a direction to states to put in place the proper legislative and regulatory machinery for the implementation of this regime while the Directive requires the states to enact laws governing service providers so that they would comply with the proposed regime.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The main feature of the E-evidence Proposal is twofold: (i) establishment of a legal regime whereunder competent authorities can issue European Production Orders (&lt;b&gt;EPOs&lt;/b&gt;) and European Preservation Orders (&lt;b&gt;EPROs&lt;/b&gt;) to entities in any other EU member country (together the “&lt;b&gt;Data Orders&lt;/b&gt;”); and (ii) an obligation on service providers offering services in any of the EU member countries to designate legal representatives who will be responsible for receiving the Data Orders, irrespective of whether such entity has an actual physical establishment in any EU member country.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this article we will briefly discuss the framework that has been proposed under the two instruments and then discuss how service providers based in India whose services are also available in Europe would be affected by these proposals. The authors would like to make it clear that this article is not intended to be an analysis of the E-evidence Proposal and therefore shall not attempt to bring out the shortcomings of the proposed European regime, except insofar as such shortcomings may affect the service providers located in India being discussed in the second part of the article.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Part I - E-evidence Directive and Regulation &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The E-evidence Proposal introduces the concept of binding EPOs and EPROs. Both Data Orders need to be issued or validated by a judicial authority in the issuing EU member country. A Data Order can be issued to seek preservation or production of data that is stored by a service provider located in another jurisdiction and that is necessary as evidence in criminal investigations or a criminal proceeding. Such Data Orders may only be issued if a similar measure is available for the same criminal offence in a comparable domestic situation in the issuing country. Both Data Orders can be served on entities offering services such as electronic communication services, social networks, online marketplaces, other hosting service providers and providers of internet infrastructure such as IP address and domain name registries. Thus companies such as Big Rock (domain name registry), Ferns n Petals (online marketplace providing services in Europe), Hike (social networking and chatting), etc. or any website which has a subscription based model and allows access to subscribers in Europe would potentially be covered by the E-evidence Proposal. The EPRO, similarly to the EPO, is addressed to the legal representative outside of the issuing country’s jurisdiction to preserve the data in view of a subsequent request to produce such data, which request may be issued through MLA channels in case of third countries or via a European Investigation Order (EIO) between EU member countries. Unlike surveillance measures or data retention obligations set out by law, which are not provided for by this proposal, the EPRO is an order issued or validated by a judicial authority in a concrete criminal proceeding after an individual evaluation of the proportionality and necessity in every single case.&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Like the EPO, it refers to the specific known or unknown perpetrators of a criminal offence that has already taken place. The EPRO only allows preserving data that is already stored at the time of receipt of the order, not the access to data at a future point in time after the receipt of the EPRO.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While EPOs to produce subscriber data&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and access data&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; can be issued for any criminal offence an EPO for content data&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and transactional data&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; may only be issued by a judge, a court or an investigating judge competent in the case. In case the EPO is issued by any other authority (which is competent to issue such an order in the issuing country), such an EPO has to be validated by a judge, a court or an investigating judge. In case of an EPO for subscriber data and access data, the EPO may also be validated by a prosecutor in the issuing country.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To reduce obstacles to the enforcement of the EPOs, the Directive makes it mandatory for service providers to designate a legal representative in the European Union to receive, comply with and enforce Data Orders. The obligation of designating a legal representative for all service providers that are operating in the European Union would ensure that there is always a clear addressee of orders aiming at gathering evidence in criminal proceedings. This would in turn make it easier for service providers to comply with those orders, as the legal representative would be responsible for receiving, complying with and enforcing those orders on behalf of the service provider.&lt;/p&gt;
&lt;p&gt;&lt;i&gt;&lt;span&gt;Grounds on which EPOs can be issued&lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The grounds on which Data Orders may be issued are contained in Articles 5 and 6 of the Regulation which makes it very clear that a Data Order may only be issued in a case if it is necessary and proportionate for the purposes of a criminal proceeding. The Regulation further specifies that an EPO may only be issued by a member country if a similar domestic order could be issued by the issuing state in a comparable situation. By using this device of linking the grounds to domestic law, the Regulation tries to skirt around the thorny issue of when and on what basis an EPO may be issued. The Regulation also assigns greater weight (in terms of privacy) to transactional and content data as opposed to subscriber and access data and subjects the production and preservation of the former to stricter requirements. Therefore while Data Orders for access and subscriber data may be issued for any criminal offence, orders for transactional and content data can only be issued in case of criminal offences providing for a maximum punishment of atleast 3 years and above. In addition to that EPOs for producing transactional or content data can also be issued for offences specifically listed in Article 5(4) of the Regulation. These offences have been specifically provided for since evidence for such cases would typically be available mostly only in electronic form. This is the justification for the application of the Regulation also in cases where the maximum custodial sentence is less than three years, otherwise it would become extremely difficult to secure convictions in those offences.&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Regulation also requires the issuing authority to take into account potential immunities and privileges under the law of the member country in which the service provider is being served the EPO, as well as any impact the EPO may have on fundamental interests of that member country such as national security and defence. The aim of this provision is to ensure that such immunities and privileges which protect the data sought are respected, in particular where they provide for a higher protection than the law of the issuing member country. In such situations the issuing authority “has to seek clarification before issuing the European Production Order, including by consulting the competent authorities of the Member State concerned, either directly or via Eurojust or the European Judicial Network.”&lt;/p&gt;
&lt;p&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt;&lt;span&gt;Grounds to Challenge EPOs&lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Service Providers have been given the option to object to Data Orders on certain limited grounds specified in the Regulation such as, if it was not issued by a proper issuing authority, if the provider cannot comply because of a &lt;i&gt;de facto&lt;/i&gt; impossibility or &lt;i&gt;force majeure&lt;/i&gt;, if the data requested is not stored with the service provider or pertains to a person who is not the customer of the service provider.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; In all such cases the service provider has to inform the issuing authority of the reasons for the inability to provide the information in the specified form. Further, in the event that the service provider refuses to provide the information on the grounds that it is apparent that the EPO “manifestly violates” the Charter of Fundamental Rights of the European Union or is “manifestly abusive”, the service provider shall send the information in specified Form to the competent authority in the member state in which the Order has been received. The competent authority shall then seek clarification from the issuing authority through Eurojust or via the European Judicial Network.&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;If the issuing authority is not satisfied by the reasons given and the service provider still refuses to provide the information requested, the issuing authority may transfer the EPO Certificate along with the reasons given by the service provider for non compliance, to the enforcing authority in the addressee country. The enforcing authority shall then proceed to enforce the Order, unless it considers that the data concerned is protected by an immunity or privilege under its national law or its disclosure may impact its fundamental interests such as national security and defence; or the data cannot be provided due to one of the following reasons:&lt;/p&gt;
&lt;p&gt;(a) the European Production Order has not been issued or validated by an issuing authority as provided for in Article 4;&lt;/p&gt;
&lt;p&gt;(b) the European Production Order has not been issued for an offence provided for by Article 5(4);&lt;/p&gt;
&lt;p&gt;(c) the addressee could not comply with the EPOC because of de facto impossibility or force majeure, or because the EPOC contains manifest errors;&lt;/p&gt;
&lt;p&gt;(d) the European Production Order does not concern data stored by or on behalf of the service provider at the time of receipt of EPOC;&lt;/p&gt;
&lt;p&gt;(e) the service is not covered by this Regulation;&lt;/p&gt;
&lt;p&gt;(f) based on the sole information contained in the EPOC, it is apparent that it manifestly violates the Charter or that it is manifestly abusive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to the above mechanism the service provider may refuse to comply with an EPO on the ground that disclosure would force it to violate a third-country law that either protects “the fundamental rights of the individuals concerned” or “the fundamental interests of the third country related to national security or defence.” Where a provider raises such a challenge, issuing authorities can request a review of the order by a court in the member country. If the court concludes that a conflict as claimed by the service provider exists, the court shall notify authorities in the third-party country and if that third-party country objects to execution of the EPO, the court must set it aside.&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A service provider may also refuse to comply with an order because it would force the service provider to violate a third-country law that protects interests &lt;i&gt;other than&lt;/i&gt; fundamental rights or national security and defense. In such cases, the Regulation provides that the same procedure be followed as in case of law protecting fundamental rights or national security and defense, except that in this case the court, rather than notifying the foreign authorities, shall itself conduct a detailed analysis of the facts and circumstances to decide whether to enforce the order.&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt;&lt;span&gt;Service Provider “Offering Services in the Union”&lt;/span&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As is clear from the discussion above, the proposed regime puts an obligation on service providers offering services in the Union to designate a legal representative in the European Union, whether the service provider is physically located in the European Union or not. This appears to be a fairly onerous obligation for small technology companies which may involve a significant cost to appoint and maintain a legal representative in the European Union, especially if the service provider is not located in the EU. Therefore the question arises as to which service providers would be covered by this obligation and the answer to that question lies in the definitions of the terms “service provider” and “offering services in the Union”.&lt;/p&gt;
&lt;p&gt;The term service provider has been defined in Article 2(2) of the Directive as follows:&lt;/p&gt;
&lt;p&gt;“‘service provider’ means any natural or legal person that provides one or more of the following categories of services:&lt;/p&gt;
&lt;p&gt;(a) electronic communications service as defined in Article 2(4) of [Directive establishing the European Electronic Communications Code];&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(b) information society services as defined in point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; for which the storage of data is a defining component of the service provided to the user, including social networks, online marketplaces facilitating transactions between their users, and other hosting service providers;&lt;/p&gt;
&lt;p&gt;(c) internet domain name and IP numbering services such as IP address providers, domain name registries, domain name registrars and related privacy and proxy services;”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Thus broadly speaking the service providers covered by the Regulation would include providers of electronic communication services, social networks, online marketplaces, other hosting service providers and providers of internet infrastructure such as IP address and domain name registries, or on their legal representatives where they exist. An important qualification that has been added in the definition is that it covers only those services where “storage of data is a defining component of the service”. Therefore, services for which the storage of data is not a defining component are not covered by the proposal. The Regulation also recognizes that most services delivered by providers involve some kind of storage of data, especially where they are delivered online at a distance; and therefore it specifically provides that services for which the storage of data is not a &lt;i&gt;main characteristic&lt;/i&gt; and is thus only of an ancillary nature would not be covered, including legal, architectural, engineering and accounting services provided online at a distance.&lt;a href="#_ftn13" name="_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This does not mean that all such service providers offering the type of services in which data storage is the main characteristic, in the EU, would be covered by the Directive. The term “offering services in the Union” has been defined in Article 2(3) of the Directive as follows:&lt;/p&gt;
&lt;p&gt;“‘offering services in the Union’ means:&lt;/p&gt;
&lt;p&gt;(a) enabling legal or natural persons in one or more Member State(s) to use the services listed under (3) above; and&lt;/p&gt;
&lt;p&gt;(b) having a substantial connection to the Member State(s) referred to in point (a);”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Clause (b) of the definition is the main qualifying factor which would ensure that only those entities whose offering of services has a “substantial connection” which the member countries of the EU would be covered by the Directive. The Regulation recognizes that mere accessibility of the service (which could also be achieved through mere accessibility of the service provider’s or an intermediary’s website in the EU) should not be a sufficient condition for the application of such an onerous condition and therefore the concept of a “substantial connection” was inserted to ascertain a sufficient relationship between the provider and the territory where it is offering its services. In the absence of a permanent establishment in an EU member country, such a “substantial connection” may be said to exist if there are a significant number of users in one or more EU member countries, or the “targeting of activities” towards one or more EU member countries. The “targeting of activities” may be determined based on various circumstances, such as the use of a language or a currency generally used in an EU member country, the availability of an app in the relevant national app store, providing local advertising or advertising in the language used in an EU member country, making use of any information originating from persons in EU member countries in the course of its activities, or from the handling of customer relations such as by providing customer service in the language generally used in EU member countries. A substantial connection can also be assumed where a service provider directs its activities towards one or more EU member countries as set out in Article 17(1)(c) of Regulation 1215/2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters.&lt;a href="#_ftn14" name="_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Part II - EU Directive and Service Providers located in India&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this part of the article we will discuss how companies based in India and running websites providing any “service” such as social networking, subscription based video streaming, etc. such as Hike or AltBalaji, Hotstar, etc. and how such companies would be affected by the E-evidence Proposal. At first glance a website providing a video streaming service may not appear to be covered by the E-evidence Proposal since one would assume that there may not be any storage of data. But if it is a service which allows users to open personal accounts (with personal and possibly financial details such as in the case of TVF, AltBalaji or Hotstar) and uses their online behaviour to push relevant material and advertisements to their accounts, whether that would make the storage of data a defining component of the website’s services as contemplated under the proposal is a question that may not be easy to answer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Even if it is assumed that the services of an Indian company can be classified as information society services for which the storage of data is a defining component, that by itself would not be sufficient to make the E-evidence Proposal applicable to it. The services of an Indian company would still need to have a “substantial connection” with an EU member country. As discussed above, this substantial connection may be said to exist based on the existence of (i) a significant number of users in one or more EU member countries, or (ii) the “targeting of activities” towards one or more EU member countries. The determination of whether a service provider is targeting its services towards an EU member country is to be made based on a number of factors listed above and is a subjective determination with certain guiding factors.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There does not seem to be clarity however on what would constitute a significant number of users and whether this determination is to be based upon the total number of users in an EU member country as a proportion of the population of the country or is it to be considered as a proportion of the total number of customers the service provider has worldwide. To explain this further let us assume that an Indian company such as Hotstar has a total user base of 100 million customers.&lt;a href="#_ftn15" name="_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; If there is a situation where 10 million of these 100 million subscribers are located in countries other than India, out of which there are about 40 thousand customers in France and another 40 thousand in Malta; then it would lead to some interesting analysis. Now 40 thousand customers in a customer base of 100 million is 0.04% of the total customer base of the service provider which generally speaking would not constitute a “significant number”. However if we reckon the 40 thousand customers from the point of view of the total population of the country of Malta, which is approximately 4.75 Lakh,&lt;a href="#_ftn16" name="_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; it would mean approx. 8.4% of the total population of Malta. It is unlikely that any service affecting almost a tenth of the population of the entire country can be labeled as not having a significant number of users in Malta. If the same math is done on the population of a country such as France, which has a population of approx. 67.3 million,&lt;a href="#_ftn17" name="_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; then the figure would be 0.05% of the total population; would that constitute a significant number as per the E-evidence Proposal.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The issues discussed above are very important for any service provider, specially a small or medium sized company since the determination of whether the E-evidence Proposal applies to them or not, apart from any potential legal implications, imposes a direct economic cost for designating a legal representative in an EU member country. Keeping in mind this economic burden and how it might affect the budget of smaller companies, the Explanatory Memorandum to the Regulation clarifies that this legal representative could be a third party, which could be shared between several service providers, and further the legal representative may accumulate different functions (e.g. the General Data Protection Regulation or e-Privacy representatives in addition to the legal representative provided for by the E-evidence Directive).&lt;a href="#_ftn18" name="_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In case all the above issues are determined to be in favour of the E-evidence Directive being applicable to an Indian company and the company designates a legal representative in an EU member country, then it remains to be seen how Indian laws relating to data protection would interact with the obligations of the Indian company under the E-evidence Directive. As per Rule 6 of the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (“&lt;b&gt;SPDI Rules&lt;/b&gt;”) service providers are not allowed to disclose sensitive personal data or information except with the prior permission of the except disclosure to mandated government agencies. The Rule provides that “the information shall be shared, without obtaining prior consent from provider of information, with &lt;i&gt;Government agencies mandated under the law&lt;/i&gt; to obtain information including sensitive personal data or information for the purpose of verification of identity, or for prevention, detection, investigation including cyber incidents, prosecution, and punishment of offences….”. Although the term “government agency mandated under law” has not been defined in the SPDI Rules, the term “law” has been defined in the Information Technology Act, 2000 (“&lt;b&gt;IT Act&lt;/b&gt;”) as under:&lt;/p&gt;
&lt;p&gt;“’law’ includes any Act of Parliament or of a State Legislature, Ordinances promulgated by the President or a Governor, as the case may be. Regulations made by the President under article 240, Bills enacted as President's Act under sub-clause (a) of clause (1) of article 357 of the Constitution and includes rules, regulations, byelaws and orders issued or made thereunder;”&lt;a href="#_ftn19" name="_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Since the SPDI Rules are issued under the IT Act, therefore the term “law” referred as used in the would have to be read as defined in the IT Act (unless court holds to the contrary). This would mean that Rule 6 of the SPDI Rules only recognises government agencies mandated under Indian law and therefore information cannot be disclosed to agencies not recognised by Indian law. In such a scenario an Indian company may not have any option except to raise an objection and challenge an EPO issued to it on the grounds provided in Article 16 of the Regulation, which process itself could mean a significant expenditure on the part of such a company.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Conclusion&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The framework sought to be established by the European Union through the E-evidence Proposal seeks to establish a regime different from those favoured by countries such as the United States which favours Mutual Agreements with (presumably) key nations or the push for data localisation being favoured by countries such as India, to streamline the process of access to digital data. Since the regime put forth by the EU is still only at the proposal stage, there may yet be changes which could clarify the regime significantly. However, as things stand Indian companies may be affected by the E-evidence Proposal in the following ways:&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Companies offering services outside India may inadvertently trigger obligations under the E-evidence Proposal if their services have a substantial connection with any of the member states of the European Union;&lt;/li&gt;
&lt;li&gt;Indian companies offering services overseas will have to make an internal determination as to whether the E-evidence Proposal applies to them or not;&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;In case of Indian companies which come under the E-evidence Proposal, they would be obligated to designate a legal representative in an EU member state for receiving and executing Data Orders as per the E-evidence Proposal.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;If a legal representative is designated by the Indian company they may have to incur significant costs on maintaining a legal representative especially in a situation where they have to object to the implementation of an EPO. The company would also have to coordinate with the legal representative to adequately put forth their (Indian law related) concerns before the competent authority so that they are not forced to fall foul of their legal obligations in either jurisdiction. It is also unclear the extent to which appointed legal representatives from Indian companies could challenge or push back against requests received.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Disclaimer&lt;/span&gt;: The author of this Article is an Indian trained lawyer and not an expert on European law. The author would like to apologise for any incorrect analysis of European law that may have crept into this article despite best efforts.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Explanatory Memorandum to the Proposal for Regulation of the European Parliament and of the Council on European Production and Preservation Orders for Electronic Evidence in Criminal Matters, Pg. 4, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Subscriber data means data which is used to identify the user and has been defined in Article 2 (7) as follows:&lt;/p&gt;
&lt;p&gt;“‘subscriber data’ means any data pertaining to:&lt;/p&gt;
&lt;p&gt;(a) the identity of a subscriber or customer such as the provided name, date of birth, postal or geographic address, billing and payment data, telephone, or email;&lt;/p&gt;
&lt;p&gt;(b) the type of service and its duration including technical data and data identifying related technical measures or interfaces used by or provided to the subscriber or customer, and data related to the validation of the use of service, excluding passwords or other authentication means used in lieu of a password that are provided by a user, or created at the request of a user;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The term access data has been defined in Article 2(8) as follows:&lt;/p&gt;
&lt;p&gt;“‘access data’ means data related to the commencement and termination of a user access session to a service, which is strictly necessary for the sole purpose of identifying the user of the service, such as the date and time of use, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the user of a service, data identifying the interface used and the user ID. This includes electronic communications metadata as defined in point (g) of Article 4(3) of Regulation concerning the respect for private life and the protection of personal data in electronic communications;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The term content data has been defined in Article 2 (10) as follows:&lt;/p&gt;
&lt;p&gt;“‘content data’ means any stored data in a digital format such as text, voice, videos, images, and sound other than subscriber, access or transactional data;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The term transactional data has been defined in Article 2(9) as follows:&lt;/p&gt;
&lt;p&gt;“‘transactional data’ means data related to the provision of a service offered by a service provider that serves to provide context or additional information about such service and is generated or processed by an information system of the service provider, such as the source and destination of a message or another type of interaction, data on the location of the device, date, time, duration, size, route, format, the protocol used and the type of compression, unless such data constitues access data. This includes electronic communications metadata as defined in point (g) of Article 4(3) of [Regulation concerning the respect for private life and the protection of personal data in electronic communications];”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Explanatory Memorandum to the Proposal for Regulation of the European Parliament and of the Council on European Production and Preservation Orders for Electronic Evidence in Criminal Matters, Pg. 17, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0225&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Articles 9(4) and 10(5) of the Regulation.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 10(5) of the Regulation.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 15 of the Regulation.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 16 of the Regulation. Also see &lt;a href="https://www.insideprivacy.com/uncategorized/eu-releases-e-evidence-proposal-for-cross-border-data-access/"&gt;https://www.insideprivacy.com/uncategorized/eu-releases-e-evidence-proposal-for-cross-border-data-access/&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Article 2(4) of the Directive establishing European Electronic Communications Code provides as under:&lt;/p&gt;
&lt;p&gt;‘electronic communications service’ means a service normally provided for remuneration  via electronic communications networks,  which encompasses 'internet access service' as defined in Article 2(2) of Regulation (EU) 2015/2120; and/or 'interpersonal communications service'; and/or services consisting wholly or mainly in the conveyance of signals such as transmission services  used for the provision of machine-to-machine services and for broadcasting, but excludes services providing, or exercising editorial control over, content transmitted using electronic communications networks and services;”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Information Society Services have been defined in the Directive specified as “any Information Society service, that is to say, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.”&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Proposal for a Directive of the European Parliament and of the Council Laying Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose of Gathering Evidence in Criminal Proceedings, Pg 8, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Proposal for a Directive of the European Parliament and of the Council Laying Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose of Gathering Evidence in Criminal Proceedings, Pg 9, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Hotstar already has an active customer base of 75 million, as of December, 2017; &lt;a href="https://telecom.economictimes.indiatimes.com/news/netflix-restricted-to-premium-subscribers-hotstar-leads-indian-ott-content-market/62351500"&gt;https://telecom.economictimes.indiatimes.com/news/netflix-restricted-to-premium-subscribers-hotstar-leads-indian-ott-content-market/62351500&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://en.wikipedia.org/wiki/Malta"&gt;https://en.wikipedia.org/wiki/Malta&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://en.wikipedia.org/wiki/France"&gt;https://en.wikipedia.org/wiki/France&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Proposal for a Directive of the European Parliament and of the Council Laying Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose of Gathering Evidence in Criminal Proceedings, Pg 5, available at &lt;a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN"&gt;https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018PC0226&amp;amp;from=EN&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Section 2(y) of the Information Technology Act, 2000.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law'&gt;https://cis-india.org/internet-governance/blog/vipul-kharbanda-december-23-2018-european-e-evidence-proposal-and-indian-law&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vipul</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-12-23T16:45:02Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties">
    <title>European Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties</title>
    <link>https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</link>
    <description>
        &lt;b&gt;The Court of Justice of the European Union has ruled that an "an internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties.” The decision adds to the conundrum of maintaining a balance between freedom of expression, protecting personal data and intermediary liability.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The ruling is expected to have considerable impact on reputation and privacy related takedown requests as under the decision, data subjects may approach the operator directly seeking removal of links to web pages containing personal data. Currently, users prove whether data needs to be kept online—the new rules reverse the burden of proof, placing an obligation on companies, rather than users for content regulation.&lt;/p&gt;
&lt;h3&gt;A win for privacy?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The ECJ ruling addresses Mario Costeja González complaint filed in 2010, against Google Spain and Google Inc., requesting that personal data relating to him appearing in search results be protected and that data which was no longer relevant be removed. Referring to &lt;a href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML"&gt;the Directive 95/46/EC&lt;/a&gt; of the European Parliament, the court said, that Google and other search engine operators should be considered 'controllers' of personal data. Following the decision, Google will be required to consider takedown requests of personal data, regardless of the fact that processing of such data is carried out without distinction in respect of information other than the personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The decision—which cannot be appealed—raises important of questions of how this ruling will be applied in practice and its impact on the information available online in countries outside the European Union.  The decree forces search engine operators such as Google, Yahoo and Microsoft's Bing to make judgement calls on the fairness of the information published through their services that reach over 500  million people across the twenty eight nation bloc of EU.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ECJ rules that search engines 'as a general rule,' should place the right to privacy above the right to information by the public. Under the verdict, links to irrelevant and out of date data need to be erased upon request, placing search engines in the role of controllers of information—beyond the role of being an arbitrator that linked to data that already existed in the public domain. The verdict is directed at highlighting the power of search engines to retrieve controversial information while limiting their capacity to do so in the future.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ruling calls for maintaining a balance in addressing the legitimate interest of internet users in accessing personal information and upholding the data subject’s fundamental rights, but does not directly address either issues. The court also recognised, that the data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life. Acknowledging that data belongs to the individual and is not the right of the company, European Commissioner Viviane Reding, &lt;a href="https://www.facebook.com/permalink.php?story_fbid=304206613078842&amp;amp;id=291423897690447&amp;amp;_ga=1.233872279.883261846.1397148393"&gt;hailed the verdict&lt;/a&gt;, "a clear victory for the protection of personal data of Europeans".&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Court stated that if data is deemed irrelevant at the time of the case, even if it has been lawfully processed initially, it must be removed and that the data subject has the right to approach the operator directly for the removal of such content. The liability issue is further complicated by the fact, that search engines such as Google do not publish the content rather they point to information that already exists in the public domain—raising questions of the degree of liability on account of third party content displayed on their services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ECJ ruling is based on the case originally filed against Google, Spain and it is important to note that, González argued that searching for his name linked to two pages originally published in 1998, on the website of the Spanish newspaper La Vanguardia. The Spanish Data Protection Agency did not require La Vanguardia to take down the pages, however, it did order Google to remove links to them. Google appealed this decision, following which the National  High Court of Spain sought advice from the European court. The definition of Google as the controller of information, raises important questions related to the distinction between liability of publishers and the liability of processors of information such as search engines.&lt;/p&gt;
&lt;h3&gt;The 'right to be forgotten'&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The decision also brings to the fore, the ongoing debate and &lt;a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law"&gt;fragmented opinions within the EU&lt;/a&gt;, on the right of the individual to be forgotten. The &lt;a href="http://www.bbc.com/news/technology-16677370"&gt;'right to be forgotten&lt;/a&gt;' has evolved from the European Commission's wide-ranging plans of an overhaul of the commission's 1995 Data Protection Directive. The plans for the law included allowing people to request removal of personal data with an obligation of compliance for service providers, unless there were 'legitimate' reasons to do otherwise. Technology firms rallying around issues of freedom of expression and censorship, have expressed concerns about the reach of the bill. Privacy-rights activist and European officials have upheld the notion of the right to be forgotten, highlighting the right of the individual to protect their honour and reputation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These issues have been controversial amidst EU member states with the UK's Ministry of Justice claiming the law 'raises unrealistic and unfair expectations' and  has &lt;a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law"&gt;sought to opt-out&lt;/a&gt; of the privacy laws. The Advocate General of the European Court &lt;a href="http://curia.europa.eu/juris/document/document.jsf?text=&amp;amp;docid=138782&amp;amp;pageIndex=0&amp;amp;doclang=EN&amp;amp;mode=req&amp;amp;dir=&amp;amp;occ=first&amp;amp;part=1&amp;amp;cid=362663#Footref91"&gt;Niilo Jääskinen's opinion&lt;/a&gt;, that the individual's right to seek removal of content should not be upheld if the information was published legally, contradicts the verdict of the ECJ ruling. The European Court of Justice's move is surprising for many and as Richard Cumbley, information-management and data protection partner at the law firm Linklaters &lt;a href="http://turnstylenews.com/2014/05/13/europe-union-high-court-establishes-the-right-to-be-forgotten/"&gt;puts it&lt;/a&gt;, “Given that the E.U. has spent two years debating this right as part of the reform of E.U. privacy legislation, it is ironic that the E.C.J. has found it already exists in such a striking manner."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The economic implications of enforcing a liability regime where search engine operators censor legal content in their results aside, the decision might also have a chilling effect on freedom of expression and access to information. Google &lt;a href="http://www.theguardian.com/technology/2014/may/13/right-to-be-forgotten-eu-court-google-search-results"&gt;called the decision&lt;/a&gt; “a disappointing ruling for search engines and online publishers in general,” and that the company would take time to analyze the implications. While the implications of the decision are yet to be determined, it is important to bear in mind that while decisions like these are public, the refinements that Google and other search engines will have to make to its technology and the judgement calls on the fairness of the information available online are not public.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ECJ press release is available &lt;a href="http://curia.europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf"&gt;here&lt;/a&gt; and the actual judgement is available &lt;a href="http://curia.europa.eu/juris/documents.jsf?pro=&amp;amp;lgrec=en&amp;amp;nat=or&amp;amp;oqp=&amp;amp;lg=&amp;amp;dates=&amp;amp;language=en&amp;amp;jur=C%2CT%2CF&amp;amp;cit=none%252CC%252CCJ%252CR%252C2008E%252C%252C%252C%252C%252C%252C%252C%252C%252C%252Ctrue%252Cfalse%252Cfalse&amp;amp;num=C-131%252F12&amp;amp;td=%3BALL&amp;amp;pcs=Oor&amp;amp;avg"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties'&gt;https://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>jyoti</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    

   <dc:date>2014-05-14T14:18:46Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance">
    <title>EU parliament report slams US surveillance</title>
    <link>https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance</link>
    <description>
        &lt;b&gt;Report that outlines need for stringent laws for protecting citizen privacy, democratizing Internet governance holds lessons for India, say analysts.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The article by Moulishree Srivastava and Elizabeth Roche quotes Sunil Abraham. It was &lt;a class="external-link" href="http://www.livemint.com/Home-Page/nYXiR4LEVJLiROfl95aFxH/EU-parliament-report-slams-US-surveillance.html"&gt;published in Livemint&lt;/a&gt; on January 17, 2014.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;A European Union (EU) parliament report that outlines the need for stringent laws for protecting citizen privacy, democratizing Internet governance and rebuilding trust between Europe and the US holds many lessons for India, analysts and policymakers say.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The US government listened into Indian communications as part of its massive global surveillance, which was exposed last year in leaks to the media. The embassies of France, Italy, Greece, Japan, Mexico, South Korea and Turkey were also subjected to the surveillance put in place after the September 2001 terrorist attacks. According to the external affairs ministry, India has registered its protest at least thrice over the issue with US authorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A draft report on the US National Security Agency’s surveillance programme by the European parliament’s committee on civil liberties, justice and home affairs states that trust between the two transatlantic partners, trust among EU member-states, and trust between citizens and their governments were profoundly shaken because of the spying, and to rebuild trust in all these dimensions a comprehensive plan was urgently needed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"It is very doubtful that data collection of such magnitude is only guided by the fight against terrorism, as it involves the collection of all possible data of all citizens; points therefore to the possible existence of other power motives such as political and economic espionage," says the report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report recommends prohibiting blanket mass surveillance activities and bulk processing of personal data, and asks EU member-states, including the UK, Germany, France, Sweden and the Netherlands, to revise their national legislation and practices governing the activities of intelligence services to ensure that they are in line with the standards of the European Convention on Human Rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It also calls on the US to revise its legislation without delay in order to bring it in line with international law, recognizing privacy and other rights as well as providing for judicial redress for EU citizens.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"The American approach to privacy regulation has been deeply flawed. The US dominance over the Internet affects the structure and substance of Internet governance and among other human rights, the right to privacy," said Sunil Abraham, executive director of the Centre for Internet and Society, a Bangalore-based not-for-profit research organization. "The (EU) report, if implemented, may change the future of Internet governance by deepening the existing leadership provided by the EU in promoting their privacy standards globally."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On India’s rather restrained reaction to the spying, he said, “It is a tragedy that our politicians are not as proactive when it comes to protecting our rights. While India has only focused on changing its official email policy after the revelations of mass surveillance, it has done nothing as concrete and comprehensive as EU."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"There is neither the recognition of (the) pervasive nature of global mass surveillance, nor is there full appreciation (of) the damaging consequences," Abraham added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;J. Satyanarayana, secretary in India’s department of electronics and information technology, said the concerns over privacy are the same for India as for the EU, but declined to comment on what preventive steps the government is implementing due to security reasons. The EU report called for concluding the EU-US umbrella pact, a framework agreement on data protection in the field of police and judicial cooperation, to ensure proper redress mechanisms for EU citizens in the event of data transfers from the EU to the US for law enforcement purposes. The report asks EU policymakers not to initiate any new sectoral agreements or arrangements for the transfer of personal data for law enforcement purposes and suggests suspending the terrorist finance tracking programme until the umbrella agreement negotiations are concluded.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"EU wants to use EU-US umbrella agreement...to raise the US standards, to ensure the rights of EU citizens and perhaps all the citizens. All humans will need protection under US law as is currently the case in the EU,” said Abraham. “The prohibition of blanket surveillance that the report recommends will hopefully apply to all citizens regardless of their nationality."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft report goes as far as suggesting suspending Safe Harbour, the legal instrument used for the transfer of EU personal data to the US through Google, Microsoft, Yahoo, Facebook, Apple and LinkedIn, until a full review has been conducted and current loopholes are plugged. The report’s proposals and recommendations are likely to be implemented after election to the European parliament in May.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In addition to reforms in the existing systems, the report outlines the importance of development of European clouds as it notes that trust in US cloud computing and cloud services providers has been affected by the surveillance practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Three of the major computerized reservation systems used by airlines worldwide are based in the US and that PNR (passenger name record) data are saved in cloud systems operating on US soil under US law...lacks data protection adequacy," states the report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;C.U. Bhaskar, analyst with the South Asia Monitor think tank, was of the view that India had “adequately” responded to the US through quiet diplomacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"It is unlikely that the US will give up cyber surveillance,” he said, adding, “We should acquire our own capacity to ensure adequate defensive and offensive firewalls and build up appropriate capacity for our cyber programmes."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Given our expertise in the IT (information technology) sector, as an analyst my opinion is that we have a reasonable capacity to build up our capabilities," Bhaskar added.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance'&gt;https://cis-india.org/news/livemint-january-17-2014-moulishree-srivastava-elizabeth-roche-eu-parliament-slams-us-surveillance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-02-03T06:13:55Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/about/policies/ethical-research-guidelines">
    <title>Ethical Research Guidelines</title>
    <link>https://cis-india.org/about/policies/ethical-research-guidelines</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society will endeavour to protect the physical, social and psychological well-being of those who participate in their research. The guidelines below state the necessary steps to follow while doing research.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The ethical research guideline requires CIS staff and consultants to consider and take the following steps while engaging in research.&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Providing notice to the individual of the: Aims, methods, his/her right to abstain from participation in the research and his/her right to terminate at any time his/her participation; the confidential nature of his/her replies and any limits on such confidentiality.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Providing informants and other participants the right to remain anonymous.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Taking informed consent from the individual that he/she agrees to participate. If children are involved in the research, informed consent will be taken from the parents. Informed consent will entail communicating :&lt;br /&gt;
&lt;ul&gt;
&lt;li&gt;Purpose(s) of the study, and the anticipated consequences of the research;&lt;/li&gt;
&lt;li&gt;Identity of funders and sponsors&lt;/li&gt;
&lt;li&gt;Anticipated uses of the data&lt;/li&gt;
&lt;li&gt;The degree of anonymity and confidentiality which may be afforded to informants and subjects.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Ensuring that when audio/visual-recorders and photographic records are being used, participants that are being recorded will be made aware of the use of the devices, and have the option to request that they not be used.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Ensuring that the identity and identifying information of the participant (if not already in the public domain) is destroyed at the end of project, unless the individual has consented to otherwise.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;At public events organized by CIS, it will be announced and publicly posted that the event is being recorded. Individuals will be given the choice object to being recorded or their name and organization shared in conference reports, blogs, articles etc. If the individual does not object, it will be considered that they have given their consent.&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;The Centre for Internet and Society strictly follows a policy of &lt;strong&gt;No Plagiarism&lt;/strong&gt;.&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/about/policies/ethical-research-guidelines'&gt;https://cis-india.org/about/policies/ethical-research-guidelines&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Policies</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-10-13T12:21:48Z</dc:date>
   <dc:type>Page</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data">
    <title>Ethical Issues in Open Data</title>
    <link>https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data</link>
    <description>
        &lt;b&gt;On August 1, 2013, I took part in a web meeting, organized and hosted by Tim Davies of the World Wide Web foundation. The meeting, titled “Ethical issues in Open Data,” had an agenda focused around privacy considerations in the context of the open data movement.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The main panelists, Carly Nyst and Sam Smith from &lt;a class="external-link" href="http://https//www.privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, as well as Steve Song from the &lt;a class="external-link" href="http://www.idrc.ca/EN/Pages/default.aspx"&gt;International  Development Research Centre&lt;/a&gt;, were joined by roughly a dozen other privacy and development researchers from around the globe in the hour long session.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary issue of the meeting was the concern over modern capabilities of cross-analytics for de-anonymizing data sets and revealing personally identifiable information (PII) in open data. Open data can constitute publicly available information such as budgets, infrastructures, and population statistics, as long as the data meets the three open data characteristics: accessibility, machine readability, and availability for re-use. “Historically,” said Tim Davies, “public registers have been protected through obscurity.” However, both the capabilities of data analysts and the definition of personal data have continued to expand in recent years. This concern thus presents a conflict between researchers who advocate governments releasing open data reports, and researchers who emphasize privacy in the developing world.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Steve Song, advisor to IDRC Information &amp;amp; Networks program, spoke of the potential collateral damage that comes with publishing more and more types of information. Song addressed the imperative of the meeting in saying, “privacy needs to be a core part of open data conversation.” In his presentation, he gave a particularly interesting example of the tensions between public and private information implications. Following the infamous &lt;a class="external-link" href="http://en.wikipedia.org/wiki/Sandy_Hook_Elementary_School_shooting"&gt;2012 school shooting in Newtown, Connecticut&lt;/a&gt;, the information on Newtown’s gun permit owning citizens (made publicly available through America’s &lt;a class="external-link" href="http://foia.state.gov/"&gt;Freedom of Information Act&lt;/a&gt;) was aggregated into an interactive map which revealed the citizens’ addresses. This obviously became problematic for the Newtown community, as the map not only singled out homes which exercised their right to bear arms but also indirectly revealed which homes were without firearm protection and thereby more vulnerable to theft and crime. The Newtown example clearly demonstrates the relationship (and conflict) between open data and privacy; it resolves to the conflict between the right to information and the right to privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An apparent issue surrounding open data is its perceived binary nature. Many advocates either view data as being open, or not; any intermediary boundaries are only forms of governments limiting data accessibility. Therefore, a point raised by meeting attendee Raed Sharif aptly presented an open data counter-argument. Sarif noted how, inversely, privacy conceptions may form a threat to open data. He mentioned how governments could take advantage of privacy arguments to justify their refusal to publish open reports. &lt;br /&gt;&lt;br /&gt;However, Carly Nyst summarized the privacy concern and argument in her remarks near the end of the meeting. Namely, she reasoned that the open data mission is viable, if only limited to generic data, i.e., data about infrastructure, or other information that is in no way personal. Doing so will avoid obstructions of individual privacy. Until more advanced anonymization techniques can be achieved, which can overcome modern re-identification methods, publicly publishing PII may prove too risky. It was generally agreed upon during the meeting that open data is not inherently bad, and in fact its analysis and availability can be beneficial, but the threat of its misuse makes it dangerous. For the future of open data, researchers and advocates should perhaps consider more nuanced approaches to the concept in order to respect considerations for other ethical issues, such as privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data'&gt;https://cis-india.org/internet-governance/blog/ethical-issues-in-open-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>kovey</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-08-07T09:19:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age">
    <title>Ethical Data Design Practices in the AI (Artificial Intelligence) Age</title>
    <link>https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age</link>
    <description>
        &lt;b&gt;Shweta Mohandas was a panelist at discussion on Ethical Data Design Practices in the AI (Artificial Intelligence) Age, organised by Startup Grind, Bangalore on July 28, 2018 at NUMA Bangalore. &lt;/b&gt;
        &lt;h2&gt;Agenda&lt;/h2&gt;
&lt;p&gt;&lt;b&gt;Ethical Data Design Practices in the Age&lt;/b&gt;&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The panel discussion is intended to explore the challenges we face when designing the user experiences of the complex behavioral agents that increasingly run our lives.&lt;/p&gt;
&lt;p dir="ltr"&gt;Discussion centred around how to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Understand current thinking by the AI community on ethics and morality in computing and the challenges it presents. &lt;/li&gt;
&lt;li&gt;Explore examples of the ethical choices that products make now and will make in the near future.&lt;/li&gt;
&lt;li&gt;Learn how designers might approach designing experiences that face moral dilemmas.&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age'&gt;https://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-08-01T23:14:21Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion">
    <title>Essentials of building internet tools for inclusion</title>
    <link>https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion</link>
    <description>
        &lt;b&gt;A talk jointly proposed by Chinmayi SK and Rohini Lakshané was selected for the Internet Freedom Festival held at Valencia, Spain from March 6 to 10, 2017. The talk held on March 6, 2017 was jointly organized by Random Hacks of Kindness, The Bachchao Project, and the Centre for Internet and Society. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;For the full schedule of Internet Freedom Festival, &lt;a class="external-link" href="https://internetfreedomfestival.org/schedule/"&gt;click here&lt;/a&gt;. A manual titled &lt;a class="external-link" href="https://github.com/thebachchaoproject/Manual-to-build-tech-for-diversity-and-Inclusion/blob/master/BuildingTechforDiversityandInclusion101.pdf"&gt;Building Tech for Diversity and Inclusion 101&lt;/a&gt; jointly authored by Chinmayi S.K., Rohini Lakshané and Willow Brugh was released during the event.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion'&gt;https://cis-india.org/internet-governance/events/essentials-of-building-internet-tools-for-inclusion&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2017-03-14T14:38:23Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/enlarging-the-small-print">
    <title>Enlarging the Small Print: A Study on Designing Effective Privacy Notices for Mobile Applications</title>
    <link>https://cis-india.org/internet-governance/blog/enlarging-the-small-print</link>
    <description>
        &lt;b&gt;The Word’s biggest modern lie is often wholly considered to lie in the sentence “I haveread and agreed to the Terms and Conditions.” It is a well-known fact, backed by empirical research that consumers often skip reading cumbersome privacy notices. The reasons for these range from the lengthy nature, complicated legal jargon and inopportune moments when these notices are displayed. This paper seeks to compile and analyse the different simplified designs of privacy notices that have been proposed for mobile applications that encourage consumers to make informed privacy decisions.&lt;/b&gt;
        &lt;h2 style="text-align: justify; "&gt;Introduction: Ideas of Privacy and Consent Linked with Notices&lt;/h2&gt;
&lt;h3 style="text-align: justify; "&gt;The Notice and Choice Model&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Most modern laws and data privacy principles seek to focus on individual control. As Alan Westin of Columbia University characterises privacy, "it is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other,"	&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; Or simply put, personal information privacy is "the ability of the individual to personally control 	information about himself."&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The preferred mechanism for protecting online privacy that has emerged is that of Notice and Choice.&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt; The model, identified as "the most fundamental principle" in online privacy,&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt; refers to&lt;a href="http://itlaw.wikia.com/wiki/Post" title="Post"&gt;consumers&lt;/a&gt; consenting to privacy policies before availing of an online service.	&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following 3 standards of expectations of privacy in electronic communications have emerged in the United States courts:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;KATZ TEST: Katz v. United States,&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; a wiretap case, established expectation of privacy as one society is 	prepared to recognize as ―reasonable. &lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt;This concept is critical to a court's understanding of a new 	technology because there is no established precedent to guide its analysis&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;KYLLO/ KYLLO-KATZ HYBRID TEST: Society's reasonable expectation of privacy is higher when dealing with a new technology that is not ―generally 	available to the public.&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt;This follows the logic that it is reasonable to expect common data collection practices to be used but not rare ones. &lt;a href="#_ftn10" name="_ftnref10"&gt;[10]&lt;/a&gt; In Kyllo v. United States	&lt;a href="#_ftn11" name="_ftnref11"&gt;[11]&lt;/a&gt; law enforcement used a thermal imaging device to observe the relative heat levels inside a house. 	Though as per Katz the publicly available thermal radiation technology is reasonable, the uncommon means of collection was not. This modification to the 	Katz standard is extremely important in the context of mobile privacy. Mobile communications may be subdivided into smaller parts of audio from a phone 	call, e-mail, and data related to a user's current location. Following an application of the hybrid Katz/Kyllo test, the reasonable expectation of privacy 	in each of those communications would be determined separately&lt;a href="#_ftn12" name="_ftnref12"&gt;[12]&lt;/a&gt;, by evaluating the general accessibility 	of the technology required to capture each stream.&lt;a href="#_ftn13" name="_ftnref13"&gt;[13]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;DOUBLE CLICK TEST: DoubleClick&lt;a href="#_ftn14" name="_ftnref14"&gt;[14]&lt;/a&gt; illustrates the potential problems of transferring consent to a third 	party, one to whom the user never provided direct consent or is not even aware of. The court held that for DoubleClick, an online advertising network, to 	collect information from a user it needed only to obtain permission from the website that user accessed, and not from the user himself. The court reasoned 	that the information the user disclosed to the website was analogous to information one discloses to another person during a conversation. Just as the 	other party to the conversation would be free to tell his friends about anything that was said, a website should be free to disclose any information it 	receives from a user's visit after the user has consented to use the website's services. &lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;These interpretations have weakened the standards of online privacy. While the Katz test vaguely hinges on societal expectations, the Kyllo Test to an 	extent strengthens privacy rights by disallowing uncommon methods of collection, but as the DoubleClick Test illustrates, once the user has consented to 	such practices he cannot object to the same. There have been sugestions to consider personal information as property when it shares features of property 	like location data.&lt;a href="#_ftn15" name="_ftnref15"&gt;[15]&lt;/a&gt; It is fixed when it is in storage, it has a monetary value, and it is sold and traded on a regular basis. This would create a standard where consent is required for third-party access.	&lt;a href="#_ftn16" name="_ftnref16"&gt;[16]&lt;/a&gt; Consent will then play a more pivotal role in affixing liability.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The notice and choice mechanism is designed to put individuals in charge of the collection and use of their personal information. In theory, the regime preserves user autonomy by putting the individual in charge of decisions about the collection and use of personal information.	&lt;a href="#_ftn17" name="_ftnref17"&gt;[17]&lt;/a&gt; Notice and choice is asserted as a substitute for regulation because it is thought to be more 	flexible, inexpensive to implement, and easy to enforce.&lt;a href="#_ftn18" name="_ftnref18"&gt;[18]&lt;/a&gt; Additionally, notice and choice can legitimize an information practice, whatever it may be, by obtaining an individual's consent and suit individual privacy preferences.	&lt;a href="#_ftn19" name="_ftnref19"&gt;[19]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the notice and choice mechanism is often criticized for leaving users uninformed-or misinformed, at least-as people rarely see, read, or understand 	privacy notices. &lt;a href="#_ftn20" name="_ftnref20"&gt;[20]&lt;/a&gt; Moreover, few people opt out of the collection, use, or disclosure of their data when 	presented with the choice to do so.&lt;a href="#_ftn21" name="_ftnref21"&gt;[21]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amber Sinha of the Centre for Internet and Society argues that consent in these scenarios Is rarely meaningful as consumers fail to read/access privacy 	policies, understand the consequences and developers do not provide them the choice to opt out of a particular data practice while still being allowed to 	use their services. &lt;a href="#_ftn22" name="_ftnref22"&gt;[22]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Of particular concern is the use of software applications (apps) designed to work on mobile devices. Estimates place the current number of apps available 	for download at more than 1.5 million, and that number is growing daily.&lt;a href="#_ftn23" name="_ftnref23"&gt;[23]&lt;/a&gt; A 2011 Google study, "The 	Mobile Movement," identified that mobile devices are viewed as extensions of ourselves that we share with deeply personal relations with, raising 	fundamental questions of how apps and other mobile communications influence our privacy decision-making.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recent research indicates that mobile device users have concerns about the privacy implications of using apps.	&lt;a href="#_ftn24" name="_ftnref24"&gt;[24]&lt;/a&gt; The research finds that almost 60 percent of respondents ages 50 and older decided not to install an 	app because of privacy concerns (see figure 1).&lt;a href="#_ftn25" name="_ftnref25"&gt;[25]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ConsumerReactions.png" alt="Consumer Reactions" class="image-inline" title="Consumer Reactions" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Because no standards currently exist for providing privacy notice disclosure for apps, consumers may find it difficult to understand what data the app is 	collecting, how those data will be used, and what rights users have in limiting the collection and use of their data. Many apps do not provide users with privacy policy statements, making it impossible for app users to know the privacy implications of using a particular app.	&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt;Apps can make use of any or all of the device's functions, including contact lists, calendars, phone 	and messaging logs, locational information, Internet searches and usage, video and photo galleries, and other possibly sensitive information. For example, 	an app that allows the device to function as a scientific calculator may be accessing contact lists, locational data, and phone records even though such 	access is unnecessary for the app to function properly. &lt;a href="#_ftn27" name="_ftnref27"&gt;[27]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other apps may have privacy policies that are confusing or misleading. For example, an analysis of health and fitness apps found that more than 30 percent 	of the apps studied shared data with someone not disclosed in the app's privacy policy.&lt;a href="#_ftn28" name="_ftnref28"&gt;[28]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Types of E-Contracts&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Margaret Radin distinguishes two models of direct e-contracts based on consent as -"contract-as-consent" and "contract-as-product."	&lt;a href="#_ftn29" name="_ftnref29"&gt;[29]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The contract-as-consent model is the traditional picture of how binding commitment is arrived at between two humans. It involves a meeting of the minds 	which implies that terms be understood, alternatives be available, and probably that bargaining be possible.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the contract-as-product model, the terms are part of the product, not a conceptually separate bargain; physical product plus terms are a package deal. 	For example the fact that a chip inside an electronics item will wear out after a year is an unseen contract creating a take-it-or-leave-it choice not to 	buy the package.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The product-as-consent model defies traditional ideas of consent and raises questions of whether consent is meaningful. Modern day e-contracts such as 	click wrap, shrink wrap, viral contracts and machine-made contracts which form the privacy policy of several apps have a product-as-consent approach where 	consumers are given the take-it-or-leave-it option.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mobile application privacy notices fall into the product-as-consent model. Consumers often have to click "I agree" to all the innumerable Terms and 	Conditions in order to install the app. For instance terms that the fitness app will collect biometric data is a feature of the product that is 	non-negotiable. It is a classic take-it-or-leave-it approach where consumers compromise on privacy to avail services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Contracts that facilitate these transactions are generally long and complicated and often agreed to by consumers without reading them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Craswell strikes a balance in applying the liability rule to point out that as explaining the meaning of extensive fine print would be very costly to point 	out it could be efficient to affix the liability rule not as a written contract but rather on "reasonable" terms. This means that if a fitness app collects 	sensitive financial information, which is unreasonable given its core activities, then even if the user has consented to the same in the privacy policy's 	fine print the contract should be capable of being challenged.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;The Concept of Privacy by Design&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Privacy needs to be considered from the very beginning of system development. For this reason, Dr. Anne Cavoukian	&lt;a href="#_ftn30" name="_ftnref30"&gt;[30]&lt;/a&gt; coined the term "Privacy by Design", that is, privacy should be taken into account throughout the 	entire engineering process from the earliest design stages to the operation of the productive system. This holistic approach is promising, but it does not 	come with mechanisms to integrate privacy in the development processes of a system. The privacy-by-design approach, i.e. that data protection safeguards 	should be built into products and services from the earliest stage of development, has been addressed by the European Commission in their proposal for a 	General Data Protection Regulation. This proposal uses the terms "privacy by design" and "data protection by design" synonymously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The 7 Foundational Principles&lt;a href="#_ftn31" name="_ftnref31"&gt;[31]&lt;/a&gt; of Privacy by Design are:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Proactive not Reactive; Preventative not Remedial&lt;/li&gt;
&lt;li&gt;Privacy as the Default Setting&lt;/li&gt;
&lt;li&gt;Privacy Embedded into Design&lt;/li&gt;
&lt;li&gt;Full Functionality - Positive-Sum, not Zero-Sum&lt;/li&gt;
&lt;li&gt;End-to-End Security - Full Lifecycle Protection&lt;/li&gt;
&lt;li&gt;Visibility and Transparency - Keep it Open&lt;/li&gt;
&lt;li&gt;Respect for User Privacy - Keep it User-Centric&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Several terms have been introduced to describe types of data that need to be protected. A term very prominently used by industry is "personally 	identifiable information (PII)", i.e., data that can be related to an individual. Similarly, the European data protection framework centres on "personal 	data". However, some authors argue that this falls short since also data that is not related to a single individual might still have an impact on the 	privacy of groups, e.g., an entire group might be discriminated with the help of certain information. For data of this category the term "privacy-relevant 	data" has been used. &lt;a href="#_ftn32" name="_ftnref32"&gt;[32]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An essential part of Privacy by Design is that data subjects should be adequately informed whenever personal data is processed. Whenever data subjects use 	a system, they should be informed about which information is processed, for what purpose, by which means and who it is shared is with. They should be 	informed about their data access rights and how to exercise them.&lt;a href="#_ftn33" name="_ftnref33"&gt;[33]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Whereas system design very often does not or barely consider the end-users' interests, but primarily focuses on owners and operators of the system, it is 	essential to account the privacy and security interests of all parties involved by informing them about associated advantages (e.g. security gains) and 	disadvantages (e.g. costs, use of resources, less personalisation). By creating this system of "multilateral security" the demands of all parties must be 	realized.&lt;a href="#_ftn34" name="_ftnref34"&gt;[34]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The Concept of Data Minimization&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The most basic privacy design strategy is MINIMISE, which states that the amount of personal data that is processed should be restricted to the minimal 	amount possible. By ensuring that no, or no unnecessary, data is collected, the possible privacy impact of a system is limited. Applying the MINIMISE 	strategy means one has to answer whether the processing of personal data is proportional (with respect to the purpose) and whether no other, less invasive, 	means exist to achieve the same purpose. The decision to collect personal data can be made at design time and at run time, and can take various forms. For 	example, one can decide not to collect any information about a particular data subject at all. Alternatively, one can decide to collect only a limited set 	of attributes.&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;If a company collects and retains large amounts of data, there is an increased risk that the data will be used in a way that departs from consumers' 	reasonable expectations.&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are three privacy protection goals&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; that data minimization and privacy by 	design seek to achieve. These privacy protection goals are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Unlinkability - To prevent data being linked to an identifiable entity&lt;/li&gt;
&lt;li&gt;Transparency - The information has to be available before, during and after the processing takes place.&lt;/li&gt;
&lt;li&gt;Intervenability - Those who provide their data must have means of intervention into all ongoing or planned privacy-relevant data processing	&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Spiekermann and Cranor raised an intriguing point in their paper, they argued that those companies that employ privacy by design and data minimization practices in their applications should be allowed to skip the need for privacy policies and forgo need for notice and choice features.	&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;table style="text-align: justify; "&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;div&gt;
&lt;p&gt;&lt;b&gt; To Summarise: 							&lt;i&gt; The emerging model and legal dialogue that regulates online privacy is that of Notice and Choice which has been severely 								criticised for not creating informed choice making processes. E-contracts such as agreeing to privacy notices follow the 								consent-as-product model. When there is extensive fine print liability must be affixed on the basis of reasonable terms. 								Privacy notices must incorporate the concepts of Privacy by Design through providing complete information and collecting 								minimum data. &lt;/i&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h2 style="text-align: justify; "&gt;Features of Privacy Notices in the Current Mobile Ecosystem&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;A privacy notice inform a system's users or a company's customers of data practices involving personal information. Internal practices with regard to the 	collection, processing, retention, and sharing of personal information should be made transparent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Each app a user chooses to install on his smartphone can access different information stored on that device. There is no automatic access to user 	information. Each application has access only to the data that it pulls into its own 'sandbox'.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The sandbox is a set of fine-grained controls limiting an application's access to files, preferences, network resources, hardware etc. Applications cannot 	access each other's sandboxes.&lt;a href="#_ftn39" name="_ftnref39"&gt;[39]&lt;/a&gt; The data that makes it into the sandbox is normally defined by user permissions.&lt;a href="#_ftn40" name="_ftnref40"&gt;[40]&lt;/a&gt; These are a set of user defined controls&lt;a href="#_ftn41" name="_ftnref41"&gt;[41]&lt;/a&gt;and evidence that a user consents to the application accessing that data.	&lt;a href="#_ftn42" name="_ftnref42"&gt;[42]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To gain permission mobile apps generally display privacy notices that explicitly seek consent. These can leverage different channels, including a privacy 	policy document posted on a website or linked to from mobile app stores or mobile apps. For example, Google Maps uses a traditional clickwrap structure that requires the user to agree to a list of terms and conditions when the program is initially launched.	&lt;a href="#_ftn43" name="_ftnref43"&gt;[43]&lt;/a&gt; Foursquare, on the other hand, embeds its terms in a privacy policy posted on its website, and not 	within the app. &lt;a href="#_ftn44" name="_ftnref44"&gt;[44]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This section explains the features of current privacy notices on the 4 parameters of stage (at which the notice is given), content, length and user 	comprehension. Under each of these parameters the associated problems are identified and alternatives are suggested.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;(1) &lt;/b&gt; &lt;b&gt;Timing and Frequency of Notice: &lt;br /&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt; This sub-section identifies the various stages that notices are given and highlights their advantages, disadvantages and makes recommendations. It 		concludes with the findings of a study on what the ideal stage to provide notice is. This is supplemented with 2 critical models to address the common 		problems of habituation and contextualization. &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; Studies indicate that timing of notices or the stage at which they are given impact how consumer's recall and comprehend them and make choices 		accordingly. &lt;/b&gt; &lt;a href="#_ftn45" name="_ftnref45"&gt;[45]&lt;/a&gt; &lt;b&gt; I&lt;/b&gt; ntroducing only a 15-second delay between the presentation of privacy notices and privacy relevant choices can be enough to render notices ineffective at 	driving user behaviour.&lt;a href="#_ftn46" name="_ftnref46"&gt;[46]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Google Android and Apple iOS provide notices at different times. At the time of writing, Android users are shown a list of requested permissions while the 	app is being installed, i.e., after the user has chosen to install the app. In contrast, iOS shows a dialog during app use, the first time a permission is 	requested by an app. This is also referred to as a "just-in-time" notification. &lt;a href="#_ftn47" name="_ftnref47"&gt;[47]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following are the stages in which a notice can be given:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;1) NOTICE AT SETUP: Notice can be provided when a system is used for the first time&lt;a href="#_ftn48" name="_ftnref48"&gt;[48]&lt;/a&gt;. For instance, as 	part of a software installation process users are shown and have to accept the system's terms of use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) &lt;span&gt;Advantages&lt;/span&gt;: Users can inspect a system's data practices before using or purchasing it. The system developer is benefitted due to liability and 	transparency reasons that gain user trust. It provides the opportunity to explain unexpected data practices that may have a benign purpose in the context 	of the system&lt;a href="#_ftn49" name="_ftnref49"&gt;[49]&lt;/a&gt;. It can even impact purchase decisions. Egelman et al. found that participants were more 	likely to pay a premium at a privacy-protective website when they saw privacy information in search results, as opposed to on the website after selecting a 	search result&lt;a href="#_ftn50" name="_ftnref50"&gt;[50]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Users have become largely habituated to install time notices and ignore them&lt;a href="#_ftn51" name="_ftnref51"&gt;[51]&lt;/a&gt;. Users 	may have difficulty making informed decisions because they have not used the system yet and cannot fully assess its utility or weigh privacy trade-offs. They may also be focused on the primary task, namely completing the setup process to be able to use the system, and fail to pay attention to notices	&lt;a href="#_ftn52" name="_ftnref52"&gt;[52]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Privacy notices provided at setup time should be concise and focus on data practices immediately relevant to the primary user rather 	than presenting extensive terms of service. Integrating privacy information into other materials that explain the functionality of the system may further 	increase the chance that users do not ignore it.&lt;a href="#_ftn53" name="_ftnref53"&gt;[53]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;2) JUST IN TIME NOTICE: A privacy notice can be shown when a data practice is active, for example when information is being collected, used, or shared. 	Such notices are referred to as "contextualized" or "just-in-time" notices&lt;a href="#_ftn54" name="_ftnref54"&gt;[54]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: They enhance transparency and enable users to make privacy decisions in context. Users have also been shown to more freely share information 	if they are given relevant explanations at the time of data collection&lt;a href="#_ftn55" name="_ftnref55"&gt;[55]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Habituation can occur if these are shown too frequently. Moreover in apps such as gaming apps users generally tend to ignore notices 	displayed during usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Consumers can be given notice the first time a particular type of information is accessed such as email and then be given the option to 	opt out of further notifications. A Consumer may then seek to opt out of notices on email but choose to view all notices on health information that is 	accessed depending on his privacy priorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;3) CONTEXT-DEPENDENT NOTICES: The user's and system's context can also be considered to show additional notices or controls if deemed necessary	&lt;a href="#_ftn56" name="_ftnref56"&gt;[56]&lt;/a&gt;. Relevant context may be determined by a change of location, additional users included in or receiving 	the data, and other situational parameters. Some locations may be particularly sensitive, therefore users may appreciate being reminded that they are 	sharing their location when they are in a new place, or when they are sharing other information that may be sensitive in a specific context. Facebook introduced a privacy checkup message in 2014 that is displayed under certain conditions before posting publicly. It acts as a "nudge"	&lt;a href="#_ftn57" name="_ftnref57"&gt;[57]&lt;/a&gt; to make users aware that the post will be public and to help them manage who can see their posts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It may help users make privacy decisions that are more aligned with their desired level of privacy in the respective situation and thus 	foster trust in the system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Challenges in providing context-dependent notices are detecting relevant situations and context changes. Furthermore, determining whether a context is relevant to an individual's privacy concerns could in itself require access to that person's sensitive data and privacy preferences.	&lt;a href="#_ftn58" name="_ftnref58"&gt;[58]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Standards must be evolved to determine a contextual model based on user preferences.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;4) PERIODIC NOTICES: These are shown the first couple of times a data practice occurs, or every time. The sensitivity of the data practice may determine 	the appropriate frequency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It can further help users maintain awareness of privacy-sensitive information flows especially when data practices are largely invisible	&lt;a href="#_ftn59" name="_ftnref59"&gt;[59]&lt;/a&gt;such as in patient monitoring apps. This helps provide better control options.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Repeating notices can lead to notice fatigue and habituation&lt;a href="#_ftn60" name="_ftnref60"&gt;[60]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Frequency of these notices needs to be balanced with user needs. &lt;a href="#_ftn61" name="_ftnref61"&gt;[61]&lt;/a&gt; Data practices 	that are reasonably expected as part of the system may require only a single notice, whereas practices falling outside the expected context of use which 	the user is potentially unaware of may warrant repeated notices. Periodic notices should be relevant to users in order to be not perceived as annoying. A combined notice can remind about multiple ongoing data practices. Rotating warnings or changing their look can also further reduce habituation effects	&lt;a href="#_ftn62" name="_ftnref62"&gt;[62]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;5) PERSISTENT NOTICES: A persistent indicator is typically non-blocking and may be shown whenever a data practices is active, for instance when information 	is being collected continuously or when information is being transmitted&lt;a href="#_ftn63" name="_ftnref63"&gt;[63]&lt;/a&gt;. When inactive or not shown, 	persistent notices also indicate that the respective data practice is currently not active. For instance, Android and iOS display a small icon in the 	status bar whenever an application accesses the user's location.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: These are easy to understand and not annoying increasing their functionality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: These ambient indicators often go unnoticed.&lt;a href="#_ftn64" name="_ftnref64"&gt;[64]&lt;/a&gt; Most systems can only accommodate such 	indicators for a small number of data practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Persistent indicators should be designed to be noticeable when they are active. A system should only provide a small set of persistent 	indicators to indicate activity of especially critical data practices which the user can also specify.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;6) NOTICE ON DEMAND: Users may also actively seek privacy information and request a privacy notice. A typical example is posting a privacy policy at a persistent location&lt;a href="#_ftn65" name="_ftnref65"&gt;[65]&lt;/a&gt; and providing links to it from the app.	&lt;a href="#_ftn66" name="_ftnref66"&gt;[66]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: Privacy sensitive users are given the option to better explore policies and make informed decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: The current model of a link to a long privacy policy on a website will discourage users from requesting for information that they cannot 	fully understand and do not have time to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Better option are privacy settings interfaces or privacy dashboards within the system that provide information about data practices; 	controls to manage consent; summary reports of what information has been collected, used, and shared by the system; as well as options to manage or delete 	collected information. Contact information for a privacy office should be provided to enable users to make written requests.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Which of these Stages is the Most Ideal?&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In a series of experiments, Rebecca Balekabo and others &lt;a href="#_ftn67" name="_ftnref67"&gt;[67]&lt;/a&gt; have identified the impact of timing on 	smartphone privacy notices. The following 5 conditions were imposed on participants who were later tested on their levels of recall of the notices through 	questions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Not Shown: The participants installed and used the app without being shown a privacy notice&lt;/li&gt;
&lt;li&gt;App Store: Notice was shown at the time of installation at the app store&lt;/li&gt;
&lt;li&gt;App store Big: A large notice occupying more screen space was shown at the app store&lt;/li&gt;
&lt;li&gt;App Store Popup: A smaller popup was displayed at the app Store&lt;/li&gt;
&lt;li&gt;During use: Notice was shown during usage of the app&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The results (Figure) suggest that even if a notice contains information users care about, it is unlikely to be recalled if only shown in the app store and 	more effective when shown during app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Seeing the app notice during app usage resulted in better recall. Although participants remembered the notice shown after app use as well as in other 	points of app use, they found that it was not a good point for them to make decisions about the app because they had already used it, and participants 	preferred when the notice was shown during or before app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Hence depending on the app there are optimal times to show smartphone privacy notices to maximize attention and recall with preference being given to the 	beginning of or during app use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However several of these stages as outlined baove face the disadvantages of habituation and uncertainty on contextualization. The following 2 models have 	been proposed to address this:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;Habituation&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;When notices are shown too frequently, users may become habituated. Habituation may lead to users disregarding warnings, often without reading or 	comprehending the notice&lt;a href="#_ftn68" name="_ftnref68"&gt;[68]&lt;/a&gt;. To reduce habituation from app permission notices, Felt et al. identified a 	tested method to determine which permission requests should be emphasized &lt;a href="#_ftn69" name="_ftnref69"&gt;[69]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;They categorized actions on the basis of revertibility, severability, initiation, alterable and approval nature (Explained in figure) and applied the 	following permission granting mechanisms :&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Automatic Grant: It must be requested by the developer, but it is granted without user involvement.&lt;/li&gt;
&lt;li&gt;Trusted UI elements: They appear as part of an application's workflow, but clicking on them imbues the application with a new permission. To ensure 	that applications cannot trick users, trusted UI elements can be controlled only by the platform. For example, a user who is sending an SMS message from a 	third-party application will ultimately need to press a button; using trusted UI means the platform provides the button.&lt;/li&gt;
&lt;li&gt;Confirmation Dialog: Runtime consent dialogs interrupt the user's flow by prompting them to allow or deny a permission and often contain 	descriptions of the risk or an option to remember the decision.&lt;/li&gt;
&lt;li&gt;Install-time warning: These integrate permission granting into the installation flow. Installation screens list the application's requested 	permissions. In some platforms (e.g., Facebook), the user can reject some install-time permissions. In other platforms (e.g., Android and Windows 8 Metro), 	the user must approve all requested permissions or abort installation.&lt;a href="#_ftn70" name="_ftnref70"&gt;[70]&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Based on these conditions the following sequential model that the system must adopt was proposed to determine frequency of displaying notices:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/SequentialModel.png/@@images/6a94f50d-4bd0-4566-bc30-32d5ef3f53d3.png" alt="Sequential Model" class="image-inline" title="Sequential Model" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Initial tests have proven to be successful in reducing habituation effects and it is an important step towards designing and displaying privacy notices.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Contextualization&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Bastian Koning and others, in their paper "Towards Context Adaptive Privacy Decisions in Ubiquitous Computing"	&lt;b&gt; &lt;a href="#_ftn71" name="_ftnref71"&gt;&lt;b&gt;[71]&lt;/b&gt;&lt;/a&gt;&lt;/b&gt; propose a system for supporting a user's privacy decisions in situ, 	i.e., in the context they are required in, following the notion of contextual integrity. It approximates the user's privacy preferences and adapts them to 	the current context. The system can then either recommend sharing decisions and actions or autonomously reconfigure privacy settings. It is divided into 	the following stages:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/PrivacyDecisionProcess.png/@@images/4dd72aef-1bb1-42d9-ae59-9592b2a36b9f.png" alt="Privacy Decision Process" class="image-inline" title="Privacy Decision Process" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Context Model:&lt;/b&gt; A distinction is created between the decision level and system level. The system level enables context awareness but also filters context information and 	maps it to semantic concepts required for decisions. Semantic mappings can be derived from a pre-defined or learnt world model. On the decision level, the 	context model only contains components relevant for privacy decision making. For example: An activity involves the user, is assigned a type, i.e., a 	semantic label, such as home or work, based on system level input.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Privacy Decision Engine&lt;/b&gt; : The context model allows to reason about which context items are affected by a context transition. When a transition occurs, the privacy decision engine 	(PDE) evaluates which protection worthy context items are affected. Protection worthiness (or privacy relevance) of context items for a given context are 	determined by the user's privacy preferences that are This serves as a basis for adapting privacy preferences and is subsequently further adjusted to the 	user by learning from the user's explicit decisions, behaviour, and reaction to system actions. &lt;a href="#_ftn72" name="_ftnref72"&gt;[72]&lt;/a&gt; approximated by the system from the knowledge base.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The user's personality type is determined before initial system use&lt;/i&gt; to select a basic privacy profile.&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It may also be possible that the privacy preference cannot be realized in the current context. In that case, the privacy policy would suggest terminating 	the activity. For each privacy policy variant a confidence score is calculated based on how well it fits the adapted privacy preference. Based on the 	confidence scores, the PDE selects the most appropriate policy candidate or triggers user involvement if the confidence is below a certain threshold 	determined by the user's personality and previous privacy decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Realization and Enforcement:&lt;/b&gt; The selected privacy policy must be realized on the system level. This is by combining territorial privacy and information privacy aspects. The private 	territory is defined by a territorial privacy boundary that separates desired and undesired entities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Granularity adjustments for specific Information items is defined. For example, instead of the user's exact position only the street address or city can be 	provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ADVANTAGES: The personalization to a specific user has the advantage of better emulating that user's privacy decision process. It also helps to decide when 	to involve the user in the decision process by providing recommendations only and when privacy decisions can be realized autonomously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DISADVANTAGES: The entire model hinges on the ability of the system to accurately determine user profile before the user starts using it and not after, 	when preferences can be more accurately determined. There is no provision for the user to pick his own privacy profile, it is all system determined taking 	away an element of consent in the very beginning. As all further preferences are adapted on this base, it is possible that the system may not deliver. The 	use of confident scores is an approximation that can compromise privacy by a small numerical margin of difference.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However it is a useful insight on techniques of contextualization. Depending on the environment, different strategies for policy realization and varying 	degrees of enforcement are possible&lt;a href="#_ftn73" name="_ftnref73"&gt;[73]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Length&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The length of privacy policies is often cited as one reason they are so commonly ignored. Studies show privacy policies are hard to read, read 	infrequently, and do not support rational decision making. &lt;a href="#_ftn74" name="_ftnref74"&gt;[74]&lt;/a&gt; Aleecia M. McDonald and Lorrie Faith Cranor 	in their seminal study, "The Cost of Reading Privacy Policies" estimated that the the average length of privacy policies is 2,500 words. Using the reading 	speed of 250 words per minute which is typical for those who have completed secondary education, the average policy would take 10 minutes to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers also investigated how quickly people could read privacy policies when they were just skimming it for pertinent details. They timed 93 	people as they skimmed a 934-word privacy policy and answered multiple choice questions on its content.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though some people took under a minute and others up to 42 minutes, the bulk of the subjects of the research took between three and six minutes to skim the 	policy, which itself was just over a third of the size of the average policy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers used their data to estimate how much it costs to read the privacy policy of every site they visit once a year if their time was charged for 	and arrived at a mind boggling figure of $652 billion.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ProbabilityDensityFunction.png" alt="Probability Density Function" class="image-inline" title="Probability Density Function" /&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Though the figure of $652 billion has limited usefulness, because people rarely read whole policies and cannot charge anyone for the time it takes to do 	this, the researchers concluded that readers who do conduct a cost-benefit analysis might decide not to read any policies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Preliminary work from a small pilot study in our laboratory revealed that some Internet users believe their only serious risk online is they may lose up 	to $50 if their credit card information is stolen. For people who think that is their primary risk, our point estimates show the value of their time to 	read policies far exceeds this risk. Even for our lower bound estimates of the value of time, it is not worth reading privacy policies though it may be 	worth skimming them," said the research. This implies that seeing their only risk as credit card fraud suggests Internet users likely do not understand the 	risks to their privacy. As an FTC report recently stated, "it is unclear whether consumers even understand that their information is being collected, 	aggregated, and used to deliver advertising."&lt;a href="#_ftn75" name="_ftnref75"&gt;[75]&lt;/a&gt;"&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;If the privacy community can find ways to reduce the time cost of reading policies, it may be easier to convince Internet users to do so. For example, if 	consumers can move from needing to read policies word-for-word and only skim policies by providing useful headings, or with ways to hide all but relevant information in a layered format and thus reduce the effective length of the policies, more people may be willing to read them.	&lt;a href="#_ftn76" name="_ftnref76"&gt;[76]&lt;/a&gt; Apps can also adopt short form notices that summarize and link to the larger more complete notice 	displayed elsewhere. These short form notices need not be legally binding and must candidate that it does not cover all types of data collection but only 	the most relevant ones. &lt;a href="#_ftn77" name="_ftnref77"&gt;[77]&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;Content&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In an attempt to gain permission most privacy policies inform users about: (1) the type of information collected; and (2) the purpose for collecting that 	information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Standard privacy notices generally cover the points of:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;M&lt;b&gt;ethods Of Collection And Usage Of Personal Information&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;The Cookie Policy &lt;/b&gt; &lt;b&gt; &lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Sharing Of Customer Information&lt;/b&gt; &lt;a href="#_ftn78" name="_ftnref78"&gt;&lt;b&gt;[78]&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt; &lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Certified Information Privacy Professionals divide notices into the following sequential sections&lt;a href="#_ftn79" name="_ftnref79"&gt;[79]&lt;/a&gt;:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;i. &lt;b&gt;Policy Identification Details: D&lt;/b&gt;efines the policy name, version and description.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ii. &lt;b&gt;P3P-Based Components: &lt;/b&gt;Defines policy attributes that would apply if the policy is exported to a P3P format.	&lt;a href="#_ftn80" name="_ftnref80"&gt;[80]&lt;/a&gt; Such attributes would include: policy URLs, organization information, P&lt;span&gt;II&lt;/span&gt; access and dispute 	resolution procedures.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;iii. &lt;b&gt;Policy Statements and Related Elements: Groups, Purposes and PII Types-&lt;/b&gt;Policy statements define the individuals able to access 	certain types of information, for certain pre-defined purposes.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Applications tend to define the type of data broadly in an attempt to strike a balance between providing enough information so that application may gain 	consent to access a user's data and being broad enough to avoid ruling out specific information.&lt;a href="#_ftn81" name="_ftnref81"&gt;[81]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This leads to usage of vague terms like "information collected &lt;i&gt;may &lt;/i&gt;include."&lt;a href="#_ftn82" name="_ftnref82"&gt;[82]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly the purpose of the data acquisition is also very broad. For example, a privacy policy may state that user data can be collected for anything 	related to ―"improving the content of the Service." As the scope of ―improving the content of the Service is never defined, any usage could 	conceivably fall within that category.&lt;a href="#_ftn83" name="_ftnref83"&gt;[83]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several apps create user social profiles based on their online preferences to promote targeted marketing which is cleverly concealed in phrases like "we may also draw upon this Personal Information in order to adapt the Services of our community to your needs".	&lt;a href="#_ftn84" name="_ftnref84"&gt;[84]&lt;/a&gt; For instance Bees &amp;amp; Pollen is a "predictive personalization" platform for games and apps that 	"uses advanced predictive algorithms to detect complex, non-trivial correlations between conversion patterns and users' DNA signatures, thus enabling it to 	automatically serve each user a personalized best-fit game options, in real-time." In reality it analyses over 100 user attributes, including activity on 	Facebook, spending behaviours, marital status, and location.&lt;a href="#_ftn85" name="_ftnref85"&gt;[85]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notices also often mislead consumers into believing that their information will not be shared with third parties using the terms "unaffiliated third 	parties." Other affiliated companies within the corporate structure of the service provider may have access to user's data for marketing and other 	purposes. &lt;a href="#_ftn86" name="_ftnref86"&gt;[86]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are very few choices to opt-out of certain practices, such as sharing data for marketing purposes. Thus, users are effectively left with a 	take-it-or-leave-it choice - give up your privacy or go elsewhere.&lt;a href="#_ftn87" name="_ftnref87"&gt;[87]&lt;/a&gt;Users almost always grant consent if 	it is required to receive the service they want which raises the query if this consent is meaningful&lt;a href="#_ftn88" name="_ftnref88"&gt;[88]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The following recommendations have emerged:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Notice&lt;/b&gt; - Companies should provide consumers with clear, conspicuous notice that accurately describe their information practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; " type="disc"&gt;
&lt;li&gt; &lt;b&gt;Consumer Choice&lt;/b&gt; - Companies should provide consumers with the opportunity to decide (in the form of opting-out) if it may disclose personal information to unaffiliated 		third parties. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Access and Correction&lt;/b&gt; - Companies should provide consumers with the opportunity to access and correct personal information collected about the consumer. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Security&lt;/b&gt; - Companies must adopt reasonable security measures in order to protect the privacy of personal information. Possible security measures include: 		administrative security, physical security and technical security. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Enforcement&lt;/b&gt; - Companies should have systems through which they can enforce the privacy policy. This may be managed by the company, or an independent third party to ensure compliance. Examples of popular third parties include &lt;a href="https://www.cippguide.org/tag/bbbonline/"&gt;BBBOnLine&lt;/a&gt; and		&lt;a href="https://www.cippguide.org/tag/truste/"&gt;TRUSTe&lt;/a&gt;.&lt;a href="#_ftn89" name="_ftnref89"&gt;[89]&lt;/a&gt; &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Standardization&lt;/b&gt; : Several researchers and organizations have recommended a standardized privacy notice format that covers certain essential points.		&lt;a href="#_ftn90" name="_ftnref90"&gt;[90]&lt;/a&gt; However as displaying a privacy notice in itself is voluntary it is unpredictable whether 		companies would willingly adopt a standardized model. Moreover with the app market burgeoning with innovations a standard format may not cover all 		emergent data practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2 style="text-align: justify; "&gt;Comprehension&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;The FTC states that &lt;/b&gt; "the notice-and-choice model, as implemented, has led to long, incomprehensible privacy policies that consumers typically do not read, let alone 	understand. the question is not whether consumers should be given a say over unexpected uses of their data; rather, the question is how to provide 	simplified notice and choice"&lt;a href="#_ftn91" name="_ftnref91"&gt;[91]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notably, in a survey conducted by Zogby International, 93% of adults - and 81% of teens - indicated they would take more time to read terms and conditions 	for websites if they were written in clearer language.&lt;a href="#_ftn92" name="_ftnref92"&gt;[92]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most privacy policies are in natural language format: companies explain their practices in prose. One noted disadvantage to current natural language 	policies is that companies can choose which information to present, which does not necessarily solve the problem of information asymmetry between companies and consumers. Further, companies use what have been termed "weasel words" - legalistic, ambiguous, or slanted phrases - to describe their practices	&lt;a href="#_ftn93" name="_ftnref93"&gt;[93]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a study by Aleecia M. McDonald and others&lt;a href="#_ftn94" name="_ftnref94"&gt;[94]&lt;/a&gt;, it was found that accuracy in what users comprehend span 	a wide range. An average of 91% of participants answered correctly when asked about cookies, 61% answered correctly about opt out links, 60% understood 	when their email address would be "shared" with a third party, and only 46% answered correctly regarding telemarketing. Participants found those questions 	harder which substituted vague or complicated terms to refer to practices such as telemarketing by "the information you provide may be used for marketing 	services." Overall accuracy was a mere 33%.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Natural language policies are often long and require college-level reading skills. Furthermore, there are no standards for which information is disclosed, 	no standard place to find particular information, and data practices are not described using consistent language. These policies are "long, complicated, 	and full of jargon and change frequently."&lt;a href="#_ftn95" name="_ftnref95"&gt;[95]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Kent Walker list five problems that privacy notices typically suffer from -&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) overkill - long and repetitive text in small print,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) irrelevance - describing situations of little concern to most consumers,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) opacity - broad terms the reflect the truth that is impossible to track and control all the information collected and stored,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;d) non-comparability - simplification required to achieve comparability will lead to compromising accuracy, and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;e) inflexibility - failure to keep pace with new business models. &lt;a href="#_ftn96" name="_ftnref96"&gt;[96]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Researchers advocate a more succinct and simpler standard for privacy notices,&lt;a name="_ftnref34"&gt;&lt;/a&gt;&lt;a href="#_ftn97" name="_ftnref97"&gt;[97]&lt;/a&gt; such as representing the information in the form of a table. &lt;a href="#_ftn98" name="_ftnref98"&gt;[98]&lt;/a&gt; However, studies show only an insignificant improvement in the understanding by consumers when privacy policies are represented in graphic formats like tables and labels.	&lt;a href="#_ftn99" name="_ftnref99"&gt;[99]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are also recommendations to adopt a multi-layered approach where the relevant information is summarized through a short notice.&lt;a href="#_ftn100" name="_ftnref100"&gt;[100]&lt;/a&gt; This is backed by studies that consumers find layered policies easier to understand.	&lt;a href="#_ftn101" name="_ftnref101"&gt;[101]&lt;/a&gt; However they were less accurate in the layered format especially with parts that were not 	summarized. This suggests participants that did not continue to the full policy when the information they sought was not available on the short notice. 	Unless it is possible to identify all of the topics users care about and summarize to one page, the layered notice effectively hides information and reduces transparency. It has also been pointed out that it is impossible to convey complex data policies in simple and clear language.	&lt;a href="#_ftn102" name="_ftnref102"&gt;[102]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Consumers often struggle to map concepts such as third party access to the terms used in policies. This is also because companies with identical practices 	often convey different information, and these differences reflected in consumer's ability to understand the policies. These policies may need an 	educational component so readers understand what it means for a site to engage in a given practice&lt;a href="#_ftn103" name="_ftnref103"&gt;[103]&lt;/a&gt;. 	However it is unlikely that when readers fail to take time to read the policy that they will read up on additional educational components.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;[1]&lt;/a&gt; Amber Sinha http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;[2]&lt;/a&gt; Wang, &lt;i&gt;et al.&lt;/i&gt;, 1998) Milberg, &lt;i&gt;et al.&lt;/i&gt; (1995)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;[3]&lt;/a&gt; See e.g., White House, Consumer Privacy Bill of Rights (2012) 			http://www.whitehouse.gov/the-pressoffice/2012/02/23/we-can-t-wait-obama-administration-unveils-blueprint-privacy-bill-rights; Fed. Trade Comm'n, 			Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Business and Policy Makers (2012) 			http://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commissionreport-protecting-consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;[4]&lt;/a&gt; Fed. Trade Comm'n, Privacy Online: A Report to Congress 7 (June 1998), available at www.ftc.gov/reports/privacy3/priv-23a.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; &lt;a href="http://itlaw.wikia.com/wiki/U.S._Department_of_Commerce" title="U.S. Department of Commerce"&gt;U.S. Department of Commerce&lt;/a&gt; , &lt;a href="http://itlaw.wikia.com/wiki/Internet_Policy_Task_Force" title="Internet Policy Task Force"&gt;Internet Policy Task Force&lt;/a&gt;, 			&lt;a href="http://itlaw.wikia.com/wiki/Commercial_Data_Privacy_and_Innovation_in_the_Internet_Economy:_A_Dynamic_Policy_Framework" title="Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework"&gt; Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework &lt;/a&gt; 20 (Dec. 16, 2010) (&lt;a href="http://www.ntia.doc.gov/reports/2010/IPTF_Privacy_GreenPaper_12162010.pdf"&gt;full-text&lt;/a&gt;).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;[6]&lt;/a&gt; 389 U.S. 347 (1967).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;[7]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;[8]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;[9]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;[10]&lt;/a&gt; Kyllo, 533 U.S. at 34 (―[T]he technology enabling human flight has exposed to public view (and hence, we have said, to official observation) 			uncovered portions of the house and its curtilage that once were private.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;[11]&lt;/a&gt; Kyllo v. United States, 533 U.S. 27&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;[12]&lt;/a&gt; See Katz, 389 U.S. at 352 (―But what he sought to exclude when he entered the booth was not the intruding eye-it was the uninvited ear. He 			did not shed his right to do so simply because he made his calls from a place where he might be seen.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;[13]&lt;/a&gt; See United States v. Ahrndt, No. 08-468-KI, 2010 WL 3773994, at *4 (D. Or. Jan. 8, 2010).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;[14]&lt;/a&gt; In re DoubleClick Inc. Privacy Litig., 154 F. Supp. 2d 497 (S.D.N.Y. 2001).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;[15]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;[16]&lt;/a&gt; See Michael A. Carrier, Against Cyberproperty, 22 BERKELEY TECH. L.J. 1485, 1486 (2007) (arguing against creating a right to exclude users from 			making electronic contact to their network as one that exceeds traditional property notions).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;[17]&lt;/a&gt; See M. Ryan Calo, Against Notice Skepticism in Privacy (and Elsewhere), 87 NOTRE DAME L. REV. 1027, 1049 (2012) (citing Paula J. Dalley, The Use 			and Misuse of Disclosure as a Regulatory System, 34 FLA. ST. U. L. REV. 1089, 1093 (2007) ("[D]isclosure schemes comport with the prevailing 			political philosophy in that disclosure preserves individual choice while avoiding direct governmental interference.")).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;[18]&lt;/a&gt; See Calo, supra note 10, at 1048; see also Omri Ben-Shahar &amp;amp; Carl E. Schneider, The Failure of Mandated Disclosure, 159 U. PA. L. REV. 647, 682 			(noting that notice "looks cheap" and "looks easy").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;[19]&lt;/a&gt; Mark MacCarthy, New Directions in Privacy: Disclosure, Unfairness and Externalities, 6 I/S J. L. &amp;amp; POL'Y FOR INFO. SOC'Y 425, 440 (2011) 			(citing M. Ryan Calo, A Hybrid Conception of Privacy Harm Draft-Privacy Law Scholars Conference 2010, p. 28).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;[20]&lt;/a&gt; Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 HARV. L. REV. 1879, 1885 (2013) (citing Jon Leibowitz, Fed. 			Trade Comm'n, So Private, So Public: Individuals, the Internet &amp;amp; the Paradox of Behavioral Marketing, Remarks at the FTC Town Hall Meeting on 			Behavioral Advertising: Tracking, Targeting, &amp;amp; Technology (Nov. 1, 2007), available at 			http://www.ftc.gov/speeches/leibowitz/071031ehavior/pdf). Paul Ohm refers to these issues as "information-quality problems." See Paul Ohm, Branding 			Privacy, 97 MINN. L. REV. 907, 930 (2013). Daniel J. Solove refers to this as "the problem of the uninformed individual." See Solove, supra note 17&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;[21]&lt;/a&gt; See Edward J. Janger &amp;amp; Paul M. Schwartz, The Gramm-Leach-Bliley Act, Information Privacy, and the Limits of Default Rules, 86 MINN. L. REV. 			1219, 1230 (2002) (stating that according to one survey, "only 0.5% of banking customers had exercised their opt-out rights").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;[22]&lt;/a&gt; See Amber Sinha A Critique of Consent in Information Privacy 			http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;[23]&lt;/a&gt; Leigh Shevchik, "Mobile App Industry to Reach Record Revenue in 2013," New Relic (blog), April 1, 2013, 			http://blog.newrelic.com/2013/04/01/mobile-apps-industry-to-reach-record-revenue-in-2013/.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;[24]&lt;/a&gt; Jan Lauren Boyles, Aaron Smith, and Mary Madden, "Privacy and Data Management on Mobile Devices," Pew Internet &amp;amp; American Life Project, 			Washington, DC, September 5, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;[25]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;[26]&lt;/a&gt; "Mobile Apps for Kids: Disclosures Still Not Making the Grade," Federal Trade Commission, Washington, DC, December 2012&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;[27]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;[28]&lt;/a&gt; Linda Ackerman, "Mobile Health and Fitness Applications and Information Privacy," Privacy Rights Clearinghouse, San Diego, CA, July 15, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;[29]&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999). 			&lt;a href="http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj"&gt; http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj &lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;[30]&lt;/a&gt; William Aiello, Steven M. Bellovin, Matt Blaze, Ran Canetti, John Ioannidis, Angelos D. Keromytis, and Omer Reingold. Just fast keying: Key 			agreement in a hostile internet. ACM Trans. Inf. Syst. Secur., 7(2):242-273, 2004.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;[31]&lt;/a&gt; Privacy By Design The 7 Foundational Principles by Anne Cavoukian https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;[32]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;[33]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;[34]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; John Frank Weaver, We Need to Pass Legislation on Artificial Intelligence Early and Often, SLATE FUTURE TENSE (Sept. 12, 			2014),http://www.slate.com/blogs/future_tense/2014/09/12/we_need_to_pass_artificial_intelligence_laws_early_and_often.html&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Richard Warner &amp;amp; Robert Sloan, Beyond Notice and Choice: Privacy, Norms, and Consent, J. High Tech. L. (2013). Available at: 			http://scholarship.kentlaw.iit.edu/fac_schol/568&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;b&gt;&lt;sup&gt;&lt;b&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/b&gt;&lt;/sup&gt;&lt;/b&gt;&lt;/a&gt; &lt;a href="http://ssrn.com/abstract=1085333"&gt;&lt;b&gt;Engineering Privacy by Sarah Spiekermann, Lorrie Faith Cranor :: SSRN&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;[39]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;[40]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn41"&gt;
&lt;p&gt;&lt;a href="#_ftnref41" name="_ftn41"&gt;[41]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn42"&gt;
&lt;p&gt;&lt;a href="#_ftnref42" name="_ftn42"&gt;[42]&lt;/a&gt; See Katherine Noyes, Why Android App Security is Better Than for the iPhone, PC WORLD BUS. CTR. (Aug. 6, 2010, 4:20 PM), 			http://www.pcworld.com/businesscenter/article/202758/why_android_app_security_is_be tter_than_for_the_iphone.html; see also About Permissions for 			Third-Party Applications, BLACKBERRY, http://docs.blackberry.com/en/smartphone_users/deliverables/22178/ 			About_permissions_for_third-party_apps_50_778147_11.jsp (last visited Sept. 29, 2011); Security and Permissions, supra note 76.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn43"&gt;
&lt;p&gt;&lt;a href="#_ftnref43" name="_ftn43"&gt;[43]&lt;/a&gt; Peter S. Vogel, A Worrisome Truth: Internet Privacy is Impossible, TECHNEWSWORLD (June 8, 2011, 5:00 AM), http://www.technewsworld.com/ 			story/72610.html.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn44"&gt;
&lt;p&gt;&lt;a href="#_ftnref44" name="_ftn44"&gt;[44]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn45"&gt;
&lt;p&gt;&lt;a href="#_ftnref45" name="_ftn45"&gt;[45]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing Notice: A Large-scale Experiment on the Timing of Software License 			Agreements. In Proc. of CHI. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn46"&gt;
&lt;p&gt;&lt;a href="#_ftnref46" name="_ftn46"&gt;[46]&lt;/a&gt; I. Adjerid, A. Acquisti, L. Brandimarte, and G. Loewenstein. Sleights of Privacy: Framing, Disclosures, and the Limits of Transparency. In Proc. of 			SOUPS. ACM, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn47"&gt;
&lt;p&gt;&lt;a href="#_ftnref47" name="_ftn47"&gt;[47]&lt;/a&gt; http://delivery.acm.org/10.1145/2810000/2808119/p63-balebako.pdf?ip=106.51.36.200&amp;amp;id=2808119&amp;amp;acc=OA&amp;amp;key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E35B5BCE80D07AAD9&amp;amp;CFID=801296199&amp;amp;CFTOKEN=33661544&amp;amp;__acm__=1466052980_2f265a2442ea3394aa1ebab7e6449933&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn48"&gt;
&lt;p&gt;&lt;a href="#_ftnref48" name="_ftn48"&gt;[48]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn49"&gt;
&lt;p&gt;&lt;a href="#_ftnref49" name="_ftn49"&gt;[49]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn50"&gt;
&lt;p&gt;&lt;a href="#_ftnref50" name="_ftn50"&gt;[50]&lt;/a&gt; S. Egelman, J. Tsai, L. F. Cranor, and A. Acquisti. Timing is everything?: the effects of timing and placement of online privacy indicators. In 			Proc. CHI '09. ACM, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn51"&gt;
&lt;p&gt;&lt;a href="#_ftnref51" name="_ftn51"&gt;[51]&lt;/a&gt; R. B¨ohme and S. K¨opsell. Trained to accept?: A field experiment on consent dialogs. In Proc. CHI '10. ACM, 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn52"&gt;
&lt;p&gt;&lt;a href="#_ftnref52" name="_ftn52"&gt;[52]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn53"&gt;
&lt;p&gt;&lt;a href="#_ftnref53" name="_ftn53"&gt;[53]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn54"&gt;
&lt;p&gt;&lt;a href="#_ftnref54" name="_ftn54"&gt;[54]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn55"&gt;
&lt;p&gt;&lt;a href="#_ftnref55" name="_ftn55"&gt;[55]&lt;/a&gt; A. Kobsa and M. Teltzrow. Contextualized communication of privacy practices and personalization benefits: Impacts on users' data sharing and 			purchase behavior. In Proc. PETS '05. Springer, 2005.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn56"&gt;
&lt;p&gt;&lt;a href="#_ftnref56" name="_ftn56"&gt;[56]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn57"&gt;
&lt;p&gt;&lt;a href="#_ftnref57" name="_ftn57"&gt;[57]&lt;/a&gt; E. Choe, J. Jung, B. Lee, and K. Fisher. Nudging people away from privacy-invasive mobile apps through visual framing. In Proc. INTERACT '13. 			Springer, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn58"&gt;
&lt;p&gt;&lt;a href="#_ftnref58" name="_ftn58"&gt;[58]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn59"&gt;
&lt;p&gt;&lt;a href="#_ftnref59" name="_ftn59"&gt;[59]&lt;/a&gt; Article 29 Data Protection Working Party. Opinion 8/2014 on the Recent Developments on the Internet of Things. WP 223, Sept. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn60"&gt;
&lt;p&gt;&lt;a href="#_ftnref60" name="_ftn60"&gt;[60]&lt;/a&gt; B. Anderson, A. Vance, B. Kirwan, E. D., and S. Howard. Users aren't (necessarily) lazy: Using NeuroIS to explain habituation to security warnings. 			In Proc. ICIS '14, 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn61"&gt;
&lt;p&gt;&lt;a href="#_ftnref61" name="_ftn61"&gt;[61]&lt;/a&gt; B. Anderson, B. Kirwan, D. Eargle, S. Howard, and A. Vance. How polymorphic warnings reduce habituation in the brain - insights from an fMRI study. 			In Proc. CHI '15. ACM, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn62"&gt;
&lt;p&gt;&lt;a href="#_ftnref62" name="_ftn62"&gt;[62]&lt;/a&gt; M. S. Wogalter, V. C. Conzola, and T. L. Smith-Jackson. Research-based guidelines for warning design and evaluation. Applied Ergonomics, 16 USENIX 			Association 2015 Symposium on Usable Privacy and Security 17 33(3):219-230, 2002.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn63"&gt;
&lt;p&gt;&lt;a href="#_ftnref63" name="_ftn63"&gt;[63]&lt;/a&gt; L. F. Cranor, P. Guduru, and M. Arjula. User interfaces for privacy agents. ACM TOCHI, 13(2):135-178, 2006.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn64"&gt;
&lt;p&gt;&lt;a href="#_ftnref64" name="_ftn64"&gt;[64]&lt;/a&gt; R. S. Portnoff, L. N. Lee, S. Egelman, P. Mishra, D. Leung, and D. Wagner. Somebody's watching me? assessing the effectiveness of webcam indicator 			lights. In Proc. CHI '15, 2015&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn65"&gt;
&lt;p&gt;&lt;a href="#_ftnref65" name="_ftn65"&gt;[65]&lt;/a&gt; M. Langheinrich. Privacy by design - principles of privacy-aware ubiquitous systems. In Proc. UbiComp '01. Springer, 2001&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn66"&gt;
&lt;p&gt;&lt;a href="#_ftnref66" name="_ftn66"&gt;[66]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn67"&gt;
&lt;p&gt;&lt;a href="#_ftnref67" name="_ftn67"&gt;[67]&lt;/a&gt; The Impact of Timing on the Salience of Smartphone App Privacy Notices, Rebecca Balebako , Florian Schaub, Idris Adjerid , Alessandro Acquist 			,Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn68"&gt;
&lt;p&gt;&lt;a href="#_ftnref68" name="_ftn68"&gt;[68]&lt;/a&gt; R. Böhme and J. Grossklags. The Security Cost of Cheap User Interaction. In Workshop on New Security Paradigms, pages 67-82. ACM, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn69"&gt;
&lt;p&gt;&lt;a href="#_ftnref69" name="_ftn69"&gt;[69]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn70"&gt;
&lt;p&gt;&lt;a href="#_ftnref70" name="_ftn70"&gt;[70]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn71"&gt;
&lt;p&gt;&lt;a href="#_ftnref71" name="_ftn71"&gt;[71]&lt;/a&gt; Towards Context Adaptive Privacy Decisions in Ubiquitous Computing Florian Schaub∗ , Bastian Könings∗ , Michael Weber∗ , 			Frank Kargl† ∗ Institute of Media Informatics, Ulm University, Germany Email: { florian.schaub | bastian.koenings | michael.weber 			}@uni-ulm.d&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn72"&gt;
&lt;p&gt;&lt;a href="#_ftnref72" name="_ftn72"&gt;[72]&lt;/a&gt; M. Korzaan and N. Brooks, "Demystifying Personality and Privacy: An Empirical Investigation into Antecedents of Concerns for Information Privacy," 			Journal of Behavioral Studies in Business, pp. 1-17, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn73"&gt;
&lt;p&gt;&lt;a href="#_ftnref73" name="_ftn73"&gt;[73]&lt;/a&gt; B. Könings and F. Schaub, "Territorial Privacy in Ubiquitous Computing," in WONS'11. IEEE, 2011, pp. 104-108.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn74"&gt;
&lt;p&gt;&lt;a href="#_ftnref74" name="_ftn74"&gt;[74]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn75"&gt;
&lt;p&gt;&lt;a href="#_ftnref75" name="_ftn75"&gt;[75]&lt;/a&gt; 5 Federal Trade Commission, "Protecting Consumers in the Next Tech-ade: A Report by the Staff of the Federal Trade Commission," March 2008, 11, 			http://www.ftc.gov/os/2008/03/P064101tech.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn76"&gt;
&lt;p&gt;&lt;a href="#_ftnref76" name="_ftn76"&gt;[76]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;p&gt;I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue http://www.is-journal.org/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn77"&gt;
&lt;p&gt;&lt;a href="#_ftnref77" name="_ftn77"&gt;[77]&lt;/a&gt; IS YOUR INSEAM YOUR BIOMETRIC? Evaluating the Understandability of Mobile Privacy Notice Categories Rebecca Balebako, Richard Shay, and Lorrie 			Faith Cranor July 17, 2013 https://www.cylab.cmu.edu/files/pdfs/tech_reports/CMUCyLab13011.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn78"&gt;
&lt;p&gt;&lt;a href="#_ftnref78" name="_ftn78"&gt;[78]&lt;/a&gt; https://www.sba.gov/blogs/7-considerations-crafting-online-privacy-policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn79"&gt;
&lt;p&gt;&lt;a href="#_ftnref79" name="_ftn79"&gt;[79]&lt;/a&gt; https://www.cippguide.org&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn80"&gt;
&lt;p&gt;&lt;a href="#_ftnref80" name="_ftn80"&gt;[80]&lt;/a&gt; The Platform for Privacy Preferences Project, more commonly known as P3P was designed by the World Wide Web Consortium aka W3C in response to the 			increased use of the Internet for sales transactions and subsequent collection of personal information. P3P is a special protocol that allows a 			website's policies to be machine readable, granting web users' greater control over the use and disclosure of their information while browsing the 			internet.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn81"&gt;
&lt;p&gt;&lt;a href="#_ftnref81" name="_ftn81"&gt;[81]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn82"&gt;
&lt;p&gt;&lt;a href="#_ftnref82" name="_ftn82"&gt;[82]&lt;/a&gt; See Foursqaure Privacy Policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn83"&gt;
&lt;p&gt;&lt;a href="#_ftnref83" name="_ftn83"&gt;[83]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn84"&gt;
&lt;p&gt;&lt;a href="#_ftnref84" name="_ftn84"&gt;[84]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn85"&gt;
&lt;p&gt;&lt;a href="#_ftnref85" name="_ftn85"&gt;[85]&lt;/a&gt; Bees and Pollen, "Bees and Pollen Personalization Platform," http://www.beesandpollen.com/TheProduct. aspx; Bees and Pollen, "Sense6-Social Casino 			Games Personalization Solution," http://www.beesandpollen. com/sense6.aspx; Bees and Pollen, "About Us," http://www.beesandpollen.com/About.aspx.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn86"&gt;
&lt;p&gt;&lt;a href="#_ftnref86" name="_ftn86"&gt;[86]&lt;/a&gt; CFA on the NTIA Short Form Notice Code of Conduct to Promote Transparency in Mobile Applications July 26, 2013 | Press Release&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn87"&gt;
&lt;p&gt;&lt;a href="#_ftnref87" name="_ftn87"&gt;[87]&lt;/a&gt; P. M. Schwartz and D. Solove. Notice &amp;amp; Choice. In The Second NPLAN/BMSG Meeting on Digital Media and Marketing to Children, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn88"&gt;
&lt;p&gt;&lt;a href="#_ftnref88" name="_ftn88"&gt;[88]&lt;/a&gt; F. Cate. The Limits of Notice and Choice. IEEE Security Privacy, 8(2):59-62, Mar. 2010.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn89"&gt;
&lt;p&gt;&lt;a href="#_ftnref89" name="_ftn89"&gt;[89]&lt;/a&gt; https://www.cippguide.org/2011/08/09/components-of-a-privacy-policy/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn90"&gt;
&lt;p&gt;&lt;a href="#_ftnref90" name="_ftn90"&gt;[90]&lt;/a&gt; https://www.ftc.gov/public-statements/2001/07/case-standardization-privacy-policy-formats&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn91"&gt;
&lt;p&gt;&lt;a href="#_ftnref91" name="_ftn91"&gt;[91]&lt;/a&gt; Protecting Consumer Privacy in an Era of Rapid Change. Preliminary FTC Staff Report.December 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn92"&gt;
&lt;p&gt;&lt;a href="#_ftnref92" name="_ftn92"&gt;[92]&lt;/a&gt; . See Comment of Common Sense Media, cmt. #00457, at 1.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn93"&gt;
&lt;p&gt;&lt;a href="#_ftnref93" name="_ftn93"&gt;[93]&lt;/a&gt; Pollach, I. What's wrong with online privacy policies? Communications of the ACM 30, 5 (September 2007), 103-108&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn94"&gt;
&lt;p&gt;&lt;a href="#_ftnref94" name="_ftn94"&gt;[94]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn95"&gt;
&lt;p&gt;&lt;a href="#_ftnref95" name="_ftn95"&gt;[95]&lt;/a&gt; Amber Sinha Critique&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn96"&gt;
&lt;p&gt;&lt;a href="#_ftnref96" name="_ftn96"&gt;[96]&lt;/a&gt; Kent Walker, The Costs of Privacy, 2001 available at 			&lt;a href="https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy"&gt; https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn97"&gt;
&lt;p&gt;&lt;a href="#_ftnref97" name="_ftn97"&gt;[97]&lt;/a&gt; Annie I. Anton et al., Financial Privacy Policies and the Need for Standardization, 2004 available at			&lt;a href="https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf"&gt;https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf&lt;/a&gt;; Florian Schaub, R. 			Balebako et al, "A Design Space for effective privacy notices" available at 			&lt;a href="https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf"&gt; https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn98"&gt;
&lt;p&gt;&lt;a href="#_ftnref98" name="_ftn98"&gt;[98]&lt;/a&gt; Allen Levy and Manoj Hastak, Consumer Comprehension of Financial Privacy Notices, Interagency Notice Project, available at			&lt;a href="https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf"&gt;https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn99"&gt;
&lt;p&gt;&lt;a href="#_ftnref99" name="_ftn99"&gt;[99]&lt;/a&gt; Patrick Gage Kelly et al., Standardizing Privacy Notices: An Online Study of the Nutrition Label Approach available at 			&lt;a href="https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf"&gt; https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn100"&gt;
&lt;p&gt;&lt;a href="#_ftnref100" name="_ftn100"&gt;[100]&lt;/a&gt; The Center for Information Policy Leadership, Hunton &amp;amp; Williams LLP, "Ten Steps To Develop A Multi-Layered Privacy Notice" available at 			&lt;a href="https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf"&gt; https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn101"&gt;
&lt;p&gt;&lt;a href="#_ftnref101" name="_ftn101"&gt;[101]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn102"&gt;
&lt;p&gt;&lt;a href="#_ftnref102" name="_ftn102"&gt;[102]&lt;/a&gt; Howard Latin, "Good" Warnings, Bad Products, and Cognitive Limitations, 41 UCLA Law Review available at 			&lt;a href="https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5"&gt; https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn103"&gt;
&lt;p&gt;&lt;a href="#_ftnref103" name="_ftn103"&gt;[103]&lt;/a&gt; Report by Kleimann Communication Group for the FTC. Evolution of a prototype financial privacy notice, 2006. http://www.ftc.gov/privacy/ 			privacyinitiatives/ftcfinalreport060228.pdf Accessed 2 Mar 2007&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/enlarging-the-small-print'&gt;https://cis-india.org/internet-governance/blog/enlarging-the-small-print&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Meera Manoj</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-12-14T16:27:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi">
    <title>Encryption policy would have affected emails, operating systems, WiFi</title>
    <link>https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi</link>
    <description>
        &lt;b&gt;Our email data would have to be stored. If we connect to a WiFi, that data would have to be stored, and that's plain ridiculous. There is a problem when the government tries to target citizens to ensure national security, said Pranesh Prakash, policy director at the Bangalore-based Centre for Internet and Society. &lt;/b&gt;
        &lt;p&gt;The article by Amrita Madhukalya was published in &lt;a class="external-link" href="http://www.dnaindia.com/india/report-encryption-policy-would-have-affected-emails-operating-systems-wifi-2127715"&gt;DNA&lt;/a&gt; on September 23, 2015.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="http://www.dnaindia.com/topic/draft-national-policy"&gt;Draft National Policy&lt;/a&gt; on Encryption, withdrawn by the Department of Electronics and  Information Technology (DeiTY) after it created a furore on privacy  issues, would have had allowed the government access to any form of  digital data that required encryption. Not limited to just WhatsApp or  Viber data, it would have affected email services, WiFi, phone operating  systems, etc.&lt;/p&gt;
&lt;p&gt;"Our email data would have to be stored. If we connect to a WiFi,  that data would have to be stored, and that's plain ridiculous. There is  a problem when the government tries to target citizens to ensure  national security," said Pranesh Prakash, policy director at the  Bangalore-based Centre for Internet and Society.&lt;/p&gt;
&lt;p&gt;The government, criticised heavily for the policy, withdrew it on  Tuesday afternoon. It said that a new policy will be brought in its  place.&lt;/p&gt;
&lt;p&gt;Nikhil Pahwa of internet watchdog Medianama said that data about  normal day-to-day activities would have to be stored if the policy was  implemented. "The policy would have affected everyday business to  consumer data.&lt;br /&gt; This would mean that if a doctor or lawyer had your data digitised,  they will be open to access, and would have to be kept for at least 90  days," said Pahwa.&lt;/p&gt;
&lt;p&gt;However, he added that a robust encryption is needed. "It is believed that companies like Google, &lt;a href="http://www.dnaindia.com/topic/facebook"&gt;Facebook&lt;/a&gt; allow the NSA to access user data in the US, putting our personal  security, and the national security largely, at risk," said Pahwa.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi'&gt;https://cis-india.org/internet-governance/news/dna-september-23-2015-amrita-madhukalya-encryption-policy-would-have-affected-emails-operating-systems-wifi&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>IT Act</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-09-25T01:23:10Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/encryption-and-anonymity-rights-and-risks">
    <title>Encryption and Anonymity: Rights and Risks</title>
    <link>https://cis-india.org/internet-governance/news/encryption-and-anonymity-rights-and-risks</link>
    <description>
        &lt;b&gt;Internet Governance Forum (IGF) 2015 will be held at Jao Pessoa in Brazil from November 10 to 13, 2015. The theme of IGF 2015 is Evolution of Internet Governance: Empowering Sustainable Development. ARTICLE 19 and Privacy International are organizing a workshop on Encryption and Anonymity on November 12, 2015. Pranesh Prakash is a speaker.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;This was published on the &lt;a class="external-link" href="https://www.intgovforum.org/cms/wks2015/index.php/proposal/view_public/155"&gt;IGF website&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Encryption and anonymity are two key aspects of the right to privacy and  free expression online. From real-name registration in Iran to the UK  Prime Minister's calls for Internet backdoors to encrypted  communications, however, the protection of encrypted and anonymous  speech is increasingly under threat. Recognising these challenges, the  UN Special Rapporteur on freedom of expression, David Kaye, presented a  report to the Human Rights Council in June 2015 which highlighted the  need for greater protection of encryption and anonymity.&lt;br /&gt; &lt;br /&gt; Five months on from the Special Rapporteur’s report, the participants in  this roundtable will discuss his recommendations and the latest  challenges to the protection of anonymity and encryption. For example,  how can law enforcement demands be met while ensuring that individuals  still enjoy strong encryption and unfettered access to anonymity tools?  What steps should governments, civil society, individuals and the  private sector take to avoid the legal and technological fragmentation  of a tool now vital to expression and communication? How can individuals  protect themselves from mass surveillance in the digital age?&lt;br /&gt; &lt;br /&gt; At the end of the session, the participants should have identified areas  for future advocacy both at the international and domestic levels as  well as areas for further research for the protection of anonymity and  encryption on the Internet.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Agenda&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;Moderator welcomes speakers and audience.&lt;/li&gt;
&lt;li&gt;Outline of key issues on encryption and anonymity, including summary of the UN Special Rapporteur's report.&lt;/li&gt;
&lt;li&gt;Each speaker speaks for 5-7 mins, giving their perspective re the issues.&lt;/li&gt;
&lt;li&gt;Questions from participants, including remote participation via Twitter.&lt;/li&gt;
&lt;li&gt;Conclusion and steps for further action.&lt;/li&gt;
&lt;/ol&gt; 
&lt;hr /&gt;
&lt;h2&gt;About IGF 2015&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Internet Governance Forum (IGF) is a multistakeholder, democratic and transparent forum which facilitates discussions on public policy issues related to key elements of Internet governance. IGF provides enabling platform for discussions among all stakeholders in the Internet governance ecosystem, including all entities accredited by the World Summit on the Information Society (WSIS), as well as other institutions and individuals with proven expertise and experience in all matters related to Internet governance.&lt;br /&gt;&lt;br /&gt;After consulting the wider Internet community and discussing the overarching theme of the 2015 IGF meeting, the Multistakeholder Advisory Group decided to retain the title “Evolution of Internet Governance: Empowering Sustainable Development”. This theme will be supported by eight sub-themes that will frame the discussions at the João Pessoa meeting.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/encryption-and-anonymity-rights-and-risks'&gt;https://cis-india.org/internet-governance/news/encryption-and-anonymity-rights-and-risks&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance Forum</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-10-27T02:37:45Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/enabling-multi-stakeholder-cooperation-towards-a-transnational-framework-for-due-process">
    <title>Enabling Multi-stakeholder Cooperation - Towards a Transnational Framework for Due Process</title>
    <link>https://cis-india.org/internet-governance/news/enabling-multi-stakeholder-cooperation-towards-a-transnational-framework-for-due-process</link>
    <description>
        &lt;b&gt;Internet &amp; Jurisdiction Project organized a multi-stakeholder meeting of the global multi-stakeholder dialogue process in 2015 on October 8-9 in Berlin, Germany. Sunil Abraham participated in this meeting.
 &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Since 2012, the Internet &amp;amp;  Jurisdiction Project has facilitated a global multi-stakeholder dialogue  process to address the tension between the cross-border nature of the  Internet and geographically defined national jurisdictions. It provides a  neutral platform for states, business, civil society and international  organizations to discuss the elaboration of a transnational due process  framework to handle the digital coexistence of diverse national laws in  shared cross-border online spaces. This pioneering multi-stakeholder  cooperation effort seeks to develop a “policy standard” for  transnational requests for domain seizures, content takedowns and access  to subscriber information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;2015 is an important moment that  determines the future of the multi-stakeholder model in global Internet  Governance. The Internet &amp;amp; Jurisdiction Project hopes it can provide  an opportunity to demonstrate that multi-stakeholder cooperation can  produce operational solutions to concrete policy challenges that no  stakeholder group can solve on its own.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The meeting will gather key actors from  states, Internet companies, technical operators, civil society, academia  and international organizations.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://www.internetjurisdiction.net/ij-project-multi-stakeholder-meeting-2015/"&gt;This was published on the website of Internet &amp;amp; Jurisdiction Project&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://cis-india.org/internet-governance/blog/agenda-of-i-j-meeting-in-berlin" class="external-link"&gt;Agenda of the I&amp;amp;J Meeting&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://cis-india.org/internet-governance/blog/participants-of-i-j-meeting-in-berlin" class="external-link"&gt;List of Participants&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/enabling-multi-stakeholder-cooperation-towards-a-transnational-framework-for-due-process'&gt;https://cis-india.org/internet-governance/news/enabling-multi-stakeholder-cooperation-towards-a-transnational-framework-for-due-process&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-10-14T02:53:07Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public">
    <title>En Inde, le biométrique version très grand public </title>
    <link>https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public</link>
    <description>
        &lt;b&gt;Initiée en 2010, l’Aadhaar est désormais la plus grande base de données d’empreintes et d’iris au monde. Carte d’identité destinée aux 1,25 milliard d’Indiens, elle sert aussi de moyen de paiement. Mais la sécurité du système et son utilisation à des fins de surveillance posent question.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="http://www.liberation.fr/futurs/2017/04/27/en-inde-le-biometrique-version-tres-grand-public_1565815"&gt;published by Liberation&lt;/a&gt; on April 27, 2017. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Le front barré d’un signe religieux hindou rouge, Vivek  Kumar se tient droit derrière le comptoir de son étroite papeterie  située dans une allée obscure d’un quartier populaire du sud-est de New  Delhi. Sous le regard bienveillant d’une idole de Ganesh - le dieu qui  efface les obstacles -, le commerçant à la fine moustache et à la  chemise bleu-gris au col Nehru réalise des photocopies, fournit des  tampons ou des stylos à des dizaines de chalands.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Gaurav, un vendeur de légumes de la halle d’à côté, entre  acheter du crédit de communication mobile. Au moment de payer, il sort  son portefeuille, mais pas pour chercher de la monnaie. Il y prend sa  carte d’identité Aadhaar et fournit ses douze chiffres au commerçant.  Qui les entre dans un smartphone, sélectionne la banque de Gaurav et  indique le montant de l’achat. Le client n’a plus qu’à poser son pouce  sur un lecteur biométrique relié au combiné, connecté à Internet. Une  lumière rouge s’allume et un son retentit : la transaction est bien  passée.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Depuis mars, 32 banques indiennes fournissent ce service  novateur de paiement par empreinte digitale. Appelé Aadhaar Pay, il  utilise les informations biométriques, à savoir les dix empreintes  digitales et celle de l’iris, recueillies par le gouvernement depuis  septembre 2010 pour créer la première carte d’identité du pays. Toute  personne résidant en Inde depuis plus de six mois, y compris les  étrangers, peut s’inscrire et l’obtenir gratuitement.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;«Renverser le système»&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;L’Aadhaar («la fondation» en hindi) représente aujourd’hui  la plus grande base de données biométriques au monde, avec 1,13 milliard  de personnes enregistrées sur 1,25 milliard, soit 99 % de la population  adulte indienne.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;L’objectif initial était double : identifier la population -  10% des Indiens n’avaient jusqu’ici aucun papier, et donc aucun droit -  et se servir de ces moyens biométriques pour sécuriser l’attribution de  nombreuses subventions alimentaires ou énergétiques, dont le  détournement coûte plusieurs milliards d’euros chaque année à l’Etat  fédéral.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A partir de 2014, la nouvelle majorité nationaliste hindoue  du BJP a étendu les usages de l’Aadhaar pour transformer cet outil de  reconnaissance en un vrai «passe-partout» de la vie quotidienne indienne  : depuis l’ouverture d’une ligne téléphonique à la déclaration de ses  impôts, en passant surtout par la création d’un compte en banque, le  numéro Aadhaar sera à présent requis. Dans ce dernier cas, l’Aadhaar  permet en prime d’utiliser le paiement bancaire par biométrie pour  réduire le recours au liquide, qui représente encore plus de 90 % des  transactions dans le pays.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Le Premier ministre, Narendra Modi, a fait de cette  inclusion financière l’un de ses principaux chevaux de bataille :  en 2014, son gouvernement a lancé un énorme programme qui a permis la  création de 213 millions de comptes bancaires en deux ans - aujourd’hui,  quasiment tous les foyers en possèdent au moins un. Il a continué dans  cette voie énergique en démonétisant, en novembre, les principales  coupures. But de la manœuvre : convaincre les Indiens de se défaire, au  moins temporairement, de leur dépendance aux billets marqués de la tête  de Gandhi.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;«Le liquide est gratuit, donc il est difficile de pousser les gens à utiliser d’autres moyens de paiement,&lt;/i&gt; explique Ragavan Venkatesan, responsable des paiements numériques à la  banque IDFC, pionnière dans l’utilisation de l’Aadhaar Pay. &lt;i&gt;Nous avons donc renversé le système pour que le commerçant soit incité à utiliser les moyens numériques.»&lt;/i&gt; L’établissement financier a d’abord développé le &lt;i&gt;«microdistributeur de billets»&lt;/i&gt; : une tablette que le vendeur peut utiliser pour créer des comptes,  recevoir des petits dépôts ou fournir du liquide aux clients au nom de  la banque, contre une commission. Comme l’Aadhaar Pay, cette tablette se  connecte au lecteur biométrique - fourni par l’entreprise française  Safran - pour l’identification et l’authentification. Dans les deux cas,  et à la différence des paiements par carte, ni le marchand ni le client  ne paient pour l’utilisation de ce réseau. &lt;i&gt;«Le mode traditionnel de paiement par carte va progressivement disparaître»,&lt;/i&gt; prédit Ragavan Venkatesan.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Défi&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Pour l’instant, le système n’en est toutefois qu’à ses  débuts. Environ 70 banques - une minorité du réseau indien - sont  reliées à l’Aadhaar Pay, et lors de nos visites dans différents magasins  de New Delhi, une transaction a été bloquée pendant dix minutes à cause  d’un problème de serveur. La connectivité est d’ailleurs un défi dans  un pays dont la population est en majorité rurale : le système nécessite  au minimum le réseau 2G, dont sont dépourvus environ 8 % des villages,  selon le ministère des Télécommunications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mais c’est la protection du système qui est surtout en question : &lt;i&gt;«La  biométrie réduit fortement le niveau de sécurité, car c’est facile de  voler ces données et de les utiliser sans votre accord,&lt;/i&gt; explique Sunil Abraham, directeur du Centre pour l’Internet et la société de Bangalore. &lt;i&gt;Il  existe maintenant des appareils photo de haute résolution qui  permettent de capturer et de répliquer les empreintes ou l’iris»&lt;/i&gt;, affirme ce spécialiste en cybersécurité.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Le problème tient au caractère irrévocable de ces données  biométriques. A la différence d’une carte bancaire qu’on peut annuler et  remplacer, on ne peut changer d’empreinte ou d’iris. L’Autorité  indienne d’identification unique (UIDAI), qui gère l’Aadhaar, prévoit  bien que l’on puisse bloquer l’utilisation de ses propres données  biométriques sur demande, ce qui offre une solution de sécurisation  temporaire. &lt;i&gt;«Si un fraudeur essaie de les utiliser, on peut le repérer&lt;/i&gt; [grâce au réseau internet, ndlr] &lt;i&gt;et l’arrêter»,&lt;/i&gt; défend Ragavan Venkatesan, de la banque IDFC.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mais cela risque de ne pas suffire en cas de recel de ces  informations : la police vient d’interpeller un groupe de trafiquants  qui étaient en possession des données bancaires de 10 millions  d’Indiens, récupérées à travers des employés et sous-traitants, données  qu’ils revendaient par paquets. Une femme âgée s’était déjà fait dérober  146 000 roupies (un peu plus de 2 000 euros) à cause de cette fraude.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Outil idéal&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Le directeur de l’UIDAI assure qu’aucune fuite ni vol de  données n’ont été rapportés à ce jour depuis leurs serveurs - ce qui ne  garantit pas que cette confidentialité sera respectée par tous les  autres acteurs qui y ont accès. En février, un chercheur en  cybersécurité a alerté la police sur le fait que 500 000 numéros Aadhaar  ainsi que les détails personnels de leurs propriétaires - exclusivement  des mineurs - avaient été publiés en ligne. La loi sur l’Aadhaar punit  de trois ans de prison le vol ou le recel de ces données. Ce texte  adopté l’année dernière - soit six ans après le début de la collecte -  empêche également leur utilisation à d’autres fins que  l’authentification pour l’attribution de subventions et de services. Et  l’UIDAI ne peut y accéder pleinement qu’en cas de risque pour la  sécurité nationale, et selon une procédure spéciale.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Reste qu’il n’existe pas d’autorité, comme la Cnil en France&lt;i&gt;,&lt;/i&gt; chargée de veiller de manière indépendante à ce que ces lignes rouges  ne soient pas franchies par un Etat à la recherche de nouveaux moyens de  renseignement. Car les experts s’accordent sur ce point : le  biométrique est un outil idéal pour surveiller une population.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;En 2010, le gouvernement britannique avait d’ailleurs mis  fin à son projet de carte d’identité biométrique, estimant que le taux  d’erreurs dans l’authentification était trop élevé et le risque  d’atteinte aux libertés trop important. Les Indiens, souvent subjugués  par les nouvelles technologies pour résoudre leurs problèmes sociaux, ne  semblent pas prêts de revenir en arrière. Surtout si cela peut en plus  servir à mieux ficher un pays menacé par un terrorisme régional et  local.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public'&gt;https://cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-05-03T16:27:23Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/empowering-the-next-billion-by-improving-accessibility">
    <title>Empowering the next billion by improving accessibility</title>
    <link>https://cis-india.org/internet-governance/news/empowering-the-next-billion-by-improving-accessibility</link>
    <description>
        &lt;b&gt;Internet Governance Forum (IGF) 2015 will be held at Jao Pessoa in Brazil from November 10 to 13, 2015. The theme of IGF 2015 is Evolution of Internet Governance: Empowering Sustainable Development. On Friday, November 13, 2015, Dynamic Coalition on Accessibility and Disability and  Global Initiative for Inclusive ICTs (G3ICT) is organizing this workshop. Sunil Abraham is a panelist. Pranesh Prakash will be taking part in the discussions.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;While considerable attention is given to the availability of the  communication infrastructure to expand usage of the Internet, little  attention has been given to the accessibility barriers which prevent  over one billion potential users to benefit from the Internet, including  for essential services. Those barriers affect persons living with a  variety of sensorial or physical disabilities as well as illiterate  individuals who may benefit from the same solutions designed for persons  with disabilities. &lt;br /&gt;&lt;br /&gt;This session will examine the technological  and programmatic solutions available today for an effective removal of  such barriers, potentially bringing a considerable number of new users  to the Internet. Examples in Education, Emergency services, Assistive  Technologies for work and independent living in a variety of economic  and geographic environments will be covered. The session will also  provide a detailed benchmark and statistical overview of the progress  made by countries around the world in implementing those solutions. A  general discussion with government, industry and persons with  disabilities representatives will ensue.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Read more on the &lt;a class="external-link" href="https://www.intgovforum.org/cms/wks2015/index.php/proposal/view_public/253"&gt;IGF website here&lt;/a&gt;. List of attendees &lt;a class="external-link" href="https://igf2015.sched.org/directory/attendees/2#.Vj4EjV58hQo"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/empowering-the-next-billion-by-improving-accessibility'&gt;https://cis-india.org/internet-governance/news/empowering-the-next-billion-by-improving-accessibility&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance Forum</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2015-11-07T14:04:57Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/individuals-in-search-of-society">
    <title>Empires: Individuals in Search of Society</title>
    <link>https://cis-india.org/news/individuals-in-search-of-society</link>
    <description>
        &lt;b&gt;In their 2000 bestseller Empire, Michael Hardt and Toni Negri announced a new international condition no longer built on the imperialist model of the superpowers of old but on the new condition of globalization. This new and emerging networked world held with it the opportunity for politics to bring forward a 21st century of interconnectedness, openness and a shared sense of planetary responsibility.&lt;/b&gt;
        
&lt;p&gt;&lt;a class="external-link" href="http://huff.to/MrvSbG"&gt;This article by Marc Lafia was published in Huffington Post on May 18, 2012&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;What we've discovered since is that the new empire still plays by the games of the old empires: of nation states, of divisiveness, of scarcity, of might, control and fear, even while we have never had such enormous abundance and innovation.&lt;br /&gt;&lt;br /&gt;It is this paradox that Empires -- our documentary film and online project, currently raising funds through Kickstarter -- sets out to unravel. The title works on multiple levels. It says that the nationalist empires are back. It also suggests that the empires of law, money, science, speed, nation states, and food are, in fact, complex networks that are inter-related and interdependent.&lt;br /&gt;&lt;br /&gt;It is said that you know there is a network when you're excluded from it.&lt;br /&gt;&lt;br /&gt;To be included is to have a voice, to participate, to have agency. These things drive the histories of political and philosophical thought. They are not abstract concepts but the very real struggles of networked relations, of powers, peoples, flows of energies and technologies.&lt;br /&gt;&lt;br /&gt;How these networks work and how they interact is what Empires sets out to explicate.&lt;br /&gt;&lt;br /&gt;We've sat down with an extraordinary group of historians, scientists, network technologists, sociologists, political organizers and artists to construct a conversation that describes the forces that shape our contemporary world. The list includes Manuel Delanda, Saskia Sassen, Florian Cramer, Natalie Jeremijenko, Kazys Varnelis, Geert Lovink, Alex Galloway, Michael Hardt, Anthony Pagden, Cathy Davidson, Greg Lindsay, Nishant Shah, James Delbourgo, Jon Protevi, Wendy Hui Kyong Chun, and soon Paul D. Miller and Douglas Rushkoff.&lt;br /&gt;&lt;br /&gt;What we've heard is that our managerial and government elites are dysfunctional and that the new order of things is every man for himself, that things find their own order, from the ground up. Our desires are expressed in our purchasing power. Money is how we vote and the market will continually adjust to accommodate the desires we express. We can all be winners using the network effects to scale up to success, a success each of us has agency to produce. There are no larger structures to trump agency. If you can make it you will make it.&lt;br /&gt;&lt;br /&gt;In this ethos of the elevation of our uniqueness to the exclusion of our commonalities, we have become blind to any possible collective power. We now, in the West, are a society of individuals in search of society.&lt;br /&gt;&lt;br /&gt;With reluctance today to accept such universalisms as global citizenship, rights to a living wage, to mobility, to social ownership of information channels and planetary resources, we are left with a notion that society, like nature, will be chaotic and disruptive, and that through this new 'natural law' of volatility, of self organization, a new politics will emerge and find its shape.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/individuals-in-search-of-society'&gt;https://cis-india.org/news/individuals-in-search-of-society&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2012-05-24T08:35:46Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
