<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 31 to 37.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/cis-cybersecurity-series-part-3-eva-galperin"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/cis-cybersecurity-series-part-2-ram-mohan"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-cybersecurity-series-part-1-christopher-soghoian"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/call-for-comments-model-security-standards-for-the-indian-fintech-industry"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/events/firstfridayatcis-amutha-arunachalam-stand-shielded-of-digital-rights-may-05"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/cis-cybersecurity-series-part-3-eva-galperin">
    <title>CIS Cybersecurity Series (Part 3) - Eva Galperin</title>
    <link>https://cis-india.org/internet-governance/cis-cybersecurity-series-part-3-eva-galperin</link>
    <description>
        &lt;b&gt;CIS interviews Eva Galperin, Global Policy Analyst at the Electronic Frontier Foundation (EFF).&lt;/b&gt;
        &lt;p&gt;&lt;i&gt;"It is a vital tool for speaking truth to power. Unless you are able to speak anonymously, you are not really free to espouse unpopular ideas to people who have the power to do bad things to do... I think the value of anonymous speech vastly outweighs the difficulties that you can sometimes get into because people can speak anonymously. And on the whole, I think anonymity is worth protecting." - Eva Galperin, Global Policy Analyst at EFF. &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;Centre for Internet and Society presents its third installment of the CIS Cybersecurity Series.&lt;/p&gt;
&lt;p&gt;The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.&lt;/p&gt;
&lt;p&gt;In this installment, CIS speaks to Eva Galperin, the Global Policy Analyst at the Electronic Frontier Foundation (EFF).She has worked for the EFF in various capacities for the last five years, applying the combination of her political science and technical background to organizing activism campaigns, and doing education and outreach on intellectual property, privacy, and security issues.&lt;/p&gt;
&lt;p&gt;EFF homepage: &lt;a href="https://www.eff.org/"&gt;https://www.eff.org/&lt;/a&gt;&lt;/p&gt;
&lt;div&gt;&lt;/div&gt;
&lt;p&gt;&lt;iframe frameborder="0" height="315" src="http://www.youtube.com/embed/BLtiuVX0nEM" width="560"&gt;&lt;/iframe&gt;&lt;/p&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;
&lt;p&gt;&lt;b&gt;&lt;i&gt;This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.&lt;/i&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div&gt;&lt;b&gt;&lt;i&gt;&lt;br /&gt;&lt;/i&gt;&lt;/b&gt;&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/cis-cybersecurity-series-part-3-eva-galperin'&gt;https://cis-india.org/internet-governance/cis-cybersecurity-series-part-3-eva-galperin&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>purba</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyberspace</dc:subject>
    
    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Cybercultures</dc:subject>
    
    
        <dc:subject>Cyber Security Interview</dc:subject>
    

   <dc:date>2013-08-01T09:55:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/cis-cybersecurity-series-part-2-ram-mohan">
    <title>CIS Cybersecurity Series (Part 2) - Ram Mohan</title>
    <link>https://cis-india.org/internet-governance/cis-cybersecurity-series-part-2-ram-mohan</link>
    <description>
        &lt;b&gt;CIS interviews Ram Mohan, a pioneer in the field of Internet security and internationalization, as part of the Cybersecurity Series&lt;/b&gt;
        
&lt;p&gt;&lt;em&gt;"In the Indian context, I think the government does have&amp;nbsp;a significant&amp;nbsp;responsibility&amp;nbsp;to protect its citizenry&amp;nbsp;from cybercrime. There is a greater need for the&amp;nbsp;government to work with private industries as well as academic institutions to ensure a strong understanding of the threats unique to India. After all there are many&amp;nbsp;threats that either originate in the context of the Indian sub-continent and are specific to India." - Ram&amp;nbsp;Mohan, Executive Vice President, &amp;amp; Chief Technology &lt;/em&gt;&lt;em&gt;Officer of Afilias Limited.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Centre for Internet and Society presents its second&amp;nbsp;installment of the CIS Cybersecurity Series.&lt;/p&gt;
&lt;p&gt;The CIS Cybersecurity Series seeks to address hotly&amp;nbsp;debated aspects of cybersecurity and hopes to encourage&amp;nbsp;wider public discourse around the topic.&lt;/p&gt;
&lt;p&gt;In this installment, CIS speaks to Ram Mohan, a pioneer&amp;nbsp;in the field of Internet security and&amp;nbsp;internationalization. Ram Mohan is Executive Vice&amp;nbsp;President, &amp;amp; Chief Technology Officer of Afilias&amp;nbsp;Limited. He also serves on the Board of Directors of the&amp;nbsp;Internet Corporation for Assigned Names and Numbers&amp;nbsp;(ICANN).&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;iframe src="http://www.youtube.com/embed/Riub6EIwCgk" frameborder="0" height="315" width="560"&gt;&lt;/iframe&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/cis-cybersecurity-series-part-2-ram-mohan'&gt;https://cis-india.org/internet-governance/cis-cybersecurity-series-part-2-ram-mohan&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>purba</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyberspace</dc:subject>
    
    
        <dc:subject>Cybercultures</dc:subject>
    
    
        <dc:subject>Cyber Security Interview</dc:subject>
    

   <dc:date>2013-07-12T10:27:26Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-cybersecurity-series-part-1-christopher-soghoian">
    <title>CIS Cybersecurity Series (Part 1) - Christopher Soghoian</title>
    <link>https://cis-india.org/internet-governance/blog/cis-cybersecurity-series-part-1-christopher-soghoian</link>
    <description>
        &lt;b&gt;CIS interviews Christopher Soghoian, cybersecurity researcher and activist, as part of the Cybersecurity Series&lt;/b&gt;
        
&lt;p&gt;&lt;em&gt;"We live in a surveillance state. The government can find out who we communicate with, who we talk to, who we are near, when we are at a protest, which stores we go to, where we travel to... they can find out all of these things. And it's unlikely it's going to get rolled back, but the best we can hope for is a system of law where the government gets to use its powers only in the right situation." – Christopher Soghoian, American Civil Liberties Union.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Centre for Internet and Society presents its first installment of the CIS Cybersecurity Series.&lt;/p&gt;
&lt;p&gt;The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.&lt;/p&gt;
&lt;p&gt;In this installment, CIS interviews Christopher Soghoian, a privacy researcher and activist, working at the intersection of technology, law and policy. Christopher is the Principal Technologist and a Senior Policy Analyst with the Speech, Privacy and Technology Project at the American Civil Liberties Union (ACLU).&lt;/p&gt;
&lt;p&gt;Christopher is based in Washington, D.C. His website is http://www.dubfire.net/&lt;/p&gt;
&lt;p&gt;&lt;iframe src="http://www.youtube.com/embed/SQo4b-jTAWM" frameborder="0" height="315" width="320"&gt;&lt;/iframe&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-cybersecurity-series-part-1-christopher-soghoian'&gt;https://cis-india.org/internet-governance/blog/cis-cybersecurity-series-part-1-christopher-soghoian&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>purba</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyberspace</dc:subject>
    
    
        <dc:subject>Cybercultures</dc:subject>
    
    
        <dc:subject>Cyber Security Interview</dc:subject>
    

   <dc:date>2013-07-12T10:26:59Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/call-for-comments-model-security-standards-for-the-indian-fintech-industry">
    <title>Call for Comments: Model Security Standards for the Indian Fintech Industry</title>
    <link>https://cis-india.org/internet-governance/call-for-comments-model-security-standards-for-the-indian-fintech-industry</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
&lt;p&gt;The Centre for Internet and Society is pleased to make available the Draft document of Model Security Standards for the Indian Fintech Industry, for feedback and comments from all stakeholders. The objective of this document which was first published in November 2019, is to ensure that the data of users is dealt with in a secure and safe manner by the Fintech Industry, and that smaller businesses in the Fintech industry have a specific standard to look at in order to limit their liabilities for any future breaches. &lt;br /&gt;&lt;br /&gt;We invite any parties interested in the field of technology policy, including but not limited to lawyers, policy researchers, and engineers, to send in your feedback/comments on the draft document by the 16th of January 2020. We intend to publish our final draft by the end of January 2020. We look forward to receiving your contributions to make this document more comprehensive and effective. Please find a copy of the draft document &lt;a href="https://cis-india.org/internet-governance/resources/security-standards-for-the-financial-technology-sector-in-india" class="internal-link" title="Security Standards for the Financial Technology Sector in India"&gt;here&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/call-for-comments-model-security-standards-for-the-indian-fintech-industry'&gt;https://cis-india.org/internet-governance/call-for-comments-model-security-standards-for-the-indian-fintech-industry&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranav</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Financial Technology</dc:subject>
    
    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>internet governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    

   <dc:date>2019-12-16T13:16:25Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward">
    <title>Automated Facial Recognition Systems and the Mosaic Theory of Privacy: The Way Forward</title>
    <link>https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward</link>
    <description>
        &lt;b&gt; Arindrajit Basu and Siddharth Sonkar have co-written this blog as the third of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems? &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The Mosaic Theory of Privacy&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Whether the data collected by the AFRS should be treated similar to 
face photographs taken for the purposes of ABBA is not clear in the 
absence of judicial opinion. The AFRS would ordinarily collect 
significantly more data than facial photographs during authentication. 
This can be explained with the help of the &lt;em&gt;&lt;a href="https://www.lawfareblog.com/defense-mosaic-theory" rel="noreferrer noopener" target="_blank"&gt;mosaic theory of privacy&lt;/a&gt;&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;The mosaic theory of privacy suggests that data collected for long 
durations of an individual can be qualitatively different from single 
instances of observation. It argues that aggregating data from different
 instances can create a picture of an individual which affects her 
reasonable expectation of privacy. This is because a mere slice of 
information reveals a lot less if the same is contextualised in a broad 
pattern — a mosaic.&amp;nbsp;&amp;nbsp; &amp;nbsp;&lt;/p&gt;
&lt;p&gt;The mosaic theory of privacy does not find explicit reference in 
Puttaswamy II. The petitioners had argued that seeding of Aadhaar data 
into existing databases would bridge information across silos so as to 
make real time surveillance possible. This is because information when 
integrated from different silos becomes more than the sum of its parts.&lt;/p&gt;
&lt;p&gt;The Court, however, dismissed this argument, accepting UIDAI’s 
submission that the data collected remains in different silos and 
merging is not permitted within the Aadhaar framework. Therefore, the 
Court did not examine whether it is constitutionally permissible to 
integrate data from different silos; it simply rejected the possibility 
of surveillance as a result of Aadhaar authentication.&lt;/p&gt;
&lt;p&gt;Jurisprudence in other jurisdictions is more advanced. In&amp;nbsp;&lt;em&gt;United States v. Jones&lt;/em&gt;,
 the United States Supreme Court&amp;nbsp;had observed that the insertion of a 
global positioning system into Antoine Jones’ Jeep in the absence of a 
warrant and without his consent invaded his privacy, entitling him to 
Fourth Amendment Protection. In this case, the movement of Jones’ 
vehicle was monitored for a period of twenty-eight days. Five concurring
 opinions in Jones acknowledges that aggregated and extensive 
surveillance is capable of violating the reasonable expectation of 
privacy irrespective of whether or not surveillance has taken place in 
public.&lt;/p&gt;
&lt;p&gt;The Court distinguished between prolonged surveillance and short term
 surveillance. Surveillance in the short run does not reveal what a 
person repeatedly does, as opposed to sustained surveillance which can 
reveal significantly more about a person. The Court takes the example of
 how a sequence of trips to a bar, a bookie, a gym or a church can tell a
 lot more about a person than the story of any single visit viewed in 
isolation.&lt;/p&gt;
&lt;p&gt;Most recently, in&lt;a href="https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf" rel="noreferrer noopener" target="_blank"&gt; &lt;em&gt;Carpenter v. United States&lt;/em&gt;&lt;/a&gt;,
 the Supreme Court of the United States held that the collection of&amp;nbsp; 
historical cell data by the government&amp;nbsp; exposes the physical movements 
of an individual to potential surveillance, and an individual holds a 
reasonable expectation of privacy against such&amp;nbsp; collection. The Court 
admitted that historical-cell site information allows the government to 
go back in time in order to retract the exact whereabouts of a person.&lt;/p&gt;
&lt;p&gt;Judicial decisions have not addressed specifically whether facial 
recognition through law enforcement constitutes a search under the 
Fourth Amendment or a “mere visual observation”.&lt;/p&gt;
&lt;p&gt;The common thread linking CCTV footages and cellular data is the 
unique ability to track the movement of an individual from one place to 
another, enabling extreme forms of surveillance. It is perhaps this 
crucial link that would make ARFS-enabled CCTVs prejudicial to 
individual privacy.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;The mosaic theory as understood in &lt;em&gt;Carpenter&lt;/em&gt; helps one 
understand the extent to which an AFRS can augment the capacities of law
 enforcement in India. This in turn can help in understanding whether it
 is constitutionally permissible to install such systems&amp;nbsp;across the 
country.&lt;/p&gt;
&lt;p&gt;AFRS enabled-CCTV footages from different CCTVs. if viewed in 
conjunction could reveal a sequence of movements of an individual, 
enabling long-term surveillance of a nature that is qualitatively 
distinct from isolated observances observed across unrelated CCTV 
footages.&lt;/p&gt;
&lt;p&gt;Subsequent to &lt;em&gt;Carpenter&lt;/em&gt;, &lt;a href="https://www.lawfareblog.com/four-months-later-how-are-courts-interpreting-carpenter" rel="noreferrer noopener" target="_blank"&gt;federal district courts&lt;/a&gt;
 in the United States have declined to apply Carpenter to video 
surveillance cases since the judgement did not “call into question 
conventional surveillance techniques and tools, such as security 
cameras.”&lt;/p&gt;
&lt;p&gt;The extent of processing that an AFRS-enabled CCTV exposes an 
individual to would be significantly greater. This is because every time
 an individual is in the zone of a AFRS-enabled CCTV, the facial image 
will be compared to a common database. Snippets from different CCTVs 
capturing the individual’s physical presence in two different locations 
may not be meaningful per se. When observed together, the AFRS will make
 it possible to identify the individual’s movement from one place to 
another.&lt;/p&gt;
&lt;p&gt;For instance, the AFRS will be able to identify the person when they 
are on Street A at a particular time and when they are Street B in the 
immediately subsequent hour recorded by respective CCTV cameras, 
indicating the person’s physical movement from A to B. While a CCTV 
camera only records movement of an individual in video format, AFRS 
translates that digital information into individualised data with the 
help of a comparison of facial features with a pre-existing database.&lt;/p&gt;
&lt;p&gt;Through data aggregation, which appears to be the aim of the Indian 
government&amp;nbsp;in their tender that links three databases, it is apparent 
that the right to privacy is in danger. Yet,&amp;nbsp;at present, there does not 
exist any case law or legislation that can render such&amp;nbsp;efforts illegal 
at this juncture.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Conclusions and The Way Forward&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Despite a lack of judicial recognition of the potential 
unconstitutionality of deploying&amp;nbsp;AFRS, it is clear that the introduction
 of these systems pose a clear and present danger to civil rights and 
human dignity. Algorithmic surveillance alters a human being’s life in 
ways that even the subject of this surveillance cannot fully comprehend.
 As an individual’s data is manipulated and aggregated to derive&amp;nbsp;a 
pattern about that individual’s world, the individual or his data no 
longer exists for itself&lt;sup&gt; &lt;/sup&gt;but are massaged into various categories.&lt;/p&gt;
&lt;p&gt;Louis Amoore terms this a ‘&lt;a href="https://journals.sagepub.com/doi/abs/10.1177/0263276411417430?journalCode=tcsa" rel="noreferrer noopener" target="_blank"&gt;data-derivative&lt;/a&gt;’,
 which is an abstract conglomeration of data that continuously shapes 
our futures without us having a say in their framing. The branding of an
 individual as a criminal and then aggregating their data causes 
emotional distress as individuals move about in fear of the state gaze 
and their association with activities that are branded as potentially 
dangerous — thereby suppressing a right to dissent — as exemplified by 
their use reported use during the recent protests in Hong Kong.&lt;/p&gt;
&lt;p&gt;Case law both in India and abroad has clearly suggested that a right 
to privacy is contextual and is not surrendered merely because an 
individual is in a public place. However, the jurisprudence protecting 
public photography or videography under the umbrella of privacy remains 
less clear globally and non-existent in India.&lt;/p&gt;
&lt;p&gt;The mosaic theory of privacy is useful in this regard as it prevents 
mass ‘data-veillance’ of individual behaviour and accurately identifies 
the unique power that the volume, velocity and variety of Big Data 
provides to the state. Therefore, it is imperative that the judiciary 
recognise safeguards from data aggregation as an essential component of a
 reasonable expectation of privacy. At the same time, legislation could 
also provide the required safeguards.&lt;/p&gt;
&lt;p&gt;In the US, Senators Coons and Lee recently introduced a draft Bill titled ‘&lt;a href="https://www.coons.senate.gov/imo/media/doc/ALB19A70.pdf" rel="noreferrer noopener" target="_blank"&gt;The Facial Recognition Technology Warrant Act of 2019’&lt;/a&gt;.
 The Bill aims to impose reasonable restrictions on the use of facial 
recognition technology by law enforcement. The Bill creates safeguards 
against sustained tracking of physical movements of an individual in 
public spaces. The Bill terms such tracking ‘ongoing surveillance’ when 
it occurs for over a period of 72 hours in real time or through 
application of technology to historical records. The Bill requires that 
ongoing surveillance only be conducted for law enforcement purposes &lt;em&gt;and&lt;/em&gt; in pursuance of a Court Order (unless it is impractical to do so).&lt;/p&gt;
&lt;p&gt;While the Bill has its textual problems, it is definitely worth 
considering as a model going forward and ensure that AFR systems are 
deployed in line with a rights-respecting reading of a reasonable 
expectation of privacy.&amp;nbsp; &lt;a href="http://datagovernance.org/report/adoption-and-regulation-of-facial-recognition-technologies-in-india" rel="noreferrer noopener" target="_blank"&gt;Parsheera&lt;/a&gt;
 suggests that the legislation should narrow tailoring of the objects 
and purposes for deployment of AFRS, restrictions on the person whose 
images may be scanned from the databases, judicial approval for its use 
on a case by case basis and effective mechanisms of oversight, analysis 
and verification.&lt;/p&gt;
&lt;p&gt;Appropriate legal intervention is crucial. A failure to implement 
this effectively jeopardizes the expression of our true selves and the 
core tenets of our democracy.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward'&gt;https://cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu, Siddharth Sonkar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>internet governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2020-01-02T14:12:38Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns">
    <title>Automated Facial Recognition Systems (AFRS): Responding to Related Privacy Concerns</title>
    <link>https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns</link>
    <description>
        &lt;b&gt;Arindrajit Basu and Siddharth Sonkar have co-written this blog as the second of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems? &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The Supreme Court of India, in &lt;a href="https://indiankanoon.org/doc/91938676/"&gt;Puttaswamy I&lt;/a&gt;&lt;em&gt; &lt;/em&gt;recognized&lt;em&gt;&amp;nbsp;&lt;/em&gt;that
 the right to privacy is not surrendered merely because the individual 
is in a public place. Privacy is linked to the individual as it is an 
essential facet of human dignity. Justice Chelameswar further clarified 
that privacy is contextual. Even in a public setting, people trying to 
converse in whispers would signal a claim to the right to privacy. 
Speaking on a loudspeaker would naturally not signal the same claim.&lt;/p&gt;
&lt;p&gt;The Supreme Court of Canada has also affirmed the notion of 
contextual privacy. As recently as on 7 March, 2019, the Supreme Court 
of Canada &lt;a href="http://www.thecourt.ca/r-v-jarvis-carving-out-a-contextual-approach-to-privacy/" rel="noreferrer noopener" target="_blank"&gt;in a landmark decision&lt;/a&gt; defined privacy rights in public areas implicitly applying &lt;a href="https://crypto.stanford.edu/portia/papers/RevnissenbaumDTP31.pdf"&gt;Helena Nissenbaum’s theory of contextual integrity&lt;/a&gt;.
 Helena Nissenbaum explains that the extent to which the right to 
privacy is eroded in public spaces with the help of her theory of 
contextual integrity.&lt;/p&gt;
&lt;p&gt;Nissenbaum suggests that labelling information as exclusively public 
or private fails to take into account the context which rationalises the
 desire of the individual to exercise her privacy in public. To explain 
this with an illustration, there exists a reasonable expectation of 
privacy in the restroom of a restaurant, even though it is in a public 
space.&lt;/p&gt;
&lt;p&gt;In &lt;a href="http://www.thecourt.ca/r-v-jarvis-carving-out-a-contextual-approach-to-privacy/"&gt;&lt;em&gt;R v Jarvis&lt;/em&gt;&lt;/a&gt; (Jarvis), the Court overruled a Court of Appeal for Ontario &lt;a href="https://www.canlii.org/en/on/onca/doc/2017/2017onca778/2017onca778.pdf"&gt;decision&lt;/a&gt;
 to hold that people can have a reasonable expectation of privacy even 
in public spaces. In this case, Jarvis was charged with the offence of 
voyeurism for secretly recording his students. The primary issue that 
the&amp;nbsp; Supreme Court of Canada was concerned with was whether the students
 filmed by Mr. Jarvis enjoyed a reasonable expectation of privacy at 
their school.&lt;/p&gt;
&lt;p&gt;The Court in this case unanimously held that students did indeed have
 a reasonable expectation of privacy.&amp;nbsp; The Court concluded nine 
contextual factors relevant in determining whether a person has a 
reasonable expectation to privacy would arise. The listed factors were:&lt;/p&gt;
&lt;p&gt;“1. The location the person was in when he or she was observed or recorded,&lt;/p&gt;
&lt;p&gt;2. The nature of the impugned conduct (whether it consisted of observation or recording),&lt;/p&gt;
&lt;p&gt;3. Awareness of or consent to potential observation or recording,&lt;/p&gt;
&lt;p&gt;4. The manner in which the observation or recording was done,&lt;/p&gt;
&lt;p&gt;5. The subject matter or content of the observation or recording,&lt;/p&gt;
&lt;p&gt;6. Any rules, regulations or policies that governed the observation or recording in question,&lt;/p&gt;
&lt;p&gt;7. The relationship between the person who was observed or recorded and the person who did the observing or recording,&lt;/p&gt;
&lt;p&gt;8. The purpose for which the observation or recording was done, and&lt;/p&gt;
&lt;p&gt;9. The personal attributes of the person who was observed or recorded.” (paragraph 29 of the judgement).&lt;/p&gt;
&lt;p&gt;The Court emphasized that the factors are not an exhaustive list, but
 rather were meant to be a guiding tool in determining whether a 
reasonable expectation of privacy existed in a given context. It is not 
necessary that each of these factors is present in a given situation to 
give rise to an expectation of privacy.&lt;/p&gt;
&lt;p&gt;Compared to the above-mentioned factors in Jarvis, the Indian Supreme Court in &lt;a href="https://indiankanoon.org/doc/127517806/"&gt;Justice K.S Puttaswamy (Retd.) v. Union of India&lt;/a&gt;: Justice Sikri (Puttaswamy II) &lt;strong&gt;—&lt;/strong&gt;
 the case which upheld the constitutionality of the Aadhaar project 
relied on the following factors to determine a reasonable expectation of
 privacy in a given context:&lt;/p&gt;
&lt;p&gt;“(i) What is the context in which a privacy claim is set up?&lt;/p&gt;
&lt;p&gt;(ii) Does the claim relate to private or family life, or a confidential relationship?&lt;/p&gt;
&lt;p&gt;(iii) Is the claim a serious one or is it trivial?&lt;/p&gt;
&lt;p&gt;(iv) Is the disclosure likely to result in any serious or significant injury and the nature and extent of disclosure?&lt;/p&gt;
&lt;p&gt;(v) Is disclosure relates to personal and sensitive information of an identified person?&lt;/p&gt;
&lt;p&gt;(vi) Does disclosure relate to information already disclosed publicly? If so, its implication?”&lt;/p&gt;
&lt;p&gt;These factors (acknowledged in Puttaswamy II in paragraph 292) seem 
to be very similar to the ones laid down in Jarvis, i.e., there is a 
strong reliance on the context in both cases. While there is no explicit
 mention of individual attributes of the individual claiming a 
reasonable expectation, the holding that children should be given an opt
 out indicates that the Court implicitly takes into account personal 
attributes (e.g. age) as well.&lt;/p&gt;
&lt;p&gt;The Court in Jarvis further (in paragraph 39) took the example of a 
woman in a communal change room at a public pool. She may expect other 
users to incidentally observe her undress but she would continue to 
expect only other women in the change room to observe her and reserve 
her rights against the general public. She would also expect not to be 
video recorded or photographed while undressing, both from other users 
of the pool and by the general public.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;If it is later found out that the change room had a one-way glass 
which allowed the pool staff to view the users change — or if there was a
 concealed camera recording persons while they were changing, she could 
claim a breach of her reasonable expectation of privacy under such 
circumstances and it would constitute an invasion of privacy.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;So, in the context of an AFRS, an individual walking down a 
public road may still signal that they wish to avail of their right to 
privacy. In such contexts, a concerted surveillance mechanism may come 
up against constitutional&amp;nbsp; roadblocks.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What is the nature of information being collected?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The second big question &lt;strong&gt;—&lt;/strong&gt; the nature of information 
which is being collected plays a role in determining the extent to which
 a person can exercise their reasonable expectation of privacy. 
Puttaswamy II laid down that collection of core biometric information 
such as fingerprints, iris scans in the context of the Aadhaar-Based 
Biometric Authentication (‘ABBA’) is constitutionally permissible. The 
basis of this conclusion is that the Aadhaar Act does not deal with the 
individual’s intimate or private sphere.&lt;/p&gt;
&lt;p&gt;The judgement of the Supreme Court in Puttaswamy II is in a very 
specific context (i.e. the ABBA). It does not explain or identify the 
contextual factors which determine the extent to which privacy may be 
reasonably expected over biometrics generally. In this judgment, the 
Court observed that demographic information and photographs do not raise
 a reasonable expectation of privacy under Article 21 unless there exist
 special circumstances such as the disclosure of juveniles in conflict 
of law or a rape victim’s identity.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Most importantly, the Court held that face photographs for 
the purpose of identification are not covered by a reasonable 
expectation of privacy. The Court distinguished face photographs from 
intimate photographs or those photographs which concern confidential 
situations. &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Face photographs, according to the Court, are shared by 
individuals in the ordinary course of conduct for the purpose of 
obtaining a driving &lt;/strong&gt;l&lt;strong&gt;icense, voter id, passport, 
examination admit cards, employment cards, and so on. Face photographs 
by themselves reveal no information.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Naturally, this&amp;nbsp;pronouncement of the Apex Court is a huge boost for the introduction of AFRS in India.&lt;/p&gt;
&lt;p&gt;Abroad, however, on 4 September 2019, in &lt;a href="https://www.judiciary.uk/wp-content/uploads/2019/09/bridges-swp-judgment-Final03-09-19-1.pdf"&gt;Edward Bridges v. Chief Constable of South Wales Police&lt;/a&gt;, a Division Bench of the High Court in England and Wales heard a challenge against an AFRS introduced by law enforcement (&lt;em&gt;see&lt;/em&gt;
 Endnote 1). The High Court rejected a claim for judicial review holding
 that the AFRS in question does not violate inter alia the right to 
privacy under Article 8 of the European Convention of Human Rights 
(‘ECHR’).&lt;/p&gt;
&lt;p&gt;According to the Court, the AFRS was used for specific and limited 
purposes, i.e., only when the image of the public matched a person on an
 existing watchlist. The use of the AFRS was therefore considered a 
lawful and fair restriction.&lt;/p&gt;
&lt;p&gt;The Court, however, acknowledged that extracting biometric data 
through AFRS is “well beyond the expected and unsurprising”. This seems 
to be a departure from the Indian Supreme Court’s observation in 
Puttaswamy II that there is no reasonable expectation of privacy over 
biometric data in the context of ABBA, and may be a wiser approach for 
the Indian courts to adopt.&lt;/p&gt;
&lt;h6&gt;&lt;strong&gt;Endnote &lt;/strong&gt;&lt;/h6&gt;
&lt;p&gt;1. The challenge was put forth by Edward Bridges, a civil liberties 
campaigner from Cardiff for being caught on camera in two particular 
deployments of the AFRS a) when he was at Queen Street, a busy shopping 
area in Cardiff and b) when he was at the Defence Procurement, Research,
 Technology and Exportability Exhibition held at the Motorpoint Arena.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This was published by &lt;a class="external-link" href="https://aipolicyexchange.org/2019/12/28/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns/"&gt;AI Policy Exchange&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns'&gt;https://cis-india.org/internet-governance/automated-facial-recognition-systems-afrs-responding-to-related-privacy-concerns&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu, Siddharth Sonkar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>internet governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2020-01-02T14:09:14Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/events/firstfridayatcis-amutha-arunachalam-stand-shielded-of-digital-rights-may-05">
    <title>Amutha Arunachalam - Stand Shielded of Digital Rights (Delhi, May 05, 4 pm)</title>
    <link>https://cis-india.org/internet-governance/events/firstfridayatcis-amutha-arunachalam-stand-shielded-of-digital-rights-may-05</link>
    <description>
        &lt;b&gt;We are proud to announce that Amutha Arunachalam will be the speaker at the May #FirstFriday event at the CIS Delhi office. Amutha is Principal Technical Officer in the Council Of Scientific and Industrial Research. The talk will be on digital signatures, traceability of time-stamps, and setting up an Indian Standard (Digital) Time. If you are joining us, please RSVP at the soonest as we have only limited space in our office.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Amutha Arunachalam&lt;/strong&gt;&lt;/h3&gt;
&lt;h4&gt;Principal Technical Officer, Council of Scientific and Industrial Research&lt;/h4&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;img src="https://cis-india.org/internet-governance/files/amutha-arunachalam/image" alt="Amutha Arunachalam" class="image-inline" title="Amutha Arunachalam" /&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Amutha Arunachalam entered the Indian Government service as an Intelligence Officer in Ministry of Home Affairs in 1988 after working at the Indian Institute of  Technology Madras in Fibre Optic communication Laboratory. She later moved to the Council of Scientific and Industrial Research in the field of Information Technology. She managed the IT infrastructure of the CSIR lab (Central Road Research Institute) till  2006 and moved to CSIR Head Quarters and contributed in the ICT refurbishment drive, mainly in the IT with a major contribution in establishing DATA Centre, implementing network security, linking CSIR HQ to the National Knowledge Network facility extended by National Information Centre(NIC) before joining UIDAI.&lt;/p&gt;
&lt;p&gt;In UIDAI (National Identity Project) she managed the Data Center operations that includes critical CIDR (Central Identification Repository) and was responsible for setting up Infrastructure to roll out Disaster recovery centre, Aadhaar Enrolment Service, Benchmarking  of  UIDAI  Enrolment ,  Authentication Applications and setting up of Backend infrastructure of the Authentication Service for Roll out to citizens. After the five year Deputation at UIDAI (Feb 2016), she is currently posted in the Council of Scientific and Industrial Research working in the Area of Policy in Cyber Security for CSIR, Enhancing Research with collaborative, networking  and Building unified CSIR Ecosystem with Enterprise platform.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;RSVP&lt;/strong&gt;&lt;/h3&gt;
&lt;iframe src="https://docs.google.com/forms/d/e/1FAIpQLSfWGNDezfJOi3UU7GpAWkrKn0uOMlCsV2P_6QEHqPWCb6JSqA/viewform?embedded=true" frameborder="0" marginwidth="0" marginheight="0" height="666" width="600"&gt;Loading...&lt;/iframe&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Location&lt;/strong&gt;&lt;/h3&gt;
&lt;iframe src="https://www.google.com/maps/embed?pb=!1m18!1m12!1m3!1d876.157470894426!2d77.20553462919722!3d28.550842498903158!2m3!1f0!2f0!3f0!3m2!1i1024!2i768!4f13.1!3m3!1m2!1s0x0%3A0x834072df81ffcb39!2sCentre+for+Internet+and+Society!5e0!3m2!1sen!2sin!4v1493818109951" frameborder="0" height="450" width="600"&gt;&lt;/iframe&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/events/firstfridayatcis-amutha-arunachalam-stand-shielded-of-digital-rights-may-05'&gt;https://cis-india.org/internet-governance/events/firstfridayatcis-amutha-arunachalam-stand-shielded-of-digital-rights-may-05&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sumandro</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital India</dc:subject>
    
    
        <dc:subject>#FirstFridayAtCIS</dc:subject>
    
    
        <dc:subject>E-Governance</dc:subject>
    

   <dc:date>2017-05-03T13:30:32Z</dc:date>
   <dc:type>Event</dc:type>
   </item>




</rdf:RDF>
