<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 711 to 725.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/dna-research"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/news/report-dna-july-7-2013-joanna-lobo-geeks-have-a-solution-to-digital-surveillance-in-india-cryptography"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/dna-databases-and-human-rights.pdf"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/dna-database-for-missing-persons-and-unidentified-dead-bodies"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly">
    <title>Does the Social Web need a Googopoly?</title>
    <link>https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly</link>
    <description>
        &lt;b&gt;While the utility of the new social tool Buzz is still under question, the bold move into social space taken last week by the Google Buzz team has Gmail users questioning privacy implications of the new feature.  In this post, I posit that Buzz highlights two  privacy challenges of the social web.  First, the application has sidestepped the consensual and contextual qualities desirable of social spaces.  Secondly, Google’s move highlights the increasingly competitive and convergent nature of the social media landscape.  &lt;/b&gt;
        
&lt;p&gt;&lt;/p&gt;
&lt;p&gt;Last week, and for many a surprise, Google launched its new
social networking platform, Buzz.&amp;nbsp; The
new service is Google’s effort to amplify the “social nature” of their services
by integrating them under one platform, and adding some extra social utility.&amp;nbsp;&amp;nbsp; The social application runs from the Gmail
interface, but also links other Google accounts a user may have, including
albums on Picasa, and Google Reader.&amp;nbsp; &amp;nbsp;The service also allows for the sharing from
external sources, such as photos on Flickr, and videos from YouTube.&amp;nbsp; The service also allows users to post, like,
or dislike the status updates of others which may be publicly searchable if the
user opts.&amp;nbsp; Before a Gmail user may fully
participate in Google Buzz service, a unique Google Personal Profile must be
created.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;User Consent&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Much of the buzz surrounding the new social networking
service last week wasn’t paying much lip service to the new application.&amp;nbsp; Instead, an uproar of privacy concerns continued
to dominate the Buzz scene, with many critics quickly labeling Buzz a “&lt;a href="http://news.cnet.com/8301-31322_3-10451428-256.html"&gt;privacy nightmare&lt;/a&gt;”.&amp;nbsp; A &lt;a href="http://digitaldaily.allthingsd.com/20100216/epic-files-ftc-complaint-over-google-buzz/?mod=ATD_rss"&gt;formal
complaint&lt;/a&gt; has been already filed with the US Federal Trade Commission in
response to Google’s new privacy violating service.&amp;nbsp; &amp;nbsp;A
second-year Harvard Law student has also filed a &lt;a href="http://abcnews.go.com/Technology/google-buzz-draws-class-action-suit-harvard-student/story?id=9875095&amp;amp;page=1"&gt;class-action
suit&lt;/a&gt; against the company for its privacy malpractices.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Much of the privacy talk thus far has focused on issues of
consent, or lack thereof, in this case.&amp;nbsp; Upon
Buzz’s launch, Gmail users were automatically subscribed as “opting in” for the
service.&amp;nbsp; Google has used the private
address books of millions of Gmail accounts to build social networks from the
contacts users email and chat with most.&amp;nbsp;
To entice users into using the service, Gmail users were set to
auto-follow all of their contacts, and in turn, to be followed by them,
too.&amp;nbsp; Furthermore, all new Buzz users had
been set to automatically share all public Picasa albums and Google Reader items
with their new social graph.&amp;nbsp; It is
argued that social network services should be &lt;a href="http://jonoscript.wordpress.com/2010/02/20/buzz-off-google-social-networks-should-always-be-opt-in-not-opt-out/"&gt;opt-in,
rather than opt-out&lt;/a&gt;, and that Buzz has violated the consensual nature of
the social web.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Illuminating the complications of building a social graph
from ones inbox is the story of an Australian women, who remains anonymous.&amp;nbsp; As she claims, most of the emails currently received
through her Gmail account, are those from her abusive ex-boyfriend.&amp;nbsp; Due to Google’s assumption that Gmail users
would like to be “auto-followed” by their Gmail contacts (mirroring Twitters friendship
protocol), items shared between herself and new boyfriend through her Google
reader account had become public to her broader social graph, including her
ex-boyfriend and his harassing friends.&lt;/p&gt;
&lt;p&gt;In a &lt;a href="http://www.gizmodo.com.au/2010/02/fck-you-google/"&gt;blog response&lt;/a&gt;
directed to Google’s Buzz team, the woman scornfully wrote- “&lt;em&gt;F*ck you, Google. My privacy concerns are
not trite. They are linked to my actual physical safety, and I will now have to
spend the next few days maintaining that safety by continually knocking down
followers as they pop up. A few days is how long I expect it will take before
you either knock this shit off, or I delete every Google account I have ever
had and use Bing out of f*cking spite&lt;/em&gt;”.&amp;nbsp;
As this case demonstrates, the people we mail most often may not be our
closest friends. &amp;nbsp;&amp;nbsp;As email has replaced
the telephone for many as the dominate mode of communication--some contacts may
be friends, however, many others may not be. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;In response to the uproar, tweaks to Buzz’s privacy features
have since been made.&amp;nbsp; Todd Jackson,
Buzz’s product manager, has also posted a &lt;a href="http://gmailblog.blogspot.com/2010/02/millions-of-buzz-users-and-improvements.html"&gt;public
apology&lt;/a&gt; to the official Gmail Blog late last week for not “getting
everything quite right”.&amp;nbsp; The service will
now assume the more user-centric “auto-suggest” model, allowing users to selectively
choose the contacts they wish to follow, and will also no longer auto-link Picasa
and Reader content.&amp;nbsp; However, as the &lt;a href="http://digitaldaily.allthingsd.com/20100216/epic-files-ftc-complaint-over-google-buzz/?mod=ATD_rss"&gt;EPIC’s
complaint notes&lt;/a&gt;, many are still unsatisfied with the opt-out nature of the
service, arguing that users should be able to opt-into the service if they so
choose, rather than having to delist themselves for a service they didn’t necessarily
sign up.&amp;nbsp; Ethical quandaries also still
loom over Google’s misuse of the users’ private contact lists to jumpstart
their new service.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Contextual Integrity &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The attacks on personal privacy resulting from Google’s model
are vast.&amp;nbsp; As the case of the Australian
woman illuminates, the concept of the “online friend” has completely taken out
of context with Buzz’s initial auto-follow model.&amp;nbsp; Many of the contacts we make on a daily basis
need not be made public through the Google profile.&amp;nbsp; For most, this Buzz’s privacy breach may be
benign or annoying at most. However, those who are engaged in sensitive social
or political relationships via their Gmail chat or email accounts, the revelation
of common contact could have been potentially damaging for many. &amp;nbsp;A reporter from CNET has cleverly labeled
Buzz’ as a “&lt;a href="http://news.cnet.com/8301-17939_109-10451703-2.html"&gt;socially
awkward networking&lt;/a&gt;”, as bringing diverse contacts under one umbrella
doesn’t exactly make the most social sense. In response, Gmail users are
required to sort through and filter their Buzz followers according, or choose
to disable the service all together.&lt;/p&gt;
&lt;p&gt;Besides questions of who is stalking whom, the assumptive
and public nature of Google’s&amp;nbsp; new move
has cast a shadow of doubt among Gmail users regarding the ability of Google to
maintain the privacy and contextual integrity of the Gmail account.&amp;nbsp; Should one account be the place to socialize,
and&amp;nbsp; “do business”?&amp;nbsp; Gmail is, and should remain, an email
service.&amp;nbsp; However, Buzz takes the email
experience into new and questionable grounds.&amp;nbsp;
Do Gmail users feel entirely comfortable having their personal email,
social graph, and chat functions all coming under the auspices of one platform?
&amp;nbsp;&amp;nbsp;Many users felt they had been lured
into using a social networking service that they didn’t sign up for in the
first place. &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Social Media Competition&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;In addition to Google’s attempt to integrate their various
service offerings, Buzz is seen as an obvious attempt to bolster
competitiveness in the social media market.&amp;nbsp;
In 2004, Google released Orkut. While the service has become big in
countries such as Brazil and India, it has been overshadowed by sites such as
Facebook in other jurisdictions, and has not been able to prove itself as a mainstream
space for networking.&amp;nbsp; In the past year, Google
had also launched Google Wave, a tool that mixes e-mail, with instant messaging
and the ability for several people to collaborate on documents.&amp;nbsp; However, the application failed to completely
win over audiences, and was considered one of the &lt;a href="http://www.readwriteweb.com/archives/top_10_failures_of_2009.php"&gt;top
failures of 2009&lt;/a&gt;.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;With Google unable to effectively saturate the social media
ecosystem, Buzz is an attempt to compete with the searchable and real time
experiences provided by social media giants, Facebook and Twitter.&amp;nbsp; Increased competition within the social media
market could be a positive development for privacy, as social media companies
could arguably be compete on their ability to provide users with preferable privacy
architectures.&amp;nbsp; To the contrary, however,
such competition has thus far had negative ramifications for user privacy, as
the recent Buzz and Facebook moves illustrates.&amp;nbsp;
Facebook’s loosened privacy settings were a &lt;a href="http://www.economist.com/specialreports/displaystory.cfm?story_id=15350984"&gt;competitive
knee-jerk&lt;/a&gt; to Twitters searchable and real time experience.&amp;nbsp; Through a Twitter search, individuals can
come to know what people are saying about a certain topic, event, or product,
and as a result, the service has received a great deal attention from users,
and non-users such as advertisers, alike.&amp;nbsp;
&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In an attempt to one-up, their competition, the “Twitterization”
of Facebook followed in two distinct stages.&amp;nbsp;
First was with the implementation of the Facebook News Feed, which gave
users a real time account of actions their friends on the site.&amp;nbsp; Many argued that this feature invaded user
privacy.&amp;nbsp; However, it was argued by
Facebook that they only were making available information that was already
accessible through individual profile pages.&amp;nbsp;
The News Feed, as it happens, effectively took user information and
actions on the site out of original context by streaming this information live
for others easy viewing.&amp;nbsp; Information
users once had to rummage for had become accessible in real time on the
homepage of the service.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Secondly, Facebooks’ recent &lt;a href="http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly"&gt;privacy
scandal&lt;/a&gt; was a step towards making profile information more searchable and accessible
to third parties, as is most often the case with the more public feeds on Twitter.&amp;nbsp; As &lt;a href="https://cis-india.org/openness/blog-old/•%09http:/www.simplyzesty.com/twitter/unrelenting-twitterization-facebook-continues/"&gt;one
commentator notes&lt;/a&gt;, &amp;nbsp;&amp;nbsp;“&lt;em&gt;Facebook used to be very private but private
is not great for search, to have great search you need all of the data to be
publicly available as it mostly is on Twitter. Facebook have not quite nailed
real time search yet but they are getting there and it will soon be a great way
of examining sentiment across different demographics&lt;/em&gt;”. &amp;nbsp;As a result, information on Facebook, such as
name, profile picture, friends list, location and fan pages have become open
access information.&amp;nbsp; In addition, users
on Facebook have been subjected to new privacy regime without notice, leaving
their profile pages generally more open, and searchable through Google.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Converging the Online
Self&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The impact Buzz alone can make on the social media landscape
remains questionable (Gmail heralds only 140 million accounts, which is a deficient
cry from Facebooks’ 400+ million dedicated users).&amp;nbsp; However, despite Googles’ in/ability to
become claim hegemony over the social web landscape, the abuse of private information
to launch a new service has raised serious debate over the privacy and the
future of social networking.&amp;nbsp; The Buzz
service marks more than yet another new social networking service that brushes
aside the privacy of users.&amp;nbsp; As user control
and privacy becomes an increasingly peripheral concern, Google’s shift toward privacy
decontrol also signifies a worrisome supply-side shift towards the
“convergence” of online identity.&lt;/p&gt;
&lt;p&gt;Within this new dominant paradigm, privacy concerns are
often interpreted as antithetical to competitiveness in the social media
marketplace.&amp;nbsp; Instead of an imagined ecosystem
based on user control and privacy preference, it can now be inferred that the
competiveness of social networking services will continue to disrupt the
delicate balance between the public and private online. Regardless that greater
visibility and searchability of the social profile may not be in the public
interest, Google’s recent move works to reinforcement of the new status quo of
“openness”.&amp;nbsp; Furthermore, it is
questionable as to how concentrated and integrated a user may want their online
activities to become.&amp;nbsp; A critical
discourse of online privacy must, therefore, take into account the ways in
which the social web has renders the user increasingly transparent through networks
of networking services.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Google’s Buzz illustrates this point quite well.&amp;nbsp;&amp;nbsp; Initially, Gmail was a straightforward email
service.&amp;nbsp; Next, the AdWords advertising service
and Gmail chat had become integrated into the Gmail experience.&amp;nbsp; Because Google was using the confidential
emails of its Gmail users, privacy concerns began to mount upon the launch of
the the AdWords service.&amp;nbsp; However,
turmoil surrounding AdWords died down, notably as Google continues to reassert
that is is bots, not humans, that are scanning the emails in order to provide
the AdWords service.&amp;nbsp; Next, there gradually
occurred a convergence of Google services under the single social profile, or
“email address”.&amp;nbsp; A single Gmail account
potentially includes use of with Google reader, calendar, chat, groups and an Orkut
account.&amp;nbsp; In terms of behavioral targeted
advertising, Google has recently announced that they will be providing
personalized search results even to users who have not signed up for Google
services.&amp;nbsp; This will be done through the
placement a cookie on all machines to provide targeted advertising seamlessly
through each Google search and browsing session.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While many argue that the collection of non-personally
identifiable information poses no privacy harm, this assumption needs
reassessment.&amp;nbsp; As Google comes to offer
us more, they also come to learn more, and Buzz signifies this trend towards a Googopolized
social web.&amp;nbsp; To add another layer of
complexity to Googles hegemony, users of the Buzz service are also required to create
a “Google Profile”, which is searchable online and displays real time status
updates, comments, and connections from other social network services, such as
Facebook and Twitter.&amp;nbsp; As Google recently
launched the beta version of the new &lt;a href="http://googleblog.blogspot.com/2009/10/introducing-google-social-search-i.html"&gt;Social
Search&lt;/a&gt;, Buzz was just the service required to increase the relevance to the
new service by encouraging Gmail users to publish even more personal
information.&amp;nbsp; The creation of a personal
Google profile, which is indexed and searchable, raises many concerns about
privacy and identity, and doubts are continually raised over &lt;a href="http://www.businessinsider.com/hey-google-thi-i-why-privacy-matter-2010-2"&gt;how
much Google should come to know&lt;/a&gt; about us.&lt;/p&gt;
&lt;p&gt;While Google’s services have arguably made the online social
experience more seamless and tailored, it is questionable as to how relevant,
or even desirable, such a shift may be.&amp;nbsp;
At present, it may appear that Google is wearing far too many hats, and
users should be wary of placing all eggs into one basket.&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;As
the launch of Buzz has shown us, user consent and the contextual integrity of
private personal information can be compromised when a diverse number of online
services are integrated and given a social spin.&amp;nbsp;&amp;nbsp;&amp;nbsp; When competition among social web providers
drives users to lose control of the private information which is inherently theirs,
critical questions surrounding competition, convergence and privacy require
critical exploration.&amp;nbsp;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly'&gt;https://cis-india.org/openness/blog-old/does-the-social-web-need-a-googopoly&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>rebecca</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Social Networking</dc:subject>
    
    
        <dc:subject>Competition</dc:subject>
    
    
        <dc:subject>Google Buzz</dc:subject>
    

   <dc:date>2011-08-18T05:06:37Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online">
    <title>Does the Safe-Harbor Program Adequately Address Third Parties Online?</title>
    <link>https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online</link>
    <description>
        &lt;b&gt;While many citizens outside of the US and EU benefit from the data privacy provisions the Safe Harbor Program, it remains unclear how successfully the program can govern privacy practices when third-parties continue to gain more rights over personal data.  Using Facebook as a site of analysis, I will attempt to shed light on the deficiencies of the framework for addressing the complexity of data flows in the online ecosystem. &lt;/b&gt;
        
&lt;p&gt;To date, the EU-US Safe Harbor Program leads in governing
the complex and multi-directional flows of personal information online. &amp;nbsp;&amp;nbsp;As commerce began to thrive in the online
context, the European Union was faced with the challenge of ensuring that personal
information exchanged through online services were granted
levels of protect on par with provisions set out in EU privacy law.&amp;nbsp; This was important, notably as the piecemeal
and sectoral approach to privacy legislation in the United states was deemed incompatible
with the EU approach.&amp;nbsp; While the Safe
Harbor program did not aim to protect the privacy of citizens outside of the
European Union per say, the program has in practice set minimum standards for
online data privacy due to the international success of American online
services.&lt;/p&gt;

&lt;p&gt;While many citizens outside of the US and EU benefit from
the Safe Harbor Program, it remains unclear how successful the program will be in an
online ecosystem where third-parties are being granted increasingly more rights
over the data they receive from first parties.&amp;nbsp;
Using Facebook as a site of analysis, I will attempt to shed light on
the deficiencies of the framework for addressing the complexity of data flows
in the online ecosystem.&amp;nbsp; First, I will argue
that the safe harbor program does not do enough to ensure that participants are
held reasonably responsible third party privacy practices.&amp;nbsp; Second, I will argue that the information
asymmetries created between first party sites, citizens, and governance bodies
vis-à-vis third parties obscures the application of the Safe Harbor Model.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The EU-US
Safe-Harbor Agreement&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;In 1995, and based on earlier &lt;a href="http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html"&gt;OECD
guidelines&lt;/a&gt;, the EU Data Directive on the “protection of individuals with
regard to the processing of personal data and the free movement of such data”
was passed&lt;a name="_ednref1" href="#_edn1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [1].&amp;nbsp; The original purpose of the EU Privacy
Directive was not only to increase privacy protection within the European
Union, but to also promote trade liberalization and a single integrated market
in the EU.&amp;nbsp; After the Data Directive was
passed, each member state of the EU incorporated the principles of
the directive into national laws accordingly.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While the Directive was successful in harmonizing data
privacy in the European Union, it also embodied extraterritorial
provisions, giving in reach&lt;a name="_ednref2" href="#_edn2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; beyond the EU.&amp;nbsp; Article 25 of the Directive states that the
EU commission may ban data transfers to third countries that do not ensure “an
adequate level of protect’ of data privacy rights&lt;a name="_ednref3" href="#_edn3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [2].&amp;nbsp; Also, Article 26 of the Directive, expanding
on Article 25, states that personal data cannot be &lt;em&gt;transferred &lt;/em&gt;to a country that “does not ensure an adequate level of
protection” if the data controller does not enter into a contract that adduces
adequate privacy safeguards&lt;a name="_ednref4" href="#_edn4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [3].
&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In light of the increased occurrence of cross-border
information flows, the Data Directive itself was not effective enough to ensure that
privacy principles were enforced outside of the EU.&amp;nbsp; Articles 25 and 26 of the Directive had essentially deemed all cross-border data-flows to the US in contravention of EU privacy law.&amp;nbsp; Therefor, the EU-US Safe-Harbor was established by the
EU Council and the US Department of Commerce as a way of mending the variant
levels of privacy protection set out in these jurisdictions, while also promoting
online commerce.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Social Networking
Sites and the Safe-Harbor Principles&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The case of social networking sites exemplifies the ease
with which data is transferred, processed, and stored between jurisdictionas.&amp;nbsp; While many of the top social networking sites
are registered American entities, they continue to attract users not only from
the EU, but also internationally.&amp;nbsp; In agreement
to the EU law, many social networking sites, including LinkedIn, Facebook,
Myspace, and Bebo, now adhere to the principles of the program.&amp;nbsp; The enforcement of the Safe Harbor takes
place in the United States in accordance with U.S. law and relies, to a great
degree, on enforcement by the private sector.&amp;nbsp;
TRUSTe, an independent certification program and dispute mechanism, has become the most popular governance mechanism for the safe harbor program
among social networking sites.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Drawing broadly on the principles embodied within the EU
Data Directive and the OECD Guidelines, the seven principles of the Safe-Harbor
were developed.&amp;nbsp; These principles include
Notice, Choice, Onward Transfer, Access and Accuracy, Security, Data Integrity
and Enforcement.&amp;nbsp;&amp;nbsp; The principle of “Notice”
sets out that organizations must inform individuals about the purposes for
which it collects and uses information about them, how to contact the
organization with any inquiries or complaints, the types of third parties to
which it disclosures the information, and the choices and means the organization
offers individuals for limiting its use and disclosure.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;“Choice” ensures that individuals have the opportunity to
choose to opt out whether their personal information is disclosed to a third
party, and to ensure that information is not used for purposes incompatible with the purposes for
which it was originally collected.&amp;nbsp; The
“Onward Transfer” principle ensures that third parties receiving information
subscribes to the Safe Harbor principles, is subject to the Directive, or
enters into a written agreement which requires that the third party provide at
least the same level of privacy protection as is requires by the relevant
principles.&lt;/p&gt;
&lt;p&gt;The principles of “Security” and “Data Integrity” seek to
ensure that reasonable precautions are taken to protect the loss or misuse of
data, and that information is not used in a manner which is incompatible with
the purposes for it is has been collected—minimizing the risk that personal
information would be misused or abused.&amp;nbsp;&amp;nbsp;&amp;nbsp;
Individuals are also granted the right, through the access principle, to
view the personal information about them that an organization holds, and to
ensure that it is up-to-date and accurate.&amp;nbsp;
The “Enforcement” principle works to ensure that an effective mechanism
for assuring compliance with the principles, and that there are consequences
for the organization when the principles are not followed.&lt;/p&gt;
&lt;p&gt;The principles of the program are rather quite clear and
enforceable in the first party context, despite some prevailing ambiguities.&amp;nbsp; The privacy policies of most social
networking services have become increasingly clear and straightforward since
their inception.&amp;nbsp; Facebook, for example,
has revamped its &lt;a href="http://www.facebook.com/privacy/explanation.php"&gt;privacy
regime&lt;/a&gt; several times, and gives explicit notice to users how their
information is being used.&amp;nbsp; The privacy
policy also explains the relationship between third parties and your personal information—including
how it may be used by advertisers, search engines, and fellow members.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;With respect to third party advertisers, principles of
“choice” are clearly granted by most social networking services.&amp;nbsp; For example, the &lt;a href="http://www.networkadvertising.org/"&gt;Network Advertising Initiative&lt;/a&gt;, a
self-regulatory initiative of the online advertising industry, clearly lists
its member websites and allows individuals to opt out of any targeted
advertising conducted by its members.&amp;nbsp; In
Facebook’s description of “cookies” in their privacy policy, a direct link to NAI’s
opt out features is given, allowing individuals to make somewhat informed
choices about their participation in such programs.&amp;nbsp; This point is, of course, in light of the
fact that most users do not read or understand the privacy policies provided by
social networking sites&lt;a name="_ednref5" href="#_edn5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [4].
It is also important to note that Google—a major player in the online
advertising business, does not grant users of Buzz and Orkut the same “opt-out”
options as sites such as Facebook and Bebo.&lt;/p&gt;
&lt;p&gt;Under the auspices of the US Federal Trade Commission, the
Safe Harbor Program has also successfully investigated and settled several
privacy-related breaches which have taken place on social networking sites.&amp;nbsp; Of the most famous cases is &lt;a href="http://www.beaconclasssettlement.com/"&gt;Lane et al. v. Facebook et al.&lt;/a&gt;,
which was a class action suit brought against Facebook’s Beacon Advertising
program.&amp;nbsp; The US Federal Trade Commission
was quick to insight an investigation of the program after many privacy groups
and individuals became critical of its questionable advertising practices.&amp;nbsp; The Beacon program was designed to allow
Facebook users to share information with their friends about actions taken on
affiliated, third party sites.&amp;nbsp; This had included,
for example, the movie rentals a user had made through the Blockbuster website.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The Plaintiffs filed a suit, alleging that Facebook and its
affiliates did not give users adequate notice and choice about Beacon and the
collection and use of users’ personal information. &amp;nbsp;&amp;nbsp;&amp;nbsp;The Beacon program was ultimately found to
be in breach of US law, including the &lt;a href="http://epic.org/privacy/vppa/"&gt;Video
Privacy Protection Act&lt;/a&gt;, which bans the disclosure of personally identifiable
rental information.&amp;nbsp; Facebook has
announced the settlement of the lawsuit, not bringing individual settlements,
but a marked end to the program and the development of a 9.5 million dollar &lt;a href="http://www.p2pnet.net/story/37119"&gt;Facebook Privacy Fund&lt;/a&gt; dedicated to
privacy and data-related issues.&amp;nbsp; Other privacy
related investigations of social networking sites launched by the FTC under the
Safe Harbor Program include Facebook’s &lt;a href="http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly"&gt;privacy
changes&lt;/a&gt; in late 2009, and the Google’s recently released &lt;a href="http://www.networkworld.com/news/2010/032910-lawmakers-ask-for-ftc-investigation.html"&gt;Buzz
application&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Despite the headway the Safe Harbor is making, many privacy
related questions remain ambiguous with respect to the responsibilities social networking
sites through the program.&amp;nbsp; For example,
Bebo &lt;a href="http://www.bebo.com/Privacy2.jsp"&gt;reserves the right&lt;/a&gt; to
supplement a social profile with addition information collected from publicly
available information and information from other companies.&amp;nbsp; Bebo’s does adhere to the “notice principle”—as
it makes know to users how their information will be used through their privacy
policy. However, it remains unclear if appropriate disclosures are given by Bebo
as required by Safe Harbor Framework, notably as the sources of “publicly
available information” as a concept remains broad and obscured in the privacy policy.&amp;nbsp; It is also unclear whether or not Bebo users
are able to, under the “Choice” principle, refuse to having their profiles from
being supplemented by other information sources.&amp;nbsp; Also, under the “access
principle”, do individuals have the right to review all information held about them as “Bebo
users”?&amp;nbsp; The right to review information
held by a social networking site is an important one that should be upheld.&amp;nbsp; This is most notable as supplementary information
from outside social networking services is employed &amp;nbsp;to profile individual users in ways which may
work to categorize individuals in undesirable ways.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The Third Party Problem&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Cooperation between social networking sites and the Safe
Harbor has improved, and most of these sites now have privacy policies which
explicitly address the principles of the Program.&amp;nbsp;&amp;nbsp; It should also be noted that public interest
groups, such as Epic, the Center for Digital Democracy, and The Electronic
Frontier Foundation, have played a key role in ensuring that data privacy
breaches are brought to the attention of the FTC under the program.&amp;nbsp; While the program has somewhat adequately
addressed the privacy practices of first party participants, the number of
third parties on social networking sites calls into question the
comprehensiveness and effectiveness of the Safe Harbor program.&amp;nbsp; Facebook itself as a first party site may adhere
to the Safe Harbor Program.&amp;nbsp; However, its
growing number third party platform members may not always adhere to best practices
in the field, nor can Facebook or the Safe Harbor Program guarantee that they
do so.&lt;/p&gt;
&lt;p&gt;The Safe Harbor Program does require that all participants
take certain security measures when transferring data to a third party.&amp;nbsp; Third parties must either subscribe to the
safe harbor principles, or be subject to the EU Data Directive.&amp;nbsp; Alternatively, an organization can may also
enter into a written agreement with a third party requiring that they provide
at least the same level of privacy protection as is required by program
principles.&amp;nbsp; Therefore, third parties of
participating program sites are, de facto, bound by the safe harbor principles by
the way of entering into agreement with a first party participant of the
program. &amp;nbsp;This is the approach taken by
most social networking sites and their third parties.&lt;/p&gt;
&lt;p&gt;It is important to note, however, that third parties are not
governed directly by the regulatory bodies, such as the FTC.&amp;nbsp; The safe harbor website also &lt;a href="http://www.export.gov/safeharbor/eu/eg_main_018476.asp"&gt;explicitly notes&lt;/a&gt;
that the program does not apply to third parties.&amp;nbsp; Therefore, as per these provisions, Facebook must
adhere to the principles of the program, while its third party platform members
(such as social gaming companies), only must do so indirectly as per a separate
contract with Facebook.&amp;nbsp; The
effectiveness of this indirect mode of governing of third party privacy
practices is questionable for numerous reasons.&lt;/p&gt;
&lt;p&gt;Firstly, while Facebook does take steps to ensure that
third parties use information from Facebook in a manner which is consistent to
the safe harbor principles, the company explicitly &lt;a href="http://www.facebook.com/policy.php"&gt;waives any guarantee&lt;/a&gt; that third
parties will “follow their rules”. &amp;nbsp;&amp;nbsp;Prior to allowing third parties to access any
information about users, Facebook requires third parties to &lt;a href="http://www.facebook.com/terms.php"&gt;agree to terms&lt;/a&gt; that limit their
use of information, and also use technical measures to ensure that they only
obtain authorized information.&amp;nbsp;&amp;nbsp; Facebook
also warns users to “always review the policies of third party applications and
websites to make sure you are comfortable with the ways in which they use
information”.&amp;nbsp; Not only are users
required to read the privacy policies of every third party application, but are
also expected to report applications which may be in violation of privacy
principles.&amp;nbsp; In this sense, Facebook not
only waives responsibility for third party privacy breaches, but also places further
regulatory onus upon the user.&lt;/p&gt;
&lt;p&gt;As the program guidelines express, the safe harbor relies to
a great degree on enforcement by the private sector.&amp;nbsp; However, it is likely that a self-regulatory
framework may lead the industry into a state of regulatory malaise.&amp;nbsp; Under the safe harbor program, Facebook must
ensure that the privacy practices of third parties are adequate.&amp;nbsp; However, at the same time, the company may
simultaneously waiver their responsibility for third party compliance with safe
harbor principles.&amp;nbsp; Therefore, it remains
questionable as to where responsibility for third parties exactly lies.&amp;nbsp; When third parties are not directly
answerable to the governing bodies of safe harbor program, and when first parties
can to waive responsibility for their practices, from where does the incentive to
effectively regulate third parties to come from?&amp;nbsp;&lt;/p&gt;
&lt;p&gt;While Facbeook may in fact take reasonable legal and technical
measures to ensure third party compliance, the room for potential dissonance
between speech and deed&amp;nbsp; is worrisome.&amp;nbsp; Facebook is required to ensure that third
parties provide “&lt;a href="http://www.export.gov/safeharbor/eu/eg_main_018476.asp"&gt;at least the same
level of privacy protection&lt;/a&gt;” as they do.&amp;nbsp;
However, in practice, this has yet to become the case.&amp;nbsp; A quick survey of twelve of the most popular
Platform Applications in the gaming category showed&lt;a name="_ednref6" href="#_edn6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;
that third parties are not granting their users the “same level of privacy
protection”[5].&amp;nbsp; For example, section 9.2.3
of Facebooks “&lt;a href="http://www.facebook.com/terms.php"&gt;Rights and
Responsibilities&lt;/a&gt;” for Developers/Operators of applications/sites states
that they must “have a privacy policy or otherwise make it clear to users what
user data you are going to use and how you will use, display, or share that
data”.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;However, out of the 12 gaming applications surveyed, four
companies failed to make privacy policies available to users &lt;em&gt;before&lt;/em&gt; they granted the application
access to the personal information, including that of their friends&lt;a name="_ednref7" href="#_edn7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [6].&amp;nbsp; After searching for the privacy policies on
the websites of each of the four social gaming companies, two completely failed
to post privacy policies on their central websites. &amp;nbsp;&amp;nbsp;This practice is in direct breach of the
contract made between these companies and Facebook, as mentioned above.&amp;nbsp; In addition to many applications failing to clearly
post privacy policies, many of provisions set out in these policies were
questionable vis-à-vis safe harbor principles.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;For example Zynga, makes of popular games Mafia Wars and
Farmville, reserve the right to “maintain copies of your content
indefinitely”.&amp;nbsp; This practice remains contrary
to Safe Harbor principles which states that information should not be kept for
longer than required to run a service.&amp;nbsp;
Electronic Arts also maintains similar provisions for data retention in
its privacy policy.&amp;nbsp;&amp;nbsp; Such practices are
rather worrisome also in light of the fact that both companies also reserve the
right to collect information on users from other sources to supplement profiles
held.&amp;nbsp; This includes (but is not limited
to) newspapers and Internet sources such as blogs, instant messaging services, and
other games.&amp;nbsp;&amp;nbsp; It is also notable to
mention that only one of the twelve social gaming companies surveyed directly
participates in the safe harbor program.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In addition to the difficulties of ensuring that safe harbor
principles are adhered to by third parties, the information asymmetries which
exist between first party sites, citizens, and governance bodies vis-à-vis
third parties complicate this model.&amp;nbsp; Foremost,
it is clear that Facebook, despite its resources, cannot keep tabs on the
practices of all of their applications.&amp;nbsp;&amp;nbsp;
This puts into question if industry self-regulation can really guarantee
that privacy is respected by third parties in this context.&amp;nbsp; Furthermore, the lack of knowledge or
understanding held by citizens about how third parties user their information
is particularly problematic when a system relies so heavily on users to report
suspected privacy breaches.&amp;nbsp; The same is
likely to be true for governments, too.&amp;nbsp; As
one legal scholar, promoting a more laisse-fair approach to third party
regulation, notes—multiple and invisible third party relationships presents
challenges to traditional forms of legal regulation&lt;a name="_ednref8" href="#_edn8"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; [7].&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In an “open “social ecosystem, the sheer volume of data
flows between users of social networking sites and third party players appears
to have become increasingly difficult to effectively regulate.&amp;nbsp; While the safe harbor program has been
successful in establishing best practices and minimum standards for data
privacy, it is also clear that governance bodies, and public interest groups,
have focused most attention on large industry players such as Facebook.&amp;nbsp; This has left smaller third party players on
social networking sites in the shadows of any substantive regulatory concern.&amp;nbsp; &amp;nbsp;&amp;nbsp;If
one this has become clear, it is the fact that governments may no longer be
able to effectively govern the flows of data in the burgeoning context of “open
data”.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;As I have demonstrated, it remains questionable whether or
not Facebook can regulate third parties data collection practices
effectively.&amp;nbsp; Imposing more stringent
responsibilities on safe harbor participants could be a positive step.&amp;nbsp; It is reasonable to assume that it would be
undue to impose liability on social networking sites for the data breaches of
third parties.&amp;nbsp; However, it is not
unreasonable to require sites like Facebook go beyond setting “minimum
standards” for data privacy, towards taking a more active enforcement, if even
through TRUSTe or another regulatory body.&amp;nbsp;
If the safe harbor is to be effective, it cannot allow program participants
to simply wave the liability for third party privacy practices.&amp;nbsp; The indemnity granted to third parties on social
networking sites may deem the safe harbor program more effective in sustaining
the non-liability of third parties, rather than protecting the data privacy of
citizens.&lt;/p&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;
&lt;hr align="left" size="1" width="33%" /&gt;

&lt;/div&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn1" href="#_ednref1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[1] Official Directive 95/46/EC&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn2" href="#_ednref2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn3" href="#_ednref3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[2] 95/46/EC&lt;/p&gt;
&lt;p class="discreet"&gt;[3] Ibid&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn4" href="#_ednref4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;a name="_edn5" href="#_ednref5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/a&gt;[4] See Acquisit,
A. a. (n.d.). Imagined Communities: Awareness, Information Sharing, and Privacy
on Facebook. &lt;em&gt;PET 2006&lt;/em&gt;&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn6" href="#_ednref6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[5] Of the Privacy Policy browsed include, Zynga, Rock
You!, Crowdstar, Mind Jolt, Electronic Arts, Pop Cap Games, Slash Key, Playdom,
Meteor Games, Broken Bulb Studios, Wooga, and American Global Network.&lt;/p&gt;
&lt;p class="discreet"&gt;&lt;a name="_edn7" href="#_ednref7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;[6] By adding an application, users are also sharing with
third parties the information of their friends if they do not specifically &amp;nbsp;opt out of this practice.&lt;/p&gt;
&lt;p class="discreet"&gt;[7]See&lt;strong&gt;
&lt;/strong&gt;&amp;nbsp;Milina, S. (2003).
Let the Market Do its Job: Advocating an Integrated Laissez-Faire Approach to
Online Profiling. &lt;em&gt;Cardozo Arts and Entertainment Law Journal&lt;/em&gt; .&lt;/p&gt;
&lt;pre&gt;&lt;/pre&gt;
&lt;div&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;/div&gt;
&lt;h2&gt;&amp;nbsp;&lt;/h2&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online'&gt;https://cis-india.org/internet-governance/blog/does-the-safe-harbor-program-adequately-address-third-parties-online&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>rebecca</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Social Networking</dc:subject>
    

   <dc:date>2011-08-02T07:19:34Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter">
    <title>Does Size Matter? A Tale of Performing Welfare, Producing Bodies and Faking Identity</title>
    <link>https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter</link>
    <description>
        &lt;b&gt;Malavika Jayaram gave a talk.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cyber.law.harvard.edu/events/luncheon/2014/05/jayaram"&gt;This was published by the website of Berkman Center for Internet and Society&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Big Data doesn’t get much bigger than India’s identity project. The world’s largest biometric database - currently consisting of almost 600 million enrolled - seduces with promises of inclusion, legitimacy and visibility. By locating this techno-utopian vision within the larger surveillance state that a unique identifier facilitates, Malavika will describe the ‘welfare industrial complex’ that imagines the poor as the next emerging market. She will highlight the risks of the body as password, of implementing e-governance in a legal vacuum, and of digitization reinforcing existing inequalities. The export of technologies of control - once they have been tested on a massive population that has little agency and limited ability to withhold consent - transforms this project from a site of local activism to one with global repercussions. By offering a perspective that is somewhat different from the traditional western focus of privacy, she hopes to generate a more inclusive discourse about what it means to be autonomous and empowered in the face of paternalistic development projects. She will highlight, in particular, the varied ways in which the project is already being subverted and re-purposed, in ways that are humorous and poignant.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;About Malavika&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Malavika is a Fellow at the Berkman Center for Internet and Society at  Harvard University, focusing on privacy, identity and free expression.  She is also a Fellow at the Centre for Internet and Society, Bangalore,  and the author of the India chapter for the Data Protection &amp;amp;  Privacy volume in the Getting the Deal Done series. Malavika is one of  10 Indian lawyers in The International Who's Who of Internet e-Commerce  &amp;amp; Data Protection Lawyers directory. In August 2013, she was voted  one of India’s leading lawyers - one of only 8 women to be featured in  the “40 under 45” survey conducted by Law Business Research, London. In a  different life, she spent 8 years in London, practicing law with global  firm Allen &amp;amp; Overy in the Communications, Media &amp;amp; Technology  group, and as VP and Technology Counsel at Citigroup. She is working on a  PhD about the development of a privacy jurisprudence and discourse in  India, viewed partly through the lens of the Indian biometric ID  project.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Podcast&lt;/h3&gt;
&lt;p&gt;Watch the podcast &lt;a class="external-link" href="http://castroller.com/podcasts/BerkmanCenterFor/4060529"&gt;at this link&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter'&gt;https://cis-india.org/news/harvard-university-may-13-2014-does-size-matter&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-06-04T09:45:49Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates">
    <title>Do We Really Need an App for That? Examining the Utility and Privacy Implications of India’s Digital Vaccine Certificates</title>
    <link>https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates</link>
    <description>
        &lt;b&gt;We examine the purported benefits of digital vaccine certificates over regular paper-based ones and analyse the privacy implications of their use.&lt;/b&gt;
        
&lt;p&gt;&lt;em&gt;This blogpost was edited by Gurshabad Grover, Yesha Tshering Paul, and Amber Sinha.&lt;br /&gt;It was originally published on &lt;a href="https://digitalid.design/vaccine-certificates.html"&gt;Digital Identities: Design and Uses&lt;/a&gt; and is cross-posted here.&lt;br /&gt;&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;In an experiment to streamline its COVID-19 immunisation drive, India has adopted a centralised vaccine administration system called CoWIN (or COVID Vaccine Intelligence Network). In addition to facilitating registration for both online and walk-in vaccine appointments, the system also allows for the &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;digital verification&lt;/a&gt; of vaccine certificates, which it issues to people who have received a dose. This development aligns with a global trend, as many countries have adopted or are in the process of adopting “vaccine passports” to facilitate safe movement of people while resuming commercial activity.
    &lt;br /&gt;&lt;br /&gt;Some places, such as the &lt;a href="https://www.schengenvisainfo.com/news/all-your-questions-on-eus-covid-19-vaccine-certificate-answered/" target="_blank"&gt;EU&lt;/a&gt;, have constrained the scope of use of their vaccine certificates to international travel. The Indian government, however, has so far &lt;a href="https://www.livemint.com/opinion/columns/vaccination-certificates-need-a-framework-to-govern-their-use-11618160385602.html" target="_blank"&gt;skirted&lt;/a&gt; important questions around where and when this technology should be used. By allowing &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;anyone&lt;/a&gt; to use the online CoWIN portal to scan and verify certificates, and even providing a way for the private-sector to incorporate this functionality into their applications, the government has opened up the possibility of these digital certificates being used, and even mandated, for domestic everyday use such as going to a grocery shop, a crowded venue, or a workplace.
    &lt;br /&gt;&lt;br /&gt;In this blog post, we examine the purported benefits of digital vaccine certificates over regular paper-based ones, analyse the privacy implications of their use, and present recommendations to make them more privacy respecting. We hope that such an analysis can help inform policy on appropriate use of this technology and improve its privacy properties in cases where its use is warranted.
    &lt;br /&gt;&lt;br /&gt;We also note that while this post only examines the merits of a technological solution put out by the government, it is more important to &lt;a href="https://www.accessnow.org/cms/assets/uploads/2021/04/Covid-Vaccine-Passports-Threaten-Human-Rights.pdf" target="_blank"&gt;consider&lt;/a&gt; the effects that placing restrictions on the movement of unvaccinated people has on their civil liberties in the face of a vaccine rollout that is inequitable along many lines, including &lt;a href="https://thewire.in/gender/women-falling-behind-in-indias-covid-19-vaccination-drive" target="_blank"&gt;gender&lt;/a&gt;, &lt;a href="https://www.thehindu.com/sci-tech/science/will-25-covid-19-vaccines-for-private-hospitals-aggravate-inequity/article34799098.ece" target="_blank"&gt;caste-class&lt;/a&gt;, and &lt;a href="https://scroll.in/article/994871/tech-savvy-indians-drive-to-villages-for-covid-19-vaccinations-those-without-smartphones-lose-out" target="_blank"&gt;access to technology&lt;/a&gt;.&lt;/p&gt;
&lt;h4&gt;How do digital vaccine certificates work?&lt;/h4&gt;
&lt;p&gt;Every vaccine recipient in the country is required to be registered on the CoWIN platform using one of &lt;a href="https://www.cowin.gov.in/faq" target="_blank"&gt;seven&lt;/a&gt; existing identity documents. [1] &lt;a name="ref1"&gt;&lt;/a&gt; Once a vaccine is administered, CoWIN generates a vaccine certificate which the recipient can access on the CoWIN website. The certificate is a single page document that contains the recipient’s personal information — their name, age, gender, identity document details, unique health ID, a reference ID — and some details about the vaccine given.&lt;a name="ref2"&gt;&lt;/a&gt; [2] It also includes a “secure QR code” and a link to CoWIN’s verification &lt;a href="https://verify.cowin.gov.in/" target="_blank"&gt;portal&lt;/a&gt;.
  &lt;br /&gt;&lt;br /&gt;The verification portal allows for the verification of a certificate by scanning the attached QR code. Upon completion, the portal displays a success message along with some of the information printed on the certificate.
  &lt;br /&gt;&lt;br /&gt;Verification is done using a cryptographic mechanism known as &lt;a href="https://en.wikipedia.org/wiki/Digital_signature" target="_blank"&gt;digital signatures&lt;/a&gt;, which are encoded into the QR code attached to a vaccine certificate. This mechanism allows “offline verification”, which means that the CoWIN verification portal or any private sector app attempting to verify a certificate does not need to contact the CoWIN servers to establish its authenticity. It instead uses a “public key” issued by CoWIN beforehand to verify the digital signature attached to the certificate.
  &lt;br /&gt;&lt;br /&gt;The benefit of this convoluted design is that it protects user privacy. Performing verification offline and not contacting the CoWIN servers, precludes CoWIN from gleaning sensitive metadata about usage of the vaccine certificate. This means that CoWIN does not learn about where and when an individual uses their vaccine certificate, and who is verifying it. This closes off a potential avenue for mass surveillance. [3] However, given how certificate revocation checks are being implemented (detailed in the privacy implications section below), CoWIN ends up learning this information anyway.&lt;/p&gt;
&lt;h4&gt;Where is digital verification useful?&lt;/h4&gt;
&lt;p&gt;The primary argument for the adoption of digital verification of vaccine certificates over visual examination of regular paper-based ones is security. In the face of vaccine hesitancy, there are concerns that people may forge vaccine certificates to get around any restrictions that may be put in place on the movement of unvaccinated people. The use of digital signatures serves to allay these fears.
&lt;br /&gt;&lt;br /&gt;In its current form, however, digital verification of vaccine certificates is no more secure than visually inspecting paper-based ones. While the “secure QR code” attached to digital certificates can be used to verify the authenticity of the certificate itself, the CoWIN verification portal does not provide any mechanism nor does it instruct verifiers to authenticate the identity of the person presenting the certificate. This means that unless an accompanying identity document is also checked, an individual can simply present someone else’s certificate.
&lt;br /&gt;&lt;br /&gt;There are no simple solutions to this limitation; adding a requirement to inspect identity documents in addition to digital verification of the vaccine certificate would not be a strong enough security measure to prevent the use of duplicate vaccine certificates. People who are motivated enough to forge a vaccine certificate, can also duplicate one of the seven ID documents which can be used to register on CoWIN, some of which are simple paper-based documents. [4] Requiring even stronger identity checks, such as the use of Aadhaar-based biometrics, would make digital verification of vaccine certificates more secure. However, this would be a wildly disproportionate incursion on user privacy — allowing for the mass collection of metadata like when and where a certificate is used — something that digital vaccine certificates were explicitly designed to prevent. Additionally, in Russia, people were &lt;a href="https://www.washingtonpost.com/world/europe/moscow-fake-vaccine-coronavirus/2021/06/26/0881e1e4-cf98-11eb-a224-bd59bd22197c_story.html" target="_blank"&gt;found&lt;/a&gt; issuing fake certificates by discarding real vaccine doses instead of administering them. No technological solution can prevent such fraud.
&lt;br /&gt;&lt;br /&gt;As such, the utility of digital certificates is limited to uses such as international travel, where border control agencies already have strong identity checks in place for travellers. Any everyday usage of the digital verification functionality on vaccine certificates would not present any benefit over visually examining a piece of paper or a screen.&lt;/p&gt;
&lt;h4&gt;Privacy implications of digital certificates&lt;/h4&gt;
&lt;p&gt;In addition to providing little security utility over manual inspection of certificates, digital certificates also present privacy issues, these are listed below along with recommendations to mitigate them:
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(i) The verification portal leaks sensitive metadata to CoWIN’s servers:&lt;/em&gt; An analysis of network requests made by the CoWin verification portal reveals that it conducts a ‘revocation check’ each time a certificate is verified. This check was also found in the source &lt;a href="https://github.com/egovernments/DIVOC/blob/e667697b47a50a552b8d0a8c89a950180217b945/interfaces/vaccination-api.yaml#L385" target="_blank"&gt;code&lt;/a&gt;, which is made openly available&lt;a name="ref5"&gt;&lt;/a&gt;.
[5]&lt;/p&gt;
&lt;p&gt;Revocation checks are an important security consideration while using digital signatures. They allow the issuing authority (CoWIN, in this case) to revoke a certificate in case the account associated with it is lost or stolen, or if a certificate requires correction. However, the way they have been implemented here presents a significant privacy issue. Sending certificate details to the server on every verification attempt allows it to learn about where and when an individual is using their vaccine certificate.
&lt;br /&gt;&lt;br /&gt;We note that the revocation check performed by the CoWIN portal does not necessarily mean that it is storing this information. Nevertheless, sending certificate information to the server directly contradicts claims of an “offline verification” process, which is the basis of the design of these digital certificates.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendations:&lt;/strong&gt; Implementing privacy-respecting revocation checks such as Certificate Revocation Lists, [6] or Range Queries [7] would mitigate this issue. However, these solutions are either complex or present bandwidth and storage tradeoffs for the verifier.
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(ii) Oversharing of personally identifiable information:&lt;/em&gt; CoWIN’s vaccine certificates include more personally identifiable information (name, age, gender, identity document details and unique health ID) than is required for the purpose of verifying the certificate. An examination of the vaccine certificates available to us revealed that while the Aadhaar number is appropriately masked, other personal identifiers such as passport number and unique health ID were not masked. Additionally, the inclusion of demographic details, such as age and gender, provides little security benefit by limiting the pool of duplicate certificates that can be used and are not required in light of the security analysis above.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendation:&lt;/strong&gt; Personal identifiers (such as passport number and unique health ID) should be appropriately masked and demographic details (age, gender) can be removed.
&lt;br /&gt;&lt;br /&gt;The minimal set of data required for identity-linked usage for digital verification, as described above, is a full name and masked ID document details. All other personally identifying information can be removed. In case of paper-based certificates, which is suggested for domestic usage, only the details about vaccine validity would suffice and no personal information is required.
&lt;br /&gt;&lt;br /&gt;&lt;em&gt;(iii) Making information available digitally increases the likelihood of collection:&lt;/em&gt; All of the personal information printed on the certificate is also encoded into the QR code. This is &lt;a href="https://www.bbc.com/news/uk-scotland-57208607" target="_blank"&gt;necessary&lt;/a&gt; because the digital signature verification process also verifies the integrity of this information (i.e. it wasn’t modified). A side effect of this is that the personal information is made readily available in digital form to verifiers when it is scanned, making it easy for them to store. This is especially likely in private sector apps who may be interested in collecting demographic information and personal identifiers to track customer behaviour.
&lt;br /&gt;&lt;br /&gt;&lt;strong&gt;Recommendation:&lt;/strong&gt; Removing extraneous information from the certificate, as suggested above, mitigates this risk as well.&lt;/p&gt;
&lt;h4&gt;Conclusion&lt;/h4&gt;
&lt;p&gt;Our analysis reveals that without incorporating strong, privacy-invasive identity checks, digital verification of vaccine certificates does not provide any security benefit over manually inspecting a piece of paper. The utility of digital verification is limited to purposes that already conduct strong identity checks.
&lt;br /&gt;&lt;br /&gt;In addition to their limited applicability, in their current form, these digital certificates also generate a trail of data and metadata, giving both government and industry an opportunity to infringe upon the privacy of the individuals using them.
&lt;br /&gt;&lt;br /&gt;Keeping this in mind, the adoption of this technology should be discouraged for everyday use.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;References&lt;/h4&gt;
&lt;p&gt;[1] Exceptions &lt;a href="https://web.archive.org/web/20210511045921/https://www.mohfw.gov.in/pdf/SOPforCOVID19VaccinationofPersonswithoutPrescribedIdentityCards.pdf" target="_blank"&gt;exist&lt;/a&gt; for people without state-issued identity documents.&lt;/p&gt;
&lt;p&gt;[2] This information was gathered by inspecting three vaccine certificates linked to the author’s CoWIN account, which they were authorised to view, and may not be fully accurate.&lt;/p&gt;
&lt;p&gt;[3] This design is similar to Aadhaar’s “&lt;a href="https://resident.uidai.gov.in/offline-kyc" target="_blank"&gt;offline KYC&lt;/a&gt;” process.&lt;/p&gt;
&lt;p&gt;[4] “Aadhaar Card: UIDAI says downloaded versions on ordinary paper, mAadhaar perfectly valid”, &lt;em&gt;Zee Business&lt;/em&gt;, April 29 2019, &lt;em&gt;https://www.zeebiz.com/india/news-aadhaar-card-uidai-says-downloaded-versions-on-ordinary-paper-maadhaar-perfectly-valid-96790&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;[5] This check was also verified to be present in the reference &lt;a href="https://github.com/egovernments/DIVOC/blob/261a61093b89990fe34698f9ba17367d4cb74c34/public_app/src/components/CertificateStatus/index.js#L125" target="_blank"&gt;code&lt;/a&gt; made available for private-sector applications incorporating this functionality, suggesting that private sector apps will also be affected by this.&lt;/p&gt;
&lt;p&gt;[6] &lt;a href="https://en.wikipedia.org/wiki/Certificate_revocation_list" target="_blank"&gt;Certificate Revocation Lists&lt;/a&gt; allow the server to provide a list of revoked certificates to the verifier, instead of the verifier querying the server each time. This, however, can place heavy bandwidth and storage requirements on the verifying app as this list can potentially grow long.&lt;/p&gt;
&lt;p&gt;[7] Range Queries are described in this &lt;a href="https://www.ics.uci.edu/~gts/paps/st06.pdf" target="_blank"&gt;paper&lt;/a&gt;. In this method, the verifier requests revocation status from the server by specifying a range of certificate identifiers within which the certificate being verified lies. If there are any revoked certificates within this range, the server will send their identifiers to the verifier, who can then check if the certificate in question is on the list. For this to work, the range selected must be sufficiently large to include enough potential candidates to keep the server from guessing which one is in use.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates'&gt;https://cis-india.org/internet-governance/blog/do-we-really-need-an-app-for-that-examining-the-utility-and-privacy-implications-of-india2019s-digital-vaccine-certificates&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>divyank</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Digital ID</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    
    
        <dc:subject>Appropriate Use of Digital ID</dc:subject>
    

   <dc:date>2021-08-03T05:13:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme">
    <title>Do we need the Aadhar scheme?</title>
    <link>https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme</link>
    <description>
        &lt;b&gt;"Decentralisation and privacy are preconditions for security. Digital signatures don’t require centralised storage and are much more resilient in terms of security", Sunil Abraham in the Business Standard on 1 February 2012.&lt;/b&gt;
        
&lt;p&gt;We don’t need Aadhar because we already have a much more robust identity management and authentication system based on digital signatures that has a proven track record of working at a “billions-of-users” scale on the internet with reasonable security. The Unique Identification (UID) project based on the so-called “infallibility of biometrics” is deeply flawed in design. These design disasters waiting to happen cannot be permanently thwarted by band-aid policies.&lt;/p&gt;
&lt;p&gt;Biometrics are poor authentication factors because once they are compromised they cannot be re-secured unlike digital signatures. Additionally, an individual’s biometrics can be harvested remotely without his or her conscious cooperation. The iris can be captured remotely without a person’s knowledge using a high-res digital camera.&lt;/p&gt;
&lt;p&gt;Biometrics are poor identification factors in a country where the registrars have commercial motivation to create ghost identities. For example, bank managers trying to achieve targets for deposits by opening benami accounts. Biometrics for these ghost identities can be imported from other countries or generated endlessly using image processing software. The de-duplication engine at the Unique Identification Authority of India (UIDAI) will be fooled into thinking that these are unique residents.&lt;/p&gt;
&lt;p&gt;An authentication system does not require a centralised database of authentication factors and transaction details. This is like arguing that the global system of e-commerce needs a centralised database of passwords and logs or, to use an example from the real world, to secure New Delhi, all citizens must deposit duplicate keys to their private property with the police.&lt;/p&gt;
&lt;p&gt;Decentralisation and privacy are preconditions for security. The “end-to-end principle” used to design internet security is also in compliance with Gandhian principles of Panchayat Raj. Digital signatures don’t require centralised storage of private keys and are, therefore, much more resilient in terms of security.&lt;/p&gt;
&lt;p&gt;Biometrics as authentication factors require the government to store biometrics of all citizens but citizens are not allowed to store biometrics of politicians and bureaucrats. The state authenticates the citizen but the citizen cannot conversely authenticate the state. Digital signatures as an authentication factor, on the other hand, does not require this asymmetry since citizens can store public keys of state actors and authenticate them. The equitable power relationship thus established allows both parties to store a legally non-repudiable audit trail for critical transactions like delivery of welfare services. Biometrics exacerbates the exiting power asymmetry between citizens and state unlike digital signatures, which is peer authentication technology.&lt;/p&gt;
&lt;p&gt;Privacy protections should be inversely proportional to power. The transparency demanded of politicians, bureaucrats and large corporations cannot be made mandatory for ordinary citizens. Surveillance must be directed at big-ticket corruption, at the top of the pyramid and not retail fraud at the bottom. Even for retail fraud, the power asymmetry will result in corruption innovating to circumvent technical safeguards. Government officials should be required by law to digitally sign the movement of resources each step of the way till it reaches a citizen. Open data initiatives should make such records available for public scrutiny. With support from civil society and the media, citizens will themselves address retail fraud. To solve corruption, the state should become more transparent to the citizen and not vice versa.&lt;/p&gt;
&lt;p&gt;UIDAI’s latest 23-page biometrics report is supposed to dispel the home ministry’s security anxieties. It says “biometric data is collected by software provided by the UIDAI, which immediately encrypts and applies a digital signature.” Surely, what works for UIDAI, that is digital signatures, should work for citizens too. The report does not cover even the most basic attack — for example, the registrar could pretend that UIDAI software is faulty and harvest biometrics again using a parallel set-up. If biometrics are infallible, as the report proclaims, then sections in the draft UID Bill that criminalise attempts to defraud the system should be deleted.&lt;/p&gt;
&lt;p&gt;The compromise between UIDAI and the home ministry appears to be a turf battle for states where security concerns trump developmental aspirations. This compromise does nothing to address the issues raised by the Parliamentary Standing Committee on Finance, headed by the Bharatiya Janata Party’s Yashwant Sinha.&lt;/p&gt;
&lt;p&gt;Read the &lt;a class="external-link" href="http://www.business-standard.com/india/news/do-we-needaadhar-scheme/463324/"&gt;original published in the Business Standard&lt;/a&gt; on 1 February 2012&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme'&gt;https://cis-india.org/internet-governance/do-we-need-the-aadhar-scheme&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2012-02-03T10:11:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/dna-research">
    <title>DNA Research</title>
    <link>https://cis-india.org/internet-governance/blog/dna-research</link>
    <description>
        &lt;b&gt;In 2006, the Department of Biotechnology drafted the Human DNA Profiling Bill. In 2012 a revised Bill was released and a group of Experts was constituted to finalize the Bill. In 2014, another version was released, the approval of which is pending before the Parliament. This legislation will allow the government of India to Create a National DNA Data Bank and a DNA Profiling Board for the purposes of forensic research and analysis. Here is a collection of our research on privacy and security concerns related to the Bill.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The Centre for Internet and Society, India has been researching privacy in India since the year 2010, with special focus on the following issues related to the DNA Bill:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Validity and legality of collection, usage and storage of DNA samples and information derived from the same.&lt;/li&gt;
&lt;li&gt;Monitoring projects and policies around Human DNA Profiling.&lt;/li&gt;
&lt;li&gt;Raising public awareness around issues concerning biometrics.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;In 2006, the Department of Biotechnology drafted the Human DNA Profiling Bill. In 2012 a revised Bill was released and a group of Experts was constituted to finalize the Bill. In 2014, another version was released, the approval of which is pending before the Parliament.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;The Bill seeks to establish DNA Databases at the state and regional level and a national level database. The databases would store DNA profiles of suspects, offenders, missing persons, and deceased persons. The database could be used by courts, law enforcement (national and international) agencies, and other authorized persons for criminal and civil purposes. The Bill will also regulate DNA laboratories collecting DNA samples. Lack of adequate consent, the broad powers of the board, and the deletion of innocent persons profiles are just a few of the concerns voiced about the Bill.&lt;/p&gt;
&lt;img src="https://github.com/cis-india/website/raw/master/img/CIS_DNA-Profiling-Bill_Web.jpg" alt="DNA Profiling Bill - Infographic" /&gt;
&lt;h6&gt;&lt;a href="https://github.com/cis-india/website/raw/master/img/CIS_DNA-Profiling-Bill_Web.jpg" target="_blank"&gt;Download the infographic.&lt;/a&gt; Credit: &lt;a href="https://twitter.com/Scott_Mason88" target="_blank"&gt;Scott Mason&lt;/a&gt; and CIS team.&lt;/h6&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h2&gt;1. DNA Bill&lt;/h2&gt;
&lt;p style="text-align: justify;"&gt;The Human DNA Profiling bill is a legislation that will allow the government of India to Create a National DNA Data Bank and a DNA Profiling Board for the 	purposes of forensic research and analysis. There have been many concerns raised about the infringement of privacy and the power that the government will 	have with such information raised by Human Rights Groups, individuals and NGOs. The bill proposes to profile people through their fingerprints and retinal 	scans which allow the government to create different unique profiles for individuals. Some of the concerns raised include the loss of privacy by such 	profiling and the manner in which they are conducted. Unless strictly controlled, monitored and protected, such a database of the citizens' fingerprints 	and retinal scans could lead to huge blowbacks in the form of security risks and privacy invasions. The following articles elaborate upon these matters.&lt;/p&gt;
&lt;ol type="1"&gt;&lt;/ol&gt;
&lt;ul type="disc"&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/biometrics-an-angootha-chaap-nation"&gt;Biometrics - An 'Angootha Chaap' Nation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/re-the-human-dna-profiling-bill-2012"&gt;Re: The Human DNA Profiling Bill, 2012&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/human-dna-profiling-bill-analysis"&gt;Human DNA Profiling Bill 2012 Analysis&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/indian-draft-dna-profiling-act"&gt;Overview and Concerns Regarding the Indian Draft DNA Act&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/indias-biometric-identification-programs-and-privacy-concerns"&gt;India's Biometric Identification Programs and Privacy Concerns&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/dna-dissent"&gt;A Dissent note to the Expert Committee for DNA Profiling&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015"&gt;CIS Comments and Recommendations to the Human DNA Profiling Bill, June 2015&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/concerns-regarding-dna-law"&gt;Concerns regarding DNA Law&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/human-dna-profiling-bill-2012-vs-2015"&gt;Human DNA Profiling Bill 2012 v/s 2015 Bill&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/news/the-scariest-bill-in-parliament-is-getting-no-attention-2013-here2019s-what-you-need-to-know-about-it"&gt;The scariest Bill in the Parliament is getting no attention - Here's what you need to know about it&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/news/business-standard-kanika-datta-august-1-2015-why-the-dna-bill-is-open-to-misuse-sunil-abraham"&gt;Why the DNA Bill is open to misuse&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/news/livemint-nikita-mehta-july-29-2015-regulation-misuse-concerns-still-dog-dna-profiling-bill"&gt;Regulation, misuse concerns still dog DNA Profiling Bill&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/news/open-magazine-august-7-2015-ullekh-np-genetic-profiling"&gt;Genetic profiling - Is it all in the DNA?&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/comparison-of-the-human-dna-profiling-bill-2012-with-cis-recommendations-sub-committee-recommendations-expert-committee-recommendations-and-the-human-dna-profiling-bill-2015"&gt;Comparison of the Human DNA Profiling Bill 2012 with - CIS Recommendations, Sub-Committee Recommendations, Expert Committee Recommendations, and the Human DNA Profiling Bill 2015&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/council-for-responsible-genetics-april-2014-sunil-abraham-very-big-brother"&gt;Very Big Brother&lt;/a&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h2&gt;2. Comparative Analysis with other Legislatures&lt;/h2&gt;
&lt;p&gt;Human DNA Profiling is a system that isn't proposed only in India. This system of identification has been proposed and implemented in many nations. Each of 	these systems differs from the other on bases dependent on the nation's and society's needs. The risks and criticisms that DNA profiling has faced may be 	the same but the manner in which solutions to such issues are varying. The following articles look into the different systems in place in different 	countries and create a comparison with the proposed system in India to give us a better understanding of the risks and implications of such a system being 	implemented.&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/comparative-analysis-of-dna-profiling-legislations-across-the-world"&gt;Comparative Analysis of DNA Profiling Legislations from Across the World&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/comparision-of-draft-human-dna-profiling-bill-and-identification-act-revised-statute-of-canada-provisions"&gt;Comparison of Section 35(1) of the Draft Human DNA Profiling Bill and Section 4 of the Identification Act Revised Statute of Canada&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://cis-india.org/internet-governance/blog/comparison-of-draft-dna-profiling-bills"&gt;A Comparison of the Draft DNA Profiling Bill 2007 and the Draft Human DNA Profiling Bill 2012&lt;/a&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/dna-research'&gt;https://cis-india.org/internet-governance/blog/dna-research&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vanya</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-07-21T11:02:29Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/news/report-dna-july-7-2013-joanna-lobo-geeks-have-a-solution-to-digital-surveillance-in-india-cryptography">
    <title>dna exclusive: Geeks have a solution to digital surveillance in India: Cryptography</title>
    <link>https://cis-india.org/news/report-dna-july-7-2013-joanna-lobo-geeks-have-a-solution-to-digital-surveillance-in-india-cryptography</link>
    <description>
        &lt;b&gt;While you were thinking of what next to post on Twitter, the government has stealthily put an ambitious surveillance programme in place that tracks your every move in the digital world — through voice calls, SMS and MMS, GPRS, fax communications on landlines, video calls and emails.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p&gt;The article by Joanna Lobo was &lt;a class="external-link" href="http://www.dnaindia.com/scitech/1857945/report-dna-exclusive-geeks-have-a-solution-to-digital-surveillance-in-india-cryptography"&gt;published in DNA&lt;/a&gt; on July 7, 2013. Pranesh Prakash is quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The programme, conceived in 2011, has now been brought under one umbrella referred to as the centralised monitoring system (CMS). It is the death of privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But as concerned citizens argue for the need to formulate policies and laws to protect privacy, there's a simpler solution in sight for now: a CryptoParty.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At this 'party', an informal gathering of people, non-geeks can learn how to legally encrypt their digital communications and how to store data without the fear of anyone snooping in. Encryption is a process of encoding messages so that it can only be read by authorised parties.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;What is it?&lt;/b&gt;&lt;br /&gt; "A CryptoParty educates people in the domain of cryptography. It's  usually about the basics: how to send encrypted email, how to protect  your hardware and how to use free and open source software," says  Satyakam Goswami, a free software consultant associated with the  Software Freedom Law Centre (SFLC), Delhi (remove this). Goswami was one  of the 72 participants at the CryptoParty organised on Saturday at  Institute of Informatics &amp;amp; Communication (IIC), Delhi University  South Campus  	On June 30, a CryptoParty organised at the Centre for Internet and  Society (CIS) in Bangalore had 30 people in attendance. "We were taught  about the what, how and who is watching us. We were also taught how to  encrypt emails, chat, video calls or instant messaging,” says Siddhart  Prakash Rao, a computer science graduate and a free software and open  source enthusiast who is about to pursue a Masters in Cryptography.&lt;br /&gt; &lt;br /&gt; The topics may be a mouthful for non-geeks but CryptoParty advocates  maintain that all this is taught in the simplest way possible. The  choice of subject depends on the composition of the group — if it is a  gathering of geeks, like at the Bangalore event, then the topics are  more technical.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;How can it help?&lt;/b&gt;&lt;br /&gt; CryptoParties started in August 2012 by an Australian woman (who goes  by the pseudonym Asher Wolf) after a conversation on Twitter about The  Australian Parliament's new cybercrime bill that allowed law enforcement  to ask Internet Service Providers to monitor and store data. &lt;br /&gt; Attending a CryptoParty is a good way to learn how to overcome government snooping legally.&lt;br /&gt; &lt;br /&gt; “Citizens should use encryption to safeguard their private  communications against both corporations and the government. Encryption  is one of the best ways to react to CMS along with increased civic  vigilance and democratic questioning of our government and  parliamentarians,” says Pranesh Prakash, policy director, CIS, and one  of the frontrunners in the fight to formulate a policy to safeguard  privacy in India.&lt;br /&gt; &lt;br /&gt; "In India, people tend to be rather ignorant. They are not aware of the  kind of surveillance they are subjected to once online. It's a lack of  understanding," says Sumandro Chattapadhyay, a researcher with Sarai, a  programme of the Centre for the Study of Developing Societies, Delhi.&lt;br /&gt; &lt;br /&gt; Bernadette Langle, who also works at CIS has been instrumental in  organising the handful of CryptoParties in the country. When dna spoke  to her, she was on her way to Delhi after participating in the Bangalore  event. Langle will also be part of a CryptoParty being planned for  October in Mumbai. "Ten years ago, you had to be a geek to be able to  encrypt and protect yourself online. Now, you need software and it's  much easier," she says.&lt;br /&gt; &lt;br /&gt; The advantage is that the privacy tactics taught at such parties is  completely legal. All knowledge is in the public domain. “A government  will only deny its citizens basic communications privacy if it is  authoritarian,” says Pranesh. “So while it can try social engineering  and other means to gain access to what you've encrypted, it simply  cannot 'decode' it as long as you have chosen a strong pass phrase and  keep that protected, or they create quantum computers capable of  breaking your encryption.”&lt;br /&gt; &lt;br /&gt; The CIS is currently working on revisions of the Privacy (Protection)  Bill 2013 with the objective of contributing to privacy legislation in  India. Till that bill becomes an Act and till there's a better way to  overcome needless government surveillance, attending a CryptoParty could  possibly be the wisest solution for those concerned about privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(For more details on CryptoParties, visit www.cryptoparty.in)&lt;br /&gt; &lt;br /&gt; &lt;b&gt;How to encrypt:&lt;/b&gt;&lt;br /&gt; SMS: Make content secure by using software like TextSecure (Android) or  CryptoSMS (Symbian). However, SMS metadata (who you are sending the  message to and at what time) can still be tracked.&lt;br /&gt; &lt;br /&gt; Instead of Whatsapp, install Jabbir and add off the record encryption.&lt;br /&gt; &lt;br /&gt; For email, you can use OpenPGP in conjunction with Thunderbird to  encrypt mails you send from Gmail/Yahoo Mail/Live Mail accounts so that  even Google, Yahoo and Microsoft can't read them&lt;br /&gt; &lt;br /&gt; For web browsing, use a VPN (which will hide your traffic from your  ISP), or Tor (which will help anonymise your traffic, but will slow down  your connection slower).&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/news/report-dna-july-7-2013-joanna-lobo-geeks-have-a-solution-to-digital-surveillance-in-india-cryptography'&gt;https://cis-india.org/news/report-dna-july-7-2013-joanna-lobo-geeks-have-a-solution-to-digital-surveillance-in-india-cryptography&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-07-15T06:24:40Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/dna-databases-and-human-rights.pdf">
    <title>DNA Databases and Human Rights</title>
    <link>https://cis-india.org/internet-governance/dna-databases-and-human-rights.pdf</link>
    <description>
        &lt;b&gt;Using DNA to trace people who are suspected of committing a crime has been a major advance in policing.&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/dna-databases-and-human-rights.pdf'&gt;https://cis-india.org/internet-governance/dna-databases-and-human-rights.pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2012-09-17T05:39:06Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/dna-database-for-missing-persons-and-unidentified-dead-bodies">
    <title>DNA Database for Missing Persons and Unidentified Dead Bodies</title>
    <link>https://cis-india.org/internet-governance/blog/dna-database-for-missing-persons-and-unidentified-dead-bodies</link>
    <description>
        &lt;b&gt;This blog discusses the possible implications of the public interest litigation that has been placed before the Supreme Court petitioning for the establishment of a DNA database in respect to unidentified bodies. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In the year 2012 Lokniti, a Non Governmental Organization filed a public interest litigation in the Supreme Court of India asking the government to 	establish a DNA database in respect of unidentified dead bodies as well as for those individuals for whom missing persons reports have been filed so that 	DNA of unidentified dead bodies can be matched against missing persons - arguing that the right to be identified is a part of the right to dignity, and 	that such systems have been adopted across the globe.&lt;a name="_ftnref1"&gt;&lt;/a&gt; The case has come up a few times since 2012 and 	parties have been given time to file their replies in these instances.&lt;a name="_ftnref2"&gt;&lt;/a&gt; Prior to the 2012 Public Interest 	Litigation filed by Lokniti, in 2009 a Public Interest Litigation was filed by a Haryana based doctor. The PIL petitioned for the DNA profiling of unidentified bodies to be made mandatory - arguing that thousands of individuals die with their identity being unknown.	&lt;a name="_ftnref3"&gt;&lt;/a&gt; During the hearing the Bench asked a number of questions including why the Ministry of Health was not 	brought into the case, given the fact that a number of labs that conduct DNA profiling function under the ministry.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the case is still pending, the Supreme Court on 22&lt;sup&gt;nd&lt;/sup&gt; September 2014 gave another interim order which was a little more detailed.	&lt;a name="_ftnref4"&gt;&lt;/a&gt; On this date the Ministry of Science and Technology of the Government of India, through the Department of 	Biotechnology stated that they are piloting a DNA profiling Bill that would establish a DNA Profiling Board and a National DNA Data Bank. The National DNA 	Data Bank is envisaged to maintain the following indices for various categories of data:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;I. a crime scene index;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;II. a suspects' index;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;III. an offenders' index;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;IV. a missing persons' index;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;V. unknown deceased persons' index&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;VI. a volunteers' index; and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;VII. such other DNA indices as may be specified by regulations made by the Board.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One of the Ministry's plans under this Bill is to create DNA profiles of individuals whose relatives have gone missing, on a voluntary basis to help the 	relatives identify missing persons and unidentified dead bodies. They also stated that cross-matching of DNA profiling data in the database would require 	specialized software and the CDFB, Hyderabad is in the process of acquiring the same from the Federal Bureau of investigation, USA.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The advocate for Lokniti responded to this saying that the DNA profiling Bill has been pending for a long time and has not seen the light of day for the 	last seven years. To this the response of the government was that it was a complex Bill involving a number of issues which take a long time to resolve.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At this point the Supreme Court, without going into the details of the Bill asked the advocate for the Union of India to obtain instructions regarding the 	following two aspects:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(1) Whether pending the Bill coming into force the concerned Department can constitute a Data Bank in respect of dead persons who are not identifiable; and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(2) when there are missing reports in respect of persons to collect the DNA from the permissible sources like siblings or others so that in case any 	unidentified dead body is found to match the DNA to arrive at the conclusion about the missing persons who are dead; or as an ancillary the missing person 	who is a victim of the crime of kidnapping or where any child, who is not able to find out his parents, can be in a position to find out through the DNA.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Thus it seems that the Supreme Court, recognizing its limitations in directing the legislature to pass a law and the fact that the passing of the DNA 	profiling Bill may take a long time to become law, has tried to find a way out in which the concerns of the petitioner regarding a DNA Databank for missing 	persons and unidentified dead bodies could be addressed without the passage of the DNA profiling Bill. However since the case is still pending in the 	Supreme Court no final directions have been given in this regard. Thus, the Court has left the government with the responsibility to address the question 	of whether a DNA Databank can be established without the passing of a legislation providing legal basis for the collection, profiling, databasing, and use 	of DNA samples.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr align="left" size="1" width="100%" /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a name="_ftn1"&gt;&lt;/a&gt; http://indianexpress.com/article/india/india-others/sc-wants-centre-to-create-dna-data-bank/#sthash.7zqU0Ill.dpuf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a name="_ftn2"&gt;&lt;/a&gt; All the orders between 2012 and 2014 giving time to the parties can be accessed at 			&lt;a href="http://courtnic.nic.in/supremecourt/caseno_listed_1.asp?cno=491%20%20%20&amp;amp;ctype=3&amp;amp;cyear=2012&amp;amp;frmname=causedisp&amp;amp;petname=LOKNITI%20FOUNDATION%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20&amp;amp;resname=U.O.I.%20&amp;amp;%20ORS"&gt; http://courtnic.nic.in/supremecourt/caseno_listed_1.asp?cno=491%20%20%20&amp;amp;ctype=3&amp;amp;cyear=2012&amp;amp;frmname=causedisp&amp;amp;petname=LOKNITI%20FOUNDATION%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20&amp;amp;resname=U.O.I.%20&amp;amp;%20ORS &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a name="_ftn3"&gt;&lt;/a&gt; http://indianexpress.com/article/india/india-others/sc-seeks-govt-response-on-making-dna-profiling-mandatory/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a name="_ftn4"&gt;&lt;/a&gt; The order dated September 22, 2014 can be found at			&lt;a href="http://courtnic.nic.in/supremecourt/temp/wc%2049112p.txt"&gt;http://courtnic.nic.in/supremecourt/temp/wc%2049112p.txt&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/dna-database-for-missing-persons-and-unidentified-dead-bodies'&gt;https://cis-india.org/internet-governance/blog/dna-database-for-missing-persons-and-unidentified-dead-bodies&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vipul</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-11-04T15:46:29Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime">
    <title>DNA ‘Evidence’: Only Opinion, Not Science, And Definitely Not Proof Of Crime!</title>
    <link>https://cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime</link>
    <description>
        &lt;b&gt;On August 9, 2018, the DNA Technology (Use and Application) Regulation Bill, 2018 was introduced in the Lok Sabha and we commented on some key aspects of it earlier. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="https://www.bloombergquint.com/opinion/2018/08/20/dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime#gs.nyAe84A"&gt;Bloomberg Quint&lt;/a&gt; on August 20, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Though taking some steps in the right direction such as formalising the process for lab accreditation, the Bill ignores many potential cases of ‘harm’ that may arise out of the collection, databasing, and using DNA evidence for criminal and civil purposes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DNA evidence is widely touted as the most accurate forensic tool, but what is not widely publicised is it is not infallible. From crime scene to database, it is extremely vulnerable to a number of different unknown variables and outcomes. These variables are only increasing as the technology becomes more precise – profiles can be developed from only a few cells and technology now exists that generates a profile in 90 minutes. Primary and secondary transfer, contamination, incomplete samples, too many mixed samples, and inaccurate or outdated methods of analysis and statistical methodologies that may be used, are all serious reasons as to why DNA evidence may paint an innocent person guilty.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;Importantly, DNA itself is not static and predicting how it may have changed over time is virtually impossible.&lt;/blockquote&gt;
&lt;h3 style="text-align: justify; "&gt;Innocent, But Charged&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In April 2018, &lt;a href="https://www.wired.com/story/dna-transfer-framed-murder/" target="_blank"&gt;WIRED carried a story &lt;/a&gt;of  Lukis Anderson who was charged with the first-degree murder of Raveesh  Kumra, a Silicon Valley investor after investigators found Anderson’s  DNA on Kumra’s nails. Long story short – Anderson earlier that day had  been intoxicated in public and had been attended by paramedics. The same  paramedics handled Kumra’s body and inadvertently transferred  Anderson’s DNA to Kumra’s body. The story quotes some sobering facts  that research has found about DNA:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Direct  contact is not necessary for DNA to be transferred. In an experiment  with a group of individuals sharing a bottle of juice, 50 percent had  another’s DNA on their hand and ⅓rd of the glasses contained DNA from  individuals that did not have direct contact with them.&lt;/li&gt;
&lt;li&gt;An average person sheds 50 million skin cells a day.&lt;/li&gt;
&lt;li&gt;Standing  still our DNA can travel over a yard away and will be easily carried  over miles on others clothing or hair, for example not very differently  from pollen.&lt;/li&gt;
&lt;li&gt;In an experiment that tested public items, it was found that items can contain DNA from a half-dozen people.&lt;/li&gt;
&lt;li&gt;A friendly or inadvertent contact can transfer DNA to private regions or clothing.&lt;/li&gt;
&lt;li&gt;Different people shed detritus at different levels that contain DNA.&lt;/li&gt;
&lt;li&gt;One in five has some other person’s DNA under the fingernails on a continuous basis.&lt;/li&gt;
&lt;/ol&gt;
&lt;div style="text-align: center; "&gt;&lt;img src="https://cis-india.org/home-images/BloombergPic.png/@@images/6eed536e-0142-44b7-a710-60d812d3bc1e.png" alt="Crime Scene Tape in Alexandria" class="image-inline" title="Crime Scene Tape in Alexandria" /&gt;&lt;/div&gt;
&lt;div style="text-align: center; "&gt;A police office carries crime scene tape in Alexandria, Virginia, U.S. (Photographer: Andrew Harrer/Bloomberg)&lt;/div&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="https://www.wired.com/2015/10/familial-dna-evidence-turns-innocent-people-into-crime-suspects/" target="_blank"&gt;In another case&lt;/a&gt;,  the police in Idaho, USA, used a public DNA database to run a familial  DNA search – a technique used to identify suspects whose DNA is not  recorded in a law enforcement database, but whose close relatives have  had their genetic profiles cataloged, just as India's DNA Bill seeks to  do. The partial match that resulted implicated Michael Usry, the son of  the man whose DNA was in the public database. It took 33 days for  Michael to be cleared of the crime. That an innocent man only spent 33  days under suspicion could be considered a positive outcome when  compared to the case of &lt;a href="https://www.theatlantic.com/magazine/archive/2016/06/a-reasonable-doubt/480747/" target="_blank"&gt;Josiah Sutton&lt;/a&gt; who spent four years convicted of rape in prison due to  misinterpretation of DNA samples by the Houston Police Department Crime  Laboratory, which is among the largest public forensic centers in Texas.  The Atlantic called this out as “The False Promise of DNA Testing – the  forensic technique is becoming ever more common and ever less  reliable”.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;Presently, there is little confidence that such safeguards exist – prosecutors do not share any exculpatory evidence with the accused and India does not even follow the ‘fruit of a poisonous tree’ doctrine with respect to the admissibility of evidence and India has yet to develop a robust jurisprudence for evaluating scientific evidence.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;The 2015 Law  Commission Report cites four cases that speak to the role and reliance  on expert opinion as evidence. Though these cases point to the  importance of expert opinion they differ on the weight that should be  given to the same.&lt;a href="http://www.genewatch.org/uploads/f03c6d66a9b354535738483c1c3d49e4/BestPractice_Report_plus_cover_final.pdf" target="_blank"&gt; International best practice&lt;/a&gt; requires the submission of corroborating evidence, training law  enforcement, and court officers, and ensuring that prosecution and  defence have equal access to forensic evidence.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Consider India with a population of 1.3 billion people – 70 percent mostly residing in rural areas and less educated and a&lt;a href="https://www.weforum.org/agenda/2017/10/india-has-139-million-internal-migrants-we-must-not-forget-them/" target="_blank"&gt; heavy migrant population&lt;/a&gt; in urban centres, an overwhelmed police force in nascent stages of  forensic training, and an overburdened judiciary and no concrete laws to  govern issues of the &lt;a href="http://jlsr.thelawbrigade.com/index.php/2017/06/16/admissibility-of-dna-in-indian-legal-system/" target="_blank"&gt;admissibility of forensic techniques&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In such circumstances, the question is not only how many criminals can be convicted but also how many innocents could be convicted.&lt;/p&gt;
&lt;p style="text-align: center; "&gt;&lt;img src="https://cis-india.org/home-images/Handcuffs.png/@@images/ada66bb0-965f-404f-b434-bb8d36110544.png" alt="Handcuffs" class="image-inline" title="Handcuffs" /&gt;&lt;/p&gt;
&lt;p style="text-align: center; "&gt;A pair of standard issue handcuffs sits on a table. (Photographer: Jerome Favre/Bloomberg)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The DNA Bill seeks to establish DNA databanks at the regional and national level but how this will be operationalised is not quite clear. The Bill enables the DNA Regulatory Board to accredit DNA labs. Will databases be built from scratch? Will they begin by pulling in existing databases?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The question is not if the DNA samples match but how they came to match. The greater power that comes from the use of DNA databases requires greater responsibility in ensuring adequate information, process, training, and laws are in place for everyone – those who give DNA, collect DNA, store DNA, process DNA, present DNA, and eventually decide on the use of the DNA. As India matures in its use of DNA evidence for forensic purposes it is important that it keeps at the forefront what is necessary to ensure and protect the rights of the individual.&lt;/p&gt;
&lt;div class="story-element-text story-element"&gt;
&lt;div&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Elonnai Hickok  Chief Operating Officer at The Centre for Internet and Society. Murali  Neelakantan is an expert in healthcare laws, and the author of&lt;/i&gt; ‘&lt;i&gt;DNA Testing as Evidence - A Judge&lt;/i&gt;’&lt;i&gt;s Nightmare&lt;/i&gt;’ &lt;i&gt;in the Journal of Law and Medicine.&lt;/i&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;ol&gt; &lt;/ol&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime'&gt;https://cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok and Murali Neelakantan</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>DNA Profiling</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-08-22T00:43:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019">
    <title>Divergence between the General Data Protection Regulation and the Personal Data Protection Bill, 2019</title>
    <link>https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
&lt;p&gt;Our note on the divergence between the General Data Protection Regulation and the Personal Data Protection Bill can be downloaded as a PDF &lt;a href="https://cis-india.org/internet-governance/divergence-between-the-gdpr-and-pdp-bill-2019" class="internal-link" title="Divergence between the GDPR and PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The European Union’s General Data
Protection Regulation (GDPR), replacing the 1995 EU Data Protection Directive
came into effect in May 2018. It harmonises the data protection regulations
across the European Union. In India, the Ministry of Electronics and
Information Technology had constituted a Committee of Experts (chaired by
Justice Srikrishna) to frame recommendations for a data protection framework in
India. The Committee submitted its report and a draft Personal Data Protection
Bill in July 2018 (2018 Bill). Public comments were sought on the bill till
October 2018. The Central Government revised the Bill and introduced the
revised version of the Personal Data Protection Bill (PDP Bill) on December 11,
2019 in the Lok Sabha.&lt;/p&gt;
&lt;p&gt;The PDP Bill has incorporated certain
aspects of the GDPR, such as requirements for notice to be given to the data
principal, consent for processing of data, establishment of a data protection
authority, etc. However, there are some differences and in this note we have highlighted
the areas of divergence between the two. It only includes
provisions which are common to the GDPR and the PDP Bill. It does not include
the provisions on (i) Appellate Tribunal, (ii) Finance, Account and Audit; and
(iii) Non- Personal Data.&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019'&gt;https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2020-02-21T11:08:50Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017">
    <title>Discussion on Ranking Digital Rights in India (Delhi, January 07)</title>
    <link>https://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017</link>
    <description>
        &lt;b&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of Privacy International, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues. Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the study.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_RDRIndia-Discussion_07012017_Invitation.pdf"&gt;Invitation and agenda&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="https://rankingdigitalrights.org/"&gt;Ranking Digital Rights Corporate Responsibility Index&lt;/a&gt; is a project hosted by the Open Technology Institute at New America Foundation that aims to rank Information and Communications Technology (ICTs) companies with respect to their Governance, Freedom of Expression, and Privacy practices. The inaugural Corporate Accountability Index, released in November 2015, evaluated 16 companies based on the project’s methodology that included 31 indicators in total.&lt;/p&gt;
&lt;p&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of &lt;a href="https://privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues.&lt;/p&gt;
&lt;p&gt;Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the Ranking Digital Rights study. We will begin at 10:30 am with a round of tea and coffee.&lt;/p&gt;
&lt;p&gt;The event is open to all but the venue has limited space. The participants are requested to RSVP by sending an email to &lt;a href="mailto:nisha@cis-india.org?subject=RSVP: Ranking Digital Rights Discussion"&gt;nisha@cis-india.org&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;To further encourage programmers, researchers, journalists, students, and users in general to use and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology, we are also organising a “rankathon” on Sunday, January 08, at the CIS office in Delhi. More details can be found &lt;a href="http://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We look forward to your participation and contribution to the discussion. Please support us by sharing this invitation with your colleagues and networks.&lt;/p&gt;
&lt;h2&gt;Agenda&lt;/h2&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;10:30-11:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:00-11:15&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:15-13:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Presentation of the Findings and Discussion&lt;/strong&gt; &lt;em&gt;Divij Joshi and Aditya Singh Chawla&lt;/em&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;13:00-14:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Lunch&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;14:00-15:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #1: Parameters of Evaluation&lt;/strong&gt;&lt;br /&gt;The RDR methodology was based upon evaluating commitments to uphold human rights through their services – in particular towards their commitment to users’ freedom of expression and privacy. Are there other parameters that may be considered in the Indian context?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;15:00-16:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #2: Towards Protecting Digital Rights&lt;/strong&gt;&lt;br /&gt;What steps can be taken by the government, civil society, and industry in India to create an environment that recognizes and protects users digital rights? What are the relevant legal, political, and economic factors to take into consideration towards this? What are steps that other, multinational ICT companies have taken? Would these be realistic for Indian companies to implement?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:00-16:30&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:30-17:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017'&gt;https://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Ranking Digital Rights</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2016-12-29T07:07:34Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence">
    <title>Discrimination in the Age of Artificial Intelligence </title>
    <link>https://cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence</link>
    <description>
        &lt;b&gt;The dawn of Artificial Intelligence (AI) has been celebrated by both government and industry across the globe. AI offers the potential to augment many existing bureaucratic processes and improve human capacity, if implemented in accordance with principles of the rule of law and international human rights norms. Unfortunately, AI-powered solutions have often been implemented in ways that have resulted  in the automation, rather than mitigation, of existing societal inequalities.&lt;/b&gt;
        &lt;p&gt;This was originally published by &lt;a class="external-link" href="http://ohrh.law.ox.ac.uk/discrimination-in-the-age-of-artificial-intelligence/"&gt;Oxford Human Rights Hub&lt;/a&gt; on October 23, 2018&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="https://cis-india.org/home-images/ArtificialIntelligence.jpg/@@images/3b551d39-e419-442c-8c9d-7916a2d39378.jpeg" alt="Artificial Intelligence" class="image-inline" title="Artificial Intelligence" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Image Credit: Sarla Catt via Flickr, used under a Creative Commons license available at https://creativecommons.org/licenses/by/2.0/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the international human rights law context, AI solutions pose a  threat to norms which prohibit discrimination. International Human  Rights Law &lt;a href="https://books.google.co.in/books/about/International_Human_Rights_Law.html?id=YkcXAgAAQBAJ&amp;amp;redir_esc=y"&gt;recognizes that discrimination&lt;/a&gt; may take place in two possible ways, directly or indirectly. Direct  discrimination occurs when an individual is treated less favourably than  someone else similarly situated on one of the grounds prohibited in  international law, which, as per the &lt;a href="http://www.equalrightstrust.org/ertdocumentbank/Human%20Rights%20Committee,%20General%20Comment%2018.pdf"&gt;Human Rights Committee,&lt;/a&gt; includes race, colour, sex, language, religion, political or other  opinion, national or social origin, property, birth or other status.  Indirect discrimination occurs when a policy, rule or requirement is  ‘outwardly neutral’ but has a disproportionate impact on certain groups  that are meant to be protected by one of the prohibited grounds of  discrimination. A clear example of indirect discrimination recognized by  the European Court of Human Rights arose in the case of &lt;a href="http://www.errc.org/cikk.php?cikk=3559"&gt;&lt;i&gt;DH&amp;amp;Ors v Czech Republic&lt;/i&gt;&lt;/a&gt;.  The ECtHR struck down an apparently neutral set of statutory rules,  which implemented a set of tests designed to evaluate the intellectual  capability of children but which resulted in an excessively high  proportion of minority Roma children scoring poorly and consequently  being sent to special schools, possibly because the tests were blind to  cultural and linguistic differences. This case acts as a useful analogy  for the potential disparate impacts of AI and should serve as useful  precedent for future litigation against AI-driven solutions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Indirect discrimination by AI may occur &lt;a href="https://cis-india.org/internet-governance/ai-and-governance-case-study-pdf"&gt;at two stages&lt;/a&gt;. First is the &lt;b&gt;usage of incomplete or inaccurate training data&lt;/b&gt; that results in the algorithm processing data that may not accurately reflect reality. Cathy O’Neil explains this &lt;a href="https://weaponsofmathdestructionbook.com/"&gt;using a simple example&lt;/a&gt;.  There are two types of crimes-those that are ‘reported’ and others that  are only ‘found’ if a policeman is patrolling the area. The first  category includes serious crimes such as murder or rape while the second  includes petty crimes such as vandalism or possession of illicit drugs  in small quantities. Increased police surveillance in areas in US cities  where Black or Hispanic people reside lead to more crimes being ‘found’  there. Thus, data is likely to suggest that these communities commit a  higher proportion of crimes than they actually do – indirect  discrimination that has been empirically been shown through research  published by &lt;a href="https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematically-inevitable-researchers-say"&gt;Pro Publica&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Discrimination may also occur at the stage of &lt;b&gt;data processing&lt;/b&gt;, which is done through a metaphorical &lt;a href="https://www.sentient.ai/blog/understanding-black-box-artificial-intelligence/"&gt;‘black-box’&lt;/a&gt; that accepts inputs and generates outputs without revealing to the  human developer how the data was processed. This conundrum is compounded  by the fact that the algorithms are often utilised to solve an  amorphous problem-which attempts to break down a complex question into a  simple answer. An example is the development of ‘risk profiles’ of  individuals for the  &lt;a href="http://fortune.com/longform/ai-bias-problem/"&gt;determination of insurance premiums.&lt;/a&gt; Data might show that an accident is more likely to take place in inner  cities due  to more densely packed populations in these areas. Racial  and ethnic minorities tend to reside more in these areas, which means  that algorithms could learn that minorities are more likely to get into  accidents, thereby generating an outcome (‘risk profile’) that  indirectly discriminates on grounds of race or ethnicity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It would be wrong to ignore discrimination, both direct and indirect,  that occurs as a result of human prejudice. The key difference between  that and discrimination by AI lies in the ability of other individuals  to compel the decision-maker to explain the factors that lead to the  outcome in question and testing its validity against principles of human  rights. The increasing amounts of discretion and, consequently, power  being delegated to autonomous systems mean that principles of  accountability which audit and check indirect discrimination need to be  built into the design of these systems. In the absence of these  principles, we risk surrendering core tenets of human rights law to the  whims of an algorithmically crafted reality.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence'&gt;https://cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-10-26T14:47:57Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook">
    <title>Digital Native: Delete Facebook?</title>
    <link>https://cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook</link>
    <description>
        &lt;b&gt;You can check out any time you like, but you can never leave.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="http://indianexpress.com/article/technology/social/digital-native-delete-facebook-5127198/"&gt;published in Indian Express&lt;/a&gt; on April 8, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;One fine day, we all woke up and were told that &lt;/span&gt;&lt;a href="http://indianexpress.com/about/facebook/"&gt;Facebook&lt;/a&gt;&lt;span&gt; sold our data to Cambridge Analytica and then they made dastardly profiles of us to target us with advertisement and political propaganda, so, we made a beeline for #DeleteFacebook. The most surprising part about the expose is how much of a non-event it is. We have been warned, at least since the Edward Snowden revelations, if not earlier, that our data is the new oil, coal and gold. It is being used as a resource, it is being mined from our everyday digital transactions, and it is precious because it can result in a massive social engineering without our consent or knowledge. Ever since Facebook started expanding its domain from being a friends-poke-friends-with-livestock website, we have been warned that the ambition of Facebook was never to connect you with your friends but to be your friend.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;Time and again, we have been told that the sapient Facebook algorithm remembers everything you say and do, anticipates all your future needs, and listens to the most banal litany of your life. More than your mom, your partner or your shrink, it’s the Facebook algorithm which is interested in all your quotidian uselessness. It is not the stranger who accesses your post that should worry you. The biggest perpetrator of privacy violations on Facebook is Facebook itself. There is good reason why a company that offers its prime products for free is valuated as one of the richest corporations in the world. The product of Facebook – it has always been known – is us.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;Why, then, are we suddenly taken aback at the fact that Facebook sold us? And while we are sharing our thoughts (ironically on Facebook) about deleting our profiles, the question that remains is this: How much of your digital life are you willing to erase? Because, and I am sorry if this pricks your filter bubble, Facebook’s problem is not really a Facebook problem. It is almost the entire World Wide Web, where we lost the battle for data ownership and platform openness more than two decades ago. Name one privately owned free service that you use on the internet and I will show you the section in its “terms and services” where you have surrendered your data. In fact, you can’t even find government services, tied up with their private partners, where your data is safe and stored in privacy vaults where it won’t be abused.&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;It is time to realise that the popular ’90s meme “All your base are belong to us” is the lived reality of our digital lives. As we forego ownership for convenience, as our governments sold our sovereignty for profits, and as digital corporations became behemoths that now have the capacity to challenge and write our constitutional and fundamental rights, we are waking up to a battle that has already been fought and resolved. A large part of our physical hardware to access the internet is privately owned. This means that almost all our PCs, tablets, phones, servers are owned and open to exploitation by private companies. Every time your phone does an automatic update or your PC goes into house-cleaning mode, you have to realise that you are being stored, somewhere in the cloud in ways that you cannot imagine.&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;It is tiring to hear this alarm and panic around Facebook’s data trading. Not only is it legal, it is something that has been happening for a while, most of us have been aware of it, and we have resolutely ignored it because, you know, cute cats. If somebody tells you that they are against privately owned physical property and are going to start a revolution to take away all private property and make it equally shared with the public, you would laugh at them because they are arriving at the battle scene after the war is over. This digital wokeness trend to #DeleteFacebook is the digital equivalent of that moment. If you want to fight, fight the governments and nations who can still protect us. Participate in conversations around Internet governance. Take responsibility to educate yourself about the politics of how the digital world operates. But stop trying to feel virtuous because you pulled out of a social media network, pretending that that is the end of the problem.&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook'&gt;https://cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>nishant</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2018-05-06T03:08:25Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions">
    <title>Digital illusions</title>
    <link>https://cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions</link>
    <description>
        &lt;b&gt;The Watal Committee’s report presents the government with an impossible road map to a cashless nirvana. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by V. Sridhar was &lt;a class="external-link" href="http://www.frontline.in/the-nation/digital-illusions/article9541506.ece?homepage=true"&gt;published in Frontline&lt;/a&gt;, Print edition: March 3, 2017&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;MORE than two months after demonetising an overwhelming proportion of the currency in circulation, the Narendra Modi government now appears to have settled on its key objective for setting out on the unprecedented economic adventure. After shifting the goalposts several times—initially it was a means of combating terrorism and fake currency, later it was a war on black money and still later it was to forcibly march the country towards a “cashless” future, which was then modified to a more reasonable “less cash” society—the government now ostensibly has the road map to undertake the hazardous journey to an age when cash will no longer be king.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There is no better and time-tested means for a government bent on carrying out its whims than to appoint a committee headed by a former bureaucrat to give it the report that would justify what it has already decided to do. In August 2016, months before demonetisation, it constituted the Committee on Digital Payments, chaired by Ratan P. Watal, Principal Adviser, NITI Aayog, and former Secretary, Ministry of Finance. The committee dutifully submitted its report in double quick time on December 9, which was approved by the Finance Ministry on December 27.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The haste with which the committee has gone about its business is evident throughout the report. The committee’s slant is also evident in its approach, especially the reverence with which it welcomes the demonetisation move, even though it was commissioned before November 8, and its recourse to suspect data from private industry and multinational companies even when better quality data were available from official sources such as the Reserve Bank of India (RBI). The report’s lack of rigour, especially in tackling the substantive issues pertaining to monetary policy, was also hindered by the fact that not a single economist of worth, not even a specialist in monetary economics, was present in the committee.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Reckless rush&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;However, to blame the committee alone would be futile. The government, by pursuing an ambitious and reckless push towards “less cash” before setting out a regulatory framework governing digital payments, in effect, placed the cart before the horse.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report reveals not just the haste with which the Watal Committee has pursued its mission with evangelical zeal but its utter lack of respect for conceptual issues. Nowhere is this more evident than in its recommendation that the regulatory responsibilities for governing the digital payments system be distanced from the RBI. This not only is out of tune with global practices, but it reveals the committee’s sheer inability to understand the fact that although payments account for just a small fraction of what a banking system does, they impinge on modern banking and monetary policy in crucial ways.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a modern economy, currency creation by the central bank through fiat money is not the only means by which money is created. Deposits with banks, for instance, which provide the base for credit creation, are a means by which banks “create” money. From this perspective, a mobile wallet service provider also acts like a bank; even the users’ monies are held only for a brief period until transactions happen.&lt;br /&gt;&lt;br /&gt;Thus, it appears fit and proper that such services are also governed by the central bank. However, the Watal Committee has recommended that they be supervised by an entity that has a measure of independence from the RBI. This suggestion is dangerous because such entities can potentially pose a systemic risk, which is a key responsibility of a central bank. There is also the risk of regulatory capture of the suggested body, the Payments Regulatory Board (PRB), if sections of the payments industry exercise their newly acquired clout.&lt;br /&gt;&lt;br /&gt;The committee’s enthusiastic acceptance of the “go cashless” mantra is also evident in the data it has sourced. A good example of how it cherry-picked data is its use of a highly dubious (or at the very least, utterly misplaced) dataset to make the point that India is far too dependent on cash. It points to data sourced from the International Monetary Fund (IMF) and other sources to claim that India’s cash-GDP (gross domestic product) ratio is 12.04 per cent, much higher than countries such as Brazil, Mexico and South Africa.&lt;br /&gt;&lt;br /&gt;However, this much-abused dataset, quoted widely by advocates of demonetisation, is an inaccurate measure because it only captures the extent of physical currency in circulation and ignores short-term deposits, which are defined as “broad money”. Logically, these deposits must be included because they are virtually on call by depositors and are, therefore, liquid. Secondly, the fact that such deposits have been increasing as a proportion of the currency in circulation, aided by the spread of banking in India, makes them particularly relevant in the Indian context. The committee, in its bid to justify sending the nation on a cashless path, proceeds to evaluate the “high” costs that cash imposes on the Indian economy. It quotes from McKinsey and Visa, both of which may have a vested interest in India’s mission to go cashless, to drive home the point that going digital would result in huge savings. It quotes McKinsey to claim that “transitioning to an electronic platform for government payments itself could save approximately Rs.100,000 crore annually, with the cost of the transition being estimated at Rs.60,000-70,000 crore” and a Visa report that claims a total investment of Rs.60,000 crore over five years towards creating a digital payments ecosystem could reduce the country’s cost of cash from 1.7 per cent of the GDP to 1.3 per cent.&lt;br /&gt;&lt;br /&gt;Even while pushing the benefits of going cashless, the committee does admit that the transition to digital payments “cannot be agnostic to the actual costs incurred by the end customers, the reasons for preferring cash, and the factors inhibiting the uptake of existent channels of digital payments”.&lt;br /&gt;&lt;br /&gt;A large part of the Indian economy is its “black” counterpart, estimated at about 60 per cent of the legitimate part of India’s national income. Since a significant portion of the currency in circulation caters to the demand from the shadow economy, apart from the huge segment that is engaged in legitimate but informal economic activity, these estimates miss a significant chunk of the economy and its need for cash. Conceptually, to that extent, they significantly overstate the extent of cash relative to real GDP, including the portion missing from official data.&lt;br /&gt;&lt;br /&gt;The naive assumption that digitalised financial transactions are scale-neutral and costless, painless and efficient lies at the heart of the Watal Committee’s report. This has obvious implications for India’s large informal economy, which the Modi government is pushing, under pain of death, towards formality through digital channels. For instance, basic data on the usage of debit cards show how skewed the demand for cards is in India. In August 2016, cash withdrawals at ATMs accounted for 92.28 per cent of the value of all debit card transactions in the country. Thus, less than 8 per cent of the total value was made at point-of-sale (PoS) terminals.&lt;br /&gt;&lt;br /&gt;This statistic is a clear indication of a divide that mirrors the income and consumption divide in Indian society. When banks issue cards (debit, credit or any other), card payment system companies such as Mastercard and Visa provide an interface with the customer for which the issuer pays a fee, which is, in any case, recovered from customers. According to a recent study by Visa, the penetration of PoS terminals has slowed down significantly since 2012, when the RBI set limits on what the card companies could charge as merchant discount rate (MDR), the amount charged from sellers. This reveals that card companies may have been slowing down penetration in order to bargain for a bigger slice of the transaction fee. Although the rates apply not just to card-based purchases but to cash withdrawals too (and have been waived or lowered in the wake of demonetisation on a purely temporary basis), there is no guarantee that they will not increase once the situation returns to normal. This is aggravated by the fact that the government may have little or no control, or the will, to prevent banks and card issuers from charging higher rates later. This has been demonstrated in the past with, for example, ATM-based withdrawals, for which customers have to pay a fee after a minimum number of transactions.&lt;br /&gt;&lt;br /&gt;The flat fee (as a percentage) is regressive, especially because it punishes smaller sellers. It is in this sense that finance, digital or otherwise, is never scale-neutral. The fact that the immediate victims of demonetisation are small-scale producers and retailers implies that the balance has been tilted against them and in favour of larger producers and retailers after November 8. By skewing the field against small and tiny enterprises, demonetisation has been the vehicle for a massive and unprecedented transfer of incomes and wealth from the poor to the rich.&lt;br /&gt;&lt;br /&gt;There is also a fundamental asymmetry in the use of technology in the financial services industry. ATMs, which have been around for decades, were originally touted as a technology that increases efficiency in the use of cash; you only need to withdraw as much as you need, so there is no motive to hoard cash. But that was not the motive for introducing ATMs; the real reason was that they enabled banks to reduce their workforce to cut costs. As ATMs became more ubiquitous, banks started moving from cost cutting to profit-seeking by levying a fee for every transaction above a minimum threshold. In effect, the gains from technology are boosting the profitability of banks while the wider systemic benefits made possible by the same technology have been sacrificed, as the imposition of fees above a minimum threshold actually drives people to hoard cash.&lt;br /&gt;&lt;br /&gt;A study by Visa in October 2016, titled Accelerating The Growth of Digital Payments in India: A Five-Year Outlook, reveals that a one percentage point reduction in cash in circulation as percentage of GDP would require digital transactions of personal consumption expenditure to multiply ninefold. In other words, Visa suggested that digital transactions as a percentage of personal consumption expenditure would need to increase from 4 per cent to 36 per cent if the cash-GDP ratio has to reduce from 11 per cent to 10 per cent.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Security concerns&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Apart from these weighty economic issues, which are central to the move towards digital financial transactions, there are other critically important issues that the committee has either ignored or swept under the carpet. The question of privacy and security was a central issue at a recent conference on digital payments organised by HasGeek, a platform for software developers, in Bengaluru. Several experts, including some from the payments industry, pointed out the serious security and privacy issues that are being ignored in the rush to go digital. For example, an expert on data security warned that the mindless rush to mobile-based transactions was especially scary because most Android phones are vulnerable because they leak data. In fact, he noted that it may be safer for Android mobile users to perform digital transactions using desktop browsers.&lt;br /&gt;&lt;br /&gt;But what is more scary is the manner in which Aadhaar is being touted by the committee as the magic wand by which the digital era can be ushered in quickly. It recommends that mobile number-based and Aadhaar-based “fully interoperable payments” be prioritised within 60 days and that the National Payments Corporation of India (NPCI) be responsible for ensuring this.&lt;br /&gt;&lt;br /&gt;There has been significant resistance to the idea of an Aadhaar-enabled service for digital transactions, primarily because of security and privacy concerns. Entities such as the Centre for Internet and Society have warned against linking Aadhaar to the financial inclusion project because it violates the Supreme Court stricture against making Aadhaar mandatory. Kiran Jonnalagadda of HasGeek pointed out that the Aadhaar system offered only “single factor authorisation”. He said in a recent tweet that Aadhaar involved only a permanent login ID without “a changeable password”, which, from a systemic point of view, made it open to abuse.&lt;br /&gt;&lt;br /&gt;Longstanding critics of the Aadhaar project have pointed out the launch of such a countrywide programme at a time when a regulatory regime is not even in place, and when India does not have privacy protection laws, is dangerously misplaced. They have pointed to the fact that unlike in the case of a debit or credit card, which can be replaced when its integrity has been compromised, the theft of biometric characteristics of a user implies that they are compromised forever. This is not science fiction but a very real possibility as has been demonstrated across the world.&lt;br /&gt;&lt;br /&gt;There are also serious worries that the high failure rate of biometric verification would hurt the poor, supposedly the main target group of the Aadhaar project; the large-scale denial of services such as access to the public distribution system has already been documented across the country. Extending a failed system to real-time financial transactions, thus, appears to be dangerously misplaced. The fundamental issue is this: can a digital mode of payment effectively provide the same level of trust between the transacting parties that is central to a cash-based transaction? The answer to that depends critically on whether the digital mode provides the same level of convenience, cost, predictability and certainty.&lt;br /&gt;&lt;br /&gt;The Watal Committee has produced a report that the political masters sought. Its lack of appreciation of the economic issues underpinning financial transactions and of the wider economic processes in the Indian economy are obvious. Effectively, it has delivered what the Modi government asked for—an impossible road map to a cashless nirvana for a people already suffering the effects of demonetisation.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions'&gt;https://cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-02-16T14:53:39Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>




</rdf:RDF>
