<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 241 to 255.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/raw/indian-express-august-12-2018-nishant-shah-digital-native-double-speak"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/huffington-post-august-25-2018-paul-bluementhal-and-gopal-sathe-indias-biometric-database-is-creating-a-perfect-surveillance-state"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/files/analysis-of-cloud-act-and-implications-for-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/factor-daily-anand-murali-august-13-2018-the-big-eye"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/call-for-essays-offline"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/ai-and-governance-case-study-pdf"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/files/normative-regulation-of-cyber-space-report"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/about/newsletters/july-2018-newsletter"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/files/indian-privacy-code"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/raw/indian-express-august-12-2018-nishant-shah-digital-native-double-speak">
    <title>Digital Native: Double Speak</title>
    <link>https://cis-india.org/raw/indian-express-august-12-2018-nishant-shah-digital-native-double-speak</link>
    <description>
        &lt;b&gt;Aadhaar’s danger has always been that it opens up individuals to high levels of vulnerability without providing safeguards.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="https://indianexpress.com/article/express-sunday-eye/digital-native-aadhaar-double-speak-5300540/"&gt;Indian Express&lt;/a&gt; on August 12, 2018.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;This has been a month of Twitter drama. In the latest episode,  Twitter exploded once again with RS Sharma, the chief of the Telecom  Regulatory Authority of India (TRAI). Sharma revealed his &lt;a href="https://indianexpress.com/article/what-is/what-is-aadhaar-card-and-where-is-it-mandatory-4587547/"&gt;Aadhaar&lt;/a&gt; number on Twitter and challenged the world (#facepalm) to do their  worst. The Twitterati moved quickly and decided to go 50 Shades of Grey  on Sharma.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In less than 24 hours, French security researcher Elliot Alderson,  who has been systematically showing vulnerabilities in Aadhaar’s  technical infrastructure, fished out Sharma’s personal address, birth  date, email, alternate phone number, and PAN number. A few other ethical  hackers got hold of his bank account details and used &lt;a href="https://indianexpress.com/about/paytm/"&gt;Paytm&lt;/a&gt; apps to transfer money to one of his bank accounts. Sharma made a  grandstand of how this information is not “state secret” and that this  was already peppered across the internet for anybody to find. The UIDAI,  while calling his tactics a cheap hack, announced that the Aadhaar  database was not “hacked” to retrieve this information and that our  precious private data is safe in those hands.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What remains really bizarre, in both the responses from Sharma and  the UIDAI, however, is their willing blindness to what networked  information systems do and look like. There are three main points to  consider here. Sharma, marked by privilege, protected by power, and  confident in his ability to protect himself in case of threat, might  dismiss this private information as non-critical. However, what he fails  to realise is that the same data, for somebody in a precarious  condition might be sensitive enough to have their life collapse on them.  On the nefarious digital worlds of the Indian web, where women are  regularly threatened with rape and death as a form of silencing them,  where queer people are stalked and followed in real life for blackmail  and abuse, where resistant actors find their families threatened, this  information in the public domain could literally be a matter of life and  death. In the past, with much less information available, we have seen  how specific communities could be targeted in times of communal tension  and violence. The fact that the head of TRAI cannot look beyond his  gilded privilege to the conditions of precariousness that data leaks  like these could lead to is shameful.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Perhaps, even more alarming is the UIDAI’s consistent myopic focus on  what constitutes safe data. While I have no doubt that the incredible  engineers and security experts are working hard to keep the Aadhaar data  secure, the Twitter ethical hackers were not making claims of hacking a  database at all. They were merely demonstrating why centralised unique  ids, which perform acts of causative correlation, have the capacity to  build surveillance states without even meaning to. Their data exposure  is indicative of the fact that while Aadhaar itself does not carry much  information, the linkages it makes with multiple other databases — tax  offices, bank accounts, public services, emails, phone numbers, etc. —  can expose information profiles without our consent. In fact, the danger  of Aadhaar has never been that as a technical system it doesn’t work.  The threat that it posits is that as a social and a cultural transaction  system it opens up individuals to high levels of precariousness without  building privacy safeguards for those who might fall through the  cracks.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What remains the most disappointing in this entire piece of melodrama  is that the conversations keep on unfolding at two different registers.  The Aadhaar activists have been asking not for a dismantling of the  system but to build ethical, compassionate, flexible and constitutional  checks and balances at the core of the system. Ever since its inception,  the demand has been clear: build privacy, security, safety, and human  care into the DNA of the system, and not in its afterthought. The UIDAI  has persistently neglected and willfully dismissed these demands, thus  privileging the security of their infrastructure and data over the  safety of their citizens.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/indian-express-august-12-2018-nishant-shah-digital-native-double-speak'&gt;https://cis-india.org/raw/indian-express-august-12-2018-nishant-shah-digital-native-double-speak&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>nishant</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Digital Natives</dc:subject>
    

   <dc:date>2018-09-04T15:22:59Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/huffington-post-august-25-2018-paul-bluementhal-and-gopal-sathe-indias-biometric-database-is-creating-a-perfect-surveillance-state">
    <title>India’s Biometric Database Is Creating A Perfect Surveillance State — And U.S. Tech Companies Are On Board</title>
    <link>https://cis-india.org/internet-governance/news/huffington-post-august-25-2018-paul-bluementhal-and-gopal-sathe-indias-biometric-database-is-creating-a-perfect-surveillance-state</link>
    <description>
        &lt;b&gt;The Aadhaar program offers a glimpse of the tech world's latest quest to control our lives, where dystopias are created in the name of helping the impoverished.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Paul Bluementhol and Gopal Sathe was published in &lt;a class="external-link" href="https://www.huffingtonpost.in/entry/india-aadhuar-tech-companies_us_5b7ebc53e4b0729515109fd0"&gt;Huffington Post&lt;/a&gt; on August 25, 2018. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Big U.S. technology  companies are involved in the construction of one of the most intrusive  citizen surveillance programs in history.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For the past nine years, India has  been building the world’s biggest biometric database by collecting the  fingerprints, iris scans and photos of nearly 1.3 billion people. For  U.S. tech companies like Microsoft, Amazon and Facebook, the project,  called Aadhaar (which means “proof” or “basis” in Hindi), could be a  gold mine.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The CEO of Microsoft has repeatedly praised the project, and local media have carried frequent reports on &lt;a href="https://m.economictimes.com/tech/hardware/uidai-wants-to-make-mobile-phones-aadhaar-enabled-holds-discussion-with-smartphone-makers/amp_articleshow/53441186.cms?__twitter_impression=true" rel="noopener noreferrer" target="_blank"&gt;consultations between the Indian government and senior executives&lt;/a&gt; from companies like Apple and Google (in addition to South Korean-based  Samsung) on how to make tech products Aadhaar-enabled. But when  reporters of HuffPost and HuffPost India asked these companies in the  past weeks to confirm they were integrating Aadhaar into their products,  only one company ― Google ― gave a definitive response.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;That’s because Aadhaar has become  deeply controversial, and the subject of a major Supreme Court of India  case that will decide the future of the program as early as this month.  Launched nine years ago as a simple and revolutionary way to streamline  access to welfare programs for India’s poor, the database has become  Indians’ gateway to nearly any type of service ― from food stamps to a  passport or a cell phone connection. Practical errors in the system have caused &lt;a href="https://stateofaadhaar.in/report_pages/state-of-aadhaar-report-2017-18/" rel="noopener noreferrer" target="_blank"&gt;millions&lt;/a&gt; of poor Indians to lose out on aid. And the exponential growth of the  project has sparked concerns among security researchers and academics  that India is the first step toward setting up a surveillance society to  rival China.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;b&gt;A Scheme Born In The U.S.&lt;/b&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Tapping into Aadhaar would help big  tech companies access the data and transactions of millions of users in  the second most populous country on earth, explained &lt;a href="https://www.huffingtonpost.in/2018/06/06/after-beta-testing-on-a-billion-indians-the-tech-behind-aadhaar-is-going-global_a_23452248/" rel="noopener noreferrer" target="_blank"&gt;Usha Ramanathan&lt;/a&gt;, a Delhi-based lawyer, legal researcher and one of Aadhaar’s most vocal critics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The idea for India’s national  biometric identification team wasn’t unprecedented, and in fact, it has  strong parallels with a system proposed for the United States. Following  the Sept. 11, 2001, attacks, the CEO of Oracle, Larry Ellison, offered  to build the&lt;a href="https://www.computerworld.com/article/2583197/data-privacy/ellison-offers-free-software-for-national-id.html" rel="noopener noreferrer" target="_blank"&gt; U.S. government software&lt;/a&gt; for a national identification system that would include a centralized  computer database of all U.S. citizens. The program never got off the  ground amid objections from privacy and civil liberties advocates, but  India’s own Ellison figure, Nandan Nilekani, had a similar idea. The  billionaire founder of IT consulting giant Infosys, Nilekani  conceptualized Aadhaar as a way to eliminate waste and corruption in  India’s social welfare programs. He lobbied the government to bring in  Aadhaar, and went on to run the project under the administration of  Manmohan Singh. Nilekani gained even more influence under current Prime  Minister Narendra Modi, who moved to make Aadhaar necessary for almost  any kind of business in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first 12-digit Aadhaar ID was  issued in 2010. Today, over a billion people (around 89 percent of  India’s population) have been included in the system ― from India’s  unimaginably wealthy billionaires to the homeless, from residents of the  country’s sprawling cities to remote inaccessible villages. While  initially a voluntary program, the database is now linked to just about  all government programs. You need an Aadhaar ID to get a &lt;a href="https://www.businesstoday.in/current/economy-politics/uidai-aadhaar-tatkal-passports-deadline-extension-order/story/272576.html" rel="noopener noreferrer" target="_blank"&gt;passport issued or renewed&lt;/a&gt;. Aadhaar was made mandatory for operating a bank account, using a cell phone or investing in mutual funds, only for the proposals to be rolled back pending the Supreme Court verdict on the constitutionality of the project.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As Aadhaar identification became  integrated into other systems like banking, cell phones and government  programs, tech companies can use the program to cross-reference their  datasets against other&lt;a href="https://www.hindustantimes.com/india-news/why-state-data-hubs-pose-a-risk-to-aadhaar-security/story-Klyl3yT5MkFk6Szg2yGg9N.html" rel="noopener noreferrer" target="_blank"&gt; databases&lt;/a&gt; and assemble a far more detailed and intrusive picture of Indians’  lives. That would allow them, for example, to better target products or  advertising to the vast Indian population. “You can take a unique  identifying number and use it to find data in different sectors,”  explained &lt;a href="https://www.huffingtonpost.in/2018/04/25/aadhaar-seeding-fiasco-how-to-geo-locate-every-minority-family-in-ap-with-one-click_a_23419643/" rel="noopener noreferrer" target="_blank"&gt;Pam Dixon&lt;/a&gt;,  executive director of the World Privacy Forum, an American public  interest research group. “That number can be cross-walked across all the  different parts of their life.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Microsoft, which uses  Aadhaar in a new version of Skype to verify users, declined to talk  about its work integrating products with the Aadhaar database. But Bill  Gates, Microsoft’s founder, &lt;a href="https://timesofindia.indiatimes.com/business/india-business/aadhaar-doesnt-pose-any-privacy-issue-gates/articleshow/64012833.cms" rel="noopener noreferrer" target="_blank"&gt;has publicly endorsed Aadhaar&lt;/a&gt; and his foundation is funding a World Bank program to bring Aadhaar-like  ID programs to other countries. Gates has also argued that ID  verification schemes like Aadhaar in itself don’t pose privacy issues.  Microsoft CEO Satya Nadella has repeatedly praised Aadhaar in both his  recent book and a &lt;a href="https://gadgets.ndtv.com/internet/features/satya-nadella-and-nandan-nilekani-talk-aadhaar-india-stack-ai-and-ar-1661798" rel="noopener noreferrer" target="_blank"&gt;tour across India&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amazon did not respond to a request for comment, but according to a &lt;a href="https://www.buzzfeednews.com/article/pranavdixit/amazon-is-asking-indians-to-hand-over-their-aadhaar-indias" rel="noopener noreferrer" target="_blank"&gt;BuzzFeed report&lt;/a&gt;, the company told Indian customers not  uploading a copy of Aadhaar “might result in a delay in the resolution  or no resolution” of cases where packages were missing.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Facebook, too, failed to respond to  repeated requests for comment, though the platform’s prompts for users  to log in with the same name as their Aadhaar card prompted suspicions from &lt;a href="https://gadgets.ndtv.com/social-networking/news/facebook-aadhaar-real-name-new-user-sign-up-onboarding-process-test-1792648" rel="noopener noreferrer" target="_blank"&gt;users&lt;/a&gt; that  it wanted everyone to use their Aadhaar-verified names and spellings so  they could later build in Aadhaar functionality with minimal problems.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A spokesman for Google, which has its  own payments platform in India called Tez, told HuffPost that the  company has not integrated any of its products with Aadhaar. But there was outrage earlier in August when the Aadhaar helpline was added &lt;a href="https://www.indiatoday.in/technology/news/story/aadhaar-number-in-phones-uidai-google-clarification-1306344-2018-08-06" rel="noopener noreferrer" target="_blank"&gt;to Android phones without informing users&lt;/a&gt;. Google claimed in a statement to the &lt;a href="https://economictimes.indiatimes.com/news/politics-and-nation/uidai-row-google-says-it-inadvertently-coded-the-number/articleshow/65264353.cms" rel="noopener noreferrer" target="_blank"&gt;Economic&lt;i&gt; Times&lt;/i&gt;&lt;/a&gt; this happened “inadvertently”&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;b&gt;Privacy Jeopardized For Millions&lt;/b&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;But the same features that are set to  make tech companies millions are are also the ones that threaten the  privacy and security of millions of Indians.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“As long as [the data] is being  shared with so many people and services and companies, without knowing  who has what data, it will always be an issue,” said Srinivas Kodali, an  independent security researcher. “They can’t protect it until they  encrypt it and stop sharing data.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One government website allowed users to search and geolocate homes on the basis of &lt;a href="https://www.huffingtonpost.in/2018/04/25/aadhaar-seeding-fiasco-how-to-geo-locate-every-minority-family-in-ap-with-one-click_a_23419643/" rel="noopener noreferrer" target="_blank"&gt;caste and religion&lt;/a&gt; ― sparking fears of ethnic and religious violence in a country where  lynchings, beatings and mob violence are commonplace. Another website  broadcast the names, phone numbers and medical purchases — like generic  Viagra and HIV medication — of &lt;a href="https://www.huffingtonpost.in/2018/06/17/andhra-pradesh-tracked-you-as-you-bought-viagra-then-put-your-name-and-phone-number-on-the-internet-for-the-world-to-see_a_23459943/" rel="noopener noreferrer" target="_blank"&gt;anyone who buys medicines&lt;/a&gt; from government stores. &lt;a href="https://www.huffingtonpost.in/2018/07/11/indias-latest-data-leak-is-so-basic-that-peoples-aadhaar-number-bank-account-and-fathers-name-are-just-one-google-search-away_a_23479694/" rel="noopener noreferrer" target="_blank"&gt;In another leak&lt;/a&gt;, a Google search for phone numbers of farmers in Andhra Pradesh would reveal their Aadhaar numbers, address, fathers’ names and bank account numbers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The leaks are aggravated by “a Star  Trek-type obsession” with data dashboards, said Sunil Abraham, executive  director of the Center for Internet and Society. Many government  departments each created an online data dashboard with detailed personal  records on individuals, he explained. The massive centralization of  personal data, he said, &lt;a href="https://www.huffingtonpost.in/2018/07/23/how-andhra-pradesh-built-indias-first-police-state-using-aadhaar-and-a-census_a_23487838/" rel="noopener noreferrer" target="_blank"&gt;created a huge security risk&lt;/a&gt; as these dashboards were accessible to any government official and in many cases, were even left open to the public.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Authentication failures have led to deaths among the poorest sections of Indian society &lt;a href="https://timesofindia.indiatimes.com/city/ranchi/7-hunger-deaths-related-to-aadhaar/articleshow/64695700.cms" rel="noopener noreferrer" target="_blank"&gt;when people were denied government food rations&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;And much like the tech companies,  some local governments are using the system to connect data sets and  build expansive surveillance. In the state of Andhra Pradesh in India,  there’s a &lt;a href="https://www.huffingtonpost.in/2018/07/23/how-andhra-pradesh-built-indias-first-police-state-using-aadhaar-and-a-census_a_23487838/" rel="noopener noreferrer" target="_blank"&gt;war room next to the state chief minister’s office&lt;/a&gt;,  where a wall of screens shows details from databases that collect  information from every department. There are security cameras and  dashboards that track every mention of the chief minister on the news.  There’s a separate team watching what’s being said about him on social  media and there are also dashboards that collect information from IoT  [Internet of Things] sensors across the state.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;b&gt;Court Ruling Could Halt Rollout&lt;/b&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Those issues around privacy are why  the dreams of government bureaucrats and large tech companies to build a  perfect surveillance apparatus around Aadhaar may ultimately fall  apart. The Supreme Court of India is set to decide on a case that could  decide the future of the program.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The court is set to review 27 petitions, including whether requiring  an Aadhaar for government subsidies and benefits makes access to these  programs conditional, even though the state is constitutionally bound to  deliver them. The petitioners include lawyers, academics and a  92-year-old retired judge whose petition also secured the right to  privacy as a fundamental right in August 2017. Petitioners also argue  that the ability for Aadhaar to be used to track and profile people is  unconstitutional.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In its judgment, due any day now, the court will rule on all 27  petitions together. It will decide not only the fate of the Aadhaar Act  of 2016, but likely the future involvement of some of tech’s biggest  companies in one of the world’s most ambitious and divisive IT projects.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/huffington-post-august-25-2018-paul-bluementhal-and-gopal-sathe-indias-biometric-database-is-creating-a-perfect-surveillance-state'&gt;https://cis-india.org/internet-governance/news/huffington-post-august-25-2018-paul-bluementhal-and-gopal-sathe-indias-biometric-database-is-creating-a-perfect-surveillance-state&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-09-04T14:40:51Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india">
    <title>The Srikrishna Committee Data Protection Bill and Artificial Intelligence in India</title>
    <link>https://cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india</link>
    <description>
        &lt;b&gt;Artificial Intelligence in many ways is in direct conflict with traditional data protection principles and requirements including consent, purpose limitation, data minimization, retention and deletion, accountability, and transparency.&lt;/b&gt;
        &lt;h3 style="text-align: justify; "&gt;Privacy Considerations in AI&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Other related privacy concerns in the context of AI center around re-identification and de-anonymisation, discrimination, unfairness, inaccuracies, bias, opacity, profiling, and misuse of data and imbedded power dynamics.&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The need for large amounts of data to improve accuracy, the ability to process vast amounts of granular data, and the present relationship between explainability and result of AI systems&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; have raised many concerns on both sides of the fence. On one hand, there is concern that heavy handed or inappropriate regulation will result in stifling innovation. If developers can only use data for pre-defined purpose - the prospects of AI are limited. On the other hand, individuals are concerned that privacy will be significantly undermined in light of AI systems that collect and process data in realtime and at a personal level not previously possible. Chatbots, house assistants, wearable devices, robot caregivers, facial recognition technology etc.  have the ability to collect data from a person at an intimate level. At the sametime, some have argued that AI can work towards protecting privacy by limiting the access that humans working at respective companies have to personal data.&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;India is embracing AI. Two national roadmaps for AI were released in 2018 respectively by the Ministry of Commerce and Industry and Niti Aayog. Both roadmaps emphasized the importance of addressing privacy concerns in the context of AI and ensuring that a robust privacy legislation is enacted. In August 2018, the Srikrishna Committee released a draft Personal Data Protection Bill 2018 and the associated report that outlines and justifies a framework for privacy in India. As the development and use of AI in India continues to grow, it is important that India simultaneously moves forward with a privacy framework that addresses the privacy dimensions of AI.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this article we attempt to analyse if and how the Srikrishna committee draft Bill  and report has addressed AI, contrast this with developments in the EU and the passing of the GDPR, and identify solutions that are being explored towards finding a way to develop AI while upholding and safeguarding privacy.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The GDPR and Artificial Intelligence&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The General Data Protection Regulation became enforceable in May 2018 and establishes a framework for the processing of personal data for individuals within the European Union. The GDPR has been described by IAAP  as taking a ‘risk based’ approach to data protection that pushes data controllers to engage in risk analysis and adopt ‘risk measured responses’.&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Though the GDPR does not explicitly address artificial intelligence, it does have a number of provisions that address automated decision making and profiling and a number of provisions that will impact companies using artificial intelligence in their business activities. These have been outlined below:&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Data rights: &lt;/b&gt; The GDPR enables individuals with a number of  data rights: the right to be informed, right of access, right to rectification, right to erasure, right to restrict processing, right to data portability, right to object, and rights related to automated decision making including profiling.  The last right - rights related to automated decision making - seeks to address concerns arising out of automated decision making by giving the individual the right to request to not be subject to a decision based solely on automated decision making including profiling if the decision would produce legal effects or similarly significantly affects them.  There are three exceptions to this right - if the automated decision making is:  a. necessary for the performance of a contract, b. authorised by the Union or Member State c. is based on explicit consent.&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;&lt;b&gt;Transparency:&lt;/b&gt; Under Article 14, data controllers must enable the right to opt out of automated decision making by notifying individuals of the existence of automated decision making including profiling and providing meaningful information about the logic involved as well as the potential consequences of such processing.&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Importantly, this requirement has the potential of ensuring that companies do not operate complete  ‘black box’ algorithms within their business processes.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Fairness: &lt;/b&gt;The principle of fairness found under Article 5(1) will also apply to the processing of personal data by AI. The principle requires that personal data must be processed in a way to meet the three conditions of lawfully, fairly, and in a transparent manner in relation to the data subject. Recital 71 further clarifies that this will include implementing appropriate mathematical and statistical measures for profiling, ensuring that inaccuracies are corrected, and  ensuring that processing that does not result in negative discriminatory results.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;&lt;b&gt;Purpose Limitation:&lt;/b&gt; The principle of purpose limitation (Article 5(1)(b) requires that personal data must be collected for  specified, explicit, and legitimate purposes and not be further processed in a manner incompatible with those purposes.  Processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes are not considered to be incompatible with the initial purposes. It has been noted that it is unclear if research carried out through artificial intelligence would fall under this exception as the GDPR does not define ‘scientific purposes’.&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;&lt;b&gt;Privacy by Design and Default:&lt;/b&gt; Article 25 requires all data controllers to implement technical and organizational measures to meet the requirements of the regulation. This could include techniques like pseudonymisation. Data controllers also are required to implement appropriate technical and organizational measures for ensuring that by default only personal data which are necessary for a specific purpose are processed.&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Data Protection Impact Assessments:&lt;/b&gt; Article 35 requires data controllers to undertake impact assessments if they are undertaking processing that is likely to result in a high risk to individuals. This includes if the data controller undertakes: systematic and extensive profiling, processes special categories of criminal offence data on a large scale, systematically monitor publicly accessible places on a large scale. In implementation, some jurisdictions like the UK require impact assessments on additional conditions including if the data controller: uses new technologies, uses profiling or special category data to decide on access to services, profile individuals on a large scale, process biometric data, process genetic data, match data or combine datasets from different sources, collect personal data from a source other than the individual without providing them with a privacy notice, track individuals’ location or behaviour, profile children or target marketing or online services at them, process data that might endanger the individual’s physical health or safety in the event of a security breach.&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Security:&lt;/b&gt; Article 30 requires data controllers to ensure a level of security appropriate to the risk including employing methods like encryption and pseudonymization. &lt;/li&gt;
&lt;/ol&gt;
&lt;h3 style="text-align: justify; "&gt;Srikrishna Committee Bill and AI&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Draft Data Protection Bill and associated report by the Srikrishna Committee was published in August 2018 and recommends a privacy framework for India. The Bill contains a number of provisions that will directly impact data fiduciaries using AI and that try and account for the unintended consequences of emerging technologies like AI. These include:&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Definition of Harm:&lt;/b&gt; The Bill defines harm as including bodily or mental injury, loss, distortion or theft of identity, financial loss or loss of property, loss of reputation or humiliation, loss of employment, any discriminatory treatment, any subjection to blackmail or extortion, any denial or withdrawal of a service, benefit or good resulting from an evaluative decision about the data principal, any restriction placed or suffered directly or indirectly on speech, movement or any other action arising out of a fear of being observed or surveilled, any observation or surveillance that is not reasonably expected by the data principal. The Bill also allows for categories of significant harm to be further defined by the data protection authority.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Many of the above are harms that have been associated with artificial intelligence - specifically loss employment, discriminatory treatment, and denial of service. Enabling the data protection authority to further define categories of  significant harm, could allow for unexpected harms arising from the use of AI to come under the ambit of the Bill.&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Data Rights:&lt;/b&gt; Like the GDPR, the Bill creates a set of data rights for the individual including the right to confirmation and access, correction, data portability, and right to be forgotten. At the sametime the Bill is intentionally silent on the rights and obligations that have been incorporated into the GDPR that address automated decision making including: The right to object to processing,&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; the right to opt out of automated decision making&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, and the obligation on the data controller to inform the individual about the use of automated decision making and basic information regarding the logic and impact of same.&lt;a href="#_ftn13" name="_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As justification, in their report the Committee noted the following: The right to restrict processing may be unnecessary in India as it provides only interim remedies around issues such as inaccuracy of data and the same can be achieved by a data principal approaching the DPA or courts for a stay on processing as well as simply withdraw consent. The objective of protecting against discrimination, bias, and opaque decisions that the right to object to automated processing and receive information about the processing of data in the Indian context seeks to fulfill would be better achieved through an accountability framework requiring specific data fiduciaries that will be making evaluative decisions through automated means to set up processes that ‘weed out’ discrimination. At the same time, if discrimination has taken place, individuals can seek remedy through the courts.&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;By taking this approach, the Bill creates a framework to address harms arising out of AI, but does not empower the individual to decide how their data is processed and remains silent on the issue of ‘black box’ algorithms.&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Data Quality&lt;/b&gt;: Requires data fiduciaries to ensure that personal data that is processed is complete, accurate, not misleading and updated with respect to the purposes for which it is processed. When taking steps to comply with this - data fiduciaries must take into consideration if the personal data is likely to be used to make a decision about the data principal, if it is likely to be disclosed to other individuals, if the personal data is kept in a form that distinguishes personal data based on facts from personal data based on opinions or personal assessments.&lt;a href="#_ftn14" name="_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;This principle, while not mandating that data fiduciaries take into account considerations such as biases in datasets, could potentially be be interpreted by the data protection authority to include in its scope, means towards ensuring that data does not contain or result in bias.&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Principle of Privacy by Design:&lt;/b&gt; Requires significant data fiduciaries to have in place a number policies and measures around several aspects of privacy. These include - (a) measures to ensure managerial, organizational, business practices and technical systems are designed in a manner to anticipate, identify, and avoid harm to the data principal (b) the obligations mentioned in Chapter II are embedded in organisational and business practices (c) technology used in the processing of personal data is in accordance with commercially accepted or certified standards (d) legitimate interests of business including any innovation is achieved without compromising privacy interests (e) privacy is protected throughout processing from the point of collection to deletion of personal data (f) processing of personal data is carried out in a transparent manner (g) the interest of the data principal is accounted for at every stage of processing of personal data.&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;A number of these (a, d, e, and g)  require that the interest of the data principal is accounted for throughout the processing of personal data, This will be  significant for systems driven by artificial intelligence as a number of the harms that have arisen from the use of AI include discrimination, denial of service, or loss of employment - have been brought under the definition of harm within the Bill. Placing the interest of the data principal first is also important in protecting against unintended consequences or harms that may arise from AI.&lt;a href="#_ftn15" name="_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; If enacted, it will be important to see what policies and measures emerge in the context of AI to comply with this principle. It will also be important to see what commercially accepted or certified standard companies rely on to comply with (c.)&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Data Protection Impact Assessment:&lt;/b&gt; Requires data fiduciaries to undertake a data protection impact assessment when implementing new technologies or large scale profiling or use of sensitive personal data. Such assessments need to include a detailed description of the proposed processing operation, the purpose of the processing and the nature of personal data being processed, an assessment of the potential harm that may be caused to the data principals whose personal data is proposed to be processed, and measures for managing, minimising, mitigating or removing such risk of harm. If the Authority finds that the processing is likely to cause harm to the data principles, it may direct the data fiduciary to undertake processing in certain circumstances or entirely.  This requirement applies to all significant data fiduciaires and all other data fiduciaries as required by the DPA.&lt;a href="#_ftn16" name="_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;This principle will apply to companies implementing AI systems. For AI systems, it will be important to see how much information the DPA will require under the requirement of data fiduciaries providing detailed descriptions of the proposed processing operation and purpose of processing.&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Classification of data fiduciaries as significant data fiduciaries&lt;/b&gt;: The Authority has the ability to notify certain categories of data fiduciaries as significant data fiduciaries based on 1. The volume of personal data processed, 2. The sensitivity of personal data processed, turnover of the data fiduciary, risk of harm resulting from any processing being undertaken by the fiduciary, use of new technologies for processing, and other factor relevant for causing harm to any data principal. If a data fiduciary falls under the ambit of any of these conditions they are required to register with the Authority. All significant data fiduciaries must undertake data protection impact assessments, maintain records as per the bill, under go data audits, and have in place a data protection officer.&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;As per this provision - companies deploying artificial intelligence would come under the definition of a significant data fiduciary and be subject to the principles of privacy by design etc. articulated in the chapter. The exception to this will be if the data fiduciary comes under the definition of ‘small entity’ found in section 48.&lt;a href="#_ftn17" name="_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Restrictions on cross border transfer of personal data: &lt;/b&gt;Requires that all data fiduciaries must store a copy of personal data on a server or data centre located in India and notified categories of critical personal data must be processed in servers located in India.&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;It is interesting to note that in the context of cross border sharing of data,  the Bill is creating a new category of data that can be further defined beyond personal and sensitive personal data. For companies implementing artificial intelligence, this provision may prove cumbersome to comply with as many utilize cloud storage and facilities located outside of India for the processing of larger amounts of data.&lt;a href="#_ftn18" name="_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Powers and functions of the Authority&lt;/b&gt;: The Bill lays down a number of functions of the Authority one being to monitor technological developments and commercial practices that may affect protection of personal data.&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;By assumption, this will include monitoring of technological developments in the field of Artificial Intelligence.&lt;a href="#_ftn19" name="_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Fair and reasonable processing: &lt;/b&gt;Requires that any person processing personal data owes a duty to the data principal to process such personal data in a fair and reasonable manner that respects the privacy of the data principal. In the Srikrishna Committee report, the committee explains that the principle of the fair and reasonable is meant to address 1. Power asymmetries between data subjects and data fiduciaries - recognizing that data fiduciaires have a responsibility to act in the best interest of the data principal 2. Situations where processing may be legal but not necessary fair or in the best interest of the data principal 3. Developing trust between the data principal and the data fiduciary.&lt;a href="#_ftn20" name="_ftnref20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;This is in contrast to the GDPR which requires processing to simultaneously meet the three conditions of fairness, lawfulness, and transparency.&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt; 
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Purpose Limitation: &lt;/b&gt;Personal data can only be processed for the purposes specified or any other purpose that the data principal would reasonably expect.&lt;/li&gt;
&lt;/ul&gt;
&lt;ol style="text-align: justify; "&gt; &lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;As a note, the Srikrishna Committee Bill does not include ‘scientific purposes’ as an exception to the principle of purpose limitation as found in the GDPR,&lt;a href="#_ftn21" name="_ftnref21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and instead creates an exception for research, archiving, or statistical purposes.&lt;a href="#_ftn22" name="_ftnref22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The DPA has the responsibility of developing codes defining research purposes under the act.&lt;a href="#_ftn23" name="_ftnref23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;&lt;b&gt;Security Safeguards:&lt;/b&gt; Every data fiduciary must implement appropriate security safeguards including the use of methods such as de-identification and encryption, steps to protect the integrity of personal data, and steps necessary to prevent misuse, unauthorised access to, modification, and disclosure or destruction of personal data.&lt;a href="#_ftn24" name="_ftnref24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Unlike the GDPR which explicitly refers to the technique of pseudonymization, the Srikrishna  uses Bill uses term de-identification.  The Srikrishna Report clarifies that the this includes techniques like pseudonymization and masking and further clarifies that because of the  risk of re-identification, de-identified personal data should still receive the same level of protection as personal data. The Bill further gives the DPA the authority to define appropriate levels of anonymization. &lt;a href="#_ftn25" name="_ftnref25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Technical perspectives of Privacy and AI&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;There is an emerging body of work that is looking at solutions to the dilemma of maintaining privacy while employing artificial intelligence and finding ways in which artificial intelligence can support and strengthen privacy. For example, there are AI driven platforms that leverage the technology to help a business to meet regulatory compliance with data protection laws&lt;a href="#_ftn26" name="_ftnref26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, as well as research into AI privacy enhancing technologies.&lt;a href="#_ftn27" name="_ftnref27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Standards setting bodies like IEEE have undertaken work on the ethical considerations in the collection and use of personal data when designing, developing, and/or deploying AI through the standard ‘Ethically Aligned Design’.&lt;a href="#_ftn28" name="_ftnref28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; . In the article Artificial Intelligence and Privacy by Datatilsynet - the Norwegian Data Protection Authority&lt;a href="#_ftn29" name="_ftnref29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; break such methods into three categories:&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;Techniques for reducing the need for large amounts of training data: Such techniques  can include&lt;/li&gt;
&lt;ol&gt;
&lt;li&gt;&lt;b&gt;Generative adversarial networks (GANs):&lt;/b&gt; GANs are used to create synthetic data and can address the need for large volumes of labelled data without relying on real data containing personal data. GANs could potentially be useful from a research and development perspective in sectors like healthcare where most data would quality as sensitive personal data.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Federated Learning:&lt;/b&gt; Federated learning allows for models to be trained and improved on data from a large pool of users without directly using user data. This is achieved by running a centralized model on a client unit and subsequently improved on local data. Changes from the improvements are shared back with the centralized server. An average of the changes from multiple individual client units becomes the basis for improving the  centralized model.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Matrix Capsules&lt;/b&gt;: Proposed by Google researcher Geoff Hinton, Matrix Capsules improve the accuracy of existing neural networks while requiring less data.&lt;a href="#_ftn30" name="_ftnref30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;li&gt;Techniques that uphold data protection without reducing the basic data set&lt;/li&gt;
&lt;ol&gt;
&lt;li&gt;&lt;b&gt;Differential Privacy&lt;/b&gt;: Differential privacy intentionally adds ‘noise’ to data when accessed. This allows for personal data to be accessed with revealing identifying information.&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Homomorphic Encryption:&lt;/b&gt; Homomorphic encryption allows for the processing of data while it is still encrypted. This addresses the need to access and use large amounts of personal data for multiple purposes&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Transfer Learning&lt;/b&gt;: Instead of building a new model, transfer learning relies builds upon existing models that are applied to new related purposes or tasks. This has the potential to reduce the amount of training data needed. &lt;/li&gt;
&lt;li&gt;&lt;b&gt;RAIRD&lt;/b&gt;: Developed by Statistics Norway and the Norwegian Centre for Research Data, RAIRD is a national research infrastructure that allows for access to large amounts of statistical data for research while managing statistical confidentiality. This is achieved by allowing researchers access to metadata. The metadata is used to build analyses which are then run against detailed data without giving access to actual data.&lt;a href="#_ftn31" name="_ftnref31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;li&gt;Techniques to move beyond opaque algorithms&lt;/li&gt;
&lt;ol&gt;
&lt;li&gt;&lt;b&gt;Explainable AI (XAI): &lt;/b&gt;DARPA in collaboration with Oregon State University is researching how to create explainable models and explanation interface while ensuring a high level of learning performance in order to enable individuals to interact with, trust, and manage artificial intelligence.&lt;a href="#_ftn32" name="_ftnref32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; DARPA identifies a number of entities working on different models and interfaces for analytics and autonomy AI.&lt;a href="#_ftn33" name="_ftnref33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Local Interpretable Model Agnostic Explanations&lt;/b&gt;: Developed to enable trust between AI models and humans by generating explainers to highlight key aspects that were important to the model and its decision - thus providing insight into the rationale behind a model.&lt;a href="#_ftn34" name="_ftnref34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt; &lt;/ol&gt;
&lt;h3 style="text-align: justify; "&gt;Public Sector use of AI and Privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The role of AI in public sector decision making has been gradually growing globally across sectors such as law enforcement, education, transportation, judicial decision making and healthcare. In India too, use of automated processing in electronic governance under the Digital India mission, domestic law enforcement agencies monitoring social media content and educational schemes is being discussed and gradually implemented. Much like the potential applications of AI across sub-sectors, the nature of regulatory issues are also diverse.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Aside from the accountability framework discussed in the Srikrishna Committee report, the Puttaswamy judgment also provides a basis for governance of AI with respect to its concerns for privacy, in limited contexts. The sources of right to privacy as articulated in the Puttaswamy judgments included the terms ‘personal liberty’ under Article 21 of the Constitution. In order to fully appreciate how constitutional principles could apply to automated processing in India, we need to look closely at the origins of privacy under liberty. In the famous case of &lt;i&gt;AK Gopalan&lt;/i&gt; there is a protracted discussion on the contents of the rights under Article 21. Amongst the majority opinions itself, the opinion was divided. While Sastri J. and Mukherjea J. took the restrictive view that limiting the protections to bodily restraint and detention, Kania J. and Das J. take a broader view for it to include the right to sleep, play etc. Through &lt;i&gt;RC Cooper&lt;/i&gt;&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and &lt;i&gt;Maneka&lt;/i&gt;&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, the Supreme Court took steps to reverse the majority opinion in &lt;i&gt;Gopalan&lt;/i&gt; and it was established that that the freedoms and rights in Part III could be addressed by more than one provision. The expansion of ‘personal liberty’ has began in &lt;i&gt;Kharak Singh&lt;/i&gt; where the unjustified interference with a person’s right to live in his house, was held to be violative of Article 21. The reasoning in &lt;i&gt;Kharak Singh&lt;/i&gt; draws heavily from&lt;i&gt; Munn&lt;/i&gt; v. &lt;i&gt;Illinois&lt;/i&gt;&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; which held life to be “more than mere animal existence.” Curiously, after taking this position &lt;i&gt;Kharak Singh&lt;/i&gt; fails to recognise a fundamental right to privacy (analogous to the Fourth Amendment protection in US) under Article 21. The position taken in &lt;i&gt;Kharak Singh&lt;/i&gt; was to extrapolate the same method of wide interpretation of ‘personal liberty’ as was accorded to ‘life’. &lt;i&gt;Maneka&lt;/i&gt; which evolved the test for enumerated rights within Part III says that the claimed right must be an integral part of or of the the same nature as the named right. It says that the claimed must be ‘in reality and substance nothing but an instance of the exercise of the named fundamental right’. The clear reading of privacy into ‘personal liberty’ in this judgment is effectively a correction of the inherent inconsistencies in the positions taken by the majority in Kharak Singh.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The other significant change in constitutional interpretation that occurred in Maneka was with respect to the phrase ‘procedure established by law’ in Article 21. In Gopalan, the majority held that the phrase ‘procedure established by law’ does not mean procedural due process or natural justice. What this meant was that, once a ‘procedure’ was ‘established by law’, Article 21 could not be said to have been infringed. This position was entirely reversed in Maneka. The ratio in Maneka said that ‘procedure established by law’ must be fair, just and reasonable, and cannot be arbitrary and fanciful. Therefore, any infringement of the right to privacy must be through a law which follows the principles of natural justice, and is not arbitrary or unfair. It follows that any instances of automated processing for public functioning by state actors or others, must meet this standard of ‘fair, just and reasonable’.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While there is a lot of focus internationally on what ethical AI must be, it is important that when we consider use of AI by the state, we pay heed to the existing constitutional principles which determine how AI must be evaluated against these standards. These principles however extend only to limited circumstances for protections under Article 21 are not horizontal in nature but only applicable against the state. Whether a party is the state or not is a question that has been considered several times by the Supreme Court and must be determined by functional tests. In our submission of the Justice Srikrishna Committee, we clearly recommended that where automated decision making is used for discharging of public functions, the data protection law must state that such actions are subject the the constitutional standards and are ‘just, fair and reasonable’ and satisfy the tests for both procedural and substantive due process. To a limited extent, the committee seems to have picked up the standards of ‘fair’ and ‘reasonable’ and made it applicable to all forms of processing, whether public or private. It is as yet unclear whether fairness and reasonableness as inserted in the bill would draw from the constitutional standard under Article 21. The report makes a reference to the twin principles of acting in a manner that upholds the best interest of the privacy of the individual, and processing within the reasonable expectations of the individual, which do not seem to cover the fullest essence of the legal standard under Article 21.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Srikrishna Committee Bill attempts to create an accountability framework for the use of emerging technologies including AI that is focused on placing the responsibility on companies to prevent harm. Though not as robust as found in the GDPR, the protections have been enabled through requirements such as fair and reasonable processing, ensuring data quality, and implementing principles of privacy of design. At the sametime, the Srikrishna Bill does not include provisions that can begin to address the  consumer facing ‘black box’ of AI by ensuring that individuals have information about the potential impact of decisions taken by automated means. In contrast, the GDPR has already taken important steps to tackle this by requiring companies to explain the logic and potential impact of decisions taken by automated means.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most importantly, the Bill gives the Data Protection Authority the necessary tools to hold companies accountable for the use of AI through the requirements of data protection audits. If enacted, it will have to be seen how these audits and the principle of privacy by design are implemented and enforced in the context of companies using  AI. Though the Bill creates a Data Protection Authority consisting of members that have significant experience in data protection, information technology, data management, data science, cyber and internet laws, and related subjects, these requirements can be further strengthened by having someone from a background of ethics and human rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One of the responsibilities of the DPA under the Srikrishna Bill will be to monitor technological developments and commercial practices that may affect protection of personal data and promote measures and undertake research for innovation in the field of protection of personal data. If enacted, we hope that AI and solutions towards enhancing privacy in the context of AI like described above will be one of these focus areas of the DPA. It will also be important to see how the DPA develops impact assessments related to AI and what tools associated with the principle of Privacy by Design emerge to address AI.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://privacyinternational.org/topics/artificial-intelligence&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.wired.com/story/our-machines-now-have-knowledge-well-never-understand/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://iapp.org/news/a/ai-offers-opportunity-to-increase-privacy-for-users/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://iapp.org/media/pdf/resource_center/GDPR_Study_Maldoff.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://gdpr-info.eu/art-22-gdpr/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://gdpr-info.eu/art-14-gdpr/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://gdpr-info.eu/art-25-gdpr/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/accountability-and-governance/data-protection-impact-assessments/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://gdpr-info.eu/art-21-gdpr/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://gdpr-info.eu/art-22-gdpr/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://gdpr-info.eu/art-14-gdpr/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;Draft Data Protection Bill 2018 -  Chapter II section 9&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 -  Chapter VII section 29&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 -  Chapter VII section 33&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 -  Chapter VII section 38&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 -  Chapter VIII section 40&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 -  Chapter X section 60&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 -  Chapter II section 4&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 - Chapter II section 5&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 -  Chapter IX Section 45&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 - Chapter XIV section 97&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Draft Data Protection Bill 2018 - Chapter VII section 31&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Srikrishna Committee Report on Data Protection pg. 36 and 37. Available at: http://www.prsindia.org/uploads/media/Data%20Protection/Committee%20Report%20on%20Draft%20Personal%20Data%20Protection%20Bill,%202018.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.ciosummits.com/Online_Assets_DocAuthority_Whitepaper_-_Guide_to_Intelligent_GDPR_Compliance.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://jolt.law.harvard.edu/assets/articlePDFs/v31/31HarvJLTech217.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_personal_data_v2.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.artificial-intelligence.blog/news/capsule-networks&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; http://raird.no/about/factsheet.html&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.darpa.mil/attachments/XAIProgramUpdate.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.darpa.mil/attachments/XAIProgramUpdate.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; https://www.oreilly.com/learning/introduction-to-local-interpretable-model-agnostic-explanations-lime&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;R C Cooper&lt;/i&gt; v. &lt;i&gt;Union of India&lt;/i&gt;, 1970 SCR (3) 530.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Maneka Gandhi&lt;/i&gt; v. &lt;i&gt;Union of India&lt;/i&gt;, 1978 SCR (2) 621.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; 94 US 113 (1877).&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india'&gt;https://cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha and Elonnai Hickok</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-09-03T13:29:12Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/files/analysis-of-cloud-act-and-implications-for-india">
    <title>Analysis of CLOUD Act and Implications for India</title>
    <link>https://cis-india.org/internet-governance/files/analysis-of-cloud-act-and-implications-for-india</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/files/analysis-of-cloud-act-and-implications-for-india'&gt;https://cis-india.org/internet-governance/files/analysis-of-cloud-act-and-implications-for-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>elonnai</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-08-22T14:53:50Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/factor-daily-anand-murali-august-13-2018-the-big-eye">
    <title>The Big Eye: The tech is all ready for mass surveillance in India</title>
    <link>https://cis-india.org/internet-governance/news/factor-daily-anand-murali-august-13-2018-the-big-eye</link>
    <description>
        &lt;b&gt;Chennai’s T. Nagar, arguably India’s biggest shopping district by revenues and crowded on any given day, gets even more packed in festival seasons as thousands throng its saree and jewellery stores.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post by Anand Murali was published in &lt;a class="external-link" href="https://factordaily.com/face-recognition-mass-surveillance-in-india/"&gt;Factor Daily&lt;/a&gt; on August 13, 2018. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Every year, Deepavali, less than three months away this year, presents the perfect hunting ground for pickpockets and other petty thieves — and a headache for the local police.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This time, however, the city police have reason to believe it has a  handle on things. It has a technology that analyses CCTV footage to  spot, in real time, people with a criminal history visiting the T. Nagar  area. “We are matching real-time CCTV video footage with our criminal  database using the FaceTagr system and if any criminals are identified  in that area, we get an immediate alert and we can further investigate,”  says P Aravindan, deputy commissioner of police. Last year, FaceTagr, a  face recognition software developed by an eponymous Chennai company,  was used in a few areas with results that convinced the police to spread  it to all of the T Nagar area, he adds.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Aravindan’s counterparts in Punjab are as big fans of real-time  surveillance as him. Amritsar Police used something the state’s police  calls Punjab Artificial Intelligence System, or PAIS, developed by  Gurugram AI company Staqu Technologies, to solve a murder case within 24  hours — again, using CCTV footage and facial recognition technology.  The company has &lt;a href="https://tech.economictimes.indiatimes.com/news/startups/staqu-builds-an-android-smart-glass-platform-to-help-police-identify-criminals/63239706" rel="noopener nofollow external noreferrer" target="_blank"&gt;piloted&lt;/a&gt; a camera mounted on a pair of smart glasses to capture a real-time feed and analyse it for facial matches with a database.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Elsewhere, the Surat Police has a picture intelligence unit that  relies on NEC’s proprietary NeoFace technology for facial recognition,  as also vehicle number plate recognition, to &lt;a href="https://in.nec.com/en_IN/press/201507/global_20150719_2.html" rel="noopener nofollow external noreferrer" target="_blank"&gt;track persons of interest&lt;/a&gt;.  The result is alerts that the police can proactively act upon and  faster turnaround in solving cases. Surat can claim to be a step ahead  of Tokyo: NEC plans to use the latest version of its NeoFace technology  at the 2020 Tokyo Olympics to &lt;a href="https://www.sunherald.com/news/business/article216218290.html" rel="noopener nofollow external noreferrer" target="_blank"&gt;track accredited persons&lt;/a&gt; – athletes, officials, media, and others – at multiple venues.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Welcome to the Big Eye helping law keepers and administrators in  India to instantly recognise faces and use the information in multiple  use cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Facial recognition and image cognition tech is nothing new, to be  sure. We have seen them in movies for some time now – be it the Jason  Bourne series in which the CIA uses complex surveillance tech to track  the agent or the &lt;i&gt;Mission Impossible&lt;/i&gt; movies where the protagonist use facial recognition to get access to secure areas. Or, the recent Steven Spielberg movie, &lt;i&gt;Ready Player One&lt;/i&gt;,  in which the villain uses camera drones. This kind of advanced – and  even futuristic – image recognition-based surveillance all set to go  mainstream in India with the rapid proliferation of cameras: from the  public and private CCTVs to the ubiquitous mobile phone cameras.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Investigation on steroids&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Chennai-based FaceTagr has been working with Indian Railways since  last year to prevent human trafficking. “Finding missing children and  the prevention of human trafficking was one of the first use cases that  we developed. We work with the Indian Railways, state police  departments, and CBI to prevent human trafficking,” says Vijay  Gnanadesikan, CEO and co-founder, FaceTagr.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;His moment of epiphany that led to the idea for developing FaceTagr  was on a morning drive to work in Chennai traffic and watching children  begging at his window. “I reached the office and discussed with my  cofounder. We realised that there is an existing database of missing  children with photographs and, with face recognition technology, we  could develop a solution that could help solve the problem and in a way  also prevent human trafficking,” says Gnanadesikan. Cut to today: the  tool has been deployed at the India-Nepal and India-Bangladesh borders  at nearly 24 checkpoints to monitor human trafficking.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;FaceTagr is a face recognition technology that works on both static  images and video footage. The same technology is being used in a  solution for the Chennai police to identify criminals. “Earlier a  suspect had to be taken to the police station, fingerprinted, and then  his details were verified. Imagine a guy walking on the road at 2 am who  is looking suspicious. A police patrol can take the suspect’s  photograph with our app and, within a second, receive details about his  crime history,” says Gnanadesikan.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The T. Nagar deployment runs on real-time CCTV footage. In the areas  it was deployed last year, the system helped reduce the number of crimes  “from three digits to a single digit” during last year’s Deepavali  season, claims the FaceTagr CEO.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The system compares the real-time CCTV footage of the crowd with the  police criminal database for facial matches. “Once someone from the  database is identified among the crowd, the picture shows up, which is  then re-verified by the police personnel monitoring the system for a  reconfirmation,” says Gnanadesikan, adding that an ID match does not  mean a crime is committed. “Someone might also be there for shopping and  we and the police team are very mindful of that, but it will give the  police a notification about the person’s whereabouts in the area.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One of the clever outcomes of the deployment is that the system helps  identify criminals from other cities or areas. According to DCP  Aravindan, a police officer in Chennai city will likely not know of a  criminal from, say, Tirunelveli, Kanyakumari or other far off places.  This is where the face recognition system comes in handy, he says.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Traditionally, we have data of all criminals station-wise and there  is also a crime team which is familiar with the criminals and can  recognise them. But, of late, with the improvement in connectivity and  communication, people from far-off places come and commit a crime and  this has made it challenging to identify them,” he says. The state’s  crime database currently has over 60,000 photographs with more  photographs being added daily. Every week, the department nabs two or  three criminals with the help of the face recognition system, Aravindan  adds.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Are there any privacy concerns? “To avoid misuse we have conducted  multiple training programs for all the police personnel who are using  this application and we have instructed them that unless they find a  person suspicious, they should not take a photograph. We have designed  an SOP (standard operating procedure) for using the system to avoid  misuse,” adds the deputy commissioner.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Surveillance on smart glass&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The face recognition system of Staqu, the Gurgaon AI startup, has  been deployed in the states of Uttarakhand, Punjab and Rajasthan.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;According to Atul Rai, Staqu’s CEO and co-founder, different law  enforcement jurisdictions or agencies, even within a state, often have  their own sets of data and it becomes difficult to sift through them and  find links or patterns. Staqu’s answer to that problem was ABHED, short  for Artificial Intelligence Based Human Efface Detection, which formed  the base software for a mobile application and is connected to a backend  database processing system. “This system accumulates images, speech and  text, and using all this information, it develops intelligence for  these agencies,” says Rai.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The company has also developed a real-time video surveillance-based  face recognition technology that works via a camera mounted on a smart  glass. The system was piloted with the Punjab Police and the company is  now in the process of deploying with &lt;a href="https://tech.economictimes.indiatimes.com/news/startups/ai-startup-staqu-signs-mou-to-assist-dubai-police/64271484" rel="noopener nofollow external noreferrer" target="_blank"&gt;the Dubai Police&lt;/a&gt;, says Rai.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most CCTVs today have a limited view and, in comparison, an officer  wearing the smart glass and moving in a crowd will have a better field  of view, says Rai. “In real time, the glass will stream the video  footage to the server, which will then match the footage and give the  report if any person from the database is detected,” he adds.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Staqu-developed PAIS, or Punjab Artificial Intelligence System,  can image match with an accuracy of 98% if the database has five images  of the person, claims Rai.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another use case for face recognition technology that has been coming  up in India is in the corporate sector for attendance and security.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“In many of the enterprise use cases, the technology is used in  controlled spaces – for example, conferences where most attendees  pre-register or employees access systems in companies,” says Uday  Chinta, managing director of American technology service company IPSoft,  which has also developed and deployed an AI-based personal assistant  called Amelia in the US. “Amelia is able to recognise a person using his  facial features and able to assist them and give personalised service  based on their identity,” says Chinta.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Software services company Tech Mahindra has launched a facial  recognition system for employee attendance at its Noida office.  According to &lt;a href="https://economictimes.indiatimes.com/news/company/corporate-trends/tech-mahindra-adopts-facial-recognition-to-mark-attendance/articleshow/65300255.cms" rel="noopener nofollow external noreferrer" target="_blank"&gt;one report&lt;/a&gt;,  the system also comes with a “moodometer” that will track the mood and  emotions of employees and give additional analytics to the company.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Beyond face analytics, image recognition technology is also being  used to identify vehicles. The National Highways Authority of India has  been using AI-based image recognition systems to tag and identify  vehicles across its infrastructure in the country.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Underlying digital layer: databases&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The scarier part to the tech is its dark side: mass surveillance covering all. Countries like China have already deployed &lt;a href="https://www.theatlantic.com/international/archive/2018/02/china-surveillance/552203/" rel="noopener nofollow external noreferrer" target="_blank"&gt;mass surveillance on its citizens&lt;/a&gt;.  Chinese citizens today have a scoring system assigned to them by the  government based on various factors including data captured through the  surveillance program which will give the preferential access to services  like fast internet access.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the case of India, to facilitate proper surveillance in a state,  one of the first requirements is a digital database which already exists  in many forms across central and state governments. With or without a  double take, the answer is obvious: Aadhaar, India’s citizen ID  database. With a population of 135 crore and Aadhaar covering over 90%  of this population, it is India’s most extensive database.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notwithstanding the use cases detailed earlier in this story and the  huge interest among state police and law enforcement agencies in India,  collecting data and using it – even it is to bust crime – falls into  grey areas. In June this year, &lt;a href="https://indianexpress.com/article/india/ncrb-pitches-for-giving-police-limited-access-to-aadhaar-data-to-crack-crimes-5227541/" rel="noopener nofollow external noreferrer" target="_blank"&gt;news reports&lt;/a&gt; had National Crime Records Bureau director Ish Kumar saying that  investigators need to be given limited access to Aadhaar. Reacting to  this, the Unique Identification Authority of India (UIDAI) issued a &lt;a href="https://www.uidai.gov.in/images/news/Press-Note-on-rejecting-demand-of-access-to-Aadhaar-data-25062018.pdf" rel="noopener nofollow external noreferrer" target="_blank"&gt;statement&lt;/a&gt; saying that access to Aadhaar biometric data for criminal investigation  is not permissible under Section 29 of the Aadhaar Act, 2016 — which  perhaps explains why the Punjab Police declined requests for interviews  for this story.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Longtime Aadhaar critic Sunil Abraham, executive director of  Bengaluru’s Centre for Internet and Society (CIS), calls Aadhaar “the  perfect tool for surveillance”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The main database is the Aadhaar database. It’s got your iris and  biometrics information already and they have said that they will  strengthen the fingerprint authentication with facial recognition. So  now, they have the have the full surveillance infrastructure that they  need. The collection devices (CCTVs) are just there to collect the data  but the actual recognition engine is Aadhaar only,” says Abraham, who is  leaving CIS to join non-profit Mozilla Foundation as a vice president  in January.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;According to him, all three types of biometrics – fingerprint data,  iris information data, and facial data – can be used in a remote and  covert fashion and, therefore, in a non-consensual fashion. (&lt;i&gt;Editor’s note&lt;/i&gt;: There is no public incident, to date, that proves such a use.)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Abraham is “100% sure” where we are headed. “The reason why I call  Aadhaar a surveillance project is not that there is metadata stored, I  call it a surveillance project because the biometrics are being stored.  Metadata is one of the problems, that is the profiling risk but the  surveillance risk primarily comes from the biometric data that they  have,” he says. By metadata, he is referring to a citizen’s information  such as phone number, age, sex, address, and other details.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are also other databases in the works that could provide the  basis for surveillance. Like: the Crime and Criminal Tracking Network  &amp;amp; Systems (CCTNS) across police stations in India. &lt;a href="http://ncrb.gov.in/BureauDivisions/cctnsnew/index.html" rel="noopener nofollow external noreferrer" target="_blank"&gt;According&lt;/a&gt; to the CCTNS website, as of May 2018, the CCTNS hardware and software  deployment has covered nearly 94% of the police stations across India.  There have been &lt;a href="https://thewire.in/government/hyderabad-smart-policing-surveillance" rel="noopener nofollow external noreferrer" target="_blank"&gt;reports&lt;/a&gt; of the CCTNS system being used as a mass surveillance system in the guise of e-policing by authorities in Hyderabad.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Early in 2016, the Hyderabad of Police had launched a &lt;a href="http://www.hyderabadpolice.gov.in/assets/tender/Integrated%20Information%20Hub(IIH).pdf" rel="noopener nofollow external noreferrer" target="_blank"&gt;tender&lt;/a&gt; looking for companies to set up a citizen profiling and monitoring system. According to a report in &lt;i&gt;Telangana Today&lt;/i&gt;,  the Integrated People Information Hub (IPIH) gives the police access to  personal informations of its citizens including names, family details,  addresses and other related information by sourcing them from documents  like police records, FIRs and other external sources like utility  connections, tax payments, voter identification, passport etc.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;During Israeli Prime Minister Benjamin Netanyahu’s visit to India in January, Tel Aviv-based AI company Cortica had &lt;a href="https://www.prnewswire.com/news-releases/prime-ministers-narendra-modi-and-benjamin-netanyahu-welcome-new-age-of-collaboration-for-israel-and-india-300589299.html" rel="nofollow external noopener noreferrer"&gt;announced&lt;/a&gt; a partnership with India’s Best Group to develop solutions for combing  through data captured daily by drones, surveillance cameras, and  satellites. The aim is to develop an AI-based real-time identification  of patterns, concepts and situational anomalies to identify potential  problems, flag them and improve safety in the process. More details such  as scale and scope of this partnership are not available at this point  in time.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Mass surveillance: Easier said than done&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Take a step back. India already has multiple digital surveillance –  even if not mass, real-time facial recognition – programs in place to  keep track of its citizens. E.g.: the Telecom Enforcement Resource and  Monitoring (TERM) and NETRA (NEtwork TRaffic Analysis) surveillance  software developed by the Centre for Artificial Intelligence and  Robotics (CAIR). These are just some of the surveillance programs  operated by the government.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But when it comes to mass surveillance in real time, even with the  AI-based tech is available today, the currently installed infrastructure  might not be ready for real-time mass surveillance. “Countries like  China are good at setting up infrastructure which is very essential for  mass surveillance systems to be in place,” says Kedar Kulkarni of  Bengaluru-based deep learning startup Hyperverge, who also insists that  all CCTVs out there today might not be fit to conduct facial  recognition.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;According to Kulkarni, for a mass surveillance system to be in place,  you either need cameras that can capture and do computing for face  recognition within its hardware or you need a robust network which can  transmit live feeds from multiple cameras to processing centres, which  is very bandwidth intensive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most public spaces in India including railway stations, bus depots,  metro station, marketplaces are often under CCTV surveillance. New Delhi  is all set to have one of the largest deployments in the country of  CCTVs with the state government announcing plans to install 1.4 lakh  CCTVs across Delhi. The India Railways is also setting aside Rs 3,000  crore in its 2018-19 budget to install CCTV systems across 11,000 trains  and 8,500 stations, according to a news report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In comparison, China is said to have 170 million CCTV cameras  installed across the country currently and this number is estimated to  go up by 400 million in the next three years, says a BBC news report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Even the staunchest privacy activists acknowledge what surveillance  can deliver if used carefully. “Overall, it is a very powerful  technology. It should be used for law enforcement, it should be used for  national security. That is the correct domain of application,” says  Abraham. He hastens to add the caveats: “When we use it, we have to use  it with lots of safeguards and it should be used only on a very small  subset of the population. It shouldn’t be a technology that is broadly  deployed in the population because it is not necessary, it is not  proportionate, and the risks are very high.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The flip and funny side of facial recognition-based surveillance is  that the government does not need the technology to actually work. Just  the threat of surveillance – that big brother is watching you – is  enough to reduce crime. According to Gnanadesikan, the Chennai CEO of  FaceTagr, one reason for the drop in crime rate in last year’s T. Nagar  trials was that criminals knew that they were being watched.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/factor-daily-anand-murali-august-13-2018-the-big-eye'&gt;https://cis-india.org/internet-governance/news/factor-daily-anand-murali-august-13-2018-the-big-eye&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-08-13T14:54:14Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/call-for-essays-offline">
    <title>Call for Essays: Offline</title>
    <link>https://cis-india.org/raw/call-for-essays-offline</link>
    <description>
        &lt;b&gt;Who is offline, and is it a choice? The global project of bringing people online has spurred several commendable initiatives in expanding access to digital devices, networks, and content, and often contentious ones such as Free Basics / internet.org, which illustrate the intersectionalities of scale, privilege, and rights that we need to be mindful of when we imagine the offline. Further, the experience of the internet, for a large section of people is often mediated through prior and ongoing experiences of traditional media, and through cultural metaphors and cognitive frames that transcend more practical registers such as consumption and facilitation. How do we approach, study, and represent this disembodied internet – devoid of its hypertext, platforms, devices, it's nuts and bolts, but still tangible through engagement in myriad, personal and often indiscernible ways. The researchers@work programme invites abstracts for essays that explore dimensions of offline lives.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Offline&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Does being offline necessarily mean being disconnected? Beyond anxieties such as FOMO, being offline is also seen as disengagement from a certain milieu of the digital (read: capital), an impediment to the way life is organised by and around technologies in general. However, being offline is not the exception, as examples of internet shutdown and acts on online censorship illustrate the persistence and often alarming regularity of the offline even for the ‘connected’ sections of the population.&lt;/p&gt;
&lt;p&gt;State and commercial providers of internet and telecommunication services work in tandem to produce both the “online” and the “offline” - through content censorship, internet regulation, generalised service provision failures, and so on. Further, efforts to prioritise the use of digital technologies for financial transactions, especially since demonetisation, has led to a not-so-subtle equalisation of the ‘online economy’ with the ‘formal economy’; thus recognising the offline as the zones of informality, corruption, and piracy. This contributes to the offline becoming invisible, and in many cases, illegal, rather than being recognised as a condition that necessarily informs what it means to be digital.&lt;/p&gt;
&lt;p&gt;Who is offline, and is it a choice? The global project of bringing people online has spurred several commendable initiatives in expanding access to digital devices, networks, and content, and often contentious ones such as Free Basics / internet.org, which illustrate the intersectionalities of scale, privilege, and rights that we need to be mindful of when we imagine the offline. Further, the experience of the internet, for a large section of people is often mediated through prior and ongoing experiences of traditional media, and through cultural metaphors and cognitive frames that transcend more practical registers such as consumption and facilitation. How do we approach, study, and represent this disembodied internet – devoid of its hypertext, platforms, devices, it's nuts and bolts, but still tangible through engagement in myriad, personal and often indiscernible ways.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Call for Essays&lt;/strong&gt;&lt;/h3&gt;
&lt;h4&gt;We invite abstracts for essays that explore social, economic, cultural, political, infrastructural, or aesthetic dimensions of the "offline". Please submit the abstracts by Sunday, September 02.&lt;/h4&gt;
&lt;p&gt;We will select 10 abstracts and announce them on &lt;strong&gt;Wednesday, September 05&lt;/strong&gt;. The selected authors are expected to submit the first draft of the essay (2000-4000 words) by &lt;strong&gt;Friday, October 05&lt;/strong&gt;. We will share editorial suggestions with the authors, and the final versions of the essays will be published on the researchers@work blog from November onwards. We will offer Rs. 5,000 as honourarium to all selected authors.&lt;/p&gt;
&lt;p&gt;Please submit the abstracts (300-500 words) as a text file via email sent to &lt;strong&gt;raw@cis-india.org&lt;/strong&gt;, with the subject line of "Offline".&lt;/p&gt;
&lt;p&gt;The essays, for example, may explore one or more of the following themes:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;Geographies of internet access: Infrastructural, socio-political, and discursive forces and contradictions&lt;/li&gt;
&lt;li&gt;Terms, objects, metaphors, and events of the internet and their offline remediation and circulation&lt;/li&gt;
&lt;li&gt;Minimal computing, maker cultures, and digital collaboration and creativity in the offline&lt;/li&gt;
&lt;li&gt;Offline economic cultures and transition towards less-cash economy&lt;/li&gt;
&lt;li&gt;Offline as democratic choice: the right to offline lives in the context of global debates on privacy, surveillance, and data justice&lt;/li&gt;
&lt;li&gt;Methods of studying the "offline" at the intersections of offline and online lives&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Please note that the scope of essays need not be limited to the topics mentioned above but may address other dimensions of offline lives.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/call-for-essays-offline'&gt;https://cis-india.org/raw/call-for-essays-offline&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sneha-pp</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Studies</dc:subject>
    
    
        <dc:subject>RAW Blog</dc:subject>
    
    
        <dc:subject>Call for Essays</dc:subject>
    
    
        <dc:subject>Offline</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2018-08-20T06:58:05Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/ai-and-governance-case-study-pdf">
    <title>AI and Governance Case Study pdf</title>
    <link>https://cis-india.org/internet-governance/ai-and-governance-case-study-pdf</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/ai-and-governance-case-study-pdf'&gt;https://cis-india.org/internet-governance/ai-and-governance-case-study-pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranav</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-08-01T02:06:47Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/files/normative-regulation-of-cyber-space-report">
    <title>Normative Regulation of Cyber Space Report</title>
    <link>https://cis-india.org/internet-governance/files/normative-regulation-of-cyber-space-report</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/files/normative-regulation-of-cyber-space-report'&gt;https://cis-india.org/internet-governance/files/normative-regulation-of-cyber-space-report&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranav</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-07-31T23:42:42Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/about/newsletters/july-2018-newsletter">
    <title>July 2018 Newsletter</title>
    <link>https://cis-india.org/about/newsletters/july-2018-newsletter</link>
    <description>
        &lt;b&gt;CIS July 2018 newsletter.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;span&gt;Dear readers,&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Previous issues of the newsletters can be &lt;a class="external-link" href="http://cis-india.org/about/newsletters"&gt;accessed here&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Highlights&lt;/h2&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Paul Kurien and Akriti Bopanna carried out an &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/icann-diversity-analysis"&gt;analysis of the diversity of participation&lt;/a&gt; at the ICANN processes by taking a close look at their mailing lists. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;&lt;a class="external-link" href="https://meta.wikimedia.org/wiki/CIS-A2K/Events/2018#July"&gt;CIS-A2K organized 6 events&lt;/a&gt;: partnership discussions with Misimi Telugu monthly magazine; partnership activity in Annamayya Library, Guntur, a workshop in Tumakur University; a workshop of river activists for building Jal Bodh; a workshop of publishers and writers on unicode, open source and wikimedia projects; and a Telugu literary conference.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;CIS had worked with the Research and Advisory Group (RAG) of the Global Commission on the Stability of Cyberspace (GCSC). The work looked at the negotiation processes and strategies that various players may adopt as they drive the cyber norms agenda. In continuation &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/the-potential-for-the-normative-regulation-of-cyberspace-implications-for-india"&gt;CIS has brought out a report&lt;/a&gt; which focuses more extensively on the substantive law and principles at play and looks closely at what the global state of the debate means for India.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;The debate surrounding privacy has in recent times gained momentum due to the Aadhaar judgement and the growing concerns around the use of personal data by corporations and governments. In this light &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018"&gt;CIS has made comments and recommendations to the India Privacy Code, 2018&lt;/a&gt;. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;CIS &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-submitted-a-response-to-a-notice-of-enquiry-by-the-us-government-on-international-internet-policy-priorities"&gt;drafted a response&lt;/a&gt; to a Notice of Inquiry (NOI) issued by the U.S. Commerce Department's National Telecommunications and Information Administration (NTIA) on "International Internet Policy Priorities." CIS commented on the free flow of information and jurisdiction, mult-stakeholder approach to internet governance, privacy and security.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Elonnai Hickok, Shweta Mohandas and Swaraj Paul Barooah &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework"&gt;compiled the AI Task Force Report&lt;/a&gt;, India's first step towards an AI framework. The Task Force on Artificial Intelligence was established by the Ministry of Commerce and Industry to leverage AI for economic benefits, and provide policy recommendations on the deployment of AI for India. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Paul Kurian and Akriti Bopanna &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/icann-diversity-analysis"&gt;carried out an analysis&lt;/a&gt; of the diversity of participation at the ICANN processes by taking a close look at their mailing lists. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Articles&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="http://cis-india.org/raw/indian-express-july-1-2018-nishant-shah-digital-native-bigger-picture"&gt;Digital Native: The bigger picture&lt;/a&gt; (Nishant Shah; Indian Express; July 1, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/telecom/blog/organizing-india-blogspot-shyam-ponappa-july-6-2018-problems-that-should-occupy-our-electioneers"&gt;The Problems That Should Occupy Our Electioneers&lt;/a&gt; (Shyam Ponappa; Business Standard; July 6, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://cis-india.org/raw/indian-express-july-15-2018-nishant-shah-digital-native-the-citys-watching"&gt;Digital Native: How smart cities can make criminals out of denizens&lt;/a&gt; (Nishant Shah; Indian Express; July 15, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/livemint-july-24-2018-swaraj-barooah-and-gurshabad-grover-anti-trafficking-bill-may-lead-to-censorship"&gt;Anti-trafficking Bill may lead to censorship&lt;/a&gt; (Swaraj Barooah and Gurshabad Grover; Livemint; July 24, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://cis-india.org/raw/digital-native-hashtag-along-with-me"&gt;Digital Native: Hashtag Along With Me&lt;/a&gt; (Nishant Shah; Indian Express; July 29, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill"&gt;Lining up the data on the Srikrishna Privacy Draft Bill&lt;/a&gt; (Sunil Abraham; Economic Times; July 30, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around"&gt;Spreading unhappiness equally around&lt;/a&gt; (Business Standard; July 31, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;CIS in the News&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/the-national-july-2-2018-samanth-subramanian-smartphone-rumours-spark-series-of-mob-killings-in-india"&gt;Smartphone rumours spark series of mob killings in India&lt;/a&gt; (Samanth Subramanian; The National; July 2, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/huffington-post-july-5-2018-government-gives-nod-to-bill-for-building-dna-databases-in-india-for-criminal-investigation-and-justice-delivery"&gt;Government Gives Nod To Bill For Building DNA Databases In India, For 'Criminal Investigation And Justice Delivery'&lt;/a&gt; (Huffington Post; July 5, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/the-times-of-india-july-6-2018-hope-for-such-swift-crackdowns-for-everyone"&gt;'Hope for such swift crackdowns for everyone&lt;/a&gt;' (Times of India; July 6, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/business-standard-july-9-2018-69-mob-attacks-on-child-lifting-rumours-since-jan-17-only-one-before-that"&gt;Child-lifting rumours caused 69 mob attacks, 33 deaths in last 18 months&lt;/a&gt; (Business Standard; July 9, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/death-by-social-media"&gt;Death by Social Media&lt;/a&gt; (Pretika Khanna, Abhiram Ghadyalpatil and Shaswati Das; Livemint; July 9, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-12-2018-indias-latest-data-leak-is-so-basic-that-peoples-aadhaar-number-bank-account-and-fathers-name-are-just-one-google-search-away"&gt;India's Latest Data Leak: People's Aadhaar Number And Bank Account Are Just One Google Search Away&lt;/a&gt; (Gopal Sathe; Huffington Post; July 12, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/bloomberg-quint-july-16-2018-people-should-have-right-to-their-data-not-companies-says-trai"&gt;People Should Have Right To Their Data, Not Companies, Says TRAI&lt;/a&gt; (Bloomberg Quint; July 16, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy"&gt;After Securing Net Neutrality In India, TRAI Goes To Bat For Data Privacy&lt;/a&gt; (Gopal Sathe; Huffington Post; July 16, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-july-18-2018-surabhi-agarwal-and-gulveen-aulakh-trai-recommendations-on-data-privacy-raises-eyebrows"&gt;TRAI recommendations on data privacy raises eyebrows &lt;/a&gt;(Surabhi Agarwal and Gulveen Aulakh; Economic Times; July 18, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-megha-mandavia-july-19-2018-srikrishna-panel-upset-at-timing-of-trai-suggestions"&gt;Srikrishna panel upset at timing of Trai suggestion&lt;/a&gt;s (Megha Mandavia; Economic Times; July 19, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/deccan-herald-july-20-2018-rajitha-menon-firms-find-wealth-in-your-data"&gt;Firms find wealth in your data&lt;/a&gt; (Rajitha Menon; Deccan Herald; July 20, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-venkat-ananth-july-24-2018-whatsapp-races-against-time-to-fix-fake-news-mess-ahead-of-2019-general-elections"&gt;WhatsApp races against time to fix fake news mess ahead of 2019 general elections&lt;/a&gt; (Venkat Ananth; Economic Times; July 24, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/factor-daily-sunny-sen-and-jayadevan-pk-july-25-2018-the-crown-of-thorns-that-awaits-facebook-india-md-hire"&gt;The crown of thorns that awaits Facebook’s India MD hire&lt;/a&gt; (Sunny Sen and Jayadevan PK; Factory Daily; July 25, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy"&gt;Bit by byte protecting her privacy&lt;/a&gt; (Mihir Dalal and Anirban Sen; Livemint; July 26, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/livemint-july-27-2018-komal-gupta-govt-asks-cbi-to-probe-cambridge-analytica-in-data-breach-case"&gt;Govt asks CBI to probe Cambridge Analytica in data breach case&lt;/a&gt; (Komal Gupta; Livemint; July 27, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-july-28-2018-mugdha-variyar-and-pratik-bhakta-data-localisation-may-pinch-startups-payments-firms"&gt;Data localisation may pinch startups, payments firms&lt;/a&gt; (Mugdha Variyar and Pratik Bhakta; Economic Times; July 28, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a href="http://cis-india.org/a2k"&gt;Access to Knowledge&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Our Access to Knowledge programme currently consists of two projects.  The Pervasive Technologies project, conducted under a grant from the  International Development Research Centre (IDRC), aims to conduct  research on the complex interplay between low-cost pervasive  technologies and intellectual property, in order to encourage the  proliferation and development of such technologies as a social good. The  Wikipedia project, which is under a grant from the Wikimedia  Foundation, is for the growth of Indic language communities and projects  by designing community collaborations and partnerships that recruit and  cultivate new editors and explore innovative approaches to building  projects.&lt;/p&gt;
&lt;h3&gt;Wikipedia&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Blog Entries&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/cb5cbfc95cbfcaaca1cbfcaf-ca4cb0cacca4cbf-ce8ce6ce7cee-cb0cbec82c9acbf-1"&gt;ವಿಕಿಪೀಡಿಯ ತರಬೇತಿ ೨೦೧೮ @ ರಾಂಚಿ&lt;/a&gt; (Vikas Hegde; July 4, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/how-to-write-differently-for-different-telugu-digital-platforms-awareness-session-to-indu-gnana-vedika"&gt;How to write differently for different Telugu digital platforms - awareness session to Indu Gnana Vedika&lt;/a&gt; (Pavan Santosh; July 19, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/c35c3ec1fc4dc38c3ec2ac4d-c38c3ec39c3fc24c4dc2f-c35c47c26c3fc15-c28c41c02c1ac3f-c35c3fc15c40c38c4bc30c4dc38c41c15c41"&gt;వాట్సాప్ సాహిత్య వేదిక నుంచి వికీసోర్సుకు&lt;/a&gt; (Pavan Santosh; July 31, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;Events Organized&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/partnership-activity-in-annamayya-library-guntur"&gt;Partnership activity in Annamayya Library&lt;/a&gt; (Guntur; July 10, 2014).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/partnership-discussions-with-misimi-telugu-monthly-magazine"&gt;Partnership discussions with Misimi Telugu Monthly Magazine&lt;/a&gt; (July 24, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/tumakur%20university-workshop"&gt;Tumakur University Workshop&lt;/a&gt; (Tumkur; July 25, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/workshop-of-river-activists-for-building-jal-bodh-knowledge-resource-on-water"&gt;Workshop of River activists for building Jal Bodh - Knowledge resource on Water&lt;/a&gt; (Pune; July 25, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/workshop-of-publishers-and-writers-on-unicode-open-source-and-wikimedia-projects"&gt;Workshop of Publishers and Writers on Unicode, Open Source and Wikimedia Projects&lt;/a&gt; (Pune; July 25, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a href="http://cis-india.org/internet-governance"&gt;Internet Governance&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;As part of its research on privacy and free speech, CIS is engaged with  two different projects. The first one (under a grant from Privacy  International and IDRC) is on surveillance and freedom of expression  (SAFEGUARDS). The second one (under a grant from MacArthur Foundation)  is on restrictions that the Indian government has placed on freedom of  expression online.&lt;/p&gt;
&lt;h3&gt;Privacy&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Submissions&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-submitted-a-response-to-a-notice-of-enquiry-by-the-us-government-on-international-internet-policy-priorities"&gt;Response to a Notice of Enquiry by the US Government on International Internet Policy Priorities&lt;/a&gt; (Swagam Dasgupta; July 18, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018"&gt;The Centre for Internet and Society’s Comments and Recommendations to the: Indian Privacy Code, 2018&lt;/a&gt; (Shweta Mohandas, Elonnai Hickok, Amber Sinha and Shruti Trikanand; July 20, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Blog Entry&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework"&gt;The AI Task Force Report - The first steps towards India’s AI framework&lt;/a&gt; (Elonnai Hickok, Shweta Mohandas and Swaraj Paul Barooah; June 27, 2018). The blog post was edited by Swagam Dasgupta.&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;Participation in Events&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/news/ietf-102-montreal"&gt;IETF 102 Montreal&lt;/a&gt; (Organized by Internet Engineering Task Force; Fairmont Queen Elizabeth Montreal in Canada; July 14 - 20, 2018). Gurshabad Grover presented a review of the human rights considerations in the drafts of the Software Update for IoT Devices (SUIT) Working Group in the meeting of the HRPC research group. &lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age"&gt;Ethical Data Design Practices in the AI (Artificial Intelligence) Age&lt;/a&gt; (Organized by Startup Grind, Bangalore at NUMA Bangalore; July 28, 2018). Shweta Mohandas was a panelist.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h3&gt;Cyberspace and Cyber Security&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Analysis&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/the-potential-for-the-normative-regulation-of-cyberspace-implications-for-india"&gt;The Potential for the Normative Regulation of Cyberspace: Implications for India&lt;/a&gt; (Arindrajit Basu; July 30, 2018). The report was edited by Elonnai Hickok, Sunil Abraham and Udbhav Tiwari with research assistance from Tejas Bharadwaj.&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;Blog Entry&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-contributes-to-the-research-and-advisory-group-of-the-global-commission-on-the-stability-of-cyberspace-gcsc"&gt;CIS contributes to the Research and Advisory Group of the Global Commission on the Stability of Cyberspace&lt;/a&gt; (GCSC) (Arindrajit Basu; July 5, 2018). &lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;p&gt;&lt;b&gt;Participation in Event&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/news/ieee-sa-indita-conference-2018"&gt;IEEE-SA InDITA Conference 2018&lt;/a&gt; (Organized by IEEE Standards Association; IIIT-Bangalore; July 10 - 11, 2018). Gurshabad Grover gave a brief presentation on how we could apply or reject 'Trust Through Technology' principles in the design of public biometric authentication. &lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Free Speech &amp;amp; Expression&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Blog Entries&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/icann-diversity-analysis"&gt;ICANN Diversity Analysis&lt;/a&gt; (Paul Kurian and Akriti Bopanna; July 16, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/didp-31-diversity-of-employees-at-icann"&gt;DIDP #31 Diversity of employees at ICANN&lt;/a&gt; (Akash Sriram; July 19, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;&lt;br /&gt;Participation in Event&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/26th-amic-annual-conference-2013-india-2018"&gt;26th AMIC Annual Conference – India 2018&lt;/a&gt; (Organized by Manipal Academy of Higher Education; Fortune Inn Valley View, Manipal, Karnataka; June 7 - 9, 2018). Swaraj Paul Barooah was a speaker. &lt;span&gt;An article announcing the event by Kevin Mendonsa was published in the &lt;/span&gt;&lt;a class="external-link" href="https://timesofindia.indiatimes.com/home/education/news/mahe-to-host-26th-annual-conference-of-amic/articleshow/64468351.cms"&gt;Times of India&lt;/a&gt;&lt;span&gt; on June 5, 2018.&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cis-india.org/telecom"&gt;Telecom&lt;/a&gt;&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span style="text-align: justify; "&gt;CIS is involved in promoting access and accessibility to telecommunications services and resources, and has provided inputs to ongoing policy discussions and consultation papers published by TRAI. It has prepared reports on unlicensed spectrum and accessibility of mobile phones for persons with disabilities and also works with the USOF to include funding projects for persons with disabilities in its mandate:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-align: justify; "&gt;Newspaper Column&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span style="text-align: justify; "&gt;&lt;a class="external-link" href="https://cis-india.org/telecom/blog/organizing-india-blogspot-shyam-ponappa-july-6-2018-problems-that-should-occupy-our-electioneers"&gt;The Problems That Should Occupy Our Electioneers&lt;/a&gt; (Shyam Ponappa; Business Standard; July 5, 2018 and Organizing India Blogspot; July 6, 2018).&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;&lt;span style="text-align: justify; "&gt;&lt;span style="text-align: justify; "&gt;&lt;span style="text-align: justify; "&gt;&lt;a href="http://cis-india.org/"&gt;About CIS&lt;/a&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The Centre for Internet and  Society (CIS) is a non-profit organisation that undertakes  interdisciplinary research on internet and digital technologies from  policy and academic perspectives. The areas of focus include digital  accessibility for persons with disabilities, access to knowledge,  intellectual property rights, openness (including open data, free and  open source software, open standards, open access, open educational  resources, and open video), internet governance, telecommunication  reform, digital privacy, and cyber-security. The academic research at  CIS seeks to understand the reconfigurations of social and cultural  processes and structures as mediated through the internet and digital  media technologies.&lt;/p&gt;
&lt;p&gt;► Follow us elsewhere&lt;/p&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;Twitter:&lt;a href="http://twitter.com/cis_india"&gt; http://twitter.com/cis_india&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Access to Knowledge: &lt;a href="https://twitter.com/CISA2K"&gt;https://twitter.com/CISA2K&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Information Policy: &lt;a href="https://twitter.com/CIS_InfoPolicy"&gt;https://twitter.com/CIS_InfoPolicy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Facebook - Access to Knowledge:&lt;a href="https://www.facebook.com/cisa2k"&gt; https://www.facebook.com/cisa2k&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;E-Mail - Access to Knowledge: &lt;a&gt;a2k@cis-india.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;E-Mail - Researchers at Work: &lt;a&gt;raw@cis-india.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;List - Researchers at Work: &lt;a href="https://lists.ghserv.net/mailman/listinfo/researchers"&gt;https://lists.ghserv.net/mailman/listinfo/researchers&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;p&gt;► Support Us&lt;/p&gt;
&lt;div&gt;Please help us defend consumer and citizen rights on the Internet!  Write a cheque in favour of 'The Centre for Internet and Society' and  mail it to us at No. 194, 2nd 'C' Cross, Domlur, 2nd Stage, Bengaluru -  5600 71.&lt;/div&gt;
&lt;p&gt;► Request for Collaboration&lt;/p&gt;
&lt;div&gt;
&lt;p style="text-align: justify; "&gt;We invite researchers, practitioners, artists, and theoreticians,  both organisationally and as individuals, to engage with us on topics  related internet and society, and improve our collective understanding  of this field. To discuss such possibilities, please write to Sunil  Abraham, Executive Director, at sunil@cis-india.org (for policy research), or Sumandro Chattapadhyay, Research Director, at sumandro@cis-india.org (for  academic research), with an indication of the form and the content of  the collaboration you might be interested in. To discuss collaborations  on Indic language Wikipedia projects, write to Tanveer Hasan, Programme  Officer, at &lt;a&gt;tanveer@cis-india.org&lt;/a&gt;.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;&lt;i&gt;CIS is grateful to its primary donor the Kusuma Trust founded  by Anurag Dikshit and Soma Pujari, philanthropists of Indian origin for  its core funding and support for most of its projects. CIS is also  grateful to its other donors, Wikimedia Foundation, Ford Foundation,  Privacy International, UK, Hans Foundation, MacArthur Foundation, and  IDRC for funding its various projects&lt;/i&gt;.&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/about/newsletters/july-2018-newsletter'&gt;https://cis-india.org/about/newsletters/july-2018-newsletter&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Telecom</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Access to Knowledge</dc:subject>
    

   <dc:date>2018-08-11T02:50:52Z</dc:date>
   <dc:type>Page</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around">
    <title>Spreading unhappiness equally around</title>
    <link>https://cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around</link>
    <description>
        &lt;b&gt;The section of civil society opposed to Aadhaar is unhappy because the UIDAI and all other state agencies that wish to can process data non-consensually.&lt;/b&gt;
        &lt;p&gt;The article was published in &lt;a class="external-link" href="https://www.business-standard.com/article/opinion/spreading-unhappiness-equally-around-118073100008_1.html"&gt;Business Standard&lt;/a&gt; on July 31, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;There is a joke in policy-making circles — you know you have reached a good compromise if all the relevant stakeholders are equally unhappy. By that measure, the B N Srikrishna committee has done a commendable job since there are many with complaints.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some in the private sector are unhappy because their demonisation of the European Union’s General Data Protection Regulation (GDPR) has failed. The committee’s draft data protection Bill is closely modelled upon the GDPR in terms of rights, principles, design of the regulator and the design of the regulatory tools like impact assessments. With 4 per cent of global turnover as maximum fine, there is a clear signal that privacy infringements by transnational corporations will be reigned in by the regulator. Getting a law that has copied many elements of the European regulation is good news for us because the GDPR is recognised by leading human rights organisations as the global gold standard. But the bad news for us is that the Bill also has unnecessarily broad data localisation mandates for the private sector.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some in the fintech sector are unhappy because the committee rejected the suggestion that privacy be regulated as a property right. This is a positive from the human rights perspective, especially because this approach has been rejected across the globe, including the European Union. Property rights are inappropriate because a natural law framing of the enclosure of the commons into private property through labour does not translate to personal data. Also in comparison to patents — or “intellectual property” — the scale of possible discreet property holdings in personal information is several orders higher, posing unimaginable complexity for regulation, possibly creating a gridlock economy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The section of civil society opposed to Aadhaar is unhappy because the UIDAI and all other state agencies that wish to can process data non-consensually. A similar loophole exists in the GDPR. Remember the definition of processing includes “operations such as collection, recording, organisation, structuring, storage, adaptation, alteration, retrieval, use, alignment or combination, indexing, disclosure by transmission, dissemination or otherwise making available, restriction, erasure or destruction”. This means the UIDAI can collect data from you without your consent and does not have to establish consent for the data it has collected in the past. There is a “necessary” test which is supposed to constrain data collection. But for the last 10 odd years, the UIDAI has deemed it “necessary” to collect biometrics to give the poor subsidised grain. Will those forms of disproportionate non-consensual data collection continue? Most probably because the report recommends that the UIDAI continue to play the role of the regulator with heightened powers. Which is like trusting the fox with&lt;br /&gt;the henhouse.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Employees should be unhappy because the Bill has an expansive ground under which employers can nonconsensually harvest their data. The Bill allows for non-consensual processing of any data “necessary” for recruitment, termination, providing any benefit or service, verifying the attendance or any other activity related to the assessment of the performance”. This is permitted when consent is not an appropriate basis or would involve disproportionate effort on the part of the employer. This is basically a surveillance provision for employers. Either this ground should be removed like in the GDPR or a “proportionate” test should also be introduced otherwise disproportionate mechanisms like spyware on work computers will be installed by employees without providing notice.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some free speech activists are unhappy because the law contains a “right to be forgotten” provision. They are concerned that this will be used by the rich and powerful to censor mainstream and alternative media. On the face of the “right to be forgotten” in the GDPR is a much more expansive “right to erasure”, whilst the Bill only provides for a more limited "right to restrict or prevent continuing disclosure”. However, the GDPR has a clear exception for “archiving purposes in the public interest, scientific or historical research purposes or statistical purposes”. The Bill like the GDPR does identify the two competing human rights imperatives — freedom of expression and the right to information. However, by missing the “public interest” test it does not sufficiently social power asymmetries.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Privacy and security researchers are unhappy because re-identification has been made an offence without a public interest or research exception. It is indeed a positive that the committee has made re-identification a criminal offence. This is because the de-identification standards notified by the regulator would always be catching up with the latest mathematical development. However, in order to protect the very research that the regulator needs to protect the rights of individuals, the Bill should have granted the formal and non-formal academic community immunity from liability and criminal prosecution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Lastly but also most importantly, human rights activists are unhappy because the committee again like the GDPR did not include sufficiently specific surveillance law fixes. The European Union has historically handled this separately in the ePrivacy Regulation. Maybe that is the approach we must also follow or maybe this was a missed opportunity. Overall, the B N Srikrishna committee must be commended for producing a good data protection Bill. The task before us is to make it great and to have it enacted by Parliament at the earliest.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around'&gt;https://cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-31T14:49:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill">
    <title>Lining up the data on the Srikrishna Privacy Draft Bill</title>
    <link>https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill</link>
    <description>
        &lt;b&gt;In the run-up to the Justice BN Srikrishna committee report, some stakeholders have advocated that consent be eliminated and replaced with stronger accountability obligations. This was rejected and the committee has released a draft bill that has consent as the bedrock just like the GDPR. And like the GDPR there exists legal basis for nonconsensual processing of data for the “functions of the state”. What does this mean for lawabiding persons?&lt;/b&gt;
        &lt;p&gt;The article was published in &lt;a class="external-link" href="https://economictimes.indiatimes.com/small-biz/startups/newsbuzz/lining-up-the-data-on-the-srikrishna-privacy-draft-bill/articleshow/65192296.cms"&gt;Economic Times&lt;/a&gt; on July 30, 2018&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Non-consensual processing is permitted in the bill as long it is “necessary for any function of the” Parliament or any state legislature. These functions need not be authorised by law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Or alternatively “necessary for any function of the state authorised by law” for the provision of a service or benefit, issuance of any certification, licence or permit.&lt;br /&gt;Fortunately, however, the state remains bound by the eight obligations in chapter two i.e., fair and reasonable processing, purpose limitation, collection limitation, lawful processing, notice and data quality and data storage limitations and accountability. This ground in the GDPR has two sub-clauses: one, the task passes the public interest test and two, the loophole like the Indian bill that possibly includes all interactions the state has with all persons.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The “necessary” test appears both on the grounds for non-consensual processing, and in the “collection limitation” obligation in chapter two of the bill. For sensitive personal data, the test is raised to “strictly necessary”. But the difference is not clarified and the word “necessary” is used in multiple senses.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the “collection limitation” obligation the bill says “necessary for the purposes of processing” which indicates a connection to the “purpose limitation” obligation. The “purpose limitation” obligation, however, only requires the state to have a purpose that is “clear, specific and lawful” and processing limited to the “specific purpose” and “any other incidental purpose that the data principal would reasonably expect the personal data to be used for”. It is perhaps important at this point to note that the phrase “data minimisation” does not appear anywhere in the bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Therefore “necessary” could broadly understood to mean data Parliament or the state legislature requires to perform some function unauthorised by law, and data the citizen might reasonably expect a state authority to consider incidental to the provision of a service or benefit, issuance of a certificate, licence or permit.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Or alternatively more conservatively understood to mean data without which it would be impossible for Parliament and state legislature to carry out functions mandated by the law, and data without it would be impossible for the state to provide the specific service or benefit or issue certificates, licences and permits. It is completely unclear like with the GDPR why an additional test of “strictly necessary” is — if you will forgive the redundancy — necessary.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After 10 years of Aadhaar, the average citizen “reasonably expects” the state to ask for biometric data to provide subsidised grain. But it is not impossible to provide subsidised grain in a corruption-free manner without using surveillance technology that can be used to remotely, covertly and non-consensually identify persons. Smart cards, for example, implement privacy by design. Therefore a “reasonable expectation” test is not inappropriate since this is not a question about changing social mores.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When it comes to persons that are not law abiding the bill has two exceptions — “security of the state” and “prevention, detection, investigation and prosecution of contraventions of law”. Here the “necessary” test is combined with the “proportionate” test.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The proportionate test further constrains processing. For example, GPS data may be necessary for detecting someone has jumped a traffic signal but it might not be a proportionate response for a minor violation. Along with the requirement for “procedure established by law”, this is indeed a well carved out exception if the “necessary” test is interpreted conservatively. The only points of concern here is that the infringement of a fundamental right for minor offences and also the “prevention” of offences which implies processing of personal data of innocent persons.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ideally consent should be introduced for law-abiding citizens even if it is merely tokenism because you cannot revoke consent if you have not granted it in the first place. Or alternatively, a less protective option would be to admit that all egovernance in India will be based on surveillance, therefore “necessary” should be conservatively defined and the “proportionate” test should be introduced as an additional safeguard.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill'&gt;https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-31T02:52:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy">
    <title>After Securing Net Neutrality In India, TRAI Goes To Bat For Data Privacy</title>
    <link>https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy</link>
    <description>
        &lt;b&gt;This will be a stop-gap measure before the creation of a privacy bill.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Gopal Sathe was published in &lt;a class="external-link" href="https://www.huffingtonpost.in/2018/07/16/after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy_a_23483166/"&gt;Huffington Post&lt;/a&gt; on July 16, 2018. Pranesh Prakash was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Last week, the Department of Telecom gave  the nod to net neutrality regulations, ensuring that there would be no  discrimination of data at a time when the US is moving in the &lt;a href="https://www.theverge.com/2018/6/11/17439456/net-neutrality-dead-ajit-pai-fcc-internet" target="_blank"&gt;opposite direction&lt;/a&gt;.  The net neutrality norms were based on the recommendations from the  Telecom Regulatory Authority of India (TRAI) - which the BBC in November  described as &lt;a href="https://www.bbc.com/news/world-asia-india-42162979" target="_blank"&gt;the world's strongest&lt;/a&gt; - but the regulator isn't celebrating right now - it's moved on to  another equally important topic - privacy and data protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On Monday, TRAI announced its &lt;a href="https://trai.gov.in/sites/default/files/RecommendationDataPrivacy16072018_0.pdf" target="_blank"&gt;recommendations&lt;/a&gt; on privacy, security, and ownership of data in the telecom sector, and  the 77 page document serves as the first major public guidelines on  privacy and data protection in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;TRAI has outlined a consent based framework, where users have to  clearly choose what data is being used, which bears some similarities to  Europes GDPR. TRAI noted that while the right to privacy should not be  treated solely as a property right, it must be noted that the  controllers of personal data are mere custodians without any primary  right over the same. In other words, your data should belong to you, and  not to Google, or Facebook, or any other company which holds your data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"The Right to Choice, Notice, Consent, Data Portability, and Right to  be Forgotten should be conferred upon the telecommunication consumers,"  TRAI recommended&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In section 2.3, it also notes that meta-data is personal information  and as such should be given the same protections. This is an important  point given that even metadata can be used to track and identify people  accurately. It also noted that there needs to be a right to be  forgotten, and once you stop using a service it should not store your  data beyond what's mandated by the law, according to section 2.46.  Section 2.49 also allows users the right to withdraw consent, which  means that even if people have given consent to gathering your data,  users will be able to stop tracking on demand.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At the same time, TRAI also noted the stop-gap nature of its  recommendations, and said, "till such time a general data protection law  is notified by the government, the existing Rules/ License conditions  applicable to the Telecom Service Providers for protection of users  should be made applicable to all the entities in the digital  eco-system."&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Good, with some caveats&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Early reactions to the recommendations are largely positive. On  Twitter, lawyer Apar Gupta, who is one of the founding members of the  Internet Freedom Foundation shared some &lt;a href="https://twitter.com/apargupta84/status/1018856500775841793" target="_blank"&gt;quick thoughts&lt;/a&gt; about the recommendations. Describing this as a substantive document he  called it "partly positive since it calls for interim safeguards", but  added that the "form of some seems problematic."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the plus side, he noted that many of the protections in the  recommendations "focus on a user rights model, which includes notice,  choice, consent, portability, deletion and erasure." He also praised the  recommendations for not taking a view on data localisation, and that  the protections need to apply to private as well as state entities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, he criticized the fact that TRAI is planning to impose  license conditions on all OTT providers - that is to say, all third  party services. He also noted that the recommendations did not directly  address state surveillance. He also pointed out that an Electronic  Consent Framework as described in the recommendations may "centralise  consent requests thereby may end up generating more personal data and  unifying them into a single portal managed by the govt/regulators."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"We are happy with the TRAI's recommendations on Privacy, Security  and Ownership of Data as the regulator is calling for all digital  entities to be brought under data protection framework. This would  include all devices, operating systems, browsers, and applications and  would be welcome stop-gap measure till rules and regulations of the  telecom services providers are applicable to them," said Rajan Matthews,  DG Cellular Operators Association of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"This will ensure, in prevailing circumstances, that the privacy of  users is protected and maintained. National security and privacy issues  are of paramount importance. Accordingly, the regulator by making this  recommendation, is ensuring that no exception is made for any service  provider, while subjecting them to the rules to meet the national  security and privacy norms. However, this is our preliminary view and we  will need to review the other recommendations to determine their  implications."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Speaking in a &lt;a href="https://twitter.com/ETNOWlive/status/1018849319300972544" target="_blank"&gt;television interview&lt;/a&gt;,  Pranesh Prakash, Policy Director at the Centre for Internet and  Society, said he's still processing the document, but "on the face of it  it seems good."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"There are still certain concerns I have which haven't been  addressed. The telecom licenses themselves, which are issued by the  Government of India, require a whole lot of data to be collected,  metadata to be collected, by telecom companies. So I'm not sure how that  requirement by the Government of India squares off with what is now  being recommended by TRAI."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Let me also point out that one of the things that TRAI says, and it  might be exceeding its brief a little bit, is that it says this should  not only cover telecom operators, but also device manufacturers,  operating systems, application creators, and other kinds of software.  What TRAI seems to want to do is actually quite a bit more than what I  think the DoT has, or really ought to be doing. I really don't  understand whether this will find any favour in the interim before the  government decides to take up the Justice Srikrishna Committee report."&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Justice Srikrishna committee report still due&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Although TRAI's recommendations are an important document, and will  serve as stopgap privacy rules, India is also on the verge of a data  protection and privacy bill, which will be based on the recommendations  of the Justice BN Srikrishna committee on the subject. The committee was  formed in August and was expected to deliver its report in June, but  sources say that disagreements over the Aadhaar have caused some delays.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The committee is expected to send its recommendations to the  government soon, at which point things could change, but for now, TRAI's  recommendations are an important development as India moves to secure  the privacy of its people.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ahead of that though, you can read the full TRAI recommendations &lt;a href="https://trai.gov.in/sites/default/files/RecommendationDataPrivacy16072018_0.pdf" target="_blank"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy'&gt;https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Telecom</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-07-29T05:28:20Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy">
    <title>Bit by byte protecting her privacy</title>
    <link>https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy</link>
    <description>
        &lt;b&gt;The Srikrishna committee draft law on data protection is days away. Here’s a bucket list of issues that will matter&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Mihir Dalal and Anirban Sen was published in &lt;a class="external-link" href="https://www.livemint.com/Politics/qZg7qJoXhHIwnyLUYVsaxL/Bit-by-byte-protecting-her-privacy.html"&gt;Livemint&lt;/a&gt; on July 26, 2018. Amber Sinha was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In an  era dominated by “free” platforms such as Google, Facebook and Amazon,  among others, data privacy had largely been considered an academic  matter. However, in the past one year that notion has changed forever,  bringing data privacy to the fore, as one of the defining issues of the  internet, both in India and abroad.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Last August, the Supreme Court  ruled that privacy was a fundamental right under the Constitution of  India. Concomitantly, the debate over Aadhaar and its potential misuse  picked up steam on the back of reports about data breaches in the  biometric ID system though these reports were denied by the Unique  Identification Authority of India, which built Aadhaar. (The apex Court  will deliver its verdict on petitions that have challenged the  constitutional validity of Aadhaar and its legal framework)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Globally,  Facebook came under severe criticism after it was revealed that the  social media giant had compromised user data in the run up to the US  elections. Finally, in May, Europe introduced its landmark data privacy  law, General Data Protection Regulation (GDPR), which has put users in  control of their data through various measures.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The stage  is now set for the much-delayed draft law on data protection, which is  expected to be submitted soon by the 10-member panel headed by former  Supreme Court justice B.N. Srikrishna.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The committee, which had  been set up last July, has attracted criticism from some quarters.  Earlier this month, more than 150 lawyers, activists and journalists,  among others, wrote to the Srikrishna committee, complaining about the  lack of transparency in its process, the lack of diversity in the views  held by members of the committee, besides other issues. In an earlier  letter in November last, activists, lawyers and others had alleged that  too many members of the committee held pro-Aadhaar views.  Some experts  believe that the mandate of the committee was flawed to begin with.  “Given that personal information is omnipresent in so many different  sectors, it is better to have a light touch legislation that deals  mostly with key principles of data privacy and empowers a data  commissioner to frame more detailed regulations,” said Stephen Mathias,  partner, Kochhar and Co.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Last week, the Telecom Regulatory  Authority of India (Trai) released a set of recommendations on data  privacy that favour giving users control of their data and personal  information, while severely restricting the ways in which telecom and  internet companies can use customer data. Here are the major issues to  watch out for in the draft data protection law.&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Users vs. collectors &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This  broad umbrella includes mandatory consent of users for data collection,  data portability, the right to be forgotten and the right to erasure.  Last week, Trai gave its recommendations on some of these issues in what  were considered pro-privacy and progressive suggestions. Those  recommendations tracked GDPR measures. The Srikrishna committee is also  expected to suggest pro-privacy measures, though the details will be  all-important. The committee is also expected to define what is  ‘sensitive’ or ‘critical’ data.  “In India, government agencies, private  entities and others collect various forms of data on individuals,” said  Chetan Nagendra, partner, AZB Partners. “The committee will have to  clarify what category of data is allowed to be collected and whether  this should this be standardized across different entities. It will also  have to standardize rules on how long is it okay to store such  user-collected data.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The flip side of user rights is the role of  data repositories that collect and process user data. The committee will  be required to clarify what data firms and government agencies can  gather on users and what will be their responsibilities toward the usage  of that data. This includes the principle of privacy by design, that  is, companies must ensure by default that their platforms are designed  to protect rather than exploit user data and privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;IndusLaw  partner Namita Viswanath said that in terms of data repositories, there  was a need to distinguish between a data controller and a data  processor. A data controller is the user-facing platform that gathers  data, whereas a data processor is often a third-party firm that provides  infrastructure for the platform. “Responsibilities of user personal  data should be shared between a data controller and processor. The  nature and extent of liability should depend on the nature of data, the  party responsible for handling data and the measures adopted, but  ultimately, the data controller should most responsibility,” Viswanath  said.&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Regulation  vs. Self-control&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Given  that data is such a broad-ranging topic, the Srikrishna committee will  be expected to recommend who should have oversight of data-related  matters. Will there be a new data protection authority? If so, what will  be its scope, given that regulators, such as the RBI, Sebi and Trai,  will all be affected by a privacy framework in their respective areas?  And what will be the punitive measures and fines for offenders on data  matters?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some experts said the government should appoint a  data protection authority. As the recent travails at Facebook show,  relying solely on self-regulation of internet platforms, is a disastrous  policy. But it’s unlikely that the entire burden of regulation will  fall on one authority.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Logistical problems are likely, especially  in the early days, with having a top-down regulatory approach,” said  Kriti Trehan, partner, Panag and Babu. “The process of training,  requirement of funding and access to skilled human resources will  necessitate organisational and administrative inputs. With this in mind,  I believe that a co-regulatory framework for data protection will be  efficient. With this approach, established parameters may guide  escalation in specific instances.”&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Data localisation &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In  April, the RBI had issued norms on the storage of payments system data,  which requires digital payment providers to store data in India. That  has sparked another debate over the possible stance of the Srikrishna  committee. Many start-ups and firms use data servers located in overseas  locations because of several reasons, including economies of scale and  tax planning. “Data protection should not be confused with data access,”  said Kartik Maheshwari, leader, Nishith Desai Associates. “For  instance, if a firm is storing user data abroad, that should be fine as  long as it is secure and access in India is provided, whenever required.  Storing data locally is not necessarily the best solution from the  perspective of data security as better infrastructure may be available  abroad. However, the government may, in exceptional cases of  sensitivity, legitimately require local storage of very narrowly defined  streams of data.”&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Surveillance is key&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The  law will also need to clearly define the contours of the contentious  issue of surveillance and how to ensure that India does not end up  replicating the policies in place in countries such as China, which are  notorious for mass surveillance practices. Surveillance that has been  legally sanctioned is part of the exceptions to regular privacy  practices. The committee will have to define the parameters of these  exceptions. In the case of surveillance, some experts, including Amber  Sinha of Centre for Internet and Society, said that while it needs to be  allowed in specific instances such as issues related to national  security, a judicial system needs to be in place to protect the rights  of the parties that are being put under surveillance. This, in many  ways, is the heart of a very important matter.&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;The Aadhaar factor&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The most hot-button of all issues for the committee is, of course, Aadhaar. Former UIDAI chairman Nandan Nilekani told &lt;i&gt;Mint &lt;/i&gt;this  week that “if something needs to be modified in the Aadhaar law, it  will be done” by the Srikrishna committee. The changes that the  committee will suggest to the Aadhaar law will go a long way in  determining whether its draft law is truly pro-privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy'&gt;https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-29T01:46:38Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018">
    <title>The Centre for Internet and Society’s Comments and Recommendations to the: Indian Privacy Code, 2018 </title>
    <link>https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018</link>
    <description>
        &lt;b&gt;The debate surrounding privacy has in recent times gained momentum due to the Aadhaar judgement and the growing concerns around the use of personal data by corporations and governments.&lt;/b&gt;
        &lt;p&gt;Click to download the &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/indian-privacy-code"&gt;file here&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;As India moves towards greater digitization, and technology becomes even more pervasive, there is a need to ensure the privacy of the individual as well as hold the private and public sector accountable for the use of personal data. Towards enabling public discourse and furthering the development a privacy framework for India, a group of lawyers and policy analysts backed by the Internet Freedom Foundation (IFF) have put together a draft a citizen's bill encompassing a citizen centric privacy code that is based on seven guiding principles.&lt;a href="#_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This draft builds on the Citizens Privacy Bill, 2013 that had been drafted by CIS on the basis of a series of roundtables conducted in India.&lt;a href="#_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Privacy is one of the key areas of research at CIS and we welcome this initiative and hope that our comments make the Act a stronger embodiment of the right to privacy.&lt;/p&gt;
&lt;h1 style="text-align: justify; "&gt;Section by Section Recommendations&lt;/h1&gt;
&lt;h2 style="text-align: justify; "&gt;Preamble&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; The Preamble specifies that the need for privacy has increased in the digital age, with the emergence of big data analytics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; It could instead be worded as ‘with the emergence of technologies such as big data analytics’, so as to recognize the impact of multiple technologies and processes including big data analytics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; The Preamble states that it is necessary for good governance that all interceptions of communication and surveillance be conducted in a systematic and transparent manner subservient to the rule of law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recommendation: The word ‘systematic’ is out of place, and can be interpreted incorrectly. It could instead be replaced with words such as ‘necessary’, ‘proportionate’, ‘specific’, and ‘narrow’, which would be more appropriate in this context.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Chapter 1&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;Preliminary&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 2: &lt;/b&gt;This Section defines the terms used in the Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Some of the terms are incomplete and a few of the terms used in the Act have not been included in the list of definitions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendations:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The term “effective consent” needs to be defined. The term is first used in the Proviso to Section 7(2), which states “Provided that effective consent can only be said to have been obtained where...:”It is crucial that the Act defines effective consent especially when it is with respect to sensitive data.&lt;/li&gt;
&lt;li&gt;The term “open data” needs to be defined. The term is first used in Section 5 that states the exemptions to the right to privacy. Subsection 1 clause ii states as follows “the collection, storage, processing or dissemination by a natural person of personal data for a strictly non-commercial purposes which may be classified as open data by the Privacy Commission”. Hence the term open data needs to be defined in order to ensure that there is no ambiguity in terms of what open data means.&lt;/li&gt;
&lt;li&gt;The Act does not define “erasure”, although the term erasure does come under the definition of destroy (Section 2(1)(p)). There are some provisions that use the word erasure , hence if erasure and destruction mean different acts then the term erasure needs to be defined, otherwise in order to maintain uniformity the sections where erasure is used could be substituted with the term “destroy” as defined under this Act.&lt;/li&gt;
&lt;li&gt;The definition of “sensitive personal data” does not include location data and identification numbers. The definition of sensitive data must include location data as the Act also deals in depth with surveillance. With respect to identification numbers, the Act needs to consider identification numbers (eg. the Aadhaar number, PAN number etc.) as sensitive information as this number is linked to a person's identity and can reveal sensitive personal data such as name, age, location, biometrics etc. Example can be taken from Section 4(1) of the GDPR&lt;a href="#_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; which identifies location data as well as identification numbers as sensitive personal data along with other identifies such as biometric data, gender race etc.&lt;/li&gt;
&lt;li&gt;The Act defines consent as the “unambiguous indication of a data subject’s agreement” however, the definition does not indicate that there needs to be an informed consent. Hence the revised definition could read as follows “the informed and unambiguous indication of a data subject’s agreement”. It is also unclear how this definition of consent relates to ‘effective consent’. This relationship needs to be clarified.&lt;/li&gt;
&lt;li&gt;The Act defines ‘data controller’ in Section 2(1)(l) as “ any person including appropriate government..”. In order to remove any ambiguity over the definition of the term person, the definition could specify that the term person means any natural or legal person.&lt;/li&gt;
&lt;li&gt;The Act defines ‘data processor’ in Section (2(1)(m) as “means any person including appropriate government”. In order to remove any ambiguity over the definition of the term ‘any person’, the definition could specify that the term person means any natural or legal person. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER II&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;Right to Privacy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 5: &lt;/b&gt;This section provides exemption to the rights to privacy&lt;b&gt;. &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;Section 5(1)(ii) states that the collection, storage, processing or dissemination by a natural person of personal data for a strictly non-commercial purposes are exempted from the provisions of the right to privacy. This clause also states that this data may be classified as open data by the Privacy Commission. This section hence provides individuals the immunity from collection, storage, processing and dissemination of data of another person. However this provision fails to state what specific activities qualify as non commercial use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;This provision could potentially be strengthened by specifying that the use must be in the public interest. The other issue with this subsection is that it fails to define open data. If open data was to be examined using its common definition i.e “data that can be freely used, modified, and shared by anyone for any purpose”&lt;a href="#_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; then this section becomes highly problematic. As a simple interpretation would mean that any personal data that is collected, stored, processed or disseminated by a natural person can possibly become available to anyone. Beyond this, India has an existing framework governing open data. Ideally the privacy commissioner could work closely with government departments to ensure that open data practices in India are in compliance with the privacy law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER III&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;Protection of Personal Data&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;PART A&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Notice by data controller &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 6: &lt;/b&gt;This section specifies the obligations to be followed by data controllers in their communication, to maintain transparency and lays down provisions that all communications by Data Controllers need to be complied with.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; There seems to be a error in the &lt;i&gt;Proviso &lt;/i&gt;to this section. The proviso states “Provided that all communications by the Data Controllers including but not limited to the rights of Data Subjects under this part &lt;b&gt;shall may be &lt;/b&gt;refused when the Data Controller is, unable to identify or has a well founded basis for reasonable doubts as to the identity of the Data Subject or are manifestly unfounded, excessive and repetitive, with respect to the information sought by the Data Subject ”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;The proviso could read as follows “The proviso states “Provided that all communications by the Data Controllers including but not limited to the rights of Data Subjects under this part &lt;b&gt;&lt;i&gt;may&lt;/i&gt;&lt;/b&gt; be refused when the Data Controller is…”. We suggest the use of the ‘may’ as this makes the provision less limiting to the rights of the data controller.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, it is not completely clear what ‘included but not limited to...’ would entail. This could be clarified further.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;PART B&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;CONSENT OF DATA SUBJECTS&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 10: &lt;/b&gt;This section talks about the collection of personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 10(3) lays down the information that a person must provide before collecting the personal data of an individual.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 10(3)(xi) states as follows “the time and manner in which it will be destroyed, or the criteria used to Personal data collected in pursuance of a grant of consent by the data subject to whom it pertains shall, if that consent is subsequently withdrawn for any reason, be destroyed forthwith: determine that time period;”. There seems to be a problem with the sentence construction and the rather complex sentence is difficult to understand.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; This section could be reworked in such as way that two conditions are clear, one - the time and manner in which the data will be destroyed and two the status of the data once consent is withdrawn.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 10(3)(xiii) states that the identity and contact details of the data controller and data processor must be provided. However it fails to state that the data controller should provide more details with regard to the process for grievance redressal. It does not provide guidance on what type of information needs to go into this notice and the process of redressal. This could lead to very broad disclosures about the existence of redress mechanisms without providing individuals an effective avenue to pursue.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;As part of the requirement for providing the procedure for redress, data controllers could specifically be required to provide the details of the Privacy Officers, privacy commissioner, as well as provide more information on the redressal mechanisms and the process necessary to follow.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 11:&lt;/b&gt;This section lays out the provisions where collection of personal data without prior consent is possible.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 11 states “Personal data may be collected or received from a third party by a Data Controller the prior consent of the data subject only if it is:..”. However as the title of the section suggests the sentence could indicate the situations where it is permissible to collect personal data without prior consent from the data subject”. Hence the word “without” is missing from the sentence. Additionally the sentence could state that the personal data may be collected or received directly from an individual or from a third party as it is possible to directly collect personal data from an individual without consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;The sentence could read as “Personal data may be collected or received from an &lt;b&gt;individual or a third party &lt;/b&gt;by a Data Controller &lt;b&gt;&lt;i&gt;without&lt;/i&gt;&lt;/b&gt; the prior consent of the data subject only if it is:..”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 11(1)(i) states that the collection of personal data without prior consent when it is “necessary for the provision of an emergency medical service or essential services”. However it does not specify the kind or severity of the medical emergency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;In addition to medical emergency another exception could be made for imminent threats to life.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 12: &lt;/b&gt;This section details the Special provisions in respect of data collected prior to the commencement of this Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This section states that all data collected, processed and stored by data controllers and data processors prior to the date on which this Act comes into force shall be destroyed within a period of two years from the date on which this Act comes into force. Unless consent is obtained afresh within two years or that the personal data has been anonymised in such a manner to make re-identification of the data subject absolutely impossible. However this process can be highly difficult and impractical in terms of it being time consuming, expensive particularly, in cases of analog collections of data. This is especially problematic in cases where the controller cannot seek consent of the data subject due to change in address or inavailability or death. This will also be problematic in cases of digitized government records.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; We suggest three ways in which the issue of data collected prior to the Act can be handled. One way is to make a distinction on the data based on whether the data controller has specified the purpose of the collection before collecting the data. If the purpose was not defined then the data can be deleted or anonymised. Hence there is no need to collect the data afresh for all the cases. The purpose of the data can also be intimated to the data subject at a later stage and the data subject can choose if they would like the controller to store or process the data.The second way is by seeking consent afresh only for the sensitive data. Lastly, the data controller could be permitted to retain records of data, but must necessarily obtain fresh consent before using them. By not having a blanket provision of retrospective data deletion the Act can address situations where deletion is complicated or might have a potential negative impact by allowing storage, deletion, or anonymisation of data based on its purpose and kind.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section (2)(1)(i) of the Act states that the data will not be destroyed provided that &lt;b&gt;effective consent&lt;/b&gt; is obtained afresh within two years. However as stated earlier the Act does not define effective consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recommendation: The term &lt;b&gt;effective consent &lt;/b&gt;needs to be defined in order to bring clarity to this provision.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;PART C&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;FURTHER LIMITATIONS ON DATA CONTROLLERS&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 16: &lt;/b&gt;This section deals with the security of personal data and duty of confidentiality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 16(2) states “ Any person who collects, receives, stores, processes or otherwise handles any personal data shall be subject to a duty of confidentiality and secrecy in respect of it.” Similarly Section 16(3) states “data controllers and data processors shall be subject to a duty of confidentiality and secrecy in respect of personal data in their possession or control. However apart from the duty of confidentiality and secrecy the data collectors and processors could also have a duty to maintain the security of the data.” Though it is important for confidentiality and secrecy to be maintained, ensuring security requires adequate and effective technical controls to be in place.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; This section could also emphasise on the duty of the data controllers to ensure the security of the data. The breach notification could include details about data that is impacted by a breach or attach as well as the technical details of the infrastructure compromised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 17:&lt;/b&gt; This section details the conditions for the transfer of personal data outside the territory of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 17 allows a transfer of personal data outside the territory of India in 3 situations- If the Central Government issues a notification deciding that the country/international organization in question can ensure an adequate level of protection, compatible with privacy principles contained in this Act; if the transfer is pursuant to an agreement which binds the recipient of the data to similar or stronger conditions in relation to handling the data; or if there are appropriate legal instruments and safeguards in place, to the satisfaction of the data controller. However, there is no clarification for what would constitute ‘adequate’ or ‘appropriate’ protection, and it does not account for situations in which the Government has not yet notified a country/organisation as ensuring adequate protection. In comparison, the GDPR, in Chapter V&lt;a href="#_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, contains factors that must be considered when determining adequacy of protection, including relevant legislation and data protection rules, the existence of independent supervisory authorities, and international commitments or obligations of the country/organization. Additionally, the GDPR allows data transfer even in the absence of the determination of such protection in certain instances, including the use of standard data protection clauses, that have been adopted or approved by the Commission; legally binding instruments between public authorities; approved code of conduct, etc. Additionally, it allows derogations from these measures in certain situations: when the data subject expressly agrees, despite being informed of the risks; or if the transfer is necessary for conclusion of contract between data subject and controller, or controller and third party in the interest of data subject; or if the transfer is necessary for reasons of public interest, etc. No such circumstances are accounted for in Section 17.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;Additionally, data controllers and processors could be provided with a period to allow them to align their policies towards the new legislation. Making these provisions operational as soon as the Act is commenced might put the controllers or processors guilty of involuntary breaching the provisions of the Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 19: &lt;/b&gt;This section&lt;b&gt; &lt;/b&gt;states the special provisions for sensitive personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 19(2) states that in addition to the requirements set out under sub-clause (1), the Privacy Commission shall set out additional protections in respect of:i.sensitive personal data relating to data subjects who are minors; ii.biometric and deoxyribonucleic acid data; and iii.financial and credit data.This however creates additional categories of sensitive data apart from the ones that have already been created.&lt;a href="#_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These additional categories can result in confusion and errors.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;Sensitive data must not be further categorised as this can lead to confusion and errors. Hence all sensitive data could be subject to the same level of protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 20:&lt;/b&gt; This section states the special provisions for data impact assessment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This section states that all data impact assessment reports will be submitted periodically to the State Privacy commission. This section does not make provisions for instances of circumstances in which such records may be made public. Additionally the data impact assessment could also include a human rights impact assessment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The section could also have provisions for making the records of the impact assessment or relevant parts of the assessment public. This will ensure that the data controllers / processors are subjected to a standard of accountability and transparency. Additionally as privacy is linked to human rights the data impact assessment could also include a human rights impact assessment. The Act could further clarify the process for submission to State Privacy Commissions and potential access by the Central Privacy Commission to provide clarity in process.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 20 requires controllers who use new technology to assess the risks to the data protection rights that occur from processing. ‘New technology’ is defined to include pre-existing technology that is used anew. Additionally, the reports are required to be sent to the State Privacy Commission periodically. However, there is no clarification on the situations in which such an assessment becomes necessary, or whether all technology must undergo such an assessment before their use. Additionally, the differentiation between different data processing activities based on whether the data processing is incidental or a part of the functioning needs to be clarified. This differentiation is necessary as there are some data processors and controllers who need the data to function; for instance an ecommerce site would require your name and address to deliver the goods, although these sites do not process the data to make decisions. This can be compared to a credit rating agency that is using the data to make decisions as to who will be given a loan based on their creditworthiness. Example can taken from the GDPR, which in Article 35, specifies instances in which a data impact assessment is necessary: where a new technology, that is likely to result in a high risk to the rights of persons, is used; where personal aspects related to natural persons are processed automatically, including profiling; where processing of special categories of data (including data revealing ethnic/racial origin, sexual orientation etc), biometric/genetic data; where data relating to criminal convictions is processed; and with data concerning the monitoring of publicly accessible areas. Additionally, there is no requirement to publish the report, or send it to the supervising authority, but the controller is required to review the processor’s operations to ensure its compliance with the assessment report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The reports could be sent to a central authority, which according to this Act is the Privacy Commission, along with the State Privacy Commission. Additionally there needs to be a differentiation between the incidental and express use of data. The data processors must be given at least a period of one year after the commencement of the Act to present their impact assessment report. This period is required for the processors to align themselves with the provisions of the Act as well as conduct capacity building initiatives.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;PART C&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;RIGHTS OF A DATA SUBJECT&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 21: &lt;/b&gt;This section explains the right of the data subject with regard to accessing her data. It states that the data subject has the right to obtain from the data controller information as to whether any personal data concerning her is collected or processed. The data controller also has to not only provide access to such information but also the personal data that has been collected or processed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This section does not provide the data subject the right to seek information about security breaches.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;This section could state that the data subject has the right to seek information about any security breaches that might have compromised her data (through theft, loss, leaks etc.). This could also include steps taken by the data controller to address the immediate breach as well as steps to minimise the occurrence of such breaches in the future.&lt;a href="#_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER IV&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;INTERCEPTION AND SURVEILLANCE&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 28: &lt;/b&gt;This section lists out the special provisions for competent organizations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 28(1) states ”all provisions of Chapter III shall apply to personal data collected, processed, stored, transferred or disclosed by competent organizations unless when done as per the provisions under this chapter ”.This does not make provisions for other categories of data such as sensitive data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; This section needs to include not just personal data but also sensitive data, in order to ensure that all types of data are protected under this Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 30:&lt;/b&gt; This section states the provisions for prior authorisation by the appropriate Surveillance and Interception Review Tribunal.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 30(5) states “any interception involving the infringement of the privacy of individuals who are not the subject of the intended interception, or where communications relate to &lt;b&gt;medical, journalistic, parliamentary or legally privileged material&lt;/b&gt; may be involved, shall satisfy additional conditions including the provision of specific prior justification in writing to the Office for Surveillance Reform of the Privacy Commission as to the necessity for the interception and the safeguards providing for minimizing the material intercepted to the greatest extent possible and the destruction of all such material that is not strictly necessary to the purpose of the interception.” This section needs to state why these categories of communication are more sensitive than others. Additionally, interceptions typically target people and not topics of communication - thus medical may be part of a conversation between two construction workers and a doctor will communicate about finances.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The section could instead of singling out “medical, journalistic, parliamentary or legally privileged material” state that “any interception involving the infringement of the privacy of individuals who are not the subject of the intended interception may be involved, shall satisfy additional conditions including the provision of specific prior justification in writing to the Office for Surveillance Reform of the Privacy Commission.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 37&lt;/b&gt;: This section details the bar against surveillance.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;Section 37(1) states that “no person shall order or carry out, or cause or assist the ordering or carrying out of, any surveillance of another person”. The section also prohibits indiscriminate monitoring, or mass surveillance, unless it is necessary and proportionate to the stated purpose. However, it is unclear whether this prohibits surveillance by a resident of their own residential property, which is allowed in Section 5, as the same could also fall within ‘indiscriminate monitoring/mass surveillance’. For instance, in the case of a camera installed in a residential property, which is outward facing, and therefore captures footage of the road/public space.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The Act needs to bring more clarity with regard to surveillance especially with respect to CCTV cameras that are installed in private places, but record public spaces such as public roads. The Act could have provisions that clearly define the use of CCTV cameras in order to ensure that cameras installed in private spaces are not used for carrying out mass surveillance. Further, the Act could address the use of emerging techniques and technology such as facial recognition technologies, that often rely on publicly available data.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER V&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;THE PRIVACY COMMISSION&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 53:&lt;/b&gt; This section details the powers and functions of the Privacy Commission.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 53(2)(xiv) states that the Privacy Commission shall publish periodic reports “providing description of performance, findings, conclusions or recommendations of any or all of the functions assigned to the Privacy Commission”. However this Section does not make provisions for such reporting to happen annually and to make them publicly available, as well as contain details including financial aspects of matters contained within the Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;The functions could include a duty to disclose the information regarding the functioning and financial aspects of matters contained within the Act. Categories that could be included in such reports include: the number of data controllers, number of data processors, number of breaches detected and mitigated etc.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER IX&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;OFFENCES AND PENALTIES&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; Sections 73 to 80:&lt;/b&gt; These sections lay out the different punishments for controlling and processing data in contravention to the provisions of this Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; These sections, while laying out different punishments for controlling and processing data in contravention to the provisions of this Act, mets out a fine extending upto Rs. 10 crore. This is problematic as it does not base these penalties on the finer aspects of proportionality, such as  offences that are not as serious as the others.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;Recommendation:&lt;/b&gt; There could be a graded approach to the penalties based on the degree of severity of the offence.This could be in the form of name and shame, warnings and penalties that can be graded based on the degree of the offence. &lt;br /&gt; ----------------------------------------------------------------------&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additional thoughts: As India moves to a digital future there is a need for laws to be in place to ensure that individual's rights are not violated. By riding on the push to digitization, and emerging technologies such as AI, a strong all encompassing privacy legislation can allow India to leapfrog and use these emerging technologies for the benefit of the citizens without violating their privacy. A robust legislation can also ensure a level playing field for data driven enterprises within a framework of openness, fairness, accountability and transparency.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These seven principles include: Right to Access, Right to Rectification, Right to Erasure And Destruction of Personal Data,Right to Restriction Of Processing, Right to Object, Right to Portability of Personal Data,Right to Seek Exemption from Automated Decision-Making.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;The Privacy (Protection) Bill 2013: A Citizen’s Draft, Bhairav Acharya, Centre for Internet &amp;amp; Society, https://cis-india.org/internet-governance/blog/privacy-protection-bill-2013-citizens-draft&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;General Data Protection Regulation, available at https://gdpr-info.eu/art-4-gdpr/.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Antonio Vetro, Open Data Quality Measurement Framework: Definition and Application to Open Government Data, available at https://www.sciencedirect.com/science/article/pii/S0740624X16300132&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; General Data Protection Regulation, available at https://gdpr-info.eu/chapter-5/.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Sensitive personal data under Section 2(bb) includes, biometric data; deoxyribonucleic acid data;&lt;br /&gt; sexual preferences and practices;medical history and health information;political affiliation;&lt;br /&gt; membership of a political, cultural, social organisations including but not limited to a trade union as defined under Section 2(h) of the Trade Union Act, 1926;ethnicity, religion, race or caste; and&lt;br /&gt; financial and credit information, including financial history and transactions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Submission to the Committee of Experts on a Data Protection Framework for India, Amber Sinha, Centre for Internet &amp;amp; Society, available at https://cis-india.org/internet-governance/files/data-protection-submission&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018'&gt;https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas, Elonnai Hickok, Amber Sinha and Shruti Trikanand</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-20T13:55:46Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/files/indian-privacy-code">
    <title>Indian Privacy Code</title>
    <link>https://cis-india.org/internet-governance/files/indian-privacy-code</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/files/indian-privacy-code'&gt;https://cis-india.org/internet-governance/files/indian-privacy-code&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-07-20T13:54:35Z</dc:date>
   <dc:type>File</dc:type>
   </item>




</rdf:RDF>
