<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 251 to 265.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/files/indian-privacy-code"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/indian-express-july-15-2018-nishant-shah-digital-native-the-citys-watching"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/about/newsletters/june-2018-newsletter"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/files/ai-task-force-report.pdf"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/comments-on-the-draft-digital-communications-policy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/niti-aayog-discussion-paper-an-aspirational-step-towards-india2019s-ai-policy"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/files/niti-aayog-discussion-paper"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill">
    <title>Lining up the data on the Srikrishna Privacy Draft Bill</title>
    <link>https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill</link>
    <description>
        &lt;b&gt;In the run-up to the Justice BN Srikrishna committee report, some stakeholders have advocated that consent be eliminated and replaced with stronger accountability obligations. This was rejected and the committee has released a draft bill that has consent as the bedrock just like the GDPR. And like the GDPR there exists legal basis for nonconsensual processing of data for the “functions of the state”. What does this mean for lawabiding persons?&lt;/b&gt;
        &lt;p&gt;The article was published in &lt;a class="external-link" href="https://economictimes.indiatimes.com/small-biz/startups/newsbuzz/lining-up-the-data-on-the-srikrishna-privacy-draft-bill/articleshow/65192296.cms"&gt;Economic Times&lt;/a&gt; on July 30, 2018&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Non-consensual processing is permitted in the bill as long it is “necessary for any function of the” Parliament or any state legislature. These functions need not be authorised by law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Or alternatively “necessary for any function of the state authorised by law” for the provision of a service or benefit, issuance of any certification, licence or permit.&lt;br /&gt;Fortunately, however, the state remains bound by the eight obligations in chapter two i.e., fair and reasonable processing, purpose limitation, collection limitation, lawful processing, notice and data quality and data storage limitations and accountability. This ground in the GDPR has two sub-clauses: one, the task passes the public interest test and two, the loophole like the Indian bill that possibly includes all interactions the state has with all persons.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The “necessary” test appears both on the grounds for non-consensual processing, and in the “collection limitation” obligation in chapter two of the bill. For sensitive personal data, the test is raised to “strictly necessary”. But the difference is not clarified and the word “necessary” is used in multiple senses.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the “collection limitation” obligation the bill says “necessary for the purposes of processing” which indicates a connection to the “purpose limitation” obligation. The “purpose limitation” obligation, however, only requires the state to have a purpose that is “clear, specific and lawful” and processing limited to the “specific purpose” and “any other incidental purpose that the data principal would reasonably expect the personal data to be used for”. It is perhaps important at this point to note that the phrase “data minimisation” does not appear anywhere in the bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Therefore “necessary” could broadly understood to mean data Parliament or the state legislature requires to perform some function unauthorised by law, and data the citizen might reasonably expect a state authority to consider incidental to the provision of a service or benefit, issuance of a certificate, licence or permit.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Or alternatively more conservatively understood to mean data without which it would be impossible for Parliament and state legislature to carry out functions mandated by the law, and data without it would be impossible for the state to provide the specific service or benefit or issue certificates, licences and permits. It is completely unclear like with the GDPR why an additional test of “strictly necessary” is — if you will forgive the redundancy — necessary.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After 10 years of Aadhaar, the average citizen “reasonably expects” the state to ask for biometric data to provide subsidised grain. But it is not impossible to provide subsidised grain in a corruption-free manner without using surveillance technology that can be used to remotely, covertly and non-consensually identify persons. Smart cards, for example, implement privacy by design. Therefore a “reasonable expectation” test is not inappropriate since this is not a question about changing social mores.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When it comes to persons that are not law abiding the bill has two exceptions — “security of the state” and “prevention, detection, investigation and prosecution of contraventions of law”. Here the “necessary” test is combined with the “proportionate” test.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The proportionate test further constrains processing. For example, GPS data may be necessary for detecting someone has jumped a traffic signal but it might not be a proportionate response for a minor violation. Along with the requirement for “procedure established by law”, this is indeed a well carved out exception if the “necessary” test is interpreted conservatively. The only points of concern here is that the infringement of a fundamental right for minor offences and also the “prevention” of offences which implies processing of personal data of innocent persons.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ideally consent should be introduced for law-abiding citizens even if it is merely tokenism because you cannot revoke consent if you have not granted it in the first place. Or alternatively, a less protective option would be to admit that all egovernance in India will be based on surveillance, therefore “necessary” should be conservatively defined and the “proportionate” test should be introduced as an additional safeguard.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill'&gt;https://cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-31T02:52:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy">
    <title>After Securing Net Neutrality In India, TRAI Goes To Bat For Data Privacy</title>
    <link>https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy</link>
    <description>
        &lt;b&gt;This will be a stop-gap measure before the creation of a privacy bill.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Gopal Sathe was published in &lt;a class="external-link" href="https://www.huffingtonpost.in/2018/07/16/after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy_a_23483166/"&gt;Huffington Post&lt;/a&gt; on July 16, 2018. Pranesh Prakash was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Last week, the Department of Telecom gave  the nod to net neutrality regulations, ensuring that there would be no  discrimination of data at a time when the US is moving in the &lt;a href="https://www.theverge.com/2018/6/11/17439456/net-neutrality-dead-ajit-pai-fcc-internet" target="_blank"&gt;opposite direction&lt;/a&gt;.  The net neutrality norms were based on the recommendations from the  Telecom Regulatory Authority of India (TRAI) - which the BBC in November  described as &lt;a href="https://www.bbc.com/news/world-asia-india-42162979" target="_blank"&gt;the world's strongest&lt;/a&gt; - but the regulator isn't celebrating right now - it's moved on to  another equally important topic - privacy and data protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On Monday, TRAI announced its &lt;a href="https://trai.gov.in/sites/default/files/RecommendationDataPrivacy16072018_0.pdf" target="_blank"&gt;recommendations&lt;/a&gt; on privacy, security, and ownership of data in the telecom sector, and  the 77 page document serves as the first major public guidelines on  privacy and data protection in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;TRAI has outlined a consent based framework, where users have to  clearly choose what data is being used, which bears some similarities to  Europes GDPR. TRAI noted that while the right to privacy should not be  treated solely as a property right, it must be noted that the  controllers of personal data are mere custodians without any primary  right over the same. In other words, your data should belong to you, and  not to Google, or Facebook, or any other company which holds your data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"The Right to Choice, Notice, Consent, Data Portability, and Right to  be Forgotten should be conferred upon the telecommunication consumers,"  TRAI recommended&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In section 2.3, it also notes that meta-data is personal information  and as such should be given the same protections. This is an important  point given that even metadata can be used to track and identify people  accurately. It also noted that there needs to be a right to be  forgotten, and once you stop using a service it should not store your  data beyond what's mandated by the law, according to section 2.46.  Section 2.49 also allows users the right to withdraw consent, which  means that even if people have given consent to gathering your data,  users will be able to stop tracking on demand.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At the same time, TRAI also noted the stop-gap nature of its  recommendations, and said, "till such time a general data protection law  is notified by the government, the existing Rules/ License conditions  applicable to the Telecom Service Providers for protection of users  should be made applicable to all the entities in the digital  eco-system."&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Good, with some caveats&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Early reactions to the recommendations are largely positive. On  Twitter, lawyer Apar Gupta, who is one of the founding members of the  Internet Freedom Foundation shared some &lt;a href="https://twitter.com/apargupta84/status/1018856500775841793" target="_blank"&gt;quick thoughts&lt;/a&gt; about the recommendations. Describing this as a substantive document he  called it "partly positive since it calls for interim safeguards", but  added that the "form of some seems problematic."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the plus side, he noted that many of the protections in the  recommendations "focus on a user rights model, which includes notice,  choice, consent, portability, deletion and erasure." He also praised the  recommendations for not taking a view on data localisation, and that  the protections need to apply to private as well as state entities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, he criticized the fact that TRAI is planning to impose  license conditions on all OTT providers - that is to say, all third  party services. He also noted that the recommendations did not directly  address state surveillance. He also pointed out that an Electronic  Consent Framework as described in the recommendations may "centralise  consent requests thereby may end up generating more personal data and  unifying them into a single portal managed by the govt/regulators."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"We are happy with the TRAI's recommendations on Privacy, Security  and Ownership of Data as the regulator is calling for all digital  entities to be brought under data protection framework. This would  include all devices, operating systems, browsers, and applications and  would be welcome stop-gap measure till rules and regulations of the  telecom services providers are applicable to them," said Rajan Matthews,  DG Cellular Operators Association of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"This will ensure, in prevailing circumstances, that the privacy of  users is protected and maintained. National security and privacy issues  are of paramount importance. Accordingly, the regulator by making this  recommendation, is ensuring that no exception is made for any service  provider, while subjecting them to the rules to meet the national  security and privacy norms. However, this is our preliminary view and we  will need to review the other recommendations to determine their  implications."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Speaking in a &lt;a href="https://twitter.com/ETNOWlive/status/1018849319300972544" target="_blank"&gt;television interview&lt;/a&gt;,  Pranesh Prakash, Policy Director at the Centre for Internet and  Society, said he's still processing the document, but "on the face of it  it seems good."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"There are still certain concerns I have which haven't been  addressed. The telecom licenses themselves, which are issued by the  Government of India, require a whole lot of data to be collected,  metadata to be collected, by telecom companies. So I'm not sure how that  requirement by the Government of India squares off with what is now  being recommended by TRAI."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Let me also point out that one of the things that TRAI says, and it  might be exceeding its brief a little bit, is that it says this should  not only cover telecom operators, but also device manufacturers,  operating systems, application creators, and other kinds of software.  What TRAI seems to want to do is actually quite a bit more than what I  think the DoT has, or really ought to be doing. I really don't  understand whether this will find any favour in the interim before the  government decides to take up the Justice Srikrishna Committee report."&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Justice Srikrishna committee report still due&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Although TRAI's recommendations are an important document, and will  serve as stopgap privacy rules, India is also on the verge of a data  protection and privacy bill, which will be based on the recommendations  of the Justice BN Srikrishna committee on the subject. The committee was  formed in August and was expected to deliver its report in June, but  sources say that disagreements over the Aadhaar have caused some delays.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The committee is expected to send its recommendations to the  government soon, at which point things could change, but for now, TRAI's  recommendations are an important development as India moves to secure  the privacy of its people.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ahead of that though, you can read the full TRAI recommendations &lt;a href="https://trai.gov.in/sites/default/files/RecommendationDataPrivacy16072018_0.pdf" target="_blank"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy'&gt;https://cis-india.org/internet-governance/news/huffington-post-gopal-sathe-july-16-2018-after-securing-net-neutrality-in-india-trai-goes-to-bat-for-data-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Telecom</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-07-29T05:28:20Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy">
    <title>Bit by byte protecting her privacy</title>
    <link>https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy</link>
    <description>
        &lt;b&gt;The Srikrishna committee draft law on data protection is days away. Here’s a bucket list of issues that will matter&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Mihir Dalal and Anirban Sen was published in &lt;a class="external-link" href="https://www.livemint.com/Politics/qZg7qJoXhHIwnyLUYVsaxL/Bit-by-byte-protecting-her-privacy.html"&gt;Livemint&lt;/a&gt; on July 26, 2018. Amber Sinha was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In an  era dominated by “free” platforms such as Google, Facebook and Amazon,  among others, data privacy had largely been considered an academic  matter. However, in the past one year that notion has changed forever,  bringing data privacy to the fore, as one of the defining issues of the  internet, both in India and abroad.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Last August, the Supreme Court  ruled that privacy was a fundamental right under the Constitution of  India. Concomitantly, the debate over Aadhaar and its potential misuse  picked up steam on the back of reports about data breaches in the  biometric ID system though these reports were denied by the Unique  Identification Authority of India, which built Aadhaar. (The apex Court  will deliver its verdict on petitions that have challenged the  constitutional validity of Aadhaar and its legal framework)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Globally,  Facebook came under severe criticism after it was revealed that the  social media giant had compromised user data in the run up to the US  elections. Finally, in May, Europe introduced its landmark data privacy  law, General Data Protection Regulation (GDPR), which has put users in  control of their data through various measures.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The stage  is now set for the much-delayed draft law on data protection, which is  expected to be submitted soon by the 10-member panel headed by former  Supreme Court justice B.N. Srikrishna.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The committee, which had  been set up last July, has attracted criticism from some quarters.  Earlier this month, more than 150 lawyers, activists and journalists,  among others, wrote to the Srikrishna committee, complaining about the  lack of transparency in its process, the lack of diversity in the views  held by members of the committee, besides other issues. In an earlier  letter in November last, activists, lawyers and others had alleged that  too many members of the committee held pro-Aadhaar views.  Some experts  believe that the mandate of the committee was flawed to begin with.  “Given that personal information is omnipresent in so many different  sectors, it is better to have a light touch legislation that deals  mostly with key principles of data privacy and empowers a data  commissioner to frame more detailed regulations,” said Stephen Mathias,  partner, Kochhar and Co.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Last week, the Telecom Regulatory  Authority of India (Trai) released a set of recommendations on data  privacy that favour giving users control of their data and personal  information, while severely restricting the ways in which telecom and  internet companies can use customer data. Here are the major issues to  watch out for in the draft data protection law.&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Users vs. collectors &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This  broad umbrella includes mandatory consent of users for data collection,  data portability, the right to be forgotten and the right to erasure.  Last week, Trai gave its recommendations on some of these issues in what  were considered pro-privacy and progressive suggestions. Those  recommendations tracked GDPR measures. The Srikrishna committee is also  expected to suggest pro-privacy measures, though the details will be  all-important. The committee is also expected to define what is  ‘sensitive’ or ‘critical’ data.  “In India, government agencies, private  entities and others collect various forms of data on individuals,” said  Chetan Nagendra, partner, AZB Partners. “The committee will have to  clarify what category of data is allowed to be collected and whether  this should this be standardized across different entities. It will also  have to standardize rules on how long is it okay to store such  user-collected data.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The flip side of user rights is the role of  data repositories that collect and process user data. The committee will  be required to clarify what data firms and government agencies can  gather on users and what will be their responsibilities toward the usage  of that data. This includes the principle of privacy by design, that  is, companies must ensure by default that their platforms are designed  to protect rather than exploit user data and privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;IndusLaw  partner Namita Viswanath said that in terms of data repositories, there  was a need to distinguish between a data controller and a data  processor. A data controller is the user-facing platform that gathers  data, whereas a data processor is often a third-party firm that provides  infrastructure for the platform. “Responsibilities of user personal  data should be shared between a data controller and processor. The  nature and extent of liability should depend on the nature of data, the  party responsible for handling data and the measures adopted, but  ultimately, the data controller should most responsibility,” Viswanath  said.&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Regulation  vs. Self-control&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Given  that data is such a broad-ranging topic, the Srikrishna committee will  be expected to recommend who should have oversight of data-related  matters. Will there be a new data protection authority? If so, what will  be its scope, given that regulators, such as the RBI, Sebi and Trai,  will all be affected by a privacy framework in their respective areas?  And what will be the punitive measures and fines for offenders on data  matters?&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some experts said the government should appoint a  data protection authority. As the recent travails at Facebook show,  relying solely on self-regulation of internet platforms, is a disastrous  policy. But it’s unlikely that the entire burden of regulation will  fall on one authority.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Logistical problems are likely, especially  in the early days, with having a top-down regulatory approach,” said  Kriti Trehan, partner, Panag and Babu. “The process of training,  requirement of funding and access to skilled human resources will  necessitate organisational and administrative inputs. With this in mind,  I believe that a co-regulatory framework for data protection will be  efficient. With this approach, established parameters may guide  escalation in specific instances.”&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Data localisation &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In  April, the RBI had issued norms on the storage of payments system data,  which requires digital payment providers to store data in India. That  has sparked another debate over the possible stance of the Srikrishna  committee. Many start-ups and firms use data servers located in overseas  locations because of several reasons, including economies of scale and  tax planning. “Data protection should not be confused with data access,”  said Kartik Maheshwari, leader, Nishith Desai Associates. “For  instance, if a firm is storing user data abroad, that should be fine as  long as it is secure and access in India is provided, whenever required.  Storing data locally is not necessarily the best solution from the  perspective of data security as better infrastructure may be available  abroad. However, the government may, in exceptional cases of  sensitivity, legitimately require local storage of very narrowly defined  streams of data.”&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;Surveillance is key&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The  law will also need to clearly define the contours of the contentious  issue of surveillance and how to ensure that India does not end up  replicating the policies in place in countries such as China, which are  notorious for mass surveillance practices. Surveillance that has been  legally sanctioned is part of the exceptions to regular privacy  practices. The committee will have to define the parameters of these  exceptions. In the case of surveillance, some experts, including Amber  Sinha of Centre for Internet and Society, said that while it needs to be  allowed in specific instances such as issues related to national  security, a judicial system needs to be in place to protect the rights  of the parties that are being put under surveillance. This, in many  ways, is the heart of a very important matter.&lt;/p&gt;
&lt;p class="orangeXh" style="text-align: justify; "&gt;&lt;b&gt;The Aadhaar factor&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The most hot-button of all issues for the committee is, of course, Aadhaar. Former UIDAI chairman Nandan Nilekani told &lt;i&gt;Mint &lt;/i&gt;this  week that “if something needs to be modified in the Aadhaar law, it  will be done” by the Srikrishna committee. The changes that the  committee will suggest to the Aadhaar law will go a long way in  determining whether its draft law is truly pro-privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy'&gt;https://cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-29T01:46:38Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018">
    <title>The Centre for Internet and Society’s Comments and Recommendations to the: Indian Privacy Code, 2018 </title>
    <link>https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018</link>
    <description>
        &lt;b&gt;The debate surrounding privacy has in recent times gained momentum due to the Aadhaar judgement and the growing concerns around the use of personal data by corporations and governments.&lt;/b&gt;
        &lt;p&gt;Click to download the &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/indian-privacy-code"&gt;file here&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;As India moves towards greater digitization, and technology becomes even more pervasive, there is a need to ensure the privacy of the individual as well as hold the private and public sector accountable for the use of personal data. Towards enabling public discourse and furthering the development a privacy framework for India, a group of lawyers and policy analysts backed by the Internet Freedom Foundation (IFF) have put together a draft a citizen's bill encompassing a citizen centric privacy code that is based on seven guiding principles.&lt;a href="#_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This draft builds on the Citizens Privacy Bill, 2013 that had been drafted by CIS on the basis of a series of roundtables conducted in India.&lt;a href="#_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Privacy is one of the key areas of research at CIS and we welcome this initiative and hope that our comments make the Act a stronger embodiment of the right to privacy.&lt;/p&gt;
&lt;h1 style="text-align: justify; "&gt;Section by Section Recommendations&lt;/h1&gt;
&lt;h2 style="text-align: justify; "&gt;Preamble&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; The Preamble specifies that the need for privacy has increased in the digital age, with the emergence of big data analytics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; It could instead be worded as ‘with the emergence of technologies such as big data analytics’, so as to recognize the impact of multiple technologies and processes including big data analytics.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; The Preamble states that it is necessary for good governance that all interceptions of communication and surveillance be conducted in a systematic and transparent manner subservient to the rule of law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recommendation: The word ‘systematic’ is out of place, and can be interpreted incorrectly. It could instead be replaced with words such as ‘necessary’, ‘proportionate’, ‘specific’, and ‘narrow’, which would be more appropriate in this context.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Chapter 1&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;Preliminary&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 2: &lt;/b&gt;This Section defines the terms used in the Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Some of the terms are incomplete and a few of the terms used in the Act have not been included in the list of definitions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendations:&lt;/b&gt;&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;The term “effective consent” needs to be defined. The term is first used in the Proviso to Section 7(2), which states “Provided that effective consent can only be said to have been obtained where...:”It is crucial that the Act defines effective consent especially when it is with respect to sensitive data.&lt;/li&gt;
&lt;li&gt;The term “open data” needs to be defined. The term is first used in Section 5 that states the exemptions to the right to privacy. Subsection 1 clause ii states as follows “the collection, storage, processing or dissemination by a natural person of personal data for a strictly non-commercial purposes which may be classified as open data by the Privacy Commission”. Hence the term open data needs to be defined in order to ensure that there is no ambiguity in terms of what open data means.&lt;/li&gt;
&lt;li&gt;The Act does not define “erasure”, although the term erasure does come under the definition of destroy (Section 2(1)(p)). There are some provisions that use the word erasure , hence if erasure and destruction mean different acts then the term erasure needs to be defined, otherwise in order to maintain uniformity the sections where erasure is used could be substituted with the term “destroy” as defined under this Act.&lt;/li&gt;
&lt;li&gt;The definition of “sensitive personal data” does not include location data and identification numbers. The definition of sensitive data must include location data as the Act also deals in depth with surveillance. With respect to identification numbers, the Act needs to consider identification numbers (eg. the Aadhaar number, PAN number etc.) as sensitive information as this number is linked to a person's identity and can reveal sensitive personal data such as name, age, location, biometrics etc. Example can be taken from Section 4(1) of the GDPR&lt;a href="#_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; which identifies location data as well as identification numbers as sensitive personal data along with other identifies such as biometric data, gender race etc.&lt;/li&gt;
&lt;li&gt;The Act defines consent as the “unambiguous indication of a data subject’s agreement” however, the definition does not indicate that there needs to be an informed consent. Hence the revised definition could read as follows “the informed and unambiguous indication of a data subject’s agreement”. It is also unclear how this definition of consent relates to ‘effective consent’. This relationship needs to be clarified.&lt;/li&gt;
&lt;li&gt;The Act defines ‘data controller’ in Section 2(1)(l) as “ any person including appropriate government..”. In order to remove any ambiguity over the definition of the term person, the definition could specify that the term person means any natural or legal person.&lt;/li&gt;
&lt;li&gt;The Act defines ‘data processor’ in Section (2(1)(m) as “means any person including appropriate government”. In order to remove any ambiguity over the definition of the term ‘any person’, the definition could specify that the term person means any natural or legal person. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER II&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;Right to Privacy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 5: &lt;/b&gt;This section provides exemption to the rights to privacy&lt;b&gt;. &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;Section 5(1)(ii) states that the collection, storage, processing or dissemination by a natural person of personal data for a strictly non-commercial purposes are exempted from the provisions of the right to privacy. This clause also states that this data may be classified as open data by the Privacy Commission. This section hence provides individuals the immunity from collection, storage, processing and dissemination of data of another person. However this provision fails to state what specific activities qualify as non commercial use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;This provision could potentially be strengthened by specifying that the use must be in the public interest. The other issue with this subsection is that it fails to define open data. If open data was to be examined using its common definition i.e “data that can be freely used, modified, and shared by anyone for any purpose”&lt;a href="#_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; then this section becomes highly problematic. As a simple interpretation would mean that any personal data that is collected, stored, processed or disseminated by a natural person can possibly become available to anyone. Beyond this, India has an existing framework governing open data. Ideally the privacy commissioner could work closely with government departments to ensure that open data practices in India are in compliance with the privacy law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER III&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;Protection of Personal Data&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;PART A&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Notice by data controller &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 6: &lt;/b&gt;This section specifies the obligations to be followed by data controllers in their communication, to maintain transparency and lays down provisions that all communications by Data Controllers need to be complied with.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; There seems to be a error in the &lt;i&gt;Proviso &lt;/i&gt;to this section. The proviso states “Provided that all communications by the Data Controllers including but not limited to the rights of Data Subjects under this part &lt;b&gt;shall may be &lt;/b&gt;refused when the Data Controller is, unable to identify or has a well founded basis for reasonable doubts as to the identity of the Data Subject or are manifestly unfounded, excessive and repetitive, with respect to the information sought by the Data Subject ”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;The proviso could read as follows “The proviso states “Provided that all communications by the Data Controllers including but not limited to the rights of Data Subjects under this part &lt;b&gt;&lt;i&gt;may&lt;/i&gt;&lt;/b&gt; be refused when the Data Controller is…”. We suggest the use of the ‘may’ as this makes the provision less limiting to the rights of the data controller.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additionally, it is not completely clear what ‘included but not limited to...’ would entail. This could be clarified further.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;PART B&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;CONSENT OF DATA SUBJECTS&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 10: &lt;/b&gt;This section talks about the collection of personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 10(3) lays down the information that a person must provide before collecting the personal data of an individual.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 10(3)(xi) states as follows “the time and manner in which it will be destroyed, or the criteria used to Personal data collected in pursuance of a grant of consent by the data subject to whom it pertains shall, if that consent is subsequently withdrawn for any reason, be destroyed forthwith: determine that time period;”. There seems to be a problem with the sentence construction and the rather complex sentence is difficult to understand.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; This section could be reworked in such as way that two conditions are clear, one - the time and manner in which the data will be destroyed and two the status of the data once consent is withdrawn.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 10(3)(xiii) states that the identity and contact details of the data controller and data processor must be provided. However it fails to state that the data controller should provide more details with regard to the process for grievance redressal. It does not provide guidance on what type of information needs to go into this notice and the process of redressal. This could lead to very broad disclosures about the existence of redress mechanisms without providing individuals an effective avenue to pursue.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;As part of the requirement for providing the procedure for redress, data controllers could specifically be required to provide the details of the Privacy Officers, privacy commissioner, as well as provide more information on the redressal mechanisms and the process necessary to follow.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 11:&lt;/b&gt;This section lays out the provisions where collection of personal data without prior consent is possible.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 11 states “Personal data may be collected or received from a third party by a Data Controller the prior consent of the data subject only if it is:..”. However as the title of the section suggests the sentence could indicate the situations where it is permissible to collect personal data without prior consent from the data subject”. Hence the word “without” is missing from the sentence. Additionally the sentence could state that the personal data may be collected or received directly from an individual or from a third party as it is possible to directly collect personal data from an individual without consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt;The sentence could read as “Personal data may be collected or received from an &lt;b&gt;individual or a third party &lt;/b&gt;by a Data Controller &lt;b&gt;&lt;i&gt;without&lt;/i&gt;&lt;/b&gt; the prior consent of the data subject only if it is:..”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 11(1)(i) states that the collection of personal data without prior consent when it is “necessary for the provision of an emergency medical service or essential services”. However it does not specify the kind or severity of the medical emergency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;In addition to medical emergency another exception could be made for imminent threats to life.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 12: &lt;/b&gt;This section details the Special provisions in respect of data collected prior to the commencement of this Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This section states that all data collected, processed and stored by data controllers and data processors prior to the date on which this Act comes into force shall be destroyed within a period of two years from the date on which this Act comes into force. Unless consent is obtained afresh within two years or that the personal data has been anonymised in such a manner to make re-identification of the data subject absolutely impossible. However this process can be highly difficult and impractical in terms of it being time consuming, expensive particularly, in cases of analog collections of data. This is especially problematic in cases where the controller cannot seek consent of the data subject due to change in address or inavailability or death. This will also be problematic in cases of digitized government records.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; We suggest three ways in which the issue of data collected prior to the Act can be handled. One way is to make a distinction on the data based on whether the data controller has specified the purpose of the collection before collecting the data. If the purpose was not defined then the data can be deleted or anonymised. Hence there is no need to collect the data afresh for all the cases. The purpose of the data can also be intimated to the data subject at a later stage and the data subject can choose if they would like the controller to store or process the data.The second way is by seeking consent afresh only for the sensitive data. Lastly, the data controller could be permitted to retain records of data, but must necessarily obtain fresh consent before using them. By not having a blanket provision of retrospective data deletion the Act can address situations where deletion is complicated or might have a potential negative impact by allowing storage, deletion, or anonymisation of data based on its purpose and kind.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section (2)(1)(i) of the Act states that the data will not be destroyed provided that &lt;b&gt;effective consent&lt;/b&gt; is obtained afresh within two years. However as stated earlier the Act does not define effective consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recommendation: The term &lt;b&gt;effective consent &lt;/b&gt;needs to be defined in order to bring clarity to this provision.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;PART C&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;FURTHER LIMITATIONS ON DATA CONTROLLERS&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 16: &lt;/b&gt;This section deals with the security of personal data and duty of confidentiality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 16(2) states “ Any person who collects, receives, stores, processes or otherwise handles any personal data shall be subject to a duty of confidentiality and secrecy in respect of it.” Similarly Section 16(3) states “data controllers and data processors shall be subject to a duty of confidentiality and secrecy in respect of personal data in their possession or control. However apart from the duty of confidentiality and secrecy the data collectors and processors could also have a duty to maintain the security of the data.” Though it is important for confidentiality and secrecy to be maintained, ensuring security requires adequate and effective technical controls to be in place.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; This section could also emphasise on the duty of the data controllers to ensure the security of the data. The breach notification could include details about data that is impacted by a breach or attach as well as the technical details of the infrastructure compromised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 17:&lt;/b&gt; This section details the conditions for the transfer of personal data outside the territory of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 17 allows a transfer of personal data outside the territory of India in 3 situations- If the Central Government issues a notification deciding that the country/international organization in question can ensure an adequate level of protection, compatible with privacy principles contained in this Act; if the transfer is pursuant to an agreement which binds the recipient of the data to similar or stronger conditions in relation to handling the data; or if there are appropriate legal instruments and safeguards in place, to the satisfaction of the data controller. However, there is no clarification for what would constitute ‘adequate’ or ‘appropriate’ protection, and it does not account for situations in which the Government has not yet notified a country/organisation as ensuring adequate protection. In comparison, the GDPR, in Chapter V&lt;a href="#_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, contains factors that must be considered when determining adequacy of protection, including relevant legislation and data protection rules, the existence of independent supervisory authorities, and international commitments or obligations of the country/organization. Additionally, the GDPR allows data transfer even in the absence of the determination of such protection in certain instances, including the use of standard data protection clauses, that have been adopted or approved by the Commission; legally binding instruments between public authorities; approved code of conduct, etc. Additionally, it allows derogations from these measures in certain situations: when the data subject expressly agrees, despite being informed of the risks; or if the transfer is necessary for conclusion of contract between data subject and controller, or controller and third party in the interest of data subject; or if the transfer is necessary for reasons of public interest, etc. No such circumstances are accounted for in Section 17.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;Additionally, data controllers and processors could be provided with a period to allow them to align their policies towards the new legislation. Making these provisions operational as soon as the Act is commenced might put the controllers or processors guilty of involuntary breaching the provisions of the Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 19: &lt;/b&gt;This section&lt;b&gt; &lt;/b&gt;states the special provisions for sensitive personal data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 19(2) states that in addition to the requirements set out under sub-clause (1), the Privacy Commission shall set out additional protections in respect of:i.sensitive personal data relating to data subjects who are minors; ii.biometric and deoxyribonucleic acid data; and iii.financial and credit data.This however creates additional categories of sensitive data apart from the ones that have already been created.&lt;a href="#_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These additional categories can result in confusion and errors.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;Sensitive data must not be further categorised as this can lead to confusion and errors. Hence all sensitive data could be subject to the same level of protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 20:&lt;/b&gt; This section states the special provisions for data impact assessment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This section states that all data impact assessment reports will be submitted periodically to the State Privacy commission. This section does not make provisions for instances of circumstances in which such records may be made public. Additionally the data impact assessment could also include a human rights impact assessment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The section could also have provisions for making the records of the impact assessment or relevant parts of the assessment public. This will ensure that the data controllers / processors are subjected to a standard of accountability and transparency. Additionally as privacy is linked to human rights the data impact assessment could also include a human rights impact assessment. The Act could further clarify the process for submission to State Privacy Commissions and potential access by the Central Privacy Commission to provide clarity in process.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Section 20 requires controllers who use new technology to assess the risks to the data protection rights that occur from processing. ‘New technology’ is defined to include pre-existing technology that is used anew. Additionally, the reports are required to be sent to the State Privacy Commission periodically. However, there is no clarification on the situations in which such an assessment becomes necessary, or whether all technology must undergo such an assessment before their use. Additionally, the differentiation between different data processing activities based on whether the data processing is incidental or a part of the functioning needs to be clarified. This differentiation is necessary as there are some data processors and controllers who need the data to function; for instance an ecommerce site would require your name and address to deliver the goods, although these sites do not process the data to make decisions. This can be compared to a credit rating agency that is using the data to make decisions as to who will be given a loan based on their creditworthiness. Example can taken from the GDPR, which in Article 35, specifies instances in which a data impact assessment is necessary: where a new technology, that is likely to result in a high risk to the rights of persons, is used; where personal aspects related to natural persons are processed automatically, including profiling; where processing of special categories of data (including data revealing ethnic/racial origin, sexual orientation etc), biometric/genetic data; where data relating to criminal convictions is processed; and with data concerning the monitoring of publicly accessible areas. Additionally, there is no requirement to publish the report, or send it to the supervising authority, but the controller is required to review the processor’s operations to ensure its compliance with the assessment report.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The reports could be sent to a central authority, which according to this Act is the Privacy Commission, along with the State Privacy Commission. Additionally there needs to be a differentiation between the incidental and express use of data. The data processors must be given at least a period of one year after the commencement of the Act to present their impact assessment report. This period is required for the processors to align themselves with the provisions of the Act as well as conduct capacity building initiatives.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;PART C&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;RIGHTS OF A DATA SUBJECT&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 21: &lt;/b&gt;This section explains the right of the data subject with regard to accessing her data. It states that the data subject has the right to obtain from the data controller information as to whether any personal data concerning her is collected or processed. The data controller also has to not only provide access to such information but also the personal data that has been collected or processed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; This section does not provide the data subject the right to seek information about security breaches.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;This section could state that the data subject has the right to seek information about any security breaches that might have compromised her data (through theft, loss, leaks etc.). This could also include steps taken by the data controller to address the immediate breach as well as steps to minimise the occurrence of such breaches in the future.&lt;a href="#_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER IV&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;INTERCEPTION AND SURVEILLANCE&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 28: &lt;/b&gt;This section lists out the special provisions for competent organizations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 28(1) states ”all provisions of Chapter III shall apply to personal data collected, processed, stored, transferred or disclosed by competent organizations unless when done as per the provisions under this chapter ”.This does not make provisions for other categories of data such as sensitive data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; This section needs to include not just personal data but also sensitive data, in order to ensure that all types of data are protected under this Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 30:&lt;/b&gt; This section states the provisions for prior authorisation by the appropriate Surveillance and Interception Review Tribunal.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 30(5) states “any interception involving the infringement of the privacy of individuals who are not the subject of the intended interception, or where communications relate to &lt;b&gt;medical, journalistic, parliamentary or legally privileged material&lt;/b&gt; may be involved, shall satisfy additional conditions including the provision of specific prior justification in writing to the Office for Surveillance Reform of the Privacy Commission as to the necessity for the interception and the safeguards providing for minimizing the material intercepted to the greatest extent possible and the destruction of all such material that is not strictly necessary to the purpose of the interception.” This section needs to state why these categories of communication are more sensitive than others. Additionally, interceptions typically target people and not topics of communication - thus medical may be part of a conversation between two construction workers and a doctor will communicate about finances.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The section could instead of singling out “medical, journalistic, parliamentary or legally privileged material” state that “any interception involving the infringement of the privacy of individuals who are not the subject of the intended interception may be involved, shall satisfy additional conditions including the provision of specific prior justification in writing to the Office for Surveillance Reform of the Privacy Commission.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 37&lt;/b&gt;: This section details the bar against surveillance.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment: &lt;/b&gt;Section 37(1) states that “no person shall order or carry out, or cause or assist the ordering or carrying out of, any surveillance of another person”. The section also prohibits indiscriminate monitoring, or mass surveillance, unless it is necessary and proportionate to the stated purpose. However, it is unclear whether this prohibits surveillance by a resident of their own residential property, which is allowed in Section 5, as the same could also fall within ‘indiscriminate monitoring/mass surveillance’. For instance, in the case of a camera installed in a residential property, which is outward facing, and therefore captures footage of the road/public space.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation:&lt;/b&gt; The Act needs to bring more clarity with regard to surveillance especially with respect to CCTV cameras that are installed in private places, but record public spaces such as public roads. The Act could have provisions that clearly define the use of CCTV cameras in order to ensure that cameras installed in private spaces are not used for carrying out mass surveillance. Further, the Act could address the use of emerging techniques and technology such as facial recognition technologies, that often rely on publicly available data.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER V&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;THE PRIVACY COMMISSION&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Section 53:&lt;/b&gt; This section details the powers and functions of the Privacy Commission.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; Section 53(2)(xiv) states that the Privacy Commission shall publish periodic reports “providing description of performance, findings, conclusions or recommendations of any or all of the functions assigned to the Privacy Commission”. However this Section does not make provisions for such reporting to happen annually and to make them publicly available, as well as contain details including financial aspects of matters contained within the Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Recommendation: &lt;/b&gt;The functions could include a duty to disclose the information regarding the functioning and financial aspects of matters contained within the Act. Categories that could be included in such reports include: the number of data controllers, number of data processors, number of breaches detected and mitigated etc.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;CHAPTER IX&lt;/h2&gt;
&lt;h2 style="text-align: justify; "&gt;OFFENCES AND PENALTIES&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; Sections 73 to 80:&lt;/b&gt; These sections lay out the different punishments for controlling and processing data in contravention to the provisions of this Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Comment:&lt;/b&gt; These sections, while laying out different punishments for controlling and processing data in contravention to the provisions of this Act, mets out a fine extending upto Rs. 10 crore. This is problematic as it does not base these penalties on the finer aspects of proportionality, such as  offences that are not as serious as the others.&lt;br /&gt; &lt;br /&gt; &lt;b&gt;Recommendation:&lt;/b&gt; There could be a graded approach to the penalties based on the degree of severity of the offence.This could be in the form of name and shame, warnings and penalties that can be graded based on the degree of the offence. &lt;br /&gt; ----------------------------------------------------------------------&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Additional thoughts: As India moves to a digital future there is a need for laws to be in place to ensure that individual's rights are not violated. By riding on the push to digitization, and emerging technologies such as AI, a strong all encompassing privacy legislation can allow India to leapfrog and use these emerging technologies for the benefit of the citizens without violating their privacy. A robust legislation can also ensure a level playing field for data driven enterprises within a framework of openness, fairness, accountability and transparency.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; These seven principles include: Right to Access, Right to Rectification, Right to Erasure And Destruction of Personal Data,Right to Restriction Of Processing, Right to Object, Right to Portability of Personal Data,Right to Seek Exemption from Automated Decision-Making.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;The Privacy (Protection) Bill 2013: A Citizen’s Draft, Bhairav Acharya, Centre for Internet &amp;amp; Society, https://cis-india.org/internet-governance/blog/privacy-protection-bill-2013-citizens-draft&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;General Data Protection Regulation, available at https://gdpr-info.eu/art-4-gdpr/.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Antonio Vetro, Open Data Quality Measurement Framework: Definition and Application to Open Government Data, available at https://www.sciencedirect.com/science/article/pii/S0740624X16300132&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; General Data Protection Regulation, available at https://gdpr-info.eu/chapter-5/.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Sensitive personal data under Section 2(bb) includes, biometric data; deoxyribonucleic acid data;&lt;br /&gt; sexual preferences and practices;medical history and health information;political affiliation;&lt;br /&gt; membership of a political, cultural, social organisations including but not limited to a trade union as defined under Section 2(h) of the Trade Union Act, 1926;ethnicity, religion, race or caste; and&lt;br /&gt; financial and credit information, including financial history and transactions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Submission to the Committee of Experts on a Data Protection Framework for India, Amber Sinha, Centre for Internet &amp;amp; Society, available at https://cis-india.org/internet-governance/files/data-protection-submission&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018'&gt;https://cis-india.org/internet-governance/blog/the-centre-for-internet-and-society2019s-comments-and-recommendations-to-the-indian-privacy-code-2018&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas, Elonnai Hickok, Amber Sinha and Shruti Trikanand</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-20T13:55:46Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/files/indian-privacy-code">
    <title>Indian Privacy Code</title>
    <link>https://cis-india.org/internet-governance/files/indian-privacy-code</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/files/indian-privacy-code'&gt;https://cis-india.org/internet-governance/files/indian-privacy-code&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-07-20T13:54:35Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/indian-express-july-15-2018-nishant-shah-digital-native-the-citys-watching">
    <title>Digital Native: How smart cities can make criminals out of denizens</title>
    <link>https://cis-india.org/raw/indian-express-july-15-2018-nishant-shah-digital-native-the-citys-watching</link>
    <description>
        &lt;b&gt;People download information and share it without knowing about the intellectual property rights. On social media bullying, harassment and hate speech find easy avenues.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="https://indianexpress.com/article/express-sunday-eye/digital-native-the-citys-watching-5258165/"&gt;Indian Express&lt;/a&gt; on July 15, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;I first heard about smart cities in 2003. Sitting in India, it seemed to  be a very strange concept being developed in the Netherlands, where the  planners were trying to arm an entire city with smartness. The idea was  that if we deploy enough cameras, devices that see, machines that hear,  and data connectivity that envelopes the city in a seamless cloud, it  might lead to more order, discipline, and control. To me that felt like a  strange experiment because under all of those different imaginations of  the city as a neat, organised, controlled environment, were assumptions  that were alien to my Indian sensibilities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It was strange to look at all the promises that “smartness” would deliver — it would make human life easier. It would increase safety and create order out of chaos. It would build new lifestyles that are filled with assistive technologies. In all of these, was the imagination of the city as a laboratory — controlled and efficient, as opposed to riotous and serendipitous. The cities were positioned as filled with intention, so that the interruptions of people, animals, festivals, traffic and crowds would be removed through the deployment of these digital devices and networks. What needed to be preserved was the city and its infrastructure, rather than the individuals and communities that make the city alive and exciting. We wanted our infrastructure to be smart, taking decisions on our behalf, and shaping our lives through the algorithmic protocols that they were coded to embody.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In that faraway time, these had felt like idle speculations. Fifteen  years on, I have now come to realise that the biggest motivation for  building smart cities was not really facilitating human movement,  habitation and habits. Indeed, at the heart of the smart city project  was the setting up of a massive surveillance apparatus that would  clinically diagnose the unwanted people and processes in the city, and  surgically remove them — with the assistance of predictive technologies  that would be implemented in policing and planning these city spaces.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Smart cities were not constructed to make people’s lives easier. They  were constructed because, increasingly, all the people in a city are  imagined as “users”, who need to be instructed through terms of  services, how they must behave and live in these city spaces. One of the  biggest cultural turns in the massification of the digital web was that  almost all users were imagined as potential criminals by the very  virtue of them being connected. Internet service providers and  regulators knew that if people are connected, they will be violating the  law at some point or another, sometimes unknowingly. People download  information and share it without knowing about the intellectual property  rights. On social media bullying, harassment and hate speech find easy  avenues. The largest traffic on the internet is for pornographic and  often banned material which finds its audiences on the connected web.  Spammers, viruses, hijacked machines, and, often, searches for  unexpected items lead people onto the dark web where the questionable  human interactions happen frequently.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The introduction of the digital terms of services was essentially to  presume that the user was a potential criminal who leases hardware and  software, and, platforms from proprietary companies and governments  could then control and discipline the user through comprehensive  surveillance practices. Construction of smart cities performs a similar  function in the physical space. Instead of thinking about citizens as  co-owners who shape city spaces, smart cities establish a service level  agreement with its occupants, and reduces them to users. Any deviation  results in punitive action or devaluation, often curbing the movement,  and the rights of belonging to the city spaces.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While it is true that smart technologies can facilitate certain  aspects of human life, they depend on unfettered data collection,  predictive profiling, correlative algorithms and conditions of extreme  invasion and control — which are all predicated on the idea that you  will falter. And when you do, the technologies will be there to witness,  record, archive, and punish you for the daily transgressions till you  are wiped into becoming a predictable, controlled, cleaned up drone that  travels in docility across the networked edges of the city. We will be  assimilated. Resistance will be futile.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/indian-express-july-15-2018-nishant-shah-digital-native-the-citys-watching'&gt;https://cis-india.org/raw/indian-express-july-15-2018-nishant-shah-digital-native-the-citys-watching&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>nishant</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Natives</dc:subject>
    

   <dc:date>2018-08-01T00:19:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/about/newsletters/june-2018-newsletter">
    <title>June 2018 Newsletter</title>
    <link>https://cis-india.org/about/newsletters/june-2018-newsletter</link>
    <description>
        &lt;b&gt;CIS newsletter for the month of June 2018.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;span&gt;Dear readers,&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Previous issues of the newsletters can be &lt;a class="external-link" href="http://cis-india.org/about/newsletters"&gt;accessed here&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Highlights&lt;/h2&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Balbharati – the Maharashtra state bureau of textbook production and  curriculum research – has issued a copyright policy that forces all  publishers, digital educational-content creators, and coaching classes  to obtain expensive licenses for developing material directly or  indirectly relating to Balbharati’s content. This is an alarming development for Indian students reported Anubha Sinha &lt;a class="external-link" href="https://cis-india.org/a2k/blogs/asia-times-june-20-anubha-sinha-maharastras-copyright-policy-makes-education-unaffordable"&gt;in an article in the Asian Times&lt;/a&gt; on June 20, 2018.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;CIS-A2K has &lt;a class="external-link" href="https://meta.wikimedia.org/wiki/Marathi_Publishers%27_orientation_session_on_FOSS,Open_knowledge_%26_Wikimedia_Projects"&gt;started dialogue with the publishers for the last 6 months  regarding FOSS, Open knowledge and content donation to Wikimedia  Projects&lt;/a&gt;. As a result Akhil Bharatiya Marathi Prakashak Sangh, an apex body of publishers at all India level invited us for a orientation session at their annual gathering in Pune.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Submitted &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/comments-on-the-draft-digital-communications-policy"&gt;comments on the Draft Digital Communications Policy&lt;/a&gt; which was released to the public by the Department of Telecommunications of the Ministry of Communications on 1st May 2018 for comments and views. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Submitted &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/comments-on-the-telecom-commercial-communications-customer-preference-regulations"&gt;comments on the Telecom Commercial Communications Customer Preference Regulations&lt;/a&gt; which was released to the public by the Telecom Regulatory Authority of India (TRAI) on 29th May 2018 for comments and views. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;The Task Force on Artificial Intelligence was established by the Ministry of Commerce and Industry to leverage AI for economic benefits, and provide policy recommendations on the deployment of AI for India. Elonnai Hickok, Shweta Mohandas and Swaraj Paul Barooah &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework"&gt;wrote a blog entry on the artificial intelligence task force&lt;/a&gt;. The blog entry was edited by Swagam Dasgupta. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;The world’s oldest networked infrastructure, money, is increasingly dematerialising and fusing with the world’s latest networked infrastructure, the Internet, wrote Sunil Abraham in an article published in the &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention"&gt;Economic Times&lt;/a&gt; on June 10, 2018.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;An essay by P.P. Sneha &lt;a class="external-link" href="https://cis-india.org/raw/new-contexts-and-sites-of-humanities-practice-in-the-digital-paper"&gt;was published in Summer Hill, a journal published by Indian Institute of Advanced Study&lt;/a&gt;. In the essay, edited by Dr. Bindu Menon, Sneha draws upon excerpts from a study on the field of digital humanities and related practices in India, to outline the diverse contexts of humanities practice with the advent of the digital and explore the developing discourse around digital humanities in the Indian context. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;The &lt;a class="external-link" href="https://cis-india.org/raw/dhai-inagural-conference-2018-puthiya-purayil-sneha-keynote"&gt;inaugural conference of the Digital Humanities Alliance of India &lt;/a&gt;(DHAI) was held at the Indian Institute of Management (IIM), Indore on June 1-2, 2018. P.P. Sneha was a keynote speaker at the event. Her talk was titled ‘New Contexts and Sites of Humanities Practice in the Digital’.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Articles&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention"&gt;Why NPCI and Facebook need urgent regulatory attention&lt;/a&gt; (Sunil Abraham; Economic Times; June 10, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/raw/indian-express-nishant-shah-june-17-2018-digital-native-cause-an-effect"&gt;Digital Native: Cause an Effect&lt;/a&gt; (Nishant Shah; Indian Express; June 17, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/asia-times-june-20-anubha-sinha-maharastras-copyright-policy-makes-education-unaffordable"&gt;Maharashtra's Copyright Policy Makes Education Unaffordable&lt;/a&gt; (Anubha Sinha; Asia Times; June 20, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;CIS in the News&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/times-of-india-june-1-2018-allow-admins-to-add-users-to-online-group-chats-only-after-permission-sflc-in"&gt;Allow admins to add users to online group chats only after permission: SFLC.in&lt;/a&gt; (Times of India; June 1, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states"&gt;EC disables easy access to electoral data across states&lt;/a&gt; (Akshatha M; Economic Times; June 5, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation"&gt;Draft bill proposes Rs 1 crore fine, 3 year jail for data privacy violation&lt;/a&gt; (Vidhi Choudhury; Hindustan Times; June 8, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/bloomberg-quint-june-9-2018-draft-bill-seeks-to-revolutionise-data-collection-storage-in-india"&gt;Citizens’ Draft Privacy Bill Seeks To Revolutionise Data Collection, Storage In India&lt;/a&gt; (Arpan Chaturvedi; Bloomberg Quint; June 9, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/the-times-of-india-nilesh-christopher-and-naveen-menezes-june-14-2018-police-to-counter-fake-news-on-whatsapp"&gt;Police to counter fake news on WhatsApp&lt;/a&gt; (Nilesh Christopher and Naveen Menezes; Times of India; June 14, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/times-of-india-june-18-2018-full-belief-in-fake-texts-shows-cops-not-trusted"&gt;'Full belief in fake texts shows cops not trusted'&lt;/a&gt; (Times of India; June 18, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/deccan-herald-june-19-2018-anushka-finds-support-her-anti-litter-tirade"&gt;Anushka finds support for her anti-litter tirade&lt;/a&gt; (Nina C. George; Deccan Herald; June 19, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-june-19-2018-jindal-varsitys-international-affairs-students-shine-in-job-market"&gt;Jindal varsity's international affairs students shine in job market&lt;/a&gt; (Economic Times; June 19, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy"&gt;Data Privacy: Footprints on the Web&lt;/a&gt; (Sujit Bhar; IndiaLegal; June 21, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://https//cis-india.org/internet-governance/news/death-by-whatsapp"&gt;Death By WhatsApp&lt;/a&gt; (News18.com, June 25, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/your-story-june-29-2018-tech-transformation-agriculture-redefined-digital-innovation-startups"&gt;Tech transformation: how agriculture is being redefined through digital innovation and startups&lt;/a&gt; (Your Story; June 29, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a href="http://cis-india.org/a2k"&gt;Access to Knowledge&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Access to Knowledge (A2K) is a campaign to promote the fundamental  principles of justice, freedom, and economic development. It deals with  issues like copyrights, patents and trademarks, which are an important  part of the digital landscape. Our A2K program comprises 2 projects:  Pervasive Technologies done under a grant from International Development  Research Centre examining interplay between cost-effective pervasive  technologies and intellectual property and encouraging development of  such technologies for social good, and Wikipedia under a grant from  Wikimedia Foundation to enable the growth of Indic language communities  and cultivate new editors in different Indian languages.&lt;/p&gt;
&lt;h3&gt;Wikipedia&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Event Organized&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://meta.wikimedia.org/wiki/Marathi_Publishers%27_orientation_session_on_FOSS,Open_knowledge_%26_Wikimedia_Projects"&gt;Marathi Publishers' orientation session on FOSS,Open knowledge &amp;amp; Wikimedia Projects&lt;/a&gt; (Co-organized by Akhil Bharatiya Marathi Prakashak Sangh and CIS-A2K; Maratha Chamber of Commerce, Tilak Road, Pune; June 17, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a href="http://cis-india.org/internet-governance"&gt;Internet Governance&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The Tunis Agenda of the second World Summit on the Information Society  has defined internet governance as the development and application by  governments, the private sector and civil society, in their respective  roles of shared principles, norms, rules, decision making procedures and  programs that shape the evolution and use of the internet. CIS is  engaged in two different projects. The  first one (under a grant from Privacy International and IDRC) is on  surveillance and freedom of expression (SAFEGUARDS). The second one  (under a grant from MacArthur Foundation) is on restrictions that the  Indian government has placed on freedom of expression online.&lt;/p&gt;
&lt;h3&gt;Privacy&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Blog Entries&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/niti-aayog-discussion-paper-an-aspirational-step-towards-india2019s-ai-policy"&gt;NITI Aayog Discussion Paper: An aspirational step towards India’s AI policy&lt;/a&gt; (Sunil Abraham, Elonnai Hickok, Amber Sinha, Swaraj Barooah, Shweta Mohandas, Pranav M Bidare, Swagam Dasgupta, Vishnu Ramachandran and Senthil Kumar; June 13, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/ai-task-force-report.pdf"&gt;The AI Task Force Report - The first steps towards India’s AI framework&lt;/a&gt; (Authored by Elonnai Hickok, Shweta Mohandas and Swaraj Paul Barooah and Edited by Swagam Dasgupta; June 27, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;Submissions&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/comments-on-the-draft-national-policy-on-official-statistics"&gt;Comments on the Draft National Policy on Official Statistics&lt;/a&gt; (Gurshabad Grover and Sandeep Kumar; June 7, 2018).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/comments-on-the-draft-digital-communications-policy"&gt;Comments on the Draft Digital Communications Policy&lt;/a&gt; (Anubha Sinha, Gurshabad Grover and Swaraj Barooah; June 14, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;Participation in Event&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/is-privacy-obsolete"&gt;Is Privacy Obsolete?&lt;/a&gt; (Organized by TERI; Bangalore; June 22, 2018). Pranesh Prakash was a panelist.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h3&gt;Free Speech &amp;amp; Expression&lt;/h3&gt;
&lt;p&gt;Blog Entry&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/network-disruptions-report-by-global-network-initiative"&gt;Network Disruptions Report by Global Network Initiative&lt;/a&gt; (Akriti Bopanna; June 12, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cis-india.org/telecom"&gt;Telecom&lt;/a&gt;&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span style="text-align: justify; "&gt;CIS is involved in promoting access and accessibility to telecommunications services and resources, and has provided inputs to ongoing policy discussions and consultation papers published by TRAI. It has prepared reports on unlicensed spectrum and accessibility of mobile phones for persons with disabilities and also works with the USOF to include funding projects for persons with disabilities in its mandate:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-align: justify; "&gt;Submission&lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span style="text-align: justify; "&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/comments-on-the-telecom-commercial-communications-customer-preference-regulations"&gt;Comments on the Telecom Commercial Communications Customer Preference Regulations&lt;/a&gt; (Sandeep Kumar, Torsha Sarkar, Swaraj Barooah, and Gurshabad Grover; June 22, 2018).&lt;br /&gt;&lt;br /&gt;&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span style="text-align: justify; "&gt;&lt;span style="text-align: justify; "&gt;&lt;a href="http://cis-india.org/raw"&gt;Researchers at Work&lt;/a&gt;&lt;/span&gt;&lt;/span&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The Researchers at Work (RAW) programme is an interdisciplinary research initiative driven by an emerging need to understand the reconfigurations of social practices and structures through the Internet and digital media technologies, and vice versa. It aims to produce local and contextual accounts of interactions, negotiations, and resolutions between the Internet, and socio-material and geo-political processes:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Participation in Event&lt;br /&gt;&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/raw/dhai-inagural-conference-2018-puthiya-purayil-sneha-keynote"&gt;Digital Humanities Alliance of India - Inagural Conference 2018&lt;/a&gt; (Co-organized by IIM and IIT, Indore with support from CIS; IIM, Indore; June 1 - 2, 2018). P.P. Sneha was a speaker and gave the keynote address.&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;b&gt;Essay&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/raw/new-contexts-and-sites-of-humanities-practice-in-the-digital-paper"&gt;New Contexts and Sites of Humanities Practice in the Digital&lt;/a&gt; (Paper) (P.P. Sneha; June 25, 2018).&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;hr /&gt;
&lt;h2&gt;&lt;span style="text-align: justify; "&gt;&lt;span style="text-align: justify; "&gt;&lt;span style="text-align: justify; "&gt;&lt;a href="http://cis-india.org/"&gt;About CIS&lt;/a&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The Centre for Internet and  Society (CIS) is a non-profit organisation that undertakes  interdisciplinary research on internet and digital technologies from  policy and academic perspectives. The areas of focus include digital  accessibility for persons with disabilities, access to knowledge,  intellectual property rights, openness (including open data, free and  open source software, open standards, open access, open educational  resources, and open video), internet governance, telecommunication  reform, digital privacy, and cyber-security. The academic research at  CIS seeks to understand the reconfigurations of social and cultural  processes and structures as mediated through the internet and digital  media technologies.&lt;/p&gt;
&lt;p&gt;► Follow us elsewhere&lt;/p&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;Twitter:&lt;a href="http://twitter.com/cis_india"&gt; http://twitter.com/cis_india&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Access to Knowledge: &lt;a href="https://twitter.com/CISA2K"&gt;https://twitter.com/CISA2K&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Information Policy: &lt;a href="https://twitter.com/CIS_InfoPolicy"&gt;https://twitter.com/CIS_InfoPolicy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Facebook - Access to Knowledge:&lt;a href="https://www.facebook.com/cisa2k"&gt; https://www.facebook.com/cisa2k&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;E-Mail - Access to Knowledge: &lt;a&gt;a2k@cis-india.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;E-Mail - Researchers at Work: &lt;a&gt;raw@cis-india.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;List - Researchers at Work: &lt;a href="https://lists.ghserv.net/mailman/listinfo/researchers"&gt;https://lists.ghserv.net/mailman/listinfo/researchers&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;p&gt;► Support Us&lt;/p&gt;
&lt;div&gt;Please help us defend consumer and citizen rights on the Internet!  Write a cheque in favour of 'The Centre for Internet and Society' and  mail it to us at No. 194, 2nd 'C' Cross, Domlur, 2nd Stage, Bengaluru -  5600 71.&lt;/div&gt;
&lt;p&gt;► Request for Collaboration&lt;/p&gt;
&lt;div&gt;
&lt;p style="text-align: justify; "&gt;We invite researchers, practitioners, artists, and theoreticians,  both organisationally and as individuals, to engage with us on topics  related internet and society, and improve our collective understanding  of this field. To discuss such possibilities, please write to Sunil  Abraham, Executive Director, at sunil@cis-india.org (for policy research), or Sumandro Chattapadhyay, Research Director, at sumandro@cis-india.org (for  academic research), with an indication of the form and the content of  the collaboration you might be interested in. To discuss collaborations  on Indic language Wikipedia projects, write to Tanveer Hasan, Programme  Officer, at &lt;a&gt;tanveer@cis-india.org&lt;/a&gt;.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;&lt;i&gt;CIS is grateful to its primary donor the Kusuma Trust founded  by Anurag Dikshit and Soma Pujari, philanthropists of Indian origin for  its core funding and support for most of its projects. CIS is also  grateful to its other donors, Wikimedia Foundation, Ford Foundation,  Privacy International, UK, Hans Foundation, MacArthur Foundation, and  IDRC for funding its various projects&lt;/i&gt;.&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/about/newsletters/june-2018-newsletter'&gt;https://cis-india.org/about/newsletters/june-2018-newsletter&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Telecom</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Access to Knowledge</dc:subject>
    

   <dc:date>2018-08-11T02:52:10Z</dc:date>
   <dc:type>Page</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation">
    <title>Draft bill proposes Rs 1 crore fine, 3 year jail for data privacy violation</title>
    <link>https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation</link>
    <description>
        &lt;b&gt;The move comes at a time when user data of Indians is under threat from social media firms accused of data mining and sharing information with private companies for advertising and marketing purposes. There has also been a growing concern over Aadhaar.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Vidhi Choudhury was published in the &lt;a class="external-link" href="https://www.hindustantimes.com/india-news/draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation/story-Cbxt5LxKhINJiDdtipZlGI.html"&gt;Hindustan Times&lt;/a&gt; on June 8, 2018. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Even as a 10-member government panel is due to submit its recommendations for a new data privacy bill, a group of lawyers on Friday uploaded a model citizens’ code, which they said could give the panel pointers to what India’s final privacy law should be.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Internet Freedom Foundation (IFF) launched its community project, ‘Save our Privacy’, in what it described as a bid to safeguard individuals’ right to privacy. This model code, titled ‘Indian Privacy Code, 2018’, has been drafted by lawyers such Gautam Bhatia, Apar Gupta and Raman Jit Singh Chima, among others.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Many of these lawyers made a joint submission to the Justice BN Srikrishna Committee in the past. On Friday, they sent him an email with the copy of the code with its seven core principles. The core principles follow what IFF calls a “rights-based approach to protect people from harmful use of their personal data”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“In a world where personal data has power, people need to be put in charge of their own lives,” said New Delhi-based lawyer Apar Gupta.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft bill sets a penalty of up to Rs 1 crore for the violation of privacy of citizens and a prison sentence of up to three years. It also provides for a penalty of up to Rs 10 crore to anyone found to be performing surveillance unlawfully, with a prison term of up to five years.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The move comes at a time when user data of Indians is under potential threat from social media companies that have been accused of data mining and sharing user information with private firms for advertising and marketing purposes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There has also been a growing concern in India over the validity of the Aadhaar law. A Constitution bench of the Supreme Court has finished hearing a slew of petitions against the unique identity number and has reserved its judgment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On 31 July, the government constituted the panel headed by Justice Srikrishna to study various issues relating to data protection and suggest a draft data protection bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;IFF said in a statement that it had concerns over the “composition, lack of diversity and transparency” of the committee. It also said it was concerned about the lack of urgency India had shown about making a privacy law, and that its civil society project was important to build awareness on privacy and data protection in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The Indian Privacy Code, 2018 ensures that right to privacy does not undermine the Right to Information Act. All the other existing laws including the Telegraph Act and the Aadhaar Act should be subject to this law,” Chima said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“We hope the Justice BN Srikrishna Committee considers and adopts the language we propose,” he added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;According to a senior official at the home ministry who spoke on the condition of anonymity, the privacy bill hasn’t come up for discussion yet. “In any case, the said bill will be taken up by the IT ministry first. The IT ministry will be responsible for piloting the proposed bill on privacy and MHA will, in the later stages, give its opinion on security issues related to the proposed bill,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A government official on condition of anonymity said that its for the Justice Srikirshna Committee to look at the model privacy code launched today and decide what they want to use from it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When contacted, Ajay Sawhney, secretary for ministry of electronics and technology said: “The Justice Srikrishna Committee will submit its report shortly.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The reason civil society is doing this is because the government is not sharing their draft bills,” said Sunil Abraham, founder of think tank Centre for Internet and Society (CIS). In 2013, CIS had also published a citizen’s draft privacy protection bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(With inputs from Azaan Javaid)&lt;/i&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation'&gt;https://cis-india.org/internet-governance/hindustan-times-june-8-2018-vidhi-choudhary-draft-bill-proposes-rs-1-crore-fine-3-year-jail-for-data-privacy-violation&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-29T16:48:48Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework">
    <title>The AI Task Force Report - The first steps towards India’s AI framework </title>
    <link>https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework</link>
    <description>
        &lt;b&gt;The Task Force on Artificial Intelligence was established by the Ministry of Commerce and Industry to leverage AI for economic benefits, and provide policy recommendations on the deployment of AI for India.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post was edited by Swagam Dasgupta. &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/ai-task-force-report.pdf"&gt;Download &lt;strong&gt;PDF&lt;/strong&gt; here&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;span style="text-align: justify; "&gt;The Task Force’s Report, released on March 21st 2018, is a result of the combined expertise of members from different sectors&lt;/span&gt;&lt;a name="_ftnref1"&gt;&lt;/a&gt;&lt;span style="text-align: justify; "&gt; and examines how AI will benefit India. It sheds light on the Task Force’s perception of AI, the sectors in which AI can be leveraged in India, the challenges endemic to India and certain ethical considerations. It concludes with a set of policy recommendations for the government to leverage AI for the next five years. While acknowledging AI as a social and economic problem solver,&lt;/span&gt;&lt;a name="_ftnref2"&gt;&lt;/a&gt;&lt;span style="text-align: justify; "&gt; the Report attempts to answer three policy questions:&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What are the areas where government should play a role?&lt;/li&gt;
&lt;li&gt;How can AI improve quality of life and solve problems at scale for Indian citizens?&lt;/li&gt;
&lt;li&gt;What are the sectors that can generate employment and growth by the use of AI technology?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;span style="text-align: justify; "&gt;This blog will look at how the Task Force answered these three policy questions. In doing so, it gives an overview of salient aspects and reflects on the strengths and weaknesses of the Report.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Sectors of Relevance and Challenges&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In order to navigate the outlined questions, the Report looks at ten sectors that it refers to as ‘domains of relevance to India’. Furthermore, it examines the use of AI along with its major challenges, and possible solutions for each sector. These sectors include: Manufacturing, FinTech, Agriculture, Healthcare, Technology for the Differently-abled, National Security, Environment, Public Utility Services, Retail and Customer Relationship, and Education.&lt;a name="_ftnref3"&gt;&lt;/a&gt; While these ten domains are part of the 16 domains of focus listed in the AITF’s web page,&lt;a name="_ftnref4"&gt;&lt;/a&gt; it would have been useful to know the basis on which these sectors were identified. A particular strength of the identified sectors is the consideration of technology for the differently abled as well as the recognition to the development of AI systems in spoken and sign languages in the Indian context.&lt;a name="_ftnref5"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Some of the problems endemic to India that were recognized include infrastructural barriers, managing scale and innovation, and the collection, validation and distribution of data.&lt;/span&gt;&lt;a name="_ftnref6"&gt;&lt;/a&gt;&lt;span&gt; The Task Force also noted the lack of consumer awareness, and inability of technology providers to explain benefits to end users as further challenges.&lt;/span&gt;&lt;a name="_ftnref7"&gt;&lt;/a&gt;&lt;span&gt; The Task Force — by putting the onus on the individual — seems to hint that the impediment to the uptake of technology is the inability of individuals to understand the benefits of the technology, rather than aspects such as poor design, opacity, or misuse of data and insights. Furthermore, although the Report recognizes the challenges associated to data in India and highlights the importance of quality and quantity of data; it overlooks the importance of data curation in creatinge reliable AI systems.&lt;/span&gt;&lt;a name="_ftnref8"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Although the Report examines challenges to AI in each sector, it fails to include all challenges that require addressal. For example, the report fails to acknowledge challenges such as the lack of appropriate certification systems for AI driven health systems and technologies.&lt;a name="_ftnref9"&gt;&lt;/a&gt; In the manufacturing sector, the Report fails to highlight contextual challenges associated with the use of AI. This includes the deployment of autonomous vehicles compared to the use of industrial robots.&lt;a name="_ftnref10"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the use of AI in retail, the Report while examining consumer data and its respective regulatory policies, identified the issues to be related to the definition, discrimination, data breaches, digital products and safety awareness and reporting standards.&lt;a name="_ftnref11"&gt;&lt;/a&gt; In this, the Report is limited in its understanding of what categories of data can lead to discrimination and restricts mechanisms for transparency and accountability to data breaches. The Report could have also been more forward looking in its position on security — including security by design and security by default. Furthermore, these issues were noted only in the context of the retail sector and ideally should have been discussed across all sectors.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The challenges for utilizing AI for national security could have been examined beyond cost and capacity to include associated ethical and legal challenges such as the need for legal backing. The use of AI in national security demands clear accountability and oversight as it is a ground for legitimate state interference with fundamental rights such as privacy and freedom of expression. As such, there is a need for human rights impact assessments, as well as a need for such uses to be aligned with international human rights norms. Government initiatives that allow country wide surveillance and AI decisions based on such data should ideally be implemented only after a comprehensive privacy law is in place and India’s surveillance regime has been revisited.&lt;a name="_ftnref12"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recognizing the potential of AI for the benefit of the differently abled is one of the key takeaways from this section of the Report. Furthermore, it also brings in the need for AI inclusivity. AI in natural language generation and translation systems have the potential to help the large number of youth that are disabled or deprived.&lt;a name="_ftnref13"&gt;&lt;/a&gt; Therefore, AI could have a large positive impact through inclusive growth and empowerment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Although the Report examines each of the ten domains in an attempt to provide an insight into the role the government can play, there seems to be a lack of clarity in terms of the role that each department will and is playing with respect to AI. Even the section which lays down the relevant ministries for each of the ten domains failed to include key ministries and departments. For example, the Report does not identify the Ministry of Education, nor does it list the Ministry of Law for national security. The Report could have also identified government departments which would be responsible for regulation and standardization. This could include the Medical Council of India (healthcare), CII (manufacture and retail), RBI (Fintech) etc. The Report also does not recognize other developments around AI emerging out the government. For example, the Draft National Digital Communications Policy (published on May 1, 2018) seeks to empower the Department of Telecommunication to provide a roadmap for AI and robotics.&lt;a name="_ftnref14"&gt;&lt;/a&gt; Along similar lines, the Department of Defence Production has also created a task force earlier this year to study the use of AI to accelerate military technology and economic growth.&lt;a name="_ftnref15"&gt;&lt;/a&gt; The government should look at building a cohesive AI government body, or clearly delineating the role of each ministry, in order to ensure harmonization going forward.&lt;/p&gt;
&lt;h3&gt;Areas in need of Government Intervention&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Report also lists out the grand challenges where government intervention is required. This includes data collection and management and the need for widespread expertise contributing to research, innovation, and response. However, while highlighting the need for AI experts from diverse backgrounds, it fails to include experts from law and policy into the discussion.&lt;a name="_ftnref16"&gt;&lt;/a&gt; While identifying manufacturing, agriculture, healthcare and public utility to be places where government intervention is needed, the Report failed to examine national security beyond an important domain to India and as a sector where government intervention is needed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Participation in International Forums&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another relevant concern that the Report underscores is India’s scarce participation as researchers, AI developers and government engagement in global discussions around AI. The Report states that although efforts were being made by Indian universities to increase their presence in international AI conferences, they were lagging behind other nations. On the subject of participation by the government it recommends regular presence in International AI policy forums. Hence, emphasising the need for India’s active participation in global conversations around AI and international rulemaking.&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Key Enablers to AI&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Report while analysing the key enablers for AI deployment in India states that positive societal attitudes will be the driving force behind the proliferation of AI.&lt;a name="_ftnref17"&gt;&lt;/a&gt; Although relying on positive social attitudes alone will not help in increasing the trust on AI, steps such as making algorithms that are used by public bodies public, enacting a data protection law etc. will be important in enabling trust beyond highlighting success stories.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Data and Data Marketplaces&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the Report identifies data as a challenge where government intervention is needed, it also points to the Aadhaar ecosystem as an enabler. It states that Aadhaar will help in the proliferation of AI in three ways: one as a creator of jobs as related to the collection and digitization of data, two as a collector of reliable data, and three as a repository of Indian data. However, since the very constitutionality of Aadhaar is yet to be determined by the Supreme Court,&lt;a name="_ftnref18"&gt;&lt;/a&gt; the task force should have used caution in identifying Aadhaar as a definitive solution. Especially while making statements that the Aadhaar along with the SC judgement has created adequate frameworks to protect consumer data. Additionally, the Task Force should have recognized the various concerns that have been voiced about Aadhaar, particularly in the context of the case before the Supreme Court.&lt;a name="_ftnref19"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;This section also proposes the creation of a Digital Data Marketplace. A data marketplace needs to be framed carefully so as to not create a situation where privacy becomes a right available to only those who can afford it.&lt;/span&gt;&lt;a name="_ftnref20"&gt;&lt;/a&gt;&lt;span&gt; It is concerning that the discussion on data protection and privacy in the Report is limited to policies and guidelines for businesses and not centered around the individual.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;strong&gt;Innovation and Patents&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Report states that the Indian startups working in the field of AI must be encouraged, and industry collaborations and funding must be taken up as a policy measure. One of the ways in which this could be achieved is by encouraging innovations, and one of the ways to do so is by adding a commercial incentive to it, such as through IP rights. Although the Report calls for a stronger IP regime that protects and incentivises innovation, it remains ambiguous as to which aspect of IP rights — patents, trade secrets and copyrights — need significant changes.&lt;a name="_ftnref21"&gt;&lt;/a&gt; If the Report is specifically advocating for stronger patent rights in order to match those of China and US, then it shows that the the task force fails to understand the finer aspects of Indian patent law and the history behind India’s stance on patenting. This includes the fact that Indian patent law excludes algorithms from being patented. Indian patent law, by providing a higher threshold for patenting computer related inventions (CRIs), ensures that only truly innovative patents are granted.&lt;a name="_ftnref22"&gt;&lt;/a&gt; Given the controversies over CRIs that have dotted the Indian patent landscape&lt;a name="_ftnref23"&gt;&lt;/a&gt;, the task force would have done well to provide more clarity on the ‘how’ and ‘why’ of patenting in this sector, if that is their intent with this suggestion.&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Ethical AI framework&lt;/span&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Responsible AI&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In terms of establishing an ethical AI framework, the Task Force suggests measures such as making AI explainable, transparent, and auditable for biases. The Report addresses the fact that currently with the increase in human and AI interaction there is a need to have new standards set for the deployment of AI as well as industrial standards for robots. However, the Report does not go into details of how AI could cause further bias based on various identifiers such as gender and caste, as well as the myriad concerns around privacy and security. This is especially a concern given that the Report envisions widespread use of AI in all major sectors. In this way, the Report looks at data as both a challenge and an enabler, but fails to dedicate time towards explaining the various ethical considerations behind the collection and use of data in the context of privacy, security and surveillance as well as account for unintended consequences. In laying out the ethical considerations associated with AI, the report does not make a distinction between the use of AI by the public sector and private sector. As the government is responsible for ensuring the rights of citizens and holds more power than the citizenry, the public sector needs to be more accountable in their use of AI. This is especially so in cases where AI is proposed to be used for sovereign functions such as national security.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Privacy and Data&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Report also recognises the significance of the implementation of the Aadhaar Act&lt;a name="_ftnref24"&gt;&lt;/a&gt;, the privacy judgement&lt;a name="_ftnref25"&gt;&lt;/a&gt; and the proposed data protection laws&lt;a name="_ftnref26"&gt;&lt;/a&gt;, on the development and use of AI for India. Yet, the Report does not seem to recognize the importance of a robust and multi-faceted privacy framework as it assumes that the Aadhaar Act and the Supreme Court Judgement on privacy and potential privacy law have already created a basis for safe and secure utilization and sharing of customer data.&lt;a name="_ftnref27"&gt;&lt;/a&gt; Although the Report has tried to be an expansive examination of various aspects of AI for India, it unfortunately has not looked in depth at the current issues and debates around AI privacy and ethics and makes policy recommendations without appearing to fully reflect on the implementation and potential impact of the same. Similar to the discussion paper by the Niti Aayog,&lt;a name="_ftnref28"&gt;&lt;/a&gt; this Report does not consider the emerging principles of data protection such as right to explanation and right to opt-out of automated processing, which directly relate to AI.&lt;a name="_ftnref29"&gt;&lt;/a&gt; Furthermore, there is a lack of discussion on issues such as data minimisation and purpose limitation which some big data and AI proponents argue against.&lt;a name="_ftnref30"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;strong&gt;Liability&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the question of liability, the Report only states that specific liability mechanisms need to be worked out for certain categories of machines. The Report does not address the questions of liability that should be applicable to all AI systems, and on whom the duty of care lies, not only in case of robots but also in the case of automated decision making etc. Thus, there is a need for further thinking on mechanisms for determining liability and how these could apply to different types of AI (deep learning models and other machine learning models) and AI systems.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;AI and Employment &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the topic of jobs and employment, the Report states that AI will create more jobs than it takes as a result of an increase in the number of companies and avenues created by AI technologies. Additionally, the Report provides examples of jobs where AI could replace the human (autonomous drivers, industrial robots etc,) but does not go as far as envisioning what jobs could be created directly from this replacement. Though the Report recognizes emerging forms of work such as crowdsourcing platforms like Mturk&lt;a name="_ftnref31"&gt;&lt;/a&gt;, it fails to examine the impact of such models of work on workers and traditional labour market structures and processes.&lt;a name="_ftnref32"&gt;&lt;/a&gt; Going forward, it will be important that the government and the private sector undertake the necessary steps to ensure that fair, protected, and fulfilling jobs are created simultaneously with the adoption of AI. This will include revisiting national and organizational skilling programmes, labor laws, social benefit schemes, relevant economic policies, and exploring best practices with respect to the adoption and integration of AI in work.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Education and Re-skilling&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The task force emphasised the need for a change in the education curriculum as well as the need to reskill the labour force to ensure an AI ready future. This level of reskilling will be a massive effort, and a thorough review and audit of existing skilling programmes in India is needed before new skilling programmes are established and financed. The Report also clarifies that the statistics used were based on a study on the IT component of the industry, and that a similar study was required to analyse AI’s effect on the automation component.&lt;a name="_ftnref33"&gt;&lt;/a&gt; Going forward, there is the need for a comprehensive study of the labour intensive sectors and formal and informal sectors to develop evidence based policy responses.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Policy Recommendations &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Task Force&lt;sub&gt;,&lt;/sub&gt; in its policy recommendations, notes that the successful adoption of AI in India will depend on three factors: people, process and technology. However, it does not explain these three factors any further.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;National Artificial Intelligence Mission&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The most significant suggestion made in the Report is for the establishment of the National Artificial Intelligence Mission (N-AIM) — a centralised nodal agency for coordinating and facilitating research, collaboration and providing economic impetuous to AI startups.&lt;a name="_ftnref34"&gt;&lt;/a&gt; The mission with a budget allocation of Rs 1,200 crore over five years aims, among other things, to look at various ways to encourage AI research and deployment.&lt;a name="_ftnref35"&gt;&lt;/a&gt; Some of the suggestions include targeting and prototyping AI systems and setting up of a generic AI test bed. These suggestions seems to draw inspiration from other countries such as the US DARPA Challenge&lt;a name="_ftnref36"&gt;&lt;/a&gt; and Japan’s sandbox for self driving trucks.&lt;a name="_ftnref37"&gt;&lt;/a&gt; The establishment of N-AIM is a welcome step to encourage both AI research and development on a national scale. The availability of public funds will encourage more AI research and development.&lt;a name="_ftnref38"&gt;&lt;/a&gt;Additionally, government engagement in AI projects has thus far been fragmented&lt;a name="_ftnref39"&gt;&lt;/a&gt;and a centralised body will presumably bring about better coordination and harmonization. Some of the initiatives such as Capture the flag competition&lt;a name="_ftnref40"&gt;&lt;/a&gt; that seeks to centre around the provision for real datasets to catalyze innovation will need to be implemented with appropriate safeguards in place.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Other recommendations&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are other suggestions that are problematic — particularly that of funding “an inter-disciplinary large data integration center in pilot mode to develop an autonomous AI Machine that can work on multiple data streams in real time and provide relevant information and predictions to public across all domains.”&lt;a name="_ftnref41"&gt;&lt;/a&gt; Before such a project is developed and implemented there are a number of factors where legal clarity is required; a few being: data collection and use, accuracy and quality of the AI system. There is also a need to ensure that bias and discrimination have been accounted for and fairness, responsibility and liability have been defined with consideration that this will be a government driven AI system. Additionally, such systems should be transparent by design and should include redress mechanisms for potential harms that may arise. This can be through the presence of a human in the loop, or the existence of a kill switch. These should be addressed through ethical principles, standards, and regulatory frameworks.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The recommendations propose establishing operation standards for data storage and  privacy, communication standards for autonomous systems, and standards to allow for interoperability between AI based systems. A significant lacuna in this list is the development of safety, accuracy, and quality standards for AI algorithms and systems.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, although the proposed public private partnership model for research and startups is a good idea, this initiative should be undertaken only after questions such as the implications of liability, ownership of IP and data, and the exclusion of critical sectors are thought through.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Furthermore, the suggestion to ‘fund a national level survey on identification of cluster of clean annotated data necessary for building effective AI systems’&lt;a name="_ftnref42"&gt;&lt;/a&gt; needs to recognize the existing initiatives around open data or use this as a starting place. The Report does not clarify if this survey would involve identifying data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The inconspicuous release of the Report as well as the lack of a call for public comments&lt;a name="_ftnref43"&gt;&lt;/a&gt; results in the fact that the Report does not incorporate or reflect on the sentiments of the public or draw upon the expertise that exists in India on the topic or policies around emerging technologies, which will have a pervasive and wide effect on society. The need for multi stakeholder engagement and input cannot be understated. Nonetheless, the Report of the Task Force is a welcome step towards understanding the movement towards an definitive AI policy. The task force has attempted answering the three policy questions keeping people, process and technology in mind. However, it could have provided greater details about these indices. The Report, which is meant for a wider audience, would have done well to provide greater detail, while also providing clarity on technical terms. On a definitional plane, a list of technologies that the task force perceived as AI for this Report, could have also helped keep it grounded on possible and plausible 5 year recommendations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Compared to the recent Niti Aayog Discussion Paper&lt;/span&gt;&lt;a name="_ftnref44"&gt;&lt;/a&gt;&lt;span&gt;, this Report misses out on a detailed explanation on AI and ethics, however, it does spend some considerable amount of time on education and the use of AI for the differently abled. Additionally, the Report’s statement on the democratization of development and equal access as well as assigning ownership and framing transparent rules for usage of the infrastructure is a positive step towards making AI inclusive. Overall, the Report is a progressive step towards laying down India’s path forward in the field of Artificial Intelligence. The emphasis on India’s involvement in International rulemaking gives India an opportunity to be a leader of best practice in international forums by adopting forward looking and human rights respecting practices. Whether India will also become a strong contender in the AI race, with policies favouring the development of a socio-economically beneficial, and ethical-AI backed industries and services is yet to be seen.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn1"&gt;&lt;/a&gt;&lt;span&gt; The Task Force consists of 18 members in total. Of these, 11 members are from the field of AI technology both research and industry, three from the civil services, one from healthcare research, one with and Intellectual property law background, and two from a finance background. The specializations of the members are not limited to one area as the members have experience or education in various areas relevant to AI. &lt;/span&gt;&lt;a href="https://www.aitf.org.in/"&gt;https://www.aitf.org.in//&lt;/a&gt;&lt;span&gt; There is a notable lack of members from Civil Society. It may also be noted that only 2 of the 18 members are women&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn2"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 1,&lt;span&gt;http://dipp.nic.in/sites/default/files/Report_of_Task_Force_on_ArtificialIntelligence_20March2018_2.pdf&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn3"&gt;&lt;/a&gt; ibid.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn4"&gt;&lt;/a&gt; The Artificial Intelligence Task Force https://www.aitf.org.in/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn5"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 8&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn6"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 9,10.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn7"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 9&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn8"&gt;&lt;/a&gt; ibid.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn9"&gt;&lt;/a&gt; Artificial Intelligence in the Healthcare Industry in India https://cis-india.org/internet-governance/files/ai-and-healtchare-report&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn10"&gt;&lt;/a&gt;Artificial Intelligence in the Manufacturing and Services Sector https://cis-india.org/internet-governance/files/AIManufacturingandServices_Report   _02.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn11"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 21.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn12"&gt;&lt;/a&gt; Submission to the Committee of Experts on a Data Protection Framework for India, Centre for Internet and Society https://cis-india.org/internet-governance/files/data-protection-submission&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn13"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 22&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn14"&gt;&lt;/a&gt; Draft National Digital Communications Policy-2018, http://www.dot.gov.in/relatedlinks/draft-national-digital-communications-policy-2018&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn15"&gt;&lt;/a&gt; Task force set up to study AI application in military,https://indianexpress.com/article/technology/tech-news-technology/task-force-set-up-to-study-ai-application-in-military-5049568/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn16"&gt;&lt;/a&gt;It is not just technical experts  that are needed, ethical, technical, and legal experts as well as domain experts need to be part of the decision making process.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn17"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 31&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn18"&gt;&lt;/a&gt;Constitutional validity of Aadhaar: the arguments in Supreme Court so far, http://www.thehindu.com/news/national/constitutional-validity-of-aadhaar-the-arguments-in-supreme-court-so-far/article22752084.ece&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn19"&gt;&lt;/a&gt; ibid.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn20"&gt;&lt;/a&gt; CIS Submission to TRAI Consultation on Free Data http://trai.gov.in/Comments_FreeData/Companies_n_Organizations/Center_For_Internet_and_Society.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn21"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 30&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn22"&gt;&lt;/a&gt; Section 3(k) of the patent act describes that a mere mathematical or business method or a computer programme or algorithm cannot be patented.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn23"&gt;&lt;/a&gt;Patent Office Reboots CRI Guidelines Yet Again: Removes “novel hardware” Requirement&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;https://spicyip.com/2017/07/patent-office-reboots-cri-guidelines-yet-again-removes-novel-hardware-requirement.html&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn24"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 37&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn25"&gt;&lt;/a&gt;The Report on the Artificial Intelligence Task Force, Pg. 7&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn26"&gt;&lt;/a&gt; ibid.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn27"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 8&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn28"&gt;&lt;/a&gt; National Strategy for Artificial Intelligence: &lt;a href="http://niti.gov.in/writereaddata/files/document_publication/NationalStrategy-for-AI-Discussion-Paper.pdf"&gt;http://niti.gov.in/writereaddata/files/document_publication/NationalStrategy-for-AI-Discussion-Paper.pdf&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn29"&gt;&lt;/a&gt; Meaningful information and the right to explanation,Andrew D Selbst  Julia Powles, International Data Privacy Law, Volume 7, Issue 4, 1 November 2017, Pages 233–242&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn30"&gt;&lt;/a&gt; The Principle of Purpose Limitation and Big Data, https://www.researchgate.net/publication/319467399_The_Principle_of_Purpose_Limitation_and_Big_Data&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn31"&gt;&lt;/a&gt; M-Turk https://www.mturk.com/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn32"&gt;&lt;/a&gt; For example a lesser threshold of minimum wages, no job secuirity etc, https://blogs.scientificamerican.com/guilty-planet/httpblogsscientificamericancomguilty-planet20110707the-pros-cons-of-amazon-mechanical-turk-for-scientific-surveys/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn33"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 41&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn34"&gt;&lt;/a&gt; Report of Artificial Intelligence Task Force Pg, 46, 47&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn35"&gt;&lt;/a&gt; ibid.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn36"&gt;&lt;/a&gt;The DARPAChallenge https://www.darpa.mil/program/darpa-robotics-challenge&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn37"&gt;&lt;/a&gt;Japan may set regulatory sandboxes to test drones and self driving vehicles http://techwireasia.com/2017/10/japan-may-set-regulatory-sandboxes-test-drones-self-driving-vehicles/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn38"&gt;&lt;/a&gt; Mariana Mazzucato in her 2013 book The Entrepreneurial State, argued that it was the government that drives technological innovation. In her book she stated that high-risk discovery and development were made possible by government spending, which the private enterprises capitalised once the difficult work was done.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn39"&gt;&lt;/a&gt;&lt;a href="https://tech.economictimes.indiatimes.com/news/technology/govt-of-karnataka-launches-centre-of-excellence-for-data-science-and-artificial-intelligence/61689977"&gt;https://tech.economictimes.indiatimes.com/news/technology/govt-of-karnataka-launches-centre-of-excellence-for-data-science-and-artificial-intelligence/61689977&lt;/a&gt;,https://analyticsindiamag.com/amaravati-world-centre-for-ai-data/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn40"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 47&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn41"&gt;&lt;/a&gt; Report of Artificial Intelligence Task Force Pg. 49&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn42"&gt;&lt;/a&gt; The Report on the Artificial Intelligence Task Force, Pg. 47&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn43"&gt;&lt;/a&gt; The AI task force website has a provision for public comments although it is only for the vision and mission and the domains mentioned in the website.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a name="_ftn44"&gt;&lt;/a&gt;National Strategy for Artificial Intelligence: &lt;a href="http://niti.gov.in/writereaddata/files/document_publication/NationalStrategy-for-AI-Discussion-Paper.pdf"&gt;http://niti.gov.in/writereaddata/files/document_publication/NationalStrategy-for-AI-Discussion-Paper.pdf&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework'&gt;https://cis-india.org/internet-governance/blog/the-ai-task-force-report-the-first-steps-towards-indias-ai-framework&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok, Shweta Mohandas and Swaraj Paul Barooah</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-27T14:32:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/files/ai-task-force-report.pdf">
    <title>AI Task Force Report</title>
    <link>https://cis-india.org/internet-governance/files/ai-task-force-report.pdf</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/files/ai-task-force-report.pdf'&gt;https://cis-india.org/internet-governance/files/ai-task-force-report.pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-06-27T14:22:11Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy">
    <title>Data Privacy: Footprints on the Web</title>
    <link>https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy</link>
    <description>
        &lt;b&gt;Technology has made data protection a hot button issue. Now, a group of eminent citizens, mostly lawyers, have formulated a draft privacy bill, a legal framework that protects the individual’s right to privacy, but it faces legal jurisdiction issues &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The blog post by Sujit Bhar was published in &lt;a class="external-link" href="http://www.indialegallive.com/constitutional-law-news/acts-and-bills-news/data-privacy-footprints-on-the-web-50261"&gt;IndiaLegal&lt;/a&gt; on June 21, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Lack of data privacy is a modern day peril. Quite like the individual’s right to privacy—one that has been raised to the level of a Fundamental Right by the Supreme Court—data privacy today is prime, because technology has made our lives fully dependant on associated data. Hence, by extension of the same logic and arguments that the top court used for personal privacy, data privacy should be protected.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The methodology to be adopted, though, is not as easy to determine given the lack of legislation in the field, the improbability of existing technology to ensure complete privacy and because of legal jurisdiction issues.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Also, to what extent data privacy can and should be allowed is a legal argument that needs to be supported by other fields of knowledge. The Supreme Court decision to award privacy as a Fundamental Right will act as a plinth in determining this.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To that end a group of eminent citizens, mostly lawyers, came together and formulated a draft privacy bill with the objective of slicing through banal arguments that would ensue if this was to wait for public re-reference/debate.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The proponents—Apar Gupta, Gautam Bhatia, Kritika Bhardwaj, Maansi Verma, Naman M Aggarwal, Praavita Kashyap, Prasanna S, Raman Jit Singh Chima, Ujwala Uppaluri and Vrinda Bhandari—have tried to develop their own privacy bill, based on the foundation of the Privacy (Protection) Bill, 2013, “which was drafted over a series of roundtables and inputs conducted by the Centre for Internet and Society, Bangalore”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In doing so the group started from what it calls “seven privacy principles”, derived from various constitutional and expert texts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 1: Individual rights are at the centre of privacy and data protection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This says that “the individual and her rights are primary. The law on privacy must empower you by advancing your right to privacy…”including “your right to autonomy and dignity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 2: A data protection law must be based on privacy principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Here reference is made to the report of the Justice AP Shah Committee of Experts. It’s a method that has been left flexible, to accommodate fast developing technology. There is a reference to Moore’s Law in this. Moore’s Law has remained one of the most overwhelmingly true laws of the IT industry. Originating in 1970, it says that processor speeds, or overall processing power for computers “will double every two years”. While that has remained true till now, with the development of multiple core processors, this law too has seemingly run its course. With the world changing at such a fast pace, if the data privacy bill/law does not remain flexible, it would also be quickly consigned to a museum of laws. Hence this flexible approach will be crucial.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 3: A strong privacy commission must be created to enforce the privacy principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This is the part of establishing an oversight authority, “a strong body to ensure that the data protection rights are put into practice and enforced”. This structure has been treated for something “that works in principle and in practice.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There is one part that says that this proposed “Privacy Commission”, has been “provided wide powers of investigation, adjudication, rule-making and enforcement. The Commission should adopt an approach that builds accountability for the rights of users by having powers to impose penalties that are proportionate to the harm and build deterrence.” This, obviously, means that it will be stepping onto the toes of other laws and that would be a rough road to navigate. However, as the group’s own philosophy says that the problem with technology oriented legislation is that it takes catching up with the progress of technology. To overcome this, the group wants to “make sure that the Privacy Code is not outdated” and hence wants to make sure that the “Privacy Commission can exercise rule making powers to give effect to the data protection principles under the regulation”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The other part of the philosophy is of acknowledging and addressing public complaints. Hence the legal rigidity of regular acts would be dismissed. How this can work with enforcement agencies, though, will remain a matter of debate. The draft bill says that the “Privacy Commission must serve as the forum for the redressal of the general public’s grievances”, and that “Privacy Commissions should have the ability to investigate (independently through the office of a Director General), hold hearings and pass orders with directions and fines”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;That could be legal nightmare, because unlike a simple code, the bill has to pass through parliament to become an act, and legislators are the ones who have final say in remodelling an existing law. How much power they would agree to delegate is anybody’s guess.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Of course, the draft also calls for the courts to welcome public opinion. There seems to be a slight hitch in the wording, which says that “…while the Privacy Commission serves as the forum for redressal, the public should retain the remedies of approaching the civil courts (even in instances where harm is suffered by a group of people) and of filing police complaints directly”. That questions even the oversight authority of the commission. There is another objective—a hope, one would say—that the Privacy Commission must have jurisdiction over the government, as it does over the private sector. The Privacy Commission should have overriding power and superintendence over all legal entities in matter of data protection and privacy”. While this sounds good on paper, the issue of national security can override all. At this point, according to a cyber security expert, there is talk within the Indian government on how to deal with the social media messaging app WhatsApp. Technically, as the company points out, messaging through an app is encrypted (military grade encryption, it is said) end-to-end. Hence terrorist groups have zeroed in on this as a common idea exchange platform. There could possibly be restrictive legislation on this. That could strike at the heart of data privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The government’s reaction, though, could become counter-productive. This could be visible in what the Justice Srikrishna-led Committee of Experts possibly could recommend.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 4: The government should respect user privacy. Technically, if this bill, in its current form, has to go through parliament, members of both houses should be willing to accept that it will have no snooping powers, ever. The way the government fought tooth and nail against personal privacy in court—and the Aadhaar verdict is still awaited—this proposal seems unlikely to have an easy passage. The draft says: “It is imperative that the government, its arms, bodies and programmes be compliant with the privacy protection principles through a data protection law.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There is a caveat within this, saying: “We support the use of digital technologies for public benefit. However, they should not be privileged over fundamental rights.” The proposal also says: “The government is responsible for the delivery of many essential services to the public of India. These services must not be withheld from an individual, due to such individual not sharing data with the government. Withholding services on the pretext of requirement of collection of data effectively amounts to extortion of consent. Individuals cannot be forced to trade away their data and citizenship at the altar of being permitted to use government services and access legal entitlements on welfare.” This will have to wait its validation or dismissal through the Aadhaar verdict.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 5: A complete privacy code comes with surveillance reform&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This is another tricky issue for any government. It talks about how the Snowden revelations “brought to public knowledge that our personal data is collected in an indiscriminate manner by governments”. The draft calls this collection procedure “dragnet surveillance”, because it “contravenes the principles of necessity, proportionality and purpose limitation”. Necessity and proportionality have been argued in detail during the Aadhaar debate in court and till that verdict is out, it would, possibly, not be right to delve into this, though a recommendation for procedural safeguards might run into the same wall as in the case of encrypted software in social media apps. The draft accepts the possibility of “individual interception and surveillance”, but says “this should be severely limited in substance and practice through procedural safeguards”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 6: The right to information needs to be strengthened and protected&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This basically refers to the Right to Information Act and seems completely justified, with Information Commissioners being “exempted from interference or control by the Privacy Commissioner”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Principle 7: International protections and harmonisation to protect the open internet must be incorporated&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another contentious issue, being fuelled by the loss of face by Facebook in its effort to introduce graded access (with paywalls).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The group widens its scope in stating that “we need to be guided by the &lt;a href="http://www.indialegallive.com/topic/supreme-court"&gt;Supreme Court’s&lt;/a&gt; Right to Privacy decision and make reference to the European Union’s General Data Protection Regulation”. More interestingly, the group admits that every law will have certain exceptions. It says: “…but without clear wording sometimes exceptions swallow up the rule. We adopted a three part test in our drafting process in which any exceptions to these privacy principles should be: (a) worded clearly; (b) limited in purpose, necessary and proportionate to the aim; and (c) accompanied by sufficient procedural safeguards”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On the face of it, the overall draft represents a novel and upright way of thinking, and if some of this is accepted while the government mulls the Justice Srikrishna Committee’s recommendations (expected late this month), it would be a good beginning.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy'&gt;https://cis-india.org/internet-governance/news/india-legal-live-june-21-2018-data-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-25T16:48:34Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/comments-on-the-draft-digital-communications-policy">
    <title>Comments on the Draft Digital Communications Policy</title>
    <link>https://cis-india.org/internet-governance/blog/comments-on-the-draft-digital-communications-policy</link>
    <description>
        &lt;b&gt;This submission presents comments by the Centre for Internet &amp; Society, India (“CIS”) on the Draft Digital Communications Policy which was released to the public by the Department of Telecommunications of the Ministry of Communications on 1st May 2018 for comments and views.

&lt;/b&gt;
        
&lt;h2&gt;Preliminary&lt;/h2&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;p&gt;On 1st May 2018, the Department of Telecommunications of the Ministry
 of Communications released the Draft Digital Communications Policy for 
comments and feedback.&amp;nbsp; We laud the Government’s attempts to realise the
 socio-economic potential of India by increasing&amp;nbsp; access to Internet, 
and drafting a comprehensive policy while adequately keeping in mind the
 various security and privacy concerns that arise due to online 
communication. On behalf of the Centre for Internet &amp;amp; Society (CIS),
 we thank the Department of Telecommunications for the opportunity to 
submit its comments on the draft policy.&lt;/p&gt;
&lt;p&gt;We would like to point out two concerns with the consultation 
process: (i) a character-limit imposed on the comments to each section, 
due to which this submission has to sacrifice on providing comprehensive
 references to research; and (ii) issues with signing in on the MyGov 
where this consultation was hosted. We strongly recommend that the 
consultation process be liberal in accepting content, and allow for 
multiple types of submissions.&lt;/p&gt;
&lt;h2&gt;Comments&lt;/h2&gt;
&lt;h3&gt;Connect India: Creating a Robust Digital Communication Infrastructure&lt;/h3&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;
&lt;div&gt;On 2022 Goals&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;a. Provide Universal broadband coverage at 50 Mbps to every citizen&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;According to UNICEF’s 2017 report, &lt;a class="external-link" href="https://www.unicef.org/publications/files/SOWC_2017_ENG_WEB.pdf"&gt;Children in a Digital World&lt;/a&gt;, 
only 29% of the internet users in India are female.&amp;nbsp; It is essential 
that the policy recognise the wide digital gender gap and other 
differences in internet access that result from traditional 
sociocultural barriers. Therefore, we recommend that the goal read as: 
“Provide Universal broadband coverage at 50 Mpbs to every citizen, with 
special focus on increasing internet access for women, people with 
disabilities, and historically-marginalised communities.”&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;g. Ensure connectivity to all uncovered areas&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;The term “connectivity” should be changed to “active internet 
connectivity”. As per the current norms, a gram panchayat may be 
considered “connected” if the fibre infrastructure exists, but this does
 not necessarily mean an active internet connection being serviced in 
the area. For example, &lt;a class="external-link" href="http://indianexpress.com/article/business/four-years-of-modi-government-telecom-and-it-ravi-shankar-prasad-5188871/"&gt;as on May 20&lt;/a&gt;, “of 1.22 lakh gram panchayats with 
fibre connectivity, 1.09 lakh had active internet.”&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;On Strategies&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;1.1 (a) i. BharatNet – Providing 1 Gbps to Gram Panchayats upgradeable to 10 Gbps&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;The Central Government, under the “State-led” implementation of the
 BharatNet initiative, has allowed certain state governments to 
implement the program in their respective states. This has allowed State
 Governments to take misplaced liberty with the core objective of the 
program, which originally was to increase access to internet services. 
For example, after the Telecom Commission’s approval of Andhra Pradesh’s
 “State-led” implementation of the program, the state government set up a
 body corporate Andhra Pradesh State FiberNet Limited. This body then 
went on to &lt;a class="external-link" href="https://164.100.158.235/question/annex/245/Au4554.pdf"&gt;exceed&lt;/a&gt; its mandate by venturing into the television 
broadcasting and distribution business by offering Internet Protocol 
Television (IPTV) services.&amp;nbsp; This is deeply problematic a it indicates 
that central government funds meant for increasing internet access are 
being used for IPTV services, despite the TRAI’s repeated 
&lt;a class="external-link" href="http://www.trai.gov.in/notifications/press-release/trai-issues-recommendations-%E2%80%9Cissues-related-entry-certain-entities"&gt;recommendations&lt;/a&gt; (since 2012) that state-owned entities should not be 
allowed to enter broadcasting and distribution activities ; allowing 
state entities in the business is against fair play and competition, 
runs contrary to the principle of independent and free media, and has 
chilling effects on the freedom of expression.&lt;/div&gt;
&lt;div&gt;Additionally, this has created a problem for aggregated data 
availability on the expenditure on the program. While the central 
government should ideally have all data pertaining to state-wise 
expenditure of funds for the program, data regarding the states 
implementing the initiative on their own is &lt;a class="external-link" href="http://164.100.47.190/loksabhaquestions/annex/14/AU4334.pdf"&gt;generally&lt;/a&gt; &lt;a class="external-link" href="http://164.100.47.190/loksabhaquestions/annex/14/AS73.pdf"&gt;excluded&lt;/a&gt; from the 
data provided by the Ministry. The goals of the program need to be 
specifically defined so that funds are directed effectively. The program
 needs stricter monitoring mechanisms to ensure that the intended 
objectives are met.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;1.1 (a) iv. JanWiFi – Establishing 2 Million Wi-Fi Hotspots in rural areas&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;Under present regulations, resale of communication data logged by 
WiFi hotspots is not permitted. However, &lt;a class="external-link" href="https://www.livemint.com/Industry/T4c6JlgpofYfHODmuQUjJP/Govt-may-allow-data-resale-in-boost-to-public-WiFi-plan.html"&gt;recent&lt;/a&gt; news &lt;a class="external-link" href="https://www.livemint.com/Industry/1jJ6MGWuQM7RiBNhPOb4zI/Data-resale-should-be-allowed-to-boost-public-WiFi-hotspots.html"&gt;reports&lt;/a&gt; suggest
 that the DoT may change these norms to permit (virtual network) 
operators to further sell this information. We understand that while 
changing such norms may incentivise the operators to set up WiFi 
hotspots, however, the proliferation of internet access cannot come at 
the cost of privacy of users. The data available to the operators of 
these hotspots includes all browsing data, which is sensitive private 
information, and thus, should be restricted from sale. We strongly 
recommend that in compatibility with the security &amp;amp; privacy goals 
for consumers envisioned in the latter sections of this draft policy, 
the DoT ensure that strong privacy measures are in place for public WiFi
 hotspots made available through programs like JanWiFi.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;1.1 (f) Enabling Infrastructure Convergence of IT, telecom and broadcasting sectors&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;The policy proposes a convergence of the infrastructure 
administration currently performed by three central Government 
departments: IT, Broadcasting and Telecom. As admitted in the draft, 
this will require amendments, amongst many Acts, to the Telegraph Act. 
However, the draft policy has not clearly delineated the new proposed 
responsibilities for each department, and avoids elaborating on the 
nuance that will be required to address the multiple legal and 
administrative concerns stemming from the proposed convergence. The 
document also fails to detail how infrastructure (say internet access 
through 4G) will be regulated differently services (say IPTV operating 
on 4G). Further clarity is also required (i) how department-specific 
concerns (which are unsuited for a larger body) will be handled; and 
(ii) regarding the auspices under which the new converged body will 
operate.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;1.2 (a) Making adequate spectrum available to be equipped for the new broadband era&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;TRAI’s &lt;a class="external-link" href="https://www.trai.gov.in/sites/default/files/Consultation-Paper_Final%2028-3-14.pdf"&gt;consultation paper&lt;/a&gt;, Allocation and Pricing of Microwave 
Access (MWA) and Microwave Backbone (MWB) RF carriers (March 2014), 
recommends the exploration of the usage of the E-band (71 - 76 / 81-86 
GHz) and V-band (57-64 MHz),&amp;nbsp; and for the allocation of the same to 
telecom service providers.&amp;nbsp; We recommend that the Ministry accept TRAI’s
 recommendations, and reflect it in this policy.&lt;/div&gt;
&lt;div&gt;While the draft policy aims to decrease regulation of the spectrum,
 including liberalising spectrum “sharing, leasing and trading” regime, 
in addition, the policy should clear the government’s stance on 
unlicensed spectrum usage. CIS has written earlier (&lt;a class="external-link" href="https://cis-india.org/telecom/unlicensed-spectrum-policy-brief-for-govt-of-india"&gt;June 2012&lt;/a&gt;) about the
 demonstrable need for unlicensed spectrum to create a path for 
inexpensive connectivity in rural and remote areas.&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;1.2 (a) v. Optimal Pricing of Spectrum to ensure sustainable and affordable access to Digital Communications&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;&lt;br /&gt;&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;
&lt;div&gt;The draft policy should review existing approach to spectrum 
pricing in India. The Indian telecom sector is under heavy debt, and if 
rejuvenating this sector is a purported goal of this policy via “optimal
 pricing of spectrum”, auctions with a view to revenue maximisation 
should no longer remain the preferred method of assigning spectrum. The 
National Telecom Policy, 1999 which adopted a revenue-sharing approach 
to license fees, showed good results for the sector and translated into 
huge benefits for consumers. The government should adopt a similar 
approach to rescue the industry.&lt;/div&gt;
&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;h3&gt;Propel India: Enabling Next Generation Technologies and Services 
through Investments, Innovation, Indigenous Manufacturing and IPR 
Generation&lt;/h3&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;p&gt;On Strategies&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;2.2 (a) ii. Simplifying&amp;nbsp; licensing&amp;nbsp; and regulatory frameworks 
whilst&amp;nbsp; ensuring&amp;nbsp; appropriate security&amp;nbsp; frameworks&amp;nbsp; for&amp;nbsp; IoT/&amp;nbsp; M2M&amp;nbsp; /&amp;nbsp; 
future services&amp;nbsp; and&amp;nbsp; network&amp;nbsp; elements incorporating international best
 practices&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The process of “simplifying” licensing and regulatory regime is 
currently vague, and the intentions remain unclear. Simplifying licences
 without clear intentions can lead to losing the necessary nuance in the
 license agreements required to maintain competitive markets. In recent 
months, the industry has already witnessed a dilution of provisions 
which were placed to ensure healthy competition in the sector. For 
example, on May 31st, new norms were &lt;a class="external-link" href="https://telecom.economictimes.indiatimes.com/news/dot-amends-licence-rule-to-allow-higher-spectrum-holding/64406115"&gt;announced&lt;/a&gt; by DoT under which now 
allow an operator to hold 35% of the total spectrum&amp;nbsp; as opposed to the 
earlier regulation which only allowed for holding a maximum 25% of the 
total spectrum.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;2.3 (d) (iii) Providing financial incentives for the 
development of Standard Essential Patents(SEPs) in the field of digital 
communications technologies&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;This is a welcome step by the government to incentivise the 
development of SEPs in the country. However, this appreciable step will 
only yield results in the long term - and realistically speaking, not 
before a decade. It is equally necessary to improve the environment of 
licensing of SEPs in the short-term. The government should take 
initiative for creation of government-controlled patent pools for SEPs, 
which will solve issues of licensing for SEP holders, and also improve 
transparency of information relating to SEPs. Specifically, we recommend
 that the government initiate the &lt;a class="external-link" href="https://cis-india.org/a2k/blogs/open-letter-to-prime-minister-modi"&gt;formation of a patent pool&lt;/a&gt; of critical
 mobile technologies and apply a five percent compulsory license.&amp;nbsp;&lt;/p&gt;
&lt;h3&gt;Secure India: Ensuring Digital Sovereignty, Safety and Security of Digital Communications&lt;/h3&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;p&gt;On Strategies&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;3.1 Harmonising communications law and policy with the evolving
 legal framework and jurisprudence relating to privacy and data 
protection in India&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;We welcome the Ministry’s intention to amend licence agreements to 
include data protection and privacy provisions. In the same vein, the 
Ministry should also consider removing provisions from licenses that 
prevent the operator from using certain encryption methods in its 
network. For example, Clause 2.2 (vii) of the &lt;a class="external-link" href="http://www.dot.gov.in/isplicense/template-agreement-between-internet-service-provider-isp-and-vendor-equipment-product-and"&gt;License Agreement between 
DoT &amp;amp; ISP&lt;/a&gt; prohibits bulk encryption.&amp;nbsp; Additionally, in the License 
Agreement, encryption with only up to 40-bit in RSA (or equivalent) is 
normally permitted.&amp;nbsp; Similarly, Clause 37.1 of the &lt;a class="external-link" href="http://www.dot.gov.in/sites/default/files/Unified%20Licence_0.pdf"&gt;Unified Service 
License Agreement&lt;/a&gt; prohibits bulk encryption.&amp;nbsp; These provisions must be 
revised to ensure that ISPs and other service providers can employ more 
cryptographically secure methods.&lt;/p&gt;
&lt;p&gt;When regulating on encryption, we recommend that the government only 
set positive minimum mandates for the storage and transmission of data, 
and not set upper limits on the number of bits or on the quality of 
cryptographical method. In pursuance of the same goals, we also 
recommend adding point ‘iii’ to 3.1 (b): “promoting the use of 
encryption in private communication by providing positive minimum 
mandates for strong encryption in (or along with) the data protection 
framework.”&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;3.2 (a) Recognising the need to uphold the core principles of net neutrality&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Like other goals of the draft policy, the target for ensuring and 
enforcing net neutrality principles has been set as 2022. However, this 
goal is achievable by as early as December 2018. We suggest that the 
Government take the first step towards this goal by accepting the net 
neutrality principles proposed by the TRAI and its recommendations to 
the government which have been pending with the Ministry since November 
2017. The government may additionally take into consideration &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-position-on-net-neutrality"&gt;CIS’ 
position&lt;/a&gt; on &lt;a class="external-link" href="https://cis-india.org/internet-governance/resources/net-neutrality/2015-06-29_PositionPaperonNetNeutralityinIndia/view"&gt;net neutrality&lt;/a&gt;.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The vaguely worded “appropriate exclusions and exceptions” carved out
 to net-neutrality principles in the policy need urgent elaboration. 
Given the vague boundaries between different control layers in digital 
communication, content regulation is very easy to slip into, and needs 
to be consciously avoided by the government.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;3.3 (f) ii. Facilitating lawful interception agencies with 
state of the art lawful intercept and analysis systems for 
implementation of law and order and national security&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;There is no clarity in policy on how the government plans to meet the
 goal of “[f]acilitating lawful interception agencies with state of the 
art lawful intercept and analysis systems for implementation of law and 
order and national security.”&amp;nbsp; It has been &lt;a class="external-link" href="https://ajayshahblog.blogspot.com/2018/05/indias-communication-surveillance.html"&gt;recently suggested&lt;/a&gt; that some 
legal provisions that enable targeted communication surveillance might 
be violative of the privacy guidelines laid out in the recent Supreme 
Court judgment that affirmed the Right to Privacy.&amp;nbsp; Additionally, mass 
surveillance, prime facie, does not meet the “proportionality test.” 
Therefore, the policy documents needs details as to how the Ministry 
will aid intelligence agencies, and whether these interception details 
will be known to ISPs, TSPs and the public via reflection in the various
 License Agreements.&lt;/p&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/comments-on-the-draft-digital-communications-policy'&gt;https://cis-india.org/internet-governance/blog/comments-on-the-draft-digital-communications-policy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Anubha Sinha, Gurshabad Grover and Swaraj Barooah</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-06-14T12:43:10Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/niti-aayog-discussion-paper-an-aspirational-step-towards-india2019s-ai-policy">
    <title>NITI Aayog Discussion Paper: An aspirational step towards India’s AI policy</title>
    <link>https://cis-india.org/internet-governance/blog/niti-aayog-discussion-paper-an-aspirational-step-towards-india2019s-ai-policy</link>
    <description>
        &lt;b&gt;The National Strategy for Artificial Intelligence — a discussion paper on India’s path forward in AI, is a welcome step towards a comprehensive document that reflects the government's AI ambitions. The 115-page discussion paper attempts to be an all encompassing document looking at a host of AI related issues including privacy, security, ethics, fairness, transparency and accountability.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/niti-aayog-discussion-paper"&gt;&lt;strong&gt;Download the Report&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The 115-page discussion paper attempts to be an all encompassing document looking at a host of AI related issues including privacy, security, ethics, fairness, transparency and accountability. The paper identifies five focus areas where AI could have a positive impact in India.&lt;/span&gt;&lt;span&gt; It also focuses on reskilling as a response to the potential problem of job loss due the future large-scale adoption of AI in the job market.&lt;/span&gt;&lt;span&gt; This blog is a follow up to the comments made by CIS on Twitter&lt;/span&gt;&lt;span&gt; on the paper and seeks to reflect on the National Strategy as a well researched AI roadmap for India. In doing so, it identifies areas that can be strengthened and built upon.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Identified Focus Areas for AI Intervention&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The paper identifies five focus areas—Healthcare, Agriculture, Education, Smart Cities and Infrastructure, Smart Mobility and Transportation, which Niti Aayog believes will benefit most from the use of AI in bringing about social welfare for the people of India.&lt;/span&gt;&lt;span&gt; Although these sectors are essential in the development of a nation, the failure to include manufacturing and services sectors is an oversight. Focussing on  manufacturing is fundamental not only in terms of economic development and user base, but also regarding questions of safety and the impact of AI on jobs and economic security. The same holds true for the service sector particularly since AI products are being made for the use of consumers, not just businesses. Use of AI in the services sector also raises critical questions about user privacy and ethics. Another sector the paper fails to include is defense, this is worrying since India is chairing the Group of Governmental Experts &lt;/span&gt;&lt;span&gt;on Lethal Autonomous Weapons Systems (LAWS) in 2018.&lt;/span&gt;&lt;span&gt; Across sectors, the report fails to look at how AI could be utilised to ensure accessibility and inclusion for the disabled. This is surprising, as  aid for the differently abled and accessibility technology was one of the 10 domains identified in the Task Force Report on AI published earlier this year. &lt;/span&gt;&lt;span&gt;This should have been a focus point in the paper as it  aims to identify applications with maximum social impact and inclusion.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In its vision for the use of AI in smart cities, the&lt;/span&gt;&lt;span&gt; paper suggests the adoption of a sophisticated surveillance system as well as the use of social media intelligence platforms to check and monitor people’s movement both online and offline to maintain public safety.&lt;/span&gt;&lt;span&gt; This is at variance with constitutional standards of due process and criminal law principles of reasonable ground and reasonable suspicion. Further, use of such methods will pose issues of judicial inscrutability. From a rights perspective, state surveillance can directly interfere with fundamental rights including privacy, freedom of expression, and freedom of assembly. Privacy organizations around the world have raised concerns regarding the increased public surveillance through the use of AI.&lt;/span&gt;&lt;span&gt; Though the paper recognized the impact on privacy that such uses would have, it failed to set a strong and forward looking position on the issue - such as advocating that such surveillance must be lawful and inline with international human rights norms.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Harnessing the Power of AI and Accelerating Research&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;One of the ways suggested for the proliferation of AI in India was to increase research, both core and applied, to bring about innovation that can be commercialised.&lt;/span&gt;&lt;span&gt; In order to attain this goal the paper proposes a two-tier integrated approach: the establishment of  COREs (Centres of Research Excellence in Artificial Intelligence) and ICTAI (International Centre for Transformational Artificial Intelligence).&lt;/span&gt;&lt;span&gt; However the roadmap to increase research in AI fails to acknowledge the principles of public funded research such as free and open source software (FOSS), open standards and open data. The report also blames the current Indian  Intellectual Property regime for being “unattractive” and averse to incentivising research and adoption of AI.&lt;/span&gt;&lt;span&gt; Section 3(k) of Patents Act exempts algorithms from being patented, and the Computer Related Inventions (CRI) Guidelines have faced much controversy over the patentability of mere software without a novel hardware component.&lt;/span&gt;&lt;span&gt; The paper provides no concrete answers to the question of whether it should be permissible to patent algorithms, and if yes, to  to what extent. Furthermore, there needs to be a standard either in the CRI Guidelines or the Patent Act, that distinguishes between AI algorithms and non-AI algorithms. Additionally, given that there is no historical precedence on the requirement of patent rights to incentivise creation of AI,  innovative investment protection mechanisms that have lesser negative externalities, such as compensatory liability regimes&lt;/span&gt;&lt;span&gt; would be more desirable.  The report further failed to look at the issue holistically and recognize that facilitating rampant patenting can form a barrier to smaller companies from using or developing  AI. This is important to be cognizant of given the central role of startups to the AI ecosystem in India and because it can work against the larger goal of inclusion articulated by the report.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Ethics, Privacy, Security and Safety&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In a positive step forward, the paper addresses a broader range of ethical issues concerning AI including transparency, fairness, privacy and security and safety in more detail when compared to the earlier report of the Task Force.&lt;/span&gt;&lt;span&gt; Yet despite a dedicated section covering these issues, a number of concerns still remain unanswered.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Transparency&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The section on transparency and opening the Black Box has several lacunae.&lt;/span&gt;&lt;span&gt; First, AI that is used by the government, to an acceptable extent, must be available in the public domain for audit, if not under Free and Open Source Software (FOSS). This should hold true in particular for uses that impinge on fundamental rights. Second, if the AI is utilised in the private sector, there currently exists a right to reverse engineer within the Indian Copyright Act,&lt;/span&gt;&lt;span&gt; which is not accounted for in the paper. Furthermore, if the AI was involved both in the commission of a crime or the violation of human rights, or in the investigations of such transgressions, questions with regard to judicial scrutability of the AI remain. In addition to explainability, the source code must be made circumstantially available, since explainable AI&lt;/span&gt;&lt;span&gt; alone cannot solve all the problems of transparency. In addition to availability of source code and explainability, a greater discussion is needed about the tradeoff between a complex and potentially more accurate AI system (with more layers and nodes)  vs. an AI system which is potentially not as accurate but is able to provide a human readable explanation.&lt;/span&gt;&lt;span&gt; It is interesting to note that transparency within human-AI interaction is absent in the paper. Key questions on transparency, such as whether an AI should disclose its identity to a human have not been answered.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Fairness&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;With regards to fairness, the paper mentions how AI can amplify bias in data and create unfair outcomes.&lt;/span&gt;&lt;span&gt; However, the paper neither suggests detailed or satisfactory solutions nor does it deal with biased historical data in an Indian context. More specifically, there seems to be no mention of regulatory tools to tackle the problem of fairness, such as:&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;Self-certification&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Certification by a self-regulatory body&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Discrimination impact assessments&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Investigations by the privacy regulator &lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;span&gt;Such tools will proactively need to ensure&lt;/span&gt;&lt;span&gt; inclusion, diversity, and equity in composition and decisions.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Additionally, with reference to correcting bias in AI, it should be noted that the technocratic view that as an AI solution continues to be trained on larger amounts of data  , systems will self correct, does not fully recognize the importance of data quality and data curation, and is inconsistent with fundamental rights. Policy objectives of AI innovation must be technologically nuanced and cannot be at the cost of intermediary denial of rights and services.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Further, the paper does not deal with issues of multiple definitions and principles of fairness, and that building definitions into AI systems may often involve choosing one definition over the other. For instance, it can be argued that the set of AI ethical principles articulated by Google&lt;/span&gt;&lt;span&gt; are more consequentialist in nature involving a a cost-benefit analysis, whereas a human rights approach may be more deontological in nature. In this regard, there is a need for interdisciplinary research involving computer scientists, statisticians, ethicists and lawyers.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Privacy&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Though the paper underscores the importance of privacy and the need for a privacy legislation in India - the paper limits the potential privacy concerns arising from AI to collection, inappropriate use of data, personal discrimination, unfair gain from insights derived from consumer data  (the solution being to explain to consumers about the value they as consumers gain from this), and unfair competitive advantage by collecting mass amounts of data (which is not directly related to privacy).&lt;/span&gt;&lt;span&gt; In this way the paper fails to discuss the full implications on privacy that AI might have and fails to address the data rights necessary to enable the right to privacy in a society where AI is pervasive. The paper fails to engage with emerging principles from data protection such as right to explanation and right to opt-out of automated processing, which directly relate to AI. Further, there is no discussion on the issues such as data minimisation and purpose limitation which some big data and AI proponents argue against. To that extent, there is a lack of appreciation of the difficult policy questions concerning privacy and AI. The paper is also completely silent on redress and remedy.  Further the paper endorses the seven data protection principles postulated by the Justice Srikrishna Committee.&lt;/span&gt;&lt;span&gt; However CIS has pointed out that these principles are generic and not specific to data protection.&lt;/span&gt;&lt;span&gt; Moreover, the law chapter of IEEE’s ‘&lt;/span&gt;&lt;em&gt;&lt;span&gt;Global Initiative on Ethics of Autonomous and Intelligent Systems’&lt;/span&gt;&lt;/em&gt;&lt;span&gt; has been ignored in favor of the chapter on ‘&lt;/span&gt;&lt;em&gt;&lt;span&gt;Personal Data and Individual Access Control in Ethically Aligned Design&lt;/span&gt;&lt;/em&gt;&lt;span&gt;’&lt;/span&gt;&lt;span&gt; as the recommended international standard.&lt;/span&gt;&lt;span&gt; Ideally, both chapters should be recommended for a holistic approach to the issue of ethics and privacy with respect to AI. &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;AI Regulation and Sectoral Standards&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The discussion paper’s approach towards sectoral regulation advocates collaboration with industry to formulate regulatory frameworks for each sector.  However, the paper is silent on the possibility of reviewing existing sectoral regulation to understand if they require amending. We believe that this is an important solution to consider since amending existing regulation and standards often takes less time than formulating and implementing new regulatory frameworks.&lt;/span&gt;&lt;span&gt; Furthermore, although the emphasis on awareness in the paper is welcome, it must complement regulation and be driven by all stakeholders, especially given India’s limited regulatory budget. The over reliance on industry self-regulation, by itself, is not advisable, as there is an absence of robust industry governance bodies in India and self-regulation raises questions about the strength and enforceability of such practices. The privacy debate in India has recognized this and reports, like the Report of the Group of Experts on Privacy, recommend a co-regulatory framework with industry developing binding standards that are inline with the national privacy law and that are approved and enforced by the Privacy Commissioner.&lt;/span&gt;&lt;span&gt; That said, the UN Guiding Principles on Business and Human Rights and its “protect, respect, and remedy” framework should guide any self regulatory action.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Security and Safety of AI Systems&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In terms of security and safety of AI systems the paper seeks to shift the discussion of accountability being primarily about liability, to that of one about the  explainability of AI.&lt;/span&gt;&lt;span&gt; Furthermore, there is no recommendation of immunities or incentives for whistleblowers or researchers to report on privacy breaches and vulnerabilities. The report also does not recognize certain uses of AI as being more critical than others because of their potential harm to the human. This would include uses in healthcare and autonomous transportation. A key component of accountability in these sectors will be the evolution of appropriate testing and quality assurance standards. Only then, should safe harbours be discussed as an extension of the negligence test for damages caused by AI software. Additionally, the paper fails to recommend kill switches, which should be mandatory for all kinetic AI systems.&lt;/span&gt;&lt;span&gt; Finally, there is no mention of mandatory human-in-the-loop in all systems where there are significant risks to safety and human rights. Autonomous AI is only viewed as an economic boost, but its potential risks have not been explored sufficiently. A welcome recommendation would be for all autonomous AI to go through human rights impact assessments.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Research and Education&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Being a government think-tank, the NITI Aayog could have dealt in detail with the AI policies of the government and looked at how different arms of the government are aiming to leverage AI and tackle the problems arising out of the use of AI. Instead of tabulating the government’s role in each area and especially research, the report could have also listed out the various areas where each department could play a role in the AI ecosystem through regulation, education, funding research etc. In terms of the recommendations for introducing AI curriculums in schools, and colleges,&lt;/span&gt;&lt;span&gt; the government could also ensure that ethics and rights are  part of the curriculum - especially in technical institutions. A possible course of action could include corporations paying for a pan-Indian AI education campaign.This would also require the government to formulate the required academic curriculum that is updated to include rights and ethics. &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Data Standards and Data Sharing&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Based on the amount of data the Government of India collects through its numerous schemes, it has the potential to be the largest aggregator of data specific to India. However the paper does not consider the use of this data with enough gravity. For example, the paper recommends Corporate Data Sharing for “social good” and making government datasets from the social sector available publicly.&lt;/span&gt;&lt;span&gt; Yet  this section does not mention privacy enhancing technologies/standards such as pseudonymization, anonymization standards, differential privacy etc. Additionally there should be provisions that allow the government to prevent the formation of monopolies by regulating companies from hoarding user data. The open data standards could also be applicable to the private companies, so that they can also share their data in compliance with the privacy enhancing technologies mentioned above. The paper also acknowledges that AI Marketplaces require monitoring and maintenance of quality. It recognises the need for “continuous scrutiny of products, sellers and buyers”&lt;/span&gt;&lt;span&gt;, and proposes that the government enable these regulations in a manner that private players could set up the marketplace. This is a welcome suggestion, but the legal and ethical framework of the AI Marketplace requires further discussion and clarification.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;An AI Garage for Emerging Economies&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The discussion paper also qualifies India as an “ideal test-bed”&lt;/span&gt;&lt;span&gt; for trying out AI related solutions. This is problematic since questions of regulation in  India with respect to AI have yet to be legally clarified and defined and India does not have a comprehensive privacy law. Without a strong ethical and regulatory framework, the use of new and possibly untested technologies in India could lead to unintended and possibly harmful outcomes.The government's ambition to position India as a leader amongst developing countries on AI related issues should not be achieved by using Indians as test subjects for technologies whose effects are unknown.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In conclusion, NITI Aayog’s discussion paper represents a welcome step towards a comprehensive AI strategy for India. However, the trend of inconspicuously releasing reports (this and the AI Task Force) as well as the lack of a call for public comments, seems to be the wrong way to foster discussion on emerging technologies that will be as pervasive as AI. &lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The blanket recommendations were provided without looking at its viability in each sector.&lt;/span&gt;&lt;span&gt; Furthermore, the discussion paper does not sufficiently explore or, at times, completely omits key areas. It barely touched upon societal, cultural and sectoral challenges to the adoption of AI — research that CIS is currently in the process of undertaking.&lt;/span&gt;&lt;span&gt;Future reports on Indian AI strategy should pay more attention to the country’s unique legal context and to possible defense applications and take the opportunity to establish a forward looking, human rights respecting, and holistic position in global discourse and developments. Reports should also consider infrastructure investment as an important prerequisite for AI development and deployment. Digitised data and connectivity as well as more basic infrastructure, such as rural electricity and well-maintained roads, require more funding to more successfully leverage AI for inclusive economic growth. Although there are important concerns, the discussion paper is an aspirational step toward India’s AI strategy. &lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/niti-aayog-discussion-paper-an-aspirational-step-towards-india2019s-ai-policy'&gt;https://cis-india.org/internet-governance/blog/niti-aayog-discussion-paper-an-aspirational-step-towards-india2019s-ai-policy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Sunil Abraham, Elonnai Hickok, Amber Sinha, Swaraj Barooah, Shweta Mohandas, Pranav M Bidare, Swagam Dasgupta, Vishnu Ramachandran and Senthil Kumar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    

   <dc:date>2018-06-13T13:08:47Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention">
    <title>Why NPCI and Facebook need urgent regulatory attention </title>
    <link>https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention</link>
    <description>
        &lt;b&gt;The world’s oldest networked infrastructure, money, is increasingly dematerialising and fusing with the world’s latest networked infrastructure, the Internet. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="https://economictimes.indiatimes.com/industry/banking/finance/banking/why-npci-and-facebook-need-urgent-regulatory-attention/articleshow/64522587.cms"&gt;Economic Times&lt;/a&gt; on June 10, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;As the network effects compound, disruptive acceleration hurtle us towards financial utopia, or dystopia. Our fate depends on what we get right and what we get wrong with the law, code and architecture, and the market.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Internet, unfortunately, has completely transformed from how it was first architected. From a federated, generative network based on free software and open standards, into a centralised, environment with an increasing dependency on proprietary technologies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In countries like Myanmar, some citizens misconstrue a single social media website, Facebook, for the internet, according to LirneAsia research. India is another market where Facebook could still get its brand mistaken for access itself by some users coming online. This is Facebook put so many resources into the battle over Basics, in the run-up to India’s network neutrality regulation. an odd corporation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On hand, its business model is what some term surveillance capitalism. On the other hand, by acquiring WhatsApp and by keeping end-toend (E2E) encryption “on”, it has ensured that one and a half billion users can concretely exercise their right to privacy. At the time of the acquisition, WhatsApp founders believed Facebook’s promise that it would never compromise on their high standards of privacy and security. But 18 months later, Facebook started harvesting data and diluting E2E.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In April this year, my colleague Ayush Rathi and I wrote in Asia Times that WhatsApp no longer deletes multimedia on download but continues to store it on its servers. Theoretically, using the very same mechanism, Facebook could also be retaining encrypted text messages and comprehensive metadata from WhatsApp users indefinitely without making this obvious.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;My friend, Srikanth Lakshmanan, founder of the CashlessConsumer collective, is a keen observer of this space. He says in India, “we are seeing an increasing push towards a bank-led model, thanks to National Payments Corporation of India (NPCI) and its control over Unified Payments Interface (UPI), which is also known as the cashless layer of the India Stack.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;NPCI is best understood as a shape shifter. Arundhati Ramanathan puts it best when she says “depending on the time and context, NPCI is a competitor. It is a platform. It is a regulator. It is an industry association. It is a profitable non-profit. It is a rule maker. It is a judge. It is a bystander.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This results in UPI becoming, what Lakshmanan calls, a NPCI-club-good rather than a new generation digital public good. He also points out that NPCI has an additional challenge of opacity — “it doesn’t provide any metrics on transaction failures, and being a private body, is not subject to proactive or reactive disclosure requirements under the RTI.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Technically, he says, UPI increases fragility in our financial ecosystem since it “is a centralised data maximisation network where NPCI will always have the superset of data.” Given that NPCI has opted for a bank-led model in India, it is very unlikely that Facebook able to leverage its monopoly the social media market duopoly it shares with in the digital advertising market to become a digital payments monopoly.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, NCPI and Facebook both share the following traits — one, an insatiable appetite for personal information; two, a fetish for hypercentralisation; three, a marginal commitment to transparency, and four, poor track record as a custodian of consumer trust. The marriage between these like-minded entities has already had a dubious beginning.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Previously, every financial technology wanting direct access to the NPCI infrastructure had to have a tie-up with a bank. But for Facebook and Google, as they are large players, it was decided to introduce a multi-bank model. This was definitely the right thing to do from a competition perspective. But, unfortunately, the marriage between the banks and the internet giant was arranged by NPCI in an opaque process and WhatsApp was exempted from the full NPCI certification process for its beta launch.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Both NPCI and Facebook need urgent regulatory attention. A modern data protection law and a more proactive competition regulator is required for Facebook. The NPCI will hopefully also be subjected to the upcoming data protection law. But it also requires a range of design, policy and governance fixes to ensure greater privacy and security via data minimisation and decentralisation; greater accountability and transparency to the public; separation of powers for better governance and open access policies to prevent anti-competitive behaviour.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention'&gt;https://cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-12T02:07:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/files/niti-aayog-discussion-paper">
    <title>NITI Aayog Discussion Paper</title>
    <link>https://cis-india.org/internet-governance/files/niti-aayog-discussion-paper</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/files/niti-aayog-discussion-paper'&gt;https://cis-india.org/internet-governance/files/niti-aayog-discussion-paper&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-06-12T01:52:04Z</dc:date>
   <dc:type>File</dc:type>
   </item>




</rdf:RDF>
