<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/internet-governance/blog/online-anonymity/search_rss">
  <title>We are anonymous, we are legion</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 331 to 345.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/owasp-seasides-conference"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/about/newsletters/february-2019-newsletter"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/imagine-a-feminist-internet-research-practice-and-policy-in-south-asia"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/vipul-kharbanda-february-25-2019-comments-on-draft-second-protocol-to-convention-on-cybercrime-budapest-convention"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/hindu-businessline-february-19-2019-arindrajit-basu-resurrecting-the-marketplace-of-ideas"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/economic-times-february-20-2019-are-rss-fears-about-tik-tok-true"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/jessica-corbett-common-dreams-february-5-2019-civil-liberties-groups-warn-proposed-eu-terrorist-content-rule-threat-democratic"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/2019-international-asia-conference"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/economic-times-nilanjana-bhowmick-february-13-2019-make-our-digital-backyard-safe"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/unbox-2019-festival"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/news/hindustan-times-february-10-2019-smriti-kak-ramachandran-and-vidhi-choudhary-willing-to-participate-in-parliamentary-panel-hearing"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/data-infrastructures-inequities-reproductive-health-surveillance-india"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/news/owasp-seasides-conference">
    <title>OWASP Seasides Conference</title>
    <link>https://cis-india.org/internet-governance/news/owasp-seasides-conference</link>
    <description>
        &lt;b&gt;Karan Saini attended the OWASP Seasides security conference held on February 27 and 28, 2019 at Cavelossim, Goa. The event was organized by OWASP Seasides.&lt;/b&gt;
        &lt;p&gt;For conference details &lt;a class="external-link" href="https://www.owaspseasides.com/schedule/workshops"&gt;click here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/owasp-seasides-conference'&gt;https://cis-india.org/internet-governance/news/owasp-seasides-conference&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    

   <dc:date>2019-03-07T23:53:47Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/about/newsletters/february-2019-newsletter">
    <title>February 2019 Newsletter</title>
    <link>https://cis-india.org/about/newsletters/february-2019-newsletter</link>
    <description>
        &lt;b&gt;Our newsletter for February month below.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The CIS &lt;span class="highlightedSearchTerm"&gt;newsletter&lt;/span&gt; aims to  highlight developments in copyright and patent, free speech and  expression, privacy, cyber security, telecom, etc. as well as Industry  4.0, big data, additive manufacturing and so on which are  revolutionizing and moving the digital world forward. Through this &lt;span class="highlightedSearchTerm"&gt;newsletter&lt;/span&gt; we look to engage you with our research and build a strong bond by  bringing you insightful articles and blog posts which will be beneficial  for you and your business. Throughout the year we will send you stories  and insights from our board, staff and community leaders. We welcome  your feedback, suggestions or comments regarding our &lt;span class="highlightedSearchTerm"&gt;newsletter&lt;/span&gt; or any other aspect of our research.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3 style="text-align: justify; "&gt;Highlights for February 2019&lt;/h3&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Maharashtra is a state which is rich in diversity in terms of language and culture seen in its various regions such as Konkan, Marathwada, Western Maharashtra, Northern Maharashtra and Vidarbha. Awareness needs to be created to make Wikimedia movement inclusive and diverse in these geographical regions as well as in their social strata. Keeping this in view CIS-A2K &lt;a class="external-link" href="https://cis-india.org/a2k/blogs/marathi-language-fortnight-workshops-2019"&gt;conducted five workshops&lt;/a&gt; in different parts of the state.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Marathi language department of Goa University &lt;a class="external-link" href="https://cis-india.org/a2k/blogs/marathi-wikipedia-workshop-1lib1ref-session-at-goa-university"&gt;has initiated the process to document the culture of Goa on Marathi Wikipedia and Commons&lt;/a&gt;. Subodh Kulkarni reports this in a blog entry.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Khetrimayum Monish Singh and Rajiv K. Mishra &lt;a class="external-link" href="https://cis-india.org/raw/media-infrastructures-digital-practices-north-east-of-india-presentation"&gt;co-authored a research paper&lt;/a&gt; which was presented at the Young Scholars International Conference on “Margins and Connections,” organised by the Special Centre for the Study of North East India, Jawaharlal Nehru University, on February 7-8, 2019.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Following consultations with data protection, civil society, industry and others, during the Cybercrime Convention Committee (T-CY) meeting from 29 November 2018 onwards, the Cybercrime Convention Committee had sought additional contributions regarding the provisional draft text for a Second Additional Protocol to the Budapest Convention on Cybercrime (“Budapest Convention”). Vipul Kharbanda &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/vipul-kharbanda-february-25-2019-comments-on-draft-second-protocol-to-convention-on-cybercrime-budapest-convention"&gt;submitted comments on behalf of CIS&lt;/a&gt;.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;CIS &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights"&gt;responded to the call for submissions&lt;/a&gt; from the UN Special Rapporteur on Freedom of Speech and Expression. The submission was on the Surveillance Industry and Human Rights. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;CIS and University of Munich, Germany are co-organizing an event&lt;a class="external-link" href="https://cis-india.org/internet-governance/events/internet-speech-perspectives-on-regulation-and-policy"&gt; 'Internet Speech: Perspectives on Regulation and Policy' &lt;/a&gt;at India Habitat Centre on April 5, 2019.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Akriti Bopanna on behalf of CIS &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/akriti-bopanna-february-8-2019-comment-on-icann-draft-fy-20-operating-plan-and-budget"&gt;provided comments on the proposed draft of ICANN’s FY20 Operating Plan and Budget&lt;/a&gt; along with their Five-Year Operating Plan Update. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Harsh Bajpai, Ambika Tandon, and Amber Sinha have co-authored a case study '&lt;a class="external-link" href="https://cis-india.org/internet-governance/future-of-work-in-automotive-sector.pdf"&gt;The Future of Work in the Automotive Sector in India&lt;/a&gt;'. The case study highlights the impact of technologies such as artificial intelligence, industry 4.0, internet of things, and so on at industry workplace. The case study was edited by Rakhi Sehgal. Manav Mehta provided research assistance.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;In a response to the &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018"&gt;Draft of The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018&lt;/a&gt;, CIS examined whether the draft rules meet tests of constitutionality and whether they are consistent with the parent Act. The submission also examined potential harms that may arise from the Rules as they are currently framed and make recommendations to the draft rules that may enable government to meet its objectives while remaining situated within the constitutional ambit. &lt;/li&gt;
&lt;li style="text-align: justify; "&gt;A UN high-level panel on Digital Cooperation issued a call for inputs that called for responses to various questions. &lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-submission-to-un-high-level-panel-on-digital-cooperation"&gt;CIS responded to the call for inputs&lt;/a&gt;. The response was drafted by Aayush Rathi, Ambika Tandon, Arindrajit Basu and Elonnai Hickok. &lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;CIS and the News&lt;/h3&gt;
&lt;p&gt;The following news pieces were authored by CIS and published on its website in January:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/epw-engage-february-9-2019-data-infrastructures-inequities-why-does-reproductive-health-surveillance-india-need-urgent-attention"&gt;Data Infrastructures and Inequities: Why Does Reproductive Health Surveillance in India Need Our Urgent Attention?&lt;/a&gt; (Aayush Rathi and Ambika Tandon; EPW Engage , Vol. 54, Issue No. 6, February 9, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating"&gt;Intermediary liability law needs updating&lt;/a&gt; (Sunil Abraham; Business Standard; February 9, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/hindu-businessline-february-19-2019-arindrajit-basu-resurrecting-the-marketplace-of-ideas"&gt;Resurrecting the marketplace of ideas&lt;/a&gt; (Arindrajit Basu; Hindu Businessline; February 22, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://https//cis-india.org/raw/indian-express-nishant-shah-february-24-2019-what-i-learned-from-going-offline-for-48-hours"&gt;&lt;span class="external-link"&gt;What I learned from going offline for 48 hours&lt;/span&gt;&lt;/a&gt; (Nishant Shah; Indian Express; February 24, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h3&gt;CIS in the News&lt;/h3&gt;
&lt;p&gt;CIS was quoted in these news articles published elsewhere:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/jessica-corbett-common-dreams-february-5-2019-civil-liberties-groups-warn-proposed-eu-terrorist-content-rule-threat-democratic"&gt;Civil Liberties Groups Warn Proposed EU 'Terrorist Content' Rule a Threat to Democratic Values&lt;/a&gt; (Jessica Corbett; Common Dreams; February 5, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/hindustan-times-february-10-2019-smriti-kak-ramachandran-and-vidhi-choudhary-willing-to-participate-in-parliamentary-panel-hearing"&gt;‘Willing to participate, but need more time’: Twitter on parliamentary panel hearing&lt;/a&gt; (Smriti Kak Ramachandran and Vidhi Choudhary; February 10, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/livemint-abhijit-ahaskar-february-12-2019-what-the-governments-draft-it-intermediary-guidelines-say"&gt;What the government's draft IT intermediary guidelines say&lt;/a&gt; (Abhijit Ahaskar; Livemint; February 12, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-nilanjana-bhowmick-february-13-2019-make-our-digital-backyard-safe"&gt;Make our digital backyard safe&lt;/a&gt; (Nilanjana Bhowmick; Economic Times; February 13, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages"&gt;Even years later, Twitter doesn't delete your direct messages&lt;/a&gt; (Zack Whittaker and Natasha Lomas; Tech Crunch; February 15, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/economic-times-february-20-2019-are-rss-fears-about-tik-tok-true"&gt;Are RSS's fears about Tik Tok true? Here's what you should know&lt;/a&gt; (Economic TimZack Whittaker and Natasha Lomases; February 20, 2019). Also published in Moneycontrol News on the same day.&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/gulf-times-february-24-2019-dr-r-seetharaman-risk-integration-is-key-to-better-cybersecurity-management"&gt;Risk integration is key to better cybersecurity management&lt;/a&gt; (Dr. R. Seetharaman; Gulf Times; February 24, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/kashmir-watch-february-25-2019-any-failure-to-resolve-the-kashmir-problem-could-lead-the-south-asia-to-a-nuclear-disaster"&gt;Any failure to resolve the Kashmir problem could lead the South Asia to a nuclear disaster&lt;/a&gt; (Kashmir Watch; February 25, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/times-of-india-pti-february-28-2019-over-30-organisations-industry-bodies-oppose-proposal-to-ban-vape-content"&gt;Over 30 organisations, industry bodies oppose proposal to ban vape content&lt;/a&gt; (Times of India; February 28, 2019).&lt;br /&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;/div&gt;
&lt;h2&gt;&lt;a class="external-link" href="https://cis-india.org/a2k"&gt;Access to Knowledge&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Our Access to Knowledge programme currently consists of                  two projects. The Pervasive Technologies project,                  conducted under a grant from the International                  Development Research Centre (IDRC), aims to conduct                  research on the complex interplay between low-cost                  pervasive technologies and intellectual property, in                  order to encourage the proliferation and development of                  such technologies as a social good. The Wikipedia                  project, which is under a grant from the Wikimedia                  Foundation, is for the growth of Indic language                  communities and projects by designing community                  collaborations and partnerships that recruit and                  cultivate new editors and explore innovative approaches                  to building projects.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Wikipdedia&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;As part of the &lt;a href="http://cis-india.org/a2k/access-to-knowledge-program-plan"&gt;project                   grant from the Wikimedia Foundation&lt;/a&gt; we have                 reached out to more than 3500 people across  India by                 organizing more than 100 outreach events and  catalysed                 the release of encyclopaedic and other content  under the                 Creative Commons (CC-BY-3.0) license in four  Indian                 languages (21 books in Telugu, 13 in Odia, 4  volumes of                 encyclopaedia in Konkani and 6 volumes in  Kannada, and 1                 book on Odia language history in  English).&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Blog Entries&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/marathi-wikipedia-workshop-1lib1ref-session-at-goa-university"&gt;Marathi Wikipedia Workshop &amp;amp; 1lib1ref session at Goa University&lt;/a&gt; (Subodh Kulkarni; February 1, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/blogs/marathi-language-fortnight-workshops-2019"&gt;Marathi Language Fortnight Workshops 2019&lt;/a&gt; (Subodh Kulkarni February 26, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;/div&gt;
&lt;p&gt;&lt;b&gt;Media Coverage&lt;/b&gt;&lt;/p&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/news/times-of-india-february-20-2019-goa-university-students-update-goa-marathi-articles-on-wikipedia"&gt;Goa University students update ‘Goa’ Marathi articles on Wikipedia&lt;/a&gt; (Times of India; February 20, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;h2&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance"&gt;Internet Governance&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;As part of its research on privacy and free speech, CIS is engaged with two different projects. The first one (under a grant from Privacy International and IDRC) is on surveillance and freedom of expression (SAFEGUARDS). The second one (under a grant from MacArthur Foundation) is on restrictions that the Indian government has placed on freedom of expression online.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Cyber Security&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Submission&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/vipul-kharbanda-february-25-2019-comments-on-draft-second-protocol-to-convention-on-cybercrime-budapest-convention"&gt;Comments on the Draft Second Protocol to the Convention on Cybercrime (Budapest Convention)&lt;/a&gt; (Vipul Kharbanda; February 25, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h3&gt;Privacy&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Submissions&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-submission-to-un-high-level-panel-on-digital-cooperation"&gt;CIS Submission to UN High Level Panel on Digital Cooperation&lt;/a&gt; (Aayush Rathi, Ambika Tandon, Arindrajit Basu and Elonnai Hickok; February 7, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights"&gt;CIS Submission to the UN Special Rapporteur on Freedom of Speech and Expression: Surveillance Industry and Human Rights&lt;/a&gt; (Elonnai Hickok, Arindrajit Basu, Gurshabad Grover, Akriti Bopanna, Shweta Mohandas and Martyna Kalvaityte; February 20, 2019). &lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;Participation in Event&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/participation-in-the-meeting-of-bis-litd-17"&gt;BIS LITD 17&lt;/a&gt; (Organized by the Bureau of Indian Standards; February 26, 2019). Gurshabad Grover participated in the meeting conducted online.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;h3&gt;Gender&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Workshop Organized&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/unbox-2019-festival"&gt;Unbox Festival 2019: CIS organizes two Workshops&lt;/a&gt; (Organized by CIS; Bangalore; February 15 - 17, 2019). CIS organized two workshops on What is your Feminist Infrastructure Wishlist? and AI for Good.&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;Participation in Events&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/imagine-a-feminist-internet-research-practice-and-policy-in-south-asia"&gt;Imagine a Feminist Internet: Research, Practice and Policy in South Asi&lt;/a&gt;a (Organized by Internet Democracy Project and Point of View; Sri Lanka; February 21 - 22, 2019). Ambika Tandon was a speaker and presented a paper 'Framing Reproductive Health as a Data Problem? Unpacking ‘Dataveillance’ in India' which was co-authored by herself and Aayush Rathi. &lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/firn-convening-design"&gt;Feminist Internet Research Network (FIRN) Convening Design&lt;/a&gt; (Organized by Association for Progressive Communications; Malaysia; February 27 - March 1, 2019). Ambika Tandon attended the event.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;h3&gt;Free Speech and Expression&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Submissions&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018"&gt;Response to the Draft of The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018&lt;/a&gt; (Gurshabad Grover, Elonnai Hickok, Arindrajit Basu and Akriti Bopanna; February 7, 2019).&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/akriti-bopanna-february-8-2019-comment-on-icann-draft-fy-20-operating-plan-and-budget"&gt;CIS Comment on ICANN's Draft FY20 Operating Plan and Budget&lt;/a&gt; (Akriti Bopanna; February 12, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;b&gt;Upcoming Event&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/events/internet-speech-perspectives-on-regulation-and-policy"&gt;Internet Speech: Perspectives on Regulation and Policy&lt;/a&gt; (Organized by CIS and the University of Munich (LMU), Germany; India Habitat Centre, New Delhi; April 5, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;&lt;b&gt;Participation in Event&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines"&gt;Webinar on counter-comments to the draft Intermediary Guidelines&lt;/a&gt; (Organized by CCAOI and the ISOC Delhi Chapter; February 11, 2019). Gurshabad Grover attended the event.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;h3&gt;Artificial Intelligence&lt;/h3&gt;
&lt;p&gt;&lt;b&gt;Case Study&lt;/b&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/blog/harsh-bajpai-ambika-tandon-and-amber-sinha-february-8-2018-the-future-of-work-in-automotive-sector-in-india"&gt;The Future of Work in the Automotive Sector in India&lt;/a&gt; (Harsh Bajpai, Ambika Tandon and Amber Sinha; February 8, 2019). Case study was edited by Rakhi Sehgal.&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;/div&gt;
&lt;div&gt;&lt;b&gt;Participation in Event&lt;/b&gt;&lt;/div&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/news/2019-international-asia-conference"&gt;2019 International Asia Conference &lt;/a&gt;(Organized by ITECHLAW; Bangalore; January 31 - February 1, 2019). Sunil Abraham was a panelist in the session "Policy Making for the Emerging Tech in India". &lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;/div&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div&gt;&lt;/div&gt;
&lt;h2&gt;&lt;a class="external-link" href="https://cis-india.org/raw"&gt;Researchers at Work (RAW)&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The Researchers at Work (RAW) programme is an interdisciplinary research initiative driven by an emerging need to understand the reconfigurations of social practices and structures through the Internet and digital media technologies, and vice versa. It aims to produce local and contextual accounts of interactions, negotiations, and resolutions between the Internet, and socio-material and geo-political processes:&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Research Paper&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="https://cis-india.org/raw/media-infrastructures-digital-practices-north-east-of-india-presentation"&gt;Media Infrastructures and Digital Practices: Case Studies from the North East of India&lt;/a&gt; (Khetrimayum Monish Singh; February 5, 2019).&lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2&gt;&lt;a class="external-link" href="http://cis-india.org/"&gt;About CIS&lt;/a&gt;&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The Centre for Internet and  Society  (CIS) is a non-profit organisation that undertakes  interdisciplinary  research on internet and digital technologies from  policy and academic  perspectives. The areas of focus include digital  accessibility for  persons with disabilities, access to knowledge,  intellectual property  rights, openness (including open data, free and  open source software,  open standards, open access, open educational  resources, and open  video), internet governance, telecommunication  reform, digital privacy,  and cyber-security. The academic research at  CIS seeks to understand  the reconfigurations of social and cultural  processes and structures as  mediated through the internet and digital  media technologies.&lt;/p&gt;
&lt;p&gt;► Follow us elsewhere&lt;/p&gt;
&lt;div&gt;
&lt;ul&gt;
&lt;li&gt;Twitter:&lt;a href="http://twitter.com/cis_india"&gt; http://twitter.com/cis_india&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Access to Knowledge: &lt;a href="https://twitter.com/CISA2K"&gt;https://twitter.com/CISA2K&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Twitter - Information Policy: &lt;a href="https://twitter.com/CIS_InfoPolicy"&gt;https://twitter.com/CIS_InfoPolicy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Facebook - Access to Knowledge:&lt;a href="https://www.facebook.com/cisa2k"&gt; https://www.facebook.com/cisa2k&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;E-Mail - Access to Knowledge: &lt;a&gt;a2k@cis-india.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;E-Mail - Researchers at Work: &lt;a&gt;raw@cis-india.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;List - Researchers at Work: &lt;a href="https://lists.ghserv.net/mailman/listinfo/researchers"&gt;https://lists.ghserv.net/mailman/listinfo/researchers&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;p&gt;► Support Us&lt;/p&gt;
&lt;div&gt;Please help us defend consumer and citizen rights on the Internet!   Write a cheque in favour of 'The Centre for Internet and Society' and   mail it to us at No. 194, 2nd 'C' Cross, Domlur, 2nd Stage, Bengaluru -   5600 71.&lt;/div&gt;
&lt;p&gt;► Request for Collaboration&lt;/p&gt;
&lt;div&gt;
&lt;p style="text-align: justify; "&gt;We invite researchers, practitioners,  artists, and theoreticians,  both organisationally and as individuals,  to engage with us on topics  related internet and society, and improve  our collective understanding  of this field. To discuss such  possibilities, please write to Sunil  Abraham, Executive Director, at sunil@cis-india.org (for policy research), or Sumandro Chattapadhyay, Research Director, at sumandro@cis-india.org  (for  academic research), with an indication of the form and the  content of  the collaboration you might be interested in. To discuss  collaborations  on Indic language Wikipedia projects, write to Tanveer  Hasan, Programme  Officer, at &lt;a&gt;tanveer@cis-india.org&lt;/a&gt;.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;&lt;i&gt;CIS is grateful to its primary  donor the Kusuma Trust founded  by Anurag Dikshit and Soma Pujari,  philanthropists of Indian origin for  its core funding and support for  most of its projects. CIS is also  grateful to its other donors,  Wikimedia Foundation, Ford Foundation,  Privacy International, UK, Hans  Foundation, MacArthur Foundation, and  IDRC for funding its various  projects&lt;/i&gt;.&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/about/newsletters/february-2019-newsletter'&gt;https://cis-india.org/about/newsletters/february-2019-newsletter&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Access to Knowledge</dc:subject>
    

   <dc:date>2019-03-14T16:40:12Z</dc:date>
   <dc:type>Page</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/imagine-a-feminist-internet-research-practice-and-policy-in-south-asia">
    <title>Imagine a Feminist Internet: Research, Practice and Policy in South Asia</title>
    <link>https://cis-india.org/internet-governance/news/imagine-a-feminist-internet-research-practice-and-policy-in-south-asia</link>
    <description>
        &lt;b&gt;Internet Democracy Project and Point of View co-organized a two-day Imagine a Feminist Internet event in Sri Lanka on 22011 and 22 February 2019. Ambika Tandon was a speaker and presented a paper 'Framing Reproductive Health as a Data Problem? Unpacking ‘Dataveillance’ in India' which was co-authored by herself and Aayush Rathi.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The panel also had a presentation by Dr. Anja Kovacs, and was moderated by Eva Blum-Dumontet from Privacy International. Ambika also participated in a committee that drafted a declaration for policymakers based on the presentations at the conference, which is yet to be finalised. The agenda can be &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/ifi-draft-agenda"&gt;seen here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/imagine-a-feminist-internet-research-practice-and-policy-in-south-asia'&gt;https://cis-india.org/internet-governance/news/imagine-a-feminist-internet-research-practice-and-policy-in-south-asia&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-27T01:52:55Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/vipul-kharbanda-february-25-2019-comments-on-draft-second-protocol-to-convention-on-cybercrime-budapest-convention">
    <title>Comments on the Draft Second Protocol to the Convention on Cybercrime (Budapest Convention) </title>
    <link>https://cis-india.org/internet-governance/blog/vipul-kharbanda-february-25-2019-comments-on-draft-second-protocol-to-convention-on-cybercrime-budapest-convention</link>
    <description>
        &lt;b&gt;Following consultations with data protection, civil society, industry and others, during the Cybercrime Convention Committee (T-CY) meeting from 29 November 2018 onwards, the Cybercrime Convention Committee has sought additional contributions regarding the provisional draft text for a Second Additional Protocol to the Budapest Convention on Cybercrime (“Budapest Convention”).&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The Centre for Internet and Society, (“CIS”), is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with diverse abilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, and open access), internet governance, telecommunication reform, digital privacy, artificial intelligence, freedom of expression, and cyber-security. This submission is consistent with CIS’ commitment to safeguarding general public interest, and the rights of stakeholders. CIS is thankful to the Cybercrime Convention Committee for this opportunity to provide feedback to the Draft.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The draft text addresses three issues viz. language of requests, emergency multilateral cooperation and taking statements through video conferencing. Click to download the &lt;a href="https://cis-india.org/internet-governance/comments-on-the-draft-second-protocol-to-the-convention-on-cybercrime-budapest-convention" class="internal-link"&gt;entire submission here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/vipul-kharbanda-february-25-2019-comments-on-draft-second-protocol-to-convention-on-cybercrime-budapest-convention'&gt;https://cis-india.org/internet-governance/blog/vipul-kharbanda-february-25-2019-comments-on-draft-second-protocol-to-convention-on-cybercrime-budapest-convention&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vipul</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-25T16:48:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/hindu-businessline-february-19-2019-arindrajit-basu-resurrecting-the-marketplace-of-ideas">
    <title>Resurrecting the marketplace of ideas</title>
    <link>https://cis-india.org/internet-governance/blog/hindu-businessline-february-19-2019-arindrajit-basu-resurrecting-the-marketplace-of-ideas</link>
    <description>
        &lt;b&gt;There is no ‘silver bullet’ for regulating content on the web. It requires a mix of legal and empirical analysis.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Arindrajit Basu was published in &lt;a class="external-link" href="https://www.thehindubusinessline.com/opinion/resurrecting-the-marketplace-of-ideas/article26313605.ece"&gt;Hindu Businessline&lt;/a&gt; on February 19, 2019.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;A century after the ‘marketplace of ideas’ first found its way into a  US Supreme Court judgment through the dissenting opinion of Justice  Oliver Wendell Holmes Jr &lt;i&gt;(Abrams v United States, 1919&lt;/i&gt;), the oft-cited rationale for free speech is arguably under siege.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The  increasing quantity and range of online speech hosted by internet  platforms coupled with the shock waves sent by revelations of rampant  abuse through the spread of misinformation has lead to a growing  inclination among governments across the globe to demand more aggressive  intervention by internet platforms in filtering the content they host.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Rule  3(9) of the Draft of the Information Technology [Intermediary  Guidelines (Amendment) Rules] 2018 released by the Ministry of  Electronics and Information Technology (MeiTy) last December follows the  interventionist regulatory footsteps of countries like Germany and  France by mandating that platforms use “automated tools or appropriate  mechanisms, with appropriate controls, for proactively identifying and  removing or disabling public access to unlawful information or content.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Like its global counterparts, this rule, which serves as a  pre-condition for granting immunity to the intermediary from legal  claims arising out of user-generated communications, might not only have  an undue ‘chilling effect’ on free speech but is also a thoroughly  uncooked policy intervention.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Censorship by proxy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Rule  3(9) and its global counterparts might not be in line with the  guarantees enmeshed in the right to freedom of speech and expression for  three reasons. First, the vague wording of the law and the abstruse  guidelines for implementation do not provide clarity, accessibility and  predictability — which are key requirements for any law restricting free  speech .The NetzDG-the German law, aimed at combating agitation and  fake news, has attracted immense criticism from civil society activists  and the UN Special Rapporteur David Kaye on similar grounds.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Second,  as proved by multiple empirical studies across the globe, including one  conducted by CIS on the Indian context, it is likely that legal  requirements mandating that private sector actors make determinations on  content restrictions can lead to over-compliance as the intermediary  would be incentivised to err on the side of removal to avoid expensive  litigation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, by shifting the burden of determining and  removing ‘unlawful’ content onto a private actor, the state is  effectively engaging in ‘censorship by proxy’. As per Article 12 of the  Constitution, whenever a government body performs a ‘public function’,  it must comply with all the enshrined fundamental rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Any  individual has the right to file a writ petition against the state for  violation of a fundamental right, including the right to free speech.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However,  judicial precedent on the horizontal application of fundamental rights,  which might enable an individual to enforce a similar claim against a  private actor has not yet been cemented in Indian constitutional  jurisprudence.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This means that any individual whose content has  been wrongfully removed by the platform may have no recourse in law —  either against the state or against the platform.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Algorithmic governmentality&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Using  automated technologies comes with its own set of technical challenges  even though they enable the monitoring of greater swathes of content.  The main challenge to automated filtering is the incomplete or  inaccurate training data as labelled data sets are expensive to curate  and difficult to acquire, particularly for smaller players.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Further, an algorithmically driven solution is an amorphous process.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Through  it is hidden layers and without clear oversight and accountability  mechanisms, the machine generates an output, which corresponds to  assessing the risk value of certain forms of speech, thereby reducing it  to quantifiable values — sacrificing inherent facets of dignity such as  the speaker’s unique singularities, personal psychological motivations  and intentions.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Possible policy prescriptions&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The first  step towards framing an adequate policy response would be to segregate  the content needing moderation based on the reason for them being  problematic.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Detecting and removing information that is false  might require the crafting of mechanisms that are different from those  intended to tackle content that is true but unlawful, such as child  pornography.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Any policy prescription needs to be adequately  piloted and tested before implementation. It is also likely that the  best placed prescription might be a hybrid amalgamation of the methods  outlined below.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Second, it is imperative that the nature of  intermediaries to which a policy applies are clearly delineated. For  example, Whatsapp, which offers end-to-end encrypted services would not  be able to filter content in the same way internet platforms like  Twitter can.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first option going forward is user-filtering,  which as per a recent paper written by Ivar Hartmann, is a decentralised  process, through which the users of an online platform collectively  endeavour to regulate the flow of information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Users collectively  agree on a set of standards and general guidelines for filtering. This  method combined with an oversight and grievance redressal mechanism to  address any potential violation may be a plausible one.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The second  model is enhancing the present model of self-regulation. Ghonim and  Rashbass recommend that the platform must publish all data related to  public posts and the processes followed in a certain post attaining  ‘viral’ or ‘trending’ status or conversely, being removed.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This,  combined with Application Programme Interfaces (APIs) or ‘Public  Interest Algorithms’, which enables the user to keep track of the  data-driven process that results in them being exposed to a certain  post, might be workable if effective pilots for scaling are devised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The  final model that operates outside the confines of technology are  community driven social mechanisms. An example of this is Telengana  Police Officer Remi Rajeswari’s efforts to combat fake news in rural  areas by using Janapedam — an ancient form of story-telling — to raise  awareness about these issues.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Given the complex nature of the  legal, social and political questions involved here, the quest for a  ‘silver-bullet’ might be counter-productive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Instead, it is  essential for us to take a step back, frame the right questions to  understand the intricacies in the problems involved and then, through a  mix of empirical and legal analysis, calibrate a set of policy  interventions that may work for India today.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/hindu-businessline-february-19-2019-arindrajit-basu-resurrecting-the-marketplace-of-ideas'&gt;https://cis-india.org/internet-governance/blog/hindu-businessline-february-19-2019-arindrajit-basu-resurrecting-the-marketplace-of-ideas&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>basu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Freedom</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-22T02:18:53Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/economic-times-february-20-2019-are-rss-fears-about-tik-tok-true">
    <title>Are RSS's fears about Tik Tok true? Here's what you should know</title>
    <link>https://cis-india.org/internet-governance/news/economic-times-february-20-2019-are-rss-fears-about-tik-tok-true</link>
    <description>
        &lt;b&gt;Swadeshi Jagran Manch has flagged security, business and social risks posed by Chinese apps such as TikTok. The RSS fears may not be totally unfounded.&lt;/b&gt;
        &lt;p&gt;The article was &lt;a class="external-link" href="https://economictimes.indiatimes.com/news/politics-and-nation/are-rsss-fears-about-tik-tok-true-heres-what-you-should-know/articleshow/68066972.cms"&gt;published in Economic Times&lt;/a&gt; on February 20, 2019. Shweta Mohandas was quoted. The story was also published by &lt;a class="external-link" href="https://www.moneycontrol.com/news/india/rss-calls-for-ban-on-chinese-social-media-apps-like-tik-tok-like-3562401.html"&gt;Moneycontrol News&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Should India let Chinese social media apps and telecom companies proliferate in India? Swadeshi Jagran Manch (SJM), the economic wing of the Rashtriya Swayamsevak Sangh has written to Prime Minister Narendra Modi for a ban on these Chinese companies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The statement comes days after the Pulwama attack by terrorists of Jaish-e-Muhammad (JeM). China has repeatedly helped Pakistan by blocking India’s efforts to get Pakistan-based JeM chief Masood Azhar listed by the UN Security Council as a global terrorist.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;SJM has flagged security, business and social risks posed by Chinese apps such as hugely popular TikTok.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The RSS fears may not be totally unfounded.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;TikTok, Kwai and LIKE have been downloaded by millions of smartphone users in small town India who are using them to share personal videos, away from the glare of scrutiny that falls on more mainstream social media platforms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In November last year, ET reviewed more than 20 Chinese video apps that dominate the mobile entertainment network of tier-2 and tier-3 cities mostly thanks to titillating videos, suggestive notifications, risqué humour and raunchy content.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Chinese apps pose several potential risks, Swetha Mohandas, policy officer at the Center for Internet and Society, an advocacy group, told ET in November last year. “The draft DP (data protection) Bill in the current stage provides greater responsibility on data fiduciaries to maintain the privacy of the individual and the security of the data,” she said. “There are a lot of questions that these apps pose with respect to the Bill, some of them being the security, the data storage provision, the personal data of children, and most importantly that these apps might have recordings that might be sensitive personal data.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most of these apps including TikTok explicitly state that though they have appropriate technical and organisational measures in place, “they cannot guarantee the security of your information transmitted through the platform”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;TikTok, the popular lip-sync app, is filled with 15-second clips of meme-friendly content featuring its youthful users miming to their favourite songs. The videos range from the harmless to the explicit, depending upon the users followed. The app has gone viral, having racked up close to 100 million downloads and with 20 million monthly active users in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While all such apps carry a disclaimer stating that they are not directed at children, their target audience encompasses preteens and adolescents in tier-2 and tier-3 cities, according to experts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Despite the rapidly growing user base, apps like TikTok don’t have a grievance redressal officer in India. The government is insisting on this for all major social media platforms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In its letter to the PM, SJM said it was the duty of all Indians to take steps to prevent the economic gains of any nation or individual that directly or tacitly supports terrorists.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Referring to India putting economic pressure on Pakistan, SJM said, “At such a time, we believe it is imperative that the government create similar hurdles for Chinese companies that are using India for their economic gain. As has been said often, data is now considered the new oil. We should not allow Chinese companies to capture Indian user data without any restrictions and monitoring.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Bytedance's response: TikTok and Helo are committed to respecting local laws and regulations as well as maintaining a safe and positive in-app environment for our users in India. There is no basis for the factually incorrect claims raised by certain groups recently. We treat the safety and security of our user data very seriously. Moreover, we have robust measures to protect users against misuse, including easy reporting mechanisms that enable users and law enforcement to report content that violates our terms of use and community guidelines.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/economic-times-february-20-2019-are-rss-fears-about-tik-tok-true'&gt;https://cis-india.org/internet-governance/news/economic-times-february-20-2019-are-rss-fears-about-tik-tok-true&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-22T02:13:35Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines">
    <title>Webinar on counter-comments to the draft Intermediary Guidelines</title>
    <link>https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines</link>
    <description>
        &lt;b&gt;CCAOI and the ISOC Delhi Chapter organised a webinar on February 11 to discuss the comments submitted to the Information Technology [Intermediary Guidelines (Amendment) Rules] 2018, and counter-comments that were due by February 14. &lt;/b&gt;
        &lt;p&gt;The agenda of the discussion was:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A brief introduction to the counter comment process [Shashank Mishra]&lt;/li&gt;
&lt;li&gt;Invited stakeholders  comment on key issues and perspectives on the submissions and the points to be countered.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The following people participated:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Amba Kak, Mozilla&lt;/li&gt;
&lt;li&gt;Rajesh Chharia, ISPAI&lt;/li&gt;
&lt;li&gt;Gurshabad Grover, CIS&lt;/li&gt;
&lt;li&gt;Priyanka Chaudhari, SFLC&lt;/li&gt;
&lt;li&gt;Divij Joshi, Vidhi Centre for Legal Policy&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines'&gt;https://cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    
    
        <dc:subject>Information Technology</dc:subject>
    

   <dc:date>2019-02-22T01:51:19Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights">
    <title>CIS Submission to the UN Special Rapporteur on Freedom of Speech and Expression: Surveillance Industry and Human Rights</title>
    <link>https://cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights</link>
    <description>
        &lt;b&gt;CIS responded to the call for submissions from the UN Special Rapporteur on Freedom of Speech and Expression. The submission was on the Surveillance Industry and Human Rights.&lt;/b&gt;
        
&lt;p&gt;CIS is grateful for the opportunity to submit the United Nations (UN) Special Rapporteur on call for submissions on the surveillance industry and human rights.1 Over the last decade, CIS has worked extensively on research around state and private surveillance around the world. In this response, individuals working at CIS wish to highlight these programs, with a special focus on India.&lt;/p&gt;
&lt;p&gt;The response can be accessed &lt;a href="https://cis-india.org/internet-governance/resources/the-surveillance-industry-and-human-rights.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights'&gt;https://cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok, Arindrajit Basu, Gurshabad Grover, Akriti Bopanna, Shweta Mohandas, Martyna Kalvaityte</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Human Rights</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Surveillance</dc:subject>
    

   <dc:date>2019-02-20T10:48:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/jessica-corbett-common-dreams-february-5-2019-civil-liberties-groups-warn-proposed-eu-terrorist-content-rule-threat-democratic">
    <title>Civil Liberties Groups Warn Proposed EU 'Terrorist Content' Rule a Threat to Democratic Values</title>
    <link>https://cis-india.org/internet-governance/news/jessica-corbett-common-dreams-february-5-2019-civil-liberties-groups-warn-proposed-eu-terrorist-content-rule-threat-democratic</link>
    <description>
        &lt;b&gt;Requiring filtering tools would be "a gamble with European Internet users' rights to privacy and data protection, freedom of expression and information, and non-discrimination and equality before the law."&lt;/b&gt;
        &lt;p&gt;The blog post by Jessica Corbett was published by &lt;a class="external-link" href="https://www.commondreams.org/news/2019/02/05/civil-liberties-groups-warn-proposed-eu-terrorist-content-rule-threat-democratic"&gt;Common Dreams&lt;/a&gt; on February 5, 2019. Centre for Internet &amp;amp; Society was a signatory.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Dozens of human rights groups and academics have signed on to an &lt;a href="https://cdt.org/files/2019/02/Civil-Society-Letter-to-European-Parliament-on-Terrorism-Database.pdf"&gt;open letter&lt;/a&gt; (pdf) raising alarm about the European Union's proposed &lt;a href="https://edri.org/terrorist-content-regulation-document-pool/"&gt;Regulation on Preventing the Dissemination of Terrorist Content Online&lt;/a&gt;,  warning that its call for Internet hosts to employ "proactive measures"  to censor such content "will almost certainly lead platforms to adopt  poorly understood tools" at the expense of democratic values across the  globe.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One of those tools is the Hash Database developed by Facebook,  YouTube, Microsoft, and Twitter. The 13 companies that use the  database—which supposedly contains 80,000 images and videos—can  automatically filter out material deemed "extreme" terrorist content.  However, as the letter explains, "almost nothing is publicly known about  the specific content that platforms block using the database, or about  companies' internal processes or error rates, and there is insufficient  clarity around the participating companies' definitions of 'terrorist  content.'"&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Countering terrorist violence is a shared priority, and our point is  not to question the good intentions of the database operators. But  lawmakers and the public have no meaningful information about how well  the database or any other existing filtering tool serves this goal, and  at what cost to democratic values and individual human rights," notes  the letter, whose signatories include the American Civil Liberties Union  (ACLU), the Brennan Center for Justice, the Electronic Frontier  Foundation (EFF), and the European Digital Rights (EDRi).&lt;/p&gt;
&lt;p&gt;As an EDRi &lt;a href="https://edri.org/open-letter-on-the-terrorism-database/"&gt;statement&lt;/a&gt; outlines, among the groups' main concerns are the following:&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify; "&gt;Lack of transparency of how the database works, and its  effectiveness, proportionality, and appropriateness to achieve the goals  the Terrorist Content Regulation aims to achieve;&lt;/li&gt;
&lt;li&gt;How filters are unable to understand the context and therefore they are error-prone; and&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Regardless of the possibility of filters to be accurate in the  future, the pervasive online monitoring on disadvantaged and  marginalized individuals.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Given the uncertainties over the effectiveness and societal  costs of such tools, the letter charges that "requiring all platforms to  use black-box tools like the database would be a gamble with European  Internet users' rights to privacy and data protection, freedom of  expression and information, and non-discrimination and equality before  the law."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With those fundamental rights under threat, the groups are calling on  members of the European Parliament "to reject proactive filtering  obligations; provide sound, peer-reviewed research data supporting  policy recommendations and legal mandates around counter-terrorism; and  refrain from enacting laws that will drive Internet platforms to adopt  untested and poorly understood technologies to restrict online  expression."&lt;/p&gt;
&lt;p&gt;Read the full letter:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;i&gt;Dear Members of the European Parliament,&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The undersigned organizations write to share our concerns about  the EU’s proposed Regulation on Preventing the Dissemination of  Terrorist Content Online, and in particular the Regulation’s call for  Internet hosts to use “proactive measures” to detect terrorist content.  We are concerned that if this Regulation is adopted, it will almost  certainly lead platforms to adopt poorly understood tools, such as the  Hash Database referenced in the Explanatory Memorandum to the Regulation  and currently overseen by the Global Internet Forum to Counter  Terrorism. Countering terrorist violence is a shared priority, and our  point is not to question the good intentions of the Database operators.  But lawmakers and the public have no meaningful information about how  well the Database or any other existing filtering tool serves this goal,  and at what cost to democratic values and individual human rights. We  urge you to reject proactive filtering obligations; provide sound,  peer-reviewed research data supporting policy recommendations and legal  mandates around counter-terrorism; and refrain from enacting laws that  will drive Internet platforms to adopt untested and poorly understood  technologies to restrict online expression.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The Database was initially developed by Facebook, YouTube,  Microsoft, and Twitter as a voluntary measure, and announced to the  public in 2016. It contains digital hash “fingerprints” of images  and4videos that platforms have identified as “extreme” terrorist  material, based not on the law but on their own Community Guidelines or  Terms of Service. The platforms can use automated filtering tools to  identify and remove duplicates of the hashed images or videos. As of  2018, the Database was said to contain hashes representing over 80,000  images or videos. At least thirteen companies now use the Database, and  some seventy companies have reportedly discussed adopting it.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Almost nothing is publicly known about the specific content that  platforms block using the Database, or about companies’ internal  processes or error rates, and there is insufficient clarity around the  participating companies’ definitions of “terrorist content.”  Furthermore, there are no reports about how many legal processes or  investigations were opened after the content was blocked. This data  would be crucial to understand to what extent the measures are effective  and necessary in a democratic society, which are some of the sine qua  non requisites for restrictions of fundamental rights. We do know,  however, of conspicuous problems that seemingly result from content  filtering gone awry. The Syrian Archive, a civil society organization  preserving evidence of human rights abuses in Syria, for example,  reports that YouTube deleted over 100,000 of its videos. Videos and  other content which may be used in one context to advocate terrorist  violence may be essential elsewhere for news reporting, combating  terrorist recruitment online, or scholarship. Technical filters are  blind to these contextual differences. As three United Nations special  rapporteurs noted in a December 2018 letter, this problem raises serious  concerns about free expression rights under the proposed Regulation. It  is far from clear whether major platforms like YouTube or Facebook  adequately correct for this through employees’ review of filtering  decisions—and it seems highly unlikely that smaller platforms could even  attempt to do so, if required to use the Database or other filtering  tools.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Failures of this sort seriously threaten Internet users’ rights  to seek and impart information. The pervasive monitoring that platforms  carry out in order to filter users’ communications also threatens  privacy and data protection rights. Moreover, these harms do not appear  to be equally distributed, but instead disproportionately disadvantage  individual Internet users based on their ethnic background, religion,  language, or location—in other words, harms fall on users who might  already be marginalized. More extensive use of the Database and other  automated filtering tools will amplify the risk of harms to users whose  messages and communications about matters of urgent public concern may  be wrongly removed by platforms. The United Nations Special Rapporteur  on the promotion and protection of human rights and fundamental freedoms  while countering terrorism has expressed concern about this lack of  clarity, and said that Facebook’s rules for classifying organizations as  terrorist are “at odds with international humanitarian law”.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Due to the opacity of the Database’s operations, it is impossible  to assess the consequences of its nearly two years of operation. The  European public is being asked to rely on claims by platforms or vendors  about the efficacy of the Database and similar tools—or else to assume  that any current problems will be solved by hypothetical future  technologies or untested, post-removal appeal mechanisms. Such  optimistic assumptions cannot be justified given the serious problems  researchers have found with the few filtering tools available for  independent review. Requiring all platforms to use black-box tools like  the Database would be a gamble with European Internet users’ rights to  privacy and data protection, freedom of expression and information, and  non-discrimination and equality before the law. That gamble is neither  necessary nor proportionate as an exercise of state power.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;EU institutions’ embrace of the database and other filtering  tools will also have serious consequences for Internet users all over  the world, including in countries where various of the undersigned  organizations work to protect human rights. For one thing, when  platforms filter a video or image in response to a European authority’s  request, it will likely disappear for users everywhere—even if it is  part of critical news reporting or political discourse in other parts of  the world. For another, encoding proactive measures to filter and  remove content in an EU regulation gives authoritarian and  authoritarian-leaning regimes the cover they need to justify their own  vaguely worded and arbitrarily applied anti-terrorism legislation.  Platforms that have already developed content filtering capabilities in  order to comply with EU laws will find it difficult to resist demands to  use them in other regions and under other laws, to the detriment of  vulnerable Internet users around the globe. Your decisions in this area  will have global consequences.&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Signatories:&lt;/i&gt;&lt;br /&gt;&lt;i&gt; Access Now; Africa Freedom of  Information Centre; Agustina Del Campo, in an individual capacity  (Center for Studies on Freedom of Expression CELE); American Civil  Liberties Union (ACLU); ApTI Romania; Article 19; Bits of Freedom;  Brennan Center for Justice; Catalina Botero Marino, in an individual  capacity (Former Special Rapporteur of Freedom of Expression of the  Organization of American States; Center for Democracy &amp;amp; Technology  (CDT); Centre for Internet and Society; Chinmayi Arun, in an individual  capacity; Damian Loreti, in an individual capacity; Daphne Keller, in an  individual capacity (Stanford CIS); Derechos Digitales · América  Latina; Digital Rights Watch; Electronic Frontier Finland; Electronic  Frontier Foundation (EFF); Electronic Frontier Norway; Elena  Sherstoboeva, in an individual capacity (Higher School of Economics);  European Digital Rights (EDRi); Hermes Center; Hiperderecho; Homo  Digitalis; IT-Pol; Joan Barata, in an individual capacity (Stanford  CIS); Krisztina Rozgonyi, in an individual capacity (University of  Vienna); Open Rights Group; Open Technology Institute at New America;  Ossigeno; Pacific Islands News Association (PINA); People Over Politics;  Prostasia Foundation; R3D: Red en Defensa de los Derechos Digitales;  Sarah T. Roberts, Ph.D., in an individual capacity; Southeast Asian  Press Alliance; Social Media Exchange (SMEX), Lebanon; WITNESS; and  Xnet.&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/jessica-corbett-common-dreams-february-5-2019-civil-liberties-groups-warn-proposed-eu-terrorist-content-rule-threat-democratic'&gt;https://cis-india.org/internet-governance/news/jessica-corbett-common-dreams-february-5-2019-civil-liberties-groups-warn-proposed-eu-terrorist-content-rule-threat-democratic&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-19T00:49:00Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/2019-international-asia-conference">
    <title>2019 International Asia Conference</title>
    <link>https://cis-india.org/internet-governance/news/2019-international-asia-conference</link>
    <description>
        &lt;b&gt;ITECHLAW organized the 2019 edition of International Asia Conference at JW Marriott hotel in Bangalore on January 31, 2019 and February 1, 2019. Sunil Abraham was a panelist in the session "Policy Making for the Emerging Tech in India".&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The rush of emerging technologies of Machine Learning, Internet of Things (IoT) and Virtual Reality (VR) is revolutionising the landscape in which humans exist. Innovators of the generation are ambitious, and their contributions have significantly impacted on various fields like healthcare, media and entertainment, agriculture, and other service models. As these technology advancements are driving new business and service models, there is a need for stakeholders and governments to ensure security and stability of the market without stifling innovations, stigmatising incentives or creating obstacles. Rapid spreading technology applications are resulting in drastic changes in today’s regulatory model, posing the difficult challenges for regulators. In India, the expeditiously developing start-up ecosystem and online consumer base, has stirred the regulators.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Intermediary liability, surveillance, data and privacy, digital taxation, data governance and sovereignty are the dominating debatable topics in India. The debates are not only between regulators and stakeholders, but consumers also joining in it. As the competition between Indian and Foreign Technology intensifies in the turf, the debate on tech-policy is considerably being mentioned in run-up of political parties to the general elections as well. Over the past one year, the country has witnessed some landmark judgments and contentious government proposals related to data and privacy, implications of which have affected over-the-top (“OTT”) services, online media, social media, e-commerce platforms, IoT services etc. The Indian regulatory framework on tech-policy is becoming stricter due to a very disruptive phase last year. The tech-giants like Facebook, Google, Twitter, and Amazon are themselves realising their enormous market influence. After the episodes of lynching, hate speeches etc., they are participating in policy-making efforts related to fake news and digital malfeasance. In this process legal industry is making considerable lobbying efforts for corporations to work with government to curb the menace of digital malpractice and make the internet safer.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As the legal industry is participating in the process of creating an innovators-friendly regulatory regime, they are also striving to understand the disruptive technologies and adopt them for their own convenience. However, legal firms must understand that the technology cannot do their job for clients but can only upgrade the business model for them. The traditional law firm business model is not in sync with legal buyers. Effective deployment of technology will ameliorate the factor of its approachability to its clients.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With the growing technology-based start-ups in India, it is going to be a hub for investments by big corporations. In order to keep attracting the investors there is a need for government to remove the potential hindrances that may make investors double-think. The government should prepare a level-playing field in the market by making citizens aware of the standard tech-policies and fostering the innovators-friendly regulatory regime.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;For more info &lt;a class="external-link" href="https://www.itechlaw.org/Bangalore2019"&gt;see the website&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/2019-international-asia-conference'&gt;https://cis-india.org/internet-governance/news/2019-international-asia-conference&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    

   <dc:date>2019-02-19T00:23:43Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/economic-times-nilanjana-bhowmick-february-13-2019-make-our-digital-backyard-safe">
    <title>Make our digital backyard safe</title>
    <link>https://cis-india.org/internet-governance/news/economic-times-nilanjana-bhowmick-february-13-2019-make-our-digital-backyard-safe</link>
    <description>
        &lt;b&gt;India has been patting itself on the back for being at the forefront of the ‘Fourth Industrial Revolution’ driven by digitisation. Reports have gushed about the speed and scale of digitisation. But this speed and scale have come at a cost to our privacy.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Nilanjana Bhowmick was published in &lt;a class="external-link" href="https://economictimes.indiatimes.com/blogs/et-commentary/make-our-digital-backyard-safe/"&gt;Economic Times&lt;/a&gt; on February 13, 2019.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;According to GoI, this digital push has led to 99% of adult Indians having an Aadhaar number in 2017. GoI has also integrated personal information through the Jan Dhan-Aadhaar-Mobile phone trinity (JAM).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;According to GoI, this digital push has led to 99% of adult Indians having an Aadhaar number in 2017. GoI has also integrated personal information through the Jan Dhan-Aadhaar-Mobile phone trinity (JAM).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A July 2018 &lt;a href="https://www-01.ibm.com/common/ssi/cgi-bin/ssialias?htmlfid=55017055USEN&amp;amp;"&gt;IBM report&lt;/a&gt; stated  that the probability of data breach went up by 8.7% in India over the  last four years based on past experiences. The study also stated that  malicious or criminal attacks were the root cause for 42% of data  breaches, followed by system glitch at 30% and human error at 28%. This  28% has the potential to cause incalculable havoc, which includes the  leak of personal information by anyone — from a call centre executive to  a bank manager — who has access to it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The prime reason for our lackadaisical attitude is that most Indians  don’t value privacy. We are yet to register the value of personal  information — the actual monetary, marketable value. My personal data,  for instance, costs roughly $2. If I take that as an average, then at  least $2 billion worth of data belonging to 1.3 billion Indians are at  stake here. Which is why, when this data is taken without consent, it is  a financial crime.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What is perhaps more frightening is that when this data is taken  without consent by an untrusted source, it may also land you, victim of a  data breach, in jail.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Last month, I had noticed a suspicious movement of money in my  account. A large sum of money was deposited in my account in two  instalments, withinthe space of 12 hours. And while I am waiting for the  issue to be addressed by the authorities — RBI ombudsman, bank customer  service, enforcement directorate — the person who wired the money to my  account had got hold of my personal information, including my address  and phone number.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He kept calling me on my phone and ‘requested’ I give the money  ‘back’ to his brother, ‘in cash or cheque’. Then his brother started  calling me, demanding I ‘return’ the money to him.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The movement of funds in my account could well have been a  money-laundering operation, and if I made the payment to the ‘sender’ as  demanded, the money trail would have implicated me. But what’s most  alarming is that if I was dealing with criminals, someone from my bank  had made them privy to my private information. And this is a top bank  with supposedly top-notch security.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Unfortunately, security is woefully lagging behind India’s speedy  digitisation. Neither are we investing enough on fortifying the system,  nor are we spending enough on postbreach responses. India spends a mere  $20,000 in notification costs, compared to the US’ $740,000.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The US also spends $1.76 million in post-data breach response  activities, including help desk activities, special investigations and  remediation. US and Canadian firms spend $258 and $213 per record  respectively to resolve amalicious or criminal attacks. Indian ones, on  an average, spend $76 per record.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Yes, digitisation is the future. But let’s first plug the social, institutional and systemic weaknesses in our systems.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/economic-times-nilanjana-bhowmick-february-13-2019-make-our-digital-backyard-safe'&gt;https://cis-india.org/internet-governance/news/economic-times-nilanjana-bhowmick-february-13-2019-make-our-digital-backyard-safe&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Nilanjana Bhowmick</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-18T14:37:25Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages">
    <title>Even years later, Twitter doesn't delete your direct messages</title>
    <link>https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages</link>
    <description>
        &lt;b&gt;When does “delete” really mean delete? Not always, or even at all, if you’re Twitter .&lt;/b&gt;
        &lt;p&gt;The blog post by Zack Whittaker and Natasha Lomas was published in &lt;a class="external-link" href="https://techcrunch.com/2019/02/15/twitter-direct-messages/"&gt;Tech Crunch&lt;/a&gt; on February 15, 2019. Karan Saini was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Twitter  retains direct messages for years, including messages you and others  have deleted, but also data sent to and from accounts that have been  deactivated and suspended, according to security researcher Karan Saini.&lt;/p&gt;
&lt;p&gt;Saini  found years-old messages in a file from an archive of his data obtained  through the website from accounts that were no longer on Twitter. He  also reported a similar bug, found a year earlier but not disclosed  until now, that allowed him to use a since-deprecated API to retrieve  direct messages even after a message was deleted from both the sender  and the recipient — though, the bug wasn’t able to retrieve messages  from suspended accounts.&lt;/p&gt;
&lt;p&gt;Saini told TechCrunch that he had “concerns” that the data was retained by Twitter for so long.&lt;/p&gt;
&lt;p&gt;Direct messages &lt;a href="https://www.cnet.com/how-to/how-to-unsend-twitter-direct-messages/"&gt;once let users “unsend” messages&lt;/a&gt; from someone else’s inbox, simply by deleting it from their own.  Twitter changed this years ago, and now only allows a user to delete  messages from their account. “Others in the conversation will still be  able to see direct messages or conversations that you have deleted,”  Twitter says in &lt;a href="https://help.twitter.com/en/using-twitter/direct-messages"&gt;a help page&lt;/a&gt;. Twitter also says in its &lt;a href="https://twitter.com/en/privacy"&gt;privacy policy&lt;/a&gt; that  anyone wanting to leave the service can have their account “deactivated  and then deleted.” After a 30-day grace period, the account disappears,  along with its data.&lt;/p&gt;
&lt;p&gt;But, in our tests, we could recover direct  messages from years ago — including old messages that had since been  lost to suspended or deleted accounts. By downloading &lt;a href="https://twitter.com/settings/your_twitter_data"&gt;your account’s data&lt;/a&gt;, it’s possible to download all of the data Twitter stores on you.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://cis-india.org/home-images/Twitter.png/@@images/40867bd2-2284-4c9c-b42f-fb7a500b1c92.png" alt="Twitter" class="image-inline" title="Twitter" /&gt;&lt;/p&gt;
&lt;p&gt;A conversation, dated March 2016, with a suspended Twitter account was still retrievable today (Image: TechCrunch)&lt;/p&gt;
&lt;p&gt;Saini says this is a “functional bug” rather than a security flaw,  but argued that the bug allows anyone a “clear bypass” of Twitter  mechanisms to prevent accessed to suspended or deactivated accounts.&lt;/p&gt;
&lt;p&gt;But  it’s also a privacy matter, and a reminder that “delete” doesn’t mean  delete — especially with your direct messages. That can open up users,  particularly high-risk accounts like journalist and activists, to  government data demands that call for data from years earlier.&lt;/p&gt;
&lt;p&gt;That’s despite &lt;a href="https://help.twitter.com/en/rules-and-policies/twitter-law-enforcement-support"&gt;Twitter’s claim&lt;/a&gt; that once an account has been deactivated, there is “a very brief  period in which we may be able to access account information, including  tweets,” to law enforcement.&lt;/p&gt;
&lt;p&gt;A Twitter spokesperson said the  company was “looking into this further to ensure we have considered the  entire scope of the issue.”&lt;/p&gt;
&lt;p&gt;Retaining direct messages for years  may put the company in a legal grey area ground amid Europe’s new data  protection laws, which allows users to demand that a company deletes  their data.&lt;/p&gt;
&lt;p&gt;Neil Brown, a telecoms, tech and internet lawyer at &lt;a href="https://decoded.legal/"&gt;U.K. law firm Decoded Legal&lt;/a&gt;,  said there’s “no formality at all” to how a user can ask for their data  to be deleted. Any request from a user to delete their data that’s  directly communicated to the company “is a valid exercise” of a user’s  rights, he said.&lt;/p&gt;
&lt;p&gt;Companies can be fined up to four percent of their annual turnover for violating GDPR rules.&lt;/p&gt;
&lt;p&gt;“A  delete button is perhaps a different matter, as it is not obvious that  ‘delete’ means the same as ‘exercise my right of erasure’,” said Brown.  Given that there’s no case law yet under the new General Data Protection  Regulation regime, it will be up to the courts to decide, he said.&lt;/p&gt;
&lt;p&gt;When asked if Twitter thinks that consent to retain direct messages is withdrawn when a message or account is deleted, Twitter’s spokesperson had “nothing further” to add.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages'&gt;https://cis-india.org/internet-governance/news/zack-whittaker-natasha-lomas-february-15-2019-tech-crunch-even-years-later-twitter-doesnt-delete-your-direct-messages&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Zack Whittaker and Natasha Lomas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-18T14:17:54Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/unbox-2019-festival">
    <title>Unbox Festival 2019: CIS organizes two Workshops</title>
    <link>https://cis-india.org/internet-governance/blog/unbox-2019-festival</link>
    <description>
        &lt;b&gt;Centre for Internet &amp; Society organized two workshops at the Unbox Festival 2019, in Bangalore, on 15 and 17 February 2019. &lt;/b&gt;
        &lt;h3 style="text-align: justify; "&gt;'What is your Feminist Infrastructure Wishlist?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The first workshop 'What is your Feminist Infrastructure Wishlist?' was on Feminist Infrastructure Wishlists that was conducted by P.P. Sneha and Saumyaa Naidu on  15 February 2019. The objective of the workshop was to explore what it means to have infrastructure that is feminist. How do we build spaces, networks, and systems that are equal, inclusive, diverse, and accessible? We will also reflect on questions of network configurations, expertise, labour and visibility. For reading material &lt;a class="external-link" href="https://feministinternet.org/"&gt;click here&lt;/a&gt;.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;AI for Good&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;With a backdrop of AI for social good, we explore existing applications of artificial intelligence, how we interact and engage with this technology on a daily basis. A discussion led by Saumyaa Naidu and Shweta Mohandas invited participants to examine current narratives around AI and imagine how these may transform with time. Questions around how we can build an AI for the future will become the starting point to trace its implications relating to social impact, policy, gender, design, and privacy. For reading materials see &lt;a class="external-link" href="https://ainowinstitute.org/AI_Now_2018_Report.pdf"&gt;AI Now Report 2018&lt;/a&gt;, &lt;a class="external-link" href="https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing"&gt;Machine Bias&lt;/a&gt;, and &lt;a class="external-link" href="https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistants-have-feminine-names/475884/"&gt;Why Do So Many Digital Assistants Have Feminine Names?&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For info on Unbox Festival, &lt;a class="external-link" href="http://unboxfestival.com/"&gt;click here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/unbox-2019-festival'&gt;https://cis-india.org/internet-governance/blog/unbox-2019-festival&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>saumyaa</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    

   <dc:date>2019-02-26T01:53:39Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/news/hindustan-times-february-10-2019-smriti-kak-ramachandran-and-vidhi-choudhary-willing-to-participate-in-parliamentary-panel-hearing">
    <title>‘Willing to participate, but need more time’: Twitter on parliamentary panel hearing</title>
    <link>https://cis-india.org/internet-governance/news/hindustan-times-february-10-2019-smriti-kak-ramachandran-and-vidhi-choudhary-willing-to-participate-in-parliamentary-panel-hearing</link>
    <description>
        &lt;b&gt;Executives from social media firm Twitter’s US headquarters will not appear before a parliamentary panel that has summoned them on Monday over perceived bias towards right-wing handles on the micro-blogging platform.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Smriti Kak Ramachandran and Vidhi Choudhary was &lt;a class="external-link" href="https://www.hindustantimes.com/india-news/twitter-says-willing-to-participate-in-parliamentary-panel-hearing-seeks-more-time/story-C7cDq6n7kOJM3DOFOX45dI.html"&gt;published in Hindustan Times&lt;/a&gt; on February 10, 2019. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Executives from social media firm Twitter’s US headquarters will not appear before a parliamentary panel that has summoned them on Monday over perceived bias towards right-wing handles on the micro-blogging platform although a spokesperson for the firm said in a statement that this is only on account of timing and that Twitter is “willing to participate in” a hearing by the panel.&lt;br /&gt;&lt;br /&gt;“We have indicated that we are willing to participate in such a broad hearing process. Given the short notice of the hearing, we informed the committee that it would not be possible for senior officials from Twitter to travel from the United States to appear on Monday,” the statement said. The panel’s summons were issued on February 5, with a meeting with the parliamentary panel scheduled for Monday, February 11.&lt;br /&gt;&lt;br /&gt;A right-wing group, Youth for Social Media Democracy, recently held protests claiming the microblogging site suspends or shadow-bans accounts that appear sympathetic to the ruling Bharatiya Janata Party (BJP) and the government.&lt;br /&gt;&lt;br /&gt;Anurag Thakur, a BJP MP who heads the parliamentary panel on information and technology, asked IT ministry officials and Twitter representatives to be present at the meeting. He said the committee takes a serious note of Twitter’s response and would take “appropriate action on February 11.”&lt;br /&gt;&lt;br /&gt;According to an official aware of the letter sent to Twitter, the company was told “it may be noted that the Head of the Organisation has to appear before the Committee”.&lt;br /&gt;&lt;br /&gt;Twitter added in its statement that while it will work with the Lok Sabha secretariat to find a mutually agreeable date for a meeting so that a senior Twitter official (from the US) can attend it has “also offered representatives from Twitter India to come and answer questions on Monday”. “We await feedback from the government on both matters,” the statement added.&lt;br /&gt;&lt;br /&gt;In a previous statement, Twitter said that its India representatives do not enforce policy and that this is done “with impartiality” by a “specialized global team”.&lt;br /&gt;&lt;br /&gt;Thakur’s intervention wasn’t prompted by protests by Youth for Social Media Democracy alone. According to the people familiar with the matter, the issue has been repeatedly flagged at meetings of the Rashtriya Swayamsevak Sangh (RSS), the ideological parent of the BJP.&lt;br /&gt;&lt;br /&gt;Twitter denied these allegations. In a statement issued on Friday, the company said, “Twitter is a global platform that serves a global, public conversation. Elevating debate and open discourse is fundamental to the platform’s service, and its core values as a company. Twitter is committed to remain unbiased with the public interest in mind.”&lt;br /&gt;&lt;br /&gt;“The public conversation around Twitter’s policies and actions may be distorted by some who have a political agenda and this may be particularly acute during election cycles when highly-charged political rhetoric becomes more common. For our part, we will endeavour to be even more transparent in how we develop and enforce our policies to dispel conspiracy theories and mistrust,” Colin Crowell, global vice president, public policy, Twitter, added in the statement.&lt;br /&gt;&lt;br /&gt;A senior functionary of the RSS said it was soon after the January 1, 2018 clash between Maratha and Dalit groups in Maharashtra’s Bhima Koregaon that escalated into violence that functionaries of the Sangh began to notice posts on social media that were allegedly “anti-national” and had the potential to create “communal friction”.&lt;br /&gt;&lt;br /&gt;The content of some of the posts was construed to be similar to the expressions used by so-called “urban naxals”, this person said on condition of anonymity. Urban naxals is a term coined by the right wing for left-wing intellectuals who, they say, are suspected to have links to Maoist organisations.&lt;br /&gt;&lt;br /&gt;“Posts that spoke of destabilising the nation, that attacked the sovereignty of the country were being put up. No action was being taken, despite complaints to Twitter,” the functionary added.&lt;br /&gt;&lt;br /&gt;It was then that the Sangh chose to knock on Thakur’s doors.&lt;br /&gt;&lt;br /&gt;With 34.4 million users, Twitter has emerged as a key platform for political and social conversations. Given the reach of the medium, even the Election Commission has been monitoring the posts to ensure there is no adverse impact on election processes.&lt;br /&gt;&lt;br /&gt;Experts said Twitter and other platforms need to become more transparent. “Unless Twitter and other internet giants implement principles of natural justice, they will always be accused of bias,” said Sunil Abraham, co-founder of the think tank Centre for Internet and Society, adding that the platform does not “provide sufficient transparency regarding its decisions”.&lt;br /&gt;&lt;br /&gt;Lawyer Apar Gupta said that the parliamentary panel on IT needs to function more robustly. “It has not invited experts, academics, and civil society voices for deliberations. Also, the outcomes from hearings such as the ones on Aadhaar, privacy. data breaches, and net neutrality, done a while back, remain outstanding. Reports or recommendations have not been made to parliament.”&lt;br /&gt;&lt;br /&gt;In general, parliamentary panels do allow hearings to be deferred at the request of someone who has been summoned, although this is usually at the discretion of the chairman and also if the request is made immediately after the summons is issued.&lt;br /&gt;&lt;br /&gt;Gupta added that usually, a breach of privilege complaint is made by the chairman of the committee to the Lok Sabha speaker “who will then approve it and send it to the Privileges Committee of the Lok Sabha”.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/news/hindustan-times-february-10-2019-smriti-kak-ramachandran-and-vidhi-choudhary-willing-to-participate-in-parliamentary-panel-hearing'&gt;https://cis-india.org/internet-governance/news/hindustan-times-february-10-2019-smriti-kak-ramachandran-and-vidhi-choudhary-willing-to-participate-in-parliamentary-panel-hearing&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Smriti Kak Ramachandran and Vidhi Choudhary</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2019-02-15T02:29:55Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/data-infrastructures-inequities-reproductive-health-surveillance-india">
    <title>Data Infrastructures and Inequities: Why Does Reproductive Health Surveillance in India Need Our Urgent Attention?</title>
    <link>https://cis-india.org/internet-governance/blog/data-infrastructures-inequities-reproductive-health-surveillance-india</link>
    <description>
        &lt;b&gt;In order to bring out certain conceptual and procedural problems with health monitoring in the Indian context, this article by Aayush Rathi and Ambika Tandon posits health monitoring as surveillance and not merely as a “data problem.” Casting a critical feminist lens, the historicity of surveillance practices unveils the gendered power differentials wedded into taken-for-granted “benign” monitoring processes. The unpacking of the Mother and Child Tracking System and the National Health Stack reveals the neo-liberal aspirations of the Indian state. &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;The article was first published by &lt;a href="https://www.epw.in/engage/article/data-infrastructures-inequities-why-does-reproductive-health-surveillance-india-need-urgent-attention" target="_blank"&gt;EPW Engage, Vol. 54, Issue No. 6&lt;/a&gt;, on 9 February 2019.&lt;/em&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;strong&gt;Framing Reproductive Health as a Surveillance Question&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;The approach of the postcolonial Indian state to healthcare has been Malthusian, with the prioritisation of family planning and birth control (Hodges 2004). Supported by the notion of socio-economic development arising out of a “modernisation” paradigm, the target-based approach to achieving reduced fertility rates has shaped India’s reproductive and child health (RCH) programme (Simon-Kumar 2006).&lt;/p&gt;
&lt;p&gt;This is also the context in which India’s abortion law, the Medical Termination of Pregnancy (MTP) Act, was framed in 1971, placing the decisional privacy of women seeking abortions in the hands of registered medical practitioners. The framing of the MTP act invisibilises females seeking abortions for non-medical reasons within the legal framework. The exclusionary provisions only exacerbated existing gaps in health provisioning, as access to safe and legal abortions had already been curtailed by severe geographic inequalities in funding, infrastructure, and human resources. The state has concomitantly been unable to meet contraceptive needs of married couples or reduce maternal and infant mortality rates in large parts of the country, mediating access along the lines of class, social status, education, and age (Sanneving et al 2013).&lt;/p&gt;
&lt;p&gt;While the official narrative around the RCH programme transitioned to focus on universal access to healthcare in the 1990s, the target-based approach continues to shape the reality on the ground. The provision of reproductive healthcare has been deeply unequal and, in some cases, in hospitals. These targets have been known to be met through the practice of forced, and often unsafe, sterilisation, in conditions of absence of adequate provisions or trained professionals, pre-sterilisation counselling, or alternative forms of contraception (Sama and PLD 2018). Further, patients have regularly been provided cash incentives, foreclosing the notion of free consent, especially given that the target population of these camps has been women from marginalised economic classes in rural India.&lt;/p&gt;
&lt;p&gt;Placing surveillance studies within a feminist praxis allows us to frame the reproductive health landscape as more than just an ill-conceived, benign monitoring structure. The critical lens becomes useful for highlighting that taken-for-granted structures of monitoring are wedded with power differentials: genetic screening in fertility clinics, identification documents such as birth certificates, and full-body screeners are just some of the manifestations of this (Adrejevic 2015). Emerging conversations around feminist surveillance studies highlight that these data systems are neither benign nor free of gendered implications (Andrejevic 2015). In continual remaking of the social, corporeal body as a data actor in society, such practices render some bodies normative and obfuscate others, based on categorisations put in place by the surveiller.&lt;/p&gt;
&lt;p&gt;In fact, the history of surveillance can be traced back to the colonial state where it took the form of systematic sexual and gendered violence enacted upon indigenous populations in order to render them compliant (Rifkin 2011; Morgensen 2011). Surveillance, then, manifests as a “scientific” rationalisation of complex social hieroglyphs (such as reproductive health) into formats enabling administrative interventions by the modern state. Lyon (2001) has also emphasised how the body emerged as the site of surveillance in order for the disciplining of the “irrational, sensual body”—essential to the functioning of the modern nation-state—to effectively happen.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Questioning the Information and Communications Technology for Development (ICT4D) and Big Data for Development (BD4D) Rhetoric&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Information and Communications Technology (ICT) and data-driven approaches to the development of a robust health information system, and by extension, welfare, have been offered as solutions to these inequities and exclusions in access to maternal and reproductive healthcare in the country.&lt;/p&gt;
&lt;p&gt;The move towards data-driven development in the country commenced with the introduction of the Health Management Information System in Andhra Pradesh in 2008, and the Mother and Child Tracking System (MCTS) nationally in 2011. These are reproductive health information systems (HIS) that collect granular data about each pregnancy from the antenatal to the post-natal period, at the level of each sub-centre as well as primary and community health centre. The introduction of HIS comprised cross-sectoral digitisation measures that were a part of the larger national push towards e-governance; along with health, thirty other distinct areas of governance, from land records to banking to employment, were identified for this move towards the digitalised provisioning of services (MeitY 2015).&lt;/p&gt;
&lt;p&gt;The HIS have been seen as playing a critical role in the ecosystem of health service provision globally. HIS-based interventions in reproductive health programming have been envisioned as a means of: (i) improving access to services in the context of a healthcare system ridden with inequalities; (ii) improving the quality of services provided, and (iii) producing better quality data to facilitate the objectives of India’s RCH programme, including family planning and population control. Accordingly, starting 2018, the MCTS is being replaced by the RCH portal in a phased manner. The RCH portal, in areas where the ANMOL (ANM Online) application has been introduced, captures data real-time through tablets provided to health workers (MoHFW 2015).&lt;/p&gt;
&lt;p&gt;A proposal to mandatorily link the Aadhaar with data on pregnancies and abortions through the MCTS/RCH has been made by the union minister for Women and Child Development as a deterrent to gender-biased sex selection (Tembhekar 2016). The proposal stems from the prohibition of gender-biased sex selection provided under the Pre-Conception and Pre-Natal Diagnostics Techniques (PCPNDT) Act, 1994. The approach taken so far under the PCPNDT Act, 2014 has been to regulate the use of technologies involved in sex determination. However, the steady decline in the national sex ratio since the passage of the PCPNDT Act provides a clear indication that the regulation of such technology has been largely ineffective. A national policy linking Aadhaar with abortions would be aimed at discouraging gender-biased sex selection through state surveillance, in direct violation of a female’s right to decisional privacy with regards to their own body.&lt;/p&gt;
&lt;p&gt;Linking Aadhaar would also be used as a mechanism to enable direct benefit transfer (DBT) to the beneficiaries of the national maternal benefits scheme. Linking reproductive health services to the Aadhaar ecosystem has been critiqued because it is exclusionary towards women with legitimate claims towards abortions and other reproductive services and benefits, and it heightens the risk of data breaches in a cultural fabric that already stigmatises abortions. The bodies on which this stigma is disproportionately placed, unmarried or disabled females, for instance, experience the harms of visibility through centralised surveillance mechanisms more acutely than others by being penalised for their deviance from cultural expectations.&amp;nbsp; This is in accordance with the theory of "data extremes,” wherein marginalised communities are seen as&amp;nbsp; living on the extremes of&amp;nbsp; data capture, leading to a data regime that either refuses to recognise them as legitimate entities or subjects them to overpolicing in order to discipline deviance (Arora 2016). In both developed and developing contexts, the broader purpose of identity management has largely been to demarcate legitimate and illegitimate actors within a population, either within the framework of security or welfare.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Potential Harms of the Data Model of Reproductive Health Provisioning&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Informational privacy and decisional privacy are critically shaped by data flows and security within the MCTS/RCH. No standards for data sharing and storage, or anonymisation and encryption of data have been implemented despite role-based authentication (NHSRC and Taurus Glocal 2011). The risks of this architectural design are further amplified in the context of the RCH/ANMOL where data is captured real-time. In the absence of adequate safeguards against data leaks, real-time data capture risks the publicising of reproductive health choices in an already stigmatised environment. This opens up avenues for further dilution of autonomy in making future reproductive health choices.&lt;/p&gt;
&lt;p&gt;Several core principles of informational privacy, such as limitations regarding data collection and usage, or informed consent, also need to be reworked within this context.&lt;sup&gt;[1]&lt;/sup&gt; For instance, the centrality of the requirement of “free, informed consent” by an individual would need to be replaced by other models, especially in the context of reproductive health of&amp;nbsp; rape survivors who are vulnerable and therefore unable to exercise full agency. The ability to make a free and informed choice, already dismantled in the context of contemporary data regimes, gets further precluded in such contexts. The constraints on privacy in decisions regarding the body are then replicated in the domain of reproductive data collection.&lt;/p&gt;
&lt;p&gt;What is uniform across these digitisation initiatives is their treatment of maternal and reproductive health as solely a medical event, framed as a data scarcity problem. In doing so, they tend to amplify the understanding of reproductive health through measurable indicators that ignore social determinants of health. For instance, several studies conducted in the rural Indian context have shown that the degree of women’s autonomy influences the degree of usage of pregnancy care, and that the uptake of pregnancy care was associated with village-level indicators such as economic development, provisioning of basic infrastructure and social cohesion. These contextual factors get overridden in pervasive surveillance systems that treat reproductive healthcare as comprising only of measurable indicators and behaviours, that are dependent on individual behaviour of practitioners and women themselves, rather than structural gaps within the system.&lt;/p&gt;
&lt;p&gt;While traditionally associated with state governance, the contemporary surveillance regime is experienced as distinct from its earlier forms due to its reliance on a nexus between surveillance by the state and private institutions and actors, with both legal frameworks and material apparatuses for data collection and sharing (Shepherd 2017). As with historical forms of surveillance, the harms of contemporary data regimes accrue disproportionately among already marginalised and dissenting communities and individuals. Data-driven surveillance has been critiqued for its excesses in multiple contexts globally, including in the domains of predictive policing, health management, and targeted advertising (Mason 2015). In the attempts to achieve these objectives, surveillance systems have been criticised for their reliance on replicating past patterns, reifying proximity to a hetero-patriarchal norm (Haggerty and Ericson 2000). Under data-driven surveillance systems, this proximity informs the preexisting boxes of identity for which algorithmic representations of the individual are formed. The boxes are defined contingent on the distinct objectives of the particular surveillance project, collating disparate pieces of data flows and resulting in the recasting of the singular offline self into various 'data doubles' (Haggerty and Ericson 2000). Refractive, rather than reflective, the data doubles have implications for the physical, embodied life of individual with an increasing number of service provisioning relying on the data doubles (Lyon 2001). Consider, for instance, apps on menstruation, fertility, and health, and wearables such as fitness trackers and pacers, that support corporate agendas around what a woman’s healthy body should look, be or behave like (Lupton 2014). Once viewed through the lens of power relations, the fetishised, apolitical notion of the data “revolution” gives way to what we may better understand as “dataveillance.”&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Towards a Networked State and a Neo-liberal Citizen&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Following in this tradition of ICT being treated as the solution to problems plaguing India’s public health information system, a larger, all-pervasive healthcare ecosystem is now being proposed by the Indian state (NITI Aayog 2018). Termed the National Health Stack, it seeks to create a centralised electronic repository of health records of Indian citizens with the aim of capturing every instance of healthcare service usage. Among other functions, it also envisions a platform for the provisioning of health and wellness-based services that may be dispensed by public or private actors in an attempt to achieve universal health coverage. By allowing private parties to utilise the data collected through pullable open application program interfaces (APIs), it also fits within the larger framework of the National Health Policy 2017 that envisions the private sector playing a significant role in the provision of healthcare in India. It also then fits within the state–private sector nexus that characterises dataveillance. This, in turn, follows broader trends towards market-driven solutions and private financing of health sector reform measures that have already had profound consequences on the political economy of healthcare worldwide (Joe et al 2018).&lt;/p&gt;
&lt;p&gt;These initiatives are, in many ways, emblematic of the growing adoption of network governance reform by the Indian state (Newman 2001). This is a stark shift from its traditional posturing as the hegemonic sovereign nation state. This shift entails the delayering from large, hierarchical and unitary government systems to horizontally arranged, more flexible, relatively dispersed systems.&lt;sup&gt;[2]&lt;/sup&gt; The former govern through the power of rules and law, while the latter take the shape of self-regulating networks such as public–private contractual arrangements (Snellen 2005). ICTs have been posited as an effective tool in enabling the transition to network governance by enhancing local governance and interactive policymaking enabling the co-production of knowledge (Ferlie et al 2011). The development of these capabilities is also critical to addressing “wicked problems” such as healthcare (Rittel and Webber 1973).&lt;sup&gt;[3]&lt;/sup&gt; The application of the techno-deterministic, data-driven model to reproductive healthcare provision, then, resembles a fetishised approach to technological change. The NHSRC describes this as the collection of data without an objective, leading to a disproportional burden on data collection over use (NHSRC and Taurus Glocal 2011).&lt;/p&gt;
&lt;p&gt;The blurring of the functions of state and private actors is reflective of the neo-liberal ethic, which produces new practices of governmentality. Within the neo-liberal framework of reproductive healthcare, the citizen is constructed as an individual actor, with agency over and responsibility for their own health and well-being (Maturo et al 2016).&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;“Quantified Self” of the Neo-liberal Citizen&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Nowhere can the manifestation of this neo-liberal citizen can be seen as clearly as in the “quantified self” movement. The quantified self movement refers to the emergence of a whole range of apps that enable the user to track bodily functions and record data to achieve wellness and health goals, including menstruation, fertility, pregnancies, and health indicators in the mother and baby. Lupton (2015) labels this as the emergence of the “digitised reproductive citizen,” who is expected to be attentive to her fertility and sexual behaviour to achieve better reproductive health goals. The practice of collecting data around reproductive health is not new to the individual or the state, as has been demonstrated by the discussion above. What is new in this regime of datafication under the self-tracking movement is the monetisation of reproductive health data by private actors, the labour for which is performed by the user. Focusing on embodiment draws attention to different kinds of exploitation engendered by reproductive health apps. Not only is data about the body collected and sold, the unpaid labour for collection is extracted from the user. The reproductive body can then be understood as a cyborg, or a woman-machine hybrid, systematically digitising its bodily functions for profit-making within the capitalist (re)production machine (Fotoloulou 2016). Accordingly, all major reproductive health tracking apps have a business model that relies on selling information about users for direct marketing of products around reproductive health and well-being (Felizi and Varon nd).&lt;/p&gt;
&lt;p&gt;As has been pointed out in the case of big data more broadly, reproductive health applications (apps) facilitate the visibility of the female reproductive body in the public domain. Supplying anonymised data sets to medical researchers and universities fills some of the historical gaps in research around the female body and reproductive health. Reproductive and sexual health tracking apps globally provide their users a platform to engage with biomedical information around sexual and reproductive health. Through group chats on the platform, they are also able to engage with experiential knowledge of sexual and reproductive health. This could also help form transnational networks of solidarity around the body and health&amp;nbsp; (Fotopoulou 2016).&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This radical potential of network-building around reproductive and sexual health is, however, tempered to a large extent by the reconfiguration of gendered stereotypes through these apps. In a study on reproductive health apps on Google Play Store, Lupton (2014) finds that products targeted towards female users are marketed through the discourse of risk and vulnerability, while those targeted towards male users are framed within that of virility. Apart from reiterating gendered stereotypes around the male and female body, such a discourse assumes that the entire labour of family planning is performed by females. This same is the case with the MCTS/RCH.&lt;/p&gt;
&lt;p&gt;Technological interventions such as reproductive health apps as well as HIS are based on the assumption that females have perfect control over decisions regarding their own bodies and reproductive health, despite this being disproved in India. The Guttmacher Institute (2014) has found that 60% of women in India report not having control over decisions regarding their own healthcare. The failure to account for the husband or the family as stakeholder in decision-making around reproductive health has been a historical failure of the family planning programme in India, and is now being replicated in other modalities. This notion of an autonomous citizen who is able to take responsibility of their own reproductive health and well-being does not hold true in the Indian context. It can even be seen as marginalising females who have already been excluded from the reproductive health system, as they are held responsible for their own inability to access healthcare.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Concluding Remarks&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;The interplay that emerges between reproductive health surveillance and data infrastructures is a complex one. It requires the careful positioning of the political nature of data collection and processing as well as its hetero-patriarchal and colonial legacies, within the need for effective utilisation of data for achieving developmental goals. Assessing this discourse through a feminist lens identifies the web of power relations in data regimes. This problematises narratives of technological solutions for welfare provision.&lt;/p&gt;
&lt;p&gt;The reproductive healthcare framework in India then offers up a useful case study to assess these concerns. The growing adoption of ICT-based surveillance tools to equalise access to healthcare needs to be understood in the socio-economic, legal, and cultural context where these tools are being implemented. Increased surveillance has historically been associated with causing the structural gendered violence that it is now being offered as a solution to. This is a function of normative standards being constructed for reproductive behaviour that necessarily leave out broader definitions of reproductive health and welfare when viewed through a feminist lens. Within the larger context of health policymaking in India, moves towards privatisation then demonstrate the peculiarity of dataveillance as it functions through an unaccountable and pervasive overlapping of state and private surveillance practises. It remains to be seen how these trends in ICT-driven health policies affect access to reproductive rights and decisional privacy for millions of females in India and other parts of the global South.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/data-infrastructures-inequities-reproductive-health-surveillance-india'&gt;https://cis-india.org/internet-governance/blog/data-infrastructures-inequities-reproductive-health-surveillance-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Aayush Rathi and Ambika Tandon</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Data Systems</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>BD4D</dc:subject>
    
    
        <dc:subject>Healthcare</dc:subject>
    
    
        <dc:subject>Surveillance</dc:subject>
    
    
        <dc:subject>Big Data for Development</dc:subject>
    

   <dc:date>2019-12-30T16:44:32Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
