<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/blogs/RSS">
  <title>All Blogs</title>
  <link>https://cis-india.org</link>
  
  <description>
    
       
       
  </description>
  
  
  
            <syn:updatePeriod>daily</syn:updatePeriod>
            <syn:updateFrequency>1</syn:updateFrequency>
            <syn:updateBase>2008-09-22T07:49:00Z</syn:updateBase>
        
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/raw/unrecognised-and-vulnerable-reviewing-social-protection-instruments-for-platform-workers"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/how-concentration-kills-innovation-a-round-table-on-investing-and-innovating-in-the-age-of-big-tech"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/whose-technology-is-it-anyway-an-exploratory-essay-on-the-political-economy-of-indias-digital-revolution"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/emotional-contagion-theorising-the-role-of-affect-in-covid-19"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/navigating-the-digitalisation-of-finance"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/the-cost-of-free-basics-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/openness/state-of-openness-in-indias-e-governance-applications"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/aparna-bhatnagar-and-amrita-sengupta-education-epistemologies-and-ai-understanding-role-of-generative-ai-in-education"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-report-on-ai-governance-guidelines-development"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/a2k/blogs/she-leads-bootcamp"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/submission-to-igf-2025-call-for-thematic-inputs"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/do-we-need-separate-health-data-law-in-india"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/bahujan-digital-publishing-infrastructures"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/cis-digest-2024"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/explainer-predatory-pricing"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/raw/unrecognised-and-vulnerable-reviewing-social-protection-instruments-for-platform-workers">
    <title>Unrecognised and Vulnerable: Reviewing social protection instruments for platform workers</title>
    <link>https://cis-india.org/raw/unrecognised-and-vulnerable-reviewing-social-protection-instruments-for-platform-workers</link>
    <description>
        &lt;b&gt;In this issue brief, Chetna V M and Chiara Furtado reflect on developments in social protection for platform workers in India, issues of exclusions and insufficient coverage for workers, and tensions between existing social protection instruments and the call for labour rights-affirming protections for platform workers.
&lt;/b&gt;
        
&lt;p id="docs-internal-guid-2451d7fe-7fff-b118-cc25-a56bbcb59609" style="text-align: justify;" dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;Platform work is often touted by industry and the state as a form of aspirational employment and opportunity. However, behind this promise lies exploitative working conditions, safety concerns, high operational costs, and inadequate earnings. A key concern is the denial of employment protections, as platform companies are notorious for evading their responsibility by classifying platform workers as “partners” or “micro-entrepreneurs” rather than employees. There has been growing pressure from platform workers’ unions, workers’ collectives, and civil society in the past several years towards recognising and securing platform workers’ rights. This has resulted in steps taken by both governments and platform companies—although substantially limited—to extend social security benefits to platform workers. Currently, in India, social protection instruments available to platform workers are ad-hoc and fragmented across various government and platform initiatives.&amp;nbsp;&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;This issue brief offers an overview of the various social protection instruments for platform workers designed by governments and platform companies. It highlights exclusions of platform workers in key social protection benefits, discusses challenges in accessing eligible social protections, reveals the insidious role of the design of social protection instruments in denying the legal employment recognition of platform workers. Finally, the brief foregrounds longstanding and unmet demands by platform workers and unions on critical social protections. The social protection instruments discussed in this brief include government responses including the Code on Social Security and the E-Shram portal, the Motor Vehicle Aggregator Guidelines, state-level legislation, as well platform-specific initiatives.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p style="text-align: justify;" dir="ltr"&gt;Read the full issue brief &lt;a href="https://cis-india.org/raw/unrecognised-and-vulnerable-social-protection-pdf" class="internal-link" title="Unrecognised-and-Vulnerable-Social-Protection-pdf"&gt;here&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/unrecognised-and-vulnerable-reviewing-social-protection-instruments-for-platform-workers'&gt;https://cis-india.org/raw/unrecognised-and-vulnerable-reviewing-social-protection-instruments-for-platform-workers&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Chetna V M and Chiara Furtado</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2025-06-18T13:20:10Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/how-concentration-kills-innovation-a-round-table-on-investing-and-innovating-in-the-age-of-big-tech">
    <title>How Concentration Kills Innovation: A Round Table on Investing and Innovating in the Age of Big Tech</title>
    <link>https://cis-india.org/raw/how-concentration-kills-innovation-a-round-table-on-investing-and-innovating-in-the-age-of-big-tech</link>
    <description>
        &lt;b&gt;We are at the cusp of pivotal technological transformation – driven by rapid advancements in Artificial Intelligence, computing, robotics, biotechnology, and digital infrastructure, including the next generation of communication technologies. India aspires to be one of the leaders in this technological age, with ambitions of achieving a US$1 trillion digital economy in the near future, and domestic startups are anticipated to be a major driving force behind achieving this milestone and as such have been instrumental in ushering investments from state and market actors, alike.&lt;/b&gt;
        &lt;p&gt;At the same time, AI continues to ride a wave of popularity fuelled by a surge in corporate investment, an increase in M&amp;amp;A activity, and aggressive hiring strategy. Over the past three years, AI has emerged as the most hyped popular ‘innovation’ of our times, driven largely by the emergence and rapid adoption of Large Language Models (LLMs) such as Open AI’s GPT and Meta’s Llama. Even though, at first glance, AI was touted to disrupt the Big Tech dominance and create a vibrant competitive landscape, it is hard to imagine current AI systems without the Big Tech.&lt;br /&gt;&lt;br /&gt;These phenomena of market consolidation and Big Tech dependence raises some important concerns for all involved, but most pertinently for entrepreneurs and investors – who form the core constituency of today’s digital revolution. Some of these concerns reflect the funders’ sectoral interests, and how their choices are affected by the Big Tech’s market consolidation. Other issues relate to the outcomes of these choices, and whether the current arc of innovation excludes, or even harms, certain stakeholders.&lt;br /&gt;&lt;br /&gt;To this end, Abhineet Nayyar and Isha Suri – with research assistance from Girish Chandra and Ayush Menon – conducted an analysis of the three seasons of Shark Tank India, and organised an online roundtable discussion with entrepreneurs, investors, and market researchers. Their analysis, and the subsequent roundtable, aimed to answer the following key questions&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;How does the startup ecosystem’s dependence on Big Tech affect i) how entrepreneurs think about innovation, and ii) how investors think about funding?&lt;/li&gt;
&lt;li&gt;How can regulation be channelled to facilitate innovation in this context?&lt;/li&gt;
&lt;li&gt;What role do mergers, acquisitions, and investments play in killing market competition?&lt;/li&gt;
&lt;li&gt;Since ‘innovativeness’ remains a subjective concept, what kinds of metrics (proxy or otherwise) does the startup ecosystem usually rely on to identify innovative ideas?&lt;/li&gt;
&lt;/ol&gt;
&lt;div&gt;&lt;/div&gt;
&lt;ol&gt; &lt;/ol&gt; 
&lt;hr /&gt;
&lt;p&gt;You can read through their findings and key learnings from the roundtable &lt;a class="external-link" href="http://cis-india.org/raw/files/concentration-and-innovation-roundtable-discussion-note"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/how-concentration-kills-innovation-a-round-table-on-investing-and-innovating-in-the-age-of-big-tech'&gt;https://cis-india.org/raw/how-concentration-kills-innovation-a-round-table-on-investing-and-innovating-in-the-age-of-big-tech&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Abhineet Nayyar and Isha Suri</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2025-04-29T16:27:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india">
    <title>Mapping the Legal and Regulatory Frameworks of the Ad-Tech Ecosystem in India</title>
    <link>https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india</link>
    <description>
        &lt;b&gt;The main purpose of regulations in any sector is essentially twofold, one is to ensure that the interests of the general public or consumers are protected, and the other is to ensure that the sector itself flourishes and grows. Too much regulation may possibly stifle the commercial potential of any sector, whereas too little regulation runs the risk of leaving consumers vulnerable to harmful practices.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In this paper, we try to map the legal and regulatory framework dealing with Advertising Technology (Adtech) in India as well as a few other leading jurisdictions. Our analysis is divided into three main parts, the first being general consumer regulations, which apply to all advertising irrespective of the media – to ensure that advertisements are not false or misleading and do not violate any laws of the country. This part also covers the consumer laws which are specific to malpractices in the technology sector such as Dark Patterns, Influencer based advertising, etc.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The second part of the paper covers data protection laws in India and how they are relevant for the Adtech industry. The Adtech industry requires and is based on the collection and processing of large amounts of data from the users. It is therefore important to discuss the data protection and consent requirements that have been laid out in the spate of recent data protection regulations, which have the potential to severely impact the Adtech industry.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The last part of the paper covers the competition angle of the Adtech industry. Like with social media intermediaries, the Adtech industry in the world is also dominated by two or three players and such a scenario always lends itself easily to anti-competitive practices. It is therefore imperative to examine the competition law framework to see whether the laws as they exist are robust enough to deal with any possible anti competitive practices that may be prevalent in the Adtech sector.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The research was reviewed by Pallavi Bedi, it can be &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india"&gt;accessed here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india'&gt;https://cis-india.org/internet-governance/blog/mapping-the-legal-and-regulatory-frameworks-of-the-ad-tech-ecosystem-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>vipul</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2025-04-24T14:52:29Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder">
    <title>Emotional Contagion: Theorising the Role of Affect in COVID-19 Information Disorder</title>
    <link>https://cis-india.org/internet-governance/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder</link>
    <description>
        &lt;b&gt;In this paper, we investigate the underexplored emotional drivers of information disorder, with a particular focus on how it manifested in COVID-19 misinformation in India. While "fake news" has received considerable attention for its impact on elections, marginalized communities, and public health, mainstream information disorder research does not sufficiently prioritise the underlying psychological factors that influence information trust. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;&lt;span style="text-align: start; float: none; "&gt;By incorporating theoretical frameworks from psychology, sociology, and communication studies, we reveal the complex foundations of both the creation and consumption of misinformation. From this research, fear emerged as the predominant emotional driver in both the creation and consumption of misinformation, demonstrating how negative affective responses frequently override rational analysis during crises. Our findings suggest that effective interventions must address these affective dimensions through tailored digital literacy programs, diversified information sources on online platforms, and expanded multimodal misinformation research opportunities in India.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span style="text-align: start; float: none; "&gt;Click to download the &lt;a class="external-link" href="https://cis-india.org/internet-governance/files/emotional-contagion.pdf"&gt;research paper&lt;/a&gt;&lt;strong&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/emotional-contagion"&gt;&lt;br /&gt;&lt;/a&gt;&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder'&gt;https://cis-india.org/internet-governance/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Yesha Tshering Paul and Amrita Sengupta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Information Disorders</dc:subject>
    
    
        <dc:subject>Fake News</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    

   <dc:date>2025-04-14T05:23:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/whose-technology-is-it-anyway-an-exploratory-essay-on-the-political-economy-of-indias-digital-revolution">
    <title>Whose Technology is it Anyway? An exploratory essay on the political economy of India's digital revolution</title>
    <link>https://cis-india.org/raw/whose-technology-is-it-anyway-an-exploratory-essay-on-the-political-economy-of-indias-digital-revolution</link>
    <description>
        &lt;b&gt;The story of India's digital journey has become an oft-cited tale of economic success across the globe, inspiring similar experiments in other nations in the Global Majority world - most prominently across sub-Saharan Africa. At home, however, this tale has been used to rapidly normalise the deployment of digital technologies. In this process, these innovations have not just bolstered the state's control over individuals and their actions, but have also enabled the tech elite to extract more value from workers, small businesses, and even consumers.&lt;/b&gt;
        &lt;p style="text-align: left; "&gt;&lt;span style="text-align: start; float: none; "&gt;In this exploratory essay, Abhineet reiterates that technology's presumed benefits are not, and have never quite been, an inevitable consequence of progress. By presenting a historical view of the industrial revolution and the green revolution, the piece instead shows that the outcomes of any innovation depend crucially on the choices made by the political and economic elite of the time. Consequently, to truly reflect the interests of the country's masses, the essay calls for India's digital revolution to also take this insight into account.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: left; "&gt;&lt;span style="text-align: start; float: none; "&gt;The central thesis of the essay is borrowed from Daron Acemoglu and Simon Johnson's 'Power and Progress', and its substantiation for the various historical moments is done through secondary material.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: left; "&gt;&lt;span style="text-align: start; float: none; "&gt;&lt;a class="external-link" href="http://cis-india.org/raw/files/whose-technology-anyway"&gt;Click here&lt;/a&gt; to download the research&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/whose-technology-is-it-anyway-an-exploratory-essay-on-the-political-economy-of-indias-digital-revolution'&gt;https://cis-india.org/raw/whose-technology-is-it-anyway-an-exploratory-essay-on-the-political-economy-of-indias-digital-revolution&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Abhineet Nayyar</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Telecom</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2025-04-11T15:42:41Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder">
    <title>Emotional Contagion: Theorising the Role of Affect in COVID-19 Information Disorder</title>
    <link>https://cis-india.org/internet-governance/blog/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder</link>
    <description>
        &lt;b&gt;In this paper, we investigate the underexplored emotional drivers of information disorder, with a particular focus on how it manifested in COVID-19 misinformation in India. While "fake news" has received considerable attention for its impact on elections, marginalized communities, and public health, mainstream information disorder research does not sufficiently prioritise the underlying psychological factors that influence information trust. &lt;/b&gt;
        
&lt;p style="text-align: justify;"&gt;&lt;span style="text-align: start; float: none;"&gt;By incorporating theoretical frameworks from psychology, sociology, and communication studies, we reveal the complex foundations of both the creation and consumption of misinformation. From this research, fear emerged as the predominant emotional driver in both the creation and consumption of misinformation, demonstrating how negative affective responses frequently override rational analysis during crises. Our findings suggest that effective interventions must address these affective dimensions through tailored digital literacy programs, diversified information sources on online platforms, and expanded multimodal misinformation research opportunities in India.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;&lt;span style="text-align: start; float: none;"&gt;Click to download the &lt;a href="https://cis-india.org/internet-governance/files/emotional-contagion.pdf" class="internal-link" title="Emotional Contagion: Theorising the Role of Affect in COVID-19 Information Disorder"&gt;research paper&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder'&gt;https://cis-india.org/internet-governance/blog/emotional-contagion-theorising-role-of-affect-in-covid-19-information-disorder&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Yesha Tshering Paul and Amrita Sengupta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Information Disorders</dc:subject>
    
    
        <dc:subject>Fake News</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Covid19</dc:subject>
    

   <dc:date>2025-04-14T18:51:27Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/emotional-contagion-theorising-the-role-of-affect-in-covid-19">
    <title>Emotional Contagion: Theorising the Role of Affect in COVID-19 Information Disorder</title>
    <link>https://cis-india.org/internet-governance/emotional-contagion-theorising-the-role-of-affect-in-covid-19</link>
    <description>
        &lt;b&gt;In this paper, we investigate the underexplored emotional drivers of information disorder, with a particular focus on how it manifested in COVID-19 misinformation in India. While "fake news" has received considerable attention for its impact on elections, marginalized communities, and public health, mainstream information disorder research does not sufficiently prioritise the underlying psychological factors that influence information trust. &lt;/b&gt;
        
&lt;p style="text-align: justify;"&gt;&lt;span style="text-align: start; float: none;"&gt;By incorporating theoretical frameworks from psychology, sociology, and communication studies, we reveal the complex foundations of both the creation and consumption of misinformation. From this research, fear emerged as the predominant emotional driver in both the creation and consumption of misinformation, demonstrating how negative affective responses frequently override rational analysis during crises. Our findings suggest that effective interventions must address these affective dimensions through tailored digital literacy programs, diversified information sources on online platforms, and expanded multimodal misinformation research opportunities in India.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify;"&gt;&lt;span style="text-align: start; float: none;"&gt;Click to download the &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/emotional-contagion-theorising-the-role-of-affect-in-covid-19"&gt;research paper&lt;/a&gt; &lt;strong&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/emotional-contagion"&gt; &lt;br /&gt;&lt;/a&gt;&lt;/strong&gt;&lt;/span&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/emotional-contagion-theorising-the-role-of-affect-in-covid-19'&gt;https://cis-india.org/internet-governance/emotional-contagion-theorising-the-role-of-affect-in-covid-19&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Yesha Tshering Paul and Amrita Sengupta</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2025-04-14T10:51:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/navigating-the-digitalisation-of-finance">
    <title>Navigating the Digitalisation of Finance:  User experiences of risks and harms </title>
    <link>https://cis-india.org/raw/navigating-the-digitalisation-of-finance</link>
    <description>
        &lt;b&gt;Our study unpacks the experiences of marginalised users navigating the digitalisation of finance. Through a survey of 3,784 users, 18 interviews and 7 focus group discussions, our study’s findings highlight user experiences of risks and harms while accessing digital financial services, unpacking experiences specifically of persons with disabilities, transgender persons, gender and sexual minorities, elderly persons, women, regional language-first users, and persons facing digital and economic vulnerabilities.
&lt;/b&gt;
        
&lt;p&gt;Read the &lt;a href="https://cis-india.org/CIS_Navigating-the-digitalisation-of-finance" class="external-link"&gt;full report here&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;Executive Summary&lt;/h2&gt;
&lt;p style="text-align: justify;"&gt;The last couple of decades have seen significant changes in the financial ecosystem in India, both within the fintech sector and with respect to digital financial inclusion. The rapid growth in the reach of banking services to previously unbanked citizens through the Pradhan Mantri Jan Dhan Yojana has been followed by digitalisation in financial and public services.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;However, a commensurate increase in digital and financial literacy has not followed, and rates of access to digital devices and the internet are still growing for many user groups, like rural women and regional language speakers. From the proliferation of fraudulent schemes and cybercrime to regulatory loopholes and inadequate consumer protections, the landscape of online financial services in India presents numerous risks. Factors such as weak cybersecurity measures, data breaches, lack of awareness among users, and the absence of comprehensive regulations create a fertile ground for financial scams. Simultaneously, rapid digitalisation of financial services, especially post demonetisation and the COVID-19 pandemic, has also brought to the fore concerns around omissions and exclusion of sections of users from databases, and a steep learning curve in adapting to this new digital ecosystem.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;These combined factors open up users to a range of potential financial risks and harms, with differential impact on specific marginalised and vulnerable groups. With this understanding, we use the term digital financial harms to refer to adverse financial outcomes and other related detrimental consequences in the use of digital financial services.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Through this study, we aim to situate these experiences in a continuum of harms within a rapidly digitalising financial ecosystem. By exploring questions of access, accessibility and language we hope to bring aspects of cybersecurity, digital and financial literacy and design justice into conversation with each other. While some research has aimed to understand technology-facilitated gender based violence, financial fraud, misinformation, and other forms of digital risks in siloes, the correlations between these risks online remain severely understudied. In this report, we focus on the experiences of groups long marginalised within the financial system, to recommend that their needs are centred in shaping digital financial services.&lt;/p&gt;
&lt;p&gt;Key questions guiding our research were:&lt;/p&gt;
&lt;ul&gt;
&lt;li style="text-align: justify;"&gt;How were digital financial risks understood and experienced by users of digital financial services across groups? What factors have amplified risks for marginalised and at-risk user groups?&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;What potential vulnerabilities, risks and harms have emerged relating to digital financial services around device and internet access, accessibility, challenges with use, exclusions from digitalised social protection, and forms of social engineering and digital financial fraud?&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;How accessible were digital financial service providers’ and governments’ reporting and grievance redressal systems?&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;What role should fintech platforms, social media platforms, banking and financial institutions, government, and regulatory bodies play in reducing digital financial risks across the ecosystem?&lt;/li&gt;&lt;/ul&gt;
&lt;p style="text-align: justify;"&gt;This was a mixed methods study, consisting of a review of available literature in the field, followed by quantitative and qualitative data collection through surveys and in-depth interviews. The report highlights the experiences of persons with disabilities, gender minorities, the elderly, low income users, and regional language first users; to better understand how discrimination, exclusion or slow redressal processes may increase their risk or cause disproportionate harm when using digital financial services. It discusses users’ experiences of fraud in the context of an evolving regulatory ecosystem, as well as practical challenges users face with redressal systems.&lt;/p&gt;
&lt;p&gt;Key findings include:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Access to digital financial services, still requires improving      access and accessibility of physical and phygital banking services, and      good internet connectivity.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Even among mobile phone owners, many users still rely on shared      devices, particularly among women and persons with disabilities.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;There is a high need for support in utilising net banking services,      with 60% of surveyed netbanking users mentioning that they sought help to      conduct online banking transactions. Migrating to digital financial      services is not a purely digital journey for users who are still building      comfort with digital interfaces, or those whose languages are      deprioritised in the development of digital financial platforms.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Age, gender, and income were significant factors in the access to      the internet, and adoption of digital financial services. For instance,      women and transgender persons over 45 years were less likely to have a      Unified Payments Interface (UPI) account. Women, transgender persons, and      disabled users of UPI were also more likely to be infrequent users      compared to the rest of the sample.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Over reliance on digital platforms for the administration of direct      benefit transfer programmes results in challenges and risks of exclusions      for beneficiaries.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;While awareness of common forms of fraud is high, awareness of      security protocols, Know Your Customer (KYC) requirements and markers of      trustworthy banking and non-banking institutions is low.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Irrespective of the amount of money lost during frauds, it caused      significant financial and emotional burden, especially for low-income      persons.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;In the absence of monitoring frameworks, bad actors within the      financial system are able to exploit vulnerabilities like the dependence      of account holders on banking correspondents.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Gender and sexual minorities, and women face disproportionate      impacts of harm in the event of financial loss, including the consequences      of image-based sexual abuse, victim blaming, domestic violence and limits      on financial independence.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Ineffective grievance redressal for cybercrimes is a major      deterrent to reporting.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Digitalisation and the use of assistive technology have allowed      some persons with visual impairments to gain relative levels of      independence in managing their own finances and conducting transactions.      However, implementation of accessibility policies and features remains      uneven, and is marred by the continued exclusion and discrimination within      traditional banking services.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;Based on these findings, this report offers a set of recommendations addressed to stakeholders within the financial ecosystem such as banking and other financial institutions, regulatory bodies, fintech companies, cybersecurity professionals, as well as social media platforms and civil society organisations working on digital inclusion, safety and literacy. The recommendations offer nuanced perspectives on how digital financial harms can be prevented and mitigated based on our interactions with various stakeholders during the research process.&lt;/p&gt;
&lt;p&gt;Key recommendations emerging from the study are:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Create meaningful connectivity and access to digital platforms by      improving public infrastructure and addressing the challenges associated      with shared devices and mediated use.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Improve platform design to engender trust; increase accessibility      and usability through assessment and better implementation of available      technologies, regular design audits and facilitate availability in Indian      languages.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Building awareness and capacity across user and stakeholder groups      through customised and inclusive programming, working in partnership with      communities.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Centre consumer protection in regulatory interventions and      approaches to law enforcement, by implementing robust time-sensitive      reporting and redressal mechanisms, placing accountability on financial      institutions, and monitoring and curbing fraudulent activity.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Encourage transparent governance and public oversight by measuring      and evaluating digital public infrastructures to maximise their public      value.&lt;/li&gt;&lt;/ol&gt;
&lt;ol&gt;&lt;/ol&gt;
&lt;h1&gt;Contributors&lt;/h1&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Research design and/or report writing:&lt;/strong&gt; Amrita Sengupta, Chiara Furtado, Garima Agrawal, Nishkala Sekhar, Puthiya Purayil Sneha, and Vipul Kharbanda&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Research advice and/or review:&lt;/strong&gt; Antara Rai Chowdhury, Janaki Srinivasan, Nayantara Sarma, Palak Gadhiya, Pallavi Bedi, Sameet Panda, Semanti Chakladar, Shashidhar K J, Shweta Mohandas, and Taranga Sriraman&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Research and/or data analysis support:&lt;/strong&gt; Chetna V M, Pallavi Krishnappa, and Yesha Tshering Paul&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Data analysis:&lt;/strong&gt; Chiara Furtado, Garima Agrawal, and Nishkala Sekhar&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Research tool translation:&lt;/strong&gt;&amp;nbsp;  Aravind R (Kannada), Balaji J (Tamil), Bhaskar Bhuyan (Assamese), Nettime Sujata (Bangla), and Suresh Khole (Marathi)&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Research tool pilots:&lt;/strong&gt; Raveenaben (Megha Cooperative, SEWA), Sunaben (Megha Cooperative, SEWA), and Raja Mouli N&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Data collection (survey):&lt;/strong&gt; D-Cor Consulting&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&lt;strong&gt;Data collection (focus group discussions):&lt;/strong&gt; D-Cor Consulting; Jnana Prabodhini, Pune; Transgender Rights Association, Chennai; and Subodh Kulkarni&lt;/p&gt;
&lt;p&gt;This work is shared under the &lt;a href="https://creativecommons.org/licenses/by-sa/4.0/"&gt;Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0)&lt;/a&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/navigating-the-digitalisation-of-finance'&gt;https://cis-india.org/raw/navigating-the-digitalisation-of-finance&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amrita Sengupta, Chiara Furtado, Garima Agrawal, Nishkala Sekhar, Puthiya Purayil Sneha, and Vipul Kharbanda</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Financial Platforms</dc:subject>
    
    
        <dc:subject>Digital Financial Harms</dc:subject>
    
    
        <dc:subject>Homepage</dc:subject>
    
    
        <dc:subject>Digital Financial Services</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2025-04-10T05:49:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/the-cost-of-free-basics-in-india">
    <title>The Cost of Free Basics in India: Does Facebook's 'walled garden' reduce or reinforce digital inequalities?</title>
    <link>https://cis-india.org/internet-governance/blog/the-cost-of-free-basics-in-india</link>
    <description>
        &lt;b&gt;In this essay—written in April 2016 soon after India's Telecom Regulatory Authority (TRAI) upheld net neutrality and effectively banned Free Basics in India— the author uses development theories to study the Free Basics programme.  The author explored three key paradigms: 1) Construction of knowledge, power structures and virtual colonization in the Free Basics Programme, (2) A sub-internet of the marginalized and (3) the Capabilities Approach  and explored how the  programme reinforces levels of digital inequalities as opposed to reducing it.  This essay was written in 2016 and there have been various shifts in the digital and tech landscape. Further a lot of numbers and statistics are from 2016 and not all ideas held here may be transferable today. This should be read as such.  This is being published now, on account of 10 years since the Free Basics project was set to be implemented in India. &lt;/b&gt;
        &lt;p&gt;&lt;span id="m_7467646325406972221m_3271523195114453167docs-internal-guid-0bbf9e25-7fff-1674-a0f6-07e4a18be700"&gt;&lt;span&gt;In 2015, Facebook introduced &lt;a href="http://internet.org" rel="noreferrer" target="_blank"&gt;internet.org&lt;/a&gt; in India and it faced a lot of criticism. The programme was relaunched  as the Free Basics programme, ostensibly to provide, free of cost, access to the Internet to the economically  deprived section of society. The content, i.e. websites, were pre-selected  by Facebook and was provided by third-party providers. Later, Telecom Regulatory Authority of India (TRAI) ruled in favor of net neutrality, banning the program in India. A crucial conversation in this debate was also about whether the Free Basics program was going to actually be helpful for those it set out to support. &lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span id="m_7467646325406972221m_3271523195114453167docs-internal-guid-6bb0cb03-7fff-1427-120a-c31ca11f2c7c"&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span&gt;T&lt;/span&gt;&lt;span&gt;his paper examines Facebook’s Free Basics programme and its perceived role in bridging digital divides, &lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;in the context of India, where it has been widely debated, criticized and finally banned in a ruling from &lt;/span&gt;&lt;span&gt; T&lt;/span&gt;&lt;span&gt;elecom Regulatory Authority of India (TRAI). &lt;/span&gt;&lt;span&gt;While the debate on the Free Basics programme has &lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;largely been embroiled around the principles of network neutrality, this paper will try to examine it from an ICT4D perspective, embedding the &lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;discussion around key development paradigms.&lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;This essay begins by introducing the Free Basics programme in India and the associated proceedings, &lt;/span&gt;&lt;span&gt; &lt;/span&gt;&lt;span&gt;following which &lt;/span&gt;&lt;span&gt;existing literature is reviewed to explore the concept of development, the perceived role of  ICT in development, thus laying the scope of this discussion. The essay then examines the question of whether the  Free Basics programme reduces or reinforces digital inequality by looking at 3 development paradigms: (1) Construction of knowledge, power structures and virtual colonization in the Free Basics Programme, (2) A sub-internet of the marginalized: looking at second level digital divides  and (3) the Capabilities Approach and premise of connectivity as a source of equality and freedom. &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span id="m_7467646325406972221m_3271523195114453167docs-internal-guid-bf9460ef-7fff-ea3a-581d-9dc8c753f219"&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span&gt;The essay concludes with a view that the need for  digital access should be viewed as a subset of overall contextual development as opposed to programs unto  themselves and taking purely techno-solutionist approaches. &lt;/span&gt;&lt;span&gt;There is a requirement for effective needs  identification as part of ICT4D research to locate the users at the center and not at the periphery of the  discussions. Lastly, policymakers should look into the addressal of more basic concerns like that of access and connectivity and not just on solutions which can be claimed as “quick-wins” in policy implementation. &lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p dir="ltr"&gt;&lt;span&gt;&lt;a class="external-link" href="https://cis-india.org/internet-governance/files/free-basics"&gt;Click to download the Essay&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/the-cost-of-free-basics-in-india'&gt;https://cis-india.org/internet-governance/blog/the-cost-of-free-basics-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amrita Sengupta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Free Basics</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2025-04-05T04:10:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/openness/state-of-openness-in-indias-e-governance-applications">
    <title>State of Openness in India's E-Governance Applications</title>
    <link>https://cis-india.org/openness/state-of-openness-in-indias-e-governance-applications</link>
    <description>
        &lt;b&gt;Open source software (OSS), also commonly known as free and open source software (FOSS) or free libre open source software (FLOSS), is software that is made available with its source code. It is licensed liberally, granting users access to study, use, modify, improve, or redistribute it. This work was sponsored by Mozilla Foundation. &lt;/b&gt;
        &lt;p style="text-align: left; "&gt;In this context, the term ‘open’ refers to the source code being made available without having to pay royalties or licensing fees, while the term ‘free’ refers to the freedom to copy and use the software rather than being ‘free of cost’. The two organisations that are the self-appointed custodians of these definitions are the Free Software Foundation (FSF)2 and the Open Software Initiative (OSI).3 While the two organisations and the two terms resulted from different philosophies and represent different methodologies, the FSF and OSI acknowledge that for all practical purposes, “they both refer to essentially the same thing”4; “however, the differences in extension of the category are small: nearly all free software is open source, and nearly all open source software is free.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: left; "&gt;&lt;b&gt;&lt;a class="external-link" href="https://cis-india.org/openness/files/state-of-openness-in-indias-e-governance"&gt;Click to download the research paper&lt;/a&gt;&lt;/b&gt; authored by Upasana Hembram and reviewed by Divyansha Sehgal. Shared under Creative Commons Attribution 4.0 International license.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/openness/state-of-openness-in-indias-e-governance-applications'&gt;https://cis-india.org/openness/state-of-openness-in-indias-e-governance-applications&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Upasana Hembram</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Openness</dc:subject>
    
    
        <dc:subject>FOSS</dc:subject>
    

   <dc:date>2025-03-26T02:01:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/aparna-bhatnagar-and-amrita-sengupta-education-epistemologies-and-ai-understanding-role-of-generative-ai-in-education">
    <title>Education, Epistemologies and AI: Understanding the role of Generative AI in Education</title>
    <link>https://cis-india.org/internet-governance/blog/aparna-bhatnagar-and-amrita-sengupta-education-epistemologies-and-ai-understanding-role-of-generative-ai-in-education</link>
    <description>
        &lt;b&gt;As generative AI becomes more deeply embedded in educational contexts, it raises critical questions about  trust, epistemic reliability, and the nature of knowledge production. While AI offers significant opportunities for enhancing pedagogical methodologies, facilitating personalised learning, and augmenting research, it also raises concerns regarding cognitive offloading, the erosion of critical thinking skills, and the perpetuation of biases inherent in training data. 

This essay examines how higher education institutions navigate these complexities, focusing on institutional adaptation, ethical considerations, and policy responses. Central to this inquiry is an analysis of key theoretical frameworks in education and epistemology to understand how these impact the discourse around generative AI in the classroom. This essay  looks at existing educational theory to understand the role of AI in the classroom.  Furthermore, the study assesses existing institutional and national AI policies, evaluating their efficacy in addressing governance challenges, and offers future-looking questions and recommendations to guide the responsible integration of generative AI in education.&lt;/b&gt;
        &lt;p&gt;&lt;span style="text-align: justify; "&gt;&lt;span&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/education-epistemologies-and-ai-understanding-the-role-of-generative-ai-in-education"&gt;&lt;/a&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/education-epistemologies-and-ai-understanding-the-role-of-generative-ai-in-education"&gt;Click to&lt;/a&gt; download the full text.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span style="text-align: justify; "&gt;&lt;span&gt;&lt;br /&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/aparna-bhatnagar-and-amrita-sengupta-education-epistemologies-and-ai-understanding-role-of-generative-ai-in-education'&gt;https://cis-india.org/internet-governance/blog/aparna-bhatnagar-and-amrita-sengupta-education-epistemologies-and-ai-understanding-role-of-generative-ai-in-education&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Aparna Bhatnagar and Amrita Sengupta</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Responsible AI Integration</dc:subject>
    
    
        <dc:subject>Critical Thinking</dc:subject>
    
    
        <dc:subject>Knowledge Production</dc:subject>
    
    
        <dc:subject>Education Policy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Epistemic Trust</dc:subject>
    
    
        <dc:subject>Algorithmic Bias</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    

   <dc:date>2025-03-21T15:03:58Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025">
    <title>The Centre for Internet and Society’s comments and feedback to the: Digital Personal Data Protection Rules 2025</title>
    <link>https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) submitted its comments and feedback to the Digital Personal Data Protection Rules 2025 initiated by the Indian government.&lt;/b&gt;
        &lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 3 - Notice given by data fiduciary to data principal&lt;/span&gt;&lt;/b&gt; - Under Section 5(2) of the DPDP Act, when the personal data of the data principal has been processed before the commencement of the Act, then the data fiduciary is required to give notice to the data principal as soon as reasonably practicable. However, the Rules fail to specify what is meant by reasonably practicable. The timeline for a notice in such circumstances is unclear.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;In addition, under Rule 3(a) the phrase “be presented and be understandable independently” is ambiguous. It is not clear whether the consent notice has to be presented independently of any other information or whether it only needs to be independently understandable and can be presented along with other information. &lt;/li&gt;
&lt;li&gt;In addition to this we suggest that the need for “privacy by design” mentioned in the earlier drafts is brought back, with the focus on preventing deceptive design practices (dark patterns)  being used while collecting data. &lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;br /&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 4 - Registration and obligations of Consent Manager&lt;/span&gt;&lt;/b&gt;- The concept of independent consent managers, similar to account aggregators in the financial sector, and consent manager platforms in the EU is a positive step. However, the Act and the Rules need to flesh out the interplay between the Data Fiduciary and the Consent Managers in a more detailed manner, for example, how does the data fiduciary know if a data principal is using a consent manager, and under what circumstances can the data fiduciary bypass the consent manager, what is the penalty/consequence, etc.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 6 - Reasonable security safeguards&lt;/span&gt;&lt;/b&gt; - While we appreciate the guidance provided in terms of the measures for security such as “encryption, obfuscation or masking or the use of virtual tokens”, it would also be good to refer to the SPDI Rules and include the example of the The international Standard IS/ISO/IEC 27001 on Information Technology - Security Techniques - Information Security Management System as an illustration to guide data fiduciaries.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 7 - Intimation of personal data breach&lt;/span&gt;&lt;/b&gt; - As per the Rules, the data fiduciary on becoming aware of any personal data breach is required to notify the data principal and the Data Protection Board without delay; a plain reading of this Rule suggests that data fiduciary has to report the breach almost immediately, and this could be a practical challenge. Further, the absence of any threshold (materiality, gravity of the breach, etc) for notifying the data principal means that the data fiduciary will have to inform the data principal about even an isolated data breach which may not have an impact on the data principal. In this context, we recommend the Rule be amended to state that the data fiduciary should be required to inform the Data Protection Board about every data breach, however the data principal should be informed depending on the gravity and materiality of the breach and when it is likely to result in high risk to the data principal.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Whilst the Rules have provisions for intimation of data breach, there is no specific provision requiring the Data Fiduciary to take all steps necessary to ensure that the Data Fiduciary has taken all necessary measures to mitigate the risk arising out of the said breach. Although there is an obligation to report any such measures to the Data Principal (Rule 7(1)(c)) as well as to the DPBI (Rule 7(2)(b)(iii)), there is no positive obligation imposed on the Data Fiduciary to take any such mitigation measures. The Rules and the Act merely presume that the Data Fiduciary would take mitigation measures, perhaps that is the reason why there are notification requirements for such breach, however the Rules and the Act do not put any positive obligation on the Data Fiduciary to actually implement such measures. This would lead to a situation where a Data Fiduciary may not take any measures to mitigate the risks arising out of the data breach, and be in compliance with its legal obligations by merely notifying the Data Principal as well as the DPBI that no measures have been taken to mitigate the risks arising from the data breach. In addition, the SPDI Rules state that in an event of a breach the body corporate is required to demonstrate that they had implemented reasonable security standards. This provision could be incorporated in this Rule to emphasize on the need to implement robust security standards which is one of the ways to curb data breaches from happening, and ensure that there is a protocol to mitigate the breach.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 10 - Verifiable consent for processing of personal data of child or of person with disability who has a lawful guardian&lt;/span&gt;&lt;/b&gt; - The two mechanisms provided under the Rules to verify the age and identity of parents pre-suppose a high degree of digital literacy on the part of the parents. They may either give or refuse consent without thinking too much about the consequences arising out of giving or not giving consent. As there is always a risk of individuals not providing the correct information regarding their age or their relationship with the child, platforms may have to verify every user’s age; thereby preventing users from accessing the platform anonymously. Further, there is also a risk of data maximisation of personal data rather than data minimisation; i.e parents may be required to provide far more information than required to prove their identity. One recommendation/suggestion that we propose is to remove the processing of children's personal data from the ambit of this law, and instead create a separate standalone legislation dealing with children’s digital rights. Another important issue to highlight here is the importance of the Digital Protection Board and its capacity to levy fines and impose strictures on the platforms. We have seen from examples from other countries that platforms are forced to redesign and provide for better privacy and data protection mechanisms when the regulator steps in and imposes high penalties&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 12 - Additional obligations of Significant Data Fiduciary&lt;/span&gt;&lt;/b&gt; - The Rules do not clarify which entities will be considered as a Significant Data Fiduciary, leaving that to the government notifications. This creates uncertainty for data fiduciaries, especially smaller organisations that might not be able to set up the mechanisms and people for conducting data protection impact assessment, and auditing. The Rule provides that SDFs will have to conduct an annual Data Protection Impact Assessment. While this is a step in the right direction, the Rules are currently silent on the granularity of the DPIA. Similarly for “audit” the Rules do not clarify what type of audit is needed and what the parameters are. It is therefore imperative that the government notifies the level of details that the DPIA and the audit need to go into in order to ensure that the SDFs actually address issues where their data governance practices are lacking and not use the DPIA as a whitewashing tactic.There is also a  need to reduce some of the ambiguity with regards to the parameters, and responsibilities in order to make it easier for startups and smaller players to comply with the regulations.  In addition, while there is a need to protect data and increase responsibility on organisations collecting sensitive data or large volumes of data, there is a need to look beyond compliance and look at ways that preserve the rights of the data principal. Hence significant data fiduciaries should also be given the added responsibility of collecting explicit consent from the data principal, and also have easier access for correction of data, grievance redressal and withdrawal of consent.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 14 - Processing of personal data outside India&lt;/span&gt;&lt;/b&gt; - As per section 16 of the Act the government could, by notification, restrict the transfer of data to specific countries as notified. This system of a negative list envisaged under the Act appears to have been diluted somewhat by the use of the phrase “any foreign State” under the Rules. This ambiguity should be addressed and the language in the Rules may be altered to bring it in line with the Act. Further, the rules also appear to be ultra vires to the Act. As per the DPDP Act, personal data could be shared to outside India, except to countries which were on the negative list, however, the dilution of the provision through the rules appears to have now created a white list of countries; i.e. permissible list of countries to which data can be transferred.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 15 Exemption from Act for research, archiving or statistical purposes&lt;/span&gt;- &lt;/b&gt;While creating an exception for research and statistical purposes is an understandable objective, the current wording of the provision is vague and subject to mischief. The objective behind the provision is to ensure that research activities are not hindered due to the requirements of taking consent, etc. as required under the Act. However the way the provision is currently drafted, it could be argued that a research lab or a research centre established by a large company, for e.g. Google, Meta, etc. could also seek exemptions from the provisions of this Act for conducting “research”. The research conducted may not be shared with the public in general and may be used by the companies that funded/established the research centre. Therefore there should be further conditions attached to this provision, that would keep such research centers outside the purview of the exemption. Conditions such as making the results of the research publicly available, public interest, etc. could be considered for this purpose.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;span style="text-decoration: underline;"&gt;Rule 22 - Calling for Information from data fiduciary or intermediary&lt;/span&gt; - &lt;/b&gt;This rule read with the seventh schedule appears to dilute the data minimisation and purpose limitation provisions provided for in the Act. The wide ambit of powers appears to be in contravention of the Supreme Court judgement in the Puttaswamy case, which places certain restrictions on the government while collecting personal data. This “omnibus” provision flouts guardrails like necessity and proportionality that are important to safeguard the fundamental right to privacy.&lt;/p&gt;
&lt;p&gt;It should be clarified whether this rule is merely an enabling provision to facilitate sharing of information, and only designated competent authorities as per law can avail of this provision. &lt;span style="text-decoration: underline;"&gt;Need for Confidentiality &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Additionally, the rule mandates that the government may “require the Data Fiduciary or intermediary to not disclose” any request for information made under the Act. There is no requirement of confidentiality indicated in the governing section, i.e. section 36, from which Rule 22 derives its authority. Talking about the avoidance of secrecy in government business, the Supreme Court in the State of U.P. v. Raj Narain, (1975) 4 SCC 428 has held that &lt;br /&gt; &lt;i&gt;“In a government of responsibility like ours, where all the agents of the public must be responsible for their conduct, there can but few secrets. The people of this country have a right to know every public act, everything, that is done in a public way, by their public functionaries. They are entitled to know the particulars of every public transaction in all its bearing. The right to know, which is derived from the concept of freedom of speech, though not absolute, is a factor which should make one wary, when secrecy is claimed for transactions which can, at any rate, have no repercussions on public security (2). To cover with [a] veil [of] secrecy the common routine business, is not in the interest of the public. Such secrecy can seldom be legitimately desired. It is generally desired for the purpose of parties and politics or personal self-interest or bureaucratic routine. The responsibility of officials to explain and to justify their acts is the chief safeguard against oppression and corruption.” &lt;/i&gt;&lt;br /&gt; In order to ensure that state interests are also protected, there may be an enabling provision whereby in certain instances confidentiality may be maintained, but there has to be a supervisory mechanism whereby such action may be judged on the anvil of legal propriety.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025'&gt;https://cis-india.org/internet-governance/blog/cis-comments-and-feedback-to-digital-personal-data-protection-rules-2025&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi, Vipul Kharbanda, Shweta Mohandas, Anubha Sinha and Isha Suri</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2025-03-06T02:06:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-report-on-ai-governance-guidelines-development">
    <title>The Centre for Internet and Society’s comments and recommendations to the: Report on AI Governance Guidelines Development</title>
    <link>https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-report-on-ai-governance-guidelines-development</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) submitted its comments and recommendations on the Report on AI Governance Guidelines Development.&lt;/b&gt;
        
&lt;p&gt;With research assistance by Anuj Singh&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;I. Background&lt;/h2&gt;
&lt;p&gt;On 6 January 2025, a Subcommittee on ‘AI Governance and Guidelines Development’ under the Advisory Group put out the Report on AI Governance Guidelines Development, which advocated for a whole-of-government approach to AI governance. This sub-committee was constituted by the Ministry of Electronics and Information Technology (MeitY) on November 9, 2023, to analyse gaps and offer recommendations for developing a comprehensive framework for governance of Artificial Intelligence (AI). As various AI governance conversations take centre stage, this is a welcome step, and we hope that there are more opportunities through public comments and consultations to improve on this important AI document. &lt;br /&gt;&lt;br /&gt;CIS’ comments are inline with the submission guidelines,&amp;nbsp; we have provided both comments and suggestions based on the headings and text provided in the report.&lt;/p&gt;
&lt;h2&gt;II. Governance of AI&lt;/h2&gt;
&lt;p&gt;The subcommittee report has explained its reasons for staying away from a definition. However, it would be helpful to set the scope of AI, at the outset of the report, given that different AI systems have different roles and functionalities. Having a clearer framework in the beginning can help readers better understand the scope of the conversation in the report. This section also states that AI can now &lt;strong&gt;“&lt;/strong&gt;perform complex tasks without active human control or&amp;nbsp; supervision”, while there are instances where AI is being used without an active human control, there is a need to emphasise on the need for humans in the loop. This has also been highlighted in the &lt;a href="https://oecd.ai/en/dashboards/ai-principles/P6"&gt;OECD AI principles &lt;/a&gt;which this report draws inspiration from.&lt;/p&gt;
&lt;h3&gt;A. AI Governance Principles&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;A proposed list of AI Governance principles (with their explanations) is given&amp;nbsp; below. &lt;/strong&gt;&lt;br /&gt;While referring to the OECD AI principles is a good first step in understanding the global best practices, it is suggested that an exercise in&amp;nbsp; mapping of all global AI principles documents published by international and multinationals organisations and civil society is undertaken,&amp;nbsp; to determine principles that are most important for India. The OECD AI principles also come from regions that have a better internet penetration, and higher literacy rate than India, hence for them the principle of “Digital by design governance” would be possible to be achieved but in India, a digital first approach, especially in governance, could lead to large scale exclusions.&lt;/p&gt;
&lt;h3&gt;B. Considerations to operationalise the principles&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;1. Examining AI systems using a lifecycle approach &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The sub committee has taken a novel approach to define the AI life cycle. The terms “Development, Deployment and Diffusion” have not been seen in any of the major publications about AI lifecycle. While academicians (e.g. &lt;a href="https://www.sciencedirect.com/org/science/article/pii/S1438887123002224"&gt;Chen et al. (2023&lt;/a&gt;), &lt;a href="https://www.cell.com/patterns/pdfExtended/S2666-3899(22)00074-5"&gt;De Silva and Alahakoon (2022)&lt;/a&gt;) have pointed out that the AI life cycle contains the following stages - design, development and deployment, others &lt;a href="https://www.sciencedirect.com/science/article/pii/S2666389922000745"&gt;(Ng et al. (2022)&lt;/a&gt; have defined it as “data creation, data acquisition, model development, model evaluation and model deployment. Even NASSCOM’s&amp;nbsp; &lt;a href="https://nasscom.in/ai/pdf/the-developer%27s-playbook-for-responsible-ai-in-india.pdf"&gt;Responsible AI Playbook&lt;/a&gt; follows the “conception, designing, development and deployment, as some of the key stages in the AI life cycle. Similarly the OECD also recognised “i) ‘design, data and models’ ii) ‘verification and validation’; iii) ‘deployment’; and iv) ‘operation and monitoring’.” as the phases of the AI life cycle. The subcommittee hence could provide citation as well as a justification of using this novel approach to the AI lifecycle, and state the reason for moving away from the recognised stages. Steering away from an understood approach could cause some confusion amongst different stakeholders who may not be as well versed with AI terminologies and the AI lifecycle to begin with.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;2. Taking an ecosystem-view of AI actors &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;While the report rightly states that multiple actors are involved across the AI lifecycle, it is also important to note that the same actor could also be involved in multiple stages of the AI lifecycle. For example if we take the case of an AI app used for disease diagnosis. The medical professional can be the data principal (using their own data), the data provider (using the app thereby providing the data), and the end user (someone who is using the app for diagnosis). Similarly if we look at the example of a government body,&amp;nbsp; it can be the data provider, the developer (if it is made inhouse or outsourced through tenders), the deployer, as well as the end user. Hence for each AI application there might be multiple actors who play different roles and whose roles might not be static. &lt;br /&gt;&lt;br /&gt;While looking at governance approaches, the approach must ideally not be limited to responsibilities and liabilities, especially when the “data principal” and individual end users are highlighted as actors; the approach should also include rights and means of redressal in order to be a rights based people centric approach to AI governance.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;3. Leveraging technology for governance &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;While the use of techno-legal approach in governance is picking up speed there is a need to look at existing Central and State capacity to undertake this, and also look at what are the ways this could affect people who still do not have access to the internet. One example of a techno legal approach that has seen some success has been the&lt;a href="https://www.techinasia.com/indian-state-running-pilot-put-land-records-blockchain"&gt; Bhumi programme&lt;/a&gt; in Andhra Pradesh that used blockchain for land records,&amp;nbsp; however this also led to the weakening of local institutions, and also led to exclusion of marginalised people &lt;a href="https://www.tandfonline.com/doi/full/10.1080/01436597.2021.2013116"&gt;Kshetri (2021)&lt;/a&gt;. It was also stated that there was a need to strengthen existing institutions before using a technological measure. &lt;br /&gt; &lt;strong&gt;&lt;br /&gt; &lt;/strong&gt;Secondly, while the sub committee has emphasized on the improvements in quality of generative AI tools, there is a need to assess how these tools work for Indian use cases. It was reported last year that ChatGPT could not answer all the questions relating to the Indian civil services exam, and failed to correctly answer questions on geography, however it was able to crack &lt;a href="https://indiaai.gov.in/news/chatgpt-fails-to-clear-the-prestigious-civil-service-examination"&gt;tough exams in the USA.&lt;/a&gt; In addition to this, a month ago the Finance Ministry has advised government officials to refrain from using generative AI tools on official devices for fear of leakage of &lt;a href="https://www.thehindu.com/sci-tech/technology/indias-finance-ministry-asks-employees-to-avoid-ai-tools-like-chatgpt-deepseek/article69183180.ece"&gt;confidential information.&lt;/a&gt; &lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Thirdly, the subcommittee needs to assess India’s data preparedness for this scale of techno legal approach. In our study which was specific to healthcare and AI in India, where we surveyed medical professionals, hospitals and technology companies, a common understanding was that data quality in Indian datasets was an issue, and that there was somewhere reliance on data from the global north. This could be similar in other sectors as well, hence when this data is used to train the system it could lead to harms and biases.&lt;/p&gt;
&lt;h2&gt;III. GAP ANALYSIS&lt;/h2&gt;
&lt;h3&gt;A. The need to enable effective compliance and enforcement of existing laws.&lt;/h3&gt;
&lt;p&gt;The sub-committee has highlighted the importance of ensuring that the growth of AI does not lead to unfair trade practices and market dominance. It is hence important to analyse whether the existing laws on antitrust and competition, and the regulatory capacity of Competition Commission of India&amp;nbsp; are robust enough to deal with AI, and the change in technology and technology developers.&lt;/p&gt;
&lt;p&gt;There is also an urgent need to assess the issues that might come under the ambit of competition throughout the lifecycle of AI, including in areas of chip manufacturing, compute, data, models and IP. While the players could keep changing in this evolving area of technology there is a need to strengthen the existing regulatory system, before looking at techno legal measures.&lt;/p&gt;
&lt;p&gt;We suggest that before a techno legal approach is sought in all forms of governance, there is an urgent need to map the existing regulations both central and state and assess how they apply to regulating AI, and assess the capacity of existing regulatory bodies to regulate issues of AI. In the case of healthcare for example there are multiple laws, policies and guidelines, as well as regulatory bodies that apply to various stages of healthcare and various actors and at times these regulations do not refer to each other or cause duplications that could lead to &lt;a href="https://www.kas.de/documents/d/politikdialog-asien/panorama_2024-1-107-122"&gt;lack of clarity.&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Below we are adding our comments and suggestions certain subsections in this section on &lt;strong&gt;The need to enable effective compliance and enforcement of existing laws &lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;1. Intellectual property rights&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;a. Training models on copyrighted data and liability in case of&amp;nbsp; infringement&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;While Section. 14 of the Indian Copyright Act, 1957 provides copyright holders with exclusive rights to copy and store works, considering the fact that training AI models involves making &lt;a href="https://spicyip.com/2019/08/should-indian-copyright-law-prevent-text-and-data-mining.html"&gt;non-expressive uses of work&lt;/a&gt;, a straightforward conclusion may not be drawn easily. Hence, the presumption that training models on copyrighted data constitutes infringement is premature and unfounded.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;This report states “The Indian law permits a very closed list of activities in using copyrighted data&amp;nbsp; without permission that do not constitute an infringement. Accordingly, it is clear&amp;nbsp; that the scope of the exception under Section 52(1)(a)(i) of the Copyright Act,&amp;nbsp; 1957 is extremely narrow. Commercial research is not exempted; not-for-profit &lt;sup&gt;10&lt;/sup&gt; institutional research is not exempted. Not-for-profit research for personal or private use, not with the intention of gaining profit and which does not compete&amp;nbsp; with the existing copyrighted work is exempted. “ &lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Indian copyright law follows a ‘hybrid’ model of limitations and exceptions under s.52(1). S. 52(1)(a), which is the ‘fair dealing’ provision, is more open-ended than the rest of the clauses in the section. Specifically, the Indian fair dealing provision permits fair dealing with any work (not being a computer programme) for the purposes of private or personal use, including research. &lt;br /&gt; &lt;br /&gt; If India is keen on indigenous AI development, specifically as it relates to foundation models, it should work towards developing frameworks for suitable exceptions ,as may be appropriate.&amp;nbsp; Lawmakers could distinguish between the different types of copyrighted works and public-interest purposes while considering the issue of infringement and liability&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;b. Copyrightability of work generated by using foundation models &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;We suggest that a public consultation would certainly be a useful exercise in ensuring opinions and issues of all stakeholders including copyright holders, authors, and users are taken into account.&lt;/p&gt;
&lt;h3&gt;C. The need for a whole-of-government approach.&lt;/h3&gt;
&lt;p&gt;While the information existing in silos is a significant issue and roadblock, if the many guidelines and existing principles have taught us anything, it is that without specificity and direct applicability it is difficult for implementers to extrapolate principles into their development, deployment and governance mechanisms.&amp;nbsp; The committee assumes a sectoral understanding from the government on various players in highly regulated sectors such as healthcare or financial services. However, as our recent study on &lt;a href="https://cis-india.org/internet-governance/blog/ai-for-healthcare-understanding-data-supply-chain-and-auditability-in-india"&gt;AI in healthcare&lt;/a&gt; indicates, there are significant information gaps when it comes to shared understanding of what data is being used for AI development, where the AI models are being developed and what kind of partnerships are being entered into, for development and deployment of AI systems. While the report also highlights the concerns about the siloed regulatory framework, it is also important to consider how the sector specific challenges lend themselves to the cross-sectoral discussion. Consider that an AI credit scoring system in financial services is leading to exclusion errors.&lt;/p&gt;
&lt;p&gt;Additionally, consider an AI system being deployed for disease diagnosis. While both use predictive AI, the nature of risk and harm are different. While there can be common and broad frameworks to potentially test efficacy of both AI models, the exact parameters for testing them would have to be unique. Therefore, it will be important to consider where bringing together cross-sectoral stakeholders will be useful and where it may need more deep work at the sector level.&lt;/p&gt;
&lt;h2&gt;IV. Recommendations&lt;/h2&gt;
&lt;h3&gt;1. To implement a whole-of-government approach to AI Governance, MeitY and the Principal Scientific Adviser should establish an empowered mechanism to coordinate AI Governance.&lt;/h3&gt;
&lt;p&gt;We would like to reiterate the earlier section and highlight the&amp;nbsp; importance of considering how the sector specific challenges lend themselves to the cross-sectoral discussion. While the whole of government approach is good as it will help building a common understanding between different government institutions, this approach might not be sufficient when it comes to AI governance. It is because this is based on the implicit assumption that internal coordination among various government bodies is enough to manage AI related risks.&lt;/p&gt;
&lt;h3&gt;2.To develop a systems-level understanding of India’s AI&amp;nbsp; ecosystem, MeitY should establish, and administratively house,&amp;nbsp; a Technical Secretariat to serve as a technical advisory body&amp;nbsp; and coordination focal point for the Committee/ Group.&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt; &lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The Subcommittee report states at this stage, it is not recommended to establish a Committee/ Group or its Secretariat as statutory authorities, as making such a decision requires significant analysis of gaps, requirements, and possible unintended outcomes. While these are valid considerations, it is necessary that there are adequate checks and balances in place. If the secretariat is placed within MeitY then safeguards must be in place to ensure that officials have autonomy in decision making.&amp;nbsp; The subcommittee suggests that MeitY can bring officials on deputation from other departments. Similarly the committee proposes bringing experts from the industry, while it is important for informed policy making,&amp;nbsp; there is also risk of &lt;a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4931927"&gt;regulatory capture&lt;/a&gt;. Setting a cap on the percentage of industry representatives and full disclosure of affiliations of experts involved are some of the safeguards which can be considered. We also suggest that members of civil society are also considered for this Secretariat.&lt;/p&gt;
&lt;h3&gt;3.To build evidence on actual risks and to inform harm mitigation,&amp;nbsp; the Technical Secretariat should establish, house, and operate&amp;nbsp; an AI incident database as a repository of problems&amp;nbsp; experienced in the real world that should guide responses to&amp;nbsp; mitigate or avoid repeated bad outcomes.&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt; &lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The report suggests that the technical secretariat will develop an actual incidence of AI-related risks in India. In most instances, an AI incident database will assume that an AI related unfavorable incident has already taken place, which then implies that it's no longer a potential risk but an actual harm. This recommendation takes a post-facto approach to assessing AI systems, as opposed to conducting risk assessments prior to the actual deployment of an AI system. Further, it also lays emphasis on receiving reports from public sector organizations deploying AI systems. Given that public sector organizations, in many cases, would be the deployers of AI systems as opposed to the developers, they may have limited know-how on functionality of tools and therefore the risks and harms.&lt;/p&gt;
&lt;p&gt;It is important to clarify and define what will be considered as an AI risk as this could also depend on stakeholders, for example losing clients due to an AI system for a company is a risk, and so is an individual&amp;nbsp; being denied health insurance because of AI bias.&amp;nbsp; With this understanding, while there is a need to keep an active assessment of risks and the emergence of new risks, the Technical&amp;nbsp; Secretariat could also undergo a mapping of the existing risks which have been highlighted by academia and civil society and international organisations and begin the risk database with that. In addition, the “AI incident database” should also be open to research institutions and civil society organisations similar to &lt;a href="https://oecd.ai/en/incidents"&gt;The OECD AI Incidents Monitor&lt;/a&gt;.&lt;/p&gt;
&lt;h3&gt;4. To enhance transparency and governance across the AI&amp;nbsp; ecosystem, the Technical Secretariat should engage the&amp;nbsp; industry to drive voluntary commitments on transparency&amp;nbsp; across the overall AI ecosystem and on baseline commitments&amp;nbsp; for high capability/widely deployed systems.&lt;/h3&gt;
&lt;p&gt;It is commendable that the sub committee in this report extends the transparency requirement to the government, with the example of law enforcement. This would create more trust in the systems and also add the responsibility on the companies providing these services to be compliant with existing laws and regulations.&lt;/p&gt;
&lt;p&gt;While the transparency measures listed will ensure better understanding of processes of&amp;nbsp; AI developers and deployers, there is also a need to bring in responsibility along with transparency. While this report also mentions ‘peer review by third parties’, we would also like to suggest auditing as a mechanism to undertake transparency and responsibility. In our study on &lt;a href="https://cis-india.org/internet-governance/blog/ai-for-healthcare-understanding-data-supply-chain-and-auditability-in-india-pdf"&gt;AI data supply chain &amp;amp; auditability and healthcare in India&lt;/a&gt;, (which surveyed 150 medical professionals, 175 respondents from healthcare institutions and 175 respondents from technology companies); revealed that 77 percent of healthcare institutions and 64 percent of the technology companies surveyed for this study, conducted audits or evaluations of the privacy and security measures for data.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://cis-india.org/home-images/AIGovernanceComments.png" alt="null" class="image-inline" title="AI Governance Comments" /&gt;&lt;/p&gt;
&lt;div class="visualClear"&gt;Source: CIS survey of professionals in AI and healthcare, January- April 2024. Medical professionals (n = 133); healthcare institutions (n = 162); technology companies (n = 171)&lt;/div&gt;
&lt;div class="visualClear"&gt;&amp;nbsp;&lt;/div&gt;
&lt;h3&gt;5. Form a sub-group to work with MEITY to suggest specific measures that may be considered under the proposed legislation like Digital India Act (DIA) to strengthen and harmonise the legal framework, regulatory and technical capacity and the adjudicatory set-up for the digital industries to ensure effective grievance redressal and ease of doing business.&lt;/h3&gt;
&lt;p&gt;It would be necessary to provide some clarity on where the process to the Digital India Act is currently. While there were public consultations in 2023, we have not heard about the progress in the development of the Act. The most recent discussion on the Act was in January 2025, where S Krishnan, Secretary, Ministry of Electronics and IT (MeitY), &lt;a href="https://www.financialexpress.com/life/technology-will-not-rush-in-bringing-digital-india-act-meity-secretary-3708673/"&gt;stated&lt;/a&gt; that they were in no hurry to carry forward the draft Digital India Act and regulatory framework around AI. He also stated that the existing legal frameworks were currently sufficient to handle AI intermediaries. &lt;br /&gt; &lt;br /&gt; We would also like to highlight that during the consultations on the DIA it was proposed to replace the &lt;a href="https://vidhilegalpolicy.in/blog/explained-the-digital-india-act-2023/"&gt;Information Technology Act 2000. &lt;/a&gt;It is necessary that the subcommittee give clarity on this, since if the DIA is enacted, this reports section III on GAP analysis especially around the IT Act, and Cyber Security will need to be revisited.&lt;/p&gt;
&lt;h2&gt;&lt;/h2&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-report-on-ai-governance-guidelines-development'&gt;https://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-report-on-ai-governance-guidelines-development&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shweta Mohandas, Amrita Sengupta and Anubha Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    

   <dc:date>2025-03-06T06:32:45Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/a2k/blogs/she-leads-bootcamp">
    <title>She Leads Bootcamp 2025</title>
    <link>https://cis-india.org/a2k/blogs/she-leads-bootcamp</link>
    <description>
        &lt;b&gt;CIS-A2K is committed to bridging the gender gap within Indian Wikimedia communities, and to further this goal, last year we launched the impactful She Leads program. This initiative is designed to empower female Wikimedians to take on leadership roles within their language communities, promoting diversity and inclusivity.&lt;/b&gt;
        &lt;p&gt;&lt;span style="text-align: start; "&gt;She Leads offers vital support and resources, helping women bring their ideas and initiatives to life, while fostering an inclusive, supportive environment that encourages growth and collaboration.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The response and feedback from the first iteration of She Leads illustrated a larger need to nurture female leadership in the open knowledge movement. Subsequently, we had several conversations with remarkable women in the open knowledge space to explore avenues to foster female leadership. We are grateful to Rosie Stephenson Goodknight, Masana Mulaudzi, Netha Hussain, Sneha PP, Amrita Sengupta, Manavpreet Kaur, and Medhavi Gandhi for their support and guidance in conceptualizing She Leads Bootcamp 2025. A special shout out to Satdeep Gill for being an ally and contributing to the program design. &lt;br /&gt;&lt;br /&gt;The She Leads Bootcamp 2025, which was held in Bangalore from January 31st to February 3rd, 2025, aimed to further these efforts by gathering budding women leaders from Indic Wikimedia communities. This immersive, in-person event provided participants with the tools, resources, and connections necessary to thrive as leaders. The She Leads Bootcamp 2025 helped create a robust network of women leaders who were able to collaborate and support each other’s initiatives. The training sessions focused on leadership skills, feminist methodologies, project management, and advocacy strategies. Organizers fostered a sense of belonging among participants, encouraging them to share experiences and learn from one another.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;a class="external-link" href="https://cis-india.org/a2k/files/she-bootcamp"&gt;Click to download&lt;/a&gt; the event report authored by Soni Wadhwa and edited by Chris and Nitesh Gill.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/a2k/blogs/she-leads-bootcamp'&gt;https://cis-india.org/a2k/blogs/she-leads-bootcamp&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Soni Wadhwa</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Wikimedia</dc:subject>
    
    
        <dc:subject>CIS-A2K</dc:subject>
    
    
        <dc:subject>Wikipedia</dc:subject>
    
    
        <dc:subject>Access to Knowledge</dc:subject>
    

   <dc:date>2025-02-19T14:30:11Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/submission-to-igf-2025-call-for-thematic-inputs">
    <title>Submission to IGF 2025 Call for Thematic Inputs</title>
    <link>https://cis-india.org/internet-governance/submission-to-igf-2025-call-for-thematic-inputs</link>
    <description>
        &lt;b&gt;Below are CIS's inputs submitted in response to the IGF 2025 Call for Thematic Inputs. They will inform the MAG’s discussions and assist them in determining the thematic priorities of the IGF 2025 programme.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;div class="views-field views-field-webform-submission-value-21"&gt;&lt;span class="field-content"&gt;On AI governance, AI risks and AI and data: &lt;br /&gt;
In the past many years, there has been rapid advances in the use of AI, 
most recently with the use of generative AI by end users and citizens. 
While questions of ethical use of AI, need for fairness, accountability 
and transparency are not new, the very rapid scale deployment of AI 
across different fields and also the easy use of AI across different 
users, have raised questions of exacerbated harms, infringement of 
copyright among others and a lot of focus currently is on developing 
governance for AI. Somewhere, there has been an acceptance of 
inevitability and almost omnipresence of AI across different contexts 
which has furthered deliberations around harnessing AI for good. We ask 
that while “AI for good” as an issue is being mainstreamed, it is most 
critical that there are avenues to discuss and understand areas where AI
 should not be used  (because of the outsized harms as compared to its 
benefits) or can be used through limited use of resources (given the 
wide ranging environmental impacts associated with AI and the resource 
intensive areas of computational power and data centers) and mechanisms 
to actualize that. This means that not only do we discuss AI governance 
in the context of where it is already deployed but also discuss 
conditions in which it should not be deployed. &lt;br /&gt;
There also needs to be greater and more specific regional conversations 
around data use for AI, especially for developing predictive AI systems,
 in sensitive settings such as healthcare and financial services. The 
challenges of using different data for different geographical settings 
have been well documented (consider for example training data from 
global north to develop and deploy AI diagnostics for a country in 
global south). There needs to be more specific conversations and 
transparency around data sources that are being used and how they can be
 both ethically sourced but also made contextually relevant. IGF can 
support these conversations by inviting specific inputs from the 
multi-stakeholder community on these specific issues. &lt;br /&gt;
&lt;br /&gt;
On digital identity: &lt;br /&gt;
There is growing interest in digital public infrastructure and its use 
for public service delivery and has potential to offer benefits and 
meaningful governance, if done well, as certain examples may suggest. 
However,  the implementation of digital ID systems for example, 
particularly when they are the sole means of identification, raises 
critical questions. Such systems must have robust legislative backing, 
including privacy and data protection frameworks, if not regulations, 
along with sufficient legislative and judicial oversight to ensure 
accountability. Concerns about mission creep—where systems initially 
introduced for specific purposes gradually expand to other uses without 
adequate scrutiny—highlight the need for clearly defined objectives and 
legal safeguards. These systems should proactively assess and mitigate 
risks and harms before implementation. Furthermore, given that many of 
these systems rely heavily on private companies with limited oversight, 
it is crucial to ensure meaningful community participation and 
accountability throughout the entire process to prioritize public 
interest over private gains. As we think about DPIs, we urge that its 
applicability, necessary infrastructural availability, assessment of 
risks are adequately considered and detailed through the themes and 
sessions at IGF. &lt;br /&gt;
&lt;br /&gt;
On data governance and youth engagement: &lt;br /&gt;
Personal data is being captured by different actors in an unprecedented 
manner, and at times without any legislative backing or grievance 
redressal mechanism.  With the advent of generative AI- there are also 
concerns regarding the extent to which data which is publicly available 
is being used and for what purposes. These concerns are exacerbated when
 children’s data is being used for generative AI purposes; in most cases
 without the knowledge or consent of the children. In an increasingly 
digitised world, how should children navigate the digital world; what is
 the appropriate age for children to access the internet and should 
there be age-gating, and if yes, how should that be implemented? What 
are the mechanisms to determine parental verification? As we have more 
and more young people online, it will be essential to define and develop
 frameworks for children’s use and experience of the internet, including
 having young people participate in these discussions. &lt;/span&gt;&lt;/div&gt;
&lt;div class="views-field views-field-nothing-2"&gt;&lt;span class="field-content"&gt;&lt;br /&gt;&lt;/span&gt;&lt;/div&gt;
&lt;div class="views-field views-field-webform-submission-value-20"&gt;&lt;span class="field-content"&gt;We request some steps to be taken at the IGF annual meeting and during its intersessional work: &lt;br /&gt;
-Reduce duplication of processes and efforts when it comes to 
implementation of GDC and continue to look at existing arenas like the 
WSIS+20 and IGF. Greater coordination and collaboration among various UN
 bodies. &lt;br /&gt;
-Robust support for civil society participation at the IGF and other 
internet governance processes, especially so from Global South.  &lt;br /&gt;
-Creation of well resourced working groups that look through the GDC implementation work where relevant. &lt;/span&gt;&lt;/div&gt;
&lt;div class="views-field views-field-webform-submission-value-27"&gt;&lt;span class="field-content"&gt;Given
 the diverse set of stakeholders and the wide ranging nature topics 
discussed, it is understandable that the IGF covers a lot of ground. It 
would be beneficial if there might be deeper reflections on fewer issues
 if possible, so that there is greater depth in conversations as opposed
 to a much wider coverage. We understand that this might be difficult 
given what IGF sets out to do, but a more focused approach might help 
stakeholders have a better understanding of priorities and areas of 
focus. It will also be very helpful if all sessions have space for 
Q&amp;amp;A, even if it is for 10 minutes. It allows for listeners to 
reflect and also ask questions, where possible. &lt;/span&gt;&lt;/div&gt;
&lt;div class="views-field views-field-nothing-3"&gt;&lt;span class="field-content"&gt;&lt;br /&gt;For all inputs, please visit: https://intgovforum.org/en/igf-2025-proposed-issues (CIS's inputs are under ID322)&lt;/span&gt;&lt;/div&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/submission-to-igf-2025-call-for-thematic-inputs'&gt;https://cis-india.org/internet-governance/submission-to-igf-2025-call-for-thematic-inputs&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amrita Sengupta, Yesha Tshering Paul, and Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance Forum</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2025-03-06T06:36:10Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/do-we-need-separate-health-data-law-in-india">
    <title>Do We Need a Separate Health Data Law in India?</title>
    <link>https://cis-india.org/internet-governance/blog/do-we-need-separate-health-data-law-in-india</link>
    <description>
        &lt;b&gt;This report discusses the current definitions of health data including international efforts, the report then proceeds to share some key themes that were discussed at three roundtables we conducted in May, August, and October 2024. Participants included experts from diverse stakeholder groups, including civil society organisations, lawyers, medical professionals, and academicians. In this report, we collate the various responses to two main aspects, which were the focus of the roundtables:&lt;/b&gt;
        &lt;h2&gt;Chapter 1.Background&lt;/h2&gt;
&lt;p&gt;Digitisation has become a cornerstone of India’s governance ecosystem since the &lt;a class="external-link" href="https://www.meity.gov.in/divisions/national-e-governance-plan"&gt;National e-Governance Plan&lt;/a&gt; (NeGP) of 2006. This trend can also be seen in healthcare, especially during the COVID-19 pandemic, with initiatives like the &lt;a class="external-link" href="https://abdm.gov.in/"&gt;Ayushman Bharat Digital Mission&lt;/a&gt; (ABDM). However, the digitisation of healthcare has been largely conducted without legislative backing or judicial oversight. This has resulted in inadequate grievance redressal mechanisms, potential data breaches, and threats to patient privacy.&lt;/p&gt;
&lt;p&gt;Unauthorised access to or disclosure of health data can result in stigmatisation, mental and physical harassment, and discrimination against patients. Moreover, because of the digital divide, overdependence on digital health tools to deliver health services can lead to the exclusion of the most marginalised and vulnerable sections of society, thereby undermining the equitable availability and accessibility of health services. Health data in the digitised form is also vulnerable to cyberattacks and breaches. This was evidenced in the recent ransomware attack on All India Institute of Medical Science, which, apart from violating the right to privacy of patients, also brought patient care to a &lt;a class="external-link" href="https://thewire.in/government/aiims-servers-cyberattack-ransomware-rajya-sabh"&gt;grinding halt&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;In this context, and with the rise in health data collection and uptick in the use of AI in healthcare, there is a need to look at whether India needs a standalone legislation to regulate the digital health sphere. It is also necessary to evaluate whether the existing policies and regulations are sufficient, and if amendments to these regulations would suffice.&lt;/p&gt;
&lt;p&gt;This report discusses the current definitions of health data including international efforts, the report then proceeds to share some key themes that were discussed at three roundtables we conducted in May, August, and October 2024. Participants included experts from diverse stakeholder groups, including civil society organisations, lawyers, medical professionals, and academicians. In this report, we collate the various responses to two main aspects, which were the focus of the roundtables:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;In which areas are the current health data policies and laws lacking in India?&lt;/li&gt;
&lt;li&gt;Do we need a separate health data law for India? What are the challenges associated with this? What are other ways in which health data can be regulated? &lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;Chapter 2. How is health data defined?&lt;/h2&gt;
&lt;p&gt;There are multiple definitions of health data globally. These include those incorporated into the text of data protection legislations or under  separate health data laws. In the European Union (EU), the General Data Protection Regulation defines “data concerning health” as &lt;a class="external-link" href="https://uk.practicallaw.thomsonreuters.com/8-200-3413?originationContext=document&amp;amp;transitionType=DocumentItem&amp;amp;contextData=(sc.Default)&amp;amp;ppcid=754607539a464afcb39865bf752577b7&amp;amp;comp=pluk"&gt;personal data&lt;/a&gt; that falls under &lt;a class="external-link" href="https://uk.practicallaw.thomsonreuters.com/8-200-3413?originationContext=document&amp;amp;transitionType=DocumentItem&amp;amp;contextData=(sc.Default)&amp;amp;ppcid=754607539a464afcb39865bf752577b7&amp;amp;comp=pluk"&gt;special category data&lt;/a&gt;. This includes data that requires stringent and special protection due to its sensitive nature. Data concerning health is defined under Article(Article 4[15]) as “personal data related to the physical or mental health of a natural person, including the provision of healthcare services, which reveal information about his or her health status”. The United States has the Health Insurance Portability and Accountability Act (HIPAA), which was created to make sure that the personally identifiable information (PII) gathered by healthcare and insurance companies is protected against fraud and theft and cannot be disclosed without consent. As per the World Health Organisation (WHO), ‘digital health’ refers to “a broad umbrella term encompassing eHealth, as well as emerging areas, such as the use of advanced computing sciences in &lt;a class="external-link" href="https://apps.who.int/iris/bitstream/handle/10665/311941/9789241550505-eng.pdf?ua=1."&gt;‘big data’, genomics and artificial intelligence&lt;/a&gt;”.&lt;/p&gt;
&lt;h3&gt;2.1. Current legal framework for regulating the digital healthcare ecosystem in India&lt;/h3&gt;
&lt;p&gt;In India the digital health data had been defined under the draft Digital Information Security in Healthcare Act (&lt;a class="external-link" href="https://archive.org/details/draftdishaact"&gt;DISHA&lt;/a&gt;), 2017, as an electronic record of health-related information about an individual. and includes the following: (i) information concerning the physical or mental health of the individual; (ii) information concerning any health service provided to the individual; (iii) information concerning the donation by the individual of any body part or any bodily substance; (iv) information derived from the testing or examination of a body part or bodily substance of the individual; (v) information that is collected in the course of providing health services to the individual; or (vi) information relating to the details of the clinical establishment accessed by the individual.&lt;/p&gt;
&lt;p&gt;However, DISHA was subsumed into the &lt;a class="external-link" href="https://sansad.in/getFile/BillsTexts/LSBillTexts/Asintroduced/341%20of%202019As%20Int....pdf?source=legislation"&gt;2019 version&lt;/a&gt; of the Personal Data Protection Act, called The Data and Privacy Protection Bill,  which had a definition of health data and a demarcation between sensitive personal data and personal data. Both these definitions are absent from the &lt;a class="external-link" href="https://www.meity.gov.in/writereaddata/files/Digital Personal Data Protection Act 2023.pdf"&gt;Digital Personal Data Protection Act&lt;/a&gt; (DPDPA), 2023. This makes uncertain what is defined as health data in India. It is also important to note that the health data management policies released during the pandemic relied on the definition of health data under the then draft of the Personal Data Protection Act.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(i) Drugs and Cosmetic Act, and Rules &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;At present, there is no specific law that regulates the digital health ecosystem in India. The ecosystem is currently regulated by a mix of laws regulating the offline/legacy healthcare system and policies notified by the government from time to time. The primary law governing the healthcare system in India is the &lt;a class="external-link" href="https://cdsco.gov.in/opencms/export/sites/CDSCO_WEB/Pdf-documents/acts_rules/2016DrugsandCosmeticsAct1940Rules1945.pdf"&gt;Drugs and Cosmetics Act&lt;/a&gt; (DCA), 1940, read with the &lt;a class="external-link" href="https://cdsco.gov.in/opencms/export/sites/CDSCO_WEB/Pdf-documents/acts_rules/2016DrugsandCosmeticsAct1940Rules1945.pdf"&gt;Drugs and Cosmetic Rules, 1945&lt;/a&gt;. These regulations govern the manufacture, sale, import, and distribution of drugs in India. The central and state governments are responsible for enforcing the DCA. In 2018, the  central government published the &lt;a class="external-link" href="https://cdsco.gov.in/opencms/resources/UploadCDSCOWeb/2018/UploadPublic_NoticesFiles/omimport17dec18.pdf"&gt;Draft Rules&lt;/a&gt; to amend the Drugs and Cosmetics Rules in order to incorporate provisions relating to the sale of drugs by online pharmacies (Draft Rules). However, the final rules are yet to be notified. The Draft Rules prohibit online pharmacies from disclosing the prescriptions of patients to any third person. However, they also mandate the disclosure of such information to the central and state governments, as and when required for public health purposes.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(ii) Clinical Establishments (Registration and Regulation) Act, and Rules &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;The &lt;a class="external-link" href="http://clinicalestablishments.gov.in/WriteReadData/386.pdf"&gt;Clinical Establishments Rules, 2012&lt;/a&gt;, which are issued under the Clinical Establishments (Registration and Regulation) Act, 2010, require clinical establishments to maintain electronic health records (EHRs) in accordance with the standards determined by the central government. The &lt;a class="external-link" href="https://esanjeevani.mohfw.gov.in/assets/guidelines/ehr_guidlines.pdf"&gt;Electronic Health Record (EHR) Standards, 2016&lt;/a&gt;,  were formulated to create a uniform standards-based system for EHRs in India. They provide guidelines for clinical establishments to maintain health data records as well as data and security measures. Additionally, they also lay down that ownership of the data is vested with the individual, and the healthcare provider holds such medical data in trust for the individual.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(iii) Health digitisation policies under the National Health Authority &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;In 2017, the central government formulated the &lt;a class="external-link" href="https://mohfw.gov.in/sites/default/files/9147562941489753121.pdf"&gt;National Health Policy&lt;/a&gt; (NHP). A core component of the NHP is deploying technology to deliver healthcare services. The NHP recommends creating a National Digital Health Authority (NDHA) to regulate, develop, and deploy digital health across the continuum of care. In 2019, the Niti Aayog, proposed the &lt;a class="external-link" href="https://abdm.gov.in:8081/uploads/ndhb_1_56ec695bc8.pdf"&gt;National Digital Health Blueprint&lt;/a&gt; (Blueprint). The Blueprint recommended the creation of the National Digital Health Mission. The Blueprint made this proposition stating that “the Ministry of Health and Family Welfare has prioritised the utilisation of digital health to ensure effective service delivery and citizen empowerment so as to bring significant improvements in public health delivery”. It also stated that an institution such as the National Digital Health Mission (NDHM), which is undertaking significant reforms in health, should have legal backing.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(iv) Telemedicine Practice Guidelines &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;On 25 March 2020, the &lt;a class="external-link" href="https://esanjeevani.mohfw.gov.in/assets/guidelines/Telemedicine_Practice_Guidelines.pdf"&gt;Telemedicine Practice Guidelines &lt;/a&gt;under the Indian Medical Council Act were notified. The Guidelines provide a framework for registered medical practitioners to follow for teleconsultations.&lt;/p&gt;
&lt;h3&gt;2.2. Digital Personal Data Protection Act, 2023&lt;/h3&gt;
&lt;p&gt;There has been much hope for India’s data protection legislation in India to cover definitions of health data, keeping in mind the removal of DISHA and the uptick in health digitisation in both the public and private health sectors. The privacy/data protection law, the DPDPA was notified on 12 August 2023. However, the provisions have still not come into force. So, currently, health data and patient medical history are regulated by the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules (&lt;a class="external-link" href="https://www.meity.gov.in/sites/upload_files/dit/files/GSR313E_10511(1).pdf"&gt;SPDI Rules&lt;/a&gt;), 2011. The SPDI Rules will be replaced by the DPDA as and when its different provisions are enforced. On 3 January 2025, the Ministry of Electronics and Information Technology released the Draft Digital Personal Data Protection Rules, 2025, for public consultation. The last date for submitting the comments is 18 February 2025.&lt;/p&gt;
&lt;p&gt;Health data is regarded as sensitive personal data under the SPDI Rules. Earlier drafts of the data protection legislation had demarcated data as personal data and sensitive personal data, and health data was regarded as sensitive personal data. However, the DPDA has removed the distinction between personal data and sensitive personal data. Instead, all data is regarded as personal data. Therefore, the extra protection that was previously afforded to health data has been removed. The &lt;a class="external-link" href="https://innovateindia.mygov.in/dpdp-rules-2025/"&gt;Draft Rules&lt;/a&gt; also do not mention health data or provide any additional safeguards when it comes to protecting health data. However, it exempts healthcare professionals from the obligations that have been put on data fiduciaries when it comes to processing children’s data. The processing has to be restricted to the extent necessary to protect the health of the child.&lt;/p&gt;
&lt;p&gt;As seen so far, while there are multiple healthcare-related regulations that govern stakeholders – from medical device manufacturers to medical professionals – there is still a vacuum in terms of the definition of health data. The DPDPA does not clarify this definition. Further, there are no clear guidelines for how these regulations work with one another, especially in the case of newer technologies like AI, which have already started disrupting the Indian health ecosystem.&lt;/p&gt;
&lt;h2&gt;Chapter 3. Key takeaways from the health data roundtables&lt;/h2&gt;
&lt;p&gt;The three health data roundtables covered various important topics related to health data governance in India. The first roundtable highlighted the major concerns and examined the granular details of considering a separate law for digital healthcare. The second round table featured a detailed discussion on the need for a separate law, or whether the existing laws can be modified to address extant concerns. There was also a conversation on whether the absence of a classification absolves organisations from the responsibility to protect or secure health data. Participants stated that due to the sensitivity of health data, data fiduciaries processing health data could qualify it as significant data fiduciary under the the proposed DPDPA Rules (that were at the time of hosting the roundtables) yet to be published. The final roundtable concluded with an in-depth discussion on the need for a health data law. However, no consensus has emerged among the different stakeholders.&lt;/p&gt;
&lt;p&gt;The roundtables highlighted that the different stakeholders – medical professionals, civil society workers, academics, lawyers, and people working in startups – were indeed thinking about how to regulate health data. But there was no single approach that all agreed on.&lt;/p&gt;
&lt;h3&gt;3.1. Health data concerns&lt;/h3&gt;
&lt;p&gt;Here, we summarise the key points that emerged during the three roundtables. These findings shed light on concerns regarding the collection, sharing, and regulation of health data.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(i) Removal of sensitive personal data classification&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;In the second roundtable, there was a discussion on the removal of the definition of health data from the final version of the DPDPA, which also removed the provision for sensitive personal data;  health data previously came under this category. One participant stated that differentiating between sensitive personal data and data was important, as sensitive personal data such as health data warrants more security. They further stated that without such a clear distinction, data such as health status and sexual history could be easily accessed. Participants also pointed out that given the current infrastructure of digital data, the security of personal data is not up to the mark. Hence a clear classification of sensitive and personal data would ensure that data fiduciaries collecting and processing sensitive personal data would have greater responsibility and accountability.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(ii) Definition of informed consent &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;The term ‘informed consent’ came up several times during the roundtable discussions. But there was no clarity on what it means. A medical professional stated that in their practice, informed consent applies only to treatment. However, if the patient’s data is being used for research, it goes through the necessary internal review board and ethics board for clearance. One participant mentioned that the Section 2(i) of the &lt;a class="external-link" href="https://www.indiacode.nic.in/bitstream/123456789/2249/1/A2017-10.pdf"&gt;Mental Healthcare Act (MHA), 2017 &lt;/a&gt;defines informed consent as&lt;/p&gt;
&lt;p class="callout"&gt;consent given for a specific intervention, without any force, undue influence, fraud, threat, mistake or misrepresentation, and obtained after disclosing to a person adequate information including risks and benefits of, and alternatives to, the specific intervention in a language and manner understood by the person; a nominee to make a decision and consent on behalf of another person.&lt;/p&gt;
&lt;p&gt;Neither the DPDA nor the Draft DPDPA Rules define informed consent. However, the Draft DPDA Rules state that the notice given by the data fiduciary to the data principal must use simple, plain language to provide the data principal with a full and transparent account of the information necessary so that they can provide informed consent to process their personal data.&lt;/p&gt;
&lt;p&gt;A stakeholder pointed out that consent is taken without much nuance or the option for choice or nuance. Indeed, consent is often presented in non-negotiable terms, creating power imbalances and undermining patient autonomy. Suggested solutions include instituting granular and revocable consent mechanisms. This point also emerged during the third roundtable, where it was highlighted that consenting to a medical procedure was different from consenting to data being used to train AI. When a consent form that a patient or  caregiver is asked to sign gives the relevant information and no choice but to sign, it creates a severe power imbalance. Participants also emphasised that there was a need to assess if consent was being used as a tool to enable more data-sharing or a mechanism for citizens to be given other rights, such as the reasonable expectation that their medical information would not be used for commercial interests, especially to their own detriment, just because they signed a form. One suggested way to tackle this is for there to be greater demarcation of the aspects a person could consent to. This would give people more control over the various ways in which their data is used.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(iii) Data sharing with third parties &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Discussions also focused on the concerns about sharing health data with third parties, especially if the data is transferred outside India. Data is/can be shared with tech companies and research organisations. So the discussions highlighted the regulations and norms governing how such data sharing occurs despite the fragmented regulations. For instance:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Indian Council of Medical Research (ICMR) &lt;a class="external-link" href="https://www.icmr.gov.in/ethical-guidelines-for-application-of-artificial-intelligence-in-biomedical-research-and-healthcare"&gt;Ethical guidelines for application of Artificial Intelligence in Biomedical Research&lt;/a&gt; and Healthcare mandate strict protocols for sharing health data, but these are not binding. They state that the sharing of health data by medical institutions with tech companies and collaborators, must go through the ICMR and Health Ministry’s Screening Committee. This committee has strict guidelines on how and how much data can be shared and how it needs to be shared. The process also requires that all PII is removed and only 10 percent of the total data is permitted to be shared with any collaborator outside of any Indian jurisdiction.&lt;/li&gt;
&lt;li&gt;Companies working internationally have to comply with global standards like the GDPR and HIPAA, highlighting the gaps in India’s domestic framework which leaves the companies uncertain of which regulations to comply with. There is a need to balance the interests of startups that require more data and better longitudinal health records, and the need for strong data protection, data minimisation, and storage limitation.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;(iv) Inadequate healthcare infrastructure&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;With respect to the implementation challenges associated with health data laws, participants noted that, currently, the Indian healthcare infrastructure is not up to the mark. Moreover, smaller and rural hospitals are not yet on board with health digitisation and may not be able to comply with additional rules and responsibilities. In terms of capacity as well, smaller healthcare facilities lack the resources to implement and comply with complex regulations.&lt;/p&gt;
&lt;h3&gt;3.2. Regulatory challenges&lt;/h3&gt;
&lt;p&gt;Significant time was spent on discussing the regulatory challenges and deficiencies in India’s healthcare infrastructure. The discussion primarily revolved around the following points:&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(i) State vs. central jurisdiction &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Under the Constitutional Scheme, legislative responsibilities for various subjects are demarcated between the centre and the states, and are sometimes shared between them. The topics of public health and sanitation, hospitals, and dispensaries fall under the state list set out in the Seventh Schedule of the Constitution. This means that state governments have the primary responsibility of framing and implementing laws on these subjects. Under this, local governance institutions, namely local bodies, also play an important role in discharging public health responsibilities.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(ii) Do we bring back DISHA? &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;During the conversation about the need for the health data regulation, participants brought up that there had been an earlier push for a health data law in the form of DISHA, 2017. But this was later abandoned. DISHA aimed to set up digital health authorities at the national and state levels to implement privacy and security measures for digital health data and create a mechanism for the exchange of electronic health data. Another concern that arose with respect to having a central health data legislation was that, as health is a state subject, there could be confusion about having a separate, centralised regulatory body to oversee how the data is being handled. This might come with a lack of clarity on  who would address what, or which ministry (in the state or central government) would handle the redressal mechanism.&lt;/p&gt;
&lt;h3&gt;3.3. Are the existing guidelines enough?&lt;/h3&gt;
&lt;p&gt;Participants highlighted that enacting a separate law to regulate digital health would be challenging, considering that the DPDPA took seven years to be enacted, the rules are yet to be drafted, and the Data Protection Board has not been established. Hence, any new legislation would take significant resources, including manpower and time.&lt;/p&gt;
&lt;p&gt;In this context, there were discussions acknowledging that although the DPDPA does not currently regulate health data, there are other forms of regulation and policies that are prescribed for specific types of interventions when it comes to health data; for example, the Telemedicine Practice Guidelines, 2020, and the Medical Council of India Rules. These are binding on medical practitioners, with penalties for non-conforming, such as the revoking of medical licenses. Similarly the ICMR guidelines on the use of data in biomedical research include specific transparency measures, and existing obligations on health data collectors that would work irrespective of the lack of distinction between sensitive personal data and personal data under the DPDPA.&lt;/p&gt;
&lt;p&gt;However, another participant rightly pointed out that the ICMR guidelines and the policies from the Ministry of Health and Family Welfare are not binding. Similarly, regulations like the Telemedicine Practice Guidelines and Indian Medical Council Act are only applicable to medical practitioners. There are now a number of companies that collect and process a lot of health data; they are not covered by these regulations. Although there are multiple regulations on healthcare and pharma, none of them cover or govern technology. The only relevant one is the Telemedicine Practice Guidelines, which say that AI cannot advise any patient; it can only provide support.&lt;/p&gt;
&lt;h2&gt;Chapter 4. Recommendations&lt;/h2&gt;
&lt;p&gt;Several key points were raised and highlighted during the three roundtables. There were also a few suggestions for how to regulate the digital health sphere. These recommendations and points can be classified into short-term measures and long-term measures.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;4.1. Short-term measures &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;We propose two short-term measures, as follows:&lt;/p&gt;
&lt;p&gt;(i) Make amendments to the DPDPA Introduce sector-specific provisions for health data within the existing framework. The provisions should include guidelines for informed consent, data security, and grievance redressal.&lt;/p&gt;
&lt;p&gt;(ii) Capacity-building Provide training for healthcare providers and data fiduciaries on data security and compliance.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;4.2. Long-term measures &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;We offer six long-term measures, as follows:&lt;/p&gt;
&lt;p&gt;(i) Standalone legislation Enact a dedicated health data law that&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Defines health data and its scope; ●	Establishes a regulatory authority for oversight; and &lt;/li&gt;
&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;Includes provisions for data sharing, security, and patient rights. &lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;b&gt;(ii) National Digital Health Authority &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Establish a central authority, similar to the EU’s Health Data Space, to regulate and monitor digital health initiatives.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(ii) Cross-sectoral coordination &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Develop mechanisms to align central and state policies and ensure seamless implementation.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(v) Technological safeguards &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Encourage the development of AI-specific policies and guidelines to address the ethics of using health data.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;(vi) Stringent measures to address data breaches&lt;br /&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Increase the trust of people by addressing data breaches, and fostering proactive dialogue between patients, medical community, government and civil society. Reduce the exemption for data processing, such as that granted to the state for healthcare&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;The roundtable discussions highlighted the fragmented nature of the digital health sphere, and the issues that emanate from such a fractured polity. Considering the variations in the healthcare infrastructure and budget allocation across different states, the feasibility of enacting a central digital health law requires more in-depth research. The existing laws governing the offline/legacy health space also need careful examination to understand whether amendments to these laws are sufficient to regulate the digital health space.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/do-we-need-separate-health-data-india.pdf"&gt;Click to download the file&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/do-we-need-separate-health-data-law-in-india'&gt;https://cis-india.org/internet-governance/blog/do-we-need-separate-health-data-law-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2025-02-07T14:13:02Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/bahujan-digital-publishing-infrastructures">
    <title>Bahujan Digital Publishing Infrastructures</title>
    <link>https://cis-india.org/raw/bahujan-digital-publishing-infrastructures</link>
    <description>
        &lt;b&gt;In this study, we look at alternative Bahujan digital publishing as sites where Bahujans can claim media representation and how a vision of an anti-caste internet is emerging through these publishing practices.&lt;/b&gt;
        
&lt;p&gt;Formal knowledge production, media, and technology in India are dominated and hegemonised by elite oppressor castes (the Savarnas). The exclusion of the caste-oppressed majority (the Bahujans) from mass media systematically erases their narratives, histories, and opportunities present to them.&lt;/p&gt;
&lt;p&gt;We study how, despite systemic challenges, Bahujan publication spaces have emerged across digital media as sites of intersectional discourse on caste, using new media such as blogs, visual art, memes, YouTube channels, infographics, podcasts, etc. Further, we look at how this has exposed casteism buried under the ‘casteless’ facade of digital technologies, which are rife with issues of caste-based hate speech, poor moderation, algorithmic bias, and inadequate platform governance. For this, we draw on qualitative interviews with ten Bahujan publishing projects across social media.&lt;/p&gt;
&lt;p&gt;Through a caste-critical lens, we look at motivations, infrastructural needs, editorial processes, audience engagement, other challenges, and the future vision for these publishing projects. We discuss questions of identity, community, hate speech, platform censorship, mental health, and self-care that emerge in online anti-caste publishing. Finally, we try to articulate an emerging vision of an anti-caste internet.&lt;/p&gt;
&lt;p&gt;We explore the following questions through our research:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Why do Bahujans start publishing?&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What are the infrastructures of Bahujan publishing? &lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Who engages with anti-caste content?&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What resistance do Bahujan publishers face?&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;How do Bahujan publishers view mainstream progressive movements?&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;How do Bahujan publishers think about the future of the internet?&lt;/strong&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;&lt;span style="text-align: start; float: none;"&gt;The key takeaways from our research are:&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Publishing is a socio-technical response&lt;/strong&gt; Digital Bahujan publishers have largely started in response to shifting political landscapes within India, where caste oppression, while increasingly invisibilised, has only strengthened. Bahujan publishing uses digital tools to challenge caste oppression, fostering anti-caste discourse and community building despite limited resources and systemic barriers.&lt;/li&gt;&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Publishing is a community effort&lt;/strong&gt; Bahujan publishing exists primarily within online anti-caste communities. Anti-caste communities help each other navigate resource constraints to raise funds, build safe spaces to provide critical mental health support, provide safety from hate speech, and build resistance and resilience together.&lt;/li&gt;&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Caste mediates publishing infrastructure&lt;/strong&gt; Caste hierarchies restrict resources and opportunities for Bahujan publishers, who often precariously self-fund their work. Meanwhile, media circles and the funding ecosystem are dominated by savarnas, who gatekeep their resources and knowledge from Bahujan publishers.&lt;/li&gt;&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Online Casteism is enabled by Platforms&lt;/strong&gt; Social media platforms have failed to address rampant caste-based hate speech effectively, leaving Bahujan publishers to manage the hate speech on their own. This takes a severe toll on the publisher’s mental and emotional health, especially hurting Bahujan publishers from smaller towns, women, and queer folks.&lt;/li&gt;&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;The future of &lt;/strong&gt;Anticaste &lt;strong&gt;publishing is uncertain&lt;/strong&gt; The reach of Bahujan publishers varies wildly and unexplainably, which makes it difficult for them to rely on social media for audiences and monetisation. Bahujan publishers face a triple whammy: algorithms that suppress anti-caste content, social media platforms moving away from political content, and contentious legislation that censors independent political content. &lt;/li&gt;&lt;/ul&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;The future of Anticaste publishing is on independent platforms&lt;/strong&gt; Bahujan publishers desire platform sovereignty—to own and control their own platforms and to be able to control what they put out and how it reaches their audiences—and a vision of the internet that works towards the annihilation of castes, both online and offline.&lt;/li&gt;&lt;/ul&gt;
&lt;hr /&gt;
&lt;p&gt;Read the full report &lt;a href="https://cis-india.org/raw/files/dba-report.pdf" class="external-link"&gt;here&lt;/a&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/bahujan-digital-publishing-infrastructures'&gt;https://cis-india.org/raw/bahujan-digital-publishing-infrastructures&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Yatharth</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Cultures</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Publications</dc:subject>
    
    
        <dc:subject>Caste</dc:subject>
    

   <dc:date>2025-01-20T10:48:39Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps">
    <title>Privacy Policy Framework for Indian Mental Health Apps </title>
    <link>https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps</link>
    <description>
        &lt;b&gt;This report analyses the privacy policies of mental health apps in India and provides recommendations for making the policies not only legally compliant but also user-centric&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The report’s findings indicate a significant gap in the structure and content of privacy policies in Indian mental health apps. This highlights the need to develop a framework that can guide organisations in developing their privacy policies. Therefore, this report proposes a holistic framework to guide the development of privacy policies for mental health apps in India. It focuses on three key segments that are an essential part of the privacy policy of any mental health app. First, it must include factors considered essential by the Digital Personal Data Protection Act 2023 (DPDPA) such as consent mechanisms, rights of the data principal, provision to withdraw consent etc. Second, the privacy policy must state how the data provided by them to these apps will be used. Finally, developers must include key elements, such as provisions for third-party integrations and data retention policies.”&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Click to download the full research paper &lt;a class="external-link" href="https://cis-india.org/internet-governance/files/privacy-policy-framework.pdf"&gt;here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps'&gt;https://cis-india.org/internet-governance/blog/privacy-policy-framework-for-indian-metal-health-apps&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Chakshu Sang and Shweta Mohandas</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2025-01-10T00:11:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/internet-governance/cis-digest-2024">
    <title>CIS Research Digest - 2024</title>
    <link>https://cis-india.org/internet-governance/cis-digest-2024</link>
    <description>
        &lt;b&gt;Read ahead for a summary of the in-depth research and analysis published by CIS in 2024.&lt;/b&gt;
        
&lt;p id="docs-internal-guid-8dc2da47-7fff-2015-cb15-a18528388458" dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;2024&amp;nbsp;saw several incidents and developments that can be used to further global and domestic discourse focused on rights-based policy making. At the Centre for Internet &amp;amp; Society, we analysed some of these developments through our research on important contemporary issues, while also attempting to connect with relevant stakeholders and build communities that can utilise our research impactfully. From platform economy research examining women’s work on digital labour platforms, to research comparing data governance trends between India and the EU, and an in-depth analysis of AI audit practices and the data supply chain for AI in Indian healthcare, this is a synopsis of the in-depth research and analysis published by CIS in 2024.&amp;nbsp;&lt;/p&gt;
&lt;h4 dir="ltr"&gt;Research Reports, Papers and Case Studies:&lt;/h4&gt;
&lt;ol&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Subodh Kulkarni explores opportunities within the Wikimedia movement and projects to help revitalise small and underrepresented languages in India and provide recommendations in furthering this effort. [&lt;a href="https://cis-india.org/a2k/blogs/using-wikimedia-sphere-for-revitalization-of-small-and-underrepresented-languages-in-india"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Soni Wadhwa identifies broad patterns that have materialized in the Open Movement in the country in the last decade, and reflects on the nature of the Open and the need to envision it differently from what it currently is. [&lt;a href="https://cis-india.org/a2k/blogs/open-movement-in-india-idea-and-its-expressions"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;In partnership with the Tilburg Institute of Law, Technology and Society, Netherlands, the Centre for Communication Governance at the National Law University Delhi, India, Arindrajit Basu and Isha Suri compare data governance trends between the EU and India. [&lt;a href="https://cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Puthiya Purayil Sneha and Saumyaa Naidu explored the growth and interpretations of feminist infrastructures, through research on feminist publishing, content creation and curation spaces and how they have informed the contemporary discourse on feminism, gender, and sexuality in India. Some key learnings from this report were also the subject of a panel discussion in partnership with Khabar Lahariya. [&lt;a href="https://cis-india.org/raw/understanding-feminist-structures"&gt;link&lt;/a&gt;] [&lt;a href="https://www.youtube.com/watch?v=EGqLd0o7060"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Divyansha Sehgal and Lakshmi T. Nambiar looked at how short-form video platforms in India address online gender based violence by analysing their terms of service, community guidelines, and reporting workflows. [&lt;a href="https://cis-india.org/raw/online-gender-based-violence-on-short-form-video-platforms"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;In collaboration with The Australian Strategic Policy Institute, Arindrajit Basu and Isha Suri conducted an in-depth study of technical standards that govern Artificial Intelligence (AI), technical standards and diplomacy. [&lt;a href="https://www.aspi.org.au/report/negotiating-technical-standards-artificial-intelligence-techdiplomacy-playbook-policymakers"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Soni Wadhwa, Puthiya Purayil Sneha, Garima Agrawal, and Nishant Shankar summarise discussions from the “Future of the Commons” Conference in Pune, which focused on AI, Indian Languages, and Archives. The three day conference focused on framing AI and Indian languages in the context of achievable goals, navigating digital inclusion roadblocks, and digitisation and archiving in India [&lt;a href="https://cis-india.org/raw/files/future-of-commons-report.pdf/at_download/file"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;To gain insights on the effectiveness of union structures, priority of union demands, and workers’ interest in joining cooperative societies, the Telangana Gig and Platform Workers’ Union conducted a survey with app-based platform companies’ workers. Chetna V M, Nishkala Sekhar, Chiara Furtado, and Shaik Salauddin’s report highlights these insights and findings. [&lt;a href="https://cis-india.org/raw/gig-and-platform-workers-perspectives-on-worker-collectives"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Ashwini Lele delves into the role of 'Open Knowledge' players within the framework of the NEP 2020, and provides insights and recommendations for effective implementation with a focus on Wikimedia’s ‘Open Knowledge’ platform. [&lt;a href="https://cis-india.org/a2k/envisioning-role-of-open-knowledge-in-implementation-of-national-education-policy"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Aayush Rathi, Abhishek Sekharan, Ambika Tandon, Chetna VM, Chiara Furtado, Nishkala Sekhar, and Sriharsha Devulapalli conducted quantitative surveys with over 800 workers employed in the app-based taxi and delivery sectors across 4 Indian cities. The data briefs following this survey form a foundational evidence base for labour rights policy, social protection, and urban inclusion in platform work. [&lt;a href="https://cis-india.org/raw/platforming-precarity-data-narratives-workers"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Amrita Sengupta, Shweta Mohandas, Abhineet Nayyar, Chetna VM, Puthiya Purayil Sneha, and Yatharth study the prevalence and use of AI auditing practices in the healthcare sector, and unpack how AI systems are developed and deployed to achieve healthcare outcomes, and how AI audits are perceived and implemented by key stakeholders in the healthcare ecosystem. [&lt;a href="https://cis-india.org/internet-governance/blog/ai-for-healthcare-understanding-data-supply-chain-and-auditability-in-india"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Yesha Tshering Paul and Amrita Sengupta examine the emotional and technological underpinnings of gender-based violence faced by women in politics in India, and how gender-based violence is weaponised to diminish the political participation and influence of women in the public eye. [&lt;a href="https://cis-india.org/internet-governance/blog/technology-facilitated-gender-based-violence-and-women2019s-political-participation-in-india-a-position-paper"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;/ol&gt;
&lt;h4 dir="ltr"&gt;Public Consultations:&lt;/h4&gt;
&lt;p dir="ltr"&gt;CIS has consistently participated in policy consultation processes, both in the form of public consultation events, and written submissions to draft policies that invite public comments.&lt;/p&gt;
&lt;ol&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Chiara Furtado and Nishkala Shankar contributed to the joint submission of comments to the draft Karnataka Platform based Gig Workers (Social Security and Welfare) Bill, 2024, in partnership with Vidhi Centre for Legal Policy, and the Indian Federation of App Based Transport Workers. [&lt;a href="https://vidhilegalpolicy.in/research/comments-on-the-draft-karnataka-platform-based-gig-workers-bill-2024/"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Garima Agrawal’s comments to the “Draft Circular on Digital Lending: Transparency in Aggregation of Loan Products from Multiple Lenders” to the Reserve Bank of India focus on reducing information asymmetry, enhancing market fairness, and issues of data privacy and security in the fintech ecosystem. [&lt;a href="https://cis-india.org/internet-governance/blog/draft-circular-on-digital-lending-2013-transparency-in-aggregation-of-loan-products-from-multiple-lenders"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;Abhineet Nayyar, Isha Suri, and Pallavi Bedi submitted comments on the draft Digital Competition Bill, 2024, focusing on the transition from an ex-post to an ex-ante approach for digital competition regulation, alongside other issues and gaps in the new Bill. [&lt;a href="https://cis-india.org/internet-governance/blog/comments-to-the-draft-digital-competition-bill"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;/ol&gt;
&lt;h4 dir="ltr"&gt;Explainer Series:&lt;/h4&gt;
&lt;p dir="ltr"&gt;In an attempt to make our research more accessible to diverse audiences, we have also begun the Explainer series, which summarizes key socio-legal concepts underlying commonly used technologies, using visual explainers.&lt;/p&gt;
&lt;ol&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;The first issue of Explainer delves deeper into 'Tying and Bundling', one of the nine Anti-Competitive Practices in the draft Digital Competition Bill, 2024, which has seen little discussion, usually featuring heavy jargon and limited accessibility. [&lt;a href="https://cis-india.org/raw/explainer-tying-and-bundling"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;li style="list-style-type: decimal;" dir="ltr"&gt;
&lt;p dir="ltr"&gt;The second issue of Explainer focuses on ‘Predatory Pricing’, another Anti-Competitive Practice that the draft bill relies on, which has seen very limited discussion. [&lt;a href="https://cis-india.org/raw/explainer-predatory-pricing"&gt;link&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/cis-digest-2024'&gt;https://cis-india.org/internet-governance/cis-digest-2024&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>CIS</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2025-02-06T08:06:33Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/explainer-predatory-pricing">
    <title>Explainer | Predatory Pricing </title>
    <link>https://cis-india.org/raw/explainer-predatory-pricing</link>
    <description>
        &lt;b&gt;Who doesn't love discounts? After all, that is what got so many of us on the internet for the first time. And yet, earlier this year, the Ministry of Corporate Affairs, in its draft Digital Competition Bill, mentioned 'Pricing/Deep Discounting' as one of the Anti-Competitive Practices, or ACPs, that the draft Bill relies on. Does this mean that discounting or pricing can be anti-competitive? If so, how do we identify this form of predatory pricing?&lt;/b&gt;
        &lt;p style="text-align: start; "&gt;In this Explainer, we explore the practice of 'Predatory Pricing' and unpack it not just for legal practitioners and antitrust authorities, but also for lay people whose reliance on BigTech platforms continues to increase every day."&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Credits&lt;/h3&gt;
&lt;ul style="text-align: start; "&gt;
&lt;li style="text-align: justify; "&gt;Conceptualisation and research by Abhineet Nayyar and Isha Suri&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: start; "&gt;
&lt;li style="text-align: justify; "&gt;Design by Chris&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;div style="text-align: start; "&gt;Click&lt;span&gt; &lt;/span&gt;&lt;span&gt;&lt;a class="external-link" href="http://cis-india.org/raw/files/explainer-predatory-pricing.pdf"&gt;here&lt;/a&gt; &lt;/span&gt;to see the explainer.&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/explainer-predatory-pricing'&gt;https://cis-india.org/raw/explainer-predatory-pricing&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Abhineet Nayyar and Isha Suri</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Digital Competition</dc:subject>
    
    
        <dc:subject>Antitrust</dc:subject>
    

   <dc:date>2025-04-23T13:50:12Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>





</rdf:RDF>
