<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="https://cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>https://cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 21 to 24.
        
  </description>
  
  
  
  
  <image rdf:resource="https://cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="https://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/big-data-reproductive-health-india-mcts"/>
        
        
            <rdf:li rdf:resource="https://cis-india.org/raw/oes-ambika-tandon-ai-in-the-future-of-work"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="https://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit">
    <title>Concept Note - Digital Citizen Summit</title>
    <link>https://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit'&gt;https://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>ambika</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-11-07T02:50:22Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call">
    <title>Call for Researchers: Welfare, Gender, and Surveillance</title>
    <link>https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call</link>
    <description>
        &lt;b&gt;We are inviting applications for two researchers. Each researcher is expected to write a narrative essay that interrogates the modes of surveillance that people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations are put under as they seek sexual and reproductive health (SRH) services in India. The researchers are expected to undertake field research in the location they are based in, and reflect on lived experiences gathered through field research as well as their own experiences of doing field research. Please read the sections below for more details about the work involved, the timeline for the same, and the application process for this call.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Call for Researchers: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_Researchers_WelfareGenderSurveillance_Call_20200110.pdf" target="_blank"&gt;Download&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;strong&gt;Description of the Work&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Each researcher is expected to author a narrative essay that presents and reflects on lived experiences of people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations as they seek sexual and reproductive health (SRH) services in India. We expect the essay to contribute to a larger body of knowledge around the increasing focus on data-driven initiatives for public health provision in the country and elsewhere. Accordingly, the researcher may respond to any one or more than one of the following questions, within the context of the geographical focus as specified by the researcher:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What are the modes of surveillance, especially in terms of generation and exploitation of digital data, experienced by people of marginalised gender identities and sexual orientations in India, as they avail of sexual and reproductive healthcare?&lt;/li&gt;
&lt;li&gt;How are the lived experiences of underserved populations, such as people of marginalised gender identities and sexual orientations, shaped by gendered surveillance while accessing sexual and reproductive services?&lt;/li&gt;
&lt;li&gt;What are the modes of governance and gender ideologies that have mediated the increasing datafication of such provision?&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;We expect the researchers to draw on a) the Indian Supreme Court’s framing of privacy in India, as a fundamental right, and its implications; and b) apply and/or build on feminist conceptualisations of privacy. Further, we expect the researchers to respond to the uncertain landscape of legal rights accessible to people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations, especially in the current context shaped by The Transgender Persons (Protection of Rights) Act, 2019.&lt;/p&gt;
&lt;p&gt;The researchers will undertake field research in locations of their choice, conduct interviews and discussions with people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations seeking such services, and conduct formal and informal interviews with officials and personnel associated with public and private sector agencies involved in the provision of SRH services.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Eligibility and Application Process&lt;/strong&gt;&lt;/h3&gt;
&lt;h4&gt;We specifically encourage people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations to submit their applications for this call for researchers.&lt;/h4&gt;
&lt;p&gt;We are seeking applications from individuals who:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Are based in the place where field study is to be undertaken, for the duration of the study;&lt;/li&gt;
&lt;li&gt;Are fluent in the main regional language(s) spoken in the city where the study will be conducted, and in English (especially written);&lt;/li&gt;
&lt;li&gt;Preferably have a postgraduate degree (current students should also apply) in social or technical sciences, journalism, or legal studies (undergraduate degree-holders with research or work experience should also apply); and&lt;/li&gt;
&lt;li&gt;Have previous research and writing experiences on issues at the intersection of sexual and reproductive health, gender justice and women’s rights, and health informatics or digital public health.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;Please send the following documents (in text or PDF formats) to ​&lt;strong&gt;​raw@cis-india.org​​ by ​Friday, January 24​​&lt;/strong&gt; to apply for the researcher positions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Brief CV with relevant academic and professional information;&lt;/li&gt;
&lt;li&gt;Two samples of academic/professional (published/unpublished) writing by the applicant; and&lt;/li&gt;
&lt;li&gt;A brief research proposal (around 500 words) that should specify the scope (geographical and conceptual), research questions, and motivation of the essay to be authored by the applicant.&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;All applicants will be informed of the selection decisions by Friday, January 31.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Timeline of the Work&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;February 3-7&lt;/strong&gt; CIS research team will have a call with each researcher to plan out the work to be undertaken by them&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;February - March&lt;/strong&gt; Researchers are to undertake field research, as proposed by the researchers and discussed with the CIS research team&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;March 27&lt;/strong&gt; Researchers are to submit a full draft essay (around 3,000 words)&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;March 30 - April 3&lt;/strong&gt; CIS research team will have call with each researcher to discuss the shared draft essays and make plans towards their finalisation&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;May 15&lt;/strong&gt; Researchers are to submit the final essay (around 5,000 words, without footnotes and references)&lt;/p&gt;
&lt;p&gt;As part of this project, CIS will organise two discussion events in Bengaluru and New Delhi during April-June (tentatively). Event dates are to be decided in conversation with the researchers, and they will be invited to present their works in the same.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Remuneration&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Each researcher will be paid a remuneration of ​Rs. 1,00,000 (inclusive of taxes) ​​over two equal installments: first on signing of the agreement in February 2020, and second on submission of the final essay in May 2020.&lt;/p&gt;
&lt;p&gt;We will also reimburse local travel expenses of each researcher upto Rs. 10,000, and translations and transcriptions expense (if any) incurred by each researcher upto Rs. 10,000. These reimbursements will be made on the basis of expense invoices shared by the researcher.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Description of the Project&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Previous research conducted by CIS on the subject of sexual and reproductive health (SRH) services in India observes that there is a complex web of surveillance, or ‘dataveillance’, around each patient as they avail of SRH services from the state. In this current project, we are aiming to map the ecosystem of surveillance around SRH services as their provision becomes increasingly ‘data-driven’, and explore its implications for patients and beneficiaries.&lt;/p&gt;
&lt;p&gt;Through this project, we are interested in documenting the roles played by both the public and the private sector actors in this ecosystem of health surveillance. We understand the role of private sector actors as central to state provision of sexual and reproductive health services, especially through the institutionalisation of data-driven health insurance models, as well as through extensive privatisation of public health services. By studying semi-private, private, and public medical establishments including hospitals, primary/community health centres and clinics, we aim to develop a comparative analysis of surveillance ecosystems across the three establishment types.&lt;/p&gt;
&lt;p&gt;This project is led by Ambika Tandon, Aayush Rathi, and Sumandro Chattapadhyay at the Centre for Internet and Society, and is supported by a grant from Privacy International.&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Indicative Reading List&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;&lt;em&gt;We are sharing below a short and indicative list of readings that may be useful for potential applicants&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Aayush Rathi, &lt;a href="https://www.epw.in/engage/article/indias-digital-health-paradigm-foolproof" target="_blank"&gt;Is India's Digital Health System Foolproof?&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Aayush Rathi and Ambika Tandon, &lt;a href="https://www.epw.in/engage/article/data-infrastructures-inequities-why-does-reproductive-health-surveillance-india-need-urgent-attention" target="_blank"&gt;Data Infrastructures and Inequities: Why Does Reproductive Health Surveillance in India Need Our Urgent Attention?&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Ambika Tandon, &lt;a href="https://cis-india.org/internet-governance/blog/ambika-tandon-december-23-2018-feminist-methodology-in-technology-research" target="_blank"&gt;Feminist Methodology in Technology Research: A Literature Review&lt;/a&gt; (2018)&lt;/p&gt;
&lt;p&gt;Ambika Tandon, &lt;a href="https://cis-india.org/raw/big-data-reproductive-health-india-mcts" target="_blank"&gt;Big Data and Reproductive Health in India: A Case Study of the Mother and Child Tracking System&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;Anja Kovacs, &lt;a href="https://genderingsurveillance.internetdemocracy.in/theory/" target="_blank"&gt;Reading Surveillance through a Gendered Lens: Some Theory&lt;/a&gt; (2017)&lt;/p&gt;
&lt;p&gt;Lindsay Weinberg, &lt;a href="https://www.westminsterpapers.org/articles/10.16997/wpcc.258/" target="_blank"&gt;Rethinking Privacy: A Feminist Approach to Privacy Rights after Snowden&lt;/a&gt; (2017)&lt;/p&gt;
&lt;p&gt;Nicole Shephard, &lt;a href="https://www.apc.org/en/pubs/big-data-and-sexual-surveillance" target="_blank"&gt;Big Data and Sexual Surveillance&lt;/a&gt; (2016)&lt;/p&gt;
&lt;p&gt;Sadaf Khan, &lt;a href="https://deepdives.in/data-bleeding-everywhere-a-story-of-period-trackers-8766dc6a1e00" target="_blank"&gt;Data Bleeding Everywhere: A Story of Period Trackers&lt;/a&gt; (2019)&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call'&gt;https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>ambika</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Welfare Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Gender</dc:subject>
    
    
        <dc:subject>Gender, Welfare, and Privacy</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2020-02-13T15:05:37Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/big-data-reproductive-health-india-mcts">
    <title>Big Data and Reproductive Health in India: A Case Study of the Mother and Child Tracking System</title>
    <link>https://cis-india.org/raw/big-data-reproductive-health-india-mcts</link>
    <description>
        &lt;b&gt;In this case study undertaken as part of the Big Data for Development (BD4D) network, Ambika Tandon evaluates the Mother and Child Tracking System (MCTS) as data-driven initiative in reproductive health at the national level in India. The study also assesses the potential of MCTS to contribute towards the big data landscape on reproductive health in the country, as the Indian state’s imagination of health informatics moves towards big data.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Case study: &lt;a href="https://github.com/cis-india/website/raw/master/bd4d/CIS_CaseStudy_AT_BigDataReproductiveHealthMCTS.pdf" target="_blank"&gt;Download&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;Introduction&lt;/h3&gt;
&lt;p&gt;The reproductive health information ecosystem in India comprises of a range of different databases across state and national levels. These collect data through a combination of manual and digital tools. Two national-level databases have been launched by the Ministry of Health and Family Welfare - the Health Management Information System (HMIS) in 2008, and the MCTS in 2009. 4 The MCTS focuses on collecting data on maternal and child health. It was instituted due to reported gaps in the HMIS, which records monthly data across health programmes including reproductive health. There are several other state-level initiatives on reproductive health data that have either been subsumed into, or run in
parallel with, the MCTS.&lt;/p&gt;
&lt;p&gt;With this case study, we aim to evaluate the MCTS as data-driven initiative in reproductive health at the national level. It will also assess its potential to contribute towards the big data landscape on reproductive health in the country, as the Indian state’s imagination of health informatics moves towards big data. The methodology for the case study involved a desk-based review of existing literature on the use of health information systems globally, as well as analysis of government reports, journal articles, media coverage, policy documents, and other material on the MCTS.&lt;/p&gt;
&lt;p&gt;The first section of this report details the theoretical framing of the case study, drawing on the feminist critique of reproductive data systems. The second section maps the current landscape of reproductive health data produced by the state in India, with a focus on data flows, and barriers to data collection and analysis at the local and national level. The case of abortion data is used to further the argument of flawed data collection systems at the
national level. Section three briefly discusses the state’s imagination of reproductive health policy and the role of data systems through a discussion on the National Health Policy, 2017 and the National Health Stack, 2018. Finally, we make some policy recommendations and identify directions for future research, taking into account the ongoing shift towards big data globally to democratise reproductive healthcare.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/big-data-reproductive-health-india-mcts'&gt;https://cis-india.org/raw/big-data-reproductive-health-india-mcts&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>ambika</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Data Systems</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Reproductive and Child Health</dc:subject>
    
    
        <dc:subject>Research</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Publications</dc:subject>
    
    
        <dc:subject>BD4D</dc:subject>
    
    
        <dc:subject>Healthcare</dc:subject>
    
    
        <dc:subject>Big Data for Development</dc:subject>
    

   <dc:date>2019-12-06T04:57:55Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="https://cis-india.org/raw/oes-ambika-tandon-ai-in-the-future-of-work">
    <title>AI in the Future of Work</title>
    <link>https://cis-india.org/raw/oes-ambika-tandon-ai-in-the-future-of-work</link>
    <description>
        &lt;b&gt;Artificial Intelligence and allied technologies form part of what is being called the fourth Industrial Revolution.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Some analysts &lt;a href="https://workofthefuturecongress.mit.edu/wp-content/uploads/2019/06/w25682.pdf"&gt;project the loss of jobs&lt;/a&gt; as AI replaces humans, especially in job roles that consist of repetitive tasks that are easier to automate. Another prediction is that AI, as preceding technologies, will &lt;a href="https://www.ilo.org/wcmsp5/groups/public/---dgreports/---cabinet/documents/publication/wcms_647306.pdf"&gt;enhance and complement&lt;/a&gt; human capability, rather than replacing it at large scales. AI at the workplace includes a wide range of technologies, from &lt;a href="https://www.infosys.com/human-amplification/Documents/manufacturing-ai-perspective.pdf"&gt;machine-to-machine interactions on the factory floor&lt;/a&gt;, to automated decision-making systems.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some analysts &lt;a href="https://workofthefuturecongress.mit.edu/wp-content/uploads/2019/06/w25682.pdf"&gt;project the loss of jobs&lt;/a&gt; as AI replaces humans, especially in job roles that consist of repetitive tasks that are easier to automate. Another prediction is that AI, as preceding technologies, will &lt;a href="https://www.ilo.org/wcmsp5/groups/public/---dgreports/---cabinet/documents/publication/wcms_647306.pdf"&gt;enhance and complement&lt;/a&gt; human capability, rather than replacing it at large scales. AI at the workplace includes a wide range of technologies, from &lt;a href="https://www.infosys.com/human-amplification/Documents/manufacturing-ai-perspective.pdf"&gt;machine-to-machine interactions on the factory floor&lt;/a&gt;, to automated decision-making systems.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Studying the Platform Economy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The platform economy, in particular, is dependent on AI in the design of aggregator platforms that form a two-way market between customers and workers. Platforms deploy AI at a number of different stages, from recruitment to assignment of tasks to workers. AI systems often reflect existing social biases, as they are built using biased datasets, and by non-diverse teams that are not attuned to such biases. This has been the case in the platform economy as well, where biased systems impact the ability of marginalised workers to access opportunities. To take an example, Amazon’s algorithm to filter workers’ resumes was &lt;a href="https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G"&gt;biased against women&lt;/a&gt; because it was trained on 10 years of hiring data, and ended up reflecting the underrepresentation of women in the tech industry. That is not to say that algorithms introduce biases where they didn’t exist earlier, but that they take existing biases and hard code them into systems in a systematic and predictable manner.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Biases are made even more explicit in marketplace platforms, that allow employers to review workers’ profiles and skills for a fee. In a study of platforms offering home-based services in India, we found that marketplace platforms offer filtering mechanisms which allow employers to filter workers by demographic characteristics such as gender, age, religion, and in one case, caste (the research publication is forthcoming). The design of the platform itself, in this case, encourages and enables discrimination of workers. One of the leading platforms in India had ‘Hindu maid’ and ‘Hindu cook’ as its top search term, reflecting the ways in which employers from the dominant religion are encouraged to discriminate against workers from minority religions in the Indian platform economy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another source of bias in the platform economy are rating and pricing systems, which can reduce the quality and quantum of work offered to marginalised workers. Rating systems exist across platform types - those that offer on-demand or location-based work, microwork platforms, and marketplace platforms. They allow customers and employers to rate workers on a scale, and are most often one-way feedback systems to review a worker’s performance (as our forthcoming research discusses, we found very few examples of feedback loops that also allow workers to rate employers). Rating systems &lt;a href="https://datasociety.net/pubs/ia/Discriminating_Tastes_Customer_Ratings_as_Vehicles_for_Bias.pdf"&gt;have been found&lt;/a&gt; to be a source of anxiety for workers, as they can be rated poorly for unfair reasons, including their demographic characteristics. Most platforms penalise workers for poor ratings, and may even stop them from accessing any tasks at all if their ratings fall below a certain threshold. Without adequate grievance redressal mechanisms that allow workers to contest poor ratings, rating systems are prone to reflect customer biases while appearing neutral. It is difficult to assess the level of such bias without companies releasing data comparing ratings of workers by their demographic characteristics, but it &lt;a href="https://datasociety.net/pubs/ia/Discriminating_Tastes_Customer_Ratings_as_Vehicles_for_Bias.pdf"&gt;has been argued&lt;/a&gt; that there is ample evidence to believe that demographic characteristics will inevitably impact workers ratings due to widespread biases.&lt;/p&gt;
&lt;h3&gt;Searching for a Solution&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;It is clear that platform companies need to be pushed into solving for biases and making their systems more fair and non-discriminatory. Some companies, such as Amazon in the example above, have responded by suspending algorithms that are proven to be biased. However, this is a temporary fix, as companies rarely seek to drop such projects indefinitely. In the platform economy, where algorithms are central to the business model of companies, complete suspension is near impossible. Amazon also tried another quick fix - it &lt;a href="https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G"&gt;altered the algorithm&lt;/a&gt; to respond neutrally to terms such as ‘woman’. This is a process known as debiasing the model, through which any biased connections (such as between the word ‘woman’ and downgrading) being made by the algorithm are explicitly removed. Another solution is diversifying or debiasing datasets. In this example, the algorithm could be fed a larger sample of resumes and decision-making logics from industries that have a higher representation of women.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another set of solutions could be drawn from anti-discrimination law, which prohibit discrimination at the workplace. In India, anti-discrimination laws protect against wage inequality, as well as discrimination at the stage of recruitment for protected groups such as transgender persons. While it can be argued that biased rating systems lead to wage inequality, there are several barriers to applying anti-discrimination law for workers in the platform economy. One, most jurisdictions, including India, protect only employees from discrimination, not self-employed contractors. Another challenge is the lack of data to prove that rating or recruitment algorithms are discriminatory, without which legal recourse is impossible. &lt;a href="https://datasociety.net/pubs/ia/Discriminating_Tastes_Customer_Ratings_as_Vehicles_for_Bias.pdf"&gt;Rosenblat et al.&lt;/a&gt; (2016) discuss these challenges in the context of the US, suggesting solutions such as addressing employment misclassification or modifying pleading requirements to bring platform workers under the protection of the law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Feminist principles point to structural shifts that are required to ensure robust protections for workers. Analysing algorithmic systems from a feminist lens indicates several points in the design at which interventions must be focused to ensure impact. The teams designing algorithms need to be made more diverse, along with integrating an explicit focus on assessing the impact of systems at the stage of design. Companies need to be more transparent with their data, and encourage independent audits of their systems. Corporate and government actors must be held to account to fix broken AI systems.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Ambika Tandon is a Senior Researcher at the &lt;a href="https://cis-india.org/"&gt;Centre for Internet &amp;amp; Society (CIS)&lt;/a&gt; in India, where she studies the intersections of gender and technology. She focuses on women’s work in the digital economy, and the impact of emerging technologies on social inequality. She is also interested in developing feminist methods for technology research. Ambika tweets at &lt;a href="https://twitter.com/AmbikaTandon"&gt;@AmbikaTandon&lt;/a&gt;.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The blog was originally &lt;a class="external-link" href="https://ethicalsource.dev/blog/ai-in-the-future-of-work/"&gt;published in the Organization for Ethical Source&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='https://cis-india.org/raw/oes-ambika-tandon-ai-in-the-future-of-work'&gt;https://cis-india.org/raw/oes-ambika-tandon-ai-in-the-future-of-work&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>ambika</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>CISRAW</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Future of Work</dc:subject>
    

   <dc:date>2021-12-07T01:51:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
