The Centre for Internet and Society
https://cis-india.org
These are the search results for the query, showing results 31 to 45.
Notes for India as the digital trade juggernaut rolls on
https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on
<b>Sitting out trade negotiations could result in the country losing out on opportunities to shape the rules.</b>
<p>The article by Arindrajit Basu was <a class="external-link" href="https://www.thehindu.com/opinion/op-ed/notes-for-india-as-the-digital-trade-juggernaut-rolls-on/article38393921.ece">published in the Hindu</a> on February 8, 2022</p>
<hr />
<p style="text-align: justify; ">Despite the cancellation of the Twelfth Ministerial Conference (MC12) of the World Trade Organization (WTO) late last year (scheduled date, November 30, 2021-December 3, 2021) due to COVID-19, digital trade negotiations continue their ambitious march forward. On December 14, Australia, Japan, and Singapore, co-convenors of the plurilateral Joint Statement Initiative (JSI) on e-commerce, welcomed the ‘substantial progress’ made at the talks over the past three years and stated that they expected a convergence on more issues by the end of 2022.</p>
<h3>Holding out</h3>
<p style="text-align: justify; ">But therein lies the rub: even though JSI members account for over 90% of global trade, and the initiative welcomes newer entrants, over half of WTO members (largely from the developing world) continue to opt out of these negotiations. They fear being arm-twisted into accepting global rules that could etiolate domestic policymaking and economic growth. India and South Africa have led the resistance and been the JSI’s most vocal critics. India has thus far resisted pressures from the developed world to jump onto the JSI bandwagon, largely through coherent legal argumentation against the JSI and a long-term developmental vision. Yet, given the increasingly fragmented global trading landscape and the rising importance of the global digital economy, can India tailor its engagement with the WTO to better accommodate its economic and geopolitical interests?</p>
<h3><strong>Global rules on digital trade</strong></h3>
<p style="text-align: justify; ">The WTO emerged in a largely analogue world in 1994. It was only at the Second Ministerial Conference (1998) that members agreed on core rules for e-commerce regulation. A temporary moratorium was imposed on customs duties relating to the electronic transmission of goods and services. This moratorium has been renewed continuously, to consistent opposition from India and South Africa. They argue that the moratorium imposes significant costs on developing countries as they are unable to benefit from the revenue customs duties would bring.</p>
<p style="text-align: justify; ">The members also agreed to set up a work programme on e-commerce across four issue areas at the General Council: goods, services, intellectual property, and development. Frustrated by a lack of progress in the two decades that followed, 70 members brokered the JSI in December 2017 to initiate exploratory work on the trade-related aspects of e-commerce. Several countries, including developing countries, signed up in 2019 despite holding contrary views to most JSI members on key issues. Surprise entrants, China and Indonesia, argued that they sought to shape the rules from within the initiative rather than sitting on the sidelines.</p>
<p style="text-align: justify; ">India and South Africa have rightly pointed out that the JSI contravenes the WTO’s consensus-based framework, where every member has a voice and vote regardless of economic standing. Unlike the General Council Work Programme, which India and South Africa have attempted to revitalise in the past year, the JSI does not include all WTO members. For the process to be legally valid, the initiative must either build consensus or negotiate a plurilateral agreement outside the aegis of the WTO.</p>
<p style="text-align: justify; ">India and South Africa’s positioning strikes a chord at the heart of the global trading regime: how to balance the sovereign right of states to shape domestic policy with international obligations that would enable them to reap the benefits of a global trading system.</p>
<h3><strong>A contested regime</strong></h3>
<p style="text-align: justify; ">There are several issues upon which the developed and developing worlds disagree. One such issue concerns international rules relating to the free flow of data across borders. Several countries, both within and outside the JSI, have imposed data localisation mandates that compel corporations to store and process data within territorial borders. This is a key policy priority for India. Several payment card companies, including Mastercard and American Express, were prohibited from issuing new cards for failure to comply with a 2018 financial data localisation directive from the Reserve Bank of India. The Joint Parliamentary Committee (JPC) on data protection has recommended stringent localisation measures for sensitive personal data and critical personal data in India’s data protection legislation. However, for nations and industries in the developed world looking to access new digital markets, these restrictions impose unnecessary compliance costs, thus arguably hampering innovation and supposedly amounting to unfair protectionism.</p>
<p style="text-align: justify; ">There is a similar disagreement regarding domestic laws that mandate the disclosure of source codes. Developed countries believe that this hampers innovation, whereas developing countries believe it is essential for algorithmic transparency and fairness — which was another key recommendation of the JPC report in December 2021.</p>
<h3><strong>India’s choices</strong></h3>
<p style="text-align: justify; ">India’s global position is reinforced through narrative building by political and industrial leaders alike. Data sovereignty is championed as a means of resisting ‘data colonialism’, the exploitative economic practices and intensive lobbying of Silicon Valley companies. Policymaking for India’s digital economy is at a critical juncture. Surveillance reform, personal data protection, algorithmic governance, and non-personal data regulation must be galvanised through evidenced insights,and work for individuals, communities, and aspiring local businesses — not just established larger players.</p>
<p style="text-align: justify; ">Hastily signing trading obligations could reduce the space available to frame appropriate policy. But sitting out trade negotiations will mean that the digital trade juggernaut will continue unchecked, through mega-regional trading agreements such as the Regional Comprehensive Economic Partnership (RCEP) and the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP). India could risk becoming an unwitting standard-taker in an already fragmented trading regime and lose out on opportunities to shape these rules instead.</p>
<p style="text-align: justify; ">Alternatives exist; negotiations need not mean compromise. For example, exceptions to digital trade rules, such as ‘legitimate public policy objective’ or ‘essential security interests’, could be negotiated to preserve policymaking where needed while still acquiescing to the larger agreement. Further, any outcome need not be an all-or-nothing arrangement. Taking a cue from the Digital Economy Partnership Agreement (DEPA) between Singapore, Chile, and New Zealand, India can push for a framework where countries can pick and choose modules with which they wish to comply. These combinations can be amassed incrementally as emerging economies such as India work through domestic regulations.</p>
<p style="text-align: justify; ">Despite its failings, the WTO plays a critical role in global governance and is vital to India’s strategic interests. Negotiating without surrendering domestic policy-making holds the key to India’s digital future.</p>
<hr />
<p style="text-align: justify; "><i>Arindrajit Basu is Research Lead at the Centre for Internet and Society, India. The views expressed are personal. The author would like to thank The Clean Copy for edits on a draft of this article.</i></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on'>https://cis-india.org/internet-governance/blog/the-hindu-arindrajit-basu-february-8-2022-notes-for-india-as-the-digital-trade-juggernaut-rolls-on</a>
</p>
No publisherbasuDigitalisationDigital KnowledgeInternet GovernanceE-CommerceDigital India2022-02-09T15:04:36ZBlog EntrySubmission to the Facebook Oversight Board: Policy on Cross-checks
https://cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-policy-on-cross-checks
<b>The Centre for Internet & Society (CIS) submitted public comments to the Facebook Oversight Board on a policy consultation.</b>
<h2>Whether a cross-check system is needed?</h2>
<p style="text-align: justify;"><strong>Recommendation for the Board</strong>: The Board should investigate the cross-check system as part of Meta’s larger problems with algorithmically amplified speech, and how such speech gets moderated.</p>
<p style="text-align: justify;"><strong>Explanation</strong>: The issues surrounding Meta’s cross-check system are not an isolated phenomena, but rather a reflection of the problems of algorithmically amplified speech, as well the lack of transparency in the company’s content moderation processes at large. At the outset, it must be stated that the majority of information on the cross-check system only became available after the media <a href="https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353?mod=article_inline">reports</a> published by the Wall Street Journal. While these reports have been extensive in documenting various aspects of the system, there is no guarantee that the disclosures obtained by them provides the complete picture regarding the system. Further, given that Meta has been found to purposely mislead the Board and the public on how the cross-check system operates, it is worth investigating the incentives that necessitate the cross-check system in the first place.</p>
<p style="text-align: justify;">Meta claims that the cross-check system works as a check for false positives: they “employ additional reviews for high-visibility content that may violate our policies.” Essentially they want to make sure that content that stays up on the platform and reaches a large audience, is following their content guidelines. However, previous disclosures have <a href="https://www.wsj.com/articles/facebook-hate-speech-india-politics-muslim-hindu-modi-zuckerberg-11597423346">proven</a> policy executives have prioritized the company’s ‘business interests’ over removing content that violates their policies; and have <a href="https://www.theguardian.com/technology/2021/apr/12/facebook-fake-engagement-whistleblower-sophie-zhang">waited to act on known problematic content</a> until significant external pressure was built up, including in India. In this context, the cross-check system seems less like a measure designed to protect users who might be exposed to problematic content, and more as a measure for managing public perception of the company.</p>
<p style="text-align: justify;">Thus the Board should investigate both how content gains an audience on the platform, and how it gets moderated. Previous <a href="https://www.theguardian.com/technology/2021/apr/12/facebook-fake-engagement-whistleblower-sophie-zhang">whistleblower disclosures</a> have shown that the mechanics of algorithmically amplified speech, which prioritizes <a href="https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/">engagement and growth over safety</a>, are easily taken advantage of by bad actors to promote their viewpoints through artificially induced virality. The cross-check system and other measures of content moderation at scale would not be needed if it was harder to spread problematic content on the platform in the first place. Instead of focusing only on one specific system, the Board needs to urge Meta to re-evaluate the incentives that drive content sharing on the platform and come up with ways that make the platform safer.</p>
<h2 style="text-align: justify;">Meta’s Obligations under Human Rights Law</h2>
<p style="text-align: justify;"><strong>Recommendation for the Board: </strong>The Board must consider the cross-check system to be violative of Meta’s obligations under the International Covenant of Civil and Political Rights (ICCPR). Additionally, the cross-check ranker must be incorporated with Meta’s commitments towards human rights, as outlined in its Corporate Human Rights Policy.</p>
<p style="text-align: justify;">Explanation: Meta’s content moderation, and by extension, its cross-check system, is bound by both international human rights law as well as the Board’s past decisions. At the outset, The system fails the three-pronged test of legality, legitimacy and necessity and proportionality, as delineated under Article 19(3) of the International Covenant of Civil and Political Rights (ICCPR). Firstly, this system has been “<a href="https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353?mod=article_inline">scattered throughout the company, without clear governance or ownership</a>”, which violates the legality principle, since there is no clear guidance on what sort of speech, or which classes of users, would deserve the treatment of this system. Secondly, there is no understanding about the legitimacy of aims with which this system had been set up in the first place, beyond Meta’s own assertions, which have been <a href="https://www.oversightboard.com/news/215139350722703-oversight-board-demands-more-transparency-from-facebook/">countered</a> by evidence to the contrary. Thirdly, the necessity and proportionality of the restriction has to be <a href="https://www.oversightboard.com/decision/FB-691QAMHJ">read along</a> with the <a href="https://www.ohchr.org/en/issues/freedomopinion/articles19-20/pages/index.aspx">Rabat Plan of Action</a>, which requires that for a statement to become a criminal offense, a six-pronged test of threshold is to be applied: a) the social and political context, b) the speaker’s position or status in the society, c) intent to incite the audience against a target group, d) content and form of the speech, e) extent of its dissemination and f) likelihood of harm. As news reports have indicated, Meta has been utilizing the cross-check system to privilege speech from influential users, and in the process, have shielded inflammatory, inciting speech that would have otherwise qualified the Rabat threshold. As such, the third requirement is not fulfilled either.</p>
<p style="text-align: justify;">Additionally, Meta’s own <a href="https://about.fb.com/wp-content/uploads/2021/03/Facebooks-Corporate-Human-Rights-Policy.pdf">Corporate Human Rights Policy</a> commits to respecting human rights in line with the UN Guiding Principles on Business and Human Rights (UNGPs). Therefore, the cross-check ranker must incorporate these existing commitments to human rights, including:</p>
<ul>
<li style="text-align: justify;">The right to freedom of expression:, UN Special Rapporteur on freedom of opinion and expression report <a href="https://ap.ohchr.org/documents/dpage_e.aspx?si=A/HRC/38/35">A/HRC/38/35</a> (2018); <a href="https://www.ohchr.org/EN/NewsEvents/Pages/DisplayNews.aspx?NewsID=25729&LangID=E">Joint Statement of international freedom of expression monitors on COVID-19 (March, 2020)</a>.</li></ul>
<p style="text-align: justify;">The Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression addresses the regulation of user-generated online content.</p>
<p>The Joint Statement issued regarding Governmental promotion and protection of access to and free flow of information during the pandemic.</p>
<ul>
<li>The right to non-discrimination: International Convention on the Elimination of All Forms of Racial Discrimination (<a href="https://www.ohchr.org/EN/ProfessionalInterest/Pages/CERD.aspx">ICERD</a>), Articles 1 and 4.</li></ul>
<p>Article 1 of the ICERD defines racial discrimination.</p>
<p>Article 4 of the ICERD condemns propaganda and organisations that attempt to justify discrimination or are based on the idea of racial supremacism.</p>
<ul>
<li>Participation in public affairs and the right to vote: ICCPR Article 25.</li>
<li>The right to remedy: General Comment No. 31, Human Rights Committee (2004) (<a href="https://tbinternet.ohchr.org/_layouts/15/treatybodyexternal/Download.aspx?symbolno=CCPR%2fC%2f21%2fRev.1%2fAdd.13&Lang=en">General Comment 31</a>); UNGPs, Principle 22.</li></ul>
<p>The General Comment discusses the nature of the general legal obligation imposed on State Parties to the Covenant.</p>
<p style="text-align: justify;">Guiding Principle 22 states that where business enterprises identify that they have caused or contributed to adverse impacts, they should provide for or cooperate in their remediation through legitimate processes.</p>
<h2>Meta’s obligations to avoid political bias and false positives in its cross-check system</h2>
<p style="text-align: justify;"><strong>Recommendation for the Board: </strong>The Board must urge Meta to adopt and implement the Santa Clara Principles on Transparency and Accountability to ensure that it is open about risks to user rights when there is involvement from the State in content moderation. Additionally, the Board must ask Meta to undertake a diversity and human rights audit of its existing policy teams, and commit to regular cultural training for its staff. Finally, the Board must investigate the potential conflicts of interest that arise when Meta’s policy team has any sort of nexus with political parties, and how that might impact content moderation.</p>
<p style="text-align: justify;">Explanation: For the cross-check system to be free from biases, it is important for Meta to come clear to the Board regarding the rationale, standards and processes of the cross check review, and report on the relative error rates of determinations made through cross check compared with ordinary enforcement procedures. It also needs to disclose to the Board in which particular situations it uses the system and in which it does not. Principle 4 under the Foundational Principles of the <a href="https://santaclaraprinciples.org/">Santa Clara Principles on Transparency and Accountability in Content Moderation</a> encourage companies to realize the risk to user rights when there is involvement from the State in processes of content moderation and asks companies to makes users aware that: a) a state actor has requested/participated in an action on their content/account, and b) the company believes that the action was needed as per the relevant law. Users should be allowed access to any rules or policies, formal or informal work relationships that the company holds with state actors in terms of content regulation, the process of flagging accounts/content and state requests to action.</p>
<p style="text-align: justify;">The Board must consider that erroneous lack of action (false positives) might not always be a system's flaw, but a larger, structural issue regarding how policy teams at Meta functions. As previous disclosures have <a href="https://www.wsj.com/articles/facebook-hate-speech-india-politics-muslim-hindu-modi-zuckerberg-11597423346">proven</a>, the contours of what sort of violating content gets to stay up on the platform has been ideologically and politically coloured, as policy executives have prioritized the company’s ‘business interests’ over social harmony. In such light, it is not sufficient to simply propose better transparency and accountability measures for Meta to adopt within its content moderation processes to avoid political bias. Rather, the Board’s recommendations must focus on the structural aspect of the human moderator and policy team that is behind these processes. The Board must ask Meta to a) urgently undertake a diversity and human rights audit of its existing team and its hiring processes, b) commit to regular training to ensure that their policy staffs are culturally literate in the socio-political regions they work in. Further, the Board must seriously investigate the potential <a href="https://time.com/5883993/india-facebook-hate-speech-bjp/">conflicts of interest</a> that happen when regional policy teams of Meta, with nexus to political parties, are also tasked with regulating content from representatives of these parties, and how that impacts the moderation processes at large.</p>
<p style="text-align: justify;">Finally, in case decision <a href="https://www.oversightboard.com/decision/FB-691QAMHJ">2021-001-FB-FBR</a>, the Board made a number of recommendations to Meta which must be implemented in the current situation, including: a) considering the political context while looking at potential risks, b) employment of specialized staff in content moderation while evaluating political speech from influential users, c) familiarity with the political and linguistic context d) absence of any interference and undue influence, e) public explanation regarding the rules Meta uses when imposing sanctions against influential users and f) the sanctions being time-bound.</p>
<h2 style="text-align: justify;">Transparency of the cross-check system</h2>
<p style="text-align: justify;"><strong>Recommendation for the Board: </strong>The Board must urge Meta to adopt and implement the Santa Clara Principles on Transparency and Accountability to increase the transparency of its cross-check system.</p>
<p style="text-align: justify;"><strong>Explanation: </strong>There are ways in which Meta can increase the transparency of not only the cross-check system, but the content moderation process in general. The following recommendations draw from <a href="https://santaclaraprinciples.org/">The Santa Clara Principles</a> and the Board’s own previous decisions:</p>
<p style="text-align: justify;">Considering Principle 2 of the Santa Clara Principles: Understandable Rules and Policies, Meta should ensure that the policies and rules governing moderation of content and user behaviors on Facebook are<strong> clear, easily understandable, and available in the languages</strong> in which the user operates.</p>
<p style="text-align: justify;">Drawing from Principle 5 on Integrity and Explainability and from the Board’s recommendations in case decision <a href="https://www.oversightboard.com/decision/FB-691QAMHJ">2021-001-FB-FBR</a> which advises Meta to“<em>Provide users with accessible information on how many violations, strikes and penalties have been assessed against them, and the consequences that will follow future violations</em>”, Meta should be able to <strong>explain the content moderation decisions to users in all cases</strong>: when under review, when the decision has been made to leave the content up, or take it down. We recommend that Meta keeps a publicly accessible running tally of the number of moderation decisions made on a piece of content till date with their explanations. This would allow third parties (like journalists, activists, researchers and the OSB) to keep Facebook accountable when it does not follow its own policies, as has previously been the case.</p>
<p style="text-align: justify;">In the same case decision, the Board has also previously recommended that Meta “<em>Produce more information to help users understand and evaluate the process and criteria for applying the newsworthiness allowance, including how it applies to influential accounts. The company should also clearly explain the rationale, standards and processes of the cross-check review, and report on the relative error rates of determinations made through cross-checking compared with ordinary enforcement procedures.</em>” Thus, Meta should <strong>publicly explain the cross check system </strong>in detail with examples, and make public the list of attributes that qualify a piece of content for secondary review.</p>
<p style="text-align: justify;">The Operational Principles further provide actionable steps that Meta can take to improve the transparency of their content moderation systems. Drawing from Principle 2: Notice and Principle 3: Appeals, Meta should make a satisfactory <strong>appeals process available </strong>to users - whether they be decisions to leave up or takedown content. The appeals process should be handled by context aware teams. Meta should then <strong>publish the results</strong> of the cross check system and the appeals processes as part of their transparency reports including data like total content actioned, rate of success in appeals and cross check process, decisions overturned and preserved etc, which would also satisfy the first Operational Principle: Numbers.</p>
<h2 style="text-align: justify;">Resources needed to improve the system for users and entities who do not post in English</h2>
<p style="text-align: justify;"><strong>Recommendations for the Board: </strong>The Board must urge Meta to urgently invest in resources to expand Meta’s content moderation services into the local contexts in which the company operates and invest in training data for local languages.</p>
<p style="text-align: justify;"><strong>Explanation: </strong>The cross-check system is not a fundamentally different problem than content moderation. It has been shown time and time again that Meta’s handling of content from non-Western, non-English language contexts is severely lacking. It has been shown how content hosted on the platform has been used to<a href="https://www.theguardian.com/technology/2021/apr/12/facebook-fake-engagement-whistleblower-sophie-zhang"> inflame existing tensions in developing countries</a>, <a href="https://www.wsj.com/articles/facebook-services-are-used-to-spread-religious-hatred-in-india-internal-documents-show-11635016354?mod=article_inline">promote religious hatred in India</a>, <a href="https://www.wsj.com/articles/burn-the-houses-rohingya-survivors-recount-the-day-soldiers-killed-hundreds-1526048545?mod=article_inline">genocide in Mynmar</a>, and continue to support <a href="https://www.wsj.com/articles/facebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953?mod=article_inline">human traffickers and drug cartels</a> on the platform even when these issues have been identified.</p>
<p style="text-align: justify;">There is an urgent need to invest resources to expand Meta’s content moderation services into the local contexts in which the company operates. The company should make all policies and rule documents available in the languages of its users; invest in creating automated tools that are capable of flagging content that is not posted in English; and add people familiar with the local contexts to provide context aware second level reviews. The Facebook Files show that even according to company engineering, <a href="https://www.wsj.com/articles/facebook-ai-enforce-rules-engineers-doubtful-artificial-intelligence-11634338184?mod=article_inline">automated content moderation</a> is still not very effective in identifying hate speech and other harmful content. Meta should focus on hiring, training and retaining human moderators who have knowledge of local contexts. Bias training of all content moderators, but especially those who will participate in the second level reviews in the cross check system is also extremely important to ensure acceptable decisions.</p>
<p style="text-align: justify;">Additionally, in keeping with Meta’s human rights commitments, the company should develop and publish a policy for responding to human rights violations when they are pointed out by activists, researchers, journalists and employees as a matter of due process. It should not wait for a negative news cycle to stir them into action <a href="https://www.theguardian.com/technology/2021/apr/12/facebook-fake-engagement-whistleblower-sophie-zhang">as it seems to have done in previous cases</a>.</p>
<h2 style="text-align: justify;">Benefits and limitations of automated technologies</h2>
<p style="text-align: justify;">Meta <a href="https://www.theverge.com/2020/11/13/21562596/facebook-ai-moderation%5C">recently changed</a> its moderation practice wherein it uses technology to prioritize content for human reviewers based on their severity index. Facebook <a href="https://transparency.fb.com/policies/improving/prioritizing-content-review/">has not specified</a> the technology it uses to prioritize high-severity content but its research record shows that it <a href="https://ai.facebook.com/blog/the-shift-to-generalized-ai-to-better-identify-violating-content">uses</a> a host of automated <a href="https://ai.facebook.com/tools#frameworks-and-tools">frameworks and tools</a> to detect violating content, including image recognition tools, object detection tools, natural language processing models, speech models and reasoning models. One such model is the <a href="https://ai.facebook.com/blog/community-standards-report/">Whole Post Integrity Embeddings</a> (“WPIE”) which can judge various elements in a given post (caption, comments, OCR, image etc.) to work out the context and the content of the post. Facebook also uses image matching models (SimSearchNet++) that are trained to match variations of an image with a high degree of precision and improved recall; multi-lingual masked language models on cross-lingual understanding such as <a href="https://ai.facebook.com/blog/-xlm-r-state-of-the-art-cross-lingual-understanding-through-self-supervision/">XLM-R</a> that can accurately identify hate-speech and other policy-violating content across a wide range of languages. More recently, Facebook introduced its machine translation model called the <a href="https://analyticsindiamag.com/facebooks-new-machine-translation-model-works-without-help-of-english-data/">M2M-100</a> whose goal is to perform bidirectional translation between 7000 languages.</p>
<p style="text-align: justify;">Despite the advances in this field, there are inherent <a href="https://www.ofcom.org.uk/__data/assets/pdf_file/0028/157249/cambridge-consultants-ai-content-moderation.pdf">limitations</a> of such automated tools. <a href="https://www.theverge.com/2019/2/27/18242724/facebook-moderation-ai-artificial-intelligence-platforms">Experts</a> have repeatedly maintained that AI will get better at understanding context but it will not replace human moderators for the foreseeable future. One such instance where these limitations were <a href="https://www.politico.eu/article/facebook-content-moderation-automation/">exposed</a> was during the COVID-19 pandemic, when Facebook sent its human moderators home - the number of removals flagged as hate speech on its platform more than doubled to 22.5 million in the second quarter of 2020 but the number of successful content appeals was dropped to 12,600 from the 2.3 million figure for the first three months of 2020.</p>
<p style="text-align: justify;"><a href="https://www.wsj.com/articles/facebook-ai-enforce-rules-engineers-doubtful-artificial-intelligence-11634338184?mod=article_inline">The Facebook Files</a> show that Meta’s AI cannot consistently identify first-person shooting videos, racist rants and even the difference between cockfighting and car crashes. Its automated systems are only capable of removing posts that generate just 3% to 5% of the views of hate speech on the platform and 0.6% of all content that violates Meta’s policies against violence and incitement. As such, it is difficult to accept the company’s claim that nearly all of the hate speech it takes down was discovered by AI before it was reported by users.</p>
<p style="text-align: justify;">However, the benefits of such technology cannot be discounted, especially when one considers automated technology as a way of reducing <a href="https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona">trauma</a> for human moderators. Using AI for prioritizing content for review can turn out to be effective for human moderators as it can increase their efficiency and reduce harmful effects of content moderation on them. Additionally, it can also limit the exposure of harmful content to internet users. Moreover, AI can also reduce the impact of harmful content on human moderators by allocating content to moderators on the basis of their exposure history. Theoretically, if the company’s claims are to be believed, using automated technology for prioritizing content for review can help to improve the mental health of Facebook’s human moderators.</p>
<hr />
<p>Click to download the file <a class="external-link" href="https://cis-india.org/internet-governance/policy-on-cross-checks">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-policy-on-cross-checks'>https://cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-policy-on-cross-checks</a>
</p>
No publisher[in alphabetical order] Anamika Kundu, Digvijay Singh, Divyansha Sehgal and Torsha SarkarFreedom of Speech and ExpressionInternet FreedomFacebookInternet Governance2022-02-09T05:31:32ZBlog EntryTransference: Reimagining Data Systems: Beyond the Gender Binary
https://cis-india.org/internet-governance/events/transference-reimagining-data-systems-beyond-the-gender-binary
<b>The Centre for Internet and Society (CIS) invites you to participate in a day-long convening on the rights of transgender persons, specifically right to privacy and digital rights. Through this convening, we hope to highlight the concerns of transgender persons in accessing digital data systems and the privacy challenges faced by the community. These challenges include access to their rights — their right to self-identify their gender and welfare services offered by the State and the privacy challenges faced by transgender and intersex persons in revealing their identity.</b>
<p dir="ltr" style="text-align: justify; ">As the meaning of the word ‘Transference’ goes, through this convening, as a learning, we hope to capture and transfer the realities of transgender persons with engaging and being a part of digital data systems in India. Given the rapid digitisation of different public and private data systems in India, we hope to initiate a conversation that understands their struggles and challenges to realistically initiate the re-imagination of data systems — digital and otherwise — one that is mindful about their everyday struggles with privacy and access.</p>
<p dir="ltr" style="text-align: justify; ">Owing to the history of systemic exclusions faced by transgender persons, it is important to highlight their difficulties in accessing technological systems and the impact on their privacy, as central issues that require serious consideration. Presently, their realities seem to be ignored by the State while designing most technology laws and policies governing digital systems.</p>
<h3 dir="ltr" style="text-align: justify; ">Background</h3>
<p><span id="docs-internal-guid-491cb7c5-7fff-049a-e44a-d55b71b690d7"> </span></p>
<p dir="ltr" style="text-align: justify; "><span>In the landmark verdict in 2014, NALSA Vs Union of India, the Supreme Court of India for the first time recognised the right of an individual to self-identify their gender as male, female or transgender. This verdict detailed nine directives to be implemented by the central and state governments in India for the inclusion of transgender persons.</span></p>
<p dir="ltr" style="text-align: justify; "><span>Similarly, 2017 was a watershed moment in India’s constitutional history when the Supreme Court held the right to privacy to be a fundamental right. More importantly, the Court expounded on this right and held that the protection of an individual’s gender identity is an essential component of the right to privacy and that privacy at its core includes the preservation of personal intimacies, autonomy, the sanctity of family life, marriage, procreation, the home and sexual orientation.</span></p>
<p dir="ltr" style="text-align: justify; "><span>The 2017 privacy judgement led to the Supreme Court pronouncing the </span><span>Navtej Johar v Union of India in 2018</span><span>, striking down the </span><span>Koushal </span><span>judgement and decriminalising acts of consensual non-hetrosexual acts of intimacy. In 2019, the Personal Data Protection Bill, 2019 was introduced in Parliament for the regulation and protection of personal data. The PDP Bill classifies data into two categories as (i) personal data; and (ii) sensitive personal data. As per the PDP Bill, data identifying the transgender status and intersex status falls within the ambit of sensitive personal data. Around the time of the PDP Bill being tabled in Parliament, the Transgender Persons (Protection of Rights) Act 2019 was passed by the Parliament despite </span><a href="https://scroll.in/article/944943/explainer-despite-criticism-the-transgender-persons-bill-was-just-passed-whats-next"><span>severe opposition</span></a><span> to the Bill from civil society members as well as members of Parliament.</span></p>
<p dir="ltr" style="text-align: justify; "><span>There is a lack of clarity on the interplay between the PDP Bill and the Transgender Act and the challenges the PDP Bill may pose to the transgender community. Moving beyond mere mentions in the definition of the law through a cisgendered heteronormative lens, it is important for the discourse on data and privacy to broaden its scope to realistically include people of different sexual orientations, gender and sexual identities, gender expressions and sex characteristics.</span></p>
<h3><span>About the Event</span></h3>
<p dir="ltr" style="text-align: justify; ">Through these panel discussions, we propose to highlight the concerns of transgender persons with accessing digital data systems and the privacy challenges faced by them . These challenges include access to their rights — their right to self-identify their gender and access welfare services offered by the State and the privacy challenges faced by transgender persons in revealing their identity.</p>
<p dir="ltr" style="text-align: justify; ">The objective of these discussions is to initiate more conversations about the technological and data exclusions faced by this historically marginalised community in India. The intent is to better understand the realities of transgender persons and contribute to the larger advocacy on privacy, intersectionality and (digital) systems design.</p>
<hr />
<p>Click to register for the event <a class="external-link" href="https://us06web.zoom.us/meeting/register/tZUpcOiqrD8uG9X_4L6EIzXI-QFCipmFqqDV"><b>here</b></a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/events/transference-reimagining-data-systems-beyond-the-gender-binary'>https://cis-india.org/internet-governance/events/transference-reimagining-data-systems-beyond-the-gender-binary</a>
</p>
No publishertorshaGender, Welfare, and PrivacyEventInternet Governance2021-12-15T12:58:31ZEventLaunching CIS’s Flagship Report on Private Crypto-Assets
https://cis-india.org/internet-governance/blog/launching-flagship-cis-report-on-private-crypto-assets
<b>The Centre for Internet & Society is launching its flagship report on regulating private crypto-assets in India, as part of its newly formed Financial Technology (or Fintech) research agenda. The event will be held on Zoom, at 17:30 IST on Wednesday, 15th December, 2021
</b>
<p style="text-align: justify;"> </p>
<p style="text-align: justify;">This event will serve as a venue to bring together the various stakeholders involved in the crypto-asset space to discuss the state of crypto-asset regulation in India from a multitude of perspectives.</p>
<p style="text-align: justify;">This event will serve as a venue to bring together the various stakeholders involved in the crypto-asset space to discuss the state of crypto-asset regulation in India from a multitude of perspectives.</p>
<h3 style="text-align: justify;">About the private crypto-assets report</h3>
<p style="text-align: justify;">The first output under this agenda is our report on regulating private cryptocurrencies in India. This report aims to act as an introductory resource for policymakers who are looking to implement a regulatory framework for private crypto-assets. The report covers the technical elements of crypto-assets, their history, proposed use cases as well as its benefits and limitations. It also examines how crypto-assets fit within India’s current regulatory and legislative frameworks and makes clear recommendations for the same.</p>
<h3 style="text-align: justify;">About the Event</h3>
<p style="text-align: justify;">The launch event will feature an initial presentation by researchers at CIS on the various findings and recommendations of its flagship report. This will be followed by a moderated discussion with 5 panelists who represent the space in policy, academia and industry. The discussion will be centered around the current status of crypto-assets in India, the government’s new proposed regulations and what the future holds for the Indian crypto market.</p>
<p dir="ltr">The confirmed panelists are as follows:</p>
<ol>
<li>Tanvi Ratna - Founder, Policy 4.0 and expert on blockchain and cryptocurrencies</li>
<li>Shehnaz Ahmed - Senior Resident Fellow and Fintech Lead at Vidhi Centre for Legal Policy</li>
<li>Nithya R. - Chief Executive Officer, Unos.Finace</li>
<li>Prashanth Irudayaraj - Head of R&D, Zebpay</li>
<li>Vipul Kharbanda - Non resident Fellow specialising in Fintech at CIS</li>
<li>Aman Nair - Policy Offer, CIS (Moderator)</li></ol>
<p>Registration link: <a class="external-link" href="https://us06web.zoom.us/webinar/register/WN_TdY-EPLoRvGY2rfsq4CENw">https://us06web.zoom.us/webinar/register/WN_TdY-EPLoRvGY2rfsq4CENw</a></p>
<h3>Agenda</h3>
<table class="grid listing">
<tbody>
<tr>
<td>17.30 - 17.35</td>
<td>Welcome Note</td>
</tr>
<tr>
<td>17.35 - 18.35</td>
<td>
<p>The status of private crypto assets in India</p>
<ul>
<li>Presentation on CIS’ flagship Report on regulating private crypto-assets in India</li>
<li style="text-align: justify;">Moderated discussion with panelists across industry, government, journalism and academia providing their insight as to the current and future state of private crypto-assets, and their regulation, in India.</li></ul>
</td>
</tr>
<tr>
<td>18.35 - 19.00</td>
<td>Audience questions and discussion</td>
</tr>
</tbody>
</table>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/launching-flagship-cis-report-on-private-crypto-assets'>https://cis-india.org/internet-governance/blog/launching-flagship-cis-report-on-private-crypto-assets</a>
</p>
No publisherAman NairInternet GovernanceCryptocurrencies2021-12-13T09:11:18ZBlog EntryLaunching CIS’s Flagship Report on Private Crypto-Assets
https://cis-india.org/internet-governance/events/launching-cis-flagship-report-on-private-crypto-assets
<b>The Centre for Internet & Society is launching its flagship report on regulating private crypto-assets in India, as part of its newly formed Financial Technology (or Fintech) research agenda. This event will serve as a venue to bring together the various stakeholders involved in the crypto-asset space to discuss the state of crypto-asset regulation in India from a multitude of perspectives.</b>
<h3>About the private crypto-assets report</h3>
<p style="text-align: justify; ">The first output under this agenda is our report on regulating private cryptocurrencies in India. This report aims to act as an introductory resource for policymakers who are looking to implement a regulatory framework for private crypto-assets. The report covers the technical elements of crypto-assets, their history, proposed use cases as well as its benefits and limitations. It also examines how crypto-assets fit within India’s current regulatory and legislative frameworks and makes clear recommendations for the same.</p>
<h3 style="text-align: justify; ">About the Event</h3>
<p dir="ltr" style="text-align: justify; ">The launch event will feature an initial presentation by researchers at CIS on the various findings and recommendations of its flagship report. This will be followed by a moderated discussion with 5 panelists who represent the space in policy, academia and industry. The discussion will be centered around the current status of crypto-assets in India, the government’s new proposed regulations and what the future holds for the Indian crypto market.</p>
<p dir="ltr" style="text-align: justify; ">The confirmed panelists are as follows:</p>
<ol>
<li>Tanvi Ratna - Founder, Policy 4.0 and expert on blockchain and cryptocurrencies </li>
<li>Shehnaz Ahmed - Senior Resident Fellow and Fintech Lead at Vidhi Centre for Legal Policy </li>
<li>Nithya R. - Chief Executive Officer, Unos.Finace </li>
<li>Prashanth Irudayaraj - Head of R&D, Zebpay </li>
<li>Vipul Kharbanda - Non resident Fellow specialising in Fintech at CIS </li>
<li>Aman Nair - Policy Offer, CIS (Moderator) </li>
</ol>
<p>Registration link: <a class="external-link" href="https://us06web.zoom.us/webinar/register/WN_TdY-EPLoRvGY2rfsq4CENw">https://us06web.zoom.us/webinar/register/WN_TdY-EPLoRvGY2rfsq4CENw</a></p>
<h3>Agenda</h3>
<table class="plain">
<tbody>
<tr>
<td>17.30 - 17.35</td>
<td>Welcome Note</td>
</tr>
<tr>
<td>17.35 - 18.35</td>
<td>
<p>The status of private crypto-assets in India</p>
<ul>
<li>Presentation on CIS’ flagship Report on regulating private crypto-assets in India</li>
<li>Moderated discussion with panelists across industry, government, journalism and academia providing their insight as to the current and future state of private crypto-assets, and their regulation, in India.</li>
</ul>
</td>
</tr>
<tr>
<td>18.35 - 19.00</td>
<td>Audience questions and discussion</td>
</tr>
</tbody>
</table>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/events/launching-cis-flagship-report-on-private-crypto-assets'>https://cis-india.org/internet-governance/events/launching-cis-flagship-report-on-private-crypto-assets</a>
</p>
No publisherAdminInternet GovernanceEventCryptocurrenciesWebinar2021-12-03T15:16:27ZEventPanel discussion on 'How to Avoid Digital ID Systems That Put People at Risk: Lessons from Afghanistan' at Freedom Online Conference
https://cis-india.org/internet-governance/news/panel-discussion-how-to-avoid-digital-id-systems-that-put-people-at-risk
<b>Amber Sinha participated as a panelist in a panel discussion on How to Avoid Digital ID Systems That Put People at Risk: Lessons from Afghanistan at the Freedom Online Conference yesterday.</b>
<p style="text-align: justify; ">The Freedom Online Coalition (FOC) was established in 2011 in response to the growing recognition of the importance of the Internet for the enjoyment of human rights. Periodically, the FOC holds a multistakeholder Conference that aims to deepen the discussion on how online freedoms are helping to promote social, cultural and economic development. The ownership of the Conference program and outputs lies with the host country, most often the Chair of the Coalition during that year.</p>
<p style="text-align: justify; ">The aim of the panel was to use the lessons learned from the Afghanistan case to take a critical and realistic look at the implementation of digital identification programs around the world. A video of the panel can be <a class="external-link" href="https://www.freedomonlineconference.com/session/how-to-avoid-digital-id-systems-that-put-people-at-risk-lessons-from-afghanistan">accessed here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/news/panel-discussion-how-to-avoid-digital-id-systems-that-put-people-at-risk'>https://cis-india.org/internet-governance/news/panel-discussion-how-to-avoid-digital-id-systems-that-put-people-at-risk</a>
</p>
No publisherpraskrishnaFreedom of Speech and ExpressionDigital IDInternet Governance2021-12-03T14:52:35ZNews ItemFacial Recognition Technology in India
https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf
<b></b>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf'>https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf</a>
</p>
No publisherElonnai Hickok, Pallavi Bedi, Aman Nair and Amber SinhaPrivacyInternet GovernanceFacial Recognition2021-09-02T16:17:44ZFileFacial Recognition Technology in India
https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india
<b>The Human Rights, Big Data and Technology Project, University of Essex, UK and the Centre for Internet & Society (CIS) have jointly published a research paper on facial recognition technology. Authors, Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha, examine technological tools such as CCTV and FRT which are increasingly being deployed by the government.</b>
<h3>Executive Summary</h3>
<p style="text-align: justify; ">Over the past two decades there has been a sustained effort at digitising India’s governance structure in order to foster development and innovation. The field of law enforcement and safety has seen significant change in that direction, with technological tools such as Closed Circuit Television (CCTV) and Facial Recognition Technology (FRT) increasingly being deployed by the government.</p>
<p style="text-align: justify; ">Yet for all its increased use, there is still a lack of a coherent legal and regulatory framework governing FRT in India. Towards informing such a framework, this paper seeks to document present uses of FRT in India, specifically by law enforcement agencies and central and state governments, understand the applicability of existing legal frameworks to the use of FRT, and define key areas that need to be addressed when using the technology in India. We also briefly look at how the coverage of FRT has increased beyond law enforcement; it now covers educational institutions, employment purposes, and it is now being used for providing Covid-19 vaccines.</p>
<p style="text-align: justify; ">We begin by examining use cases of FRT systems by various divisions of central and state governments. In doing so, it becomes apparent that there is a lack of uniform standards or guidelines at either the state or central level - leading to different FRT systems having differing standards of applicability and scope of use. And while the use of such systems seems to be growing at a rapid rate, questions around their legality persist.</p>
<p style="text-align: justify; ">It is unclear whether the use of FRT is compliant with the fundamental right to privacy as affirmed by the Supreme Court in 2017 in <i>Puttaswamy</i>. While the right to privacy is not an absolute right, for the state to curtail this right, the restrictions will have to comply with a three-fold requirement— first, being the need for explicit legislative mandate in instances where the government looks to curtail the right. However, the FRT systems we have analysed do not have such a mandate and are often the result of administrative or executive decisions with no legislative blessing or judicial oversight.</p>
<p style="text-align: justify; ">We further locate the use of FRT technology within the country’s wider legislative, judicial and constitutional frameworks governing surveillance. We also briefly articulate comparative perspectives on the use of FRT in other jurisdictions. We further analyse the impact of the proposed Personal Data Protection Bill on the deployment of FRT. Finally, we propose a set of recommendations to develop a path forward for the technology’s use which include the need for a comprehensive legal and regulatory framework that governs the use of FRT. Such a framework must take into consideration the necessity of use, proportionality, consent, security, retention, redressal mechanisms, purpose limitation, and other such principles. Since the use of FRT in India is also at a nascent stage, it is imperative that there is greater public research and dialogue into its development and use to ensure that any harms that may arise in the field are mitigated.</p>
<hr />
<p>Click to download the entire <a href="https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf" class="external-link">research paper here</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india'>https://cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india</a>
</p>
No publisherElonnai Hickok, Pallavi Bedi, Aman Nair and Amber SinhaPrivacyInternet GovernanceFacial Recognition2021-09-02T16:21:24ZBlog EntryJune and July Newsletter
https://cis-india.org/about/newsletters/june-july-2021-newsletter
<b>The newsletter presents the work done in the months of June and July 2021.</b>
<h3>Announcements</h3>
<p style="text-align: justify; ">We are pleased to announce the launch of a <strong>seminar series</strong> to showcase research around digital rights and technology policy, with a focus on the Global South. The CIS seminar series will be a venue for researchers to share works-in-progress, exchange ideas, identify avenues for collaboration, and curate research. It will also seek to mitigate the impact of Covid-19 on research exchange, and foster collaborations among researchers and academics from diverse geographies. For more details on the first session, <strong>on Information Disorders</strong>, and to register, click here: [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/5rYRQ0U6yOrzlX_5e9iqnD_UB7xRMkmO8EVgecX5S9vDUhOLzn5WpJ0OxgmH2vkh7APoOqCGaRVN7fbP4hfGnUPT63lb2O87rMGdk4RE4xpKcYzABQ2MhfjmOr_3FkIJtbxITjKFXrZRVlI-An9WPxyiN-QtsOJjpxV0baaFxLqDmy_TnlrW_FLKnXYXkTNBbxlIifakqN_m9fPpBaaaMJF_KetoeIUtNQIoHYTtcIQhNoelJ8-I28gyVM1-9w61Ew">link</a>]</p>
<p style="text-align: justify; ">We are also hiring for two full time remote positions:</p>
<ul>
<li>Research Associate: Access to Knowledge Programme: Apply by <strong>August 13</strong> [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/tn9z7DynIuxWFSSRGmZ50s_HYg65AwLX75HcYf9qBiEJsrkj6teE0WzDGHWCezRU7S0d4Li9WxClerez9wuhwJFHRpki4ynQYqrFoAh7dKnqJKulAW_7VyZIrgxsBri_sYFlGanbqT0IW-9HdYDbVbqyjvgAUl06_OlaHwOMDzO833kR5cT3BwaLUSDOhZqfFvwVNZav-DBH1q9Kr9bWXdtPe_g_wDm-PW3lMxudyF7SKkCLrGceKAec1QiU">link</a>]</li>
<li>Communication Designer: Apply by <strong>August 20</strong> [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/lskNSP_MjDCNYOT2PmiuZiGB29gga3crwxuXyJYEF8rdPYDDerNnNYnnCV-GG8rdnyqkxU4eJofgQXU1-iS2IPRRGRRtBXXEaUSVB3mioQNSRwwIecWmm2TIFkfi2fAL7grkxRKKKAX2PG87TiWk8hdmOUqcqtEX9dqbsudTQ3xgmZOio5BOC4GL6mxMzN_9Q5_YzOzZxSZzpT7SMm1J_HASTKNuUktcaESwbMV7PO5sPic41ymaDT8">link</a>]</li>
</ul>
<h3>Cybersecurity, Privacy, and Emerging Technology</h3>
<ol>
<li style="text-align: justify; ">Following the MCA notification <strong>mandating disclosures of crypto currency</strong> holdings by companies, Aryan Gupta, in an issue brief, discusses the policy landscape in the United States of America, United Kingdom, and Japan with particular emphasis upon <strong>definition, accounting practices, and taxation, with respect to crypto currencies.</strong> [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/IapPj_hXCzk7v6Hf21yy36-Sz8hRKHv8zkjWHYoTB7Tu5pnKDAw25QMx5zjerDAadU3BAHF2npDH_q9m81nhsGEbEBQqfWIksFuU7FqAIoREOxap2dkrtGy-X49B1okL_K-zz4zOgG1nyg6ct03r-xSZw_C94Cc8MzubQ2tzmsZjEYGRlxHywlK8a7988SepnX7wbWd2aDt6rhgDNxSBU6AJh3DeygvFctc-wWW9F-Q5e81ADlC9Xei9IoYdHlJrbvOMikdM2WlvJLzb0vnVlDJqd_7x4B7_XdshOYFQ4YRljV4O">link</a>]</li>
<li style="text-align: justify; ">We submitted comments in response to the Supreme Court E-committee’s draft vision document of <strong>phase III of the E-courts project</strong>. Aman Nair, Arinjay Vyas, Pallavi Bedi and Garima Saxena submitted their general comments and recommendations, and comparatively analysed the <strong>integration of digital technology into the judiciary in both South Asia and Africa</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/a-ADiN4WA0-BN9-GzZs_TH-rDZ6m1ii-4HzEzLfXdwVXmGyrIYBcuU7EMPd865oDaqEYSihJoqjxTyuC4usIwryJorATCH47YWEUlUAXce8b2TudJcdAsWryfDvls0WhJFQ9TTw4Bt5ZPfdDmToylNX9ECLuOvO851uSycsDHetWiQhQXaDELUcbQKXBZEbhxtFos2ugg4PHwLXNhwM9iKMb1Q-4OuONy6YcnpFcB3fVUeLvWVp4aBEngQVUnvfLfeVdMvGWNoDk">link</a>]</li>
<li style="text-align: justify; ">Google’s new Privacy Sandbox platform promises to <strong>preserve anonymity when serving tailored advertising</strong>. But does this new framework help users in any way? Maria Jawed’s analysis reveals that Google’s gambit to <strong>reorient the ad-tech ecosystem under the garb of privacy</strong>, ultimately ends up undermining it. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/pwRhJ3bFqQSxSMBZ-qNYKO59aoQ95F8ro9x-8vBy2QDQiBpNFb-qLH4I8Ph-o65OT_bJnNcMoJzFBig6nxqFFcT7qtvR0b6bakvkH4pQRJalgbpLCylKEblBaFkiAudZPamJaz7XIeQ3mMQNQcnk9jxhjGW4yu6YFB8-h_G4nYcZg9lJCj35EZMG-bdl79YR6VEUb9jVxmNFoDXuTiUBCHjeSqP8yqPgHS40nzZgSqD7JMoGiSPT6G7K1xwQUBQLKzlCjKGGoaioxOOWS7qw8BrAQtuKIc4xxRvos-IkyJUA0g1W8wUqjNK7NvYR">link</a>]</li>
<li style="text-align: justify; ">Pandemic technology is taking a toll on data privacy, especially in the absence of any legal framework; these tools are being used for purposes beyond managing the pandemic. In an article published in the <i>Deccan Herald</i>, Aman Nair and Pallavi Bedi argue that <strong>India’s digital response to the pandemic</strong> has stoked concerns that surveillance could pose threats to the privacy of the personal data collected. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/Aye_SwuSiE165Jg5KCM8Xlu9VfO971hqjgMyX4Gv278-mjdbOrJ-pT_WYUbbFG0344IvZPu_ZqcvDp0hcVjfGVaWGAhKvBZDinhfhGSD7VvAE53bWwBah-W8vKt_3F0VP70pUKqESr5WztG-fPEOtB94MghogG528WknuMCtyA29jFZg7JvA2Qy1mR4MHAwQq2tJjvzyA_woJHqaQ2zW9at0DVmsSszAoApTe76XUE-ZoPMUtpNXT464bp-CYx1vY0jeFHyECbR6gHkoBNl-h4pwjkz2i9yOaOntXmNuf1kTX2ARhZpiMNjSmnYMf_5K_vEoGzQK0w1N6CuYG9dHLX2l">link</a>]</li>
<li style="text-align: justify; ">In a piece for <i>The Wire</i>, Aman Nair analyses <strong>Tether, a lesser known crypto currency</strong> that is at the heart of a $3 trillion market. Issued by Tether Limited, Tether forms the foundation for modern day crypto trading and could potentially be one of the <strong>biggest schemes in financial history</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/YKCj-XnMRae1xKW-I5Vc2QZ531_WbOyKyzDAaHwXjqatVsRL9KTiy0LW50cP7Thc5zIV1vTZpRlnJuXzfYGNyOH92MtVSacioSMhehA-8TpG62qt1HMjOndXVcukp5TrJ_Z4jhyr_B0qg7hItuk5fJ9-Kw1Hh-SiRjvYGdVX_ZD2dY8NxTfKn4f7GnqP2bzHT3HWNO9yPzA6KfVPSawYFVLyyIf46leO7oJ5SIKyT4MawaPTtu9FDH5nfhMMgdm9YIFYIkuc12ZF8vargG4gMd608s5mt8kg1hpub4d3pi3o">link</a>]</li>
<li style="text-align: justify; ">India has 500 million internet users — over a third of its total population — making it the country with the <strong>second largest number of internet users</strong> after China. With this comes several kinds of digital threats that an average digital consumer in India must regularly contend with. Pranav M.B. attempts to identify the <strong>existing state of digital safety in India</strong>, with a report that maps digital threats in the country. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/7DnN6eodtvhnJdNwrTh3BU4_wJCm2_Ct9eG7-nmis2QkS4qgiiX4--Qa0TTqxqJqUNHmn3xnedwSoNGVRd0smQAgaFGQ1PLpfwVhmYPO4vaXGiF0dkcRjZTHk1W5mCRTZ4CpIx2zKt4yn1WKAy3dIBxa-xnoEQMUY4YrZRqeQr1M_JwHV3KmHWG2J1CgmXUdY13h6bQ9QEDL16a5G-eN6zH8ttyLM2kXF30BnXgkAL11Sl_vZs9AdeR_UoDQJKObf3BEoq8">link</a>]</li>
<li style="text-align: justify; ">Since last year, there have been regular questions around the <strong>anti-competitive practices</strong> of digital platforms. After 46 US states filed an antitrust case against Facebook along with the Federal Trade Commission (FTC) in December 2020, Kamesh Shekar analyzed these developments in a blog post. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/svyv1CoITzbqrsIl54oOKHsVb5xbZsOjr-IIfJndIFs4FbasMTa8xPr308vsVz_owTEDCl52kc-B-8gqND7dedFPmINs25UkG8kwkeYNcktOKUUty9Zms5UqyAXnyBUFkrbccLYTL8X7DtYXy9UCoLj6i9kGiUgJyNR_ePM-32LsWT2dzMRvY3MLjtyTTeWzqv1kPYcud-kpCxX9zMid4KJZIY7fJSLCsCPiXvrcc5RjQ6wO8SxOlNzRwDLztrG9MlWjBAOom4m32Hc3Az86wUcL5h_dTnpcqiHVCjudMiD2Wz9hKAcXbBF-mMlrTS61GXYC3B9PEMLilqy1XdCSLA">link</a>]</li>
<li style="text-align: justify; ">Recently, the Indian government mandated <strong>online messaging providers to enable identification of originators of messages on their platforms</strong>. In an academic paper for the <i>NUJS Law Review</i>, Gurshabad Grover, Tanaya Rajwade and Divyank Katira conduct a legal and constitutional analysis of this ‘traceability’ requirement, how it can be implemented, and how these methods come with serious costs to usability, security, and privacy. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/7VVDI4qoefdH1M0wYht5ypELl3sgVp1Sbz2TM_DsnX0l0o2wb-2Jq0wob7as43ltZn6ZssVx21Kb6WNIz16SwxuNYxLMwFaVL7Yqu-8eX3FzktAgtzePud71Rw38aDqYPUcb7aIzIkcrEgohiTTqr4KBZglu-g5Vc21w3pwXDKyjSXh_jk_8EIqLlZ2GF5ItEZspJwQGD9VzftHVEmz5AdqcK0Zcar_OOU9nGP8JrckN9xehbcAxzJ9V7lbKaLa6fVq_xbwLO2UqdClq7XIpCoUf9EgkKQ">link</a>]</li>
<li style="text-align: justify; ">The National Digital Health Mission: Health Data Management Policy seeks to establish a digital health ecosystem by creating a <strong>unique health identity</strong> (UHID) for every Indian citizen. Pallavi Bedi points out that hasty implementation of the policy without adequate safeguards not only risks the <strong>privacy and security of medical data</strong>, but also undermines trust in the system leading to low uptake. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/I2XtCVqE0YUtaHHNBuG2SqhPciFDA8vAFssL8OFfrAIIw4IF4i0pC5aKw-bZofPUZI2o59tp6OVhScUGULq-yqLWvlZRi8AvmUhsS6gOvkWJJnC3Jpjyu5u2I2wysy-Q4Kt4TAOMgvcyr49ledwzRKHEo0lsRhQdFZ4VJMq10oyuB5bMF0vIWCJ3VqXUrb41hRJI5OUhxzXiGZmznPSy0p-gua0i5SvyeIn-uZTQjOFvdP5He9mT3HSsaw">link</a>]</li>
<li style="text-align: justify; ">In our comments to the proposed amendments to the <strong>Consumer Protection (E-Commerce) Rules, 2020</strong>, our analysis focuses on eight points: Definitions and Registration, Compliance, Data Protection and Surveillance, Flash Sales, Unfair Trade Practices, Jurisdictional Issues with Competition Law, Compliance with International Trade Law and Liabilities of Marketplace E-commerce Entities. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/KsxrVD9CtofFFSJKNnNl4rbZSQJxomJbHYtB6gaF-CJrz6NTc3iLI__BZ3Af7DRwDzklM6bD3o3OU8Z9g2llAOWtrNsQdWfxmaky4BZfyHArp59Ciryun36-inqvCvTtCz4MfM_SxYe7DWZQjbigMwPTuyM1nTjfuZZESbCU0kHL5uxK09aQvMmYUfBPfBjrUuCPSnz1q_SHSOh38kHHRw6JdIuOl-FX_Fu_pSAFCPpBCjmoqiyRpWbgQQw3C8dbSnJ9sMWXbopXwWS99f4vPqMGK6Tn7w6tWJqmQa8hA3wAQsH8wJgl315nOQ">link</a>]</li>
</ol>
<h3>Freedom of Expression, Intermediary Liability and Information Disorders</h3>
<ol>
<li style="text-align: justify; ">The recent “Infodemic” clearly shows that <strong>disinformation costs people’s lives</strong>. CIS, and the Global Disinformation Index have published a report that examines <strong>the risk of disinformation on digital news platforms in India</strong>, creating an index that is intended to serve donors and stakeholders with a neutral assessment of news sites that they can utilise to defund disinformation. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/oAbyvMS6qTJApmJnnokcclFKfhiXT90qwxve7vAzjNgoVJE7zL3znp9z-jVBaY_A_UghvzrqrbzPyQ8MWgNOqFX_zmz-LXX_QXxpTHcJCq0iQbudFAskKA4MQbW9ipPMHHkvCZ4sjD9YJ-f76ZHCOVs8aTp09SRza6UxxFqz2Lf-wyXOBkjjnSojLEnIzg_6Xyg-MV80GnR0MyptpLT6Ox44jMpuKSDNkziRqXdVFv2UiHFPUq5_kQFItEunUPazzjbXiO6aT6InqGhlHTpBpFR1ojSmP1YOtTCl7efQ-b_jHIbk-BBXDoDE4JF-TskvA8NvEln98dD-0ADQRopsvLp9XWDGiQ">link</a>]</li>
<li style="text-align: justify; ">Torsha Sarkar, Gurshabad Grover, Raghav Ahooja, Pallavi Bedi and Divyank Katira examine the legality and constitutionality of the <strong>Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021</strong>, highlighting potential benefits and harms that may arise from the rules, and making recommendations to retain the rules within constitutional bounds, and retain consistency with human rights based approaches to content regulation. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/xeCVOWx8opFVXsJsk8tGp7BqtYUkK2zovJDarS6GLbKTR6VL0JLLSA-ap81tloriYQLLg6Cv1HxAws110HUv2UUabdK0aCbOvdeL2AtTWGD4zL7LEsC1gAIHyvP5DCYWo8flbZwKL0UNrMa-Bp8mmAOPTNTaHHyHjt6SyvidPNrc2nvjuwWNDsgPITp_PBAYDBmfwu02GfVr14URroyiEeqExwha0b0RlSPhrunshSDIXND6-AaBkVuGJ8VdnE-bMD7FHdAa559EsTcyhmnPiIYanR9fmV6UQHb7Q65yD7jENV3-lbzRCkAjki09Qvia1nxacxBIWHb-w3_PlbB7GkJXbl8_qVZHEWhyzTnAxVoGA-je-7W-x-eFOetThpo">link</a>]</li>
<li style="text-align: justify; ">The passage of the <strong>Intermediary Liability Rules, 2021</strong>, has also formalized the legal requirement for the utilization of automated tools in content moderation. In a blog-post for the <i>KU Leuven’s Centre for IT and IP (CITIP) Blog</i>, Shweta Mohandas and Torsha Sarkar analyze the requirement in light of concerns of freedom of expression of Internet users. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/kfCCqzfLNuv79Hdeo_EA2wt5o0LRgortN3TKK_wup26r0wlpxdBW0C-m_IDPDssS9Ie8vuBmq3TrK6Bo0jfGRs1qD89TEU2wzVysBv9kAjUiosw2pXQiNir2ylQAnNBxnwyCe_qibQIf9UOGjlvP8d8iB1XZ1QPqQUl_yHKFDrPUme0OS2EUpis_rSoVy1ZOfH-GGHo7iNYRMcqqjbmCKtfZjmLvWY86v2Zk2EjLPXr8OA">link</a>]</li>
<li style="text-align: justify; ">Our comments to the <strong>Cinematograph (Amendment) Bill, 2021</strong>, authored by Tanvi Apte, Anubha Sinha, and Torsha Sarkar, examine the <strong>constitutionality and legality of the Bill</strong> and whether the proposed amendments are compatible with established constitutional principles, precedents, previous policy positions and existing law. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/Ao1Sghs95JSFnpzMq8bTUYQ0z1F6uZOfg6M2Stt2ceVvCf4b0iB_3f-Yx7uywoASrATvOSS6uPYTVbP8x_JLqoD9QfvjD5soYvlNJBd87FuNyxqAb4wQ8cjOuN7B44pRo65xvX9K29eBGFp7fgv-AD_ok80j4SXnAZ6LrYClxPiHC48fiisVOW7McLfsFpLtUsns1u6MIG_7FMAKNY0GHFxa5xs3lM21mrhkEcC6I7sbimtF0jmOkid5nzYbcOrtQ5ZsvrdxSRllmmOy">link</a>]</li>
<li style="text-align: justify; ">Tanvi Apte and Torsha Sarkar, in a submission to the <strong>Facebook Oversight Board</strong> in Case 2021-008-FB-FBR: Brazil, Health Misinformation and Lockdowns, answer questions set out by the Board which concerned a post made by a Brazilian sub-national health official, and raised questions on <strong>health misinformation and enforcement of Facebook's community standards</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/h-QObkDu8td1bmkfzIEHJlAmS10MohQnXiyqHQKNEnQkEpvkdTxLkKV3yJO7CcTJGDcS0kRQVTDEE8KNbb-551uGYLiaV3wFoxJ9tGnvMBaqvtPgYgxZbnAMOowSxN7gQJTqSOZwzMVQtSbr449f6KC0Bb208ApIh2a8OX_HCRwn2BYpoTvqUfeyFZyp2qoyW5LbeAe9P-JTlFrDaB7oFBXvTHvlJfTRrT6ZeLlkQqA_RqMOga71-sxDIxBo0vvn-9r28DcTePg3p659lJ0CWQMCXiz4tY1p3cLrJgKl3K3fjignnvexZpNwk91paBQ_Bia2DDUxc1Vxmvci1p3AASg3FtYqL5l1">link</a>]</li>
<li style="text-align: justify; ">In an essay for the <i>Indian Journal of Law and Technology (IJLT)</i>, Torsha Sarkar analyzes issues rising out of the recent <strong>litigation between Trump and Twitter</strong>. Torsha examines intermediary liability issues under American law, and draws parallel for India, in light of the ongoing litigation around the suspension of advocate Sanjay Hegde’s Twitter account. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/JxA_S2DzStQUHeEVzf9_Df15_QnK0WHgMEjaaCqNjLmfXPAS4teU_fvrDtG9R4OwwOzWYiAXWPE3QFaxOZvJ5VCHuwincnLyGpYpWME0K5x8CJwyW0vUhC-stExhsSV_5pLmEtfaVyzcGRaXsJ4jGnLWnrADSdYzpPjUTPAb6hKDDL5BBjLjzvRt14_y3_9RNos99UKlpOCv9UFR6gC6cmOQmqte1UICPRw54oI7TUMC8TfPow-JZGmeA8lmMtODPi5dPN91euSX0g">link</a>]</li>
</ol>
<h3>Copyright & Access to Knowledge</h3>
<ol>
<li style="text-align: justify; ">The Indian Parliamentary Standing Committee on Commerce’s report weighs on several aspects of the <strong>Indian IPR system and issues of protection and enforcement</strong>. In a blog post, Anubha Sinha summarily notes the observations and recommendations of the Committee on the Copyright Act, 1957 which stand to impact <strong>access to knowledge</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/20Alo2_Tse_JJBXG7sp9tp3Jf_qIUy2ksAvhoVH4heonMxDYRQK4nweCNF8LP29mpKvznQC8vljEX7TCv-Wb6SQREV5ph4uYOVIgz4wf36MaGTw8T5dkCxjqttA5V1tzNxdpfKi1WqQJKSFJ3o9Eog0uVFhHd3wXaYwiukkD3WHoDeYkOSZR_DYTGlm6nebmtCjaRRhTqwGMPYkZsKxM2td9xO2GBfP-J5R8llhxsrl1MvaUyiRBLIASh1l_KNpvCtlix-3Hot2VozymMTWyPG15W6s">link</a>]</li>
<li style="text-align: justify; ">The 41st edition of the Standing Committee on Copyright and Related Rights (SCCR) organized by the World Intellectual Property Organization (WIPO) was held from 28 June to 1 July. Anubha Sinha participated in the event as a speaker and delivered statements on the <strong>Protection of Broadcasting Organisations</strong> [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/VysBbmMrMfJH2U5C8TeeVWtBq8wqBadivgBYyh26sNYegYdfaR4Tg_G6v1FqMgyVD6KAm3Z1tKWm256qR0VlPwGircBtmecePp2_-24cYoFWCoDH5v_5MuytzvKUIHkSlZ4cXN9CtUZ9t-92oeqAe5qm_CDhT0Xu7G5OZKn1_9s56JlL7E9FiWa0U5l2PYeonXi9H026DNWNaOPHQ8nvvYlmvIcTkwvKWQ">link</a>], and on <strong>Limitations and Exceptions</strong> [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/TBrEeBXDldm7nDPpsENoKMft-G03I54LhjmedXzSkg1RPImWfwqhCZ7bwXpwsXbIuVvOLd7G0RtA7PgCDKqHKcYjWzHr1K8Dd8oSUYIasd8N_tlEiMedkl8eTmoz5Cm_cLV8NlYLzIbsrHCxZhhPUApqXJprQ39qHf89pyRS2Zcw1HUYW8d-rVWobmlbW4MVr0EvBz0gbWpz3NLbh9W71pVK1VN9j-ge--ine3yx-uSoyel8qUGs0mPqw0NXp0nEUnIP32r3qHvdjzEbz4Ynagm2ww">link</a>]. Readers can access the notes from Day 1 [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/W_H8QjZ4FUv92dhzAdWKRTS508l6DEy7YOb8mnsf-ZzcQeMZe8TCW3XG5Fs7j1BO678zXMJn5jZiXL2eI4ZVNjrE6Sz8XcQs5fJ4z1EZSQTr-vMsaJsroyckdwmtQnOepz5KMLPZl4OnPm6ERcnJGBCVp6v7PZgpxVBGp5PR9Fo4e_TncX2qm_q_aB_e9s3I2vp8PReJJVYoEl53xIqWKkBqXlWk2RbqOQ">link</a>], Day 2 [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/DRaLcVvuB-VfY7fjrVtjA5hPHTFt2KwIt2hsH4mjuuYlzJLCv5r9O3R5-4Rg72Bhvw3kMYaowZuZorJN8DXJjhf5NABvf519ig4SyCsIUri4mXWjDA1lmCHY_Oe1WfTq_VLVxwOb4XYp8VVnKIIcgAg1kseXVSENaugyRZI3otS_IUn_zNwEkw2PdFEojqryYcf5kiEADKQ5sRuVH8WB9pncRKgCvpOfFA">link</a>], and Days 3 & 4 [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/dTkOebRyoXNDfdFetpwM6-mmRSpH7gwM1RL-SJmGMrbF25H9Y4-lo-nQ8HINcrM1eUmX9nqvpmoL26wsIsbAhOJ3MQygMDJpTQc-RNGk07WOUyH4GFUuejBJzsRBkQn44CEDxkcSQBzyLQHGjKakTPDRFszrjnLqD3e9jXfs77ie7wKRazrFjyssNPscxSg8xmrcfv89klVCo-Ts6ApD6nuRi3t0nndX2DAQ_hw_WlYLCgfmyw">link</a>].</li>
<li style="text-align: justify; ">The CIS Access to Knowledge team published a comparative analysis of two prominent Wikimedia initiatives, <strong>Wikipedia Asian Month</strong> and <strong>Project Tiger</strong>, to understand prevailing challenges and opportunities, and strategies to address the same. Nitesh Gill in a two-part report outlines the research questions and methods of this study [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/HZI5YNgRhNViR9DS-ewrTbGX-5PkynXGEMDr5kfCauCk2OYuygd2I3Da7Tp1kyhG1Oboc0MxIelbvOqpVQHHq0JVRgbyEVMPZiTWPhQENwnv_pfOR8KYHZzzLKv7Tc-iFk6qBgCCDSbnwjmA7sfiC3FDHFvqzbEGlMMUIg1XvcRNu6fFBWe2S1W5lsdZD00dY0r-w8o3IkzCSbKwHqJMld7CQvl48lpzGHtKFreKT_MiB33iis0Fehz-nrz7DlT-k2GLTpwScqX4DcHrLjWb7A">link</a>], and then presents some of the observations and learnings [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/bdLNf3_CCDaXpSzzhYF_2ThcU-LuTFb6k6HDcZ_4myjIWm-GlwXcDVQweGpaYjKKt4NmMol-HxoPucMx6w3-HC4QUmPULVJ882x8AMHaRehpgFh9t8cYPB6VPyjXNgcbzjSfOQXE6GpUDhrGYYg6KTmuH6t7F1qlOcoc_qlglL4vz5yCBL8Ri03yfZZVcfheY5Ly5lUb3WSZMpsO1u6n6KaRC_YFemwGu0sWsWgjW-XPRSNAyxHKeGLlUS7eN7wNvx-iLCLb2-VhEtN64QZHaxUd724J8Fg5">link</a>].</li>
</ol>
<h3>Labour and Social Justice</h3>
<ol>
<li style="text-align: justify; ">In a flagship report on <strong>domestic and care workers on digital platforms</strong>, Aayush Rathi and Ambika Tandon argue that digital platforms are complicit in discriminating against workers on the basis of their identities, and that domestic workers continue to remain in precarious positions without any legal recognition or support. This work was jointly authored between the Centre for Internet and Society and the Domestic Workers’ Rights Union. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/sm3NIXtD7ClOE3mjbw6fg2ZvZB0TI3dh6rnb4vb6Hv0Ev_VwikRY-XOESwuw3-Gfglvi7OHT5l-PthXPf2rn3UDbiRRE3jaRzidnzl5uPs6ZqdtktRRVINgR3CCtZ-grN_QKqZN9KefjfMYgB7klWARTLAkZbSsKmoyrLiIZ0XMVXkYWu_F1do2eH73g_cTDDyKJiQiq9wWsbLzwjsEWoZ1uR0H2wqUp1ZOfkEyfkTbU0YojEnLVenrB-X7HDp812pjRMqHbw1qAskYpol6w_Tca">link</a>]</li>
<li style="text-align: justify; ">The ongoing pandemic has raised very valid questions of <strong>access and infrastructure in India</strong>, especially during a time when the Internet and digital technologies are essential, and in many ways the ‘new normal’. P.P. Sneha and Anasuya Sengupta write in <i>Seminar Magazine</i>, outlining some key <strong>challenges in digitalisation and representation of non-dominant/marginalised languages</strong> on the Internet, through reflections on two recent projects related to languages and the Internet. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/iWhSEkwBqINHVVX-zy-cEtFRkWyCSoGeumeW2KNYU8gylOUgjNWiIceMev9vAcoTdrNvCoBtuZKcHSmrG3oEZ5Wypr7VRmrecPMNbuxUDoIF4FJGIlzAPeQ8dpdyeeHeQqANiU3oUN2xKTpRQ5Tin8PUoWRfMm5YXh_iougUbkun-Tq6NSjRkmvbiWXeZyphO9R44QWTrxDm2wWOdlCh2reGxocxbpNMzDPlGmxnA18sMsFi73SksnR9lQh76ylSM2iIYr3ptZk61DznsmUdfr0BK-GQL7HcD4M">link</a>]</li>
<li style="text-align: justify; ">With the onset of the national lockdown on 24th March 2020 in response to the outbreak of COVID-19, the fate of millions of migrant workers was left uncertain. In addition, lack of enumeration and registration of migrant workers became a major obstacle for all state governments and the Central Government to channelize relief and welfare measures. Ankan Barman compiled a report to <strong>qualitatively assess health conditions of migrant workers and access to welfare</strong> during the first COVID-19 lockdown, in three host-states, Tamil Nadu, Maharashtra and Haryana. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/hU5-1FD3nbo69KurjQmXES36QSFtRZSHr4FuCzsscEMQOUOZD523Cc-iKliMQQWvm7AFZQ2JJtrcPhNeqoAS7ASS2X0_c9D3D_yvS9IuqLpt_xHpSUdVxnh85ZSVlSr07zj4mucQogJy6c2ZHw6zgQAmLQGkcl4xr__txUaycSpVKrqmHcBb3RBw2YkBTvxRfFnll2FcPmmfFYhGf1_SGM1baLyoZscYZ96h-AB1tHzg4Lao2KfFIhJ-RxHtC67r1nytTWNCRy8pY4QWmx2g-kBw0EAD4vl94LmPX10tdqmvBreDz3xxfN4o9h0OHfEzZARXb2dQFnHltqvRjPq5msyzW69oXuZZsDs0pcS6yYA">link</a>]</li>
<li style="text-align: justify; ">Between July to November 2019, Indian Federation of App-based Transport Workers (IFAT) and International Transport Workers’ Federation (ITF) conducted 2,128 surveys across six major cities: Bengaluru, Chennai, Delhi NCR, Hyderabad, Jaipur, and Lucknow, to determine the occupational health and safety of app-based transport workers. Findings from the survey have been compiled as a report which <strong>reveals the complete absence of social security and protection of workers in a digital platform economy.</strong> [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/J4FjrBD647MV8lneM-mPFxr7IWwYeETEgk17OI3lDkqNVRmfoRqhmAs1CqZXDQx-MyEntGeO7vOMUu6lslvGQbMg4Pp6Gvpz7GaUrXiOXti7YGBNPHMzLCP3BsDeYstDOYNs6Rry3eMUvPI-mV1kh6aNGWf_WlBXjwoevFZdwmt660vTJbRaUGuI1Cc45TFmp3ur5qDJNg3vaTXElkuEvo7Dz9rPcEHOTDNy-k2LW3cX9mOB_QNC5yt4sy0CCWvf-2yHAYa_2j6pVmVx2PwbbSrfMfSdK0-WL1PSZpcAHlqcRVU05C5Js__byzmLjmWUKO-kMbw">link</a>]</li>
</ol>
<p>
For more details visit <a href='https://cis-india.org/about/newsletters/june-july-2021-newsletter'>https://cis-india.org/about/newsletters/june-july-2021-newsletter</a>
</p>
No publisherpranavInternet GovernanceResearchers at WorkCopyrightAccess to Knowledge2021-08-10T15:57:16ZPageTechno-solutionist Responses to COVID-19
https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19
<b>The Indian state has increasingly adopted a digital approach to service delivery over the past decade, with vaccination being the latest area to be subsumed by this strategy. In the context of the need for universal vaccination, the limitations of the government’s vaccination platform Co-WIN need to be analysed.</b>
<p><span style="text-align: justify; ">The article by Amber Sinha, Pallavi Bedi, and Aman Nair was published in the </span><a class="external-link" href="https://www.epw.in/journal/2021/29/commentary/techno-solutionist-responses-covid-19.html" style="text-align: justify; ">Economic & Political Weekly</a><span style="text-align: justify; ">, Vol. 56, Issue No. 29, 17 Jul, 2021.</span></p>
<hr />
<p style="text-align: justify; ">Over the last two decades, slowly but steadily, the governance agenda of the Indian state has moved to the digital realm. In 2006, the National e-Governance Plan (NeGP) was approved by the Indian state wherein a massive infrastructure was developed to reach the remotest corners and facilitate easy access of government services efficiently at affordable costs. The first set of NeGP projects focused on digitalising governance schemes that dealt with taxation, regulation of corporate entities, issuance of passports, and pensions. Over a period of time, they have come to include most interactions between the state and citizens from healthcare to education, transportation to employment, and policing to housing. Upon the launch of the Digital India Mission by the union government, the NeGP was subsumed under the e-Gov and e-Kranti components of the project. The original press release by the central government reporting the approval by the cabinet of ministers of the Digital India programme speaks of “cradle to grave” digital identity as one of its vision areas. This identity was always intended to be “unique, lifelong, online and authenticable.”</p>
<p style="text-align: justify; ">Since the inception of the Digital India campaign by the current government, there have been various concerns raised about the privacy issues posed by this project. The initiative includes over 50 “mission mode projects” in various stages of implementation. All of these projects entail collection of vast quantities of personally identifiable information of the citizens. However, most of these initiatives do not have clearly laid down privacy policies. There is also a lack of properly articulated access control mechanism and doubts exist over important issues such as data ownership owing to most projects involving public–private partnership which involves a private organisation collecting, processing and retaining large amounts of data. Most importantly, they have continued to exist and prosper in a state of regulatory vacuum with no data protection legislation to govern them. Further, the state of digital divide and digital literacy in India should automatically underscore the need to not rely solely on digital solutions.</p>
<hr />
<p><span>Click to </span><a class="external-link" href="https://www.epw.in/journal/2021/29/commentary/techno-solutionist-responses-covid-19.html">read the full article here</a></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19'>https://cis-india.org/internet-governance/blog/economic-and-political-weekly-july-17-2021-amber-sinha-pallavi-bedi-aman-nair-techno-solutionist-responses-to-covid-19</a>
</p>
No publisherAmber Sinha, Pallavi Bedi and Aman NairDigital GovernancePrivacyDigitalisationCo-WINCovid19Digital TechnologiesInternet GovernanceTechnologyE-Governance2021-08-10T15:34:06ZBlog EntryMarch - May Newsletter
https://cis-india.org/about/newsletters/march-may-2021-newsletter
<b></b>
<h3>Cybersecurity, and Emerging Technology</h3>
<ol>
<li><strong>Doctrinal clarity</strong> and <strong>institutional coherence</strong> are essential for a robust cybersecurity posture. Arindrajit Basu and Pranesh Prakash analyze this in an opinion piece in <em>The Hindu</em>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/mkAIQo7C4IZmt9JYL5DoADKYnQqxm9fka-gdBSvoA81rsg6GEgy07tjzn0qNQvz4PxT4dYB5ZeNQ1Bbi1ubYUR0z6z8dy3e5FK9grxNzzgZSO0IUwVPm8behwp6dBjhS3_xc9_d4Bz234TH-U0qMpqF9sJzKUGtQ7MZi0hnzsUaVhsA2VGsqoSC3xrrr1cD9ZX8AlcPmIR3uj5moIhV9EfHcU2EHOQqhu6OCGcfuUBS-tgGe1iBvbOikAjEWMJin4Q61Rd8p31vaLtqTwVe2uw">link</a>]</li>
<li style="text-align: justify; ">U.S. and Indian decisions about <strong>Huawei</strong> have implications not just for their separate relations with China, but the <strong>U.S.-India bilateral</strong> as well. Arindrajit Basu and Justin Sherman co-authored an article in <em>The Diplomat</em> examining Huawei’s role in India [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/M0GGHsg5EtZWdtPNqwbeCiMiN7elnvi6aLYTpAVn0gw7se-z20XDgj6jfb79INZxyFmGtDXDcD0pf_RfRo3K_RyXEav9HKy_gV1G8nDVPhoN8Kp2G9-NLUeUCXxW6WYbiyyWDZdKwxzd4PsyoxybVKoJ9XH7JhsVFDPhN0ySqc8Mi6MD0zq8q_CRT9dDkdCC2queRjZdcOr4eoC8YPjU-LVpaxJGge0rOaPrYmM3oe__OoIjvA">link</a>]</li>
<li>In an article for <em>The Wire</em>, Aman Nair points out that India might miss out on <strong>NFT (non-fungible tokens)</strong> which is set to become a mainstay in the modern digital zeitgeist. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/wKv_Gt32QSHdLE3-ykqX_8DMhA2QohVdjXJn-C65rBN_0nsI9LCIhp3WrANkb-8cDzw1rSkKGrJ0gyPwV_p9aqBIOu3ioMRLjQmVdwMwcVH6nVHELvDJiebOfI5HgW0DS2jvjYUGiFNuBE4y5k7D6hcdEnmRXZ0cGaM-VT0qPJcw28gDhe7eJcg_rmvGhHbJBm_h0VnZfNJyjqZ8CFoiIU0z3QaGDqk16_gOlCYYR98VTEehLBYUs8ymz6Fggw">link</a>]</li>
<li>Arindrajit Basu and Andre Barrinha co-wrote for the <em>EU Cyber Direct</em>, <strong>on outer space diplomacy in the 1960s</strong> and why cyber (security) diplomacy isn’t quite progressing as well or as fast. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/Ud7YZJn5YKOmIROHOUMyLVB-69aNwlb-FParRXYStS_vdQ3SDwErMwxNQlu8iFNnUlSI5lejtsIHgERXyVY3xzTjRGyNP9_sR-uAyfxusTZlSMU3qNs5OPlSJfRErWBEkj_TiT2y1QQwZH8brbn6P8H4S1rDBX1QFICDOe5HjYF2GOdrgzwA1vaeJB6YrFcn2BUNmpsDD4f0mKwcYkCVVFCYgOtbj1-59CoswRfSqgA">link</a>]</li>
<li style="text-align: justify; ">Arindrajit Basu, Irene Poetranto and Justin Lau co-wrote an article for <em>Carnegie Endowment for International Peace</em> which captures some <strong>concerns with the United Nations OEWG process</strong> dealing with cyber norms and the absence of discussion at the forum on key issues. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/G-0Ok05_UomEqWTkmsuUXGq9V-i2zMa0ul5zzkfLKC8Rj5rCGsl12lrJl7tfGzORBxTOYoVPoLUlHF_KaD2z05TyeW3cQDqaxvlhUDxfr2Z9n64Lbe1_p8FYKFvLXrsNVAoEbxsCbOncqzkKgVebcxHe_HF5Murx9aVk6Ps9ik34I4Sj3y26-_Nj98iLwMPZO0rs8hYNZbvsjcUbyGxm6G5xlfjakhy-UsjioXEGdz7zQdV6O_FCG1BoP1Rvm8fPxvdK1JEbGkedHgwk9ENn9na2J6I">link</a>]</li>
<li style="text-align: justify; ">In an article for the <em>Observer Research Foundation</em>, Arindrajit Basu writes about how India must avoid getting its <strong>data policy</strong> caught up in tired existing machinations and instead forge <strong>a new path that prioritizes Indian strategic interests</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/eZHdtXVJIePupyeXaX8RUlkusvtOgHe4VHCDeiVpkTS0P4ji1lGib5cqvQX0nGf5iIx6vb52mwWtd9Z5G5z71_dGvd89c5xn2JyZ-f9cdOWTAsHKRwxo_Tk2Kp7Dfb4JEi4r2Sd5r3dHPc3YmRMYLseDLnESCpmxnPkbX5y1sMitN5OUu4x1ydiYZxfB3FKVZjnnXSCAmB2yPWS7pL4cGcVWpJ1PqBoqPAvvs_Ofqyg58K7inxfax-5tIPk5wyLsEARP92qYgPo">link</a>]</li>
<li style="text-align: justify; ">Aman Nair, Arinjay Vyas, Pallavi Bedi, and Garima Saxena authored a <strong>response to the Supreme Court E-committee’s draft vision document of phase III of the E-courts project</strong>. This response recommends consideration be given to the digital and gender divide, and lack of clarity in the document on several data-related aspects. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/eLr3hXFonL5hfAUH5ux5zoQcTrY2PxRDO9kflkNqtcUObBbYWm-vqp7v4Ex0g_o7YtCokB315adj-1k_QwDebJ1k9G626m1MGuTYmlfKdwSVl7mYsfna4Dy96z8Eb7iJ7gtcZZF8s5JQCGN1ux3PiYvgDrxbs3MeXeZizpIZsm9OsPvCGzvC5HbxkhfdFG2B6853ajax3xofJRcucZ2Jc1AFEg5iAVrwiopY0SFIb99XHRESaUFEP9KYNs2bC1nAXaAW4AU7OPG_">link</a>]</li>
</ol>
<h3>Privacy</h3>
<ol>
<li style="text-align: justify; ">The proposed <i>Personal Data Protection Bill, 2019</i> is being deliberated by the Joint Parliamentary Committee and is expected to be tabled in the Monsoon Session of Parliament. Pallavi Bedi and Amber Sinha co-authored a white paper to examine the <strong>personal data implications on welfare delivery models in India</strong> and to suggest ways to operationalise key provisions. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/_Gjo4q_RVbTa0sA8X1FOhYiB4McMtr_8JgcG33Uf9nXIX9VsXvDxzVvYABfOz-DyVN14iCoyotGqfkjezyNjJFt4RsiYkw6m0UFNhGd9NYLj3fkrn8IfKwI3YJtO9-FrkgMxcCOTc1PdedlPXPGO2cafHCYUaLhHNMXIepnX2L2KC-mG_-l0Fjx5m-GvmP6GcXg1eyOyNZjrCL8eFWzyCT9XVDv8afLm2D3F0l-28tz-MwSJRRqc4vIjV0PCykM6NXQ">link</a>]</li>
<li style="text-align: justify; ">Shweta Mohandas authored an article for <i>Rajiv Gandhi National University of Law Student Research Review (RSRR)</i><span>. In this article, which forms a part of RSRR’s ‘</span><i>Excerpts from Experts Blog Series</i><span>,’ Shweta examines whether </span><strong>Indian data protection legislation can act as a check on growing workplace surveillance</strong><span>. [</span><a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/5X-z4Ay91QvhUYmdHomWwzdgLp7eCubPpwLyaH1H0MWiyiQfU9PIIQSg2Nshk2mfLJYrb65hiGIj3xyuffXiDnOu9lbwfFsrQCL6D5DnQ9HkvOoZHcq3_Kgf9NVKSAX7tv-aqy00L3jjJtbWbvfaqwnagmdUVSLEP9E7S6s-UTBvO-KCO82DhWELF0Od6dhVrbr0WvVi980IX67IkCiSNaKwpuNwSXuYS9bgD0s">link</a><span>]</span></li>
<li style="text-align: justify; ">Aman Nair and Arindrajit Basu examine the changes in the context of <strong>data sharing between WhatsApp and Facebook as being an anticompetitive action in violation of the Indian Competition Act, 2002</strong><span>. Having previously </span><a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/u35U0gu1I7Y81i6OYu20wN7zgiA4FxKWowVPgk7Gmafn69IJLoZapqrfCSWui33Sh0ntbkPajjtW_p35C3qMoCP5xcrC2dHSO3DX9MZ7uFNbJZ-p_NRBv5bOZ_1jKeH2KYBYohqWlZ83VVG3CDvNl1AK_4xmNrr9L578OragYyJQo2U93bxHbLw1fnLc1CPWqkfZvcmydFo1HGyNBeFpRqiTVn6ytQjyAiUw2Gisx7itlxVHmb_QCuSd0T8nD47U4UBH_i_dg6PN5R4PcjU">examined</a><span> the implications of WhatsApp’s changes to its privacy policy in 2021, this issue brief is the second output of the series examining the effects of the changes. [</span><a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/jOUH-SfgRCjdp9DORlyEL16nnyJ_ogGha0d2DdYJGcRnBOiZt6F3SuhZzZYX8t1umpAtId1_80WNiW3Y6CgGDA-TYQ2hORCBWeOvvoPphGzr0DfCy_6tD8QQMzgb3mCm1GXECkmJM_kTL9kfRrj8GVpe3DHJ7_jX3pKBQx9HHWKqkgftY_8wTG6zCG4J8HZC-1Hv66BsR1didil6DVh-HtetydLcMzlikdBj4bvxTjzFRAoLvsyeBH9PaoDRJuUXTYR5-8BcE8ITu2TyiOyc_ME2kuDJ3DJiE4PDeNHutpTJyuc7lqwp-g">link</a><span>]</span></li>
<li style="text-align: justify; ">In a blog-post, Pallavi Bedi provides recommendations for the <strong>Covid vaccine intelligence network (Co-Win) platform</strong><span>. She says that as a first step it is essential that Co-Win has a separate dedicated privacy policy which conforms to the internationally accepted privacy principles and enumerated in the Personal Data Protection Bill. [</span><a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/o19mW5Vyy3giilmnC_ef5khZu85qA-A3uDr687psJN0UhAkPY43mYt7Jaw7cXwy0NJK7ky9IvnklXsGPIME4bYH2cCVK_NeXEhZK-N6RRRSSDFUG33BpdaFtUD3cqIxrsEV_-ILCXF4SDN3IBmJFKeJDBFZA4bLuUWEzsAhBQbnFcbGuITTNq74cViuBSO-p09OT9-AtzOUgce0Brhta6YmU5iSmpMGW2XWhWTw3ueesRR_8fjDkF7XoLDGCMmkdjvAeyfbCIee0z-30EbUN5sbLzCCHVUHmuYVPzqtLeV8">link</a><span>]</span></li>
</ol>
<h3>Freedom of Expression, and Intermediary Liability</h3>
<ol>
<li style="text-align: justify; ">In February, the Advertising Standards Council of India (ASCI) had issued draft rules for <strong>regulation of digital influencers</strong>, with an aim to <em>“understand the peculiarities of [online] advertisements and the way consumers view them,”</em> as well as to ensure that: <em>“consumers must be able to distinguish when something is being promoted with an intention to influence their opinion or behaviour for an immediate or eventual commercial gain.”</em> Torsha Sarkar and Shweta Mohandas respond with comments and recommendations to the rules. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/nP6_NZer0OIQv_bMG6p9Vzx-uTdYi17sYHl0xdFjMYzEzv9xmTvSG73K8_7sq4J6NPdQ5sNA5eaQvAwMHBrYkAt2mGFF9SLlrCSfNZ3K6rpRyst36jbtHpdD3_Pc9ukKdBW3_lhiGpISLi7H2TBa0BumRk2JV3PFdUBH6R3kk0ywJuvcHeJJWxAsnyydYY2s2_iRpo5Sc0MvHbC8vlDCoI6mtuL0_PC6B2eL0G8wZqbtwYYM2hNO-DfobKXJV16nfGC8GxASmN2FmH07pif0Cn5xSXoeadfmwb-Fox-B03UAn-0THELMM1beVubJWnOAOrPXoA5JIZ7CQe5x3g">link</a>]</li>
</ol>
<h3>Copyright, and Access to Knowledge</h3>
<ol>
<li> Anubha Sinha explains what the draft national science, technology and innovation policy means for <strong>open access to scientific literature</strong> for Indians. This article was published in <em>The Wire Science</em>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/CJjg4ihUvxLz1chJKcO03n5_Ydr9rvEDH_kFGYPs7_aijAvgsioqcqvZU0n41Ly6CNagHY1Upc0-3eCPsdo3GxXWC6baFyPSXImgs7tRy-Tio7TdRDS1qHU9i5YghNVjsoIunFozlrsutZGnXjXNF6Ce04lDrZ0g0dOdBIDt-InCeubeq35RnbIj3Qb2jdf2vwlkcAeyC925K6WeyzPM7sGUAVmMH1wKu9pmN-bgHJfNRodxOWODiF_o5vmu6g25UP6IdunHwUKorudI_0RopdHXBA">link</a>] </li>
<li> In an article published in <em>Info Justice</em>, Anubha Sinha provides a summary of the progress of the <strong>copyright infringement suit against Sci-Hub and LibGen</strong> in India. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/Jg1NJZxuFnR-Srq0Tz1RS3XZZ17cL4JxJFlOY2g12wpoHPIxsc-lW18hjUe7sg309BNiO1i0V_yLGaQsQiAzILlWe2zd3ctx4dTTFvyFbs_Ds1w3W91GNEdoWszaryWzeKs-ZSDZYR1IPZa4ZGXpOrd21RiKK6InuJVXGZRN6WJzmgdBr4ZWre9-NP3AxduZDFnzXrjfCho6iDPhS7CuR8ZW4bFCwkmvCr70-yTDLLkT2DUmkB-caRfvMxukUyr1fjilhp-3vJwEt1gHi0HP-kpyx3wac8mjFxSCbsVg-5AiRMti">link</a>] </li>
</ol>
<h3>Digital Cultures, and Social Justice</h3>
<ol>
<li style="text-align: justify; ">In a research paper, Noopur Raval offers critical historical insights from the fields of international development, anthropology, and postcolonial history to caution against both the possible harms of <strong>gender disaggregated datafication</strong>, as well as the consequences of <strong>non-participatory datafication of women</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/WmB3X2tO_c8hEDCY-QCDD1tTPBIEB7Gt4bFRLY7mNCB3X5sRuV6npbW4eIX8ta-lGod2fia1v8ZTxZurtXczkJQbBg5ckgKRSG3eYKfG9ntQ5qRKVkq12g9YEmZ1eP1raJjh5p5aHQ-0MhUsQafyvBQpzVEdDK9ZJecvYAq3GyD42aSWkS0iQ17sS9WCDchDhFQn20CS7MAEmZm6rM0yymmNBqTHRR7GuKxP3edQqiMTblOufA4mhx62YuIgqn_mRv5uOPqxevVBmTtlTTyMmZihFccK">link</a>]</li>
<li>Kaarika Das, a research scholar at NIEPA and Sravya C, a researcher in the Humanizing Automation project at IIIT Bangalore published <strong>a study on migrants in India's Gig Economy</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/H6Jr3Xykf4-nxghqRxErQtEVs4TH-l3S2LVhiXIisAPDyUCm6fiWyLGCI_V9jrofmSaX7B1sFEjjVvhsqbNcHpKz6_ztX9o6ZMp-BRrke6HgLScE3FYxJKKFhtGyp_w_xUwJu1jybdsltHMKm1oNjRgYm4Z_hbpUTmJlK72raCD6jC7VjvTmuJmIGZLFa1J18o0IoImVO8VLqbV_lUigTVBNQWqZsgl_TyjYf3a6H8oLBlG4fo3jIXAsU5S2aySLzNO9u46C1Zv5g-D3wc6jChAhrMcOtcp2NNeEOJRw_n-nzYNrfVNwwLKdIOY">link</a>]</li>
<li>Sameet Panda and Vipul Kumar wrote a blog for <em>Privacy International</em> pointing the <strong>failures in the digitisation of India’s food security programme</strong> in light of the <strong>exclusion of married women of Odisha</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/czsORnDtqHr4eMfKxD9huAqfK9BfJ_oZWslVsCoG63dJQwSqFhMbQzBgtolMXmsnvl3TuEaSJXOIWWc6z-EcMaMSfZwAZR6Tixu7KVE3u343x0qCePCh6k_Mbyo1ckxpCdq6R4M2f8b-8PdxHsW1OzgIALcgF63n63DmmmP3krIGfTsWj-kO03xSa6lho6qrFDnEQeDW6zuMc8mHf-o34ogIveNxvYoa_gtPEag390DefdFa5not77SmRSLeLd-oAFxkcQ_jrSEiEnyjD9UNdb0COOFbk8KlrD2y7SBM27_5U_oRY1tHFTDIpBT3z4k">link</a>]</li>
<li>Shreya Ghosh, a research scholar at the Centre for Political Studies, Jawaharlal Nehru University, New Delhi authored an article in <em>EPW</em> on <strong>access to welfare and health for women during the initial phase of the pandemic</strong>. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/WrUVPoWi-5LlI7z8_qy9HVtjyDoIgjSdclz7-wdA1OV2tG7GWSuUQ-F31hf1TpaGumhcxYeQJE9vqj1LRYpoKJfaHyCQHx_Dnt8PcNB2eEvQAbtHEdjAZLIu6Pno55XvtCJ33EBRdNRU-tu0Tt8j_lXT_nSChepY18OpIu69PUGNBI7Lsp6pkOo4LXhtUKdImoitU_-lBg1-paVePznLYRWL7bhk5rm_OrIsJPZuKbEnew8kXTwbDvjUgZbD">link</a>]</li>
<li>Ambika Tandon and Aayush Rathi in a research paper, <strong>“Fault lines at the Front lines”</strong> analyze the <strong>changing employment conditions for domestic workers</strong> in the growing platform economies of South and Southeast Asia. By analyzing different platform designs and comparing regulations in <strong>India, Indonesia, Pakistan and Vietnam</strong>, the authors present a thorough picture of the situation for domestic workers in the new economy. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/kPMoaM42DpjpGxHbzGnAXycfRBu9fPzVJ6jQoyePUjDKKV9KMz8HDo8M3h5fDoOFAynoCq8ARyzHdBIkACBBy8eWHRWjcbXslejcnZZIn2LP-BsWh_Sr4FMl2AWDTQktt8tlZAZ2PcTfL_KE1sYJD1d4522v3eLvu_QUX8LCXvuznSIusIe7e_vFu3MNdylOuSIK_-L61Uin8gAEZ-eO4DDwYaE42Uc0">link</a>]</li>
<li>In a blog post published by <em>Ethical Source</em>, Ambika Tandon throws light on <strong>artificial intelligence and allied technologies</strong> that form part of <strong>Industry 4.0</strong> in the future of work. [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/Hrd-w4fWPa8ThFlmr-Zw_-LR96KsoFTBchzDQ8QwDJALcjcwz1fCn49RAws3-xmNATUZIYUaSQT4nJxodQvSgrzlzKXEOdj64Sx8aRvtkyPaolpAml7hSDcczWdPJPaZISxUxCl9S1DHnfujOulrLkdqgEf1xPsWSQk_TQZJU4dOE7Vnqm_pmCnFVs_WLo4yQ2P00Td3VYd78HikHsyLC3yqju4">link</a>]</li>
<li>Ambika Tandon and Aayush Rathi authored a chapter titled <strong>“Care in the Platform Economy: Interrogating the Digital Organisation of Domestic Work in India”</strong> in a book titled <em>“The Gig Economy: Workers and Media in the Age of Convergence.”</em> [<a href="https://4jok2.r.ag.d.sendibm3.com/mk/cl/f/-vxAl0-OSphrFabwlh8Ir2yhdE_cYeWryiSavWFOByLbxWzlndVfgl1K0awHZjD1J6LmUbu2OaoCgNKL3Dcozv_hQ9WEi1MeQdSRmT1kKProU_9fJexLKPbw80T69AfzXMtjpfX_6zYPpWohxsh1xxOwK86Vs5S_x73hOG7hhuQxFfy4VF4co0Ls2jX-Wi7-L4pf-SBVBekVFuObAI6dOsUwWyywiSYldGbFbxxPfyVegmZuKMtD4bBycNBw_B__X1IogiPK5fj0851hxFM4eo5Wl2s0dZY37-UhpKL4xS0gLZI9UozMux7JbmzM4jpZT1AAGGCNlYb4DM3_Alf0YHI1KQ">link</a>]</li>
</ol>
<p>
For more details visit <a href='https://cis-india.org/about/newsletters/march-may-2021-newsletter'>https://cis-india.org/about/newsletters/march-may-2021-newsletter</a>
</p>
No publisherpranavInternet GovernanceCopyrightAccess to Knowledge2021-08-08T15:45:45ZPageFinding Needles in Haystacks - Discussing the Role of Automated Filtering in the New Indian Intermediary Liability Rules
https://cis-india.org/internet-governance/blog/finding-needles-in-haystacks-discussing-the-role-of-automated-filtering-in-the-new-indian-intermediary-liability-rules
<b>On the 25th of February this year The Government of India notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The new Rules broaden the scope of which entities can be considered as intermediaries to now include curated-content platforms (Netflix) as well as digital news publications. This blogpost analyzes the rule on automated filtering, in the context of the growing use of automated content moderation.
</b>
<p class="p1"><span class="s1">This article first <a class="external-link" href="https://www.law.kuleuven.be/citip/blog/finding-needles-in-haystacks/">appeared</a> on the KU Leuven's Centre for IT and IP (CITIP) blog. Cross-posted with permission.</span></p>
<p class="p1"><span class="s1">----</span></p>
<p class="p1"><span class="s1">Mathew Sag in his 2018 <a href="https://scholarship.law.nd.edu/cgi/viewcontent.cgi?article=4761&context=ndlr"><span class="s2">paper</span></a> on internet safe harbours discussed how the internet resulted in a shift from the traditional gatekeepers of knowledge (publishing houses) that used to decide what knowledge could be showcased, to a system where everybody who has access to the internet can showcase their work. A “<em>content creator</em>” today ranges from legacy media companies to any person who has access to a smartphone and an internet connection. In a similar trajectory, with the increase in websites and mobile apps and the functions that they serve, the scope of what is an internet intermediary has widened all over the world. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1"><strong>Who is an Intermediary?</strong></span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">In India the definition of “<em>intermediary</em>” is found under Section 2(w) of the <a href="https://www.meity.gov.in/writereaddata/files/itbill2000.pdf"><span class="s2">Information Technology (IT) Act 2000</span></a>, which defines an Intermediary as <em>“with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecoms service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-marketplaces and cyber cafes”.</em> The all-encompassing nature of the definition has allowed the dynamic nature of intermediaries to be included under the definition of the Act, and the Guidelines that have been published periodically (<a href="https://www.meity.gov.in/writereaddata/files/GSR314E_10511%25281%2529_0.pdf"><span class="s2">2011</span></a>, <a href="https://www.meity.gov.in/writereaddata/files/Draft_Intermediary_Amendment_24122018.pdf"><span class="s2">2018</span></a> and <a href="https://www.meity.gov.in/writereaddata/files/Intermediary_Guidelines_and_Digital_Media_Ethics_Code_Rules-2021.pdf"><span class="s2">2021</span></a>). With more websites and social media companies, and even more content creators online today, there is a need to look at ways in which intermediaries can remove illegal content or content that goes against their community guidelines.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">Along with the definition of an intermediary, the IT Act, under Section 79, provides exemptions which grant safe harbours to internet intermediaries, from liability from third-party content, and further empowers the central government to make Rules that act as guidelines for the intermediaries to follow. The Intermediary Liability Rules hence seek to regulate content and lay down safe harbour provisions for intermediaries and internet service providers. To keep up with the changing nature of the internet and internet intermediaries, India relies on the Intermediary Liability Rules to regulate and provide a conducive environment for intermediaries. In view of this provision India has as of now published three versions of the Intermediary Liability (IL) Rules. The first Rules came out in<a href="https://www.meity.gov.in/writereaddata/files/GSR314E_10511%25281%2529_0.pdf"><span class="s2"> 2011</span></a>, followed by the introduction of draft amendments to the law in<a href="https://www.meity.gov.in/writereaddata/files/Draft_Intermediary_Amendment_24122018.pdf"><span class="s2"> 2018</span></a> and finally the latest <a href="https://www.meity.gov.in/writereaddata/files/Intermediary_Guidelines_and_Digital_Media_Ethics_Code_Rules-2021.pdf"><span class="s2">2021 </span></a>version, which would supersede the earlier Rules of 2011. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1"><strong>The Growing Use of Automated Content Moderation </strong></span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">With each version of the Rules there seemed to be changes that ensured that they were abreast with the changing face of the internet and the changing nature of both content and content creator. Hence the 2018 version of the Rules showcase a push towards automated content filtering. The text of Rule 3(9) reads as follows: “<em>The Intermediary shall deploy technology based automated tools or appropriate mechanisms, with appropriate controls, for proactively identifying and removing or disabling public access to unlawful information or content</em>”.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">Under Rule 3(9), intermediaries were required to deploy automated tools or appropriate mechanisms to proactively identify, remove or disable public access to unlawful content. However, neither the 2018 IL Rules, nor the parent Act (the IT Act) specified which content can be deemed unlawful. The 2018 Rules also failed to establish the specific responsibilities of the intermediaries, instead relying on vague terms like “<em>appropriate mechanisms</em>” and with “<em>appropriate controls</em>”. Hence it can be seen that though the Rules mandated the use of automated tools, neither them nor the IT Act provided clear guidelines on what could be removed. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">The lack of clear guidelines and list of content that can be removed had left the decision up to the intermediaries to decide which content, if not actively removed, could cost them their immunity. It has been previously documented that the lack of clear guidelines in the 2011 version of the <a href="https://cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet"><span class="s2">Rules</span></a>, led to intermediaries over complying with take down notices, often taking down content that did not warrant it. The existing tendency to over-comply, combined with automated filtering could have resulted in a number of <a href="https://cis-india.org/internet-governance/how-india-censors-the-web-websci#:~:text=One%2520of%2520the%2520primary%2520ways,certain%2520websites%2520for%2520its%2520users."><span class="s2">unwarranted take downs</span></a>.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">While the 2018 Rules mandated the deployment of automated tools, the year 2020, (possibly due to the pandemic induced work from home safety protocols and global lockdowns) saw major social media companies announcing the move towards a fully automated system of content<a href="https://www.medianama.com/2020/03/223-facebook-content-moderation-coronavirus-medianamas-take/"><span class="s2"> moderation</span></a>. Though the use of automated content removal seems like the right step considering the <a href="https://www.businessinsider.in/tech/news/facebook-content-moderator-who-quit-reportedly-wrote-a-blistering-letter-citing-stress-induced-insomnia-among-other-trauma/articleshow/82075608.cms"><span class="s2">trauma </span></a>that human moderators had to go through, the algorithms that are being used now to remove content are relying on the parameters, practices and data from earlier removals made by the human moderators. More recently, in India with the emergence of the second wave of the COVID19 wave, the Ministry of Electronics and Information Technology has <a href="https://www.thehindu.com/news/national/govt-asks-social-media-platforms-to-remove-100-covid-19-related-posts/article34406733.ece"><span class="s2">asked </span></a>social media platforms to remove “<em>unrelated, old and out of the context images or visuals, communally sensitive posts and misinformation about COVID19 protocols</em>”.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1"><strong>The New IL Rules - A ray of hope?</strong></span></p>
<p class="p3"><span class="s3">The 2021 version of the IL Rules provides a more nuanced approach to the use of automated content filtering compared to the earlier version. Rule 4(4) now requires only “</span><span class="s1">significant social media intermediaries” to use automated tools to identity and take down content pertaining to “child sexual abuse material”, or “depicting rape”, or any information which is identical to a content that has already been removed through a take-down notice. The Rules define a social media intermediary as “<em>intermediary which primarily or solely enables interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services”</em> .The Rules also go a step further to create another type of intermediary, the significant social media intermediary. A significant social media intermediary is defined as one “<em>having a number of registered users in India above such threshold as notified by the Central Government</em>''. Hence what can be considered as a social media intermediary that qualifies as a significant one could change at any time.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s4">Along with adding a new threshold (qualifying as a significant social media intermediary) the Rules, in contrast to the 2018 version, also emphasises the need of such removal to be </span><span class="s1">proportionate to the interests of freedom of speech and expression and privacy of users. The Rules also call for “<em>appropriate human oversight</em>” as well as a periodic review of the tools used for content moderation. The Rules by using the term “<em>shall endeavor</em>” aids in reducing the pressure on the intermediary to set up these mechanisms. This also means that the requirement is now on a best effort basis, as opposed to the word “<em>shall</em>” in the 2018 version of the Rules, which made it mandatory.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">Although the Rules now narrow down the instances where automated content removal can take place, the concerns around over compliance and censorship still loom. One of the reasons for concern is that the Rules still fail to require the intermediaries to set up a mechanism for redress or for appeals to such removal. Additionally, the provision that states that automated systems could remove content that have been previously taken down, creates a cause for worry as the propensity of the intermediaries to over comply and take down content has already been documented. This then brings us back to the previous issue where the social media company’s automated systems were removing legitimate news sources. Though the 2021 Rules tries to clarify certain provisions related to automated filtering, like the addition of the safeguards, the Rules also suffer from vague provisions that could cause issues related to compliance. The use of terms such as “<em>proportionate</em>”, “<em>having regard to free speech</em>” etc. fail to lay down definitive directions for the intermediaries (in this case SSMI) to comply with. Additionally, as earlier stated, being qualified as a SSMI can change at any time, either based on the change in the number of users, or the change in the threshold of users, mandated by the government. The absence of human intervention during removal, vague guidelines and fear of losing out on safe harbour provisions, add to the already increasing trend of censorship in social media. With the use of automated means and the fast, and almost immediate removal of content would mean that certain content creators might not even be able to post their content <a href="https://www.eff.org/wp/unfiltered-how-youtubes-content-id-discourages-fair-use-and-dictates-what-we-see-online"><span class="s2">online.</span><span class="s5"> With the use of proactive filtering through automated means the content can be removed almost immediately.</span></a></span><span class="s6"> </span><span class="s1">With India’s current trend of new internet users, some of these creators would also be <a href="https://timesofindia.indiatimes.com/business/india-business/for-the-first-time-india-has-more-rural-net-users-than-urban/articleshow/75566025.cms"><span class="s2">first time users</span></a> of the internet. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p3"><span class="s1"><strong>Conclusion</strong></span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p3"><span class="s1">The need for automated removal of content is understandable, based not only on the sheer volume of content but also the nightmare stories of the toll it takes on human content moderators, who otherwise have to go through hours of disturbing content. Though the Indian Intermediary Liability Guidelines have improved from the earlier versions in terms of moving away from mandating proactive filtering, there still needs to be consideration of how these technologies are used, and the laws should understand the shift in the definition of who a content creator is. There needs to be ways of recourse to unfair removal of content and a means to get an explanation of why the content was removed, via notices to the user. In the case of India, the notices should be in Indian languages as well, so that the people are able to understand them. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p3"><span class="s1">In the absence of further clear guidelines, the perils of over-censorship by the intermediaries in order to stay out of trouble could lead to further stifling of not just freedom of speech but also access to information. In addition, the fear of content being taken down or even potential prosecution could mean that people resort to self-censorship, preventing them from exercising their fundamental rights to freedom of speech and expression, as guaranteed by the Indian Constitution. We hope that the next version of the Rules take a more nuanced approach to automated content removal and ensure adequate and specific safeguards to ensure a conducive environment for both intermediaries and content creators. </span></p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/finding-needles-in-haystacks-discussing-the-role-of-automated-filtering-in-the-new-indian-intermediary-liability-rules'>https://cis-india.org/internet-governance/blog/finding-needles-in-haystacks-discussing-the-role-of-automated-filtering-in-the-new-indian-intermediary-liability-rules</a>
</p>
No publisherShweta Mohandas and Torsha SarkarInternet GovernanceIntermediary LiabilityArtificial Intelligence2021-08-03T07:28:53ZBlog EntryThe Ministry And The Trace: Subverting End-To-End Encryption
https://cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption
<b>A legal and technical analysis of the 'traceability' rule and its impact on messaging privacy.</b>
<p> </p>
<p>The paper was published in the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">NUJS Law Review Volume 14 Issue 2 (2021)</a>.</p>
<hr />
<h2>Abstract</h2>
<div class="justify">
<div class="pbs-main-wrapper">
<p>End-to-end
encrypted messaging allows individuals to hold confidential
conversations free from the interference of states and private
corporations. To aid surveillance and prosecution of crimes, the Indian
Government has mandated online messaging providers to enable
identification of originators of messages that traverse their platforms.
This paper establishes how the different ways in which this
‘traceability’ mandate can be implemented (dropping end-to-end
encryption, hashing messages, and attaching originator information to
messages) come with serious costs to usability, security and privacy.
Through a legal and constitutional analysis, we contend that
traceability exceeds the scope of delegated legislation under the
Information Technology Act, and is at odds with the fundamental right to
privacy.</p>
<p> </p>
<p>Click here to read the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">full paper</a>.</p>
</div>
</div>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption'>https://cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption</a>
</p>
No publisherGurshabad Grover, Tanaya Rajwade and Divyank KatiraCryptographyIntermediary LiabilityConstitutional LawInternet GovernanceMessagingEncryption Policy2021-07-12T08:18:18ZBlog EntryState of Consumer Digital Security in India
https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india
<b>This report attempts to identify the existing state of digital safety in India, with a mapping of digital threats, which will aid stakeholders in identifying and addressing digital security problems in the country. This project was funded by the Asia Foundation.</b>
<p style="text-align: justify;"> </p>
<p style="text-align: justify;">Since 2006, successive Union governments in India have shown increased focus on digital governance. The National e-Governance Plan was launched by the UPA government in2006, and several digital projects led by the state such as digitisation of the filing of taxes, appointment process for passports, corporate governance, and the Aadhaar programme(India’s unique digital identity system that utilises biometric and demographic data) arose under it, in the form of mission mode projects (projects that are part of a broader National e-governance initiative, each focusing on specific e-Governance aspects, like banking, land records, or commercial taxes). In 2014, when the NDA government came to power, the National e-Governance Plan was subsumed under the government’s flagship project of Digital India, and several mission mode projects were added. In the meantime, the internet connectivity, first in the form of wire connectivity, and later in the form of mobile connectivity has increased greatly. In the same period, use of digital services, first in new services native to the Internet such as email, social networking, instant messaging, and later the platformization and disruption of traditional business models in transportation, healthcare, finance and virtually every sector, has led to a deluge of digital private service providers in India.</p>
<p style="text-align: justify;">Currently, India has 500 million internet users — over a third of its total population — making it the country with the second largest number of Internet users after China. The uptake of these technological services has also been accompanied by several kinds of digital threats that an average digital consumer in India must regularly contend with. This report is a mapping of consumer-facing digital threats in India and is intended to aid stakeholders in identifying and addressing digital security problems. The first part of the report categorises digital threats into four kinds, Personal Data Threats, Online Content Related Threats, Financial Threats, and Online Sexual Harassment Threats. Threats under each category are then defined, with detailed consumer-facing consequences, and past instances where harm has been caused because of these threats.</p>
<hr />
<p> </p>
<p>Read the full report <a href="https://cis-india.org/internet-governance/report-state-of-consumer-digital-security-in-india" class="internal-link" title="Report - State of Consumer Digital Security in India">here</a>.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india'>https://cis-india.org/internet-governance/blog/state-of-consumer-digital-security-in-india</a>
</p>
No publisherpranavDigital GovernancePrivacyDigital KnowledgeInternet GovernanceDigital Media2021-07-05T11:07:24ZBlog EntryAt the Heart of Crypto Investing, There is Tether. But Will its Promise Pan Out?
https://cis-india.org/internet-governance/blog/the-wire-aman-nair-june-30-2021-cryptocurrency-tether-stablecoin-dollar
<b>The $18.5 million fine levied by the New York attorney general’s office earlier this year to settle a legal dispute, raises more questions than answers.</b>
<p style="text-align: justify; ">The article was <a class="external-link" href="https://thewire.in/tech/cryptocurrency-tether-stablecoin-dollar">published in the Wire</a> on June 30, 2021.</p>
<hr />
<p style="text-align: justify; ">Cryptocurrencies have become the centerpiece of the global digital zeitgeist in 2021. Anyone remotely familiar with them would probably be able to name a few of the famous ones like Bitcoin and Ethereum.</p>
<p style="text-align: justify; ">However, there exists a lesser known cryptocurrency at the heart of this $ 3 trillion market, Tether. Issued by the company Tether.ltd, Tether forms the foundation for modern day crypto trading and could potentially be one of the biggest schemes in financial history.</p>
<p style="text-align: justify; ">Tether is a special type of cryptocurrency known as a stablecoin. Unlike coins such as Bitcoin and Ethereum, Tether’s monetary value is not a function of the forces of the crypto market but is rather pegged to the US Dollar. What this means is that 1 Tether will always be worth exactly 1 USD. This fixed value has allowed it to occupy a unique position within the crypto ecosphere, with it becoming the de facto standard of liquidity within these markets by acting as a widely accepted substitute to the US dollar.</p>
<p style="text-align: justify; ">At present, buying cryptocurrency using traditional fiat money (like dollars or rupees) comes with certain challenges. Purchasing with traditional currencies requires the use of banking services that come with a host of fees and time delays. At the same time, purchasing one type of crypto coin like Bitcoin with another coin like Ethereum can prove difficult due to the constantly shifting values of both coins. This is where Tether comes in. Acting as a bridge between the traditional financial world and the crypto market, it has become a sort of digital dollar — one that makes cryptocurrency trading significantly easier.</p>
<h3 style="text-align: justify; ">The problem with tether</h3>
<p style="text-align: justify; ">On the surface, Tether seems like a perfectly reasonable innovation that looks to fill in the gaps that exist within the market. Dig a little deeper than the surface and the discrepancies start to appear.</p>
<p style="text-align: justify; ">The premise of Tether’s appeal comes from its value being pegged to the US dollar. The company<a href="https://web.archive.org/web/20180202054322/https://tether.to/"> initially claimed</a> to have achieved this by ensuring that their currency was “fully backed” by cash reserves.</p>
<p style="text-align: justify; ">The process looked something like this: You gave the company 1 US dollar and they gave you 1 Tether that you could use to make other crypto purchases. If you returned your Tether, you would get your dollar back and the Tether you returned would be ‘burned’ (removed from circulation). This meant that for every Tether that existed the company would have 1 corresponding dollar in reserve in the bank, ensuring that the currency was backed.</p>
<p><img src="https://cis-india.org/home-images/copy2_of_CryptoCurrrency.png/@@images/054a9af7-7949-4765-b4be-bf50e8094a41.png" alt="Crypto Currency" class="image-inline" title="Crypto Currency" /></p>
<p><span class="discreet">An illustrated image shows US dollars, cryptocurrency and NFT written on a phone. Photo: Marco Verch/Flickr CC BY 2.0</span></p>
<p style="text-align: justify; ">However, there was an enormous flaw in this system. Since Tether.ltd was the sole creator of the coin, it could create as many of them as it wanted while falsely claiming that these new Tethers were also backed fully by cash reserves. And this is exactly what is alleged to have happened in a <a href="https://ag.ny.gov/press-release/2021/attorney-general-james-ends-virtual-currency-trading-platform-bitfinexs-illegal">case brought forward</a> against Tether.ltd by the New York Attorney General’s office. The filings made by the attorney general noted that in their investigation they found that not only did the company have inadequate reserves to back the number of Tethers in circulation, but that there were significant periods of time wherein the company did not have any bank accounts or any access to banking at all — thereby exposing Tethers claims of being backed as being demonstrably false.</p>
<p style="text-align: justify; ">The scam was alleged to have worked as follows. First, the company would issue new coins that were not actually backed by any corresponding dollars. These new Tethers were then transferred to Bitfinex – a cryptocurrency exchange that was <a href="https://news.bitcoin.com/paradise-papers-reveal-bitfinexs-devasini-and-potter-established-tether-already-back-in-2014/">owned by Tether.ltd</a>. These unbacked Tethers would then be used to buy bitcoin, with the momentum from this increased demand causing the price of bitcoin to rise. They would then exchange their newly appreciated bitcoins for actual US dollars — thereby essentially creating real money where none had previously existed. While there is no conclusive evidence for this being true, <a href="https://www.researchgate.net/publication/342185292_Is_Bitcoin_Really_Untethered">research</a> has pointed to increased tether supply causing a boom in bitcoin prices in 2017.</p>
<p style="text-align: justify; ">The company has since altered its claim from being backed by cash reserves, to now being backed by a number of assets (which it refers to as its ‘reserves’) – of which <a href="https://tether.to/wp-content/uploads/2021/05/tether-march-31-2021-reserves-breakdown.pdf">cash only formed a small subset</a>. It maintains that the cumulative value of their assets does equal the number of Tethers in circulation, though it is worth noting that the veracity of these claims has been consistently <a href="https://davidgerard.co.uk/blockchain/2021/05/13/tether-publishes-two-pie-charts-of-its-reserves/">challenged</a>.</p>
<h3>How does this affect the rest of the crypto market?</h3>
<p>Tether’s problems are unfortunately not limited to itself, but rather affect the entire crypto marketplace. If the New York Attorney General’s filings are true, then it would mean that a significant amount of the demand in the crypto market could potentially not be backed by any actual purchasing power and that the price of cryptocurrencies like bitcoin have been artificially inflated.</p>
<p style="text-align: justify; ">If Tether was ever found (either by a regulatory body or through leaks) to have been creating unbacked units of its currency then it would result in a significant amount of buying pressure disappearing from the crypto market. And since Tether isn’t just any other cryptocurrency but rather is a medium for exchange in the crypto world, its downfall would have severe knock on effects that could cause a serious crash in the entire crypto market.</p>
<p style="text-align: justify; ">Quantifying such knock-on effects would be extremely difficult, however as previously mentioned, research has clearly outlined a significant causal relationship between tether’s supply and increased bitcoin prices. This leads to the conclusion that the reverse would likely be true; that a rapid decrease in tethers would cause a significant decrease in the price of bitcoin and other cryptocurrencies.</p>
<p style="text-align: justify; ">Ultimately, no one knows for sure whether Tether is a scheme or not. However, mounting evidence from a number of independent sources have all pointed to discrepancies in the company’s functioning. What is clear is that, if the allegations are in fact true, then Tether poses a serious risk to the entire crypto marketplace and investors.</p>
<p>
For more details visit <a href='https://cis-india.org/internet-governance/blog/the-wire-aman-nair-june-30-2021-cryptocurrency-tether-stablecoin-dollar'>https://cis-india.org/internet-governance/blog/the-wire-aman-nair-june-30-2021-cryptocurrency-tether-stablecoin-dollar</a>
</p>
No publisheramanInternet GovernanceBitcoinCryptocurrencies2021-07-01T14:46:46ZBlog Entry