[This is the second part of a two-part article authored by Saurav Kumar, a third-year student from Dr. RML National Law University, Lucknow. The first part can be found here.]
Arguments Submitted by Google
A student-run group at NALSAR University of Law
[This post has been authored by Riya Sharma and Atulit Raj, second-year students at the Institute of Law, Nirma University.]
Introduction
[This post is authored by Sohina Pawah, a second-year student at the NALSAR University of Law, who is also an Editor for the TLF]
Back in June 2022, the Ministry of Electronics and Information Technology (“MeitY”) had first released the proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules 2021”) for public consultation. Recently, the MeitY notified the Amendments to Parts I and II of the IT Rules 2021 by introducing the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2022 (“IT Amendment Rules, 2022”). The IT Amendment Rules 2022 aim at the regulation of social media intermediaries by increasing the burden of their compliance, and ensuring that the safe harbours provided to them are not abused. On the whole, the Rules aim at strengthening the protective framework for the “netizens’ interests” by prioritising their fundamental rights under Articles 14,19, and 21 of the Indian Constitution.
[This post has been authored by Sanjana L.B., a 4th year student at Symbiosis Law School, Hyderabad.]
In January 2021, India had the highest number of Facebook users at 320 million. This was followed by the United States of America (“USA”), with 190 million users. As of February 2021, about 53.1% of the population of Myanmar were active social media users. These numbers are not only indicative of internet penetration, but also of the audience for user-generated content on platforms like Facebook. This article focuses, firstly, on the need for content moderation on social media by looking at harmful precedents of inefficient moderation, and secondly, on the Indian Government’s approach to content moderation through the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“Intermediary Guidelines”) and recent developments surrounding the regulation of social media content in India.
Welcome to our fortnightly newsletter, where our reporters Harsh Jain and Harshita Lilani put together handpicked stories from the world of tech law! You can find other issues here, and you can sign up for future editions of the the newsletter here.
Facebook has reached an agreement with the Australian Government and will restore news pages in the country days after restricting them. The decision follows negotiations between the tech giant and the Australian Government, which is set to pass a new media law that will require digital platforms to pay for news. The law, if passed, will make digital platforms pay local media outlets and publishers to link their content in news feeds or search results. Under the amendments, the Australian Government will give digital platforms and news publishers two months to mediate and broker commercial deals before subjecting them to mandatory arbitration under the proposed media law. Both Google and Facebook have fought against the media law since last year. Google previously threatened to remove its search service from Australia in response to the proposed law. But the company has since struck commercial deals with local publishers including the Murdoch family-owned media conglomerate News Corp. Facebook, for its part, followed through with a threat to remove news features from Australia.
[Lian Joseph is a fourth-year law student and contributing editor at robos of Tech Law and Policy, a platform for marginalized genders in the technology law and policy field. This essay is part of an ongoing collaboration between r – TLP and the NALSAR Tech Law Forum Blog. Posts in the series may be found here.]
Facebook’s Oversight Board (OB) was instituted to respond to the growing concerns regarding Facebook’s inadequate content moderation standards. The company has been alleged to have proliferated and played an important role in several instances of human right violations, hate and misinformation campaigns related to elections and COVID 19 among other issues. The introduction of the OB – the Facebook Supreme Court, as it has been dubbed – was met with a lot of skepticism, with many arguing that it was an attempt to deflect actual accountability. The Board was established as an independent body with a maximum of 40 members, separate from Facebook’s content review process with the power to review decisions made by the company and suggest changes and recommendations. Notably, the OB will be reviewing cases that are of grave concern and have potential to guide future decisions and policies. Appeals can be made by the original poster or the person who previously submitted it for review or by Facebook itself referring matters.
Welcome to our fortnightly newsletter, where our reporters Kruttika Lokesh and Dhananjay Dhonchak put together handpicked stories from the world of tech law! You can find other issues here.
In an increasingly globalised world, major retail companies like Amazon have reached even the most inaccessible places. The consumers that are exposed to e-commerce companies can only be protected in the presence of increased accountability. The newly issued E-Commerce Rules set up a Central Consumer Protection Authority to police companies that violate consumer rights. Misleading ads and unfair trade practices are prevented as e-retailers have to mandatorily disclose return, refund, warranty, exchange, guarantee, delivery and grievance redressal details. Henceforth, prices of products cannot be manipulated to produce unreasonable profits for companies. These rules apply to retailers either registered in India or abroad.
Welcome to our fortnightly newsletter, where our reporters Kruttika Lokesh and Dhananjay Dhonchak put together handpicked stories from the world of tech law! You can find other issues here.
The Ministry of Electronics and Information Technology announced in a press release on 29th June that it had invoked its powers under section 69A of the Information Technology Act to ban 59 Chinese applications. The Indian government cited ‘raging concerns on aspects relating to data security and safeguarding the privacy of 130 crore Indians’ as reasons behind the ban. The move comes after a border skirmish with China resulted in the deaths of 20 Indian soldiers. Regardless of the cybersecurity concerns cited in the press release, speculation remains rife over whether the ban was a retaliatory measure in light of the worsening geopolitical situation between India and China. India is a huge market for Chinese apps, particularly for the video-sharing platform Tik-Tok which had previously been banned in February 2019 for encouraging the spread of pornography and ‘cultural degradation’. The ban was ultimately lifted after assurances by Tik-Tok that it had the tools to censor explicit content. The current ban has been called a purely political decision and criticised for its procedural impropriety and its excessive restriction on dissemination of online content.