Home   »   IT Rules, 2021   »   Content Moderation Through Co-regulation

The Hindu Editorial Analysis for UPSC: Content Moderation Through Co-regulation| Social Media Content Moderation Laws

About Content Moderation

 

  • Content moderation simply refers to the practice of analyzing user-generated submissions, such as reviews, videos, social media posts, or forum discussions.
  • Social media content moderation is the process of moderating the contents on social media platforms like Facebook, Twitter, Instagram, LinkedIn or Tumbler.

 

Who is Social Media Intermediary?

 

  • A social media intermediary is an intermediary which primarily or solely enables interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services.
  • This definition may include any intermediary that enables interaction among its users.
  • This could include email service providers, e-commerce platforms, video conferencing platforms, and internet telephony service providers.

 

What is modern intermediary law?

 

  • Under modern intermediary law, government orders to remove content is not only necessary and proportionate but also comply with due process.
  • For Example the Recent European Union (EU) Digital Services Act (DSA)(Requires government take-down orders to be proportionate and reasoned).
  •  Intermediary law must devolve social media content moderation decisions at the platform level.

 

Why does Social Media need a modern intermediary law to re-imagine the role of governments?

 

  • Social media now has millions of users and existing government control on online speech is unsustainable.
  • With the increasing reach of the Internet, its potential harms have also increased.
  • There is more illegal and harmful content online today. For example Disinformation campaigns on social media during COVID-19 and hate speech against the Rohingya in Myanmar.
  • Hate Speech, Online Trolling, Violent Content, and Sexually Explicit Content are other biggest challenges.

 

What is the EU’s Digital Services Act(DSA)?

 

  • The Digital Services Act (DSA) regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content. This includes online marketplaces amongst others.
  • It will give better protection to users and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and provide a single, uniform framework across the EU.
  • The Digital Services Act is a Regulation that will be directly applicable across the EU.
  • Some of the obligations include:
    • Measures to counter illegal content online, including illegal goods and services
    • Obligations for very large online platforms and search engines
    • Bans on targeted advertising on online platforms
    • New obligations for the protection of minors on any platform, etc.

 

What has changed with the IT(Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021?

 

  • Earlier, it was voluntary for Digital Media Platforms to establish a grievance redressal mechanism through their terms of service until the government introduced the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
  • These mandate platforms to establish a grievance redressal mechanism to resolve user complaints within fixed timelines.
  • Recently, the government amended these Rules and established Grievance Appellate Committees (GACs). Comprising government appointees.
  • What will GACs do?

    • GACs will now sit in appeals over the platforms’ grievance redressal decisions.
    • This signifies the government’s tightening control of online speech, much like Section 69A of the IT Act.
    • The IT Act was passed in 2000 and Section 69A was introduced in 2008 when social media barely existed.

 

What kind of intermediary law does India Need?

 

  • An intermediary law must devolve crucial social media content moderation decisions at the platform level. Platforms must have the responsibility to regulate content under broad government guidelines.
  • Instituting such a co-regulatory framework will serve three functions.
    • First, platforms will retain reasonable autonomy over their terms of service.
    • Second, co-regulation aligns government and platform interests.
    • Third, instituting co-regulatory mechanisms allows the state to outsource content regulation to platforms, which are better equipped to tackle modern content moderation challenges.
  • The modality of a co-regulatory model for content moderation must be mulled over. It is important that co-regulation, while maintaining platform autonomy, also makes platforms accountable for their content moderation decisions.
  • With increased stakes in free speech and with increasing online risks, a modern intermediary law must re-imagine the role of governments.

Draft Amendments to IT Rules, 2021 (adda247.com)

 

Sharing is caring!

The Digital Services Act (DSA) of the European Union is a good example of Content Moderation Through Co-regulation as it regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content.

The Hindu Editorial Analysis for UPSC: Content Moderation Through Co-regulation| Social Media Content Moderation Laws_3.1