top of page

Technical Note: The ‘chilling effect’ under debate: content moderation on digital

  • erickdau
  • Jun 24
  • 3 min read

ree

In June 2025, the Supreme Federal Court (STF) resumed its hearing on Article 19 of the Brazilian Internet Civil Rights Framework, which had been interrupted in December 2024 by a request for review by Justice André Mendonça. Article 19 is one of the most discussed points of Law No. 12,965/2014, and the debate revolves around the liability of digital platforms for content published by third parties. The current text stipulates that internet application providers can only be held civilly liable for damages arising from content generated by third parties if, following a specific court order, they fail to take steps to remove the content identified as infringing (Brasil, 2014). This means that, currently, platforms can only be held liable for damages caused by third-party content if, following a specific court order, they fail to remove the content deemed problematic or illegal.


Current legislation gives platforms the freedom to decide, based on their own criteria—which are largely opaque and often arbitrary—which posts should be removed and which should remain online, without incurring legal liability, even when these decisions result in harm to users.


Although the current stage of the trial points to the understanding that Article 19 is unconstitutional, there is a significant controversy among the Supreme Federal Court justices regarding crimes against honor committed on social media platforms. Some of the justices believe that these cases should be protected by the virtual immunity with which the platforms currently operate. This view is defended by the companies themselves (Schreiber, 2025), who cite the lack of technical resources to evaluate controversial content as it appears online and concerns about the so-called chilling effect—the alleged tendency of platforms to remove content excessively to avoid legal liability—as arguments in favor of maintaining the constitutionality of Article 19 in its current form.


The platforms' demands, however, are not echoed in the survey conducted by NetLab UFRJ. The laboratory consolidated data on content moderation in Europe, which implemented a system for notifying users of content removal in 2018 in Germany and subsequently regulated platforms in the European Union in 2022. As demonstrated by the data presented throughout this technical note, the daily moderation carried out by these same companies involves enormous volumes of posts.


On the other hand, the data also categorically demonstrates that accountability initiatives implemented in Europe have had no inhibiting effect on the volume of posts and moderation on digital platforms, refuting concerns that the law could lead to indiscriminate censorship of users. While it is difficult to qualitatively assess the impact of changes in the platform accountability regime on freedom of expression, quantitative analyses indicate that there is no excessive moderation or significant threats to the exercise of this right.


Given the problems Brazil has been facing due to the lack of regulation and transparency on digital platforms, NetLab UFRJ consolidated data and information based on reports available in Europe to inform the debate on this topic. The need to use data from other countries to conduct this study highlights another problem with the platforms' operations in Brazil. Technology companies operating in the country do not offer—and are not required to offer—any transparency tools for their moderation practices. Decisions to remove content, based on arbitrary criteria adopted by the platforms themselves, cannot be adequately scrutinized by academics, regulatory agencies, or society at large.


NetLab UFRJ's contributions are supported by the most current and available data on content moderation on social media platforms worldwide. The objective of this technical note is to offer information based on consolidated data provided by the digital platforms themselves in Europe, contributing to a more informed and evidence-driven public debate in Brazil.


Read the Technical Note



DATA ACESS

If you would like to access the database for this report, please send an email to netlab@eco.ufrj.br identifying yourself, explaining why you are interested in the data and how it will be used. NetLab will evaluate your request and get in touch.


WARNING

This report is an independent production of NetLab UFRJ. All decisions regarding this work were made exclusively by the researchers of the laboratory. The funders of NetLab UFRJ have no influence on the laboratory's research agenda and did not participate in any stage of the production of this report.


Information on NetLab UFRJ's funding sources is available here.

Institutional
Contact
logo_atualizada_branca.png
assinatura.png
ufrj-horizontal-negativa-completa-telas.png

© NetLab UFRJ 2023.  This work may be freely copied for non-commercial teaching and research purposes. If you want to make any other uses that infringe copyright, contact our coordination by email.

bottom of page