1. Introduction
In 2014, Brazil enacted the Civil Rights Framework for the Internet (Federal Law 12.965/14, “Internet Bill of Rights” or “IBR”), that safeguard internet users’ fundamental rights. Article 19 of the IBR protects freedom of expression online by enshrining a general safe harbour rule: intermediaries are exempt from liability for third-party content, unless they fail to remove the content after receiving a judicial order.
At the time, the judiciary welcomed the adoption of a general safe harbour rule that would harmonise the judicial approaches to intermediaries’ liability. Now, almost a decade after, Brazil’s safe harbour regime is no longer being praised but criticised.
In this work, we map the potential pathways for changing the regime of intermediaries’ liability in Brazil. We begin by providing a brief overview of the IBR and its rule on intermediaries’ liability, followed by an assessment of its reception in the Brazilian society at the time of its enactment. Then, we show how the Brazilian political and social landscape changed in the last decade, leading to claims that article 19 IBR is no longer suitable to respond to present-day threats to democracy and fundamental rights. We conclude that the regime on intermediaries’ liability in Brazil will likely change soon and we outline two possible pathways for this change.
2. Article 19 of the IBR: from good guy to villain in one decade
2.1. Article 19: safe harbour for intermediaries.
As previously mentioned, the IBR aims to safeguard fundamental rights online. In article 19, it expressly recognises freedom of expression as a principle of internet usage in Brazil, enshrining a safe harbour rule for intermediaries. According to this article, application providers shall not be liable for damages caused by third-party content, unless they refrain from taking down infringing material after receiving a specific court order (with the content’s URL). Article 21 introduces a single exception to this rule, stating that intermediaries may be liable if they fail to remove unauthorised nude or pornographic material upon request from the victim or their authorised representative.
The safe harbour rule was well-received by the Brazilian courts. Prior to the IBR, the lack of rules on intermediary liability resulted in rulings that relied on consumer and civil liability. Hence, intermediaries could be strictly liable for third-party content. Moreover, the lack of harmonisation amongst the judicial rulings led to judicial insecurity for companies. Thus, article 19 represented a step towards harmonisation and greater legal certainty for companies and users.
2.2. Article 19’s loss of popularity
Although the safe harbour rule was initially well-received in the society, this scenario swiftly changed due to some political and social events. First, Brazil had two presidential elections in which the “fight against fake news” represented a considerable tension between platforms and the courts. Online misinformation arguably played a crucial role for Bolsonaro’s election, leading to claims that platforms were relying on article 19 to avoid liability for misinformation
After Bolsonaro’s loss in the subsequent election in 2022, his supporters invaded the Congress and the Supreme Court buildings, claiming electoral fraud and calling for military intervention. His supporters relied on right-wing fake news widely shared on Telegram, WhatsApp, Facebook, and Twitter. Soon after, Supreme Court Ministers and government officials stated in an event that the safe harbour rules were insufficient to protect the democracy and fundamental rights. Although these statements are not legally binding, and the Ministers could potentially rule differently when adjudicating cases in the Supreme Court, they often anticipate the Ministers’ positions.
Lastly, the mounting pressure to alter article 19 increased due to a surge in school attacks in the country in the beginning of April. Again, the safe harbour rule came to the spotlight as government officials stated that social media platforms should do more to prevent the spread of hate speech online, allegedly linked to the school attacks. Thus, in April the Ministry of Justice and Public Security issued an administrative ordinance that establishes a duty of care for social media platforms. They must monitor and prevent systemic risks associated with the protection of children online, as well as the dissemination of violent content inciting school attacks. Failure to do so may result in penalties, such as fines and suspension of activities. In this context, the protection of children became yet another motto against the safe harbour regime.
3. ISP’s liability for third-party content- a legislative matter?
These political and social changes triggered legislative responses. In 2020, Senator Alessandro Vieira presented Bill 2.630/2020 (known as “Fake News Law”) to curb some online behaviours, such as inauthentic accounts, disinformation bots, and bulk messaging. After the Senate’s approval, the Bill went to the House of Representatives, under the report of congressman Orlando Silva. In 2022, following a series of public consultations, Silva presented an amended text, which introduced, among others, broader transparency obligations regarding the standards applicable to content moderation and references to regulated self-regulation.
By 2022, the Bill had already been expanded beyond its original scope, but new rules about intermediary liability were introduced only in 2023, when the government presented DSA-inspired amendments. Considering these amendments, Silva presented a new version of the text, incorporating some of the suggestions. Thus, the current version prescribes, among others, new obligations to intermediaries regarding reports on systemic risks and algorithmic transparency.
If approved, the Fake News Law would not completely outlaw article 19 of the IBR but would introduce new exceptions to the general safe harbour rule. Accordingly, Bill 2.630/2020 provides for joint liability of platforms when they (i) promote third-party content; and/or (ii) violate the duty of care during the term of a security protocol if there is evidence of actual knowledge of the infringing content. A simple notification about infringing content suffices to establish the intermediaries’ actual knowledge. Furthermore, the bill defines that the security protocol is a special regime that may be triggered in case of an imminent systemic risk. However, the exact scope of what constitutes a security protocol is still pending regulation.
The Bill also introduces a duty of care regime, according to which platforms must prevent the dissemination of criminal content. Additionally, it establishes an obligation to notify the competent authorities upon acquiring knowledge about any suspicious criminal content that poses threats to life. Lastly, they must monitor, report, and mitigate systemic risks on their services. Violations of these obligations may lead to administrative sanctions, ranging from fines to the temporary suspension of their activities.
There are some issues with these provisions. At first, it seems that the proposed regime mirrors the DSA’s (including the systemic risks, crisis protocol and other due diligence obligations). However, Bill 2.630/2020 has a broader wording, leaving some gaps. For example, the DSA clearly indicates when a user notification may trigger liability, as well as delimitates the instances where the crisis protocol may be triggered. This is not the case in the Fake News Law. Additionally, the bill foresees the creation of a regulatory authority, but there is no consensus on which administrative body should be entrusted with this function. There are ongoing debates about whether these functions should be assigned to existing bodies such as the Internet Steering Committee in Brazil (CGI) or the National Telecommunications Agency (ANATEL), or if a new regulatory body should be established.
The Bill is currently awaiting vote by the Plenary of the House of Representatives. It was expected to be voted on May 2nd, but a lobby from tech companies and right-wing congressmen led to the postponement of the vote. Silva requested a deferral of the vote, based on the alleged need for further debate on certain issues, such as the absence of a regulatory authority.
4. ISP’s liability – a question for the Supreme Court?
Alongside the legislative debates on intermediaries’ liability, the Supreme Court is about to hear two cases on the constitutionality of article 19 of the IBR: Extraordinary Appeals n. 1.057.258 and n. 1.370.396 (respectively referred to as “Aliandra” and “Lourdes” for clarity purposes).
Aliandra dates to 2010, when students allegedly created a community on Orkut, a social media website, and shared a photo of their teacher, Aliandra, along with offensive comments. Aliandra claimed that Google, Orkut’s owner, was liable for damages to her reputation. The case predates the IBR, and the lower courts of Brazil held Google was liable based on the Brazilian Civil Code.
Lourdes, on the other hand, was brought before the courts after the IBR came into effect. The claimant argued that a third-party had created a Facebook profile using her name and photo and was insulting her family. Thus, she claimed that Facebook was liable and sought moral damages. The lower courts performed a diffuse control of constitutionality and held that Facebook was liable, as article 19 of the IBR was unconstitutional. According to the lower courts, requiring the user to seek a court order to take down third-party content infringed constitutional rights, including consumer rights. Moreover, consumer rights entailed strict liability of the platform, contrary to the wording of article 19 IBR. Consequently, the lower courts ruled that it was unconstitutional.
Facebook and Google appealed the cases on constitutional grounds, and they are now pending assessment by the Supreme Court, which will rule on (i) the constitutionality of article 19 of the IBR; and (ii) whether intermediaries are under an obligation to monitor and take down offensive content created by third parties, even without a court order. The outcome of these cases will bind all Courts in Brazil.
Considering the sensitive nature of these cases, the Supreme Court conducted public hearings with the civil society, companies, and other interested parties before assessing the merits of the cases. These hearings happened only in March 2023 – after two presidential elections, the anti-democratic protests, and the school attacks. The Supreme Court heard experts from civil society, tech companies and scholars on the matter. Big tech companies argued that striking down the safe harbour rule would lead to censorship and chilling effects, as the mere risk of facing liability for third-party content would create incentives to remove more content. Furthermore, they believe that a declaration of unconstitutionality of article 19 would, ultimately, create an obligation to monitor all third-party content. Some organisations, such as the Brazilian Bar Association and Article 19, also argue for the constitutionality of article 19. They argue that removing the safe harbour rule would only give platforms more power, as they would ultimately become the moderators and censors of the public debate. However, on the other side, other civil society organisations argue that article 19 IBR seems to consider freedom of expression as an absolute right, rendering it unconstitutional.
Evidently, it is impossible to predict the Supreme Court’s ruling on the cases. If it decides to follow the specialists who appeared in the public hearings, it is unlikely it will declare that article 19 IBR is unconstitutional: out of the 47 experts heard, 22 considered the article to be constitutional, and 17 considered the safe harbour constitutional, subject to the creation of a ‘duty of care’ for platforms. Only 8 experts deemed it unconstitutional. Nonetheless, some of the Supreme Court Ministers have already suggested in an academic event that the regulatory framework for platforms’ liability in Brazil should change.
5. Conclusion
Almost a decade after its enactment, the safe harbour rule established in article 19 IBR is threatened in Brazil. Although it was initially celebrated, now it is accused of being outdated and insufficient to respond to new threats to democracy and fundamental rights.
Given the rapidly changing online environment and the latest political and social events, it seems undisputable that Brazil will imminently change the safe harbour regime enshrined by the IBR. It is unclear, however, if the changes will come from the legislative or the judiciary – and what the new liability scheme will entail.
Acknowledgments
With thanks to the anonymous reviewer whose suggestions helped improve and clarify this piece. I also thank Paula Pedigoni Ponce for her useful contributions and comments.

Juliana da Cunha Mota
Juliana is a Ph.D. candidate at the University of Oxford, where she is currently researching the enforcement of data protection and privacy rights by the European courts in the age of surveillance capitalism. She holds an LL.M. degree from the University of Cambridge and is a qualified lawyer in Brazil.