Once again, courts have addressed questions related to (the limits of) content moderation on social media, particularly when it comes to decisions to remove content or block accounts of political actors. The ban of the former President Donald Trump was only one of the cases underlining the role of social media in making decisions on political speech online. While in the US framework courts found a constitutional bastion in the protection of the First Amendment shielding social media from any responsibility or accountability, European courts have addressed the role of social media, also extending constitutional obligations in content moderation. Nonetheless, courts seem to disagree on how far this “horizontal” extension of constitutional values to social media can go and, particularly considering the role of the recently adopted Digital Services Act, this situation raises questions for the rule of law.
In December 2022, the Court of Rome rejected the claims brought by the neo-fascist movement CasaPound against Meta and, consequently, revoked an earlier preliminary injunction against the social media company. Indeed, the case is not new but originates from a dispute that dates back to September 2019, when Meta (then Facebook) removed all the pages and individual profiles linked to CasaPound from its platforms (Facebook and Instagram) for the violation of the social media’ community standards on hate speech and incitement to violence.
At that time, the Court of Rome ordered the reinstatement of the pages and their content. The measure did not dwell into the contractual rights and obligations arising from the Meta’s Terms of Services and community guidelines but directly from Art. 49 of the Italian Constitution which recognises the right to political participation. The systemic role of Meta in the effective enjoyment of that constitutional right has led the tribunal of Rome to reject Meta’s arguments, claiming that the outright disabling of the pages and accounts linked to CasaPound constituted a measure aimed at preventing the promotion of a movement that pursues aims contrary to the Constitution. In other words, the Tribunal rejected the role that Meta in making these decisions, at least as far as this translates into the power to effectively exclude a movement from the political life of the country. As a consequence, it ordered Meta to reactivate CasaPound’s page.
Following that decision, CasaPound sought to ascertain that the ban of the CasaPound’s page was unlawful and that the reasons put forward by Meta to justify it constituted an unlawful act detrimental to CasaPound’s image and reputation. Further, CasaPound also asked the Court to ascertain that Meta has deprived the applicants of the availability of the content published on the pages and on the profile, as well as of the messages and private conversations.
CasaPound also alleged that Meta did not fully comply with the preliminary injunction, considering that the page and profile were reinstated only for Italian users and not globally, and that Meta kept on removing individual pieces of content and user profiles sharing the removed posts. More generally, in the plaintiff’s view, the importance of social media for the effective exercise of fundamental rights required that the contractual clauses be interpreted in light of constitutional principles, protecting the right to express oneself freely, unless they did not result in an offence. In its response, Meta opposed these claims by underlining its right to remove such content from the platform and to define CasaPound’s as an organisation that incites hatred.
In its 2022 decision, the Court reversed completely the preliminary injunction issued in 2019 and recognised that, among the limits to the freedom of manifestation of thought, in the balance with other fundamental rights of the person, respect for human dignity and prohibition of all discrimination is of particular importance. In this instance – and at least partially reversing the reasoning of the preliminary injunction – the Court emphasised that the case in question does not necessarily compress the freedom of expression of individuals or groups, but the possibility of access to a specific social media managed by private actors. Although Meta carries out an activity of undoubted social importance, the Court observed that Meta is still a private actor. Therefore, the relationship between the parties to the dispute is governed not only by the law, but also by the contractual conditions to which the party adheres when it applies to register with the social network.
The Court observed that the Terms of Service and the Standards of the Community form a full part of the contractual regulation. Particularly, users undertake not to use Facebook for illegal, misleading, malicious or discriminatory purposes and not to publish or perform actions that do not respect the rights of third parties or the laws in force. Further, Meta has the right to remove such content and to stop providing its services to users who violate them. Moreover, in determining whether CasaPound falls within the definition of hate organisation, the Court of Rome argued that the dissemination of fascist symbolism is prohibited by Facebook’s contractual conditions, thus making the removal of posts reproducing it lawful. According to the Court, Meta could not only terminate the contract thanks to the contractual clauses accepted at the time of its conclusion, but it had a legal duty to remove the content, once it became aware of it, risking otherwise incurring liability.
This duty would derive from a broad set of sources of both hard and soft law, at both national and international levels. Among the latter, the Court recalled Art. 14(3) of the E-Commerce Directive, the CJEU’s Glawischnig-Piesczek ruling, and even the recently adopted Digital Services Act. Further, particular emphasis was put on the “Code of Conduct to combat illegal online hate speech”, a soft law instrument drafted by the European Commission in 2016 with the collaboration of the leading companies in the field, and which many others have progressively joined. In particular, this Code requires the rapid assessment of content (within 24 hours of reporting) and the removal of discriminatory and hate speech posts or comments.
On this basis, the Court rejected the plaintiff’s claims and, accordingly, revoked the 2019 preliminary injunction issued against Meta. In doing so, it followed the line of reasoning that had emerged in other decisions, both precautionary and full-cognizance, and in particular the order that, in the context of a precautionary proceeding parallel to the one instituted by CasaPound, in 2020 had rejected the appeal filed by another neo-fascist movement (‘Forza Nuova’) against the removal of its pages and contents.
The Constitutional Limit of Content Moderation
This case provides another example of how courts deal with the increasing freedom of social media to make decisions on freedom of expression, and, in this case, political speech. The case underlines how neither international nor European law impose positive obligations on States to generally prevent access to digital platforms of organisations that might incite hatred. They rather establish obligations to prosecute, remove, and prevent acts of incitement to hatred in individual cases. Denial of access tout court is probably permitted – within certain limits – but it is not an obligation, as also underlined by the UN Human Rights Committee itself, in its 2011 General Comment No. 34 on Article 19 of the Covenant on Freedom of Opinion and Expression.
Secondly, the E-Commerce Directive, the Glawischnig-Piesczek ruling, and the 2016 ‘Code of Conduct’ do not give rise to direct obligations on private individuals. Rather, they allow and/or empower States to require private parties such as Meta to put an end to violations or to prevent them in the future. Unlike Germany, and before the entry into force of the DSA, Italy did not adopt any national legislation going in that direction. Further, Art. 14(3) of the E-Commerce Directive itself ‘shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement, nor does it affect the possibility for Member States of establishing procedures governing the removal or disabling of access to information’ (emphasis added). Similarly, according to the Glawischnig-Piesczek ruling, ‘it is apparent from Article 14(3) […] that that exemption is without prejudice to the power of the national courts or administrative authorities to require the host provider concerned to terminate or prevent an infringement, including by removing the illegal information or by disabling access to it’ (emphasis added).
Qualifying Meta’s conduct in terms of obligation and not of freedom probably responds to the need to anticipate the objections arising from the discretion of which the private actor would at that point be the holder in such a delicate matter. Still, this approach raises questions for the rule of law considering the potential expansion of case law imposing constitutional obligations onto social media. That is why this decision highlights once more the constitutional significance and systemic role of the recently approved DSA. Indeed, in this case, the DSA would provide a way to limit the reliance on unforeseeable, judge-made horizontal effects to address social media discretion in content moderation while defining the boundaries of social media freedoms. Particularly, Art. 14 requires online platforms to consider not only the protection of freedom of expression but also other fundamental rights when enforcing their terms of services. This example leads to entrust social media with the role of balancing constitutional rights while recognising the possibility for public actors to scrutinise their powers. Therefore, the Digital Services Act does not only contribute to the consolidation of social media’ powers on online content but also defines the limits of their freedom in content moderation.