The rise of social media platforms has created a new type of litigation, the ‘must-carry litigation’, in which parties require access to their accounts or request reinstatement of their content on the platform. ‘Must-carry litigation’ is an exponent of the absolute nature of content moderation: oftentimes, the consequence of content moderation is removal/blocking or publishing. However, there is a rise in ‘in-between’ options, such as controlling the reach of a piece of content or fact-checking, but the removal and blocking of accounts are by far the most common. As a result, a growing number of cases worldwide have debated whether certain types of content should be reinstated and accounts should be reopened. Those cases have to deliberate the civil law relationship of platforms with their users, as well as the public responsibility those platforms have for facilitating the democratic debate and moderating illegal types of content. Lacking a comprehensive framework for weighing those interests, courts usually turn to analogous reasoning of some sort. A popular analogy discussed briefly in this blog is the ‘public forum’, or shopping mall analogy. To provide a framework, the European Commission proposed the Digital Services Act as its legislative initiative to create a safer digital space for users, as well as protect their fundamental rights. Although far from in force, the decision discussed in this blog shows a glimpse of what a DSA application could potentially look like.
In the Netherlands, the amount of cases of content moderation is relatively limited compared to larger jurisdictions. However, when the COVID-19 pandemic triggered a new wave of disinformation, platforms removed the content of a number of politicians and opinionmakers. Wrongfully, the content creators say. One such decision was made on 6 October 2021. The Noord-Holland District Court handed down a decision in a dispute in which Dutch MP Wybren van Haga requested interim measures against the Microsoft-owned social media platform LinkedIn. Central to the dispute was the removal of Van Haga’s posts and subsequent blocking of his access to the platform. Van Haga actively promoted the use of Ivermectine, doubted the use of vaccines and the accuracy of PCR tests. LinkedIn removed his content and blocked his account because it labelled his content as disinformation. Van Haga requested access to his profile and reinstatement of his posts, together with a rectification by LinkedIn. The district court allowed Van Haga’s request for access to his account but denied the reinstatement of his posts and the rectification. In its consideration, the decision develops a different framework influenced heavily by the proposed Digital Services Act for the role social media platforms take in the public debate than previous Dutch decisions did.
Previous decisions: the Shopping Mall Analogy
The decision is noteworthy because it takes a different approach from similar decisions between MP’s and social media platforms, such as recently Amsterdam District Court on 15 September 2021 and 18 August 2021. Parties to these disputes alleged that the platforms acted illegally and in conflict with the principles of fairness and equity. To determine whether there was indeed an infringement on those principles by a violation of freedom of expression, the district court drew an analogy with the 2003 European Court of Human Rights (ECtHR) decision in Appleby v United Kingdom.1Interestingly, in Pruneyard v Robins (447 U.S. 74, 1980), the Supreme Court found the opposite in the U.S. More on the mall analogy here. The reasoning in Appleby was that there is no infringement on the right to freedom of expression by denying access to a private location or platform unless “any effective exercise” of freedom of expression is made impossible or when “the essence of the right has been destroyed” (para 47).
The Amsterdam District Court decided in both previous cases that when content is removed from a social media platform, that in itself does not constitute an infringement of freedom of expression, as explained in Appleby. After all, the MP’s concerned in those cases could still express their opinion on other social media platforms, and their effective exercise [of the right to freedom of expression] was not impossible, nor was the essence of the right destroyed. That reasoning seems a bit blunt. Realistically it is difficult to express your opinion at a similar scale – especially as a politician – without access to social media platforms. Not acknowledging this difficulty in the Appleby analogy does not reflect the societal reality of the enormous scale and potential for exposure those platforms offer. Especially because of that scale and the limited alternatives offering similar exposure, it seems that an analogy between the reality in Appleby and the virtual reality of access to social media platforms is misguided.
The Digital Services Act casting its shadow forward
In the decision on 6 October 2021, the Noord-Holland District Court took a different approach. It skips the shopping mall analogy and assesses whether (i) LinkedIn, carrying out government guidelines, is a state actor justifying horizontal application of art. 10 European Convention on Human Rights;2As in almost any jurisdiction, the district court does not allow direct horizontal application, para 4.7 and (ii) LinkedIn is acting in violation of their terms of service agreement, and termination of that agreement violates principles of fairness and equity. The second part of this assessment is interesting, as LinkedIn relies on the Code of Practice on Disinformation and the proposed Digital Services Act.
LinkedIn argues that it does not have a must-carry obligation and is answering the call to moderate disinformation it reads in the proposed Digital Services Act (DSA; para 2.8 & 4.5). The district court states that the relevance of the DSA also extends beyond the ‘must-carry’ obligation; the DSA’s objectives (art. 1(2)) can be seen as an in abstracto balance of interests between the freedom of expression and creating a safe online environment. The DSA provides rules that intend to guide a balance of those interests in concreto (para 4.13). The district court refers to the procedural safeguards laid down in the proposed Digital Services Act as a framework for that balancing act (articles 12-15): transparency, due process and a statement of reasons upon removal of content (para 4.14). Those principles are common in administrative proceedings; their application as legal standards in private law proceedings with social media platforms is justified by the court due to the responsibility social media platforms have toward the public interest. Therefore, the district court considers it should use these principles as a standard for reviewing LinkedIn’s moderating (para 4.14).
Additionally, the court considers that the terms of service agreement between Van Haga and LinkedIn qualify as a continuing performance agreement. In Dutch law, fairness and equity, together with the nature and substance of the agreement and the circumstances of the case, can necessitate strict requirements for the termination of such an agreement. The essential nature of access to social media platforms to exercise freedom of expression effectively necessitates a strict application of fairness and equity in the termination of the agreement, which can be informed by the norms of the proposed DSA (para 4.15).
In light of the foregoing, the District Court finds that LinkedIn acted carelessly in its termination of the terms of service agreement. LinkedIn did not formulate a clear policy on disinformation, indicating the line between disinformation and policy critique, making their removal decisions unforeseeable and opaque (para 4.11). Further, LinkedIn did not notify Van Haga of the removal of his content, or the blocking of his account, violating due process principles (para 4.21). Upon removal, every user should receive a notification, and the intention should be to state the reasons for the removal, so the user may learn from the decision (paras 4.21-4.22).
‘Just visit another platform’ is misguided – under the DSA, Appleby is untenable
In my opinion, the district court has taken a step in the right direction in disputes on content moderation such as this one. Content moderation on social media platforms will remain a balance of interests between the freedom of enterprise of the platform and the user’s freedom of expression. In this decision – despite the brevity of reasoning due to being an application for interim relief – the district court uses a better framework for weighing interests in the content moderation decision. The district court relies to a degree on the principles of the DSA, casting its shadow ahead on this case as a framework for weighing interests in the relation user-platform. The district court states that the DSA intends to balance freedom of expression and a safe online environment. In the DSA, the court reads a justification for applying administrative principles, such as transparency and due process, to private parties. It uses those principles as a framework for the review of the fairness and equitability of LinkedIn’s content moderation decisions. In this particular case, that balance was fairly easily struck, as LinkedIn had not done much to help Van Haga avoid publishing infringing content. The question is whether that balance is just as easily struck in grey area cases, especially when the European Parliament proposes stricter requirements on the terms of service agreements and due process obligations that platforms have in its amendment than the original DSA proposal.
Following the (indirect) application by the district court, the Digital Services Act will likely abolish the Appleby analogy in content moderation, especially if the final DSA turns out stricter than the original proposal: in the shopping mall analogy, there are no obligations for the private forum refusing service to its users, neither do they carry responsibility for the public interest; under the DSA, platforms need to observe a number of strict requirements in their content moderation process to serve the public interest. In the Dutch context, this framework better reflects the important near-monopolistic position social media platforms take in facilitating the democratic debate than previous decisions did. The Appleby logic of ‘just visit an alternative’ should be avoided for three reasons: firstly, there are not that many platforms available, especially not upon achieving a follower-base on a specific platform, which may be hard to match on a different platform; secondly, it does not account for the fact that since platforms cooperate in signing hybrid self-regulatory instruments such as the Code of Practice on Disinformation, there is a degree of convergence in content moderation guidelines between these platforms, making a move to a different platform an ineffective solution; thirdly, this reasoning incentivizes the creation of platforms where extreme opinions can accumulate through the ‘echo chamber’ effect. If posters of a certain type of content are all encouraged to go somewhere else, those types of content will accumulate elsewhere. On a separate platform, those similar voices may find each other, further accelerating their distancing from society. This shift is already visible on some alt-right social media platforms, such as Gab and Trump’s Truth Social. Although one might criticize the Millian ‘marketplace of ideas’, the alternative of platforms solely hosting extremist expressions seems even worse.
Jacob van de Kerkhof. ‘Dutch District Courts on the Shopping Mall Analogy in content moderation and the Proposed Digital Services Act’ (The Digital Constitutionalist, 30 March 2022) <https://digi-con.org/shopping-mall-analogy-content-moderation/>
- 1Interestingly, in Pruneyard v Robins (447 U.S. 74, 1980), the Supreme Court found the opposite in the U.S. More on the mall analogy here.
- 2As in almost any jurisdiction, the district court does not allow direct horizontal application, para 4.7
Jacob van de Kerkhof
Jacob van de Kerkhof is a Ph.D. candidate with the Montaigne Centre at Utrecht University. His research focuses on the protection of freedom of expression on social media platforms.