This post is a contribution to the symposium What is Digital Constitutionalism? and not an official editorial position. We nevertheless welcome the author’s contribution and encourage further posts probing the meaning of digital constitutionalism and its limits as an analytical approach.
“Digital constitutionalism” might sound strange for most people not dealing with academia daily. Many would probably struggle to see the connection between the digital world and the old-fashioned debate about the powers vested in public institutions. This is obviously a wrong perception. Indeed, talking about constitutionalism boils down to talking about power relations, an eternal theme that will never be out of fashion and that is still crucial both in the world of atoms and in the world of bit.
Classical constitutionalist theories address the topic of power relations by discussing the allocation of powers among different institutions of democratic States to offer weights and counterweights whenever power is vested in a specific institution. This approach moves from the assumption that democratic institutions retain the most significant power in connection with the regulation of citizens, being able to shape our social contract.
Over time, the rise of big corporations has soon amounted to an issue at a constitutional level. Whether it came to guaranteeing freedom of entrepreneurship, safeguarding workers’ rights or achieving a more sustainable development model, the question has always been about seeking a balance between various instances. Such balance has been pursued by shaping the powers of regulation and intervention vested in democratic institutions. In such a view, corporations were seen as either subject of constitutional rights (e.g., under Section 41 of the Italian Constitution, freedom of entrepreneurship amounts to a constitutional right of individuals) or subject of rules designed to limit the potentially harmful impact of industrial development on the society and the environment.
This classical approach is no longer exhaustive and cannot explain the role of digital constitutionalism. A mere contrast between corporations and civil rights (for example) would probably fail to grasp a crucial point. Nowadays, many digital enterprises are important for individuals and smaller undertakings to enforce their civil rights and liberties. Gillespie in 2017 observed that digital platforms are “architects of public spaces”: that is a fact, and the way such spaces are engineered is not neutral to us. Transnational corporations have been increasingly performing quasi-public functions, sometimes competing with public authorities in doing so. The impact of this circumstance goes far beyond purely economic dynamics and has led to several disrupting consequences.
Provocatively, one can say that one of the most important consequences of this new scenario is that the “rule of terms and conditions” has replaced the rule of law. Let’s think about social media platforms and content moderation. To access social platforms, users agree to a set of rules imposed by the provider. The “terms and conditions” amount to a private contract setting the rules for using the service and establishing the unilateral power of the platform to terminate the service fully or partly in case of a user’s breach (or even at its sole discretion). From a purely legal standpoint, this can make perfect sense but can undeniably lead to questionable consequences: de-platforming or private censorship are just a few examples.
As to de-platforming, let’s just think about former US President Donald Trump’s social media silencing decided upon the Capitol Hill riots: an incredibly impacting decision, materially shaping public debate, basically made by private entities based on unilaterally decided standards and with no opportunity for the silenced party to complain. Whether the decision was right or wrong is a blatant example of the “rule of Terms and Conditions” replacing the rule of law, where one of the rule of law’s typical functions is limiting discretionary powers impacting individuals and shielding them from unilateral actions, unquestionable decisions.
Besides potentially impacting individuals’ rights and freedoms, social platforms’ decisions can also have material consequences from a diplomatic perspective. A good example is YouTube’s decision to remove two online German-language channels run by RT, the Kremlin-funded television station, for alleged breaches of the platform’s policies on misinformation about the coronavirus. Since Moscow suspected the German Government advocated the decision, the spokeswoman for the Russian Foreign Ministry has publicly threatened to block German media in Russia in retaliation.
While it is unquestionable that digital platforms in general (and social platforms specifically) have contributed to bringing several unimaginable services some decades ago, helping bring development and connection everywhere, the two examples above clearly show the magnitude of the issues they can raise. After a long period of tolerance, in the name of a liberal economic approach, many States are increasingly trying to design new governance rules for digital platforms, taking local peculiarities and macro-economic dynamics into account. Whereas the US focuses on competition-related matters, China is trying to maintain control over its tech giants. Through its proposal for a Digital Services Act, Europe aims to set rules that, moving from the existing liability system, incorporate procedural fairness in terms of notice and takedown mechanisms and accountability and transparency of platforms vis-à-vis their clients/users. For someone, however, this might not be enough. By way of example, both in the US and in Europe, content moderation has always been a hot topic regulators have been struggling to cope with. From the exclusion of liability set forth under the US legislation to the accountability system designed under the EU proposals, regulators have always struggled with this dilemma: how far can we go when imposing content monitoring obligations to platforms? This question has much to do with digital constitutionalism.
The “onlife” dimension (Floridi 2014) has probably made us more familiar with that disruptive news can come from everywhere. When it comes to hosting providers’ liability, a recent disruptive initiative came not from a Parliament, scholars or advocates, but a credit-card firm. Reportedly, based on an update of the requirements Mastercard sets for banks that process payments for sellers of adult content, as of October 2021, adult websites wishing the firm to process payments in their favour will have to review all content before publication. Of course, platforms will be free to decide to not work with Mastercard. Still, it is reasonable to believe that this news will significantly impact the industry, considering that the credit-card giant is estimated to handle about 30% of all card payments made outside China. The goal (hardly questionable) is clear: fighting child pornography and the publication of non-consensual content. Humorously, platforms this time were hit by the same “rule of Terms and Conditions” they apply to their users: a private, unilaterally set of commercial rules, materially impacting their business and imposed according to a “take it or leave it” approach. That is probably a muscle-flexing exercise from the credit-card firm, but very significant.
As of the entry into force of Section 230 of the US Communications Decency Act, ex-ante review of contents before publication on platforms has always been taboo in the digital world. Scholars and case law have often discussed the role of hosting providers and their accountability with respect to content posted by third parties. Still, both in the European Union and in the US, it has always been crystal clear that – except where specific circumstances occur – hosting providers are not required to review content before publication systematically. Now, this principle has been hardily challenged by a credit-card firm.
Somehow, this episode can be regarded as a (digital) constitutional exercise for two reasons. First, it is a clear example of power relations, as referred to at the beginning of this contribution. Second, it should not be forgotten that constitutionalism is not only about designing the powers vested in public institutions: it is also about setting a common set of values we all share and base our social contract on. When a Constitution says that all individuals are naturally free have the inviolable right to free speech, freedom of movement, freedom of religion etc., it sets down the fundamental values a given society is based upon. Similarly, when a financial institution states it won’t process payments related to content that violate certain principles, it is saying that even its final goal – i.e. making profits – is based on some core principles which cannot be negotiated. These values are unquestionably shareable in the case at stake, but what if the firm’s goals clashed with constitutional values? It must be remarked that the “rule of Terms and Conditions” reflects unilateral decisions made by private entities, whilst a constitutional exercise should consist of representatives of various instances working together to find common solutions democratic institutions can base their functioning on. Agreeing on common values and engineering the tools to protect them is another side of digital constitutionalism, and this is why it is so important.
The risk of an unregulated approach cannot be ignored. It lies in the fact that being it for a certain reluctance in influencing the internal mechanics of important business or that the public law-making exercise is per se time-consuming, the task of protecting core democratic values or social liberties could be demanded by private firms. This cannot be accepted. Whether the concern is about de-platforming or data processing, about visibility and ranking of content or platforms accountability, little changes: the governance of online platforms, as well as the core values they must preserve, must be publicly guided (even better, at a supranational, global level), taking into account the principles of transparency vis-à-vis users, predictability of decisions, accountability, commonly shared values. It is about time to urge new platform governance, which must be inspired by digital constitutionalism.
We can conclude that digital constitutionalism is both a model and a methodology platforms’ governance should get inspiration from: a model based on the rule of law rather than on the “law of Terms and Conditions”.
Mariavittoria La Rosa, ‘What is digital constitutionalism? A practical view based on an “adult-only” precedent’ (The Digital Constitutionalist, 28 February 2022) <https://digi-con.org/what-is-digital-constitutionalism-a-practical-view-based-on-an-adult-only-precedent/>
Maria Vittoria La Rosa