HomePostsPlatform GovernanceThe DSA proposal and its deontic status

Related Posts

The DSA proposal and its deontic status

Reading Time: 7 minutes
Print Friendly, PDF & Email

1. Introduction

One of the key objectives of the DSA proposal, as of the e-Commerce Directive (ECD) before it, is to tackle “illegal content” online  (the ECD refers only “illegal activity”). Contrarily to the ECD, where the notion of “illegal activity” was a given, the DSA proposal foresees a definition of illegal content under article 2(g). It is a very comprehensive definition that includes not only illegal information in itself but also illegal actions regarding certain (legal ou illegal) information (see Recital 12). In this respect, although the expression “illegal content” has triumphed, the better choice goes to the ECD: information in itself is never illegal. What one means with a sentence of this kind is to implicitly state that certain human actions regarding information are in violation of norms of prohibition or obligation (the information should not have been produced, should not have been disseminated, should not be kept, had to be complete, had to explicitly mention the author, etc). This perhaps explains why the proposal still mentions “illegal activity” under articles 5 and 7.

Illegal content for the purposes of the DSA proposal is therefore: i) information which is disseminated although prohibited; it can also be ii) information which was not properly disseminated infringing content obligations (for instance, the case of a publication regarding a matter where a conflict-of-interests disclosure obligation exists: such publication would be illegal if the mandatory information was not provided). Illegality in this domain – and as expected – derives from the violation of norms of prohibition and of obligation. These norms are not foreseen in the DSA proposal, which merely refers to them: they are found in national law and other Union law (see Recital 12 in fine). Many norms exist that forbid or oblige certain actions online. Addressing the violation of such norms, such illegal content, is one of the pillars of EU legislation concerning digital services. For instance, hosting services providers are not liable for illegal content stored or posted by the recipients of the services as long as they do not know of the illegality or upon knowing, remove the content (see article 5).

This approach leaves out one important kind of norm: permissions. Indeed, most fundamental freedoms, such as the freedom of expression, are built on norms of permission and some of the most important conflicts arising on online platforms derive from the clash of actions deriving from permissions with actions foreseen in other norms, with no applicable obligations or prohibitions serving as exceptions to prevent such conflicts. Given the assumed goal of the DSA to foster the exercise and protection of fundamental rights under the Charter of the European Union, it is to be expected that this kind of conflict between fundamental rights is addressed on the DSA proposal. The issue is diagnosed and the Explanatory Memorandum to the DSA proposal states that “[t]here is a general agreement among stakeholders that ‘harmful’ (yet not, or at least not necessarily, illegal) content should not be defined in the Digital Services Act and should not be subject to removal obligations, as this is a delicate area with severe implications for the protection of freedom of expression”. Thus, beyond illegal content, derived from the violation of norms of prohibition and obligation, another category is considered, that of “harmful content”. This category does not collapse necessarily into illegal activity because one may cause harm while acting under a permission and not infringing any norm of prohibition or obligation. The fact that the DSA proposal does not define “harmful content” and does not subject it to removal obligations does not mean that the proposal is silent on this matter. One should inquire what is the status of all content under the DSA proposal and not only illegal (forbidden) content. On the contrary, and as we have seen, it is of the utmost importance to discern what happens under the DSA proposal when lawful actions regarding content conflict with other permitted actions. They are as important as actions concerning illegal content because they both involve the adequate exercise of fundamental rights.

In the following sections, I will look at two distinct but connected issues dealing with content: i) illegal content and the task of interpretation and ii) conflicts of lawful actions and balancing.

2. Illegal content and interpretation

Finding illegal content demands interpretation. It demands that we determine if a certain action is subsumed to a norm. Interpretation deals with natural languages and legal concepts. The possibility of vagueness, ambiguity and misunderstanding when trying to obtain a norm varies according to the norm sentences and the fields of law within each legal system. This trivial remark is essential regarding the DSA proposal because providers of intermediary hosting services are, under specific conditions, called to determine when actions taken by their users are illegal in order to prevent any liability for such actions. This is the case with “notice and action mechanisms” of article 14, where online platforms must assess and decide notices when users consider certain content illegal. Article 14 calls for notices to conform to certain conditions such as “an explanation of the reasons why the individual or entity considers the information in question to be illegal content” (lit. a) and if this so happens, article 14(3) prescribes that “[n]otices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.” This, in turn, seems to mean that, from that moment on, either the online platform removes the content to exclude liability or it decides that the content, although argued illegal, is not (article 14(5)). If the platforms decide in favour of the notifiers and remove the illegal content, the affected recipients of the service can submit a complaint under article 17 of the proposal. Independently of whom the platforms decide for, this mechanism will commonly give rise to further discussion and litigation on the adequate legal interpretation to each specific case, be it in court, in out-of-court bodies or administrative authorities. This means that online platforms will play an even more important part in interpreting legal norms that regulate the exercise of fundamental rights. Freedom for their users will be one of the first arguments for or against the (il)legality of content (see also the decisions on “manifestly illegal content of article 20). This is similar to what already happens in Germany under the NetzDG. In my view, such interpretation, coming from the first of several instances where the conflict may be addressed, should be welcomed as it enriches the legal reasoning and helps the task of other adjudication bodies such as out-of-court dispute settlement bodies and courts.

3. Conflicts of lawful actions and balancing

Even if content is not illegal, it can raise problems when it conflicts with the application of other norms, calling for balancing operations to decide which norm prevails. Online platforms have an interest in preventing and solving these conflicts. The rules and methods to prevent and solve these conflicts are usually foreseen in the general contractual clauses that platforms present to their users for acceptance. It is thus imperative that users understand which fundamental rights and freedoms may be affected. In effect, by accepting the terms and conditions of an online platform (the proposal defines “terms and conditions” under article 2(q)), users are allowing their rights and freedoms to be restricted under such rules. It is a case of restriction of fundamental rights by holder authorisation, the only one that a strong permission, in a Bulyginian sense, allows. For this reason, article 12(1) of the DSA proposal addresses this subject, foreseeing a duty of online platforms to inform users “on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions”. The goal is to provide the user with information that i) lets her make an informed decision to use the online platform and ii) allows her to know the conditions that will apply to any restriction of her freedoms and rights regarding “information provided”.

Article 12(2) complements the duty of information with balancing instructions. Since “restrictions” in the sense of the proposal are exceptions to permissions that the user has allowed the platforms to perform (in the form of prohibitions or obligations for specific cases), article 12(2) specifies the method and scope. First, it states how the application and enforcement of such restrictions must be made, in practice submitting such application and enforcement to a prior balancing operation under the principle of proportionality. Second, it determines that such balancing operation must comprise the applicable fundamental rights under the Charter (“with due regard”), where these occupy the positions in conflict. That is a fundamental right that justifies a restriction and another fundamental right that is restricted. Again, the platforms are called to interpret and apply the law and this time the proposal calls for direct application of the Charter.

3. Final remarks

Article 1(2)(b) of the proposal states that one of the aims of the DSA is to “set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. The DSA is a framework to apply the Charter to online platforms and markets. This should remind us that the Charter would always apply be the actions legal or illegal in any case. Although a different approach to legal and illegal activity is justified, the proposal should not comprehensively respond to the challenges of exercising fundamental rights online. That is why the DSA should be all about the procedure (as it mostly is), from the orders to act against illegal content and to provide information, to risk assessment, passing through internal complaint-handling mechanisms and trusted flaggers and beyond. The DSA should create an evolved framework for the inescapable application of balancing and exceptions (contractual and legal) within the fundamental rights domain that pervades online platforms. This does not mean that the proposal should be oblivious to norms that materially define which content can be permitted and which should be prevented or removed. Either through the interpretation of restrictions set out in national or Union law under norms of prohibition or obligation (illegal content), or by balancing reasons for the preference of one conflicting instantiation of a norm over another (lawful content), providers are called by the DSA proposal to make the first decisions on how fundamental rights under the Charter should be applied. They must offer reasons (articles 14, 15 and 17) from the perspective of the environment enabler. It seems complicated not to see here a horizontal effect of fundamental rights regarding the users of online platforms, even if often by triangulation. This effect is one in which the platforms assume a role of co-legislators (terms and conditions) and co-adjudicators (notices and complaints) in a system populated by users, administrative authorities, and dispute settlement bodies (foremost the courts), but where the Charter rules them all.

Domingos Farinho
Assistant Professor at University Lisbon - School of Law | + posts

Assistant Professor of Constitutional Law, Administrative Law and Legal Theory at University Lisbon - Law School.

[citationic]

Featured Artist