DigiExplainers is a new series of DigiCon posts composed of larger essays on key pieces of legislation. They normally include a description of a regime together with a critical analysis of their potential to shape online environments.
1. Some Context
The EU Digital Services Act is out, and although it is formally an amendment of the e-Commerce Directive, the last twenty years have made it so much more.
Back in the 90’s, e-commerce was all about a new, upcoming part of human interaction online, a world where freedom reigned. So, it was no surprise that both the US Section 230(c)(1) from the Communications Decency Act (1996) and the EU articles 12 to 14 from the e-Commerce Directive (eCD) (2000) grant general immunity to online service providers regarding third-party content. Those were the days when business models relied more on allowing access to the Internet (the rise of the ISP – Internet Service Provider) than hosting, managing and promoting content (as it would come to pass in the next decade with the rise of the first social networks). Less people were online, online was less important.
More than twenty years on the landscape has changed immensely and one feature above all stands out: information published online is not confined to subscribed closed newsgroups moderated by users with little or no control by ISP or other few service providers; or to private e-mails. Information is now publicly available across multiple platforms along with an exponential increase in internet users from 7% of the world in 2000 to 60% in 2020. This means that in the last twenty years, not only two-thirds of the world population have become internet users but digital instantiations of fundamental rights have become common through that use. The world population, especially in developed and developing countries, now experience some of the most important dimensions of their lives online, such as their privacy and the protection of their personal data, their work, the expression of their thoughts, convictions and religion, their will to come together, to protest, to go on strike, to vote and many others dimensions. Contrary to the offline world, they do not do this on their own property or on public spaces, they do it on privately owned virtual spaces that people access because they buy a service that private companies provide and/or they work for the owners.
Online services providers are the enablers of our online life. They were so by 2000, but there were few users and we had not yet put so much of our lives on the Internet. By 2022 that importance is evident.
The Digital Services Act is thus much more than an amendment to a piece of legislation setting rules for electronic commerce. By regulating digital services, the EU is regulating our online lives, from how we access the Internet to how we can act online in our interactions with others. This overwhelming change has not gone unnoticed by the EU legislator that now not only addresses a set of different roles we can assume online – as service providers, as users, as consumers, as traders – but also addresses what rights and duties we have, taking as its key parameter the EU Charter of Fundamental Rights (see Recitals 3 and 9). This is a huge shift. From no mention of Fundamental Rights in the e-Commerce Directive, the DSA now brings us 39 mentions to “fundamental right” (26 in the Recital and 13 in the articles). Even accounting for the fact that the EU Charter although declared in 2000 only came to force in 2009, the role that fundamental rights now takes up in the DSA could have already been claimed in the e-Commerce Directive, not only through Member-State’s transpositions referring to their Constitutions but also by direct referral to the European Convention on Human Rights (ECHR) and other “relevant international standards for the protection of human rights”, as it is now done with the DSA (see Recital 47). This is also a big challenge for the DSA: Online, to be citizens exercising fundamental rights we need to be users, clients, and consumers. The DSA promises a combination of these two worlds. Lawyers, better than anyone, know they are not easily reconciled.
As the DSA becomes the statute of online behavior, it does not do so in a vacuum. It is fully embedded in EU digital strategies, which have been delivered through a comprehensive set of legislation, most of which interconnect directly and extensively with the DSA: the GDPR, on data protection; the Digital Markets Act, on competition rules for the market of service providers, in the future, the amendment of the e-Privacy Directive, acting as lex specialis to the GDPR and the IA Act, with great importance for online platform moderation algorithms. So, we should look at the DSA also an as interface law, that branches out to many already enacted and soon-to-be enacted pieces of legislation and calls them in to its application. In the remainder of this blogpost, we will not concern ourselves with this branching out, but we have acknowledged it and will mention its consequences on occasion. We will instead focus on the many roles and rules that the DSA comprises, starting with service providers, traders, administrative, supervisory authorities, courts, and finally, users (as consumers) so as to show the DSA as we presented it: a statute for the entire online ecosystem, a statute that might enact more than just a common digital market, but a common digital republic.
2. Providers of intermediary services
Chapter II of the DSA, which has the honour of carrying the heritage of the e-Commerce Directive is one of the most interesting parts of the Regulation. It has a sort of anti-Gattopardo flavor: something must be kept the same in order for change to occur. Here are (articles 4 to 6) the three kinds of providers of intermediary services that we can find in the e-Commerce Directive: mere conduit, caching and hosting (articles 12 to 14 of the eCD) with a legal regime of providers’ liability that seems very similar to the one known since 2000, by US influence. Thus, in the DSA, providers of intermediary services i) are generally not liable for the information they transmit, store or host, unless specific conditions verify (articles 4(1), 5(1) and 6(1)) such as the unlawful interference with communications by the provider or its knowledge and inaction over illegal content, ii) do not have a “general obligation to monitor information […] nor actively to seek facts or circumstances indicating illegal activity” (article 8 of the DSA and 16 of the eCD), and iii) can be subjected to orders to provide remove content or provide information on users (articles 9 and 10 of the DSA and articles 12(3), 13(2) and 14(3))
Although Chapter II of the DSA offers a very similar regime of liability of providers to the one foreseen in the eCD, article 7 already serves as warning of what is to come in the remaining Regulation: although written to appear as a reinforcement of the previously mentioned exemptions from liability, this provision can only be understood in the face of the new importance that platforms’ content moderation assumes in the DSA. In fact, one must understand article 7 as tracing a very clear line between the mere, a priori existence of content moderation mechanisms and what these mechanisms can actually find: by having content moderation, service providers, especially hosting ones, do not waiver the exemptions of liability they are given, but, if in the course of moderating content, illegal one is found or made apparent (see article 6(1)(a) and (b)) than the service provider is liable for such content if it does not act to counter it. With this clear distinction in mind, Chapter II of the DSA heralds the big changes it enacts regarding the eCD. This is because, in 2022, the EU legislator presupposes that online service providers, specifically online platforms, use a business model that stands on very extensive knowledge of the hosted content (article 15) and the ways in which it can affect its public image and, by extension, its advertisement revenues (as we are currently witnessing with Twitter). Thus, it must have particular duties of care regarding the way it uses such powers of moderation, even in circumstances where no clear illegality is found.
The first big difference in the regime of service providers introduced by the DSA is the very taxonomy of such providers. Where a simple triple distinction among providers was used by the eCD, with little distinction in regimes, the DSA now builds on such classification to present a taxonomy pyramid where, as Russian dolls, further specific regimes are foreseen for increasingly specific types extracted from wider categories of the three original types of service providers. Thus, while Chapter I, II and Section 1 of the Chapter III contain rules applicable to all providers of intermediary services, from thereon after the DSA foresees specific rules to hosting providers, a type of hosting providers called online platforms, a type of online platforms that allow distance contracts between their users and traders and, finally, a very small group of very large online platforms, or VLOP for short (see Figure 1).
Figure 1. The DSA pyramid of incremental material scopes by types of service providers
- Generic provider of intermediary services
From the perspective of what we shall call the generic provider of intermediary services, that is, a provider that does not fall under any specific qualification foreseen under the DSA other than the triple distinction of article 3(g), the Regulation comprises the liability regime from Chapter II, presented above and that comes from the eCD, combined with:
- the novel due diligence regime of Chapter III Section I, that we shall look more in-depth presently;
- other, softer, due diligence obligations stemming from Chapter III Section 6;
- the regime of common provisions on enforcement from Chapter IV Section 5; and
- some provisions from Chapter V, with special attention given to article 93 on entry into force.
Even without considering the new subjective categories of the DSA, generic providers have a profoundly new regime. Beyond the liability regime, Section 1 of Chapter III contains two very important articles: while articles 11 to 13 are merely a part of the architecture of the DSA’s geographical scope foreseen in article 2(1), articles 14 and 15 are the real stars. Starting with this last one, we can see the influence of the German NetzDG (see for instance the latest Facebook report here) on the new duty of transparency having content moderation as its object. With the DSA all service providers that engage in content moderation, in the sense of article 3(t), have to submit “at least once a year, clear, easily comprehensible reports on any content moderation that they engaged in during the relevant period” (article 15(1)). However, whereas the criteria for justifying any action against user or content in the German NetzDG is, besides the Terms and Conditions of the platforms, a set of provisions from the German criminal code, according to the DSA the criteria will comprise any illegal content, as can be concluded from the article 15. This calls for comprehensive private enforcement from providers supported on the ability to ascertain illegality based on Member States and EU law.
The idea that private companies should ascertain illegality of users’ behavior may seem at first a problem, but presents no difference from the same kind of judgment that some owners of offline services opened to the public must exercise, for instance, regarding some public venues such as football stadiums. It does, not doubt, raise important challenges concerning the capability of providers to comply – in legal terms – with article 15. But if one thinks that the duty to provide transparency reports weighs heavily on service providers that will be nothing compared to article 14, the true game changer of the DSA. This article foresees a rule that prescribes nothing short of the duty to align Terms and Conditions with the EU Charter of Fundamental Rights: “Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter” (§4).
Terms and Conditions is where it all begins, but the Charter is where closure can be found. Terms and Conditions are standard contractual clauses under private law (article 3(u)). The academic discussion has been long and strenuous on the relationship between fundamental rights and private law, or, seen from another perspective, the effects of fundamental rights on the horizontal relationship established between private parties as opposed to the vertical relationship that exists between private parties and the State. However, more and more the courts, especially the European Court of Human Rights (ECtHR), are making tabula rasa of these discussions through tools like positive obligations of the State concerning human rights in horizontal relations. Such seems to be the case in the DSA. Not only does the DSA regulate in great detail the consequences arising, to users, from Terms and Conditions, such as i) any restrictions on users’ behavior, ii) adopted tools of content moderation and iii) procedures used to handle conflicts it also regulates how Terms and Conditions should look like to users: the information conveyed under the Terms and Conditions must be “be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format” (article 15(1). Also, any “significant” changes must be informed to the users (article 15(2)). So, to service providers, the DSA is not just about a duty to communicate what you do regarding users (under you content moderation schemes). It is about the very essence of how you plan and organize what you intend to do with your service and its users (Terms and Conditions). More on this when, at the end, we look at it from the users’ perspective.
Next in the new DSA regime aimed at generic service providers is a set of rules foreseen in Section 6 of Chapter III. Although the Section is called “Other provisions concerning due diligence obligations” the word “voluntary” comes up a lot in this section (articles 44(1), (45(1), (46(1), 48(1), so it should be noticed that the standards, codes of conduct and crisis protocols foreseen in this section, although strongly promoted by the Commission, only prescribe duties after voluntary acceptance by the service providers. This is a step up from what we know from the GDPR and a way to foster compliance by the providers.
Finally, the regime of liability, Terms and Conditions, transparency reports on content moderation, and voluntary obligations prescribed to generic service providers comes under public regulation according to Section 1-3 of Chapter IV. Service providers will be under Member States supervision, entrusted to one or several authorities, one of which will be the new Digital Services Coordinator (article 49). This is common EU regulation to the likes of the GDPR, where an independent national authority is also foreseen. The Digital Services Coordinators have extensive power over generic service providers, as set out in article 51 and they will be their prime reference while complying with the DSA and making sure that their actions are aligned with it and with the EU Charter of Fundamental Rights.
- Hosting platforms
The DSA adds three articles to the menu of compliance when dealing specifically with hosting platforms as one of the three original types of service providers. These articles deal with i) a revamped version of the notice and takedown mechanisms, now called notice and take action mechanisms (article 16), ii) the duty to state reasons concerning restrictions imposed to users (article 17), and the new notification of suspicions of criminal offenses (article 18). The regime specially crafted for all hosting platforms provides a fine example of the procedural approach of the DSA. Since the standards for problems of content – as a substantial issue – are found elsewhere – in Terms and Conditions as well as Member States legal orders and other EU legislation with the Charter at the zenith – the DSA focus a lot on procedural duties targeted at assuring that the standards for evaluating content are applied and well applied. Hosting providers are thus called to comply with a set of procedural duties very well known to administrative law lawyers, especially when we think of articles 16 and 17.
- Online platforms
Online platforms and their more restrictive type, the VLOP, are the primary concern of the DSA. From article 19 onward it’s all about platforms, small and large. Regarding online platforms in general, the stars are i) the new internal complaint-handling system (article 20) and ii) the out-of-court dispute settlement (article 21). However, online platforms’ special regime also includes measures and protection against misuse that must be adopted by platforms (article 23), enhanced transparency reporting obligations (article 24) as well as duties regarding online interface design and organization (article 25), advertising (article 26), recommender system’s transparency (article 27) and online protection of minors (article 28). Here is the bulk of new compliance duties of the DSA. Trusted-flaggers (article 22), although implying some work on the part of online platforms are better seen from a users’ perspective and we’ll get back to it below.
- Online platforms that allow distance contracts
Online platforms that allow consumers to conclude distance contracts with traders constitute another type of service providers, with increased rules. The aim of the DSA is to act by proxy to protect consumers beyond what it already does under EU consumer law. Thus, all the rules regarding trade on online platforms are aimed at the platforms as enablers of such trade. The goal is to make sure that i) consumers do not confuse platforms with traders offering their services and products through those platforms and ii) that traders are accountable to platforms and hence to supervisory authorities and consumers. Traceability of traders (article 30) and compliance by design (article 31) aimed at traders are key to enforce such goals.
- Very large online platforms – VLOP
Amongst the concerns with online platforms as the online places where much of our virtual lives takes place, it is normal to find the greatest concern with those platforms that garner the most attention and use. Thus, the DSA carves one final service provider out of the already narrower type of online platform: the VLOP. Besides providing criteria to define VLOP on article 33, the DSA similarly deals with VLOP to what we can find in the AI Act: it’s all about risk. Because VLOP have such power and can shape the public sphere and societal relations, the DSA takes a very specific approach to them that would merit another blogpost. Suffice to say that VLOP will have to develop their own special apparatus to comply with the DSA, with special attention given to fundamental rights. Not only do they have upgraded duties regarding those foreseen for the rest of online platforms and service providers they also have to adopt a risk-based, crisis-management approach. On top of this, they have their own regulator, none other than the Commission itself. More on this aspect below.
As I’ve argued above, the DSA is not just for providers of intermediary services. Traders (article 3(f)) receive special attention therein. Reading article 2(2) it would seem not: “This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service”.
Traders are secondary addressees of articles 30 and 31 and this makes them a part of DSA ecosystem. A trader using an online platform to offer its services or goods will under article 30 of the DSA know that it will be mandated to provide certain information to the platform in order to do business there (§1) under the consequence of the suspension of access to the platform until such time that they provide the information (§2, sub§2). The trader will also know that “inaccurate, incomplete or not up-to-date” information provided to the platform will result in a request to the trader to remedy the situation (§3), with similar consequences as presented above. Given this regime, the trader is explicitly given the same rights as any other platform user under articles 20 and 21 (§4). So, unequivocally, the many thousands of traders who use online platforms to conduct business will have to look at the DSA in order to plan their activities and be able to use such platforms. Moreover, these are not just formal requirements. Article 30(1)(e) prescribes that the platforms demand “a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law”.
4. Governance at the DSA: Supervisory authorities and courts
The first thing that comes to the mind of a digital lawyer looking at the DSA from a public law perspective is that the EU legislator has tried to take some lessons from the GDPR. It’s still too soon to know if it has succeeded.
There is a dual approach to the tasks of public bodies in the DSA. On one side we find the classic administrative and judicial approach of foreseeing supervisory and judicial control over providers and users. On the other side, the DSA foresees one of the most complex and novel systems of regulated public regulation in EU digital law.
Looking at the classic approach, under articles 9 and 10, supervisory authorities and courts serve an important task but are not grounded in the DSA. They can, pursuant to article 9, order any provider of intermediary services to suspend or remove content, according to certain procedural requisites (§2) with a case-by-case territorial scope (§2(b)). Under article 10, they can order any service provider to provide them with “specific information about one of more specific individual recipients of the service” (§1). These two powers, encompassing both administrative authorities and the courts, point to a public supervision model, that goes beyond the purely self-regulatory model that most service providers adopt and even beyond the regulated self-regulatory model that the remaining DSA expresses. In fact, if articles 9 and 10 are accompanied, within Member States, by proactive administrative authorities that investigate, both by their own volition and following complaints by users of the service provider, we have all the characteristics of a fundamental rights supervisory model, such as the ones that are implemented by national authorities entrusted with protecting personal data and freedom of the press, to give two examples. The administrative and judicial impetus and momentum in each Member State and, ultimately, in the CJUE will of course be decisive to the success of this model. This cannot be stressed enough. It must be reminded that administrative authorities and courts could be extensively sought after, especially in the first years of the DSA coming into force, to sort out and fine-tune the conditions in which information obtained by providers is “sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and, where appropriate, act against the allegedly illegal content” (Recital 22). However, the grounds for illegality are mostly found in Member States law (Recital 31), with only other EU law and the Charter offering a common parameter. Furthermore, the same can be said of the task that awaits courts in aligning Terms and Conditions with the EU Charter of Fundamental Rights, under article 14(4).
The public supervision dimension of the DSA goes well beyond this classic model of direct public supervision based on the prosecution of illegality. Again, following the German NetzDG, the DSA puts in place a complex system of regulated public regulation. We have seen above the set of due diligence obligations that will now constitute the framework of providers self-regulation. It will be self-regulation co-determined both by normative and administrative constraints. Let’s focus on this latter kind. The first big issue is the novel division between regulators. All service providers, except VLOP, will be under Member States’ supervision, with a special position taken by the new Digital Services Coordinator, but VLOP will be under direct supervision of the EU Commission. A problem immediately stands out: while national Digital Service Coordinators must be an independent authority (article 50), the Commission is not. At least if the criterion for independence is freedom from “any external influence, whether direct or indirect, and [from taking] instructions from any other public authority or any private party”. It seems the EU legislator does not think that the same should apply to the Commission.
Both the Digital Services Coordinator and the Commission have a wide and severe toolkit to deal with service providers in general and VLOP in particular, respectively.
Looking at the Digital Services Coordinator, the toolkit comprises powers of investigation and enforcement (article 51(1), (2)), such as to require information relating to infringements of the DSA, to carry out inspections, to ask explanations from providers’ staff, to accept commitments to comply with the DSA, to order the cessation of infringements, to impose fines and periodic penalty payments, to adopt interim measures to avoid “the risk of serious harm”. In certain, last ratio situations (§3), the Digital Services Coordination has the power to require management bodies to “to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken” (sub§1 a)) and even “order the temporary restriction of access of recipients to the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place” (sub§1 b), sub§2 and 3). It must be borne in mind that some of these powers may not be given to the Digital Services Coordinator but, alternatively to courts only, with a power of initiative given to the Coordinators in some Member States. Even accounting for these cases, the powers of Digital Services Coordinators are very ample and hard, especially if we add the power to apply penalties within the framework given to Member States by article 52. The same can be said for the powers conferred upon the Commission regarding VLOP, under articles 64 to 83, which should be the subject of another blog post, given its importance and complexity.
Both Member States Public Administrations and the Commission are called upon by the DSA to develop a powerful digital regulatory arm, much more pervasive than anything seen in the EU digital landscape up to now. This should be combined with its powers under the Digital Markets Act. At the same time, great leeway is given to Member States to define the substantive standards of illegality and the procedural mechanisms to which the DSA, with its own procedural demands, must fit in. In this way Member States legislatures and governments will have a crucial role to play in making the DSA operative and successful. The differences between Member States’ administrative capabilities will be much more explicit than what we know from GDPR enforcement. Self-regulation and compliance alone cannot fully achieve the goals of the DSA, clearly expressed in the first recitals. The administrative machinery of Member States and the EU Commission are decisive if we are to have a wholesome system of regulated self-regulation.
The key to the success of the DSA, as the statute of the internet citizenship and digital instantiations of fundamental rights, is, undoubtedly, platform users. This is not in itself a good thing. Most of the citizens are also professionals, parents and grandparents, sons and daughters and have their own lives to lead, where activating the tools offered by the DSA to exercise and protect digital rights may seem disproportionate in most circumstances when confronted with daily limited time and all our other (pre)occupations. The DSA, like all national legal instruments before it, must present simple and effective tools for citizens to exercise rights that, most of the time, regard actions performed in an instant even if with long-lasting consequences. The DSA cannot be a law that is only worth using when a life-shattering event occurs. The same can be said for national law where it all begins: “the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate to the public through intermediary services” (recital 27). But the DSA is here and with it the promise of better Internet, also better virtual lives. What mechanisms does it offer to users with busy lives and strong internet presence, that may not have time and willingness to go against what they think is wrong with other users’ actions or providers’ policies?
I would argue that there are several promising mechanisms in the DSA for users to explore, that will make the DSA more efficient and better at delivering the goals that EU legislators envisaged. First and foremost, the legitimacy given, under article 53, to “any body, organisation or association mandated to exercise the rights conferred by this Regulation on […] behalf [of users…] to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient of the service is located or established”. Here is the digital citizenship clause, here is the digital activism clause. Because digital rights relate to such instantaneous actions with such permanent consequences, regarding, most times, such powerful companies, representation and collective action seems the appropriate way to respond. Consumer associations, digital rights associations and foundations have their jobs cut out for them. Not only through article 53, but under national mechanisms that allow users to petition legislators and the Administration and present claims in courts. This in turn connects with another role featured in the DSA – trusted flaggers under article 22. Again, collective action and specialization is the key to represent all users and benefit them all. If users unite and get represented through a diversified set of entities under the banner of digital rights, the DSA, national and EU legal systems offer many possibilities. Even when such bodies do not have the formal legitimacy to act, as in article 53, they can still offer guidance and support both in complaining to platforms under article 20 and submitting to an out-of-court dispute settlement. Nevertheless, they can also activate the administrative authorities and courts, under national mechanisms, to bring about the orders pursuant to articles 9 and 10, as well as to assure the compliance of Terms and Conditions with the EU Charter of Fundamental Rights. This also means that all due diligence obligations set forth in the DSA are the province of supervision, if not by users directly, then by their representatives, their associations, foundations, cooperatives and the like. The DSA promises a lot of strife, a lot of legal embattlements to achieve better criteria for balancing almost infinite cases of fundamental rights’ conflicts, but it also gives us the tools – to providers, to traders, to public administrations, to courts and certainly to users – to achieve a better digital ecosystem. If we can build it. The DSA offers a comprehensive statute for online behavior. A European digital republic, if we can keep it.
Assistant Professor of Constitutional Law, Administrative Law and Legal Theory at University Lisbon - Law School.