The Digital Services Act at the test bench

1. As we mark one year since the enactment of the Digital Services Act (DSA), it is now a fitting time to conduct an interim evaluation. This blog post provides a succinct overview of the first issues arising from its implementation, such as the enforcement timeline, the clarification on the subjective scope of application, the setting up of a functioning governance linking national competent authorities and the Commission, and future constitutional implications of the Regulation.

2. Given the complexity and broad impact of the subject matter it addresses, the DSA is characterized by a staggered and deferred application over time. Therefore, as of 16 November 2022 the implementation was only partial, involving a limited number of articles[1]. Among them, a notable one is Article 24(2) setting for platform providers the deadline of 17 February 2023 as the day starting from which they shall release public data concerning information on the ‘average monthly active recipients’ of the service in the Union.
Despite this initial voluntary disclosure obligation marks an important step forward, it is the full enforcement of the DSA, occurring next February, that will signify the culmination of an extensive process involving years of drafting and negotiations.

3. The problem of the subjective scope of application was one of the first challenges to be faced. The DSA’s heart is Section 5 lying, in fact, on the possibility of tackling systemic risks arising from big platforms. To this purpose, the designation of very large platforms is a paramount stage and the first round occurred on 17 July 2023. On this day, the Commission published its first ad hoc Decision drawing a list of 19 platforms and search engines that, based on data released by providers themselves, surpassed the 45 million monthly active users threshold. In particular, among the 17 ‘very large online platforms’ (aka VLOPs), we find very commonly used social networks (the two Meta platforms, LinkedIn, Pinterest, Snapchat, TikTok, Twitter (now ‘X’) and YouTube), online retailers and marketplaces (Amazon, AliExpress, Booking.com, Google Shopping and Zalando) and other web services such as Apple AppStore, Google’s app stores, Google Maps and Wikipedia. As for the ‘very large online search engines’ (aka VLOSEs), only two major platforms are included (Bing and Google Search). According to the specific timeline provisions, the DSA entered into application for such designated entities on 25 August 2023.
However, this decision has left both targeted companies and stakeholders disappointed. On the one side, Zalando and Amazon filed legal action before the Court of Luxembourg to ask for the annulment of the decision, arguing that online marketplaces shouldn’t be considered as potential avenues for the spread of disinformation based on the fact that they do not host third party content. On the other side, two opposite issues were raised. Firstly, the claim that active users’ data is based on voluntary disclosure by platform providers, thus it is not certified by an external independent body nor is based on a common methodology for counting users. The only reference is to be found in recital 77 of the DSA, establishing ways to qualify monthly active recipients in a given period of time and to determine their level of engagement and interaction with the platform in a way that avoids tracking individuals online. Consequently, facing many perplexities, the Commission, without adopting delegated acts pursuant to Article 33, released a document containing FAQs to answer providers’ difficulties. It is noteworthy to mention that these interpretive guidelines confirm the claim that a common methodology is absent, as it states that each platform, in spontaneously submitting a communication to a specific mailbox hosted by the Commission’s DG CNECT, shall explain the methodology adopted for counting its own active users.
Secondly, some big platforms are suspected of purposedly underestimating their traffic flow to avoid being counted as VLOPs or VLOSEs. Facing the evidence that a whole business sector such as adult entertainment – whose platforms are esteemed to cater for the needs of millions, if not billions of users – might have been included in the Commission’s list, requests to do so went completely unheard. Civil society associations argued the necessity that major platforms associated with the adult entertainment industry and operating within the EU’s jurisdiction should also adhere to the DSA, along with other relevant regulations.
Allegedly, the delay in disclosing user numbers by these platforms appeared to be a strategy to avoid designation, as exemplified by XVideos, which only belatedly reported having an average of 160 million monthly service recipients. Furthermore, considering this last figure, there is speculation that other well-known platforms within the same industry may have provided implausible user counts and could potentially escape inclusion in the next round of designations. Should this strategy be further replicated, it may undermine the DSA in its core mechanism.

4. Another important test is the finalization of the DSA’s monitoring and compliance governance, namely involving the designation of national Digital Services Coordinators (DSCs), the European Board for Digital Services (EBDS) pursuant to Article 61 and composed of high officials appointed by DSCs, as well as the network of vetted researchers and trusted flaggers. All of this will have to be accomplished before 17 February 2024. In a Recommendation from 10 October 2023, which will apply until the DSA finishes entering into force, the Commission urged Member States to fast-track their national implementation of the DSA concerning, in particular, the designation of DSCs, complaining also that the EBDS could not take shape accordingly. As underlined, the necessity to ensure the effectiveness of the DSA’s mechanisms since inception arises from the increasing global instability reverberating on the Union due to Russia’s war of aggression against Ukraine and unprecedented Hamas terrorist attacks. This situation, indeed, makes it even more important to secure a safe and monitored online environment, addressing risks related to the dissemination of illegal content in full respect of fundamental rights and freedom of expression.
So far, only a limited number of Member States designated their DSCs, namely the Czech Republic, France, Germany, Ireland, Lithuania, the Netherlands and lately Italy. According to data circulated by some national authorities, a report was drawn analysing which other Member States may soon appoint more telecoms regulators as DSCs. Others may designate consumer protection authorities or competition authorities. Still, only further insight into the allotted financial budget will offer a plausible overview of the future consistency of the DSA’s overall governance system.

5. The constitutional implications of the DSA constitute a multifaceted dimension, which various authors, delving into the intricate relationship between platforms and public power, place under the umbrella term ‘digital constitutionalism’ (Fitzgerald, Lessig, Berman, Lemley, Suzor, Redeker et al., Waldron and subsequently by Celeste, De Gregorio, Pollicino). However, their approaches diverge significantly, ranging from crafting a constitutional bill of rights (Redeker) to constitutionalizing the digital landscape by constraining both public authorities and private entities from infringing upon fundamental rights (Celeste). Some authors propose reformulating established principles to address emerging online issues effectively (De Gregorio). Additionally, certain scholars suggest applying traditional constitutional principles through the horizontal effects doctrine and emphasizing the pivotal role of Courts in this evolving constitutional discourse (Pollicino).
This debate is rooted in the transnational, quasi-public dimension of platforms necessitating new constitutional patterns in upholding fundamental rights. However, it fails to crystallize into a coherent and unambiguous doctrine, prompting other commentators to caution against the potential erosion of the foundational principles of constitutionalism. While some highlight the risk of these principles being progressively supplanted by pseudo-universal values (Vigevani) or try to reconcile the various positions into a whole new compatible proposal (Golia) others swiftly dismiss the normative value of the concept of digital constitutionalism as a “faux ami” (Costello).
Nonetheless, this hard-to-grasp nature of platforms is an issue that the DSA tries to address by setting up a sophisticated blend of self-regulation, co-regulation, and hard-regulation strategies. It explicitly declares its intent to establish a ‘horizontal’ normative system, aiming to resolve key regulatory issues related to intermediary services within the European internal market.  Pursuing the overarching goal of ‘ensuring a safe, predictable and trusted online environment’ (recital 9) the DSA is presented as a lex generalis acknowledging general principles, rather than another lex specialis with limited scope of application. Therefore, the term ‘horizontal’ alludes to a cross-sectoral approach, encompassing all subject areas to create a comprehensive framework.
In doing so, the DSA seeks to provide a holistic strategy, addressing all risks and challenges emerging from the online environment in a cohesive and thorough manner (for some criticism based on issues concerning the interplay with sectoral legislation, see Quintais and Schwemer). With such a composite horizontal system, DSA’s implementation still faces lots of challenges, such as the operationalisation of the whole risk-assessment architecture, the functioning and efficiency of platforms’ remedies for fundamental rights infringements, the necessity to find a harmonised definition of ‘illegal content’, not to mention the relationship between the DSA-based rights and the constitutional-based rights of each Member State. Finally, the adoption of a thorough voluntary approach questions DSA’s ability to achieve substantial transparency and accountability in platforms’ governance, inducing a potential conferral of public power to privately-owned platforms.

[1] As stated in Art. 93.2, “Article 24(2), (3) and (6), Article 33(3) to (6), Article 37(7), Article 40(13), Article 43 and Sections 4, 5 and 6 of Chapter IV shall apply from 16 November 2022”.