How to remove unlawful content from Google or Instagram: DSA vs LCEN.
Comparison of content removal mechanisms under the DSA and LCEN.
Comparison of content removal mechanisms under the DSA and LCEN.
The Digital Services Act (DSA), which entered into force on November 16, 2022, and became applicable on February 17, 2024, marks a major turning point in the regulation of digital services within the European Union.
This legislative evolution profoundly transforms the liability regime of hosting providers, particularly through its Articles 6, 16, and 17, while considerably simplifying the procedures for reporting illegal content.
The DSA represents a fundamental break with the previous regime regarding the modalities for notifying illegal content.
Under the LCEN (Law on Confidence in the Digital Economy of June 21, 2004), the notification procedure required a preliminary step with the author of the disputed content before any report to the hosting provider.
This requirement, which theoretically aimed to favor amicable dispute resolution, had in practice proven to be a major obstacle to the effectiveness of the fight against illegal content.
The new European framework abolishes this obligation of prior notification to the author.
This evolution marks a clear will of the European legislator to prioritize efficiency and speed in the processing of problematic content.
Henceforth, victims or witnesses of illegal content can directly approach the hosting provider, thereby considerably reducing processing times and increasing the effectiveness of reports.
This procedural simplification is accompanied by a reform of the liability regime of hosting providers.
The abandonment of the notion of “_manifestly illegal_” content \\ [1\\] in favor of a broader conception of illegality \\ [2\\].
This difference, significant on paper, would nevertheless be more tenuous in practice, particularly in light of Recital 63 of the DSA which retains the notion of “_manifestly illegal_”.
The notification sent by the user considering content as illegal (intellectual property violation, criminal offenses etc.) must meet certain characteristics \\ [3\\] :
**a)** a sufficiently substantiated explanation of the reasons why the individual or entity alleges that the information in question is illegal content;
**b)** a clear indication of the exact electronic location of that information, such as the exact URL(s), and, where applicable, additional information enabling the identification of the illegal content depending on the type of content and the specific type of hosting service;
**c)** the name and email address of the individual or entity submitting the notification, unless the information is considered to involve one of the offenses referred to in Articles 3 to 7 of Directive 2011/93/EU;
**d)** a statement confirming that the individual or entity submitting the notification believes, in good faith, that the information and allegations contained therein are accurate and complete.
Article 17 of the DSA counterbalances this procedural simplification with a significant strengthening of hosting providers' transparency obligations.
While the notification of illegal content is simplified upstream, hosting providers must, however, justify their moderation decisions in detail downstream.
The statement of reasons required by Article 17 of the DSA must therefore include precise information on the nature of the measure taken, its legal or contractual basis, the factual circumstances that led to the decision, and the available redress mechanisms.
This transparency requirement constitutes a fundamental guarantee for users, compensating for the abolition of prior formalities through enhanced _ex post_ control.
Hosting providers must now:
- Directly process received reports without waiting for a preliminary step with the author - Implement rapid analysis processes for reported content - Document each moderation decision precisely - Ensure transparent and detailed communication with concerned users.
The abolition of the obligation of prior notification to the author constitutes one of the major innovations of the DSA.
This evolution, combined with the strengthening of transparency obligations, marks a profound transformation of online content regulation. The success of this reform will largely depend on the ability of hosting providers to implement moderation processes that are both effective and transparent.
The coming months will be crucial for evaluating the practical impact of these changes on the fight against illegal content and on the protection of user rights.
Observation of the concrete implementation of these new obligations will make it possible to measure whether the DSA has indeed struck the right balance between the speed of processing problematic content and the guarantee of user rights.
About the author
Partner
Co-founding partner of INFLUXIO, Maître Alexandre BIGOT-JOLY has worked in business law firms and public institutions where he trained in media and communications law, intellectual property and criminal law.
Contact
Would you like to schedule a meeting or get a quote?
We respond within 24 hours.