EU 2022, A New Era for Digital Regulation
EU 2022, A New Era for Digital Regulation
1/2 The Digital Services Act
In a “historic moment in digital regulation” according to Commissioner Thierry Breton, the European Parliament adopted the Digital Services Package on July 5th, 2022. This legislative package contains 2 pieces of legislation, the Digital Services Act (DSA) and the Digital Market Act (DMA). The former seeks to establish a trusted and safe online environment, and the latter, addresses competition issues in digital markets. With those acts, the European Commission (“the Commission”) is given additional oversight powers to supervise so called “gatekeepers.”
This high-level briefing note summarizes the key points of the Digital Services Act and will be followed by another summary dedicated to the Digital Markets Act.
The Digital Services Act
The regulation builds on the rules of the e-Commerce Directive drafted more than twenty years ago and addresses the issues emerging around online intermediaries. It will amend and complement the e-commerce directive which will remain in force. The 74 articles preceded by 106 recitals of the text introduce obligations for all online intermediary services (“ISPs”) and create a more stringent regime for larger services. Following the same logic of the GDRP, the rules will also apply to those services located outside the EU who provide services to users established in the EU, making the text applicable across the globe.
1.Scope of application
Once entered into force, the regulation will be directly and uniformly applicable across member states. The awaited novelty of the text is the creation of a set of due-diligence obligations for all intermediaries, whilst it preserves the liability regime of the e-commerce directive. It also intensifies oversight and investigative powers for national authorities and the Commission, who are supposed to cooperate on enforcement aspects.
2.Definitions
The objective to establish a safe, predictable and trusted online environment is primarily pursued through obligations on identifying and reducing the presence of illegal content online. The regulation horizontally applies to all kinds of illegal content, e.g: hate speech, counterfeit goods, or commercial scams. Hence, the text adopts a broad definition for illegal content: “any information or activity, including the sale of products or provision of services, which is not in compliance with EU law or the law of a Member State, irrespective of the precise subject matter or nature of that law.”
In line with the e-commerce directive, intermediary services in the DSA still broadly cover services that store or transmit third-party content for EU users. The well-known definitions of mere conduit, caching and hosting services remain unchanged.
The DSA introduces obligations for two types of hosting services:
1.Online platforms, defined as providers of hosting services which, at the request of a user, store and disseminate information to the public, and
2.very large online platforms that, for at least four consecutive months, provide their services to at least 45 million average monthly active users and have been designated as such by the Commission.
It also introduces obligations targeting online search engines and very large online search engines, respectively defined as:
1.providers of digital services that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, and
2.online search engines that reach at least 45 million monthly active recipients of the service in the EU and have been designated as such by the Commission.
3.Liability rules, E-commerce directive revisited
The DSA’s liability exemption rules for mere conduit and caching services concerning illegal content mirror those of the e-commerce directive, i.e: the services must have a passive conduct towards content transmission and comply with notice and take down requirements to avoid liability. Regarding liability for hosting services, the DSA also mirrors the e-commerce directive’s provisions, i.e: the service provider is not responsible for the illegal content, provided it has no actual knowledge of it, or from the moment upon gaining awareness, it makes the access impossible.
But the DSA introduces a scenario in which consumer law may preclude the application of the liability exemption for hosting services. Consumers concluding distance contracts with traders will now be able to turn against the hosting service when the items or information are presented “in a way that would lead an average consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.”
The text also invites large platforms to fight against the appearance of illegal content by introducing a good samaritan clause which ensures eligibility to the liability exemptions to those ISPs who actively detect, identify and remove, or disable access to, illegal content.
4.Due diligence, transparency and handling of complaints
4.1 All services
All ISPs regardless of their size will be subject to due diligence and transparency obligations. Essentially, the DSA seeks to restore a trusted and user centric online environment. Amongst others, the publication of content moderation rules will be mandatory in the terms and conditions of services. The DSA also codifies some industry practices: it imposes the obligation to publish, on a yearly basis, transparency reports which shall for instance include decisions affecting the removal or blocking of content, or the number of complaints received and handled (the obligation will not concern micro enterprises and SMEs). Such reports had already reached the public’s eyes when companies like Twitter or Google voluntarily disclosed information of the sort.
Another novelty for all intermediaries established outside the European Union is that they will be required to designate a “Contact Point” in Europe to endorse the role of direct correspondent with the authorities, and additionally appoint a legal representative in one of the member states where services are offered. The latter’s responsibility will be higher than that of the contact point since it may be held jointly responsible for any violations of the rules and requires a determined physical location. It shall be noted that the roles of contact point and legal representative can be endorsed by the same entity.
4.2 Hosting services
Hosting services including providers of online platforms will be subject to additional obligations directed at handling users’ complaints concerning illegal content. For instance, they will have to implement user friendly tools allowing the signaling of illegal content, or state reasons for, amongst others, affecting the visibility of items or information provided by their recipients.
4.3 Online platforms
Service providers of online platforms (excluding micro enterprises and SMEs) will be obliged to implement free of charge electronic systems for handling users’ complaints against decisions relating to illegal content moderation or violations of T&Cs. Recipients affected by online platforms’ decisions concerning illegal content will also be entitled to select out-of-court disputes settlement bodies designated by the newly created national “digital services coordinators.” Aside, other obligations relating to reporting of criminal offenses, third party traders’ information, and transparency of advertising practices will have to be complied with.
4.4 Very large online platforms and very large search engines: special obligations
Providers of very large online platforms and very large online search engines will be subject to higher due diligence standards. Amongst others, they will have to provide concise summaries of their T&Cs to users.
An innovative set of obligations also concerns the management of systemic risks stemming from the design, functioning and use of their service. Such risks (divided into four categories in the DSA) could concern for instance the possibility to disseminate illegal content or the ability for users to sell illegal products. Platforms will thus have to conduct yearly risk assessments and take appropriate mitigating steps where the service’s functioning would leave an open door to the dissemination of illegal contents.
Those platforms will also have to conduct external audits of their services, publish transparency repositories including information details of their online advertisement partners, provide access to their data to the Commission and researchers for compliance and research purposes, and appoint DSA compliance officers.
In cooperation with national Digital Services Coordinators, the platforms falling under this category will be designated by the Commission in delegated acts published in the Official Journal of the European Union.
5.Governance and enforcement
Member states will have to designate i) a national enforcement authority and, ii) a Digital Service Coordinator. Both authorities can be merged and will have to comply with classic requirements of administrative independence from the executive and the legislature.
Digital Service Coordinators will enforce the DSA at national level. They will be entitled to request information from any services, empowered to inspect them (including carrying out on-site inspections), allowed to request the cessation of infringements, levy fines, impose interim measures, or accept binding commitments in a similar fashion to those available in competition law proceedings.
Intermediaries located in the EU will fall under the jurisdiction of their member state of establishment, whilst non-EU intermediaries will be falling under the jurisdictions of the member state of their appointed legal representative. Importantly, the European Commission is granted competences to prompt investigations and initiate proceedings on their own motion against very large online platforms.
The DSA also creates a European Board for Digital Services at the EU level, comprised of the national Digital Service Coordinators. Like the EDPB in data protection, the board will have advisory tasks for all national Digital Service Coordinators and the Commission. Specific guidance they may issue in the future is awaited.
6.Non-compliance
Rules on penalties for noncompliance with the regulation are left to be defined by member states within the maximum amount of the fines included in the DSA. For a failure to comply with an obligation, the fines will not exceed 6% of the annual worldwide turnover of the ISP concerned. For the supply of incorrect, incomplete or misleading information, and failure to submit to an inspection, the fine will not exceed 1 % of the annual income or worldwide turnover.
7.Next step
The text will enter into force twenty days after its publication in the Official Journal of the European Union, and will apply fifteen months later or from 1 January 2024, whichever comes later, after entry into force. As regards the obligations for very large online platforms and very large online search engines, the DSA will apply earlier: four months after they have been designated as such by the Commission.
Commissioner Breton also shared that the enforcement and implementation of the DSA will be supported by dedicated teams to the DSA and the DMA from the Commission’s DG CONNECT. For instance, issues such as risk assessments and audits will be handled by the societal issues team.
This instrument starts a new era for internet compliance, there is much work yet to come for those online actors subject to it.