The European Parliament and the Council agreed on a comprehensive package of legislation establishing new rules for online platforms. The package consists of two Regulations:

  • Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act);
  • Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).

These two acts had been proposed by the Commission on 15 December 2020 (→ eucrim 4/2020, 273–274).

The Digital Markets Act

The Digital Markets Act (DMA) supplements competition law and limits the power of large digital companies. It establishes obligations for so-called gatekeepers to comply with in their daily operations. Gatekeeper platforms must allow, for instance, their business users to promote their offer and conclude contracts with their customers outside the gatekeeper’s platform. Bans for gatekeeper platforms include, for example, to treat services and products offered by the gatekeeper itself more favourably in ranking than similar services or products offered by third parties on the gatekeeper's platform, and to track end users outside of the gatekeepers’ core platform service for the purpose of targeted advertising, without effective consent having been granted.

Non-compliance with the DMA’s obligations can lead to:

  • Fines (of up to 10% of the company’s total worldwide annual turnover, or up to 20% in the event of repeated infringements);
  • Periodic penalty payments (of up to 5% of the average daily turnover);
  • Additional remedial measures in case of systematic infringements and after a market investigation.

The DMA was published on 12 October 2022 in the Official Journal (O.J. L 265, 1), entered into force on 1 November 2022 and applies as of 2 May 2023. As of 2 May 2023, potential gatekeepers must notify the Commission within a period of two months whether their platform exceeds the thresholds provided for by Regulation 2022/1925. Gatekeepers are defined as undertakings providing core platform services, which is presumed in particular if the platform service has at least 45 million monthly active end users and 10,000 yearly active business users established in the Union, or at least €7.5 billion in annual turnover in the last three financial years. After notification, the platform will be assessed and designated as a gatekeeper by the Commission. After this designation, gatekeepers have six months to comply with the obligations foreseen in the DMA.

The Digital Services Act

The Digital Services Act (DSA) will complement and update parts of the now 20-year-old E-Commerce Directive. It provides for uniform horizontal rules on due diligence obligations and conditional exemptions from liability for online intermediary services (such as online platforms) as well as common rules on the implementation and enforcement of the Regulation, including as regards the cooperation of and coordination between the competent authorities. Thus, the DSA aims to contribute to a safe, predictable and trustworthy online environment and the smooth functioning of the EU single market for intermediary services.

The DSA applies to all online intermediaries offering their services in the single market, whether they are established in the EU or outside. The DSA sets obligations tailored to the size and the types of intermediary services. Specific due diligence obligations apply to hosting services, including online platforms, such as social networks, content-sharing platforms, app stores, online marketplaces, and online travel and accommodation platforms. More far-reaching rules apply to very large online platforms, which have a significant societal and economic impact, including very large online search machines. “Very large online platforms and online search engines” are considered those services which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.

At the core of the DSA are the new EU-wide rules on countering illegal content online, including illegal goods and services. The DSA foresees standardised procedures for notifying illegal content, uniform rules on access to complaints and redress mechanisms across the single market, EU-wide standards of transparency of content moderation or advertising systems, and the same risk management obligations for very large online platforms. It should be stressed that the DSA does not stipulate a definition of “illegal content” – this is regulated in other laws either at the EU level or at the Member State level. However, the DSA provides that intermediary services must give effect to orders to act against one or more specific items of illegal content issued by national judicial or administrative authorities, irrespective of where the platform is established. The same obligation applies to orders to provide specific information about one or more specific individual recipients of the service.

Users will be empowered to report illegal content in an easy and effective way. Platforms are also obliged to cooperate with “trusted flaggers” (i.e. entities that have demonstrated particular expertise and competence) to identify and remove illegal content. Very large online platforms need to take additional mitigating measures at the level of their overall organisation to protect users from illegal content, goods and services. Obligations include, for instance, the necessity to trace sellers on online market places in order to help identify illegal goods. Online market places must also randomly check against existing databases whether products or services on their sites have been identified as being illegal.

The DSA emphasises, however, that no general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on providers.

Other rules in the DSA include the following:

  • Effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions;
  • Several transparency measures for online platforms, including on the algorithms used for recommendations and better information on terms and conditions;
  • Obligations for very large platforms and very large online search engines to prevent the misuse of their systems by taking risk-based action and by independent audits of their risk management systems;
  • Ban on certain types of targeted adverts on online platforms (e.g. profiling children or use of special categories of personal data, such as data on ethnicity, political views or sexual orientation);
  • Prohibition of “dark patterns” on online interfaces, which distort or impair the ability of recipients of the service to make autonomous and informed choices or decisions;
  • Oversight structure to address the complexity of the online space: EU countries will have the primary role, whereby they need to establish a “Digital Services Coordinator” (whose task is to supervise the application of the DSA). Member States will also be supported by a new European Board for Digital Services. For very large platforms, supervision and enforcement lies with the Commission which has enforcement powers similar to those in anti-trust proceedings.

The DSA was published in the Official Journal of 27 October 2022 (O.J. L 277, 1) and entered into force on 16 November 2022. Its rules will apply in two steps:

  • From the date of entry into force, very large online platforms and very large online search engines, which are directly supervised by the Commission as regards systemic obligations, now have three months (until 17 February 2023) to publish the number of active monthly recipients on their websites and report it to the Commission. The Commission will then assess whether the platform reaches the threshold of 45 million recipients and should therefore be designated as very large online platform or search engine (cf. supra). Once designated by the Commission, these platforms have four months to comply with the DSA and to comply with their obligations that go beyond those applicable to all online intermediary services, including the obligation to provide a comprehensive risk assessment under the DSA;
  • For smaller platforms, the rules will apply as of 17 February 2024, i.e. fifteen months after entry into force of the DSA. By then, Member States need to empower their national authorities to enforce the rules to all intermediary services covered by the DSA.

The European Commission informs of the key objectives of the DSA and the main new obligations for online services and platform on a factpage and provided a summary of the rules in a Q&A memo. On 16 November 2022, the Commission also announced the establishment of the European Centre for Algorithmic Transparency (ECAT). Its role is to support the Commission’s supervisory role with in-house and external multidisciplinary knowledge. The Centre will provide support with assessments as to whether the functioning of algorithmic systems is in line with the risk management obligations that the DSA establishes for very large online platforms/search engines to ensure a safe, predictable and trusted online environment. The ECAT is hosted by the Commission’s Joint Research Centre (JRC) in close cooperation with the Directorate General Communications Networks, Content and Technology (DG CONNECT).

Statements

After the adoption of the DSA/DMA in the EP on 5 July 2022, EP’s rapporteur for the DSA Christel Schaldemose (S&D, DK) said: “For too long tech giants have benefited from an absence of rules. The digital world has developed into a Wild West, with the biggest and strongest setting the rules. But there is a new sheriff in town - the DSA. Now rules and rights will be strengthened. We are opening up the black box of algorithms so that we can have a proper look at the moneymaking machines behind these social platforms.”

Andreas Schwab (EPP, DE), EP’s rapporteur for the DMA said: “We no longer accept the ‘survival of the financially strongest’. The purpose of the digital single market is that Europe gets the best companies and not just the biggest. This is why we need to focus on the legislation’s implementation. We need proper supervision to make sure that the regulatory dialogue works. It is only once we have a dialogue of equals that we will be able to get the respect the EU deserves; and this, we owe to our citizens and businesses”.

On behalf of the Czech Council Presidency, Jozef Síkela, Czech Minister for Industry and Trade, affirmed after the Council’s final approval of the DSA on 4 October 2022: “The Digital Services Act is one of the EU’s most ground-breaking horizontal regulations and I am convinced it has the potential to become the ‘gold standard’ for other regulators in the world. By setting new standards for a safer and more accountable online environment, the DSA marks the beginning of a new relationship between online platforms and users and regulators in the European Union and beyond.”

In the wake of the Council’s approval of the DMA on 18 July 2022, Ivan Bartoš, Czech Deputy Prime Minister for Digitization and Minister of Regional Development, sees the adoption of the DMA as the creation of “large online platforms responsible for their actions. Hereby, the EU will change the online space worldwide. The gatekeepers that the DMA addresses are omnipresent – we all use their services on a daily basis. However, their power is growing to an extent that negatively affects competition. Thanks to the DMA, we will ensure fair competition online, more convenience for consumers and new opportunities for small businesses.”