Overview of the Latest Developments on the DSA: November 2024-January 2025
The Digital Services Act (DSA) is designed to foster a safer, fairer, and more transparent online environment (→ eucrim 4/2022, 228-230). It establishes new obligations for online platforms, thereby ensuring that EU users are safeguarded against the dissemination of illicit goods and content and that their rights are respected when they engage in interactions, share information, or make purchases online. The DSA is also highly relevant for law enforcement purposes (→ eucrim 1/2024, 13).
This news item continues the reporting on the latest developments concerning the DSA in the form of a chronological overview. For overviews of the previous developments: April-August 2024 →eucrim 2/2024, 94-95; September-October 2024 →eucrim 3/2024, 178.
- 25 November 2024: The short message platform Bluesky is under increasing scrutiny from the European Commission for alleged non-compliance with the DSA. The DSA imposes disclosure obligations on all online platforms operating in the EU, including requirements to provide details on user numbers, designate an EU-based contact person, and publish a dedicated webpage with legal and operational information. According to the Commission, Bluesky has not fulfilled these obligations. The EU Commission has tasked national Digital Services Coordinators (DSCs) to investigate whether Bluesky is adhering to the DSA requirements. In Germany, the Federal Network Agency confirmed it had been asked to determine whether Bluesky has a branch, legal representative, or contact person in the country. The agency reported that Bluesky has not complied with any of these obligations. While the Commission oversees large platforms, enforcement for smaller platforms like Bluesky falls under the jurisdiction of national authorities. The Commission has emphasized that it is up to individual DSCs to enforce compliance, with Germany and other Member States taking the lead in examining Bluesky’s operations. Bluesky currently has around 20 million global users, far below the DSA threshold for classification as a "Very Large Online Platform" (VLOP), which requires at least 45 million monthly users in the EU. As such, the platform must meet baseline obligations, including appointment of an EU contact person and reporting user data. The Federal Network Agency in Germany has stated that no further action is currently required, as the Commission's inquiry into Bluesky is considered closed for now. However, should any national DSC take enforcement action, it will act on behalf of all EU Member States. Bluesky will be required to comply once a DSC formally intervenes.
- 29 November 2024: The European Commission convenes a roundtable with major platforms like TikTok, Meta, Google, Microsoft, and X to discuss election readiness under the DSA in the context of Presidential and Parliamentary elections in Romania. The Commission requested platforms to share risk assessments and mitigation measures for threats like disinformation and platform manipulation. Discussions also addressed cooperation with stakeholders, recommender systems, and the need for independent researcher access to platform data. With election integrity a key DSA priority, the Commission is monitoring compliance, with ongoing proceedings against X, Facebook, and Instagram for alleged violations. TikTok has also been asked to clarify its handling of information manipulation risks.
- 17 December 2024: The European Commission launches formal proceedings against TikTok for suspected violations of the DSA related to systemic risks impacting election integrity. The investigation follows allegations of foreign interference during Romania’s recent presidential elections. The investigation will examine TikTok’s management of risks linked to recommender systems, possible coordinated inauthentic activity, and policies on political advertisements and paid-for content. The probe will focus on whether TikTok properly addressed regional and linguistic risks tied to national elections. The Commission is working closely with Ireland’s Digital Services Coordinator, given TikTok’s EU establishment there. It will gather further evidence through additional information requests, monitoring actions, and inspections, including an analysis of TikTok’s algorithms. The proceedings empower the Commission to enforce interim measures or accept commitments from TikTok to address the identified risks. The investigation into electoral integrity marks the third investigation into TikTok under the DSA, underscoring growing scrutiny of the platform in the EU.
- 17 January 2025: The European Commission takes additional investigatory steps into X's compliance with the DSA regarding its recommender system. The measures include: a request for internal documentation on the platform’s recommender system and recent changes, due 15 February 2025; a retention order requiring X to preserve documents related to future algorithm changes from 17 January to 31 December 2025 or until the investigation concludes; access to X’s commercial APIs (Application Programming Interfaces) to assess content moderation and account virality. These steps aim to evaluate whether X's systems align with the DSA's goals of ensuring a fair, safe, and democratic online environment. The investigation remains ongoing.
- 20 January 2025: The European Commission incorporates the revised Code of Conduct+ on Countering Illegal Hate Speech Online into the Digital Services Act framework. Major platforms, including Facebook, TikTok, X, and YouTube, have committed to reviewing flagged hate speech within 24 hours, improving transparency, and collaborating with experts and civil society. The Code supports DSA compliance and includes annual audits to ensure that platforms mitigate hate speech risks effectively. It builds on EU legal frameworks, aiming to combat hate speech while upholding democratic values and freedom of expression. Regular evaluations will ensure that the Code continues to meet emerging challenges.
- 21 January 2025: The European Parliament debates the enforcement of the DSA to protect elections and democracy from disinformation, foreign interference, and biased algorithms. The Commission highlighted ongoing investigations into platforms like TikTok and X for election-related risks and emphasized transparency requirements, including user opt-out options for profiling and content moderation disclosures. The Commission also announces plans for a European Democracy Shield to counter disinformation and strengthen electoral integrity, building on the European Democracy Action Plan. Collaboration with national Digital Services Coordinators and international partners will ensure robust enforcement, with staff and resources being doubled to address rising threats.
- 31 January 2025: The Federal Network Agency (FNA), in its capacity as Digital Services Coordinator (DSC) for Germany, conducts a stress test with VLOPs ahead of the Parliamentary elections in Germany. Participants included representatives of Google (YouTube), LinkedIn, Microsoft, Meta (Facebook, Instagram), Snapchat, TikTok, X, as well as of national authorities and civil society organisations. The test aims to test platforms’ readiness to address behaviours on these platforms which could occur in the run-up to the elections and could pose a risk related to civic discourse and electoral processes. It follows a roundtable held on 24 January 2025 in which the FNA and the VLOPs discussed current election-related trends and risk-minimising measures by the major online platforms and search engines, in order to assure their election readiness (see also above 29 November as regards the elections in Romania).