Overview of the Latest Developments Under the Digital Services Act: November 2025 - February 2026
24 April 2026 // Preprint Issue 4/2025
Dr. Anna Pingen Dr. Anna Pingen

Eucrim regularly reports on the EU's new major legislation regulating the digital space, i.e., the Digital Services Act and the Digital Markets Act (→ eucrim 1/2024, 12-13 with further references). The Digital Services Act (DSA) is designed to foster a safer, fairer, and more transparent online environment (→ eucrim 4/2022, 228-230). It establishes new obligations for online platforms, thereby ensuring that EU users are safeguarded against the dissemination of illicit goods and content and that their rights are respected when they engage in interactions, share information, or make purchases online. The DSA is also highly relevant for law enforcement purposes (→ eucrim 1/2024, 13).

This news item continues the reporting on the latest DSA developments by giving a chronological overview. It covers the period from November 2025 to February 2026. For overviews of previous developments, see: November 2024-January 2025 →eucrim 4/2024, 272-273; February-April 2025 →eucrim 1/2025, 12-13; and May to Mid-October 2025 →eucrim 2/2025, 120-122 - each with further references.

  • 12 November 2025: The European Commission presents the European Democracy Shield, which includes several measures directly relevant to the DSA. To safeguard the integrity of the information space, the Commission prepares a DSA incidents and crisis protocol to facilitate coordination among competent authorities in cases of large-scale or cross-border information manipulation. It also works with signatories of the Code of Conduct on Disinformation within the DSA framework. The initiatives reinforce the DSA’s role as a key instrument for addressing systemic risks, including disinformation and foreign information manipulation, while enhancing cooperation between EU institutions, Member States, and relevant stakeholders.
  • 18 November 2025: The European Board for Digital Services, in cooperation with the Commission, publishes its first annual report under Art. 35(2) of the DSA, identifying the most prominent and recurrent systemic risks linked to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). The report highlights recurring systemic risks in four main areas: (1) the dissemination of illegal content (including illegal products, terrorist content, CSAM, and hate speech); (2) impacts on fundamental rights (notably freedom of expression, non-discrimination, privacy, and consumer protection); (3) risks to civic discourse, elections, and public security (including disinformation, foreign information manipulation, and algorithmic amplification); and (4) risks related to gender-based violence, public health, the protection of minors, and mental well-being. The Board underlines that systemic risks concern platform design, functioning, and large-scale effects rather than individual pieces of content. It emphasises the central role of Arts. 34 and 35 DSA, requiring VLOPs and VLOSEs to assess and mitigate systemic risks in a proportionate manner, with particular regard to fundamental rights. The report also outlines mitigation practices currently in use or proposed, including adjustments to content moderation systems, recommender systems and advertising systems, as well as the use of trusted flaggers, transparency tools, and codes of conduct.
  • 26 November 2025: The Commission sends a request for information (RFI) to Chinese multinational online clothing retailer Shein under the DSA, following indications that illegal products – including child-like sex dolls and weapons – are being offered on the platform. The Commission asks Shein, designated as a VLOP, to provide detailed information and internal documents on how it prevents the sale of illegal goods and how it protects minors from exposure to age-inappropriate content, including through age assurance measures. It also seeks clarification on the effectiveness of Shein’s risk mitigation systems. This is the third RFI that the Commission sent to Shein.
  • 27 November 2025: In his speech “Trusted Sources of Information in a Democratic Society” delivered on 27 November 2025 in Brussels to the Citizens Information Board, Commissioner Michael McGrath underlines the central role of the DSA in safeguarding the integrity of the information space. He stresses that the DSA requires VLOPs to assess and mitigate systemic risks, including coordinated inauthentic behaviour and disinformation risks linked to recommender systems. The Commission has issued DSA guidelines to mitigate risks to electoral processes, and several DSA investigations remain ongoing. Looking ahead, the Commission is preparing a DSA incidents and crisis protocol, together with the European Board for Digital Services, to strengthen coordination in response to major information manipulation incidents.
  • 5 December 2025: The European Commission fines X 120 million Euro for breaching its obligations under the DSA. This marks the first non-compliance decision adopted under the DSA. The Commission finds that X violates Art. 25(1) DSA by using a misleading “blue checkmark” design. Because users can obtain verified status through payment without meaningful identity verification, the platform creates a deceptive impression of authenticity, contrary to the DSA’s prohibition of deceptive design practices. The Commission also finds infringements of Art. 39 DSA due to deficiencies in X’s advertising repository. The repository lacks key transparency elements, including clear information on ad content and the paying entity, and contains access barriers that undermine scrutiny by researchers and civil society. In addition, X breaches Art. 40(12) DSA by failing to provide researchers with effective access to public data. X must now submit corrective measures within set deadlines. Failure to comply may result in periodic penalty payments. The Commission states that investigations concerning other potential DSA breaches by X remain ongoing.
  • 5 December 2025: The Commission accepts binding commitments from TikTok to address concerns regarding advertising transparency under the DSA. Following preliminary findings in May 2025 (see →eucrim 2/2025, 120-122), TikTok commits to ensure that its advertising repository fully complies with DSA requirements. The platform will provide, inter alia, the complete content of advertisements as shown to users, including embedded URLs, and will update its repository within 24 hours. TikTok must implement the commitments within agreed deadlines of up to 12 months. The Commission will monitor compliance under Art. 71 DSA. Other DSA investigations concerning TikTok – including recommender systems, age assurance, data access for researchers, protection of minors, and election-related risks – remain ongoing.
  • 26 January 2026: The Commission launches a new formal investigation against X under the DSA (Arts. 34, 35, and 42 relating to risk assessment, mitigation, and independent auditing obligations). The investigation focuses on risks linked to the deployment of the AI tool Grok within the platform. In parallel, it extends its ongoing proceedings from December 2023 concerning X’s recommender systems. The Commission assesses whether X has properly identified and mitigated systemic risks arising from Grok’s integration, including the dissemination of illegal content such as manipulated sexually explicit images and potential child sexual abuse material. It also examines whether X conducted and submitted an ad hoc risk assessment prior to deploying Grok functionalities that significantly affect its risk profile.
  • 10 February 2026: Under the Action Plan Against Cyberbullying, the Commission announces several measures directly linked to the DSA. The Commission will review the DSA guidelines on the protection of minors to strengthen the obligations of online platforms to prevent minors from being exposed to harmful content and to ensure that reporting mechanisms are easily accessible. It will also adopt DSA guidelines on trusted flaggers to clarify their role in addressing illegal content, including illegal cyberbullying content. According to the Action Plan, the DSA requires online platforms to ensure a high level of privacy, safety, and security for minors and to mitigate systemic risks related to harmful content.
  • 17 February 2026: The Commission opens formal proceedings against Shein under the DSA. The investigation concerns whether the platform has adequate systems to prevent the sale of illegal products in the EU, including content that may constitute child sexual abuse material. The Commission also examines whether Shein’s design features, such as reward-based engagement mechanisms, create addictive effects and whether the company properly assesses and mitigates related systemic risks. In addition, it reviews compliance with DSA transparency obligations for recommender systems, including disclosure of main ranking parameters and the requirement to offer users at least one option not based on profiling.