Controversial Proposal on Combating Child Sexual Abuse Online
27 July 2022 (updated 2 years, 2 months ago) // Published in printed Issue 2/2022 pp 91 – 92
Pingen Kopie Dr. Anna Pingen

On 11 May 2022, the Commission presented its proposal to prevent and combat child sexual abuse online (COM(2022) 209 final). The Commission pointed out that the EU is still failing to protect children from falling victim to child sexual abuse, while the online dimension represents a particular challenge. The circulation of images and videos of sexual abuse of children has dramatically increased with the development of the digital world.

Building on the Directive on Child Sexual Abuse (Directive 2011/93/EU), the2020 EU strategy for a more effective fight against child sexual abuse and Member States' rules to fight against online child sexual abuse the proposed regulation aims to set out targeted measures that are proportionate to the risk of misuse of a given service for online child sexual abuse and are subject to robust conditions and safeguards. The new Regulation will complement the Child Sexual Abuse Directive and repeal Regulation 2021/1232, which provides for a temporary solution in respect of the use of technologies by certain providers for the purpose of combating online child sexual abuse.

It is submitted that the current system based on voluntary detection and reporting by companies has proven insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. The proposal consists of two building blocks:

  • Imposition of obligations for online service providers to detect, report, remove, and block child sexual abuse material on their services.
  • Establishment of an EU Centre on child sexual abuse ("the EU Centre") as a decentralised agency to enable the implementation of the new Regulation and support removal of obstacles to the internal market.

The EU Centre will support national law enforcement and Europol by reviewing the reports from the providers to ensure that they are not submitted in error. It will channel reports quickly to law enforcement and support Member States by serving as a knowledge hub for best practices on the prevention of child sexual abuse and assistance to victims. It will also make detection technologies available to providers free of charge so that detection orders addressed to providers can be executed. The main elements of the planned regulation include:

  • Mandatory risk assessment and risk mitigation measures: Providers will have to assess the risk that their services are misused to disseminate child sexual abuse material or for purposes to solicit children, known as grooming; providers must also take reasonable mitigation measures tailored to the risk identified;
  • Strong safeguards on detection: Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible;
  • Reporting obligations: Providers that have detected online child sexual abuse will have to report it to the EU Centre;
  • Effective removal: National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down;
  • Oversight mechanisms and judicial redress: Detection orders will be issued by courts or independent national authorities. To minimise the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in court.

The proposal and the measures have been widely criticized for attacking the right on privacy. Especially criticised is the imposition of obligations for online service providers to detect, report, remove, and block child sexual abuse material on their services as this would also affect publicly available interpersonal communications services, such as messaging services and web-based e-mail services, as well as direct interpersonal and interactive exchange of information, such as chats and gaming, image-sharing and video-hosting services that will be obliged to search for and report child abuse material. Critics see in this a risk for the creation of a massive new surveillance system and therefore an attack on privacy.