Civil Rights Organisations Criticise Predictive Policing Projects
In April 2025, civil liberties, human rights and justice organisations and experts voiced their concerns over the development of the use of predictive policing systems by national police forces and the concerning development of AI-supported automated police decision-making systems.
On 15 April 2025, civil rights organisations Statewatch, the Ligue des droits humains and the Liga voor mensenrechten, jointly published a report on "predictive" policing and data-profiling in Belgium. The report examined and analysed "predictive" policing initiatives and ambitious digitalisation projects in the Belgian police. These included:
- Location-focused "predictive" policing systems used by local Belgian police forces;
- The databases that are or will be used to inform those systems;
- The Belgian Federal Police "i-Police" project, designed to use data from police and other public agencies, as well as a range of other data sources, to inform police decision-making and activities.
The report highlights several serious problems in the context of the use of advanced data analysis techniques to try to "predict" crime:
- Lack of transparency at both the local and federal levels and limited information available;
- Significant shortcomings in managing and controlling databases by the Belgian police forces;
- Often biased or unfounded information in the databases;
- Predictive policing systems produce structural inequalities and discrimination against the most marginalised groups in society.
In conclusion, the report says: "[I]t is imperative that Belgium prohibits the use of ‘predictive’ policing and automated decision-making systems in policing and criminal justice settings. By banning these systems, Belgium can take a significant step towards building a more equitable, just, and democratic society. It is an opportunity to reaffirm the commitment to upholding fundamental rights, promoting equality, and maintaining the principles of justice and accountability."
On 9 April 2025, Statewatch criticised the United Kingdom's system for "predicting" the re-offendering risk of offenders or alleged offenders. According to the UK Ministry of Justice, the system uses a combination of “structured professional judgement” and risk prediction algorithms to generate “risk scores.” The manual assessment, usually conducted by the Prison Offender Manager (POM, a Prison Service official), gathers information on varies categories. Statewatch stressed that over 1,300 people are profiled daily by this AI system. Statewatch criticised that despite serious concerns over racism and data inaccuracies, the system continues to heavily influence decision-making on imprisonment and parole. New digital tools are on the way to replace the system in 2026. In addition, the UK government is working on other predictive policing projects as well as on a bill that would allow police decisions to be made solely by computers, Statewatch said.