Key Points
- The Met announced on 13 April 2026 that it is exploring AI to rapidly grade and triage child sexual abuse imagery, enabling faster victim identification and safeguarding.
- The Met investigated over 5,400 child sexual abuse offences in the past year, requiring more than 1,300 children to be safeguarded from online child sexual abuse.
- The announcement comes alongside a £10 million investment in 23 dedicated Visual Recorded Interview suites across London for child victims.
- Deputy Commissioner Matt Jukes confirmed human judgment, strict oversight, and victim care will remain central to all investigations.
- London’s Victims’ Commissioner Andrea Simon welcomed the move but cautioned that facilities alone are not enough to keep victims engaged throughout the justice process.
London (North London News) April 14, 2026 – The Metropolitan Police Service has announced it is exploring the use of artificial intelligence to accelerate the identification of child sexual abuse victims. This particular move could potentially transform investigations across London and reduce the psychological toll on officers handling deeply distressing material.
What is the Met Police’s AI plan for child sexual abuse investigations?
The Met confirmed on 13 April that it is exploring how AI can grade and triage child sexual abuse imagery. This process is currently carried out manually by specialist officers who must spend hours reviewing seized material. The technology would classify content by severity and identify potential new victims far more quickly than is currently possible.
Why is the Met investing in AI now?
Online child sexual abuse is one of London’s fastest-growing crime types, up 25% year-on-year. The Met investigated more than 5,400 offences in the past year, safeguarding 1,300+ children. Deputy Commissioner Matt Jukes said the force must evolve in response, and that the speed of AI-assisted identification directly matters when it comes to protecting children.
What are the Visual Recorded Interview suites?
Alongside the AI plans, the Met has also announced a £10 million investment in 23 child-friendly interview suites across London. Plumstead Police Station served as the pilot, with six sites now complete. High-demand locations, including Brixton and Holborn, are among those earmarked for renovation.
What safeguards will govern the use of AI?
The Met confirmed that any AI use will operate within strict legal, ethical, and safeguarding frameworks. Specialist officers retain full decision-making responsibility throughout. No AI system has been deployed yet. This remains an exploratory phase.
What did London’s Victims’ Commissioner say?
Andrea Simon welcomed the investment in interview suites but stressed that many victims withdraw before a charging decision is made. She said treating victims with care and dignity throughout every interaction with police is equally critical to improving outcomes.
