Met Police's AI Tool Flags Misconduct, Sparking Privacy Concerns in Predictive Policing Debate
April 26, 2026
The Metropolitan Police launched investigations into hundreds of officers after deploying a Palantir-built AI tool to flag misconduct, including work-from-home violations, corruption, and serious allegations such as rape.
A Palantir Foundry pilot flagged officers for misconduct risk based on data like sickness, overtime, and absence, triggering internal reviews.
Globally, predictive policing use is rising but governance lags, underscoring calls for transparent logic, certified auditors, and scrutiny of international ties to Palantir.
Metropolitan Police officials say the software aims to build trust, reduce crime, and raise standards by leveraging existing force data and strengthening vetting and accountability.
The Met has pledged to review the pilot by year-end amid ongoing governance discussions and demand for greater disclosure and oversight of predictive policing tools.
Met framing of AI deployment includes broader AI adoption plans, talks to acquire Palantir tech for investigations, and use of drones and live facial recognition to enhance safety and crime reduction.
The Times reported that the AI flagged issues including sexual abuse of authority and fraud, with human review following automated flags.
Met Commissioner said the aim is to identify risk earlier, act faster, and raise standards, while noting the vast majority of officers serve with integrity and action will be taken against misconduct.
The system flagged undeclared Freemasonry membership; 12 officers face gross misconduct investigations and 30 more receive prevention notices for suspected but uncorroborated membership.
Palantir’s associations with ICE, the Israeli military, and an NHS contract under scrutiny highlight broader political and ethical considerations surrounding the tech partner.
The pilot ran for months, raising privacy concerns and prompting an internal investigation; the ICO has urged privacy-by-design measures and an unpublished impact assessment remains outstanding.
Eleven police forces use Palantir; the Met has about 46,000 officers, with concerns about false positives and career impact, prompting calls from unions and MPs for safeguards and transparency.
Summary based on 2 sources
Get a daily email with more World News stories
Sources

The Guardian • Apr 25, 2026
Met investigates hundreds of officers after using Palantir AI tool
Startup Fortune • Apr 26, 2026
Met Police Palantir AI flags hundreds of officers, sparking privacy probe