British police forces are preparing to trial artificial intelligence tools designed to forecast criminal activity, raising legal, ethical, and civil liberties concerns about surveillance and pre-emptive law enforcement.
By yourNEWS Media Newsroom
Police forces across the United Kingdom are preparing to trial artificial intelligence systems designed to predict and prevent crime before an offence is committed, according to senior policing officials, as part of a broader push to expand the use of advanced analytics and surveillance technologies in law enforcement.
Sir Andy Marsh, head of the College of Policing, said police forces are currently involved in roughly 100 AI-related projects nationwide. Many are aimed at reducing administrative burdens such as paperwork, but others involve predictive analytics intended to identify individuals or locations deemed at higher risk of future criminal activity.
The predictive approach has drawn comparisons to The Minority Report, the 1956 science fiction novel by Philip K. Dick, later adapted into a 2002 film, in which authorities carry out “pre-crime” arrests based on foreknowledge of future offences. Under the proposed policing initiatives, data-driven models would be used to forecast crimes before they occur, potentially allowing police to intervene earlier or deploy officers to anticipated hotspots.
According to a report by The Telegraph, the UK government has committed £4 million to develop an interactive, AI-driven map of Britain intended to identify potential crimes by 2030. Proposed uses include detecting early indicators of fights or anti-social behaviour and directing police resources toward areas considered at elevated risk of knife crime.
Marsh said predictive policing tools could also be used to identify men who pose heightened risks to women and girls. “We know the data and case histories tell us that, unfortunately, it’s far from uncommon for these individuals to move from one female victim to another, and we understand all of the difficulties of bringing successful cases to bear in court,” he said.
Home Secretary Shabana Mahmood has been a central advocate of expanding surveillance technologies, including live facial recognition systems. She has said she wants to transform policing through AI so that “the eyes of the state can be on you at all times.” Mahmood has publicly likened her vision to the Panopticon, a Georgian-era prison concept developed by Jeremy Bentham and later analyzed by philosopher Michel Foucault as a symbol of constant surveillance and social control, as discussed in a Guardian analysis.
Speaking at an event involving the Tony Blair Institute and former prime minister Tony Blair, Mahmood said, “When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his Panopticon. That is that the eyes of the state can be on you at all times… I think there’s big space here for being able to harness the power of AI and tech to get ahead of the criminals, frankly, which is what we’re trying to do.”
The renewed push for AI-led policing follows a recent controversy involving the misuse of artificial intelligence by law enforcement. Last week, Craig Guildford resigned as chief constable of West Midlands Police after it emerged that a decision to ban Israeli supporters from a football match between Maccabi Tel Aviv and West Ham United in Birmingham relied in part on false claims generated by Microsoft Copilot. The chatbot had fabricated a violent incident involving the Israeli club that never occurred in Britain.
Despite that episode, plans to expand AI use within policing have continued, drawing criticism from lawmakers. Conservative MP David Davis warned that predictive policing risks undermining basic principles of justice. In a public statement, Davis said the Minority Report film was presented as dystopian fiction, yet police leadership now appears willing to adopt similar concepts in practice.
“If an AI system deems you to be at risk of committing a crime, how do you go about proving the AI is wrong, and you pose no threat? The impact on your life when falsely accused of something is enormous,” Davis said in a public post.
He further warned that predictive policing systems could entrench inequality. “The use of these sorts of predictive policing algorithms creates a postcode lottery of justice, reinforcing existing biases and inequalities. Infusing those systems with AI will only exacerbate the injustice,” Davis said.
Davis added that if the government wants to address the high proportion of unsolved crimes, it should prioritize traditional policing methods. “If the Government really wishes to start tackling the 94 per cent of crimes currently going unsolved, then it should focus on neighbourhood policing, stop wasting resources on things like ‘non-crime hate incidents’, which can amount to censorship, and tackle low-level, high-impact offending like burglary and phone theft,” he said.