top of page
The  iluli by Mike Lamb logo. Click to return to the homepage
The iluli by Mike Lamb logo. Click to return to the homepage

Can Computers Cut Crime?

With all the political posturing in the UK that surrounded December’s General Election, it was no surprise that cutting crime rates was a central talking point — after “oven-ready” Brexit deals, of course!

In January 2019, violent crime in England and Wales had risen by 19% in a year — the highest since 2007. 

A cartoon image showing two crossed yellow "DO NOT CROSS POLICE LINE" tapes against a dark background. Faint outlines of a body, suggesting a crime scene under investigation.

For the year ending June 2019, the Office for National Statistics reported:

  • 5% decrease in the number of homicides following a period of increases over the last four years (719 to 681 offences)

  • 4% increase in the number of police recorded offences involving firearms

  • 7% increase in the number of police recorded offences involving knives or sharp instruments

  • 11% increase in robbery

The report noted that many of these lower-volume, higher-harm types of violence were concentrated in metropolitan areas including London, the West Midlands, West Yorkshire and Greater Manchester.

What I want to know, is what those in power are going to do differently. Banding around astronomical monetary figures is one thing, but how will funding affect real change on our streets? What real-world policies or technology can be deployed to ensure a reduction in knife crime, gang violence and terrorism? And how soon can it be done?

Well, technology has already helped to reduce crime over the last 20 years — it just depends on the type of crime we’re looking at... Take car crime, for instance, with the introduction of central locking and electronic immobilisers, vehicle theft dramatically reduced across the countries who made manufacturers implement the technology. Thermal imaging, GPS, robotic cameras, gunshot detection systems, automatic number plate recognition and handheld lasers have also played their part in ensuring public safety on both sides of the Pond and beyond. 

An article by Tom Gash in Wired in 2016 pinpointed a possible cause for the reduction in murder rates in the United States, referencing a study led by University of Massachusetts professor Anthony Harris. Harris’ study found that US murder rates might have been three times higher today were it not for advances in communication and medical technologies.

Much like Spielberg’s Minority Report, “predictive policing” plays a part. Through employing mathematical, predictive analytics — and other analytical techniques in law enforcement — it is possible to identify potential criminal activity. There are four core categories in which predictive policing methods can prove invaluable: methods for predicting crimes, offenders, perpetrators’ identities and victims of crimes.  

Back in the UK, knife crime is particularly prevalent. To combat the rise, British company Thruvision has developed technology that can safely detect weapons — including knives, guns and explosive devices — concealed under clothing at distances of up to 30 feet. It works by revealing objects concealed in clothing that block a person’s body heat, without the need for physical searches. The technology is already in use on the Los Angeles Metro and is due to be trialled in the UK over a five-day period at Stratford station, East London. 

And whilst predictive policing has been lauded by the media in the past as a revolutionary innovation capable of “stopping crime before it starts”, others are more reserved about its merits.

A graphic image featuring a neon blue circuit board pattern in the shape of a human fingerprint, set against a dark background.

A RAND Corporation report from 2013, tempers enthusiasm slightly, stating:

"These tools are not a substitute for integrated approaches to policing, nor are they a crystal ball; the most effective predictive policing approaches are elements of larger proactive strategies that build strong relationships between police departments and their communities to solve crime problems… 

"To be effective, predictive policing must include interventions based on analytical findings. Successful interventions typically have top-level support, sufficient resources, automated systems providing needed information, and assigned personnel with both the freedom to resolve crime problems and accountability for doing so."


The report continues:

"They [predictive policing methods] cannot foretell the future. They can only identify people and locations at increased risk of crime.

"The operational value of predictive policing tools is in their contribution to broader law enforcement strategies that use the tools' risk assessments to inform resource allocation and problem-solving decisions.


"The collection and use of data on individuals has raised a number of concerns about privacy rights and civil liberties. An understanding of the legal precedent, along with regular audits, public outreach strategies, and greater community involvement and buy-in, have helped police departments address these concerns."

The concerns around privacy are of particular interest to me. 

As an ordinary citizen, I’m sure I’m not the only one who already feels like they’re being watched — or listened to — without having given explicit consent.Whether it’s Amazon recommendations or adverts on Facebook, I regularly get the sense that big corporates are utilising listening tactics to reel in their audiences. It’s all a bit 1984, Big Brother is watching you… but where’s the real harm?

Predictive technology, on the other hand, has been accused of relying on discriminatory profiling that could lead to biased policing strategies. Personalised advertising is one thing, but potentially ending up in the dock for something you haven’t done is another thing entirely…   

A cartoon image depicting a smartphone broadcasting a live trial with the word "GUILTY" at the top. The screen shows a figure at a stand, flanked by yellow columns, under the scrutiny of a pointing hand and a judge's gavel. The scene is framed by two scales of justice on either side, set against a black backdrop.

In February 2019, BBC News reported that crime prediction software had allegedly been adopted by 14 police forces across the UK. When contacted by the BBC, two of the named forces confirmed that they had already stopped using the technology. Cheshire Police has trialled a mapping programme between January and November in 2015, and Kent Police had introduced a predictive policing mapping tool in 2014 but had decided to not renew the contract.

Several forces, however, are involved in the National Analytics Solutions (NAS) — a £4.5m proof-of-concept project funded by the Home Office and led by West Midlands Police. Drawing on information already held by the police — incidents logs, conviction histories and so on — and employing machine-learning techniques, the project aims to calculate the “risk score” of individuals. 

On paper, closely monitoring serial offenders in our society sounds like common sense but as things stand, the concept of predictive policing is far from foolproof.     

The Alan Turing Institute delivered an Ethics Advisory Report on the NAS, echoing such sentiments:

"The NAS document seeks to legitimise proactive and preventative policing. This is well meant in terms of preventing ‘harm’, but should differentiate the specific types and degrees of harm that might be mitigated or avoided, and consider the necessity and proportionality of measures taken in pursuit of such aims. Among our concerns are the ethical dangers of inaccurate prediction (false positives or negatives) given the state of the art of predictive policing. 

"We believe that the predictive risk model should be subjected to rigorous ex ante evaluation of effectiveness and ethical impact, and that a strong monitoring and evaluation programme should be included. Ethical issues abound, concerning surveillance and autonomy and the potential reversal of the presumption of innocence. 

"We believe that this whole question needs careful and evidenced discussion if the NAS is not to be based on algorithmic instruments and machine learning that are seriously flawed, at least for this kind of application. Work needs to be done to understand better the limits of the reliability of the analytics in the light of a fuller picture of all the consequences for citizens."

Whilst safety and security is obviously — and rightly — paramount, it is also important that those leading our country focus on delivering fair and just practices. Algorithms work when they’re 100% accurate. Without accuracy, they are merely a guide and their fallibility should not be forgotten at the expense of the innocent. 


bottom of page