18
May
2021
|
13:47
Europe/London

‘Re-Counting Crime’ project investigates accuracy and precision of crime estimates

Two researchers from the University of Manchester (David Buil-Gil from Criminology and Alexandru Cernat from Social Statistics), alongside Ian Brunton-Smith from Surrey and Jose Pina-Sánchez from Leeds, have secured funding from the Secondary Data Analysis Initiative of the Economic and Social Research Council (£108,948) to investigate the accuracy and precision of crime estimates in their ‘Re-Counting Crime’ project.

A map of violent crime in Islington based on Ambulance data.Understanding the amount of crime that occurs across geographic areas is important to society. Not only is this used to help with the allocation of police resources, but it is also a central theme of political debate with apparent increases in crime serving as an indictment on existing law and order policies.

Academics also make regular use of crime statistics in their work, both seeking to understand why some places and people are more prone to crime, and using variations in crime to help explain other social outcomes. Members of the public can also refer to this information, with historic crime trends now included on many house-buying websites.

Currently, there are two main ways of estimating the amount of crime: directly using police records of incidents that they are aware of; and approximating crime using victimisation surveys like the Crime Survey for England and Wales, where a sample of people are asked to report any victimisations in the past year. Both approaches are deficient.

However, whilst theoretical work has highlighted a number of sources of potential error in these data, we currently lack an empirically robust quantification of the different sources of random and systematic error in each. Nor do we fully understand the potential impact that these errors might have on the estimates from academic work that makes use of this data, although evidence from other fields suggests that this may be substantial.

This project uses cutting edge measurement error methods from the fields of epidemiology, economics, and biostatistics to properly quantify the types of error that are present in police recorded crime and crime survey data.

Drawing on data between 2011 and 2020, this project investigates the extent of systematic bias and random error in these two data sources, and how this error may have evolved over time. This project also uses advanced statistical methods to generate adjusted counts of crime across England and Wales, providing a unique picture of how different crimes vary across space and time. Finally, these findings are used in tandem with ‘off the shelf’ measurement error adjustment techniques to demonstrate the potential influence that measurement error has on the findings of existing research.

 

Share this page