Complexity and Lack of Standardization Makes Crime Statistics Less Useful
A lack of standardization in crime statistics and the complexity of the causes of and cures for crime have made the use of crime statistics difficult.
For instance, whether using marijuana causes crime is an important question as more states consider legalizing recreational marijuana. Legalization proponents could point to a paper in the Journal of Economic Behavior and Organization that found thefts decreased by 20 percent and rapes by as much as 30 percent post-legalization in Washington state.
But, in a survey of 75 sheriffs in states that legalized recreational marijuana undertaken by The New Yorker, the 25 who responded were evenly split between not seeing any change and being certain of an increased crime rate. Thus, The New Yorker survey and the journal paper have two differing conclusions, demonstrating the difficulty in comparing crime statistics.
The most basic problem in crime statistics is defining what a crime is. Behavior that is criminal in some places is not criminal in others.
The definition of what a crime is also changes over time, making historical comparisons difficult.
Further, how does one classify misdemeanors? Is speeding a crime? Is having an overgrown lawn? That is why the paper focused on rape and theft, behavior that is historically criminal in every state.
Even if the definition of crime were settled, there are differing methods of gathering crime statistics. The FBI’s Uniform Crime Reporting Program relies on information solicited from around 20,000 law enforcement agencies, while the National Crime Victimization Survey of the federal Bureau of Justice Statistics (“BJS”) is a random survey of households that relies on reports by victims — including those who did not report the crime to authorities.
This can lead to vast differences. For instance, the FBI statistics on rape show it almost doubled between 1973 and 1990, while the BJS data show a decline of 40 percent. Vanderbilt University researchers investigated the discrepancy and reported correlations between the FBI’s figures and an increase in the number of female police officers, the advent of rape crisis centers, and reformed styles of investigation. They concluded that the incidence of rape likely declined while the reporting of the crime greatly increased because of reforms in policing, swamping the decline and giving the appearance of an increase in the FBI statistics.
Finding such a coherent explanation is rare. Further, random increases in crime may soon return to average, a common phenomenon called “regression to the mean.” However, laws passed in response to the random increase may be incorrectly credited with the subsequent reduction. This makes it difficult to correlate crime rates with legal measures taken to counter crime.
The causes of crime also are hard to correlate to crime statistics because they are complex and interacting, and the results of intervention may be counter-intuitive or ignored simply because people believe in a program regardless of what the statistics show. For instance, a 1992 meta-analysis of 443 published studies on juvenile delinquency programs showed that a third of them did more harm than good. Yet programs such as D.A.R.E. and Scared Straight continue to be popular.
Likewise, a follow-up on the famous Cambridge-Somerville Youth Study, the first large-scale randomized controlled criminology study, showed that the youths selected to receive counseling, tutoring and summer camp were more likely to have committed multiple crimes, be alcoholic, or be mentally ill than those who received no intervention.
Criminologists agree that most people have simple ideas about what causes crimes — violent videos or music, or sexist or racist attitudes — and they are simply wrong. Therefore, we should be wary when simpleexplanations for changes in crime rates and simple solutions for crime are offered.