By Katherine Barrett and Richard Greene
It may appear that efforts to adopt an evidence-based approach using data to improve the effectiveness, efficiency and fairness of law enforcement had its genesis back in 1995, when New York City kicked off work on its so-called CompStat system. In that very successful effort, geographic information systems, or GIS, were used to identify the places in the city where officers could be deployed to their best use. It worked so well that New York’s crime rates plummeted and a number of other places tried to emulate the work. But while CompStat may have been at the forefront of using technology in this way, “the history of quantitative crime analysis spans decades,” wrote Jennifer Bachner, a director in the Johns Hopkins University Center for Advanced Governmental Studies. As Bachner pointed out, in 1829 “an Italian geographer and French statistician designed the first maps that visualized crime data,” including three years of property crime rates as well as education information garnered from France’s census. The maps showed a correlation between the two—more education tended to equate to less crime. Jump forward about 190 years and you’ll find that a number of states, counties and cities have been using the seemingly magical capacity of computers to advance this work dramatically.

The War on Marijuana in Black and White, a report published by the American Civil Liberties Union in June 2013, looks at marijuana possession arrest rates and racial disparities among these arrests, as well as the costs of enforcement for all state and counties from 2001 to 2010.  

The FBI’s annual Uniform Crime Report for 2010, in its preliminary findings, reports a 5.5% decrease in violent crime and a 2.8% decrease in property crime nationally (when compared with 2009 figures). And in 2009, these categories were down 5.5% and 4.9%, respectively.