By Katherine Barrett and Richard Greene
It may appear that efforts to adopt an evidence-based approach using data to improve the effectiveness, efficiency and fairness of law enforcement had its genesis back in 1995, when New York City kicked off work on its so-called CompStat system. In that very successful effort, geographic information systems, or GIS, were used to identify the places in the city where officers could be deployed to their best use. It worked so well that New York’s crime rates plummeted and a number of other places tried to emulate the work. But while CompStat may have been at the forefront of using technology in this way, “the history of quantitative crime analysis spans decades,” wrote Jennifer Bachner, a director in the Johns Hopkins University Center for Advanced Governmental Studies. As Bachner pointed out, in 1829 “an Italian geographer and French statistician designed the first maps that visualized crime data,” including three years of property crime rates as well as education information garnered from France’s census. The maps showed a correlation between the two—more education tended to equate to less crime. Jump forward about 190 years and you’ll find that a number of states, counties and cities have been using the seemingly magical capacity of computers to advance this work dramatically.