The Hidden Costs of Police Technology: Evaluating Acoustic Gunshot Detection Systems

 

 

Many law enforcement agencies are faced with questions about what type of technology to invest in. This is certainly relevant in an era of contracting budgets. One type of technology available to law enforcement, acoustic gunshot detector systems (AGDS), are advertised to capture gunshots above and beyond those called in by residents. Attesting to its quick rise as “standard” technology in law enforcement is the rapid pace of adoption within the field. ShotSpotter, one AGDS vendor, currently lists 93 cities as its clients, including major cities such as Chicago and New York.1 AGDS work by using microphones that capture and triangulate sound waves from gunshots. Anecdotal evidence suggests the systems can be effective in cases of active shooters and might assist officers in making arrests. The systems can accurately pinpoint a gunshot’s GPS location and forward this information in real time. It is difficult to deny that the systems generally work as intended—they do a fine job capturing gunfire in outdoor settings. But what will be highlighted here is the functionality and the potential hidden costs of such systems for law enforcement.2 Typical vendor costs for AGDS are around $65-90,000 per mile per year, but these vendor costs do not account for the increases in call volume agencies may experience, nor for any losses of efficiency in the response.3

A study of the efficacy of such a system installed for the St. Louis, Missouri, Metropolitan Police Department (SLMPD) found evidence that implementation of AGDS leads to a substantial increase in the total number of “shots fired” calls, costing departments additional resources in response time. This finding might not be unique, as other researchers have found that while AGDS might improve response time, it does not improve case resolution.4 Similarly, yet another study found that using SENTRI sensors linked to CCTV in Philadelphia, led to a 259 percent increase in reported gunshot incidents in the immediate vicinity of sensors in the 8 months following installation.5 While a growing number of incidents were expected, the increase was not matched by a rise in founded incidents. This finding suggests that police agencies that experience a high volume of calls for gunshots might experience additional service requests by AGDS without generating actionable data. Similar results are found in the current analysis of data from the SLMPD as the additional increase in AGDS calls produces little actionable data. In addition, there is substantial evidence that citizen-initiated calls for “shots fired” decline in the AGDS areas. Combined, these results suggest that, while the total volume of calls increases with the implantation of AGDS, the quality of these calls is lower, leading to an overall reduction in actionable results. St. Louis’s experience with AGDS, the findings of the current study, and their implications are detailed herein.

St. Louis Experience and Evaluation

St. Louis is an ideal candidate to test AGDS as the city suffers from very high and very concentrated gun crime problems. In 2008, the SLMPD contracted with an AGDS vendor to cover a one-square-mile area in one of its densest gun crime hotspots. The city decided to expand its AGDS in March of 2013 to include over 3.5 square miles, despite evidence suggest the technology had little impact in crime reduction6 Since the expansion, many high crime neighborhoods, together accounting for around 40 percent of the city’s violent offenses, were covered by the system. Given that many data have been generated in the last six years, it was time to revisit the functionality of the system from a policing perspective.

Study Design

One of the key challenges in evaluating the functionality of AGDS rests in the fact that the companies who provide the service own the data. Because the equipment is leased rather than owned, companies might retain ownership of the data. Unlike some cities in which AGDS calls for service may simply be assigned a “shots fired” code and therefore mixed in with citizen-initiated calls, the SLMPD devised a separate call code for AGDS notifications, not only allowing one to determine the exact neighborhoods covered, but also allowing the examination of the impact AGDS might have had relative to other call codes.

Because AGDS locations were decided from a strict tactical point of view, and without evaluation in mind, the present evaluation suffers from a lack of randomized control locations. In traditional experiments, treatment locations (e.g., those with AGDS) are assigned randomly and a similar group of locations do not receive the treatment. From a police administrators’ perspective, this may seem unethical, as one is depriving violent communities of a possible solution. While the authors concur in spirit, the issue for evaluators is that this does not always allow for a fully fair comparison, as one saturates the most violent communities with technology. Given that the most violent places also tend to have the most volatile crime trends, evaluations are biased in favor of positive results. By placing technology in the most violent communities of the day, one can expect to see the largest drops in these communities simply due to normal variation (regression to the mean). Consequently, one may falsely attribute a drop in gun violence to the placement of technology. To avoid such mistakes, the current study carefully selected other high crime control neighborhoods and examines long-term trends that smooth out some neighborhood volatility.

Results and Outcomes

Since the expansion of the AGDS in 2013 and February 2018, the SLMPD responded to roughly 17,000 calls for service generated by the AGDS. By contrast, in these same areas, the department responded to about 14,000 citizen-initiated calls for “shots fired.” The average time spent responding to calls was examined by tallying up the travel and investigative time logged; officers typically log 22 minutes on average AGDS calls, versus 26 minutes on citizen-initiated “shots fired” calls. Whereas AGDS calls were handled more quickly overall, there is no statistical difference in the time it took officers reaching the scene. The latter makes sense as both types of calls share the same priority in St. Louis. A noteworthy finding is that AGDS adds almost 1,000 calls for service per square mile per year, and, although such calls are less time consuming than traditional “shots fired” calls, they do not get officers to the scene more expediently.

This substantial number of additional calls for service must be incorporated into the cost of AGDS, especially in high crime areas were workloads are substantial to begin. Assuming, conservatively, that only one officer responds to these calls, the current estimate suggests that over 1,200 hours each year are consumed in St. Louis responding to such calls. Given that a police officer may cost a department about $75 per hour (training, gear, salary, benefits, and so forth), St. Louis likely spends $ 90,000 annually or $ 25,000 per square mile on top of the AGDS costs itself. What seems like an annual commitment per square mile of $65,000–90,000 may thus more likely be $90,000–115,000 annually. The latter assumes that all other things stay equal, however.

A more formal statistical evaluation of the system examines changes to the number of reported “shots fired” and “shootings.” Given the sizeable expansion of AGDS in St. Louis after 2013 the new AGDS neighborhoods are compared against similar high crime neighborhoods without AGDS, but this also includes a reexamination of earlier data from the first implementation in 2008 (see Figure 1). After both implementations of AGDS, some residents in the impacted neighborhoods stopped calling in shots fired. Based on multivariate analysis, comparing results before and after installation of AGDS and controlling for factors known to be associated with crime changes, a statistically significant reduction of about 25 percent emerges for “shots fired” calls by residents of the AGDS neighborhoods. The St. Louis AGDS was turned off due to budgetary constraints for a period of about four months in 2016. This period provides a perfect additional measuring point. Analysis shows that community members’ calls for “shots fired” returned to pre-AGDS levels during this four-month period, increasing by about 25 percent. Nearly identical results emerge when examining civilian calls for “shootings.” In sum, as AGDS expands, traditional citizen-initiated calls for service decline. Overall, calls for gunshots (both AGDS- and citizen-initiated) still increase. Annual citizen-initiated calls in the impacted neighborhoods may drop from about 3,700 to about 2,800, but because AGDS add 3,400 new calls per average year, the new annual average hovers around 6,200 calls for service, a 67 percent increase in service calls.

AGDS and Shots Fired Calls for service in St. Louis 2005-2017

At first glance, this increase is a positive effect—in theory, more calls mean more crimes uncovered, more people arrested, and more lives saved. We had the same thought and decided to dig a little deeper. For all 19,000+ AGDS calls over the last decade, police officers were led to the scenes of 5 homicides, 58 aggravated assaults, and 2 robberies (many of which were also called in by community members). After cross-referencing the calls for service database with arrest data, the current study was able to identify only 13 arrests uniquely tied to the ADGS calls. For a city with between 100 to 200 homicides annually, this is not exactly a great catch, especially considering the AGDS neighborhoods produce over 40 percent of total violent incidents. After examining the data more closely it appears that every 100 AGDS calls for service generates 0.9 founded crime incidents (which includes both Part I and Part II crimes); regular community member calls by contrast generate 7.6 crime incidents per 100 calls. This indicates that alerting police of potential activity is not enough, human intelligence supporting that information is critical to turn a notice of potential activity into something police can act upon. In effect, prior to the implementation of AGDS, the community member calls for service in the impacted neighborhoods were generating an estimated 281 founded crime incidents per year; after the system was installed in 2013, an estimated 243 founded crime incidents were generated (212.8 from traditional calls and 30.6 from AGDS). Despite responding to more calls for service, results indicate that officers receive less actionable data on the ground. A reduction in uncovering founded crimes through “shots fired” calls for service should not be equated with a reduction in crime. The current evaluation, for example, finds no crime trend differences between AGDS neighborhoods and similar areas. The only reasonable conclusion from this is that AGDS produce less actionable data and are less efficient than traditional sources of information.

The question that cannot be fully answered is why fewer people make phone calls about gunshots to police once AGDS are in place. Two very different reasons are proposed. One, people might simply wait a few minutes to notify police after hearing gunshots, perhaps assuming someone else might have already called; once they hear sirens or see a patrol car, they decide not to call. In this case AGDS simply preempts an already cold call. Two, people know AGDS are in their area and assume officers do not need the public’s help to find the right spot after gunshots are detected. In the latter example, AGDS might well cause officers to miss out on actionable human intelligence.

Conclusions and Recommendations

Results show that AGDS simply seem to replace traditional calls for service and do so less efficiently and at a greater monetary cost to departments. Given the tepid results in guiding police to the scenes of crime and given the hidden costs of these systems illustrated here, AGDS might not be well-suited for the audience the technology is marketed toward. High-volume agencies will likely experience substantial increases in their call volumes with remarkably little to show for it, at a cost that might have taxpayers questioning the logic behind the expense. While this technology can be useful, especially from an analytical point of view, it is difficult to see how agencies benefit from expensive technology that increases financial strain on departments with its only discernable impact being fewer founded crime incidents.7 To be fair, the current study examines only St. Louis, and results might differ in places with a dissimilar policing context. There is good reason to be confident, however, that these results will translate to agencies with similar crime levels and budgetary constraints to those of St. Louis. Agencies with a high workload simply cannot lift AGDS calls to the highest priority, meaning response time is equal to that of traditional calls. Essentially, if officers cannot be teleported to the scene instantaneously, agencies cannot expect big results from AGDS.

One might ask: Why is it, then, that I hear so many positive things about AGDS? There are a few reasons that AGDS technology has been able to insert itself in the police technology market. For one, the technology has been riding the broad downward slide in crime that has taken place since the 1990s, especially in high crime areas that saw even steeper declines in crime. It may certainly seem that AGDS areas have a larger downward trend in gun crimes when compared to overall city levels, but once one compares like areas, those differences disappear. Two, anecdotal evidence is powerful and inspires the public and politicians alike. A few highly publicized cases where AGDS technology led to a high-profile arrest or stops an active shooter are certainly eye-catching, but, as the above analysis shows, these are not frequent occurrences. In effect, the systems get lucky occasionally, but, in St. Louis, the average results over nearly a decade show that AGDS are no match for the efficiency of human intelligence. The tight grip that some AGDS companies have on the underlying data make it difficult for policy makers to quickly analyze the efficacy of such systems. Finally, companies selling AGDS suggest that traditional shots fired calls are flawed; on its website, one vendor proclaims: “On average only 20% of gunfire incidents are reported to 911. This means police are working with an 80% deficit in intelligence as it relates to gun violence and are unable to respond to the majority of shootings.”8

At the same time these companies argue that a decline in traditional shots fired is “proof” of their systems’ efficacy; in discussing specific results in Oakland, California, a vendor’s website states. “There has been significant progress in these efforts, with a 29% overall reduction in gunfire incidents per day from 2012-2017, and a 71% decrease in gunfire incidents per square mile from 2012-2017.”9 The current study disagrees and finds that traditional citizen-initiated “shots fired” calls provide more actionable data for law enforcement. Coupled with the fact that such calls decline upon implementation of AGDS—without an accompanying actual reduction in gun crimes—is evidence AGDS have a suppressive effect on actionable data.

For readers undaunted by the cautionary results of this study, the authors offer a few pieces of advice. Always have a plan in place to evaluate the implementation of new technology, and work together with independent researchers or analysts in the department before implementing the technology. Simple steps can help achieve reliable evaluation in little time. Randomization of treatment is important to allow comparison of similar areas. Vendors will likely push for blanket coverage in all high crime locations and will make a persuasive argument that grants may cover most of the cost. Aside from biasing results in favor of technology, as illustrated above, departments will still need to provide the officers responding to the additional calls for service. The current results would strongly recommend small steps in implementation and making sure not all high crime areas get immediate coverage, as that makes it harder to compare the impact technology might have and whether the technology will work in that specific policing context. For AGDS, it is additionally important to create new or separate incident codes in calls for service databases so specific effects may be measured and compared. Evaluation of technology does not necessarily have to involve using the latest and most complex statistical modeling, and, if the technology is implemented with evaluation in mind, simple statistical tests that can be done by in-house staff are indicative enough to make decisions about the next steps.

Notes

1 Cities,” ShotSpotter.

2 Cory Watkins et al., “Technological Approaches to Controlling Random Gunfire,” Policing: An International Journal of Police Strategies & Management 25, no. 2 (2002): 345–370; Yasemin Irvin-Erickson et al., “What Does Gunshot Detection Technology Tell Us about Gun Violence?” Applied Geography 86, (2017): 262–273.

3 Matt Drange, “We’re Spending Millions On This High-Tech System Designed to Reduce Gun Violence. Is It Making a Difference?Forbes, November 18, 2016, Viewed November 8, 2018.

4 Kyung-Shick Choi, Mitch Librett, and Taylor J. Collins, “An Empirical Evaluation: Gunshot Detection System and Its Effectiveness on Police Practices,” Police Practice and Research 15, no. 1 (2013): 48–61.

5 Jerry H. Ratcliffe et al., “A Partially Randomized Field Experiment on the Effect of an Acoustic Gunshot Detection System on Police Incident Reports,” Journal of Experimental Criminology 15, no. 1 (2018): 67–76.

6 Dennis Mares and Emily Blackburn, “Evaluating the Effectiveness of an Acoustic Gunshot Location System in St. Louis, MO,” Policing 6, no. 1 (2012): 26–42.

7 Jillian B. Carr and Jennifer L. Doleac, “The Geography, Incidence, and Underreporting of Gun Violence: New Evidence Using Shotspotter Data,” SSRN Electronic Journal (2016).

8 ShotSpotter Flex,” ShotSpotter.

9 ShotSpotter Enables Police to Have Greater Impact,” ShotSpotter.