adam toledo, shotspotter, and the expansion of a carceral state
Early in the morning on March 29, a Chicago police officer shot and killed 13-year old Adam Toledo in Little Village, a neighborhood on the Southwest side of the city. Officials didn’t notify Adam’s mom until two full days later – she thought he was missing until officers showed up at her door, asking her to identify his body.
It’s not just Adam, either. Cops have a long history of killing Black and brown youth. In 1947, Beverly Lee, a 13-year old Black boy, was shot in the back by a Detroit police officer. In 2016, LAPD shot and killed 14-year old Jesse Romero. I vividly remember the protests in Pittsburgh the summer of 2018 after cops killed 17-year old Antwon Rose, shooting him in the back as he ran for his life. Merely within a week of Adam Toledo’s death, Travon Chadwell and Anthony Alvarez were also shot and killed by CPD.
Police do not keep us safe. Police enact violence on marginalized communities in order to uphold a white supremacist and capitalist status quo. And when they are guilty with murder as in the case of Adam Toledo, they’ll lie and drag their feet with the evidence for as long as they can. I shouldn’t need to reiterate this fact, but sometimes it feels like I’m yelling at a brick wall.
It’s been two weeks since Adam died and we still don’t have a full picture of what exactly happened that night and who those cops were. The shooting was captured on an officer’s body camera, but the Civilian Office of Police Accountability (COPA) initially declined to release it on the basis that state law prevents agencies from publicly releasing videos involving minors. On Friday of last week, COPA reversed their decision and agreed to release the footage within 60 days but only after facing significant public pressure from activists and elected officials. CPD will be releasing body cam footage next week.
One fact we do know is that the cops were responding to a ShotSpotter alert for eight gunshots that night. So as we wait for more details (or for CPD to release the information they clearly do not want us to know), I decided to learn some more about ShotSpotter, the concept of which initially sounded sketchy (at best) to me. I’m always weary of throwing “data-driven” technology to reforming policing because 1) we shouldn’t be investing more in policing and 2) more often than not, tools like ShotSpotter are not neutral technologies; rather, they’re used to further extend the reach of a carceral and surveillance state.
ShotSpotter is an acoustic gunshot detection system (AGD) which means it uses a network of sound sensors installed on rooftops and telephone poles to determine a gunshot’s location, and then sends the alerts as 911 calls to police departments and officers dispatched in the area. In practice, that means installing 20 to 25 sensors per square mile of coverage. When the sensors pick up on audio from a potential gunshot, it’s analyzed by a computer and then sent to employees on call 24/7 who review the sound to make sure it’s actually a gunshot. All of this supposedly takes under a few minutes, according to ShotSpotter’s company documents.
Currently, 110 cities and 12 college campuses in the U.S., South Africa, and the Bahamas use ShotSpotter. Although the company, which is publicly traded (NASDAQ: SSTI), was founded in 1996, they only recently signed their two largest contracts – Chicago and New York – in 2017 and 2015, respectively. Chicago accounted for 18% of their total revenue in 2020, and New York, 15%.
ShotSpotter isn’t cheap either. In 2018, Chicago expanded ShotSpotter by signing a $23 million contract until the end of 2021 that covers over 100 miles and 12 police districts. (Funding for Chicago’s initial trial period of ShotSpotter came from asset forfeiture through the drug war) The company operates on a subscription per mile model, which means that $23 million equates to roughly $75,000 per square mile, per year (ShotSpotter notes a typical contract is $65,000 to $90,000 per square mile).
Imagine if we reallocated that amount of money each year money to mental health or community support services across the city!! Maybe then we would actually have funding for free mental health services at Gary Elementary, the school that Adam went to. For context, Chicago’s police budget for 2021 is $1.69 billion, so ShotSpotter is still a tiny portion of an extremely bloated police budget.
So ShotSpotter is expensive, relatively. Yet, we don’t even know if it works – results are varied across cities, and we basically have to take ShotSpotter or the city’s word for it because ShotSpotter is a private company whose data collected is “proprietary” and exempt from FOIA. A company memo even discourages agencies to respond to FOIA requests for their ShotSpotter data.
The company says it has a “97% aggregate accuracy rate” for all their customers, but it’s unclear how this metric is calculated and it can’t be verified because, well, ShotSpotter owns the data. Though ShotSpotter claims to use proprietary programs and human review to filter out non-gunshot sounds, there are still numerous examples of sensors picking up fireworks, plane crashes, and vehicle backfires in the past.
These false positives are particularly harmful because cops will assume a violent situation with an active shooter when they come into a neighborhood, and act as such. Do we really want more places in our neighborhoods to be armed with sound detectors so lethally armed cops can come within seconds of any large gunshot-like sound?
Cities must also pick and chose neighborhoods for ShotSpotter because it’s so expensive. A report from the Urban Institute recommends cities put sensors only in places with high concentrations of gun violence to “maximize benefits” while “conserving resources”. The problem is that these are likely Black and brown neighborhoods that are already over-policed and surveilled. For example, there were seven surveillance cameras installed around the area Adam Toledo was killed. And the more sensors an area has installed, the more gunshots it will report compared to other areas. This then creates a negative feedback loop where ShotSpotter data is used to further reinforce the need for more policing, justifying the investment in the first place. ShotSpotter is just one more way the police state justifies extending its numerous appendages deeper and deeper into already over-policed communities.
ShotSpotter claims that equipping police departments with faster and more accurate data on gunshots leads to lower rates of gun crimes. You can see this on their “results” page. City after city is listed with a percentage reduction in gun-related crimes. I’m obviously highly skeptical of these stats because there could be many confounding variables that make it hard to prove any sort of causal relationship.
But, it’s also unclear how much more effective ShotSpotter is than normal 911 calls. In 2017, the South Side Weekly did an analysis of roughly 4,000 ShotSpotter-linked events and found that roughly 10 percent resulted in enough data for CPD to open an investigation, a rate roughly analogous to the 14 percent which is the number of 911 calls that CPD opens cases for.
Moreover, having police arrive faster to the scene of a potential gun crime doesn’t mean that more crimes will be solved, nor does it mean more victims will be saved, especially when those victims are Black or brown. In 2016, 22-year old Courtney Copeland was shot but when the cops arrived at the scene, they treated him as a suspect rather than a victim and delayed calling an ambulance, costing him precious minutes that could have saved his life.
Perhaps what is more worrisome and insidious is what ShotSpotter choses to do with the data it collects. As I mentioned earlier, the company owns all of the data, not the city or the police department. It’s a common strategic play among SaaS (subscription-as-a-service) software companies: retain your clients’ data and monetize it. ShotSpotter CEO Ralph Clark told ABA Journal in 2016 that, “we don’t want the data to be given away so that other people could derive value from the process.” However, when ShotSpotter choses to privatize data, any accountability mechanisms are gone. There is no way for public agencies to audit what data ShotSpotter is collecting, and how they are using (or selling) it. The data is now in the hands of a profit-maximizing company that may then run machine learning algorithms on that data and sell it right back to the city as a predictive policing product at a higher price point.
Guess what? That’s exactly what ShotSpotter is planning to do as part of their “growth strategy”, according to their most recent 10-K. In 2018, ShotSpotter acquired a risk-modeling and AI company called HunchLab in order to build a predictive policing platform. HunchLab was also used by CPD at the time, likely as part of CPD’s controversial predictive policing program that ran from 2012 to 2020. ShotSpotter rolled HunchLab into a new predictive policing product called ShotSpotter Missions, now renamed as ShotSpotter Connect, which basically “forecasts” likely hotspots of gun crime and helps police departments deploy officers to certain patrol areas.
It’s well-known refrain among critics that predictive policing programs are racist, especially when many programs are based on arrest records which are more indicative of the way an area or population is policed (i.e., higher arrest rates for Black people), rather than the nature of crime in an area.
ShotSpotter says its data doesn’t rely on arrest records but rather “objective” data such as gunshot sounds and the number of liquor stores and banks in an area. But that reasoning leads to quite a slippery slope: the more that police racially profile suspects and the more incompetent they prove themselves, the more money they’ll receive to reform themselves with technology like ShotSpotter that is supposedly more “objective”. You see? Not only is ShotSpotter a form of tech-washing for existing biases in policing, but it’s a mechanism to actively expand the police state by “reforming” the reforms.
Also, what constitutes an “objective” dataset? What about the underlying ideologies and methodologies used to analyze said data, and can those even be “objective”? The South Side Weekly has an op-ed on UChicago Crime Lab’s symbiotic relationship with CPD, and how their research methods, which were presented as rational and objective (and also utilized ShotSpotter data), were rooted in racist assumptions.
“…the implicit assumption of their work is that the individual choices of Black and brown Chicagoans are the fundamental cause of intracommunal violence. This approach strips the concept of crime of any structural context, and blames the victims of oppressive structures for their plight instead of indicting the social system that harms them.”
Beyond the question of whether an algorithm is racist or not though, we ought to also question why such data should exist in the first place, what context it exists in, and what purpose it really serves.
Ultimately, investments in technologies like ShotSpotter are used by the state to surveil people the it deems as dangerous and expendable. Eventually, policing will become so embedded in communities like the block in Little Village where Adam died and countless other Black, brown, and immigrant communities across the country that they’ll become nothing but open-air prisons.
And who stands to benefit from all this? Shareholders whose bank accounts literally grow as the number of bodies ShotSpotter surveils grows. That is the logic of carceral capitalism, and it’s a logic that is inherent to all the technology companies who compete tooth and nail for lucrative government contracts as the state siphons off large portions of its apparatus in the name of neoliberalism.
I hate to end this post on a defeatist note like this, but I do think it is important to be aware of how the gears turn behind the scenes and to question why things are the way they are. Technology isn’t inherently bad, and I’m not anti-technological advancements. I just think we need to turn a critical eye toward how technology is being used and not have blind faith in brute technological progress. I’'ll end with this quote from Jackie Wang, author of Carceral Capitalism, which gives me food for thought in terms of reframing how we might use the technology in alternative ways:
“Maybe if the context in which data collection took place was not defined by capitalism and white supremacy, we could start thinking about other uses for data — we could use data to determine social needs and resource redistribution rather than punishment and profits.”
Thanks for bearing with me and making it to the end of this long post. As always, if you have questions or feedback, I’d love to hear from you.
Amy