Previously we covered the possibility that AI could be used to augment the fielding and processing of first-response and emergency calls, such as to 911 centers running computer-aided dispatch. Interestingly, this was the topic of a recent article by Chris Bibey that appeared in Yahoo Finance. It was titled, “From January 6th Rioters To Black Lives Matter Protests, Artificial Intelligence Is Becoming An Integral Part Of Law Enforcement.”

“Artificial intelligence (AI) technology continues to make its way into every industry, and that holds true of law enforcement,” Bibey writes. “Law enforcement agencies across the country have begun to integrate predictive policing, facial recognition and technologies into their day-to-day work. To date, the use of facial recognition software to identify Capitol rioters on Jan. 6, 2021, is perhaps the most notable use case in recent history. But the technology has notably had a number of other use cases, both good and bad.”

Bibey goes on to explain that facial recognition has led to at least one wrongful arrest (as chronicled in The Guardian). That hasn’t stopped the proliferation of its use, however. Moreover, the sheer amount of body camera footage now available (as body cameras have become increasingly common in law enforcement) means that it’s almost impossible to analyze all that footage in real time. Artificial intelligence, he explains, can be used to analyze the footage in order to find potential problems, misdeeds, and other events of note – incidents that would go unnoticed otherwise because the human hours of effort needed to process all that footage simply isn’t available.

Bibey then explains that a North Carolina State University report from a few months ago underscores that AI technology could either bridge the gap between the public and law enforcement, or widen that chasm. “On the plus side,” Bibey writes, “many law enforcement officials believe that AI technology has the potential to improve public safety. But there’s a growing concern that implementing this technology in certain situations could cause more harm than good regarding trust between police and the public.”

Beyond facial recognition and the use of AI technology to help process calls for first responders and emergency response, AI can be used for everything from facial recognition to gunshot detection. More and more municipalities are using gunshot detectors to alert them of the need for response, but of course not all “loud bangs” are equal. Knowing when to dispatch units and when not to could be a game changer when it comes to these listening technologies.

As with the use of AI in computer-aided dispatch software, there are always ethical and practical concerns. AI could (and very likely will) prove to be a valuable tool for public safety software supporting law enforcement and emergency responders… but as with anything, there will be details to iron out. There is no replacing a thinking, feeling human being when it comes to compassionately providing assistance to our fellow human beings… and that means, no matter how good the technology, the true power behind public safety software will always be the people designing, operating, and making use of it.