Artificial Intelligence and Community-Police Relations

Digital image of face with binary code running through it

Your spouse yells from upstairs: “Honneyyy! The Wi-Fi is down again!” It’s the third time this week your home’s wireless has stopped working. “Okaayyy!” You bellow up the stairwell, “Rebooting!”

This time the reboot doesn’t work.

You check the cable carrier website, logging in successfully after trying several passwords. You click “Internet,” certain that you will be routed and rerouted to page upon page, ultimately having to call and wait on hold for an interminable time, while your spouse continues to “encourage” you from upstairs—“Honey! The Wi-Fi is still down!”

But something different happens this time. A chat balloon appears on your webpage. “Hi,” says the text. “I’m Sally, your Wi-Fi assistant. What seems to be the problem?”

Can this really be happening? Are you really getting immediate assistance via one-on-one chat with someone who can really help? It sure seems like it. Within minutes, your home’s Wi-Fi is up and running.

Here is the big secret: There was no one-on-one assistant. A computer helped fix your computer.


Virtual assistants like the cable maven Sally in the preceding example have proven that timely, outstanding customer service is possible—even when companies are providing it from the server room or on the cloud.1

So, does it matter whether it was a real person if it got the job done and made the user feel like he or she got personal attention? Most commercial entities are successfully betting that it most assuredly does not. This raises a question: If the business world is using virtual assistants successfully, why aren’t law enforcement agencies?

To explore whether virtual customer service can work for law enforcement, it’s necessary to consider some further questions:

n How might law enforcement use artificial intelligence to impact self-reporting portals and public interface points used by police departments?

n Can artificial intelligence reduce labor costs without compromising swift, accurate customer service?

n In what ways can artificial intelligence fulfill the obligation of police to make their community feel attended-to and heard?

n How can the capabilities of artificial intelligence be “supercharged” to impact crime control and public safety, with a “community policing” feel?

Connecting Artificial Intelligence to Community-Police Relations

Artificial intelligence (AI) is the use of computers to simulate some of the precise processes used by humans to consider and interpret information and to provide decisions that mimic human reasoning.2 This technology has been applied in various forms, from complex problem-solving to use in strategic applications and games and to virtual and simulated human interaction. AI is already playing an important role as an effective partner to law enforcement. Facial and character recognition, critical data extraction from mountains of digital evidence, and advancements in the battle against child exploitation and online human trafficking enterprises are all important examples of AI in policing that will continue to play a big part in preserving public safety for years to come.3

In policing, self-reporting processes and public-police interfaces happen in various ways. The classic form of public reporting may be a police agency’s public website or online reporting portal. New inquiries from the public to the police also come in via telephone, text message, email, and various social media platforms.4 Additionally, these interaction points have become critical recruiting grounds for the next generation of public safety employees.5 Every contact that members of the public have with the police—for any reason, and through any medium—results in an interaction that positively or negatively impacts trust and confidence in police and affects the public perception of police as an element of society.

A law enforcement agency’s relationship with its community relies on two-way communication and effective partnerships.6 These two factors not only enable members of the public to have influence on peace and order in their neighborhoods—but also lead community-police collaboration in the hiring of officers who are reflective of the community they serve and in the application of effective tools to stop and solve crime.7 However, effort must be placed into building and maintaining the community-police relationship. Like commercial businesses, law enforcement agencies supply a “product”—peace, order, and quality of life—to “customers.” Law enforcement’s customer base includes members of the public, including, arguably, crime suspects.

These customers sometimes need immediate help and call police via 911. However, many of these customers need a much lower intensity level of help than an armed police response. They need advice, resources, or sometimes just a place to vent their frustration. Prudent law enforcement agencies look for ways to provide excellent service, while conserving the heightened-risk responses for the times they are truly needed and efficiently utilizing the officers trained to address such crises. This requirement necessitates tiered responses that are prioritized by levels of risk, urgency, and need for in-person assessment. The lower tiered responses that require less intensity can be prioritized differently. There is, however, a nuance to ensuring that those receiving less urgent services have the same high-quality customer service experience as those with more urgent requests.

Interaction with the public is crucial to establishing trust in law enforcement and the safety of a community. Automated systems like online reporting provide good data, but only if the system is used—and only if the right questions are being answered, which can be succinctly explained as “garbage in, garbage out.”8 As agencies transition from traditional to AI-enhanced online reporting, the ideal interactive online system will not only collect important data, but also serve as a bridge between the public and the police that leads to improved community-police relations.

Police Virtual Assistant: Safe, Efficient, and Friendly

One might ask, “What if a police agency added a virtual assistant, driven by artificial intelligence, to make its online reporting system more community friendly and better at collecting data?” Law enforcement could create an AI-driven online persona that would identify crises for immediate transfer to dispatch, ask  the right  questions of the public to help give excellent service, and sort the answers into excellent data for intelligence-led policing and public safety. The anticipated result? A nearly zero-maintenance application that answers important questions, sorts and routes emergency calls, collects critical crime data, and even recruits the next generation of law enforcement professionals.9

How could this be done? The first priority is to ensure safety. A simple yes/no question system can be crafted and programmed to determine whether the public interacting with the AI element has an emergency. This type of initial question is already incorporated in an interactive public safety chatbot system being developed by CivitasAI, a startup in Silicon Valley.10

For non-emergency calls, a prudent place to start an AI-fueled interactive space could be to focus on the least complicated of police duties—answering questions and providing referrals. A list of frequently asked questions (FAQs) can be generated and used in conjunction with a list of local referrals and services. These lists are locally created and are consistently referenced by police everywhere. Cops know a lot about local resources—and savvy cops know what they don’t know and keep reference lists at hand. An AI system can use a network of FAQs and local references in a manner nearly identical to commercial industry, presenting a list of FAQs as a menu, as well as providing FAQ and resource answers to pre-programmed keywords.11

Customer service via chatbot to answer basic frequent questions and requests is no longer cutting-edge; it is “mainstream” and accepted by the consumer public, although the capabilities are continuously evolving, and becoming more powerful.12 CivitasAI is already beta testing such systems for police in Dunwoody, Georgia, and Lafayette, Indiana, providing a chatbot that checks for an emergency, reacts to a variety of FAQs, and supplies readily available resources and referrals in an interactive chatbot format.13 The FAQ interaction model of AI is also being used by the Los Angeles, California, Police Department as a recruiting tool—the virtual recruiter “Officer Chip” answers thousands of possible questions posed by prospective LAPD officers.14

With the increasing demand for the capability to process crime data and display trends, both for intelligence-led policing and public-facing crime maps as community information, the need for accurate, quickly aggregated data is more important than it has ever been. Police departments in various regions utilize various types of online portals to facilitate public reporting of crimes that do not require a direct visit from a police officer. Although this approach helps keep cops on the street, the process of encouraging people to report crimes online is imperfect, due to such reasons as people’s distrust in the systems or the feeling that no follow-up will be done.15 In an online reporting study, users often felt like the online systems either did not understand what information they input, clearly didn’t “care,” or provided “no human interaction,” leading to the feeling of an unresponsive, sterile process without any hope of follow-up.16 All of these factors could be boosted by the application of AI to this process.

The following are some of the more critical ways AI can help:

n AI can identify urgent inquiries and reroute them to 911 emergency operators (even if they are submitted via text or social media).

n AI can field basic information through an interactive webpage with a virtual assistant and route non-urgent reports to a virtual “officer.” This AI-based “officer” will understand the customer’s problem and provide referrals for non-police services or walk the customer through a detailed reporting process with valued follow-up.

n Police officer candidates cruising police websites will encounter a virtual recruiter, who will provide potential recruits with an image and a vision of a progressive law enforcement agency that just might be the right fit for them.

n Data collected by all these elements will be memorialized and analyzed, making the agency constantly better, smarter, faster and more connected.

n AI customer service will operate all day, every day, throughout the year, just like the constantly vigilant officers deployed on the streets and the ever-prepared dispatchers staffing the 911 lines.

Adding Respect, Empathy, and Active Listening to AI

Empathy can be simulated very successfully by computers. Simple, proven techniques like authentic listening, paraphrasing, and reflecting the reporting party’s emotions are already part of the AI process for commercial customer service and chatbot interaction, with positive effects.17

Just as important as the addition of empathy is the reduction of the characteristics of judgment or apathy. Virtual interactivity with an artificial counterpart removes a sense of judgment (since machines don’t have spontaneous natural reactions like humans). Machines programmed to be inquisitive and ask clarifying questions appear to clearly be paying attention to the reporting party, making apathy a non-issue.18 Further, these processes can be very easily programmed to create and send follow-up emails to the reporting parties thanking them for their report and providing updates.19 These capabilities can provide a clear impression of proactive and impactful interaction with the public, a key characteristic of community policing.

Perhaps one of the most impressive outcomes of an AI-based police system is the ability to enhance gathered data to bolster hot spot or intelligence-led policing. Automated reporting systems can be intelligently tailored to drill down for accuracy through follow-up questions, enabling agencies to better document and track crime trends.20 These agencies can then use the more accurate data to better place “cops on dots,” leading to more effective resource deployment and strengthening their ability to solve or prevent crime. This is especially impactful in jurisdictions where small numbers of police personnel make accurate intelligence-led policing critical to operations, but it can be just as beneficial to larger agencies with wide areas to cover and a need to patrol with peak effectiveness.21

This apparently inquisitive process provides an enormous secondary benefit, as the resulting interactive feel enhances the caring, community policing–type experience for the reporting party.

Conclusion

Law enforcement must create effective solutions to remain accountable and connected to their communities. For the vast majority of the public that is not in crisis and calling 911, AI might be the most influential emerging tool for progressive law enforcement. There are some actions law enforcement leaders can take today to become well-situated to engage AI as an active part of their community relations strategy:

1. Survey your agency’s needs. Take a look at your individual community and sort out where interaction that does not need to be in-person could be amplified. Could you present your agency’s highlights or its needed areas of transparency in a way that might be more interactive and interesting? Perhaps you feel you could do more with your recruiting webpage. Maybe your “contact us” page needs an improvement in sorting the different types of incoming communication so that one administrative assistant doesn’t have to evaluate and route both “officer complaints” and “traffic complaints.” Also, review your online reporting portal—file a fictitious report yourself as “Joe or Jane Citizen,” and see if your portal needs to be more attentive or satisfying. Every agency has different needs.

2. Research the tools that are available. What would serve you best—or at least get you started? Many departments have an FAQ page or possibly several sets of FAQs under different topics, that could be used as the foundation of a very functional one-stop page facilitated by a “virtual assistant.” Compile your FAQs on various topics and couple them with the resource list that your officers no doubt carry with them. Browse some of the commercially used virtual assistants like those used by your local utility or cell carrier or retailers like Amazon or Best Buy. Walk through some question-and-answer sessions with those chatbots and see if you can imagine what they could do for your agency page. The sky is truly the limit for this emerging technology. Still skeptical? Just ask Siri, “Hey, Google,” or Alexa about it!

3. Beta test a small strategy. Some basic customer service applications are in full use and improving every day in the public sector. They could easily be repurposed for portions of the basic law enforcement mission in rapid fashion. In addition, companies focused on the public safety sector are emerging with similar solutions. Worried about the budget? It is true that the more customization and direct help agencies need, the more expensive it will be, but simple processes as a start could be built from scratch for much less by one of the agency’s tech-savvy employees using any of a plethora of available custom chatbot applications.

4. Continually reassess, adjust, and grow. Start with a small application on a seldom-viewed webpage and test it. Have agency members test it. Have trusted community members test it. Try expanding to a wider range of community members and repeat until leadership and staff are confident. Trial and readjustment are stalwart components of every problem-solving and evolutionary process, and the mantra of progressive technology.22 Try, fail modestly, improve—then readjust and go for it!

AI systems are here to stay, in all parts of society. Commercial companies are successfully using AI to quickly handle easily solvable problems and engage with their customer bases. Law enforcement agencies, often latecomers to emerging technologies, should follow this lead and leverage existing powerful tools now to provide assistance to those needing help, fuel intelligence-led strategies, connect with communities, and attract the next generation of law enforcement officers. d

Notes:

1 Shep Hyken,  “AI Is Super-Charging the Customer Service World,” Forbes, November 26, 2017.

2 Bernard Marr, “The Key Definitions of Artificial Intelligence (AI) that Explain Its Importance,” Forbes, February 14, 2018.

3 Laura Neitzel, “How Intelligent Computing Can Help Win the War on Crime,” BrandFocus, PoliceOne, December 27, 2018.

4 Federal Communications Commission, “Text to 911: What You Need to Know” April 16, 2018; Amanda Henderson, “Social Media Use to Report Crime Instead of Going to the Police,” Fox Illinois, December 7, 2017.

5 Kimberly A. Miller, David Pearson, and Joe Monroe, “Marketing & Hiring: How to Stand Out When You Are a Small Fish in a Big Pond” (presentation, IACP 2018 Annual Conference and Exposition, Orlando, FL).

6 Sarah Lawrence and Bobby McCarthy, What Works in Community Policing? A Best Practices Contexts for Measure Y Efforts (Berkeley, CA: The Chief Justice Earl Warren Institute on Law and Social Policy, 2013).

3 Ways to Improve Police/Community Relations,” ICMA, February 27, 2015.

8 Thomas B. Cross, “Top Chatbot Features – Garbage in, Garbage Out,” Part 1, April 3, 2018.

9 Theo Douglas, ”Los Angeles Chatbot Deputized to Help with Police Recruitment,” Government Technology, February 20, 2018.

10 Justine Fenwick (CEO, CivitasAI), personal communication, September 2018,

11 10 Chatbot Best Practices for Mobile Customer Support,” botanalytics blog, July 24, 2017.

12 Hyken, ”AI Is Super-Charging the Customer Service World.”

13 Fenwick, personal communication, September 2018.

14 Douglas, “Los Angeles Chatbot Deputized to Help with Police Recruitment.”

15 Jim Thomas, “Why Don’t People Report Crimes to the Police?” Legal Beagle, June 17, 2017.

16 Justine Fenwick, “Impact of Online Crime Reporting on Resident’s Trust in Their Law Enforcement Agency” (presentation, University of San Francisco, 2017).

17 Hal 90210, “Tired of Texting? Google Tests Robot to Chat with Friends for You,” Guardian, February 14, 2018; Jesse Lahey, “8 Key Skills of Empathy,” Engaging Leader (blog), May 13, 2015.

18 Tanya Abrams, “Virtual Humans Inspire Patients to Open Up, USC Study Suggests,” USC News, July 9, 2014.

19 Fenwick, Impact of Online Crime Reporting on Resident’s Trust in Their Law Enforcement Agency.

20 Daniel Newman, “Chatbots and the Future of Conversation-Based Interfaces,” Forbes, May 24, 2016.

21 Jennifer Bachner, “Predictive Policing: Preventing Crime with Data and Analytics,” The Business of Government, 86–90.

22 Colin Marshall, “’Try Again. Fail Again. Fail Better”: How Samuel Beckett Created the Unlikely Mantra that Inspires Entrepreneurs Today,” Open Culture, December 7, 2017.


Please cite as

Dave Norris, “Artificial Intelligence and Community-Police Relations,” Police Chief online, June 12, 2019.