The Police Chief, the Professional Voice of Law Enforcement
Advanced Search
July 2014HomeSite MapContact UsFAQsSubscribe/Renew/UpdateIACP

Columns
President's Message
Chief's Counsel
Legislative Alert
Technology Talk
From the Director
Departments
Advances & Applications
Highway Safety Initiatives
IACP News
Line of Duty Deaths
New Members
Products and Services
Product Update
Survivors' Club
Current Issue
Search Archives
Web-Only Articles
About Police Chief
Advertising
Editorial
Subscribe/Renew/Update
Law Enforcement Jobs
buyers Your Oppinion

 
IACP
 

Implicit Bias and Law Enforcement

By Tracey G. Gove, Captain, West Hartford, Connecticut, Police Department


acial profiling has been an obvious point of contention between law enforcement and minority group members. Over the past decade, the term “bias-based policing” has been coined, and the subject has been the topic of much research and debate. It often paints the picture of ill-intentioned officers deliberately acting upon preconceived stereotypes and prejudices. What if, perhaps, there was another answer?

In the spring of 2010, professor Jerry Kang from the UCLA School of Law presented to Connecticut judges, prosecutors, public defenders, and police administrators on the topic of implicit, or hidden, bias. His talk shed light on what has become an increasingly popular subject in social science circles. In brief, researchers contend that implicit biases are predilections held by all that operate largely outside of one’s awareness. Although hidden, these biases are both pervasive and powerful.1 Much research on the topic has focused on racial bias and has netted some intriguing results.

While the science does have its detractors, the growing research and potential implications for the criminal justice field make this a topic with which all law enforcement personnel should be familiar. The reader may find this article to be interesting, provocative, and enlightening—or some combination of the three. The purpose of this piece is to raise awareness on a topic that is growing in popularity and that has begun to emerge in the criminal justice system. It is up to readers to decide whether the science is relevant and pertinent to their lives and their workplaces.

The article discusses implicit bias, the latest testing and research into the phenomenon, and practical approaches for law enforcement interventions as recommended by social scientists in the field.


Implicit Bias

The implicit bias phenomenon is being explored in many phases of the criminal justice system and is not limited to law enforcement. Specifically, implicit bias is being studied in judicial decision making (for example, jury selection, jury instruction, and sentencing decisions), as well as in hiring and promotion decisions within criminal justice agencies. Outside of the criminal justice field, the topic has been examined in the fields of education and medicine, as well as in CEO selection at Fortune 500 companies.

A discussion on implicit bias must start with a brief explanation of how the brain sorts, relates, and processes information. Much of the day-to-day processing is done at an unconscious level as the mind works through what Professor Kang calls schemas, which are “templates of knowledge that help us organize specific examples into broad categories. A stool, sofa, and office chair are all understood to be ‘chairs.’ Once our brain maps some item into that category, we know what to do with it—in this case . . . sit on it. Schemas exist not only for objects, but also for people. Automatically, we categorize individuals by age, gender, race, and role. Once an individual is mapped into that category, specific meanings associated with that category are immediately activated and influence our interaction with that individual.”2

When used to categorize people, these schemas are called stereotypes. Although the term stereotype carries a negative connotation, social scientists posit that stereotyping is simply the way the brain naturally sorts those we meet into recognizable groups.3 Attitudes, on the other hand, are the overall evaluative feelings, positive or negative, associated with these individuals or groups. That is to say, attitude is the tendency to like or dislike, or to act favorably or unfavorably, toward someone or something.

For example, “[I]f we think that a particular category of human beings is frail—such as the elderly—we will not raise our guard.” Also, “[I]f we identify someone as having graduated from our beloved alma mater, we will feel more at ease.”4 Lastly, when introduced to someone new, about whom nothing is known but who is reminiscent of an old, admired friend, one may instantly feel comfortable and at ease with that person.

It is said that implicit bias, then, includes both implicit stereotypes and implicit attitudes5 and is shaped by both history and cultural influences (for example, upbringing; life experiences; relationships; and all manner of media—books, movies, television, newspapers, and so on). Research has shown that a person’s previous experiences (both positive and negative) leave a “memory record.”6 Implicit biases encompass the myriad fears, feelings, perceptions, and stereotypes that lie deep within the subconscious; they act on those memory records and exist without an individual’s permission or acknowledgement. 7 In fact, implicit bias can be completely contradictory to an individual’s stated beliefs—a form of conscious-unconscious divergence.8


Measuring Implicit Bias

How is an unconscious bias measured? Social psychologists have developed myriad instruments to measure such cognitions. The largest class relies on reaction time analysis. The most widely used is the Implicit Association Test (IAT),9 which measures reaction time to certain stimuli. The centerpiece for research into implicit bias is Project Implicit, a collaborative effort among research scientists, technicians, and laboratories at Harvard University, the University of Virginia, and the University of Washington.10 A host of associations are tested by the IAT including biases to race, skin tone, gender, age, and weight.

The IAT is likened to a sorting game played on a computer and is available to the general public at http://www.implicit.harvard.edu. During the test, the participant is asked to sort categories of pictures and words. The premise is that two concepts closely associated in the participant’s mind should be easier to pair: “If the word ‘red’ is painted in the color red, the participant will be faster in stating its color than if the word ‘green’ is painted in red.”11

As an example, consider the Age IAT; while seated at a computer, the participant is tasked with associating pictures of faces, both young and old, with “good” words (for example, happy, joy, love, and so on) and “bad” words (for example, angry, nasty, failure, and so on). If a participant responds more quickly when associating a young face with good words and an elderly face with bad words, a bias is said to be shown. The participant’s reaction times are measured in milliseconds. Once the test is complete, the participant receives an implicit bias rating of “slight,” “moderate,” or “strong.” The IAT has been validated12 but is not without critics.13

After seven years of research, the general findings from Project Implicit are summarized as follows:

Implicit biases are pervasive. They appear as statistically “large” effects that are often shown by majorities of samples of Americans. More than 80 percent of web respondents show implicit negativity toward the elderly compared to the young; 75 percent to 80 percent of self-identified whites and Asians show an implicit preference for racial white relative to black.

People are often unaware of their implicit biases. Ordinary people, including the researchers who direct this project, are found to harbor negative associations in relation to various social groups (that is, implicit biases) even while honestly reporting that they regard themselves as lacking these biases.

Implicit biases predict behavior From simple acts of friendliness and inclusion to more consequential acts such as the evaluation of work quality, those who are higher in implicit bias have been shown to display greater discrimination.

People differ in levels of implicit bias. Implicit biases vary from person to person—for example, as a function of a person’s group memberships, the dominance of a person’s membership group in society, consciously held attitudes, and the level of bias existing in the immediate environment. This last observation makes clear that implicit attitudes are modified by experience.14

The research and general findings suggest that implicit biases are held by all15 and, interestingly, race does not affect results. For example, of the 50,000 African Americans who have taken the Race IAT, about half of them had stronger associations with whites than with blacks. To some, this is not surprising; it has been argued that “we live in North America, where we are surrounded every day by cultural messages linking white with good.”16 According to Mahzarin Banaji, a psychology professor from Harvard and a leader of IAT research, “You don’t choose to make positive associations with the dominant group, but you are required to. All around you, that group is being paired with good things. You open the newspaper and you turn on the television, and you can’t escape it.”17

In other words, the belief is that media bombardment in which a certain race is consistently linked with crime, deviance, and so on may form the basis for implicit biases. This conclusion is contrary to existing assumptions that discrimination and bias are intrinsic characteristics held only by ignorant, pernicious individuals. The research on implicit bias indicates that discrimination and bias are based more on those social issues and influences.

This does not mean that those who show a preference for “white” on the IAT are racists. In fact, the results of the test may be entirely incompatible with an individual’s conscious, stated beliefs.18 And, in fact, one can use those stated values to direct their behavior. Research has shown, however, that the IAT is a powerful predictor of how one reacts in certain spontaneous situations that can affect behavior, likely without the individual’s awareness.19 Referring back to an earlier example of a person recognized as graduating from “our beloved alma mater,” unconsciously, one may lean closer to that person, smile more, maintain longer eye contact, and engage that person in more animated conversation.

While there are many implications of implicit racial bias in law enforcement, the remainder of this article focuses on the theory of race-crime associations. The most salient research on this topic involves studies on what is termed “shooter bias.”


“Shooter Bias” Studies

The media is awash with stories about disparate treatment of minorities by police, such as excessive force when restraining minority suspects; the profiling of black motorists; and the now infamous “beer summit” among President Obama, a professor, and a police officer, to name a few. However, none raises more uproar than a police shooting, particularly when it involves a white police officer and black suspect. Are these shootings always influenced by bias? Implicit bias researchers have examined this topic and the results are mixed; sometimes police officers show the same bias as nonpolice participants, and sometimes officers show less. Both studies, however, show promising results that bear examination by law enforcement personnel.

In 2005, researchers E. Ashby Plant and B. Michelle Peruche conducted a study utilizing 50 certified police patrol officers who participated in computer-simulated “shoot—don’t shoot” scenarios.20 Although the officers were predominantly white males, the cohort also involved female, black, Native American, and Hispanic officers. During the test, pictures of faces with either a gun or a neutral object superimposed over each were shown in various positions on a screen. If the suspect and a gun were pictured, the officers were to shoot. If the suspect and some other object were pictured (for example, a wallet, a cellphone, and so on), the officers were to chose the “don’t shoot” option. The “suspects” pictured were both black and white college-age males.

The results of the study showed that some “officers were initially more likely to mistakenly shoot unarmed black suspects than unarmed white suspects.”21 These stereotype-consistent behaviors also had emerged among a community sample of both black and white participants in a prior study.22 On a more promising note, however, the researchers found that “after extensive exposure [for example, repeated trials] to the program, the officers were able to eliminate this bias.”23 To that end, although both citizens and officers showed an implicit bias toward the unarmed black suspects, the results were not inevitable, and it appeared that proper training may improve overall accuracy in decisions to shoot. The researchers cautioned, however, that there is currently no evidence to show that the elimination of bias during computer simulation will necessarily transfer to decisions made by officers in the field.

In 2009, Correll et al. conducted a similar study on shooter bias. This study, however, utilized more officers (237) with a greater diversity of background (for example, patrol, investigators, SWAT, traffic, and so on) than the previous study. Further, Correll et al. utilized 127 civilians for comparison as well as a sampling of college students. Each group contained a mix of males, females, whites, blacks, Latinos, and other minorities. Like the 2005 study, the officer group was predominantly white; however, the civilian example contained many more ethnic minority members.24

The results of the Correll et al. study revealed that police officers were far less influenced by racial bias than the civilians.

Like community members, police were slower to make correct decisions when faced with an unarmed black man or an armed white man. It is important to note, however, that the officers differed dramatically from civilians in terms of the decisions they ultimately made. Community members showed a clear tendency to favor the shoot response for black targets. . . . Police, however, showed no bias in their criteria. Moreover, they showed greater discriminability and a less trigger-happy orientation in general (i.e., for both black and white targets).

When the target was white, all of the samples [police and civilian]. . . set a relatively high criterion. . . . But when the target was black, the community set a significantly lower (more trigger-happy) criterion than officers.25

To validate the study, the trials were run twice, and researchers speeded the rate of the images that appeared on the screen. The results were the same. “Compared to the public at large. . . police officers had a ‘less trigger-happy orientation.’ ” The lead researcher stated, “We don’t mean to conclude that this is conclusive evidence that there is no racial bias in police officers’ decision to shoot. . . . But we’ve run these test with thousands of people now, and we’ve never seen this ability to restrain behavior in any group other than police officers.”26

The results from the two studies, Plant and Peruche in 2005 and Correll et al. in 2009, were inconsistent. The reasons for this have been argued as follows: (1) The 2005 study sampled 50 police officers from Florida whereas the 2009 study sampled 237 officers from Colorado and 14 other states; and (2) the 2005 study used stimuli consisting of faces on which objects (for example, guns, wallets, and so on) had been superimposed, whereas the 2009 study utilized full-body images of men holding similar objects. It has been argued that the 2009 study more closely mirrored police training and on-the-job experiences.27

Taken together, these studies indicate that police officers showed no more bias than their civilian counterparts. In one study, officers were comparable to civilians; in the other, the officers showed better discretion and judgment in the decision to shoot.


Implicit Bias and Law Enforcement

The study of implicit bias has important implications for police leaders. Police officers are human and, as the theory contends, may be affected by implicit biases just as any other individual. In other words, well-intentioned officers who err may do so not as a result of intentional discrimination, but because they have what has been proffered as widespread human biases. Social psychologists do not contend that implicit bias should be a scapegoat for unethical police behavior; however, an understanding that biased police behavior could be manifested by even well-intentioned officers who have human biases can reduce police defensiveness around this issue and motivate change.28

Lorie A. Fridell, PhD, an associate professor of Criminal Justice at the University of South Florida and the former Director of Research at the Police Executive Research Forum (PERF), writes on this topic and provides command-level training around the country. Her recommendations for agencies form what she calls a comprehensive agency program to produce fair and impartial policing. Dr. Fridell argues that all agencies need to proactively promote fair and impartial policing because they hire humans to do police work and the research shows that humans have implicit biases. Her recommendations help agencies to address the ill-intentioned officers who engage in biased policing and the overwhelming majority of well-intentioned officers who aspire to police fairly and impartially, but who are human.

Many of the following recommendations come from Dr. Fridell’s work.


Recruitment and Hiring

First, police agencies should hire a diverse workforce and individuals who can police in an unbiased fashion. Ideally, the composition of personnel should reflect the diversity of the community that is served. Doing so conveys a sense of equality to the public and allows the officers within the agency to better understand and communicate with the minority community.29 It also “increases the likelihood that officers will come to better understand and respect various racial and cultural perspectives through their daily interactions with one another.”30 The intergroup contact theory provides another reason for a diverse workforce. According to this theory, positive interactions with people in other groups outside of one’s race or ethnicity will reduce implicit biases against those groups.31 When officers work with a diverse group of peers within an agency, their implicit biases are weakened through repeated positive interactions.


Community Policing

Community policing can promote fair and impartial policing. Community policing that facilitates positive interactions between police and community members harnesses the power of the intergroup contact theory. “Knowing many citizens by face and name improves officers’ abilities to differentiate between suspicious and nonsuspicious people on a basis other than race; getting to know the community’s law-abiding citizens helps police overcome stereotypes based on characteristics such as race.”32 Also, in building relationships, community members develop a sense of trust in their local officers; this may reduce the biases that citizens may hold against the police.


Training

Training can play a critical role in reducing the impact of implicit bias on behavior. Research has found that individuals who are made aware of their implicit biases are motivated and able to implement “controlled” (that is, unbiased) behaviors. Although the recommendations made here do not reduce bias—rather, they raise consciousness about them—research has suggested that in making one aware of unconscious biases, these biases, which are malleable, may be reduced.33 A type of “cognitive correction”34 is said to take place.

At the basic level, law enforcement recruits should be challenged to identify key police decisions and scenarios that are at greatest risk of manifesting bias, such as traffic stops, consent searches, reasonable suspicion to frisk, and other procedures—and then reflect on the potential impact of implicit bias on their perceptions and behaviors in those scenarios.35 Race, gender, age, disability, and sexual orientation all have the potential to impact and influence decisions. Further, seasoned officers should be similarly challenged at in-service and other training venues. Supervisors should be challenged to consider how implicit biases may manifest not only in themselves but also in their subordinates.36 Officers at all levels should be versed not only in diversity training but also in training on cultural competency, Fourth Amendment restrictions, and professional motor vehicle stops.

Along with police practitioners, researchers who have conducted key work in this area, and funding from the U.S. Department of Justice Office of Community Oriented Policing Services (COPS), Dr. Fridell is producing a model curriculum that will help the police recruit to

  • understand that even well-intentioned people have biases;
  • understand how implicit biases affect what we perceive and see and, unless prevented, affect what we do;
  • understand that fair and impartial policing leads to effective policing; and
  • use tools that help officers to recognize their conscious and implicit biases and implement controlled (unbiased) behavioral responses.37

Similarly, with the same expert team and COPS office funding, Dr. Fridell has developed a science-based curriculum for first-line supervisors. These training programs supplement more traditional curriculums designed to address racially biased policing.


Policy

Dr. Fridell advocates that an agency have a clear policy that informs officers when they can and cannot use race, ethnicity, and perhaps other factors as well during situations that require law enforcement choices such as decisions to stop, requests for consent to search, arrests, and so on. Focus groups held by PERF revealed that “all levels—line officers, command staff, and executives—have very different perceptions regarding the circumstances in which officers can use race/ethnicity.”38 These differing perceptions exist not only across agencies but also within the same police department.

Policies that incorporate the words “sole” or “solely” predominate nationwide but, argues Dr. Fridell, do not provide meaningful guidance.39 Connecticut’s statute on racial profiling reflects the “solely model”; it reads

The race or ethnicity of an individual shall not be the sole factor in determining the existence of probable cause to place in custody or arrest an individual or in constituting a reasonable and articulable suspicion that an offense has been or is being committed so as to justify the detention of an individual or the investigatory stop of a motor vehicle.(Alvin W. Penn Racial Profiling Prohibition Act, C.G.S. 54-1l)

Dr. Fridell argues that such policies “define the problem out of existence.” These single-factor policies do not prohibit an officer from engaging in biased behavior that is motivated by two factors, such as race and gender, because such behaviors are not based “solely” on race. Thus, an officer who pulls over all black males for speeding because they are black males and leaves whites alone, would not be violating a “solely” policy. The decisions of that officer are not based “solely” on race; they are based on race and another factor (in this case, gender).40

Two recommended model policies include the suspect-specific policy and the PERF Report policy.

The suspect-specific model reads as follows:

Officers may not consider the race or ethnicity of a person in the course of any law enforcement action unless the officer is seeking to detain, apprehend, or otherwise be on the lookout for a specific suspect sought in connection with a specific crime who has been identified or described in part by race or ethnicity.41

In accordance with this policy, officers must be able to show the link between the set of identifiers (for example, race or ethnicity) and the particular suspect sought, detained, or arrested. For example, if a witness described a robbery suspect by clothing, physical descriptors, and race, then race, along with the other identifiers, can be used as a means to interdict, detain, or arrest a suspect.

The PERF policy read as follows:

Officers shall not consider race/ethnicity to establish reasonable suspicion or probable cause except that officers may take into account the reported race/ethnicity of a potential suspect(s) based on trustworthy, locally relevant information that links a person or persons of a specific race/ethnicity to a particular unlawful incident(s).42

Both policies mandate that certain criteria be met in order to interdict or detain based on race or ethnicity.


Supervision

Police supervisors are an agency’s first line of defense against all manner of problems, issues, and liabilities. When dealing with biased policing—including that which is produced by implicit bias—the same is true. Dr. Fridell states that supervisors “must be alert to any pattern or practice of possible discriminatory treatment by individual officers or squads (through observation, information from fellow officers, or close review of complaints) and be willing and able to take appropriate action in response to inappropriate behavior.”43 Accordingly, supervisors should receive specific training on implicit bias and how it can affect not only themselves but also their officers and the entire police organization. Further, the department’s policy on biased policing must be well understood, communicated to all personnel, and strictly enforced.

If an officer displays a tendency toward discriminatory or biased behavior, the issue must be quickly addressed by a supervisor. According to Dr. Fridell, when such instances are based on implicit bias (that is, they are not intentional acts of discrimination or bias), discipline per se would not be appropriate and the evidence will likely be ambiguous. “Since, in many instances, there will only be ‘indications’ and not ‘proof,’ it will be important [for the police agency] to determine when and how supervisors can intervene to stop/prevent what appears to be inappropriate conduct while keeping in mind the ambiguous nature of the evidence as well as the sensitive nature of the issue.”44


Conclusion

Discussions on bias in policing are difficult, to say the least, as there is always the danger of polarizing groups. Implicit bias research, although still in its infancy, has grown in popularity and may soon drive such discussions. The criminal justice field has begun to feel the effects of implicit bias research: The judicial systems in several states have considered the literature and are taking steps to implement the findings into their daily operations; similarly, the Department of Justice has recruit and first-line supervisor training curricula under development in response to research findings on implicit bias. While individual police leaders and personnel may have their own thoughts or beliefs on the topic, its pervasiveness requires, at the very least, a familiarity with the subject matter and its potential implications for the law enforcement field. ■

The author would like to thank Dr. Lorie A. Fridell, associate professor of criminal justice at the University of South Florida, for her help and guidance with this article.


Notes:

1Mark W. Bennett, “Unraveling the Gordian Knot of Implicit Bias in Jury Selection: The Problems of Judge-Dominated Voir Dire, the Failed Promise of Batson, and Proposed Solutions,” Harvard Law and Policy Review 4, no. 1 (Winter 2010): 149–171.
2Americans for American Values, “What Is Implicit Bias?,” 2009, http://americansforamericanvalues.org/unconsciousbias (accessed August 16, 2011).
3Ibid.
4Jerry Kang, Implicit Bias: A Primer for Courts (August 2009), 1, http://wp.jerrykang.net.s110363.gridserver.com/wp-content/uploads/2010/10/kang-Implicit-Bias-Primer-for-courts-09.pdf (accessed August 16, 2011).
5Ibid.
6Anthony G. Greenwald and Linda Hamilton Krieger, “Implicit Bias: Scientific Foundations,” California Law Review 94, no. 4 (July 2006): 945–967.
7Bennett, “Unraveling the Gordian Knot of Implicit Bias in Jury Selection.”
8“IAT Home,” Project Implicit, https://implicit.harvard.edu/implicit/demo (accessed August 16, 2011).
9Jerry Kang and Kristin Lane, “Seeing through Colorblindness: Implicit Bias and the Law,” UCLA Law Review 58 (2010): 471–472.
10“What Is Project Implicit,” Project Implicit, http://www.projectimplicit.net/about.php (accessed August 16, 2011).
11Kang, Implicit Bias: A Primer for Courts.
12Anthony G. Greenwald et al., “Understanding and Using the Implicit Association Test: III. Meta-Analysis of Predictive Validity,” Journal of Personality and Social Psychology 97, no. 1 (2009): 17–41, http://faculty.washington.edu/agg/pdf/GPU&B.meta-analysis.JPSP.2009.pdf (accessed August 31, 2011). See also, Kang and Lane, “Seeing through Colorblindness: Implicit Bias and the Law,” 477–481, 488–489.
13John Tierney, “In Bias Test, Shades of Gray,” New York Times, November 18, 2008; and Hal R. Arkes and Philip E. Tetlock, “Attributions of Implicit Prejudice, or ‘Would Jesse Jackson “Fail” the Implicit Association Test?,’ ” Psychological Inquiry 15, no. 4 (2004): 257–278, http://faculty.washington.edu/agg/IATmaterials/PDFs/AT.psychinquiry.2004.pdf (accessed August 16, 2011).
14“General Information,” Project Implicit, http://projectimplicit.net/generalinfo.php (accessed August 16, 2011); and Bennett, “Unraveling the Gordian Knot of Implicit Bias in Jury Selection.”
15Bennett, “Unraveling the Gordian Knot of Implicit Bias in Jury Selection.”
16Malcolm Gladwell, Blink: The Power of Thinking without Thinking (New York: Hachette Book Group, 2005), 85.
17Ibid.
18Gladwell, Blink, 84–85.
19Ibid., 85.
20E. Ashby Plant and B. Michelle Peruche, “The Consequences of Race for Police Officers’ Responses to Criminal Suspects,” Psychological Science 16, no. 3 (2005): 180–183.
21Ibid.
22Joshua Correll et al., “The Police Officer’s Dilemma: Using Ethnicity to Disambiguate Potentially Threatening Individuals,” Journal of Personality and Social Psychology 83 (2002): 1324–1325.
23Plant and Peruche, “The Consequences of Race for Police Officers’ Responses to Criminal Suspects.”
24Joshua Correll et al., “Across the Thin Blue Line: Police Officers and Racial Bias in the Decision to Shoot,” Journal of Personality and Social Psychology 92, no. 6 (2007): 1006–1023.
25Ibid.
26Benedict Carey, “Study Finds Police Training Plays Key Role in Shootings,” New York Times, June 2, 2007, http://www.nytimes.com/2007/06/02/us/02police.html (accessed August 16, 2011).
27Correll et al., “Across the Thin Blue Line: Police Officers and Racial Bias in the Decision to Shoot.”
28Lorie A. Fridell, “Racially Biased Policing: The Law Enforcement Response to Implicit Black-Crime Association,” Racial Divide: Racial and Ethnic Bias in the Criminal Justice System, ed. Michael J. Lynch, Britt Patterson, and Kristina K. Childs (Monsey, N.Y.: Criminal Justice Press, 2008), 39–59.
29Ibid.
30Lorie Fridell and M. Scott, “Law Enforcement Agency Responses to Racially Biased Policing and the Perceptions of Its Practice,” Critical Issues in Policing, 5th ed., ed. R.G. Dunham and G.P. Alpert (Prospect Heights, Ill.: Waveland Press, 2005), 304–321.
31Christopher L. Aberson, Carl Shoemaker, and Christina Tomolillo, “Implicit Bias and Contact: The Role of Interethnic Friendships,” Journal of Social Psychology 144, no. 3 (2004): 335–347; and Kang, Implicit Bias: A Primer for Courts.
32Fridell, “Racially Biased Policing: The Law Enforcement Response to Implicit Black-Crime Association.”
33Kristin A. Lane, Jerry Kang, and Mahzarin R. Banaji, “Implicit Social Cognition and Law,” Annual Review of Law and Social Science 3 (December 2007); and Irene V. Blair, “The Malleability of Automatic Stereotypes and Prejudice,” Personality and Social Psychology Review 6, no. 3 (2002): 242–261.
34Bennett, “Unraveling the Gordian Knot of Implicit Bias in Jury Selection.”
35Fridell, “Racially Biased Policing: The Law Enforcement Response to Implicit Black-Crime Association.”
36Ibid.
37Lorie Fridell and Anna T. Laszlo, “Reducing Biased Policing through Training,” Community Policing Dispatch 2, no. 2 (2009), http://www.cops.usdoj.gov/html/dispatch/February_2009/biased_policing.htm (accessed August 18, 2011).
38Fridell, “Racially Biased Policing: The Law Enforcement Response to Implicit Black-Crime Association.”
39Ibid.
40Ibid.
41Ibid.
42Lorie A. Fridell et al., Racially Biased Policing: A Principled Response (Washington, D.C.: Police Executive Research Forum, 2001), http://www.cops.usdoj.gov/files/ric/Publications/RaciallyBiasedPolicing.pdf (accessed August 16, 2011).
43Fridell and Scott, “Law Enforcement Agency Responses to Racially Biased Policing and the Perceptions of Its Practice.”
44Fridell, “Racially Biased Policing: The Law Enforcement Response to Implicit Black-Crime Association.”

Please cite as:

Tracey G. Gove, "Implicit Bias and Law Enforcement," The Police Chief 78 October 2011): 44–56.

Click to view the digital edition.


Top

 

From The Police Chief, vol. LXXVIII, no. 10, October 2011. Copyright held by the International Association of Chiefs of Police, 515 North Washington Street, Alexandria, VA 22314 USA.








The official publication of the International Association of Chiefs of Police.
The online version of the Police Chief Magazine is possible through a grant from the IACP Foundation. To learn more about the IACP Foundation, click here.

All contents Copyright © 2003 - International Association of Chiefs of Police. All Rights Reserved.
Copyright and Trademark Notice | Member and Non-Member Supplied Information | Links Policy

44 Canal Center Plaza, Suite 200, Alexandria, VA USA 22314 phone: 703.836.6767 or 1.800.THE IACP fax: 703.836.4543

Created by Matrix Group International, Inc.®