The Police Chief, the Professional Voice of Law Enforcement
Advanced Search
September 2014HomeSite MapContact UsFAQsSubscribe/Renew/UpdateIACP

Current Issue
Search Archives
Web-Only Articles
About Police Chief
Advertising
Editorial
Subscribe/Renew/Update
Law Enforcement Jobs
buyers Your Oppinion

 
IACP
Back to Archives | Back to October 2009 Contents 

Failures in Criminal Investigation

By D. Kim Rossmo, University Endowed Chair in Criminology, and Director, Center for Geospatial Intelligence and Investigation, Department of Criminal Justice, Texas State University, San Marcos, Texas, and Detective Inspector (retired), Vancouver Police Department, British Columbia, Canada


t seems most unlikely that, with all the checks and balances of the criminal justice system, someone today could be convicted of a crime he or she did not commit. The unfortunate reality, however, is that it does happen.

Developments in DNA analysis have resulted in the discovery of 240 such cases in the United States, some more egregious than others. How is this possible?

The Innocence Project,1 dedicated to exonerating wrongfully convicted people through DNA testing, has identified a number of contributing causes, including eyewitness misidentification, improper forensic science, false confessions, police/prosecutorial misconduct, lying informants, and bad lawyering (see figure 1).

However, there is an important causal factor missing in this list developed by the Innocence Project: faulty investigative thinking. During the investigation of a case, even the best and most ethical detectives can fall prey to several traps. Organizational awareness of these subtle hazards and appropriate investigative procedures and supervision can reduce the risk of detectives falling prey to these hazards.

While the imprisonment of an innocent person is a travesty of justice, a criminal investigative failure more commonly results in the offender escaping justice (a wrongful conviction also allows the real offender to go free). The damage resulting from criminal investigative failures, whether to victims, innocent people, or the public, is significant. Consider homicide alone—there have been approximately 16,000 homicides annually in the United States between 2000 and 2009, with a clearance rate of 63 percent.2 In other words, on average, every day, 16 murders occur that might never be solved and their perpetrators never arrested.

The key failings in investigative thinking can be grouped into three areas: cognitive biases, organizational traps, and probability errors. Like cascading failures in airplane crashes, an investigative failure often has more than one contributing factor.


Cognitive Biases

Perception and Memory Limitations. A witness to a crime must observe, interpret, remember, recall, and then communicate information to a police investigator who, in turn, must understand and record it. Each stage has the potential for error. People are influenced by experiences and expectations, and different people view the world through different lenses. What witnesses think they see is a function of what they expected to see, what they wanted to see, and what they actually saw; the more ambiguous the last, the greater the influence of the first two factors.

Similarly, what people remember depends upon what they believe. The human brain does not objectively record data, and memories are subjective interpretations that are seldom reinterpreted, even when information changes. People tend to remember those facts consistent with their theories, and forget those that are not. More weight is placed on evidence that supports a hypothes is phenomenon called belief perseverance.

Belief perseverance is illustrated in the case of David Milgaard, convicted as a teenager in the 1969 rape and murder of nursing assistant Gail Miller in Saskatoon, Saskatchewan, Canada. After 23 years in prison, he was exonerated when DNA testing determined the semen stains on the victim’s clothing came from Larry Fisher, a serial rapist who caught the bus at the same stop as Miller did every morning. However, even after Fisher’s conviction, there are still people who cannot accept Milgaard’s innocence, “explaining” the DNA evidence by suggesting Milgaard had first murdered Miller, and then Fisher happened to come along, find her body, and have sex with it.

At some point, belief perseverance conflicts with Occam’s razor, or the principle of parsimony. When more than one explanation for an event is possible, the best choice is the simplest (the one with the fewest assumptions). Do not make things more complicated than necessary.

Intuition. Humans use two types of decision-making processes: the intuitive and the rational. Intuition falls between the automatic operations of perception and the deliberate operations of reasoning. This is what is meant by the misnomer “gut instinct.” Intuition, however, is often misunderstood. It is not a paranormal ability or a form of extrasensory perception; while it operates at a below-consciousness level, intuition is still based on normal sensory input.

Intuition is automatic and effortless, fast and powerful. It is learned slowly. Because of its nature, intuition is difficult to control or modify. It can be influenced by emotion and is often prone to error. Typically, intuition involves the use of heuristics (cognitive shortcuts).

In contrast, reasoning is slow and effortful, vulnerable to interference and easily disrupted. However, it is flexible and controllable. Reasoning can overrule intuition.

Different situations require different types of judgment. When data are unreliable and incomplete or quick decisions under chaotic and uncertain conditions are necessary, intuitive decision making is preferable. Such situations can occur on the street or on a battlefield.

However, when we have reliable and adequate data, and time for proper analysis, reasoning produces the most accurate results. Complex and rule-bound tasks, such as major crime investigations or courtroom prosecutions, require careful analysis and sound logic.

Heuristics and Biases. Clear and rational thinking is not easy. People sometimes exhibit limited rationality in the face of life’s complexities because human brains are not wired to deal effectively with uncertainty. People therefore employ heuristics—intuitive rules of thumb—to make judgments under such conditions. A heuristic does not have to be correct most of the time, as long as it promotes survival. While a street police officer’s intuition may sometimes be wrong, it is still an unwise thing to ignore.

While these mental shortcuts work well most of the time, under certain conditions they can lead to cognitive biases. Cognitive biases are mental errors caused by this simplified information-processing technique. They can result in distorted judgments and faulty analyses. Like optical illusions, cognitive biases are consistent and predictable.

Psychologists have identified many heuristics and biases, some of which are particularly problematic for criminal investigators. The anchoring heuristic results from the strong influence of the starting point on the final estimate. The available information determines first approximations, so if we have limited or incorrect information, our starting point will be wrong. There have been many murder cases in which detectives were led astray because the crime appeared to be something other than what it was.

Tunnel vision—one of the leading causes of wrongful convictions—results from a narrow focus on a limited range of possibilities. Consequently, alternative theories to the crime are not considered and potential suspects are eliminated from the investigation. This heuristic is particularly ill-suited to solving complex, dynamic investigations. Focusing on the first likely suspect, then closing the investigation off to alternative theories is a recipe for disaster.

One example of tunnel vision is the case of Rachel Nickell, an attractive 23-year-old woman murdered on London’s Wimbledon Common in July 1992. Her throat was cut, and she was stabbed 49 times. The only witness to the attack was her two-year-old son. In September, New Scotland Yard detectives received a tip regarding an odd man named Colin Stagg. For the next year, he was their investigative focus. In August 1993, after a covert operation involving an undercover policewoman who “befriended” Stagg to obtain (unsuccessfully) further incriminating information, he was finally arrested.

The case went to trial in September 1994. The judge quickly threw out most of the prosecution’s evidence. Calling the covert operation misconceived, he stated, “I am afraid this behavior betrays not merely an excessive zeal, but a substantial attempt to incriminate a suspect by positive and deceptive conduct of the grossest kind.”3 The prosecution withdrew the charges, and Stagg was released.

Several years later, enhanced DNA from Nickell’s clothing was linked to Robert Napper, a psychopath now detained indefinitely in Broadmoor, a secure hospital, for murder and rape.

Confirmation bias is a form of selective thinking in which an individual is more likely to notice or search for evidence that confirms his or her theory, while ignoring or refusing to search for contradicting evidence. Confirming evidence is given more weight, while contradicting evidence is given less weight. The components of confirmation bias include failure to seek evidence that would disprove the theory, not utilizing such evidence if found, refusing to consider alternative hypotheses, and not evaluating evidence diagnosticity.


Organizational Traps

A criminal investigator operates within an organizational structure with its own unique dynamics. The powerful police subculture has long been recognized as possessing both positive and negative characteristics, some of which can influence the dynamics of a major crime investigation.

Inertia and Momentum. Law enforcement agencies are conservative, often suffering from bureaucratic inertia—a lethargy or unwillingness to evolve or act. Change is disruptive, and requires effort, time, and money. This inertia can slow a police agency’s response to a new crime problem. Moreover, when a response finally does occur, it is often insufficient. Organizational inertia was a problem in the Green River Killer investigation in Washington State;4 by the time a functional task force was formed, the killer had stopped.

Organizational momentum—the inability to change direction in the midst of a major investigation—is the opposite problem. It is difficult for investigators to redirect their focus from an established theory of a crime, or a particular suspect, especially if the organization has to then publicly admit it was wrong.

Assumptions and Red Herrings. Investigators must outline their assumptions. If a particular assumption later turns out to be invalid, then everything that flowed from it must be rethought. As the human mind does not automatically reevaluate information, specific organizational procedures are necessary to address this situation. Documenting assumptions facilitates this process and protects investigations from “creeping credibility”—what happens when an idea or theory gains credence not from supporting evidence, but from the passage of time. A possibility hardens into a probability, and then crystallizes into fact.

Large investigations can also suffer from rumors. Detectives should always understand their knowledge base and be able to answer the question: “How do we know what we think we know?”

President John Kennedy’s assassination has provided fertile ground for conspiracy theorists, many of whom fall victim to the traps discussed here. For example, they have questioned how the “magic bullet” could have exited Kennedy’s throat and entered the right shoulder of Texas Governor John Connally Jr. seated in front of the president in the motorcade limousine.

However, conspiracy theorists have assumed the limousine seating was arranged in the same manner as a normal vehicle, with Governor Connally positioned directly in front of President Kennedy. In reality, the limousine had been specially altered and had three rows of seats, with two Secret Service agents in the front, the Connallys in the middle, and the Kennedys in the rear. President Kennedy’s seat was positioned significantly higher and outboard of Governor Connally’s. Furthermore, Governor Connally was turned to the right as he waved to the crowd. Consequently, the two men’s torso alignment is consistent with the “single bullet theory.”

Red herrings—tips that misdirect an investigation—can be particularly dangerous in high-profile major crime cases. Constant media attention brings forth a flood of public information, some of it relevant, most of it not. Witness misinformation has sent several high-profile investigations down the wrong path. Suspect vehicle sightings appear to be particularly problematic—for instance, the white box truck/van reported so often during the Washington, D.C., Sniper shootings5 (the criminals actually drove a blue Chevrolet Caprice sedan). Once followed, red herrings can be very hard to shake. They have been responsible for great waste: millions of dollars of police resources and time, higher victim counts, and unsolved crimes.

Ego and Fatigue. Ego, both personal and organizational, can prevent investigators from adjusting to new information or seeking alternative avenues of exploration. Sometimes the most prestigious law enforcement agencies are the most reluctant to admit mistakes. But truth is more important than reputation. An investigator must have the flexibility to admit his or her original theory was incorrect, and avoid falling into the ego trap. Stubbornness, which often accompanies ego, is just as problematic.

Fatigue, overwork, and stress—endemic in high-profile crime investigations—can create problems for investigative personnel. Tiredness dulls even the sharpest minds. Critical assessment abilities drop in overworked and fatigued individuals, who start to engage in uncritical “automatic believing.”

Groupthink. Groupthink is the reluctance to think critically and challenge the dominant theory. (No one wants to tell the emperor he has no clothes.) It occurs in highly cohesive groups under pressure to make important decisions. The main symptoms of groupthink include the following:

  • Power overestimation—belief in the group’s moral purpose, resulting in risk taking and blindness to the ethical consequences of decisions
  • Close-mindedness—group rationalizations and discrediting of warnings
  • Uniformity pressures—conformity demands and self-censorship

Groupthink has several negative outcomes. Group members selectively gather information and fail to seek expert opinions. They also neglect to critically assess their ideas and examine few alternatives. Groupthink can be a disaster in a major crime investigation.


Probability Errors

The justice system—required to make decisions based on often uncertain or incomplete information—revolves around probabilities. Important legal concepts such as “probable grounds,” “balance of probabilities,” and “guilt beyond a reasonable doubt” are influenced by concepts of probability. Unfortunately, the human mind is not good at understanding probability, and, as a result, humans often make irrational decisions. Probability errors in criminal investigations can misdirect investigators, prosecutors, and juries.

Uncertainty in Language. Discussing chance, likelihood, or risk involves talking about probability. However, words to describe mathematical probabilities—“unlikely,” “frequent,” “risky”—are not well matched to underlying numbers. They also mean different things to different people at different times. Furthermore, the real probability of an event is often unknown. Ambiguity in statements of probability can become a problem during an investigation or criminal trial, as happened during the rape and murder trial of David Milgaard discussed on page 56. Sperm recovered at the crime scene contained type A antigens. When Milgaard’s saliva was tested, however, the results showed he was a non-secretor. About 80 percent of the North American population are secretors—individuals whose saliva, semen, and other bodily fluids contain ABO antigens from which their blood type can be determined; non-secretors do not possess this genetic trait.

To circumvent this problem, the prosecution advanced the possibility of blood contamination, and the pathologist supported this theory in his testimony. When asked under what conditions human blood could get into seminal fluid or spermatozoa in males, he answered: “One would be local injury to the male genitals. A second and quite common occurrence would be any inflammation, either internal or external, of the male genitals.”6

But when the pathologist said “quite common,” was he speaking from the perspective of a doctor treating such an infliction, from a lifetime risk perspective, or from the perspective of someone suffering from the problem on a given day? It is only the last perspective that was relevant for Milgaard’s guilt. It turns out that such a medical condition is actually quite rare.

Coincidence. Skeptical detectives often say they do not believe in coincidences. However, when looking for patterns within a large number of items (events, suspects), coincidences are inevitable. A task force that has examined hundreds of suspects will find some of them, by sheer chance, appear circumstantially guilty. Efforts to solve a crime by “working backwards” (from the suspect to the crime, rather than from the crime to the suspect) are susceptible to errors of coincidence. These types of errors are often seen in the proffered “solutions” to such famous cases as Jack the Ripper.

Computation Mistakes. In November 1999, British solicitor Sally Clark was convicted of smothering her two infant sons, who died a year apart from sudden infant death syndrome (SIDS or crib death). At her murder trial, pediatrician Sir Roy Meadow testified that the probability of two crib deaths occurring in a single family of affluent means was “vanishingly small,” approximately 1 in 73,000,000. He calculated this number by squaring 1/8,543, the probability of a single SIDS case in England.7

There were several problems with Dr. Meadow’s analysis. First, he incorrectly assumed that crib deaths are independent within a single family, ignoring the possibility of a genetic effect. Second, he committed an ecology fallacy in treating individual-level risk as equivalent to average overall population risk. Third, because crib deaths are relatively common but nonrandom events, recurrence happens somewhere in England about once every 18 months. The Royal Statistical Society issued a press release stating there was no statistical basis for Dr. Meadow’s estimation.8 In 2003, Clark’s conviction was quashed upon appeal.

Another probability error arose during the O.J. Simpson murder trial. Harvard law professor Alan Dershowitz testified for the defense regarding Simpson’s previous domestic assault on Nicole Simpson. Claiming “academic expertise in this area,” he concluded that extremely few domestic battery cases result in murder, that battery is not a good independent predictor of murder, and that discussing their statistical relationship might confuse the jury.9

They were not the only ones confused—it turns out Dershowitz answered the wrong question. At issue is not the probability that a battered woman will be murdered by her abuser, but rather the probability that a battered and murdered woman was killed by her abuser. The former probability—the evidence given at trial—is less than 0.04 percent.10 The latter, correct probability, however, is almost 90 percent.11 Unbelievably, the prosecution let this testimony go unchallenged.

Errors of Thinking. Two errors relate to understanding probability within the court context: the prosecutor’s fallacy and the defense attorney’s fallacy. The latter is the conscious result of a clever defense lawyer who is able to convince a jury or judge to consider the probability of items of evidence of guilt in isolation (to minimize their impact), rather than in totality. The prosecutor’s fallacy is more insidious because it typically happens by mistake. It occurs when the probability of the evidence given guilt is equated to the probability of guilt given the evidence. If the evidence is highly probable given the hypothesis, it does not follow that the hypothesis itself must be highly probable.

The case of the Birmingham Six in England12 is an infamous example of the prosecutor’s fallacy. In 1974, horrendous bomb explosions in two Birmingham pubs killed 21 people and injured 182. The bombings were attributed to the Provisional IRA. Special Branch officers detained a group of six men traveling to a funeral in Belfast. Their hands were swabbed, and the swabs subsequently analyzed for traces of nitroglycerine. A forensic scientist testified, during what was the largest mass murder trial in Britain, that he was 99 percent certain the defendants had handled explosives, based on the results of his tests.13

What he should have said was the test is positive 99 percent of the time if someone has handled explosives. It was later disclosed that many other substances can produce positive test results, including nitrocellulose, which is found in paint, playing cards, soil, gasoline, cigarettes, and soap. The defendants had played a game of cards on the train shortly before their arrest. The convictions of the Birmingham Six were overturned on appeal.


Wrongful Innocence Claims

Ironically, the same factors that can lead to wrongful convictions are also found in wrongful innocence claims. Several imprisoned and truly guilty criminals have made loud and persistent cries of innocence, ensnaring well-meaning but undiscerning supporters. Such cases typically involve a coordinated campaign, with lawyers, reporters, politicians, and academics crusading for the release of the “wronged” party. Unsurprisingly, it turns out that crusaders are no more immune to psychological biases, group dynamics, and probability misunderstandings than police investigators and prosecutors.

The Benjamin LaGuer case is a typical example. In January 1984, LaGuer was charged with rape, assault, robbery, and burglary in relationship to the brutal sexual assault of a 59-year-old woman. He was convicted and given a life sentence. LaGuer has always claimed to be innocent, and his efforts for release have been supported by a number of scholars, politicians, and journalists. In 2002, DNA proved them all wrong. 14

LaGuer continues to refuse to accept responsibility for his actions, and his supporters have only enabled him (some of whom are now arguing the DNA evidence was contaminated).

The supporters of these criminals suffered from the halo effect, a cognitive bias whereby one positive trait (intelligence, talent, charm) influences the perception of other traits (moral rehabilitation, lack of dangerousness). Charisma, intelligence, and communication skills are trademarks of criminals successful in convincing others of their “innocence.” Unfortunately, these characteristics have nothing to do with actual innocence. In fact, the opposite traits—dislikeable personalities, low intelligence, and poor communication skills—are more likely to be correlated with wrongful convictions.

Well-intentioned people eager to help often do not question their mission’s validity. Wrongful convictions do occur, and innocent people are imprisoned. But as DNA testing shows, most claims of prisoner innocence are false.15 In any criminal investigation—especially a long and complicated one—mistakes and errors occur. Nevertheless, these mistakes do not automatically translate into the accused’s innocence.

The best approach is to seek the truth—an objective, careful truth based on evidence, facts, logic, and skepticism—not a subjective dogmatic opinion, entangled by biases, egos, and politics.


Preventing Investigative Failures

A major crime “whodunit” can be challenging and difficult. Factors identified with cognitive and organizational failures (low information levels, limited resources, and pressure to obtain quick results) are all too common in such investigations. The potential benefits of advanced forensic techniques, comprehensive criminal databases, and highly skilled police personnel are undermined by the wrong mind-set and a limited organizational approach.

There are three rules of investigative failure that detectives should keep in mind:

  • One mistake, one coincidence, and one piece of bad luck can produce an investigative failure.
  • Once one mistake has been made, the likelihood of further mistakes increases.
  • Usually the biggest problem is refusing to acknowledge the original mistake.

Police agencies should ensure that detectives and their managers are aware of these traps. Investigations should be led by the evidence, not by the suspects. Case conclusions should be deferred until sufficient information has been gathered, and tunnel vision should be avoided at all costs. Investigative managers must remain neutral and encourage open inquiries, discussion, and dissent. Assumptions, inference chains, and uncertainties need to be recognized and recorded. Outside help should be sought when necessary.

Being aware of these problems, however, is usually not enough. Police agencies need to establish organizational mechanisms to mitigate their risk.

The criminal investigation process plays an important and special role in countries governed by the rule of law. Its function is to seek the truth, without fear or favor. That task, integral to both public safety and justice concerns, must be conducted in an unbiased and professional manner. If it is not, the result is unsolved crimes, unapprehended offenders, and wrongful convictions. Understanding what can go wrong is the first step towards preventing a criminal investigative failure. ■

Dr. Kim Rossmo is the University Endowed Chair in Criminology and the Director of the Center for Geospatial Intelligence and Investigation in the Department of Criminal Justice at Texas State University. He has a Ph.D. in criminology from Simon Fraser University, and was formerly a detective inspector with the Vancouver Police Department. He is a member of the IACP Police Investigative Operations Committee. His second book, Criminal Investigative Failures, was recently published by Taylor & Francis (2009).

Notes:
1The Innocence Project is a national organization whose mission is to exonerate wrongfully convicted persons through DNA testing and reform the criminal justice system to prevent future wrongful convictions; for more information, visit http://www.innocenceproject.org/.
2D. Kim Rossmo, Criminal Investigative Failures, (Boca Raton, FL: Taylor and Francis, 2009).
3Paul Britton, The Jigsaw Man (London: Bantam Press, 1997): 366.
4Carlton Smith and Thomas Guillen, The Search for the Green River Killer (New York, NY: Penguin Books, 1991).
5Newseum, “D.C. Sniper,” http://www.newseum.org/exhibits_th/fbi_feat/index.aspx?item=sniper_index&style=c (accessed August 17, 2009).
6Her Majesty the Queen v. David Edgar Milgaard, 1970 QB 1157-1158.
7Celia Hall, “‘Statistical Error’ in Child Murder Trial,” Telegraph (December 31, 1999), http://www.telegraph.co.uk/htmlContent.jhtml?html=/archive/1999/12/31/nsal31.html (accessed July 12, 2008).
8“Royal Statistical Society Concerned by Issues Raised in Sally Clark Case,” news release, October 23, 2001, http://www.rss.org.uk/PDF/RSS%20Statement%20regarding%20statistical%20issues%20in%20the%20Sally%20Clark%20case,%20October%2023rd%202001.pdf (accessed August 17, 2009).
9Alan M. Dershowitz, Reasonable Doubts: The Criminal Justice System and the O.J. Simpson Case (New York: Simon and Schuster, 1996): 104.
10Alan M. Dershowitz, Reasonable Doubts.
11Ibid.
12The Birmingham Six are Hugh Callaghan, Patrick Joseph Hill, Gerard Hunter, Richard McIlkenny, William Power, and John Walker, who were sentenced to life imprisonment in 1975.
13Bernard Robertson and Georges A. Vignaux, Interpreting Evidence: Evaluating Forensic Evidence in the Courtroom (Chichester: John Wiley and Sons, 1995).
14David Arnold, “Convict’s Cause Is Tested: Supporters Shaken by DNA Findings,” The Boston Globe, March 28, 2002, http://truthinjustice.org/laguer.htm (accessed August 17, 2009).
15Mathew Bruun, “DNA Findings Difficult to Rebut: Doctor Rejects LaGuer Claims,” Worcester Telegram and Gazette, March 31, 2002.


Top

 

From The Police Chief, vol. LXXVI, no. 10, October 2009. Copyright held by the International Association of Chiefs of Police, 515 North Washington Street, Alexandria, VA 22314 USA.








The official publication of the International Association of Chiefs of Police.
The online version of the Police Chief Magazine is possible through a grant from the IACP Foundation. To learn more about the IACP Foundation, click here.

All contents Copyright © 2003 - International Association of Chiefs of Police. All Rights Reserved.
Copyright and Trademark Notice | Member and Non-Member Supplied Information | Links Policy

44 Canal Center Plaza, Suite 200, Alexandria, VA USA 22314 phone: 703.836.6767 or 1.800.THE IACP fax: 703.836.4543

Created by Matrix Group International, Inc.®