Your Trial Message

Your Trial Message

(formerly the Persuasive Litigator blog)

Think Your Jury Understands Probability? Don’t Bet on It

By Dr. Ken Broda Bahm:

1196364292_0604c275eb_z

Many legal cases involve probability. The known risks of a medical condition affect liability. The chances of the same occurrence without the defendant’s actions affect causation. The likelihood that a defendant would earn a given income if an accident hadn’t happened affects damages. Litigators might take for granted that the kinds of odds-making that we engage in daily life won’t present any special challenges to jurors. But probability itself requires some higher order thinking, and some resistance to bias. Rather than being one of those situations in which bias can sometimes creep into an otherwise rational decision, probability is best seen as an area of reasoning governed by bias, making it probable that your jury is to at least some extent misunderstanding probability.

The master bias is something called “probability neglect.” A termed coined by Cass Sunstein (2003), it refers to the common tendency to disregard probability when making a decision in uncertain conditions: Small risks can be either neglected entirely or over-valued. “As a result of probability neglect,” Sunstein notes, “people are often far more concerned about the risks of terrorism than about statistically larger risks that they confront in ordinary life.” Instead of just looking at the known chances of an inherited disease, for example, we will rely on irrelevant factors, other people’s stories, or own own feelings of personal exemption from those odds. These tendencies to distort the chances by viewing them through our own filters have broad implications for legal reasoning within a jury. This post draws on a few related areas of social psychology literature in order to quickly focus on three common ways probability neglect can play a role in trial settings.

Three Common Probability Errors (and Fixes) 

One: Past Events Were Inevitable

Looking back at the chances that something that did happen would have been likely to happen under the same conditions, the odds seem to approach 100 percent: It had to have happened because it did happen. Logically, we know that isn’t true, because the winning lottery ticket still had incredibly long odds at the time of sale. But for jurors assessing a story, it appears that the moment the doctor decides to go with a simpler test, it is inevitable that this choice would mean missing the critical diagnosis. This, of course, is one manifestation of hindsight bias (Roese & Vohs, 2012), or the tendency to project present knowledge into past assessments.

Fix:  Focusing on the chances alone isn’t the answer, since jurors will feel like they know the odds based on the outcome. Instead, push back against the tendency to exaggerate probability based on known outcomes. One way of doing that is by encouraging jurors to think counterfactually about all of the things that could have happened, but didn’t. The delay in ordering more comprehensive testing could have delayed treatment of a more likely and equally threatening medical condition, for example.

Two: Control Trumps Risk

A subjective feeling of power or control creates the perception of being personally immune to the odds. Known as the “illusion of control,” it explains the response of gun owners to statistics showing that risks of gun possession in the home greatly exceed the benefits: They will say those statistics relate to other people who take fewer precautions with their guns. And, to be sure, they might. Control isn’t always an illusion and there can be truth to those differences. But in litigation, that perceived power can be an important variable. Jurors identifying with a plaintiff who chose his own protective cycling helmet, for example, could give greater weight to the idea that this choice should have given the user a greater immunity to even the natural risks of cycling. Or a juror who appreciates all of the power that a manufacturer had over the testing process could subjectively feel that this should have given the company greater protection from even problems that it didn’t test.

Fix: If the problem is belief in the illusion that control creates an immunity to probabilities, the answer may lie in turning to the flip side of that bias: A tendency to equate choice with responsibility. So rather than focusing on the objective probabilities of something happening, focus instead on the subjective choices that led it to happen. The gun owner who made all the decisions about how and where the gun is stored arguably has greater responsibility for the misuse. The cyclist who had several choices in helmets is more responsible for the consequences of his choice. The company that could have tested but didn’t, bears more responsibility for the result.

Three: Examples Are More Powerful Than Statistics

In a recent Science in the Courtroom blog post, guest author Chris Dominic of Tsongas Litigation Consulting shares an employment class action example: “The trial team we were working with hired an expert to do a regression analysis that concluded clearly that the career paths of the protected class were roughly the same as the white men in the organization. In one mock trial we had the plaintiff ignore the study but provide three specific examples of horrible racist (or sexist) behavior. The plaintiffs crushed us.” In that case, the logical knowledge that “these examples are atypical outliers” does not beat the more visceral knowledge that “these examples are real.” The bias at work here is called the “availability heuristic,” based on Danial Kahneman and Amos Tversky’s (1973) study, more recently discussed in Thinking Fast and Slow (Kahneman, 2011). Simply put, the bias is “if you can think of it, it must be important.” By being concrete, personal, visual, and story-based, the example is simply easier to think of than the statistic.

Fix: It would be nice to believe that the fix for this is greater emphasis or education on the statistics. They are, after all, more comprehensive and representative than a story could ever be. That choice, however, means going to war with basic human psychology. You might succeed, but the odds are against you. Instead, or in addition, the solution is to make your own points as “available” or more so than the other party’s. In other words, tell your own stories, provide your own examples.

Ultimately, will your jury misunderstand probability? Probably. Lawyers like to believe that logic, and especially the power of explanation and proof, will solve any problem if the jury would just be rational. And some biases can be overcome by raising awareness. But probability is tricky business. The chances are good that this is one bias that needs to be adapted to rather than erased.

______

Other Posts on Juror Logic: 

______

Fast, N. J., Gruenfeld, D. H., Sivanathan, N., & Galinsky, A. D. (2009). Illusory Control A Generative Force Behind Power’s Far-Reaching Effects.Psychological Science20(4), 502-508

Kahneman, D. (2011). Thinking, fast and slow. Macmillan. Chicago

Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science7(5), 411-426

Sunstein, C. R. (2003). Terrorism and probability neglect. Journal of Risk and Uncertainty26(2-3), 121-136.

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive psychology5(2), 207-232.

Photo Credit: Leasepics, Flickr Creative Commons