Your Trial Message

Beware of Junk Science in Disguise

By Dr. Ken Broda Bahm:

Our trial system is designed to restrict the factfinders’ information to that which is relevant, probative, and sound. When it comes to expert testimony, it is the responsibility of trial judge to ensure that the testimony has a reliable foundation. But in the case of science, particularly social science, that can be a challenge. I have written in the past on research showing that jurors are only partially effective at understanding basic research flaws and discounting dubious science, and judges are not necessarily better. That means that advocates often need to be the last line of defense against the use of junk science in the courtroom.

In a recent ProPublica article, “They Called 911 for Help. Police and Prosecutors Used a New Junk Science to Decide They Were Liars,” journalist Brett Murphy provides a timely and disturbing example of the problems with applying a supposed science when it comes to deciding whether 911 callers might be guilty of the crimes they are reporting. The article is a fantastic example of thorough and incisive reporting, and is worth the full read for anyone interested in the rights of the accused as well as the frailties of American courts when it comes to protecting those rights, or anyone interested in a good story. It is also an interesting case study of the ways that purported scientific claims can escape scrutiny.

The System: 911 Call Analysis

The system, 911 Call Analysis, seems to have been largely created by a single person: Tracy Harpster, a retired deputy police chief from Suburban Dayton Ohio with no scientific background, and little experience in homicide investigations. In a Criminal Justice Masters Thesis, Harpster analyzed a set of one hundred 911 calls in which half of the callers were ultimately convicted of something and the other half were not. He then used speech patterns — tone of voice, pauses, word choice, and grammar — to try to differentiate the guilty from the innocent in the calls. He found a set of 20 correlating factors, including use of words like “Hi,” “Please,” “Somebody,” “Thank you,” or “I need help,” which, he says, indicated guilt. He then apparently validated the scale using the same dataset he used to create it, rather than testing it on a fresh set of calls. As you might expect, that’s a pretty big no-no when it comes to research validity and reliability.

This, however, did not stop the research from being promoted to prosecutors across the country in an FBI Bulletin, and does not stop Harpster from offering classes — 8 or 16 hours of training in these detection methods — to hundreds of prosecutors across the country. Based on 80 public records requests (some of which had to be litigated) and 120 interviews, the reporter uncovered a wealth of examples of the system’s use as pivotal testimony in at least 100 criminal cases in 26 states. The author takes a deeper dive into five cases where, by all appearances, individuals were targeted and sometimes convicted of crimes because they called 911 and used the wrong words or the wrong tone of voice. Based on surprisingly unguarded correspondence from many prosecutors, the reporter also uncovers discussions of ways to admit the testimony while by-passing the traditional checks on scientific reliability and validity.

Harpster does not allow non-prosecutors, including reporters and other researchers, to attend these trainings, and refuses to share the underlying data. So the basic problem is that there is no published or replicable research to show that this system for separating the guilty from the innocent callers actually works. But worse than that, five studies have been undertaken to corroborate Harpster’s research, and all have failed. A team of researchers at ASU is now looking into a larger scale study, as lead researcher Jessica Salerno explains, because “We think there’s no normal way to act on a 911 call.”

The Problem: Back-Dooring the Scientific Opinion Testimony on 911 Calls 

The typical check, in U.S. Courts and states that have adopted the Daubert standard, is for a claimed expert to be cross-examined in advance and required to defend their data and their methods, ideally before trial. But one fascinating fact in this case is that, despite having developed and popularized this system — including a one-page worksheet used for determining a caller’s guilt or innocence — Harpster does not appear to have ever testified about the system in any court. As one Iowa prosecutor shared in correspondence, “He knows there will be a great legal hurdle getting the research admitted,” and “he doesn’t want the legal precedent.” So instead, he trains those who will testify, encouraging them to rely on some tactics for evading scientific scrutiny of his methods. 

The trick is to identify a law enforcement witness who has taken the course – these have been detectives or even dispatchers — and then have them share conclusions about a 911 call based on Harpster’s system, or based even on Harpster’s own out of court analysis, while grounding the testimony generally in their “training” and “experience.” The reporter cites a defense attorney noting, “It can look very much like regular opinion testimony from  a witness,” and not some exotic scientific method that would be subject to greater analysis and criticism.

This approach in getting the testimony in through another witness is, of course, a problem that can occur with any kind of scientific testimony, including that offered in civil cases. When experts “team” together in order to subtly reinforce each-others’ testimony, that can be an issue. In addition, when a non-testifying advisor works with testifying fact or expert witnesses, cross-examining attorneys can similarly face the problem of assessing conclusions that are coming from somewhere else. Ideally, the courts should prevent that strategy of disguised testimony, but to help that happen, I have two recommendations for advocates.

To the Courts, Cite the Social Science

Just as jurors don’t always have the knowledge or the comprehension to see exactly what is wrong with a given scientific method, the 911 Call Analysis example underscores the fact that judges don’t necessarily have that skill either. Some judges have kept it out, while others gone with the old standby to “Let it in, and you can bring all of this up in cross examination.” That often isn’t an adequate solution because the jury trial is not the best setting for a comprehensive introduction to research standards and methods.

I have long believed that advocates should be more aggressive in citing not just legal precedent to the court, but social science research as well, particularly when the court is relying on a questionable premise (like “Jurors can know and control their own biases,” or “jurors can readily understand complex research flaws”) that the social science would effectively overturn. In this case, if the other side wants to directly or indirectly rely on 911 Call Analysis, cite the five studies (referenced in the ProPublica article) that show it does not work. You may also want to draw from the article in educating the judge about the other problems with 911 Call Analysis (e.g., secret methods, concealed data) that prevent it from being a reliable part of anyone’s testimony. In a parallel fashion, if the other side is relying on similarly unsupported or debunked methods, then educate the judge.

To the Jurors, Teach 

Ultimately, it is inevitable that jurors will apply their own perceptions and make credibility assessments about a 911 call, or frankly about any evidence that is offered. But the jurors’ perceptions of common sense shouldn’t be artificially buttressed or given a scientific sheen through unsupported or un-reviewable references to research. As many of the prosecutors note in the ProPublica article, they like to use 911 Call Analysis because it works: Even if it isn’t true, jurors understand the idea that there’s a “normal” way to act during a 911 call, and with the right testimony, they might trust that there is reliable and scientific way for an experienced person to determine the culpability of the caller just by listening to the recording. The same applies to other examples of dubious science: It is only “junk” to the experts who understand the critiques; to everyone else, it may sound meaningful and useful.

As a result, attorneys do have a need to teach the difference between good and bad science, to focus jurors on the reasons and not just the conclusions, and to encourage an understanding of opinion testimony as a process and not just a product. To do that, your witness needs to be the better teacher, using clear language, analogies, visual aids, and every other communication tool at your disposal.

Even setting aside the implications for science testimony in general, the ProPublica article is a disturbing read. In the case of the examples shared, it not only shows how innocent people can face prosecution for the simple act of calling 911, but it also shows how courts can be fooled. It is one of many examples where courts and fact finders need to become more sophisticated about what is and isn’t good social science.

____________________
Other Posts on Juror Understanding of Science: 

____________________

Image credit: Shutterstock, used under license