By Dr. Ken Broda Bahm:
The setting of an in-court oral voir dire can be seen as a “perfect storm” of information. Data about your panel and your potential future fact-finders is coming at you from all directions. You may have responses from a questionnaire as well as information from prospective jurors’ social media and public records. On top of that, you will have jurors’ numerous answers to questions from the court and from parties on all sides. Finally, you will have whatever observations you make, and judgements you apply to the members of the panel. Somehow, your full team needs to keep track of all of that information in order to ground your decisions on cause challenges and peremptory strikes.
How do you do it? For many, the method still starts with a blank chart full of boxes, or a folder covered with Post-It notes. We also use those methods, but have had a long interest in tracking the innovations that promise to bring this task into the computer age. There have been a number of systems for tablets and computers, some of which have been reviewed in our pages. Right now, our attention is on a recently-developed cloud-based system that is unique for being designed with teams in mind. The team at JurorSearch, led by CEO Dan Johnson, have developed a platform that allows any number of team members to collaboratively enter information on the panel before and during voir dire, with dedicated and fully customizable fields for surveys, hand-raiser questions, and open-ended comments. It also allows all users on your team to offer and compare ratings and comments on potential jurors on the screen. This past week, we used a panel of volunteers to conduct a live test, and that also gave us chance to step into an attorney’s shoes and appreciate the task from that perspective. In this post, we’ll share a brief video of that test along with a few observations.
Using a medical malpractice scenario involving a plaintiff who suffered a stroke shortly after being discharged from an emergency room, we mocked the full voir dire process. We asked the volunteer jurors to fill out questionnaires while we also conducted social media analysis. Each of us also played the role of attorneys during the live questioning for the plaintiff and the defendant in the scenario. Throughout all this activity, data was being entered by several members of the team. Responses from the “hand-raiser” questions, where you need to quickly record a number of individuals all answering “yes” to the same question, are the ones shown on the screen. Simultaneously, several other fields were being input as well, including fields with the jurors’ comments and our own evaluations and rankings of them. Ultimately, we used a red-to-green scoring system, and combined that score with the individual report on each potential juror in order to guide our strike decisions.
The video appears below or can be accessed at this link:
As the two doing the questioning, we both felt a fresh appreciation of the demands on the attorney conducting the questioning. The person in that role has to work hard to maintain a connection with potential jurors, something they likely cannot do well if they are also trying to keep physical or mental track of everything that is being said. That is a big reason why a team-based data entry system like JurorSearch fills an important niche. As we have more opportunities to innovate and enhance our ability to keep track of this storm of information, we likely will have more to share.
______________
Other Posts on Jury Selection:
- Tap Into Computer-Aided Jury Selection: A Video Review of Jury Box Software
- Don’t Select Your Jury Based on Demographics: A Skeptical Look at JuryQuest
- Don’t Mistake the Purpose of “Scientific Jury Selection”