“It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness….. “ Charles Dickens, A Tale of Two Cities
Mock trials can be the best of times and they can be the worst of times; attorneys may gain wisdom or obtain foolish guidance; mock jurors may offer hope, or succumb to despair. However, the reliability of the results depends upon the methods adopted when conducting a mock trial. The old adage “Garbage In, Garbage Out” holds for social scientists -- the quality and reliability of the data depends upon the research methods. Poor methods result in unreliable data and a loss of precious time and wasted resources. But what if you could compare two distinct methods to determine their superiority? It is a rare opportunity for consultants, but recently I was given the opportunity to re-test a previously conducted mock trial.
A new client recently called lamenting the results from a mock trial with another consulting firm. The research process disappointed him, which then led him to doubt the findings of the project. Worse still, his client received extremely positive support from jurors: nearly 90% of jurors sided with his client, and now his client had become emboldened to take the case to trial. In his view the client did not fully grasp the gravity of human factors in the case, largely in part because of the research methods employed. He contacted us because of Courtroom Sciences’ reputation for realistic trial simulation.
Receiving a call describing unsatisfactory experiences is not uncommon, but typically clients lack the time, money or desire to conduct a second research project as their case heads to trial. I was elated at the rare opportunity to put CSI’s methods up against a competitor’s methods to show the difference in the quality of data that could be obtained by our approach.
A Tale of Two Mock Trials:
A Flawed Method
When inquiring about the methods employed by the prior consulting firm, the attorney explained:
Attorneys prepared plaintiff and defendant scripts which the consultant reviewed to ensure argument consistency. Attorneys then read the scripts to jurors, avoiding the inclusion of any personality. Presentations absent of “personality” ensured the facts were properly tested and not influenced by “stylistic differences.” No witness videos were shown to jurors.
The audience consisted of 24 jurors matching the local demographics who reported to a focus facility, listened to the presentations, and completed questionnaires to measure their support for either party. Based upon jurors’ responses to questionnaires, eight jurors were released and 16 jurors, two groups of eight, then deliberated over a jury charge.
While the above research may sound somewhat standard, the method is fairly unique in that it attempts to isolate jurors’ view of the case to specific facts; however, trial is not a sterile scripted environment absent of attorney personalities or witnesses. Employing a method which removes human factors results in data that cannot be relied upon at trial. Within this research design, nearly 90% of jurors supported the defendant, but as the attorney explained, he could not interject any of the “human” issues the judge had not ruled upon (spoliation of evidence, alleged theft of trade secrets, breach of contract were issues upon which the judge had not yet ruled) or the personality of the opposing attorney because the consultant wanted to remain focused on the facts. Based upon the research design, the consultant appears to have employed an exploratory research design for a project in which the client wanted findings that could predict the outcome at trial. Essentially, the consultant used the research to gather feedback from jurors about the case, but the client wanted data upon which he could rely at trial. To the attorney the results were invalid and he ignored the consultants’ recommendations based upon the flawed data. He turned to us because he sought data that could be relied upon at trial and was produced through simulative trial conditions.
Retesting the case through a traditional simulative mock trial identified a distinctly different path that counsel had not considered until hearing feedback from jurors. In his words, he “learned significantly more about the case and its weaknesses than the prior project,” and while the client received less than 40% support from jurors, data indicated multiple paths to potentially prevail at trial. For a researcher, identifying challenges surrounding a case and identifying successful paths to overcome those challenges is our primary goal in most projects -- for us, few compliments rank higher than hearing how much a client learned about their case as a result of the research.
Consulting firms often pitch exploratory research designs to keep costs low. Confirmatory research is typically more expensive because predictive results require more jurors (24-36), and ideally multiple mock trials. Our internal research over 25 years shows that with 6 juries of 8 or participants, consistent case issues in each project and an environment simulating trial conditions, mock trial jurors reflect the actual jurors’ decision nearly 80% of the time. Few firms have the cases to draw upon to offer comparable data such as this, but much of our success relies upon the methods we employ. For more on predictive methods see George Speckart’s articles “Trial by Science,” in Risk and Insurance, 2008 and “Do Mock Trials Predict Actual Trial Outcomes,” in In House 2010
Environmental simulation is the most fundamental principle in predictive human behavior research, and at CSI we seek to closely replicate the courtroom environment whenever practicably possible during our predictive research projects (Mock Trials or Exploratory Mock Trials).
In-house recruiting department conducts the recruitment of mock jurors, because having a group of research participants who are representative of a typical jury in the trial venue, carefully screened, and properly oriented, is as important to the validity and utility of the research as is the content of the presentations.
Both deliberations and a focus group enable CSI consultants to replicate the courtroom process as well as thoroughly probe for issues that may not be captured in questionnaires or presentations. While the above description captures a typical mock trial, the research design often depends upon the questions a client needs to have answered.
What answers do you need?
“Just get a focus group together,” these words were offered by a claims adjuster to trial counsel who had requested juror research. Here’s the problem with the demand, if attorneys or clients seek particular answers, then the research must be designed in such a way as to provide those answers accurately. In the prior instance, both trial counsel and the claims adjuster sought to establish the case’s potential value for settlement and trial purposes. Because case valuation was the primary goal, a focus group would not be appropriate because it would not replicate the deliberation process, which in cases of personal injury, trend higher than individual damage awards. In addition to traditional mock trials, we conduct juror research through a variety of research designs depending upon our clients’ needs, but the following are some of the most common exploratory designs:
Online Mock trial-Jurors read or watch video of counsel arguments and witness testimony and individually complete verdict form and answer open-ended questions-ideal for identifying case themes, witness assessment, and damage valuation.
Discussing what answers you or your client need answered should be the starting point when discussing research design, but it is important to keep in mind that limiting resources (time or money) also limits the research project’s data. The budget for an exploratory project is not similar to a predictive research project—useful information may be gleaned both exercises, but knowing the limitations of a project’s data is crucial.
Finally, we regularly meet clients who were dissatisfied with the research or analysis provided by other trial consultants. When reading through reports prepared by other consulting firms, it quickly becomes apparent that the research design has produced unreliable results or the report is a data dump, lacking any critical analysis or practical suggestions. For example, when reviewing a recent report from a competitor, I reviewed the questionnaires prepared by the consultant and discovered the consultant had been including attorney’s verbatim arguments in each questionnaire following each presentation—sounds benign, right? However, instead of jurors naturally gravitating towards attorneys arguments as they would at trial, jurors focused upon the arguments the consultant placed in front of them—skewing the research. What might not have been an important issue to jurors, suddenly became an issue they focused upon. For social scientists this is an egregious error, and when looking into the background of the consultant, they held a bachelor’s degree [in] the social sciences.
Because predicting human behavior is one of the most challenging tasks in social science research and because research design plays a critical role in the reliability of data at trial, we highly recommend seeking a professional with a Ph.D. or M.S./M.A. in the social sciences, in particular Psychology and/or Communication. Marketing, public relations, thespians, physicians, and lawyers represent some of the motley crew backgrounds held by “litigation consultants;” however, these backgrounds lack the requisite training and knowledge to appropriately design and conduct research. When significant sums of money are at stake, do you really want to gamble with an individual who lacks proper education and training?
Identifying Quality-Questions to Ask
Ok, I get it; while I and other social scientists may geek out about methods, you may not be planning on a Ph.D., so what should questions should you ask? All you need to do is ask about:
When comparing consultants and consulting companies, it is worth asking about differences between Company A’s research design and Company B’s. When comparing proposals and costs, ensure you are comparing “apples to apples,” (research design, project structure, a company’s reputation, and education of the consultants). Finally, while cost may be an issue, most consulting firms should be able to offer budget saving options (fewer jurors, reduced report, etc….). Selecting a consulting firm based upon price alone invites risky and unforeseen consequences that may result in costly strategic miscalculations at trial—“the bitter taste of poor quality lingers long after the sweet taste of low price is forgotten.”
Courtroom Sciences, Inc. provides litigation support services to outside counsel and corporate legal departments. CSI offers a comprehensive suite of services which assists legal counsels in managing the lifecycle of litigation. Call or email us today for more information.