Since 1998, 375 individuals who were locked away in prison, serving long sentences, have been exonerated by DNA evidence. Of those 375, nearly 70% were (falsely) convicted of their crime thanks largely to eyewitness testimony. Someone identified and testified that they in fact were the perpetrator.
Eyewitness testimony is fraught with issues, according to decades of research.
Yet, when it comes to making up our minds, deciding what to believe in, we fervently stick to our own versions of eyewitness testimony. We are hardwired to cling to ideas that confirm our beliefs, to overestimate our personal experience, to put what we feel above something abstract like statistics or research. We cling to reports of friends of friends who experienced something. We pin our hopes on before-and-after shots of weight loss success with a magic supplement. We look around, see that none of our friends have died from a disease, and think things must be okay.
It’s not our fault. We’re riddled with dozens of cognitive biases, or quirks in how our minds work. These cognitive biases tend to push us in a different direction, one that blinds us to objective truth. Perhaps there is no greater example of this effect than when dealing with a pandemic:
Confirmation bias? Check.
Bandwagon effect? Yep.
Recency Bias? Of course.
Illusion of validity? Yes.
And so on…
What are we to do? Bias has a negative connotation. When mentioned, it’s as if we’re accusing the person of deliberate manipulation. But, bias is simply a glitch. It’s an acknowledgment that for whatever reason, likely evolutionary to make quick decisions and avoid sudden death, our minds like working in a way that might not lead towards objective truth in a certain situation.
Science is the process of systematically acknowledging and trying to minimize our bias. Science isn’t free of bias. But, up to this point in history, it’s the best way we’ve come up with for dealing with bias, and finding the capital T Truth.
Which brings us back to eyewitness testimony. Science has a way to answer the question, “what evidence is best.” It’s called the hierarchy of evidence. Where do we think that eyewitness testimony falls on that hierarchy?
At the very bottom. A doctor reporting that a drug seemed to help a patient? Regardless of if that observation is one person or one hundred? The very lowest rung of evidence.
Why? Because it’s the most fraught with bias. An observation, or case study, is the start of the process of science. Notice a trend or observe a phenomenon? It’s not time to throw your hands up in victory. It’s the beginning of a long process to find the truth.
As we make our way up the hierarchy of evidence, studies become more systematic and controlled. They become more specific to the nature of the problem and the population it affects. The study design, methods, and results are all subject to the first round of critique by their peers, and then published for all to see. It’s a long process. Why? We’re trying to eliminate bias.
As we make our way to the vaunted randomized placebo-controlled study, you might think, “Ahh! Now we are going to find the truth!”
Not quite. A single randomized control trial is a huge step up, but even with the best study design, it’s not perfect. We’re humans living in a complicated world. We can’t control and account for everything. A single study is helpful, but it doesn’t always show us the way. It needs to be replicated by another group. Someone who doesn’t have a stake in the claim of the shiny new study first to publish on a provocative result.
At the top of the hierarchy of evidence is a systematic review or meta-analysis: this is where we collect the dozens of studies done on a topic, and see what they say as a whole. Do they all point in the same direction? Good news! There’s probably an effect. Are the results mixed? Do they point in the opposite direction? Well, that leads us to a different interpretation.
It’s a long and complicated process. One that doesn’t actually lead us to 100 percent certainty (no such thing exists). But it’s a process that gets us to a point where we can feel reasonably confident that we’re on the right track with suggesting that whatever we’ve found is very likely to be true.
Today’s world is fraught with disinformation, opinions galore, and with many stuck in what a friend called the “I don’t know what to believe,” trap. It’s important to understand the scientific process, and more so, to understand what your own hierarchy of evidence is.
Do you value what you feel? Do you put a lot of weight into your experience? Do you rely on experts or news outlets? Do you peruse google scholar or PubMed for research studies? Do you only read the studies that back up your opinion?
I’m not saying that science is perfect. But the scientific method of systematically trying to eliminate bias is the best thing we’ve come up so far. And now, more than ever, it’s worthwhile to take that same approach to your current beliefs. How can you minimize your bias?
(For related reads see: Tech Gurus and ALL CAPS Twitter versus Actual Science and The Art and Practice of Science.)
If you enjoyed this post, you'll love our new book Do Hard Things: Why We Get Resilience Wrong and The Surprising Science of Real Toughness! It provides a roadmap for navigating life’s challenges and doing so in a way that makes us happier, more successful, and, ultimately, better people.
For a limited time, It's over 30% Off! Get your copy today!