by Robert
In one of the podcasts I regularly listen to, I came across an interesting topic. Cognitive biases – i.e., a systematic, faulty inclination when perceiving, remembering, thinking, and judging – and how you should deal with them. In this post, however, I don’t want to give a lesson in theoretical bias theory, but rather point out what I see as a wrong approach to biases and what could be a much better approach.
My point is: You are constantly told you must eliminate your biases to stay objective, but, at least for me, it’s like telling someone who has the hiccups, “Stop it!”—it simply doesn’t work.
We all have biases, and for a good reason. Biases literally help us survive. Without these “errors” in perception and thinking, it would be impossible to make decisions. If we were 100% rational, we would fail when deciding whether to wear the red or green socks today—due to the lack of objective decision criteria—and, while brooding about it, we would starve miserably. Something within us has to develop a preference for red or green on that day.
On the other hand, errors are of course things we want to avoid, and it’s the same with faulty perception. Hence, this demand is particularly understandable in research. We don’t want our inner selves to take away our objectivity. And I fully support this intention, of course. I just see the “how” a bit differently. Instead of wanting to eliminate biases, we should actively manage them.
This is especially important in research, because otherwise serious harm can occur. Take the example of Andrew Wakefield. This man was a true believer in the causal link between vaccination and autism, and therefore had a strong confirmation bias—i.e., he was only looking for evidence that supported his theory. In his The Lancet-published paper, he claimed that the measles, mumps, and rubella vaccine was associated with colitis and autism spectrum disorders in children. However, subsequent investigations revealed that Wakefield’s research was not only flawed but also fraudulent, and that Wakefield had manipulated evidence.
The fallout from this fraudulent study was significant. Vaccination rates dropped sharply, leading to outbreaks of measles and mumps, resulting in thousands of deaths and serious permanent injuries. Despite the study’s retraction in 2010 and Wakefield being struck off the UK medical register, the false claims continue to fuel vaccine hesitancy until today.
But, how do you manage your biases, then? In my opinion, you should try to reduce the impact of biases instead of trying to avoid being biased. An active bias management can be as follows:
Acknowledge That You Have Them
First and foremost, it’s crucial to recognize that having biases is not a personal flaw but a universal condition of being human. It’s like being aware that you need glasses to see properly—once you accept that fact, you can start adjusting your approach to gain more clarity. When I say “acknowledge that you have them,” I mean making it a daily habit to remind yourself that your perception of reality is never 100% neutral. Many of us like to think we’re immune to bias, especially when we consider ourselves thoughtful, open-minded individuals. A simple way to keep this top of mind is to literally post a visible note somewhere you’ll see it every day—on your mirror, laptop, or fridge. That note can be as blunt as “You are biased!” or something gentler like “Remember to question your assumptions.” Either way, creating a constant reminder that our interpretations of data, and even our everyday choices, are colored by hidden mental shortcuts helps you to manage your biases.
To deepen this step, you can schedule team workshops on bias, making sure everyone in your lab or department learns about the major types of cognitive biases and how they creep into the scientific process. By creating a shared language to talk about confirmation bias, anchoring, or availability bias, you’ll find it much easier to spot when one of these patterns threatens to steer your project. If everyone knows the vocabulary, pointing out “Hey, this might be anchoring bias” or “Could we be falling into confirmation bias here?” becomes normal, expected, and incredibly useful. If you are interested in such a team workshop just reach out to us.
Explore Your Biases and Understand How They Manipulate You
Once you acknowledge that biases exist, it’s time to pin down precisely how they affect your decisions. Are you the type who only sees data that backs your cherished hypothesis? Do you have a tendency to focus on dramatic or recent findings at the expense of older, equally valid research? Reflect on past projects and be honest about whether you brushed aside inconvenient results a bit too quickly. By digging into these possibilities, you gain a sense of how your thought patterns shape your decisions. And this isn’t just some dry academic exercise—it can be surprisingly eye-opening.
A systematic documentation of decisions can also help you to stay on top of your biases. By maintaining a detailed “bias log,” you note the moment you sense a strong “gut feeling” and then later assess if it led you astray. Or, if you’re more methodical, you can use checklists that prompt you to question whether you’re unconsciously ignoring contradictory data. Having a real paper trail of your choices makes it much easier to spot when personal leanings might be overshadowing objective judgment.
Encourage Open Discussions
One of the best ways to poke holes in your own bias bubble is to invite others in. If you’re surrounded by people who all share the same assumptions, e.g. your team members and your department peers of like-minded scientists, it can easily become an echo chamber because you’re all operating under the same assumptions. That’s why open discussion—and even constructive confrontation—is essential.
Whether it’s in your research or a casual conversation with friends, encourage people to challenge your viewpoint and ask questions like “Why do you think that?” or “Could there be another angle?” Each time you encounter a well-reasoned objection, you’re nudged to reassess your own perspective. Over time, this constant recalibration helps reduce the distorting effects of bias.
To make sure these discussions don’t happen by accident but become a consistent practice, set up peer-led bias checks and structured feedback sessions. For instance, you could designate one person at each meeting to be the “devil’s advocate” who challenges the group’s assumptions or calls out any tendency to discount conflicting evidence. This approach can be surprisingly freeing because it normalizes the act of disagreeing. Instead of feeling like a naysayer, the devil’s advocate is simply fulfilling a valuable role. Add in collaborators from different fields or cultural backgrounds, and you’ll see how the influx of fresh perspectives can illuminate biases none of you even realized you had.
Use Structured Decision-Making (Including Preregistration)
One of the best ways to reduce the pull of biases is to embed structure into your research process. Frameworks like checklists, rubrics, and standardized forms can keep you from veering off-course just because a particular result excites you. When you rely on a set methodology at each step—like how to handle outliers or which statistical tests to use—you minimize opportunities for subjective tinkering.
Preregistration is another standout tool. By documenting your hypotheses, methods, and analyses before gathering data, you shield yourself from the temptation to cherry-pick results once you see them. This preregistration doesn’t need to be public. You can establish such a process, e.g., in your department. The only important thing is that you communicate your set-up to somebody prior to project start. Coupled with standard operating procedures (SOPs) for data handling, you make it significantly more difficult for hidden biases to twist your findings. Systematic documentation not only ensures transparency but also gives you a clear decision trail if you ever need to reevaluate your process later.
Expose Yourself to Different Perspectives
The human mind gravitates toward what’s familiar, which is why reading the same journals and talking to the same colleagues can solidify biases without you even realizing it. If you want to break out of that loop, make a habit of exploring viewpoints that diverge from your own.
When you immerse yourself in unfamiliar viewpoints, it’s like learning a new language—suddenly, you have a richer vocabulary to interpret the world around you. This expansion of your mental horizons doesn’t necessarily mean you’ll abandon your original positions, but it does mean you’ll hold them with a bit more humility and a lot more understanding.
Don’t Get Paralyzed by Overthinking
While it’s crucial to be aware of biases and implement strategies to mitigate them, you don’t want to second-guess every single thought and action. Most of the things you do are grounded in sound, robust, and reliable methodology and objective logic. The point of acknowledging bias isn’t to question each and every step to the point of paralysis; it’s to remain open to the possibility that you might be missing something. Constant self-doubt can stall progress and erode confidence in your own skills. Science itself also has a built-in corrective system of peer review, replication studies, and open data sharing. If you find yourself overthinking, step back and recognize that a large portion of your work is indeed “objective to a high degree.” By remaining conscious of biases without obsessing over them, you can have the best of both worlds: intellectual rigor and tangible progress. You need to find the sweet spot between healthy skepticism and trusting your own training and experience. Cognitive biases help us navigate a complicated world, but they can also warp your decisions if you aren’t careful. Each of the steps above becomes even more potent when reinforced by team initiatives: hold workshops to normalize bias awareness, systematically document your decisions, regularly assign “bias oversight” roles, and don’t be afraid to bring in external voices that might see what you’re missing.
– Robert