top of page

How bias influences the way we treat patients. Perspectives from the US and UK.





Richard Feynman once wrote, “Scientific knowledge is a body of statements of varying degrees of uncertainty/certainty — some most unsure, some nearly sure, none absolutely certain.” Scientific curiosity, not scientific knowledge, must be the focus for clinicians to excel. School and continuing education courses are great for gathering new information, but the information can be applied in many ways. There is a lack of education on heuristics, biases, and cognitive fallacies.


Clinicians bring their own experiences and perspectives to clinical situations, leading to differing opinions on the same information provided. It is imperative to know how to use the information we are given and how to navigate cognitive traps. Without a foundation for understanding information, our biases will take over. In relation, we need to be comfortable with doubt and uncertainty.


I recently had the opportunity to host a #PhyioTalk tweet chat with clinicians and students in the US and UK on this topic of cognitive fallacies. It was an engaging conversation and enriching experience for me. Here are the five questions that we asked along with my answers and a couple great insights other physical therapists.



Q1: What is an example of confirmation bias you have experienced in a healthcare setting? Could be a clinician or patient


“Confirmation bias is ‘tunnel-vision’ style searching to support an initial diagnosis while ignoring potential data which will reject the initial hypotheses.” This is how Dale Whelehan, a physiotherapist from Ireland and researcher who has published on this topic, succinctly describes the confirmation bias.


Confirmation bias can occur in the clinic during the exam or during the research before and after. It is common to search for information to build off current knowledge, not risk tearing it down. As a new graduate, I would often hunt for studies on PubMed that strengthened my resolve, opposed to appraising all the available research. To combat the bias, we need to channel our inner mathematician.


In his book How Not to be Wrong: The Power of Mathematical Thinking, Jordan Ellenberg writes, “When you’re working hard on a theorem you should try to prove it by day and disprove it by night.” He is a mathematician, so theorems are his game, but the same concept applies to any field. The best way to combat a confirmation bias is to try proving yourself wrong.


I experience the most clinical growth when I research the opposite side of my current treatment approach. For example, if you use dry needling, research all the reasons why you shouldn’t and try to convince yourself not to. If you can’t, great, you now understand the technique better. If you end up convincing yourself, drop the tool. No tool is undroppable.



Don't get too attached to clinical "tools"

Q2: What is the most common bias you fall victim to?


In addition to confirmation bias, the sunk-cost fallacy and availability heuristic are common among clinicians. The availability heuristic is when we give greater weight to information that more easily comes to mind. Who we are surrounded by will influence out thinking. The culture of the practice we work for, the training available to us, the courses we attend, the school we graduated from all shape our experiences, perspectives, and information we routinely hear.


Sunk-cost amplifies our commitment to the past investments. I frequently see sunk-cost affect new graduates. School is designed primarily for students to pass their board examination. Unfortunately, some of the information taught is outdated and no longer standard practice. But it is challenging for a new graduate to accept the skill that required significant past investments — time, money, effort — is no longer relevant. We want to justify our investments and use the tool regardless.


As a couple physios pointed out, cost of school is typically not an issue outside the US. But sunk-cost does not only refer to financial investment; time and effort easily trigger the cognitive fallacy. I still occasionally fall victim to this bias. Its challenging when past investments don’t provide the value, we think they should. Instead of ruminating on the past and what could have been done differently, simply learn from the past and then focus on future investments.


“A rational decision maker is interested only in the future consequences of current investments. Justifying earlier mistakes is not among the Econ’s concerns.” — Daniel Kahneman in Thinking, Fast and Slow



Q3: How do you confront someone clearly demonstrating a cognitive bias or fallacy when having a clinical debate?


I struggle with this. Fact punching doesn’t work. Some people view bias and heuristics as theoretical. They only want hard numbers. Another response is digging heals in and becoming resistant to any argument made.


I find the best strategy is to seek understanding and ask more questions. Simply pointing out a bias rarely works as it can still come across as an attack. It can give the impression that you are more intelligent by bringing psychology and metacognition into the conversation.

Instead of focusing on others, begin by learning how to address your own biases. Whelehan provided five specific strategies for addressing biases in the chat:

  1. Skills development and reflective practice (to reduce over reliance on limited information)

  2. Training and simulation and ‘cognitive forcing’ situations (i.e. confronting your own biases)

  3. Technology for pattern recognition to assist decision making

  4. Shared decision-making with patients

  5. Culture change and open disclosure in human factor issues (e.g. fatigue) and in error-making

It is important to remember that we all use heuristics and have biases. It is unavoidable. Before attacking someone else’s position and pointing out their cognitive fallacies, look internally at your own potential bias. Our viewpoints are not comprehensive. Which brings us back to seeking to understand. I recommend adopting the philosophy of Charlie Munger: “I never allow myself to hold an opinion on anything that I don’t know the other side’s argument better than they do”



Q4: How can cognitive fallacies, biases, and heuristics best be integrated into clinical education?


Heuristics, bias, and metacognition needs to be a foundation of our education. Without understanding, we do not know how to apply all the information we gather.


It gets better with time

Whelehan once again gives us great suggestions:

  1. Train the trainer (important to differentiate here between ‘education’ which is the theoretical understanding of a concept and ‘training’ i.e. imparting a set of skills or ‘toolkit’ for practitioners to use)

  2. Have peer review of where biases are identified

  3. Work as ‘partners’ with students providing them with effective modelling of when best to use heuristics. Discuss scenarios with students about their own personal experiences with biases in clinical practice and create a ‘cognitive roadmap’ for students to aid decision making

  4. Promote a culture of learning from error. Our colleagues in pharmacy are brilliant at this, and ultimately have personal/organizational barriers preventing error-making in this regard. Medicine, allied health professionals and nursing alike should look at pharmacy as a framework for practice

Educators need to acknowledge their own biases and be willing to discuss doubt and uncertainty. I believe those two qualities — doubt and uncertainty — are far more important than knowledge. Without doubt and uncertainty, we stop asking questions. Heuristics are valuable tools, but they need to be constantly updated and we need to recognize the potential for errors. (System 1 vs. System 2 thinking)



Q5: What are some heuristics that are beneficial in a clinical setting?


Mental short-cuts in general allow us to more rapidly assess clinical situations. The key is taking the time to activate system two and reflect when able. Additionally, practicing recognizing biases can help us navigate cognitive traps more effectively.


For example, representative heuristic can speed up an examination and eliminate unnecessary tests. But we need to be wary of satisfaction of search — prematurely stopping an exam once we receive the first plausible answer.


Another example is the use of the availability heuristic plays. It plays into the concept of learning versus performance. We will more successfully recall recent information, which can be beneficial if we prepare for an evaluation. But it can also narrow the focus too much, cause us to miss important details.



Where do we go from here?


At the end of the day, we cannot eliminate bias and heuristics, nor do we want to. Heuristics can help us make decisions more rapidly. The key is finding the balance between rapidity and accuracy. We must learn when to slow down and reflect. We must be vigilant and recognize when bias may hinder our decision making and clinical practice.

“Intelligence is not only the ability to reasons; it is also the ability to find relevant material in memory and to deploy attention when needed.” — Daniel Kahneman, in Thinking, Fast and Slow



This blog post is available on 'Medium.' Please take a minute to give a 'clap' to the post to help it reach more readers. Thank you!

bottom of page