Double-Blind
We’ve all seen the studies. In the past few years, there’s been an avalanche of evidence. Paper after paper has shown widespread racial and ethnic inequality in the delivery of healthcare, with statistically significant impact on patient-provider interactions, treatment decisions, treatment adherence, and health outcomes. And since we are good, evidence-based practitioners, we accept that some form of inequity or bias must exist. What we often deny, though, is that we are part of the problem.
We come to this conclusion based on our own personal experience. After all, we feel compassion and caring for all of our patients. When we examine our thoughts, we find no conscious bias. We make our decisions based on the available data, and best clinical practice as we know it. Maybe other healthcare providers let bias influence their care, but not us, right?
An attending of mine once warned me that the most compelling (and therefore most misleading) evidence is our own anecdotal evidence. What we think and feel is so vivid and readily knowable to our conscious minds that it can blind us to how we actually behave. And in this case, it blinds us twice: it keeps us from seeing the privilege we grant to some, and the indifference we inflict on others.
Back when the human brain was evolving, and knowing the difference between friend and foe had mortal consequences, we developed a compulsive need to divide all people into “us” or “them.” That compulsion goes a long way toward explaining our current political polarization, but it also influences us in more subtle ways. It’s the software program running in the background as we go about our daily business.
Imagine yourself practicing medicine on a close family member (setting aside for a moment the ethical pitfalls that would entail). Would you treat them differently? Might you take a more thorough history? Might you be a little slower to dismiss some diagnoses? Would you go the extra mile to make sure they got the very best care? Would you follow up with them a little more closely? Most of us, if we’re honest, would say yes.
Okay, how about a close friend? Or the spouse or child of a close friend? Or a long-time co-worker? What about the star of your favorite sports team? Or an actor or politician you admire? Or another physician? How about a patient who went to your college and is now a professor there? Or someone who shares your ethnic background and grew up in your hometown?
As soon as we label someone as one of “us,” we unconsciously grant them special status. The closer the connection, the more special the treatment.
Now let’s look at the flip side. Do you give that same level of care and consideration to the psychotic patient who’s screaming in the waiting room? Or the one who’s threatening you because you won’t give them an opiate prescription? Probably not.
How about the patient who hasn’t bathed in months, and who doesn’t know why they’re there? Do you leave the exam room a little sooner? Or maybe the patient who speaks no English, and the interpreter didn’t show up. Do you ask one or two fewer questions? Or the person who has radically different ideas about physiology and anatomy than you do? Do they get the gold star friends-and-family version of care? Or does your frustration and stress make you a little less patient, and a little less thorough?
It’s easy to believe that our behavior in those situations is governed by external forces, rather than bias. But one of the most common mistakes we make is the Fundamental Attribution Error. We tend to attribute our own behavior to circumstances beyond our control, while we blame the behavior of others on their personality. The truth is, both external circumstances and internal characteristics shape everything we do.
The transition from “us” to “them” is a long, slippery slope, and we don’t pay much attention to what part of it we’re on at any given moment. That lack of consciousness is our defense. “If I’m not thinking biased thoughts, how can I be biased?” But by ignoring the “us” vs “them” program that’s running in the background, we allow our hidden biases to go unchecked.
Bias doesn’t have to wear a white hood or tattoo a swastika on its arm. Sometimes it’s just a slightly less thorough exam, or a failure to ask if there are any questions. Bias can be a series of tiny omissions and miscommunications that add up over an entire healthcare system until people get hurt, and people die.
Yes, the vast majority of us are kind, and ethical, and trying as hard as we can. We are also biased--because we’re human. Until we learn to see that bias, and consciously work to mitigate it, we will continue to cause harm.