1 00:00:01,800 --> 00:00:04,320 Speaker 1: Welcome to brain Stuff, a production of I Heart Radio. 2 00:00:06,720 --> 00:00:09,680 Speaker 1: Hey brain Stuff Lauren Bogle Bam here, and this is 3 00:00:09,720 --> 00:00:13,160 Speaker 1: a classic episode from our archives. This one goes into 4 00:00:13,240 --> 00:00:17,279 Speaker 1: unconscious biases. We all have them, and many of us 5 00:00:17,280 --> 00:00:20,599 Speaker 1: are working on them. But what happens when our doctors 6 00:00:20,800 --> 00:00:26,479 Speaker 1: have them? Hey brain Stuff Lauren Vogelbaum here. Years of 7 00:00:26,520 --> 00:00:30,680 Speaker 1: playing basketball had given Damon Tweetie some bum knees. When 8 00:00:30,720 --> 00:00:32,800 Speaker 1: the swelling didn't go down on his left knee after 9 00:00:32,840 --> 00:00:35,200 Speaker 1: a few days, he went to an urgent care clinic 10 00:00:35,280 --> 00:00:37,240 Speaker 1: for treatment. It was his day off and he was 11 00:00:37,320 --> 00:00:40,360 Speaker 1: dressed in a T shirt and sweatpants. He remembers the 12 00:00:40,440 --> 00:00:43,199 Speaker 1: doctor never looked at me. He just had me stand up, 13 00:00:43,240 --> 00:00:45,479 Speaker 1: looked at my knees, and then said you'll be okay, 14 00:00:45,520 --> 00:00:47,640 Speaker 1: take it easy. He never even asked what kind of 15 00:00:47,760 --> 00:00:50,879 Speaker 1: job I had? What if Tweete's job required a lot 16 00:00:50,920 --> 00:00:54,520 Speaker 1: of moving around in a way it did. Damon Tweetie 17 00:00:54,560 --> 00:00:56,840 Speaker 1: is a doctor himself, and once he made that clear 18 00:00:56,840 --> 00:00:59,640 Speaker 1: to the physician who was treating him, everything changed. The 19 00:00:59,680 --> 00:01:02,680 Speaker 1: doctor made eye contact and started asking him questions. He 20 00:01:02,720 --> 00:01:05,679 Speaker 1: even took Tweetie to get an X ray. Tweety said, 21 00:01:06,080 --> 00:01:08,839 Speaker 1: so it was an example of two different levels of care. 22 00:01:09,200 --> 00:01:12,440 Speaker 1: I was two different people. First time, I was Damon Tweety, 23 00:01:12,600 --> 00:01:15,600 Speaker 1: random black guy not to be taken seriously. In the 24 00:01:15,640 --> 00:01:17,960 Speaker 1: second case, I was Damon tweet E m d. And 25 00:01:18,080 --> 00:01:21,360 Speaker 1: worthy of the same care as anyone else. Tweety told 26 00:01:21,360 --> 00:01:23,720 Speaker 1: his story at the Decatur Book Festival in Georgia and 27 00:01:23,800 --> 00:01:26,080 Speaker 1: has written a book that is part memoir and part 28 00:01:26,080 --> 00:01:28,800 Speaker 1: discussion of bias and medicine, called black Man in a 29 00:01:28,840 --> 00:01:32,440 Speaker 1: White Coat. Not every medical misstep has to do with 30 00:01:32,480 --> 00:01:36,040 Speaker 1: the accidental slip of a scalpel or a medication dosing error. 31 00:01:36,480 --> 00:01:39,520 Speaker 1: The unconscious biases that everyone possesses to one degree or 32 00:01:39,560 --> 00:01:42,480 Speaker 1: another can impact how a doctor cares for a patient. 33 00:01:43,319 --> 00:01:48,240 Speaker 1: Wholly separate from personally accepted prejudices like overt racism or homophobia. 34 00:01:48,680 --> 00:01:52,440 Speaker 1: Unconscious biases are just that, biases that we don't even 35 00:01:52,480 --> 00:01:55,440 Speaker 1: know we have yet can impact how we treat others. 36 00:01:56,160 --> 00:01:59,080 Speaker 1: We also spoke with Renee Salazar, m d. A professor 37 00:01:59,120 --> 00:02:01,960 Speaker 1: of clinical medicine and the Director of Diversity in the 38 00:02:02,000 --> 00:02:05,280 Speaker 1: Department of Medicine at the University of California, San Francisco. 39 00:02:05,800 --> 00:02:08,280 Speaker 1: She put it this way, They're so deep within our 40 00:02:08,320 --> 00:02:11,960 Speaker 1: psyche that we're unaware of their existence, and we spoke 41 00:02:11,960 --> 00:02:15,160 Speaker 1: with Gordon Wallace, m d. Of the Canadian Medical Protective 42 00:02:15,160 --> 00:02:20,079 Speaker 1: Association via email. He explained that cognitive biases, or distortions 43 00:02:20,080 --> 00:02:23,600 Speaker 1: of thinking, are hardwired functions of the human brain, and 44 00:02:23,639 --> 00:02:26,480 Speaker 1: they can occasionally interfere with a doctor's ability to reach 45 00:02:26,520 --> 00:02:30,360 Speaker 1: a correct diagnosis. Racial bias is probably the most commonly 46 00:02:30,400 --> 00:02:34,239 Speaker 1: studied type. However, it's possible to harbor unconscious biases against 47 00:02:34,320 --> 00:02:37,840 Speaker 1: all kinds of people do for example, to their body weight, gender, 48 00:02:38,040 --> 00:02:42,680 Speaker 1: or sexual orientation. Cognitive bias isn't as obvious or easy 49 00:02:42,720 --> 00:02:45,680 Speaker 1: to pinpoint in a clinical setting because it isn't intentional. 50 00:02:46,200 --> 00:02:48,680 Speaker 1: Many studies have been conducted to determine whether or not 51 00:02:48,800 --> 00:02:52,840 Speaker 1: doctors possess unconscious biases, but one published in the Journal 52 00:02:52,880 --> 00:02:55,320 Speaker 1: of Internal Medicine took it a step further to measure 53 00:02:55,400 --> 00:02:59,600 Speaker 1: how these biases would actually affect treatment. In study, physicians 54 00:02:59,680 --> 00:03:02,760 Speaker 1: using an online tool were presented with randomized black and 55 00:03:02,840 --> 00:03:07,040 Speaker 1: white patients showing signs of coronary artery disease. The doctors 56 00:03:07,040 --> 00:03:09,480 Speaker 1: assessed to the patients and recommended a course of treatment 57 00:03:09,520 --> 00:03:12,640 Speaker 1: for each, but the results showed that doctors more often 58 00:03:12,680 --> 00:03:15,960 Speaker 1: suggested thrombolosis, a treatment to break up blood clots to 59 00:03:16,000 --> 00:03:18,960 Speaker 1: the white patients, while the black patients were left with 60 00:03:19,120 --> 00:03:23,400 Speaker 1: less aggressive options. The researchers drew the conclusion that unconscious 61 00:03:23,440 --> 00:03:27,200 Speaker 1: biases can impact the types of treatments prescribed to patients 62 00:03:27,280 --> 00:03:31,399 Speaker 1: even when they present the same symptoms as others. So 63 00:03:31,800 --> 00:03:34,800 Speaker 1: if doctors are completely unaware of their biases, how can 64 00:03:34,840 --> 00:03:38,480 Speaker 1: they possibly change their patient care strategies. Many turned to 65 00:03:38,520 --> 00:03:42,400 Speaker 1: the Implicit Association Test, a respected tool that assesses and 66 00:03:42,520 --> 00:03:46,760 Speaker 1: reports on unconscious bias. Dr Salazar said, what we find 67 00:03:46,880 --> 00:03:50,200 Speaker 1: most often is there's a disconnect between what people explicitly 68 00:03:50,280 --> 00:03:54,280 Speaker 1: feel and what they feel unconsciously. Once the results are available, 69 00:03:54,440 --> 00:03:57,160 Speaker 1: it's easier to be aware of personal cognitive biases and 70 00:03:57,280 --> 00:04:01,320 Speaker 1: take steps to minimize them. Manydical schools and hospitals are 71 00:04:01,400 --> 00:04:04,360 Speaker 1: establishing curricula to better train doctors on how to avoid 72 00:04:04,360 --> 00:04:08,040 Speaker 1: the pitfalls of cognitive bias, offering seminars and encouraging the 73 00:04:08,120 --> 00:04:11,040 Speaker 1: use of the I a T assessment tool. The Canadian 74 00:04:11,040 --> 00:04:15,040 Speaker 1: Medical Protective Association also backs up recommendations by expert Dr 75 00:04:15,120 --> 00:04:18,640 Speaker 1: pat cross Carry, an emergency physician and psychologist at Dalhousie 76 00:04:18,760 --> 00:04:23,239 Speaker 1: University in Halifax, Nova Scotia. Cross Carry suggests group decision 77 00:04:23,240 --> 00:04:26,800 Speaker 1: making and consultation, and the use of mindful reflection and 78 00:04:26,920 --> 00:04:30,640 Speaker 1: slowing down strategies to help the doctor deliberately transition from 79 00:04:30,680 --> 00:04:34,320 Speaker 1: intuitive a k a. Biased thinking to a more analytic mode. 80 00:04:35,080 --> 00:04:39,479 Speaker 1: Following checklists and computerized decision support systems also helped to 81 00:04:39,520 --> 00:04:43,200 Speaker 1: remove the human element, and experts suggest abiding by general 82 00:04:43,279 --> 00:04:47,239 Speaker 1: rules of thumb to avoid bias impact. For example, anyone 83 00:04:47,279 --> 00:04:51,359 Speaker 1: exhibiting specific neurological symptoms should always have their blood sugar tested. 84 00:04:52,240 --> 00:04:57,200 Speaker 1: Self awareness is also key to avoiding medical bias. Salazar explains, 85 00:04:57,240 --> 00:05:00,000 Speaker 1: just by knowing that these biases are there, we can 86 00:05:00,040 --> 00:05:02,960 Speaker 1: really take steps to reduce the impact. Let me stop 87 00:05:02,960 --> 00:05:04,840 Speaker 1: that process right now and make sure that I go 88 00:05:04,920 --> 00:05:07,400 Speaker 1: in with a clean slate and provide care with as 89 00:05:07,400 --> 00:05:10,880 Speaker 1: open a mind as possible. From a patient perspective, It's 90 00:05:10,880 --> 00:05:12,840 Speaker 1: not always going to be easy to figure out if 91 00:05:12,839 --> 00:05:15,839 Speaker 1: a medical provider is unconsciously biased against you or a 92 00:05:15,880 --> 00:05:19,320 Speaker 1: family member. To avoid being swept under the rug, ask 93 00:05:19,400 --> 00:05:22,320 Speaker 1: questions and document the answers, and never be afraid to 94 00:05:22,320 --> 00:05:26,800 Speaker 1: request an additional opinion or consult Doctors aren't the only 95 00:05:26,839 --> 00:05:30,760 Speaker 1: people who experience unconscious bias. You can take an implicit 96 00:05:30,800 --> 00:05:35,320 Speaker 1: association test online at implicit dot Harvard dot edu to 97 00:05:35,600 --> 00:05:38,320 Speaker 1: find out your true opinions on a variety of issues 98 00:05:38,600 --> 00:05:42,920 Speaker 1: including sexual orientation, race, and gender. The results could help 99 00:05:42,920 --> 00:05:46,800 Speaker 1: you identify areas where you might benefit from being more intentional, which, 100 00:05:47,000 --> 00:05:54,359 Speaker 1: let's face it, we've all got a few up. Today's 101 00:05:54,360 --> 00:05:57,080 Speaker 1: episode is based on the article how do doctor's biases 102 00:05:57,120 --> 00:05:59,359 Speaker 1: affect your health Care? On how stuff works dot com, 103 00:05:59,360 --> 00:06:02,160 Speaker 1: written by a Ahoy. Brain Stuff is production by Heart 104 00:06:02,240 --> 00:06:04,320 Speaker 1: Radio in partnership with how stuffworks dot com, and it's 105 00:06:04,360 --> 00:06:07,640 Speaker 1: produced by Tyler Playing. For more podcasts from my heart Radio, 106 00:06:07,880 --> 00:06:10,520 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 107 00:06:10,560 --> 00:06:11,920 Speaker 1: you listen to your favorite shows.