1 00:00:15,480 --> 00:00:18,400 Speaker 1: Welcome to tech stuff. I'm was Valoshan and today I 2 00:00:18,440 --> 00:00:21,000 Speaker 1: want to start with the story. It's about a man, 3 00:00:21,480 --> 00:00:26,079 Speaker 1: Matthew Williams, who experienced such intense stomach pain that he 4 00:00:26,120 --> 00:00:29,080 Speaker 1: felt he had to go to the ear. He was 5 00:00:29,120 --> 00:00:32,720 Speaker 1: given laxatives and told that he was just constipated, but 6 00:00:32,760 --> 00:00:35,400 Speaker 1: when his symptoms worsened, he went to a New Year 7 00:00:35,800 --> 00:00:38,879 Speaker 1: for a second opinion, and it's there they realized that 8 00:00:38,960 --> 00:00:44,239 Speaker 1: Piver's intestine is twisting on itself. This is serious and 9 00:00:44,320 --> 00:00:49,040 Speaker 1: Matthew immediately goes in for life saving surgery. But unfortunately 10 00:00:49,600 --> 00:00:52,720 Speaker 1: this isn't where his problems end. Every time he eats, 11 00:00:52,840 --> 00:00:57,840 Speaker 1: he has significant diarrhea and trouble keeping on weight. It's 12 00:00:58,120 --> 00:01:03,680 Speaker 1: completely life altering. Matthew spends years talking to gastroentrologists and nutritionists, 13 00:01:03,680 --> 00:01:06,520 Speaker 1: but no one is able to help relieve his symptoms 14 00:01:07,200 --> 00:01:11,759 Speaker 1: until chat GPT his doctor Druv Kola. 15 00:01:12,040 --> 00:01:15,559 Speaker 2: He puts up his symptoms in and he tells the bot, 16 00:01:15,720 --> 00:01:18,639 Speaker 2: you know, these are the things that are bothering me most. 17 00:01:18,880 --> 00:01:21,920 Speaker 2: You know, when I eat certain foods, that's when it 18 00:01:21,959 --> 00:01:25,520 Speaker 2: really starts to trigger my abomal pain, my diarrhea, et cetera. 19 00:01:26,080 --> 00:01:30,039 Speaker 2: And within seconds, chat GBT comes up with this diagnosis that, hey, 20 00:01:30,760 --> 00:01:34,560 Speaker 2: the foods that you're describing are high in this compound 21 00:01:34,560 --> 00:01:36,920 Speaker 2: called oxalate that he had never heard of that's found 22 00:01:36,959 --> 00:01:39,360 Speaker 2: in leafy greens and other types of foods. And he 23 00:01:39,400 --> 00:01:42,039 Speaker 2: takes that information to a nutritionists and they design a 24 00:01:42,080 --> 00:01:44,479 Speaker 2: diet and basically he has his life back. 25 00:01:44,760 --> 00:01:48,440 Speaker 1: On the one hand, this is absolutely incredible chat GPT 26 00:01:48,800 --> 00:01:52,640 Speaker 1: gave Matthew his life back. On the other hand, a 27 00:01:52,720 --> 00:01:55,680 Speaker 1: story like this makes you sort of question, what are 28 00:01:55,680 --> 00:01:59,919 Speaker 1: doctors actually for? Is this question that Dreve, a doctor 29 00:02:00,240 --> 00:02:03,440 Speaker 1: and a journalist, attempted to answer in a recent article 30 00:02:03,480 --> 00:02:09,080 Speaker 1: for the NIYULCA. Let's get right into it. It feels 31 00:02:09,080 --> 00:02:12,120 Speaker 1: like a seismic moment, right because you know, my stepmother doctor, 32 00:02:12,160 --> 00:02:13,920 Speaker 1: I got some friends who are doctors. I remember the 33 00:02:14,320 --> 00:02:17,200 Speaker 1: constant eye roll about doctor Google, like patients coming in 34 00:02:17,320 --> 00:02:20,760 Speaker 1: thinking that they'd been able to like diagnose themselves with WebMD. 35 00:02:21,600 --> 00:02:24,560 Speaker 1: But now here's a story about a guy who was 36 00:02:24,760 --> 00:02:29,000 Speaker 1: totally failed by the medical profession, frankly right, and who 37 00:02:29,040 --> 00:02:31,600 Speaker 1: was able to do something which many people in the 38 00:02:31,639 --> 00:02:34,640 Speaker 1: medical profession still advise against, which was used checchipt to 39 00:02:34,680 --> 00:02:35,920 Speaker 1: diagnose himself. 40 00:02:36,160 --> 00:02:39,080 Speaker 2: Right right. I mean it is a seismic shift. And 41 00:02:39,240 --> 00:02:41,600 Speaker 2: you know, we can talk about how much the technology 42 00:02:41,639 --> 00:02:45,120 Speaker 2: has even advanced since he, you know, put in symptoms 43 00:02:45,120 --> 00:02:46,760 Speaker 2: a couple of years ago, and how there are now 44 00:02:47,040 --> 00:02:50,520 Speaker 2: specific models that are designed on healthcare information to give 45 00:02:50,639 --> 00:02:54,400 Speaker 2: even more effective diagnostic support. But I want to raise 46 00:02:54,560 --> 00:02:56,720 Speaker 2: just one other point here. I mean, we talked about 47 00:02:56,720 --> 00:02:58,919 Speaker 2: Matthew Williams, we should also talk about this other patient 48 00:02:59,040 --> 00:03:01,639 Speaker 2: that was in the article that I highlighted, and this 49 00:03:01,760 --> 00:03:03,720 Speaker 2: was a sixty year old guy who was just concerned 50 00:03:03,720 --> 00:03:05,720 Speaker 2: about how much salt was in his diet. Was concern 51 00:03:05,760 --> 00:03:08,480 Speaker 2: that a lot of people with particularly high blood pressure 52 00:03:08,720 --> 00:03:12,160 Speaker 2: around the world probably have. He turns to chat gbt 53 00:03:12,320 --> 00:03:16,680 Speaker 2: asks for alternatives, and the chatbot gives him a suggestion 54 00:03:16,760 --> 00:03:20,799 Speaker 2: of something called sodium bromide, which is a kind of 55 00:03:20,840 --> 00:03:23,480 Speaker 2: an anti seizure medicine that was used in the past 56 00:03:23,080 --> 00:03:27,079 Speaker 2: but had had significant toxicity and so isn't really used 57 00:03:27,160 --> 00:03:30,359 Speaker 2: much anymore. He orders this compound online based on the 58 00:03:30,400 --> 00:03:33,400 Speaker 2: advice he got from chat gbt. He starts taking it, 59 00:03:33,440 --> 00:03:36,520 Speaker 2: He accumulates in his body, he starts hallucinating. He's kind 60 00:03:36,520 --> 00:03:38,520 Speaker 2: of out of his mind. He goes to the emergency room, 61 00:03:38,560 --> 00:03:41,200 Speaker 2: they find out that his bromide levels are you know, 62 00:03:41,280 --> 00:03:44,560 Speaker 2: hundreds of times above normal, and it takes him weeks 63 00:03:44,600 --> 00:03:46,800 Speaker 2: to recover where he gets it out of his system 64 00:03:47,120 --> 00:03:49,320 Speaker 2: and gets his life back. And so I just want 65 00:03:49,360 --> 00:03:52,240 Speaker 2: to put those two stories, you know, in the same narrative, 66 00:03:52,320 --> 00:03:55,040 Speaker 2: because it shows the tremendous power but also the tremendous 67 00:03:55,120 --> 00:03:57,600 Speaker 2: risk of turning to chatbots for health advice. 68 00:03:58,040 --> 00:04:00,640 Speaker 1: Now you compossibly know this question, but I have to 69 00:04:00,680 --> 00:04:03,240 Speaker 1: ask you anyway, which of those two stories you think 70 00:04:03,280 --> 00:04:05,360 Speaker 1: is more representative of where we are in terms of 71 00:04:05,480 --> 00:04:08,720 Speaker 1: using AI for self directed healthcare. 72 00:04:09,080 --> 00:04:11,800 Speaker 2: Well, I think they both have some truth. I think 73 00:04:11,880 --> 00:04:15,040 Speaker 2: that as people get more familiar with how to use 74 00:04:15,080 --> 00:04:17,960 Speaker 2: these things and what the best approaches are to get 75 00:04:17,960 --> 00:04:20,120 Speaker 2: the information that you need, and as they get more 76 00:04:20,200 --> 00:04:23,000 Speaker 2: integrated into the healthcare system, as both patients but also 77 00:04:23,080 --> 00:04:26,479 Speaker 2: doctors get more comfortable with them being along for the 78 00:04:26,560 --> 00:04:28,960 Speaker 2: ride in clinical encounters, I think we're going to see 79 00:04:28,960 --> 00:04:30,960 Speaker 2: a lot more of that first story, a lot more 80 00:04:31,000 --> 00:04:35,120 Speaker 2: of the Matthew Williams type story, and not only that, 81 00:04:35,279 --> 00:04:38,560 Speaker 2: but a lot more of being able to be a 82 00:04:38,800 --> 00:04:42,320 Speaker 2: kind of conavigator of the healthcare system, which as we know, 83 00:04:42,520 --> 00:04:45,960 Speaker 2: is incredibly complex. It's hard to access, you know, not 84 00:04:46,440 --> 00:04:48,760 Speaker 2: always able to get the answers that you want or 85 00:04:48,839 --> 00:04:51,279 Speaker 2: need in a timely manner. And this is something that 86 00:04:51,320 --> 00:04:54,440 Speaker 2: I think could really support medical care if it's used 87 00:04:54,520 --> 00:04:55,120 Speaker 2: in the right way. 88 00:04:55,240 --> 00:04:58,640 Speaker 1: I saw the announcement about chet GPT Health. I haven't 89 00:04:58,800 --> 00:05:01,719 Speaker 1: trialed the product myself or even heard a huge amount 90 00:05:01,720 --> 00:05:04,440 Speaker 1: beyond the headlines. Do you have you played around with 91 00:05:04,480 --> 00:05:06,719 Speaker 1: it or heard any patient sort of accounts of using it. 92 00:05:06,839 --> 00:05:08,880 Speaker 1: How's it compared to the regular chat chipet? 93 00:05:09,440 --> 00:05:11,599 Speaker 2: Yeah, so you know it. It was released a few 94 00:05:11,720 --> 00:05:13,520 Speaker 2: weeks ago and it's still I think, you know, most 95 00:05:13,560 --> 00:05:16,120 Speaker 2: people are still on waitless to kind of use it. 96 00:05:16,400 --> 00:05:16,560 Speaker 1: You know. 97 00:05:16,600 --> 00:05:18,760 Speaker 2: The idea here is that you can upload your medical 98 00:05:18,800 --> 00:05:22,560 Speaker 2: records safely. You're able to put in you know, if 99 00:05:22,560 --> 00:05:25,440 Speaker 2: you have an Apple health app or other device, you 100 00:05:25,480 --> 00:05:27,840 Speaker 2: can put some of your metrics in there. The way 101 00:05:27,839 --> 00:05:30,360 Speaker 2: that they have presented what it's doing is it's again 102 00:05:30,440 --> 00:05:32,800 Speaker 2: supposed to be more of a guide than a diagnostician. 103 00:05:33,160 --> 00:05:34,920 Speaker 2: I think they've been explicit about the idea that it 104 00:05:34,960 --> 00:05:38,640 Speaker 2: shouldn't replace a medical professional. But if you want additional 105 00:05:38,640 --> 00:05:42,239 Speaker 2: insights about your health. It may be useful in that way. 106 00:05:42,320 --> 00:05:44,320 Speaker 2: You know, as a general matter, I think, you know, 107 00:05:44,400 --> 00:05:47,640 Speaker 2: Claude Andthropic have released something similar, and I think we're 108 00:05:47,640 --> 00:05:50,200 Speaker 2: going to see a lot more of this. I think 109 00:05:50,200 --> 00:05:53,200 Speaker 2: these things are helpful, but in a way limited. A 110 00:05:53,240 --> 00:05:55,800 Speaker 2: lot of people in the current kind of discussion think 111 00:05:55,839 --> 00:05:59,840 Speaker 2: that getting better health is about having more information, having 112 00:05:59,839 --> 00:06:03,239 Speaker 2: it more personalized information, and that's part of the story. 113 00:06:03,320 --> 00:06:05,960 Speaker 2: You want to know. You know what your specific risk 114 00:06:06,000 --> 00:06:09,599 Speaker 2: factors are, how your sleep patterns maybe are aiding or 115 00:06:09,880 --> 00:06:12,480 Speaker 2: diminishing the quality of your life. You know what you 116 00:06:12,520 --> 00:06:15,040 Speaker 2: should be eating, et cetera. But a lot of health 117 00:06:15,120 --> 00:06:18,280 Speaker 2: is actually in behavior change. It's not just information. You know, 118 00:06:18,320 --> 00:06:21,920 Speaker 2: if you think about someone who's smoking cigarettes, that person 119 00:06:21,920 --> 00:06:24,119 Speaker 2: probably knows that it elevates the risk of lung cancer. 120 00:06:24,120 --> 00:06:26,039 Speaker 2: That's not a secret. And even if you told them 121 00:06:26,400 --> 00:06:28,200 Speaker 2: your risk of lung cancer if you keep smoking is 122 00:06:28,200 --> 00:06:30,359 Speaker 2: twenty seven point two percent or whatever it might be, 123 00:06:30,960 --> 00:06:33,919 Speaker 2: that's not necessarily going to inspire behavior change. And so 124 00:06:33,960 --> 00:06:37,080 Speaker 2: I think part of the conversation is getting the right information, 125 00:06:37,480 --> 00:06:40,240 Speaker 2: getting the right diagnosis. But there's this whole other part 126 00:06:40,240 --> 00:06:43,159 Speaker 2: of medicine that I think the chatbots, and AI is 127 00:06:43,200 --> 00:06:45,520 Speaker 2: still not as helpful with which is how do we 128 00:06:45,560 --> 00:06:47,839 Speaker 2: get people to change their behavior, how do we manage 129 00:06:47,839 --> 00:06:49,880 Speaker 2: conditions over the long time, how do we deal with 130 00:06:49,920 --> 00:06:52,680 Speaker 2: the uncertainty that comes with a lot of diagnoses, and 131 00:06:52,720 --> 00:06:54,400 Speaker 2: how do we ultimately get better treatments. 132 00:06:54,800 --> 00:06:56,240 Speaker 1: Yeah, I want to come back to that because I 133 00:06:56,240 --> 00:06:58,040 Speaker 1: think you had this great quote and the piece which 134 00:06:58,120 --> 00:07:00,600 Speaker 1: was that you know, being a doctor feel less like 135 00:07:00,640 --> 00:07:03,520 Speaker 1: being Sherlock Holmes and more like Sissyphis in terms of 136 00:07:04,279 --> 00:07:06,440 Speaker 1: constantly pushing a boulder up a hill. So when it 137 00:07:06,440 --> 00:07:09,720 Speaker 1: comes to that, but before we leave Matthew behind, I 138 00:07:09,760 --> 00:07:13,520 Speaker 1: mean he said to you, I trust AI more than doctors. 139 00:07:14,240 --> 00:07:15,520 Speaker 1: I don't think I'm the only. 140 00:07:15,360 --> 00:07:18,840 Speaker 2: One that was striking. And I think part of it 141 00:07:18,880 --> 00:07:21,680 Speaker 2: is that we are in a moment where there's broad 142 00:07:21,960 --> 00:07:26,040 Speaker 2: distrust of institutions, and medicine is certainly not immune to that. 143 00:07:26,440 --> 00:07:29,120 Speaker 2: Part of that is earned mistrust. Part of that is 144 00:07:29,400 --> 00:07:32,960 Speaker 2: the behavior and the rhetoric of political actors. But you know, 145 00:07:33,000 --> 00:07:35,760 Speaker 2: we're in this general moment where people don't trust a 146 00:07:35,760 --> 00:07:39,840 Speaker 2: lot of expertise and institutional advice. The other part of 147 00:07:39,840 --> 00:07:41,400 Speaker 2: it is that, you know, as we talked about his 148 00:07:41,400 --> 00:07:43,680 Speaker 2: story is one of kind of being failed by the 149 00:07:43,720 --> 00:07:45,920 Speaker 2: medical system in some way. That first time he went 150 00:07:45,960 --> 00:07:48,200 Speaker 2: to the emergency room all those years ago, he got 151 00:07:48,240 --> 00:07:52,680 Speaker 2: a misdiagnosis. He got laxatives for what they thought was constipation, 152 00:07:52,840 --> 00:07:55,560 Speaker 2: and that even could have made his diagnosis worse because 153 00:07:55,600 --> 00:07:57,960 Speaker 2: he had this twisting of the bow. And then after 154 00:07:58,000 --> 00:07:59,840 Speaker 2: he had the surgery got the correct diagnosis, no one 155 00:07:59,840 --> 00:08:02,880 Speaker 2: will was able to pinpoint what was really bothering him. 156 00:08:03,320 --> 00:08:06,400 Speaker 2: And so I think, as a doctor, as part of 157 00:08:06,440 --> 00:08:10,000 Speaker 2: the medical profession, if we set up an antagonistic relationship 158 00:08:10,240 --> 00:08:13,600 Speaker 2: with our official intelligence or companies that are trying to 159 00:08:13,640 --> 00:08:15,800 Speaker 2: make health care better and more efficient in some ways, 160 00:08:15,800 --> 00:08:18,040 Speaker 2: and we feel like we want to be the gatekeepers 161 00:08:18,360 --> 00:08:21,400 Speaker 2: of medical care, I think we are going to force 162 00:08:21,400 --> 00:08:23,640 Speaker 2: people into a situation where a lot of folks are 163 00:08:24,000 --> 00:08:25,960 Speaker 2: trying to decide do I trust doctors more? Do I 164 00:08:25,960 --> 00:08:29,320 Speaker 2: took AI more? What I'm hoping to push towards as 165 00:08:29,360 --> 00:08:32,000 Speaker 2: a medical profession is that we should be leaders in 166 00:08:32,120 --> 00:08:35,360 Speaker 2: trying to figure out how best to integrate AI into 167 00:08:35,400 --> 00:08:37,559 Speaker 2: medicine and healthcare. How do we make the most of 168 00:08:37,600 --> 00:08:40,760 Speaker 2: these technologies do the things that we can't do. I'm 169 00:08:40,800 --> 00:08:43,280 Speaker 2: not able to in fifteen or twenty minute go through 170 00:08:43,280 --> 00:08:45,760 Speaker 2: twenty years of medical records for the patient in front 171 00:08:45,760 --> 00:08:48,200 Speaker 2: of me and come up with a treatment plan. But 172 00:08:48,320 --> 00:08:51,200 Speaker 2: AI could synthesize that information and provide it. We have 173 00:08:51,240 --> 00:08:53,000 Speaker 2: to make sure that the AI is doing that in 174 00:08:53,040 --> 00:08:55,560 Speaker 2: a valid way, that they're not important mistakes, and so 175 00:08:55,600 --> 00:08:57,440 Speaker 2: there needs to be a lot of validation of that 176 00:08:57,480 --> 00:09:00,840 Speaker 2: type of synthesis. In summary, really need to start thinking 177 00:09:00,840 --> 00:09:04,000 Speaker 2: about AI as partners and how do we integrate it 178 00:09:04,040 --> 00:09:05,559 Speaker 2: into the care that we're delivering to people. 179 00:09:06,240 --> 00:09:10,840 Speaker 1: I want to talk about diagnosis. You witnessed you referenced 180 00:09:11,040 --> 00:09:14,720 Speaker 1: Casparov versus Deep Blue, the famous early nineties chess face 181 00:09:14,760 --> 00:09:19,440 Speaker 1: off between man and chess supercomputer which kind of took 182 00:09:19,440 --> 00:09:24,320 Speaker 1: the world by storm. There was recently a diagnosis face 183 00:09:24,400 --> 00:09:28,040 Speaker 1: off between an expert diagnostician who I think you went 184 00:09:28,080 --> 00:09:33,400 Speaker 1: to medical school with and a residency and a specialized 185 00:09:33,840 --> 00:09:37,560 Speaker 1: medical AI diagnosis to not just regular chat GPT, but 186 00:09:37,920 --> 00:09:40,320 Speaker 1: a system called Cabot. And you actually, I mean there 187 00:09:40,360 --> 00:09:42,480 Speaker 1: was a dramatic scene, so describe it. 188 00:09:42,800 --> 00:09:45,120 Speaker 2: That's right. So I went to Harvard last year to 189 00:09:45,559 --> 00:09:48,959 Speaker 2: witness this kind of showdown between this bot called Cabot, 190 00:09:49,240 --> 00:09:52,320 Speaker 2: It is named after this famous physician named Richard Cabot. 191 00:09:52,480 --> 00:09:54,959 Speaker 2: That's a great, great play on words. But Richard Cabot 192 00:09:54,960 --> 00:09:58,040 Speaker 2: came up with this kind of way of teaching trainees 193 00:09:58,440 --> 00:10:01,079 Speaker 2: how to think through complex diagno cases. So in the 194 00:10:01,120 --> 00:10:04,240 Speaker 2: early nineteen hundreds he's a physician at Massachusetts General Hospital 195 00:10:04,240 --> 00:10:07,040 Speaker 2: and he starts a seminar series where an expert physician 196 00:10:07,080 --> 00:10:09,160 Speaker 2: gets up in front of the room and he's presented 197 00:10:09,200 --> 00:10:11,800 Speaker 2: a very complex case and the details come out kind 198 00:10:11,840 --> 00:10:14,120 Speaker 2: of drip drip, You talk about the symptoms and the 199 00:10:14,200 --> 00:10:16,839 Speaker 2: labs and he's talking through you know, how he's thinking 200 00:10:16,880 --> 00:10:20,120 Speaker 2: about getting to the right diagnosis. And this was kind 201 00:10:20,160 --> 00:10:24,240 Speaker 2: of the basis for what this Cabot was trained on. 202 00:10:24,320 --> 00:10:27,920 Speaker 2: So all the CPCs, these are these cases clinical pathological 203 00:10:27,960 --> 00:10:31,920 Speaker 2: case conferences. It's trained on this literature, which now there's 204 00:10:32,000 --> 00:10:33,400 Speaker 2: hundreds and hundreds of these things. 205 00:10:33,559 --> 00:10:36,079 Speaker 1: Just to clarify for audience, a CPC, as I understand, 206 00:10:36,240 --> 00:10:40,760 Speaker 1: is basically a kind of form or a way of 207 00:10:41,440 --> 00:10:45,000 Speaker 1: recording the key details of a medical case so that 208 00:10:45,000 --> 00:10:48,120 Speaker 1: it can be basically turned into a textbook example so 209 00:10:48,200 --> 00:10:51,040 Speaker 1: that future doctors can learn how a diagnosis has come 210 00:10:51,080 --> 00:10:52,080 Speaker 1: to is that is that kind. 211 00:10:51,920 --> 00:10:56,439 Speaker 2: Of fair mostly fair. So they're called clinical pathological conferences, 212 00:10:56,520 --> 00:10:58,520 Speaker 2: and the pathology part is that at the end of 213 00:10:58,520 --> 00:11:01,640 Speaker 2: the case, a pathologists usually has looked at what the 214 00:11:01,640 --> 00:11:03,600 Speaker 2: correct answer was. So maybe you did a biopsy, maybe 215 00:11:03,640 --> 00:11:05,240 Speaker 2: you got a blood test, maybe you had some stain, 216 00:11:05,480 --> 00:11:07,760 Speaker 2: and so you know the right answer. And these are 217 00:11:07,880 --> 00:11:10,920 Speaker 2: the most complex cases that came into the Massachusetts General Hospital, 218 00:11:11,200 --> 00:11:15,120 Speaker 2: and they were selected because they were educationally instructive. And 219 00:11:15,160 --> 00:11:17,199 Speaker 2: then these things were written up and they were published 220 00:11:17,200 --> 00:11:19,199 Speaker 2: in the Newland Journal of Medicine. And they've been doing 221 00:11:19,200 --> 00:11:21,440 Speaker 2: this for more than one hundred years now, and so 222 00:11:21,480 --> 00:11:24,240 Speaker 2: they're kind of thought of this gold standard of clinical 223 00:11:24,320 --> 00:11:26,640 Speaker 2: reasoning and diagnostics. If you can solve a CPC, you 224 00:11:26,640 --> 00:11:30,120 Speaker 2: could solve pretty much anything. And most doctors, you know, 225 00:11:30,120 --> 00:11:31,800 Speaker 2: if you just gave him the case, probably couldn't solve 226 00:11:31,840 --> 00:11:34,160 Speaker 2: a lot of these. And these are challenging. And so 227 00:11:34,400 --> 00:11:36,240 Speaker 2: I go to Harvard and they're having this kind of 228 00:11:36,240 --> 00:11:39,080 Speaker 2: showdown between one of my residency classmates, who you know, 229 00:11:39,120 --> 00:11:41,040 Speaker 2: I was always envious of because he was, you know, 230 00:11:41,080 --> 00:11:43,600 Speaker 2: he was kind of the god walking amongst the rest 231 00:11:43,600 --> 00:11:47,000 Speaker 2: of US residents in terms of his diagnostic acumen. So 232 00:11:47,040 --> 00:11:49,160 Speaker 2: he gets up there and he's given the case and 233 00:11:49,200 --> 00:11:52,079 Speaker 2: he walks through it. He creates these kind of four big, 234 00:11:52,320 --> 00:11:54,520 Speaker 2: kind of ven diagrams of the things that he's thinking 235 00:11:54,520 --> 00:11:56,760 Speaker 2: about in terms of labs, and he's thinking about imaging, 236 00:11:56,800 --> 00:11:59,760 Speaker 2: and he's thinking about symptoms. And in the middle he 237 00:12:00,160 --> 00:12:04,040 Speaker 2: points this this diagnosis called loft print syndrome, which is 238 00:12:04,080 --> 00:12:07,280 Speaker 2: a kind of an autoimmune condition, and everyone clapsed and 239 00:12:07,320 --> 00:12:10,360 Speaker 2: it's a it's an impressive display of diagnostic as it is. 240 00:12:10,400 --> 00:12:13,920 Speaker 1: Revealed to it is the right correct The Wizard's has 241 00:12:14,000 --> 00:12:15,200 Speaker 1: taken the rabbit out of the. 242 00:12:15,240 --> 00:12:18,439 Speaker 2: Hut exactly exactly, and it's something that mostly again, most 243 00:12:18,440 --> 00:12:21,520 Speaker 2: people probably wouldn't get. And then Cabot was given the 244 00:12:21,559 --> 00:12:25,560 Speaker 2: same prompt and within five minutes it comes up with 245 00:12:25,640 --> 00:12:28,440 Speaker 2: a presentation. Within five minutes. Again, you know, most doctors 246 00:12:28,440 --> 00:12:30,480 Speaker 2: would get six weeks to do this and present it. 247 00:12:30,960 --> 00:12:35,160 Speaker 2: And it is it's funny, it's kind of professional sounding 248 00:12:35,160 --> 00:12:39,160 Speaker 2: in terms of the voice, casual enough. It talks about 249 00:12:39,200 --> 00:12:43,000 Speaker 2: the salient features of the case, and ultimately it arrives 250 00:12:43,080 --> 00:12:46,559 Speaker 2: at the correct diagnosis, loft print syndrome. And there's kind 251 00:12:46,600 --> 00:12:48,719 Speaker 2: of this chill in the room when it comes to 252 00:12:48,800 --> 00:12:53,199 Speaker 2: that realization that this AI has basically solved something that 253 00:12:53,240 --> 00:12:57,560 Speaker 2: even might be difficult for expert diagnosticians, and it made 254 00:12:57,600 --> 00:12:59,280 Speaker 2: me think, you know, in the past, I'd been kind 255 00:12:59,320 --> 00:13:03,240 Speaker 2: of skeptical, can AI do the kind of very complex, 256 00:13:03,400 --> 00:13:07,240 Speaker 2: difficult cognitive work the reasoning that's required to come to 257 00:13:07,280 --> 00:13:11,280 Speaker 2: a diagnosis of this level. And now I basically had 258 00:13:11,280 --> 00:13:14,040 Speaker 2: to reassess that whole thought process where I was thinking, 259 00:13:14,120 --> 00:13:15,640 Speaker 2: you know, can I do it? And now I say, 260 00:13:15,720 --> 00:13:17,640 Speaker 2: how could we not be using this thing? You know, 261 00:13:17,840 --> 00:13:19,760 Speaker 2: how can we not be using this as a diagnostic 262 00:13:19,760 --> 00:13:22,560 Speaker 2: aid if in fact it is as good as it 263 00:13:22,600 --> 00:13:23,080 Speaker 2: seems to be. 264 00:13:24,200 --> 00:13:27,040 Speaker 1: I mean, you mentioned the sodium bromide as a replacement 265 00:13:27,080 --> 00:13:31,520 Speaker 1: for sodium chloride and the gentleman who poisoned himself? Is 266 00:13:31,520 --> 00:13:34,200 Speaker 1: that an ongoing risk towards that like a story which 267 00:13:34,280 --> 00:13:38,080 Speaker 1: kind of was characteristic of the earliest stage of generative AI, 268 00:13:38,160 --> 00:13:40,800 Speaker 1: which had more hallucinations. I mean, do you how much 269 00:13:40,800 --> 00:13:43,240 Speaker 1: of a hallucination risk is there in Cabot? 270 00:13:43,400 --> 00:13:46,640 Speaker 2: I think that it is definitely still an ongoing risk, 271 00:13:46,679 --> 00:13:49,400 Speaker 2: and so we should not be turning over our medical 272 00:13:49,440 --> 00:13:52,160 Speaker 2: reasoning to these bonds just yet. One of the key 273 00:13:52,240 --> 00:13:54,880 Speaker 2: insights of the piece that I wrote, and I continue 274 00:13:54,880 --> 00:13:58,520 Speaker 2: to feel this way, is that the effectiveness of a 275 00:13:58,559 --> 00:14:01,440 Speaker 2: lot of these models depends on the curation of the 276 00:14:01,480 --> 00:14:04,200 Speaker 2: information that is presented to them. So if you give 277 00:14:04,760 --> 00:14:08,160 Speaker 2: information to these bots that's organized in the right way, 278 00:14:08,160 --> 00:14:11,600 Speaker 2: that has the right salient features, that's talking about things 279 00:14:11,679 --> 00:14:14,520 Speaker 2: in a way that you know it is legible, it 280 00:14:14,600 --> 00:14:16,760 Speaker 2: is very very good at coming in the right diagnosis. 281 00:14:16,800 --> 00:14:19,760 Speaker 2: But a lot of medicine is actually about gathering those 282 00:14:19,800 --> 00:14:22,280 Speaker 2: clues figuring out how to curate the case in your 283 00:14:22,280 --> 00:14:24,320 Speaker 2: own mind. So if you're talking to these models in 284 00:14:24,360 --> 00:14:27,360 Speaker 2: broad strokes or don't emphasize the right details, you can 285 00:14:27,760 --> 00:14:31,480 Speaker 2: very easily get a different and possibly incorrect diagnosis. And 286 00:14:31,480 --> 00:14:33,440 Speaker 2: in fact, I played around with Cabot and I gave 287 00:14:33,480 --> 00:14:36,760 Speaker 2: it Matthew Williams's case, and I gave it did yes, 288 00:14:36,840 --> 00:14:39,640 Speaker 2: I did. And when I gave it kind of broad strokes, 289 00:14:39,800 --> 00:14:43,440 Speaker 2: you know, without the sufficient level of detail, didn't emphasize things. 290 00:14:43,680 --> 00:14:45,520 Speaker 2: First of all, it just made up some stuff, made 291 00:14:45,560 --> 00:14:47,920 Speaker 2: up its vitals, and it came to the wrong answer. 292 00:14:47,960 --> 00:14:50,560 Speaker 2: It did not deliver the correct answer when I gave 293 00:14:50,560 --> 00:14:53,360 Speaker 2: it the kind of exact transcript of what happened in 294 00:14:53,440 --> 00:14:55,680 Speaker 2: the emergency room. That what the doctors had thought, how 295 00:14:55,760 --> 00:14:57,800 Speaker 2: the labs hose, they ordered, how they were thinking about 296 00:14:57,840 --> 00:15:00,360 Speaker 2: the process, that's when it nailed the diagnosis. And so 297 00:15:00,600 --> 00:15:02,520 Speaker 2: you know, one of the things that's really powerful about 298 00:15:02,520 --> 00:15:05,000 Speaker 2: a doctor using it is often they know what are 299 00:15:05,000 --> 00:15:07,880 Speaker 2: the salient things we should be emphasizing to the bots, 300 00:15:07,880 --> 00:15:11,320 Speaker 2: and patients might not always you know, have that background 301 00:15:11,320 --> 00:15:12,800 Speaker 2: and that ability to do that. 302 00:15:13,320 --> 00:15:13,520 Speaker 1: You know. 303 00:15:14,000 --> 00:15:16,400 Speaker 2: The one thing that I want to get to as 304 00:15:16,440 --> 00:15:19,480 Speaker 2: well when we're thinking about doctors using these things is 305 00:15:19,520 --> 00:15:23,800 Speaker 2: this idea of cognitive de skilling, when you're off floating 306 00:15:24,200 --> 00:15:27,480 Speaker 2: the cognitive work of thinking through something yourself. And there's 307 00:15:27,480 --> 00:15:29,680 Speaker 2: not so different than student writing an essay and not 308 00:15:29,720 --> 00:15:32,600 Speaker 2: learning to write the essay themselves, or any other type 309 00:15:32,640 --> 00:15:35,280 Speaker 2: of you know, professional who's who's using these these bots. 310 00:15:35,720 --> 00:15:39,120 Speaker 2: But I think there's there's something really challenging about relying 311 00:15:39,160 --> 00:15:42,480 Speaker 2: on these things, using them effectively, but not letting them 312 00:15:42,520 --> 00:15:45,400 Speaker 2: replace your judgment and you're thinking, because then we end 313 00:15:45,440 --> 00:15:49,040 Speaker 2: up with a generation of physicians who aren't thinking for themselves. 314 00:15:49,040 --> 00:15:51,480 Speaker 2: And when something goes wrong or you're not able to 315 00:15:51,520 --> 00:15:55,120 Speaker 2: spot where the AI is making the incorrect judgment, then 316 00:15:55,200 --> 00:15:57,440 Speaker 2: patients could really be hurt. And so this idea of 317 00:15:57,520 --> 00:15:59,680 Speaker 2: cognitive de skilling is something that we think we really 318 00:15:59,720 --> 00:16:00,640 Speaker 2: need to guard against. 319 00:16:00,920 --> 00:16:02,840 Speaker 1: I think there was a physician quote in your piece 320 00:16:02,840 --> 00:16:05,440 Speaker 1: who basically confessed to you that he'd gone through a 321 00:16:05,480 --> 00:16:08,400 Speaker 1: whole day at work and realized he hadn't made a 322 00:16:08,440 --> 00:16:12,200 Speaker 1: single diagnosis himself. He just outsourced every single one to AI. 323 00:16:12,560 --> 00:16:14,800 Speaker 2: Yeah. It was a medical student actually, and I think, 324 00:16:14,880 --> 00:16:16,160 Speaker 2: you know, we need to think a lot about how 325 00:16:16,160 --> 00:16:19,280 Speaker 2: we educate students in this environment. But he basically said 326 00:16:19,280 --> 00:16:21,760 Speaker 2: that every time he would step out of a patient's room, 327 00:16:21,920 --> 00:16:24,080 Speaker 2: he would basically put some version of what the patient 328 00:16:24,080 --> 00:16:26,480 Speaker 2: had told him into the bot and it would create 329 00:16:26,520 --> 00:16:28,440 Speaker 2: a list of potential diagnoses, and then he would go 330 00:16:28,480 --> 00:16:31,720 Speaker 2: present those diagnoses to the physician who's in charge in 331 00:16:31,760 --> 00:16:35,160 Speaker 2: the supervising physician. And he basically looked up one day 332 00:16:35,160 --> 00:16:37,560 Speaker 2: and said, I haven't thought about a single patient unassisted 333 00:16:37,720 --> 00:16:40,000 Speaker 2: the whole day. So he did, in my view, very 334 00:16:40,000 --> 00:16:42,120 Speaker 2: smart thing. He said, I'm not going to use this 335 00:16:42,160 --> 00:16:43,720 Speaker 2: for in the way that I have been doing it. 336 00:16:43,920 --> 00:16:46,000 Speaker 2: I'm first going to come up with the diagnosis, the 337 00:16:46,000 --> 00:16:48,560 Speaker 2: list of diagnoses myself. I'm first going to think about 338 00:16:48,800 --> 00:16:51,960 Speaker 2: what I think is happening, and then almost as a 339 00:16:52,000 --> 00:16:55,600 Speaker 2: second opinion, used the AI and see, you know where 340 00:16:55,760 --> 00:16:57,760 Speaker 2: some things I might have missed or you know what 341 00:16:57,840 --> 00:17:00,960 Speaker 2: it is emphasizing that I didn't emphasize, And that type 342 00:17:00,960 --> 00:17:03,880 Speaker 2: of second opinion consultation I think is a much more 343 00:17:03,880 --> 00:17:06,160 Speaker 2: effective way of using these bots at this stage. 344 00:17:06,640 --> 00:17:09,280 Speaker 1: I think one of the most striking sort of medical 345 00:17:09,320 --> 00:17:12,040 Speaker 1: AI stories of last year was this study that showed 346 00:17:12,040 --> 00:17:16,240 Speaker 1: that doctors plus AI are better than doctors or AI alone. 347 00:17:16,280 --> 00:17:19,320 Speaker 1: But this one study in fact demonstrator that doctors plus 348 00:17:19,520 --> 00:17:23,040 Speaker 1: AI were worse than AI on its own in terms 349 00:17:23,040 --> 00:17:26,560 Speaker 1: of diagnosing. I mean, to be fair. Now more recently 350 00:17:26,600 --> 00:17:29,280 Speaker 1: you have the people building these models, I mean, Dario 351 00:17:29,320 --> 00:17:33,160 Speaker 1: Armaday in particular, being much more stark about the future 352 00:17:33,560 --> 00:17:39,080 Speaker 1: role for humans in a world of AI decimation of 353 00:17:39,080 --> 00:17:41,280 Speaker 1: white collar work, of which I guess the medical profession 354 00:17:41,359 --> 00:17:44,320 Speaker 1: is the sense part of white coats versus white collars. 355 00:17:44,520 --> 00:17:47,200 Speaker 1: But why did it happen? Why would doctors not made 356 00:17:47,240 --> 00:17:49,880 Speaker 1: better in this study rather than worse by using AI? 357 00:17:50,240 --> 00:17:53,200 Speaker 2: Well, one thing to note is that this study was 358 00:17:53,240 --> 00:17:55,119 Speaker 2: done at a time where a lot of doctors hadn't 359 00:17:55,200 --> 00:17:57,960 Speaker 2: used AI in the past. And so one of the 360 00:17:58,000 --> 00:18:00,760 Speaker 2: things that that study raised for me, which as you said, 361 00:18:00,760 --> 00:18:04,840 Speaker 2: showed that basically AI using or analyzing cases by itself 362 00:18:05,000 --> 00:18:08,520 Speaker 2: performed better than doctors that were using AI, is that 363 00:18:08,560 --> 00:18:11,280 Speaker 2: the doctors didn't really know how to use these things, 364 00:18:11,280 --> 00:18:14,520 Speaker 2: what the advantages and disadvantages of using them, what specific 365 00:18:14,560 --> 00:18:17,000 Speaker 2: techniques they should be using. And so the initial study, 366 00:18:17,000 --> 00:18:19,280 Speaker 2: as you say, showed that they didn't get any better. 367 00:18:19,600 --> 00:18:21,360 Speaker 2: Now in a follow up study, I should note that 368 00:18:21,760 --> 00:18:26,240 Speaker 2: the team suggested doctors use AI in specific ways. They 369 00:18:26,280 --> 00:18:28,280 Speaker 2: asked that, you know, some doctors to read the AI's 370 00:18:28,280 --> 00:18:31,200 Speaker 2: output and then analyze their cases. They told other ones 371 00:18:31,240 --> 00:18:35,119 Speaker 2: to you know, suggest specific ways to talk with the AI. 372 00:18:35,240 --> 00:18:37,040 Speaker 2: They asked other people you know, to come up with 373 00:18:37,040 --> 00:18:39,159 Speaker 2: their own working diagnosis and ask for a second opinion. 374 00:18:39,800 --> 00:18:42,200 Speaker 2: And in this case, actually the doctors did get better 375 00:18:42,359 --> 00:18:45,080 Speaker 2: at using the AI and coming up with with the 376 00:18:45,160 --> 00:18:47,400 Speaker 2: right diagnosis. And so I think a lot of this 377 00:18:47,640 --> 00:18:51,280 Speaker 2: depends on how we're going to be interacting with the AI. 378 00:18:51,760 --> 00:18:54,520 Speaker 2: I'm open to the idea that for certain tasks and 379 00:18:54,560 --> 00:18:57,919 Speaker 2: for certain things, AI alone will just be better. I mean, 380 00:18:58,000 --> 00:19:00,000 Speaker 2: AA is going to be better, you know, as calculators 381 00:19:00,119 --> 00:19:02,879 Speaker 2: are better than humans. That adding complex numbers. I mean, 382 00:19:02,880 --> 00:19:04,840 Speaker 2: that's possible to me. But I think a lot of 383 00:19:04,880 --> 00:19:07,760 Speaker 2: the important parts of medicine are still not able to 384 00:19:07,800 --> 00:19:10,879 Speaker 2: be automated. I think that a lot of it requires, 385 00:19:11,240 --> 00:19:14,280 Speaker 2: you know, judgment and understanding the context and the perspective 386 00:19:14,320 --> 00:19:17,360 Speaker 2: of the patient and what's actually happening. You know. I'll 387 00:19:17,359 --> 00:19:19,280 Speaker 2: give you an example. The other day. You know, I 388 00:19:19,320 --> 00:19:21,639 Speaker 2: had a patient who came in with a cat bite 389 00:19:22,000 --> 00:19:24,800 Speaker 2: on her on her arm, and it looked like maybe 390 00:19:24,800 --> 00:19:27,200 Speaker 2: there was an infection there. The infection was getting worse 391 00:19:27,200 --> 00:19:29,960 Speaker 2: and red and swelling, and I started some antibiotics. It didn't 392 00:19:29,960 --> 00:19:31,440 Speaker 2: seem to get be getting better, and so I got 393 00:19:31,560 --> 00:19:33,199 Speaker 2: kind of concern, you know, so let me let me 394 00:19:33,200 --> 00:19:38,520 Speaker 2: ask an AI. And basically its recommendation was was pretty stark. 395 00:19:38,560 --> 00:19:41,639 Speaker 2: It was this person could have something called necrotizing FASc itis, 396 00:19:41,680 --> 00:19:43,920 Speaker 2: which is kind of a flesh eating bacteria. They could 397 00:19:43,920 --> 00:19:45,840 Speaker 2: lose their arm. You should call surgery. They need surgery, 398 00:19:45,920 --> 00:19:48,480 Speaker 2: and that's the next step. I had seen a lot 399 00:19:48,480 --> 00:19:50,840 Speaker 2: of these types of cellulitis. Sometimes they get a little 400 00:19:50,840 --> 00:19:53,440 Speaker 2: bit worse before they get better. You know. I needed 401 00:19:53,480 --> 00:19:56,119 Speaker 2: to be able to contextualize what it was giving me, 402 00:19:56,240 --> 00:19:58,119 Speaker 2: and so I didn't have that knee jerk response. We 403 00:19:58,119 --> 00:20:00,240 Speaker 2: had a different antibiotic. It started to get better, and 404 00:20:00,880 --> 00:20:03,480 Speaker 2: over time the woman improved. And so I think even 405 00:20:03,520 --> 00:20:05,720 Speaker 2: something as simple as that, you can't just you know, 406 00:20:05,760 --> 00:20:08,320 Speaker 2: automatically take what the AI is telling you without using 407 00:20:08,320 --> 00:20:11,240 Speaker 2: your own judgment to figure out, you know, is this something? 408 00:20:11,560 --> 00:20:14,040 Speaker 2: What level of priority should I give this output that 409 00:20:14,080 --> 00:20:14,639 Speaker 2: I've been given? 410 00:20:40,520 --> 00:20:44,440 Speaker 1: After the break Drew's answer to the question if AI 411 00:20:44,520 --> 00:21:06,280 Speaker 1: can diagnose patients what a doctor's for stay with us. So, 412 00:21:06,359 --> 00:21:09,200 Speaker 1: if AI can diagnose patients what a doctor's. 413 00:21:08,920 --> 00:21:11,800 Speaker 2: For, I think there's a lot of things that doctors 414 00:21:11,920 --> 00:21:14,879 Speaker 2: still do that AI is not able to do, and 415 00:21:14,920 --> 00:21:17,639 Speaker 2: I think won't be able to do in the near future. 416 00:21:18,080 --> 00:21:22,200 Speaker 2: You know, one of the really important things is managing uncertainty. 417 00:21:22,280 --> 00:21:26,119 Speaker 2: So if even the AI model gives you an answer, 418 00:21:26,160 --> 00:21:28,119 Speaker 2: and often it won't be one answer, it could be 419 00:21:28,200 --> 00:21:31,080 Speaker 2: several answers, or it's not clear what to do with 420 00:21:31,160 --> 00:21:34,480 Speaker 2: the information that you've been given. Maybe they have the 421 00:21:34,560 --> 00:21:37,280 Speaker 2: right answer, the right diagnosis, but there are many treatment 422 00:21:37,320 --> 00:21:40,120 Speaker 2: options and some of those are you know, have trade 423 00:21:40,160 --> 00:21:42,800 Speaker 2: offs between efficacy and side effects and so on, So 424 00:21:43,000 --> 00:21:45,919 Speaker 2: there always be this realm of managing uncertainty, and I 425 00:21:45,920 --> 00:21:48,800 Speaker 2: think you want a human, a clinician, someone who's been 426 00:21:48,800 --> 00:21:50,720 Speaker 2: trained to think about this to help you with that. 427 00:21:51,440 --> 00:21:54,280 Speaker 2: The second is integrating values. I think a lot of 428 00:21:54,320 --> 00:21:56,399 Speaker 2: medicine it's a science, but it's also an art, and 429 00:21:56,440 --> 00:21:59,359 Speaker 2: part of that art is to elicit patient values, to 430 00:21:59,560 --> 00:22:02,240 Speaker 2: understand what's important to them, what their preferences are, what's 431 00:22:02,240 --> 00:22:04,000 Speaker 2: important to them in the short run, but also in 432 00:22:04,000 --> 00:22:07,480 Speaker 2: a long run, integrating those values with the best science available, 433 00:22:07,920 --> 00:22:10,639 Speaker 2: and coming up with a treatment plan. And the third is, 434 00:22:11,440 --> 00:22:14,439 Speaker 2: you know, you want someone to take responsibility for the 435 00:22:14,480 --> 00:22:18,200 Speaker 2: care that you're receiving, particularly for complex or challenging care. 436 00:22:18,640 --> 00:22:21,560 Speaker 2: We want someone who's kind of the quarterback. You want 437 00:22:21,560 --> 00:22:24,320 Speaker 2: someone who you know, you've been given this cancer diagnosis, 438 00:22:24,320 --> 00:22:26,960 Speaker 2: you've been given a heart failure diagnosis. You know you're scared, 439 00:22:27,000 --> 00:22:30,359 Speaker 2: you're uncertain about what the path forward is. Many people 440 00:22:30,359 --> 00:22:33,440 Speaker 2: in that situation would want someone who is able to 441 00:22:33,520 --> 00:22:36,800 Speaker 2: take responsibility for them, to guide them through the most 442 00:22:36,840 --> 00:22:40,280 Speaker 2: challenging aspects of diagnosis and treatment. And so I think 443 00:22:40,520 --> 00:22:42,480 Speaker 2: there are a lot of things. So say nothing about 444 00:22:42,560 --> 00:22:44,880 Speaker 2: the kind of inconsistencies that we've talked about and even 445 00:22:44,880 --> 00:22:47,720 Speaker 2: the best AI models, but those kind of human aspects 446 00:22:47,800 --> 00:22:49,680 Speaker 2: of care I think I don't see being replaced for 447 00:22:49,720 --> 00:22:50,200 Speaker 2: a long time. 448 00:22:50,520 --> 00:22:52,720 Speaker 1: Talk about that idea of being less like Sherlock and 449 00:22:52,760 --> 00:22:54,720 Speaker 1: more like sissifas well. You know. 450 00:22:54,760 --> 00:22:58,360 Speaker 2: That gets back to this idea that when people watch 451 00:22:58,440 --> 00:23:02,200 Speaker 2: shows like House, they might think that a lot of medicine, 452 00:23:02,320 --> 00:23:05,280 Speaker 2: or most of medicine is kind of sitting around and 453 00:23:05,560 --> 00:23:08,280 Speaker 2: trying to crack the case. And most of what we're 454 00:23:08,280 --> 00:23:12,119 Speaker 2: doing as doctors is figuring out you know, this person, 455 00:23:12,280 --> 00:23:14,240 Speaker 2: you know ate this meal three weeks ago, or you 456 00:23:14,280 --> 00:23:16,520 Speaker 2: look on at their fingernail and that has this type 457 00:23:16,560 --> 00:23:18,840 Speaker 2: of dirt from this county, and that county has the 458 00:23:18,920 --> 00:23:20,439 Speaker 2: you know this type of bacteria or something like this. 459 00:23:20,800 --> 00:23:23,720 Speaker 2: That's not really the way that that medicine works. And 460 00:23:23,800 --> 00:23:26,200 Speaker 2: so a lot of medicine part of it is getting 461 00:23:26,200 --> 00:23:28,000 Speaker 2: the right diagnosis. I don't want to diminish that we 462 00:23:28,080 --> 00:23:30,840 Speaker 2: have a huge problem with diagnostic errors in this country 463 00:23:30,880 --> 00:23:32,840 Speaker 2: and elsewhere, and a lot of what we need to 464 00:23:32,840 --> 00:23:35,120 Speaker 2: be doing is making sure we have the right diagnosis. 465 00:23:35,960 --> 00:23:38,440 Speaker 2: But even when you have the right diagnosis, let's say 466 00:23:38,440 --> 00:23:42,000 Speaker 2: someone has amphasema or heart failure or cancer, or sickle 467 00:23:42,000 --> 00:23:43,679 Speaker 2: cell disease. You have that, you know what, you know 468 00:23:43,720 --> 00:23:47,720 Speaker 2: what's going on. Managing that condition takes a tremendous amount 469 00:23:47,720 --> 00:23:51,280 Speaker 2: of work, a lot of balance between different organs. You 470 00:23:51,440 --> 00:23:54,119 Speaker 2: move the fluid to improve someone's heart failure, it dehydrates 471 00:23:54,160 --> 00:23:57,000 Speaker 2: the kidneys. You got to balance these things together. You know, 472 00:23:57,040 --> 00:23:59,840 Speaker 2: you're convincing someone to stop smoking, who has emphysema or 473 00:24:00,240 --> 00:24:02,360 Speaker 2: you know, whatever it might be. There's a lot more 474 00:24:02,480 --> 00:24:05,879 Speaker 2: just kind of brute force work and staying at it 475 00:24:05,960 --> 00:24:09,080 Speaker 2: and making sure that people are plugged in after they 476 00:24:09,200 --> 00:24:11,359 Speaker 2: leave the hospital and that they have the supports that 477 00:24:11,400 --> 00:24:13,560 Speaker 2: they need. And so the point I'm trying to get 478 00:24:13,560 --> 00:24:16,320 Speaker 2: out there between the Sherlock and Sissiphus and distinction is 479 00:24:16,320 --> 00:24:20,200 Speaker 2: that diagnosis will get you only so far. A lot 480 00:24:20,240 --> 00:24:23,240 Speaker 2: of medicine is about the management of the patient afterwards. 481 00:24:23,440 --> 00:24:28,320 Speaker 1: It strikes me there's two kind of parallel situations here, 482 00:24:28,400 --> 00:24:36,280 Speaker 1: or two different use cases for AI in the medical system. 483 00:24:36,960 --> 00:24:39,800 Speaker 1: I believe you're at Wildcornel, right, that's right. So one 484 00:24:39,880 --> 00:24:43,439 Speaker 1: is you live in New York, you're fully insured, and 485 00:24:43,480 --> 00:24:47,240 Speaker 1: you get to go to Wildcornel when you're sick. How 486 00:24:47,280 --> 00:24:51,480 Speaker 1: do you and your doctor harness AI to get you 487 00:24:52,440 --> 00:24:57,080 Speaker 1: the best possible care. The other is you live in 488 00:24:57,359 --> 00:24:59,280 Speaker 1: ken Year as an example in your piece, but also 489 00:24:59,320 --> 00:25:02,240 Speaker 1: in many places in the US and are not ensured, 490 00:25:02,640 --> 00:25:06,320 Speaker 1: don't get to go to Wild Cornell, and AI might 491 00:25:06,320 --> 00:25:10,679 Speaker 1: be your only choice or at least a major like 492 00:25:11,200 --> 00:25:14,160 Speaker 1: almost an alternative source of truth versus a complementary source 493 00:25:14,160 --> 00:25:17,560 Speaker 1: of truth. Can you talk about both of those sort 494 00:25:17,600 --> 00:25:21,320 Speaker 1: of healthcare situations and what's similar and what's different in 495 00:25:21,400 --> 00:25:25,359 Speaker 1: terms of how you think about applying AI within them. 496 00:25:25,080 --> 00:25:27,520 Speaker 2: Right, So you know, one of the things that you're 497 00:25:27,520 --> 00:25:30,639 Speaker 2: getting at is many people just don't have access to 498 00:25:30,880 --> 00:25:32,480 Speaker 2: the type of care that we'd want them to have 499 00:25:32,520 --> 00:25:35,200 Speaker 2: access through. That's true, you know, internationally, and it's true 500 00:25:35,240 --> 00:25:38,600 Speaker 2: within the United States, either because they don't have adequate 501 00:25:38,640 --> 00:25:41,960 Speaker 2: forms of health insurance or because they're simply not enough 502 00:25:42,200 --> 00:25:44,920 Speaker 2: doctors in their area. So you know, something like half 503 00:25:44,960 --> 00:25:47,440 Speaker 2: of US counties don't have a single psychiatrist, and so 504 00:25:47,480 --> 00:25:50,040 Speaker 2: you can imagine the challenges that creates for people with 505 00:25:50,280 --> 00:25:53,880 Speaker 2: mental health issues. The other point it raises is that 506 00:25:54,320 --> 00:25:57,600 Speaker 2: often when we're talking about AI, who we are hesitant 507 00:25:57,640 --> 00:25:59,959 Speaker 2: to use it or reluctant to use it. If it's 508 00:26:00,119 --> 00:26:03,760 Speaker 2: not perfect if the ERR rate isn't functionally zero without 509 00:26:03,800 --> 00:26:06,800 Speaker 2: recognizing or emphasizing that there's a lot of error that's 510 00:26:06,800 --> 00:26:08,800 Speaker 2: going on in the medical system right now, and so 511 00:26:08,960 --> 00:26:11,240 Speaker 2: often it's not the case that it has to be perfect, 512 00:26:11,240 --> 00:26:13,320 Speaker 2: but it should be better or as good as what 513 00:26:13,440 --> 00:26:17,439 Speaker 2: people otherwise have an opportunity to access. And so, you know, 514 00:26:17,480 --> 00:26:20,639 Speaker 2: in an ideal world, you'd like doctors and AI working together, 515 00:26:20,760 --> 00:26:23,399 Speaker 2: everyone gets kind of the best level of care. But 516 00:26:23,640 --> 00:26:25,320 Speaker 2: in reality, I think this is going to be a 517 00:26:25,320 --> 00:26:28,480 Speaker 2: situation in which a lot of people turn to AI, 518 00:26:28,760 --> 00:26:31,960 Speaker 2: and if that's absent oversight by clinicians or not in 519 00:26:32,000 --> 00:26:35,359 Speaker 2: conjunction with clinicians, they will receive a worse level of 520 00:26:35,359 --> 00:26:37,800 Speaker 2: care than they would if those things were working together, 521 00:26:38,119 --> 00:26:40,760 Speaker 2: but probably a better level of care or at least 522 00:26:40,760 --> 00:26:43,320 Speaker 2: some care, some insight into what's happening with their bodies 523 00:26:43,400 --> 00:26:46,240 Speaker 2: their minds that they wouldn't be able to access. And 524 00:26:46,280 --> 00:26:49,800 Speaker 2: so we already see a larger level of autonomy for 525 00:26:50,000 --> 00:26:52,800 Speaker 2: AI in the healthcare system. You may have seen recently 526 00:26:52,920 --> 00:26:56,560 Speaker 2: Utah signed a partnership with AI health company Doctronic, and 527 00:26:56,640 --> 00:27:01,119 Speaker 2: so some types of medication refills will be automatically refilled 528 00:27:01,160 --> 00:27:03,760 Speaker 2: after you have a conversation with an AI. You know, 529 00:27:03,800 --> 00:27:06,160 Speaker 2: I think that's a smart way to start going about 530 00:27:06,160 --> 00:27:09,240 Speaker 2: this because most of the medications are pretty low risk medications, 531 00:27:09,280 --> 00:27:11,639 Speaker 2: and you know, after all, you're just doing refills and 532 00:27:11,680 --> 00:27:14,080 Speaker 2: you're not doing the initial prescription. But you can envision 533 00:27:14,160 --> 00:27:16,640 Speaker 2: a world in which much more of healthcare, at least 534 00:27:16,640 --> 00:27:19,800 Speaker 2: transactional forms of healthcare. You know, you just have a UTI, 535 00:27:19,960 --> 00:27:22,480 Speaker 2: or you need a quick X ray for your ankle sprain, 536 00:27:22,560 --> 00:27:25,159 Speaker 2: make sure nothing's broken. Those types of things. Even for 537 00:27:25,240 --> 00:27:28,159 Speaker 2: people who are well insured or who have access to doctors, 538 00:27:28,560 --> 00:27:30,840 Speaker 2: it might just be more convenient to have the AI 539 00:27:31,119 --> 00:27:33,760 Speaker 2: do that and only for the more complex things, the 540 00:27:33,840 --> 00:27:37,200 Speaker 2: conditions that require managing uncertain tier or certain forms of judgment. 541 00:27:37,240 --> 00:27:39,040 Speaker 2: And that's where AI and doctors come together. 542 00:27:39,640 --> 00:27:42,880 Speaker 1: Most of the piece was anchored in a patient experience, 543 00:27:42,960 --> 00:27:45,360 Speaker 1: But tell me a bit more about the doctor experience. 544 00:27:45,400 --> 00:27:48,080 Speaker 1: I mean, are you unusually interested in this topic because 545 00:27:48,080 --> 00:27:50,200 Speaker 1: you're also a writer for The New Yorker and therefore 546 00:27:50,240 --> 00:27:53,120 Speaker 1: you have an audience who's hungry to know about AI medicine, 547 00:27:53,280 --> 00:27:54,840 Speaker 1: or would you say that you have a kind of 548 00:27:54,920 --> 00:27:57,960 Speaker 1: characteristic level of interest in how this might change your profession. 549 00:27:58,880 --> 00:28:01,280 Speaker 2: I think a lot of my colleagues are very interested 550 00:28:01,280 --> 00:28:03,359 Speaker 2: in the same questions. I don't think it's just about 551 00:28:03,520 --> 00:28:05,680 Speaker 2: being a journalist or you know, having this other part 552 00:28:05,680 --> 00:28:09,560 Speaker 2: of my career. A lot of people are using AI already. 553 00:28:09,760 --> 00:28:12,639 Speaker 2: You know, we already used various forms of decision support. 554 00:28:12,680 --> 00:28:15,080 Speaker 2: It wasn't you know. We've left the era where doctor 555 00:28:15,119 --> 00:28:18,080 Speaker 2: had to keep everything around in his or her head. 556 00:28:18,200 --> 00:28:21,080 Speaker 2: We had first, you know, accessible textbooks and then kind 557 00:28:21,080 --> 00:28:23,199 Speaker 2: of digital textbooks, and so we were looking things up 558 00:28:23,280 --> 00:28:25,720 Speaker 2: and supplementing, you know, the knowledge in our brains with 559 00:28:25,840 --> 00:28:28,960 Speaker 2: those things. But you know, now it seems like almost 560 00:28:29,000 --> 00:28:32,160 Speaker 2: everyone it gets a second opinion, often from an AI 561 00:28:32,359 --> 00:28:35,159 Speaker 2: before they need a second opinion from a consultant or 562 00:28:35,520 --> 00:28:37,600 Speaker 2: a colleague of some sort. And so I think there 563 00:28:37,720 --> 00:28:42,200 Speaker 2: is a lot more interest and willingness appetite to engage 564 00:28:42,240 --> 00:28:45,160 Speaker 2: with these types of technologies, in part because you know, 565 00:28:45,240 --> 00:28:48,000 Speaker 2: medicine and the delivery of medicine is broken in a 566 00:28:48,000 --> 00:28:50,160 Speaker 2: lot of ways. I mean, there's so much administrative burden, 567 00:28:50,280 --> 00:28:53,920 Speaker 2: the time pressure that physicians are under that it can 568 00:28:54,000 --> 00:28:57,400 Speaker 2: be very helpful to have this offload some of that pressure. 569 00:28:57,640 --> 00:29:00,720 Speaker 2: If it's done in a responsible way. I think the 570 00:29:00,760 --> 00:29:04,320 Speaker 2: more general point is that I feel that the medical 571 00:29:04,360 --> 00:29:08,640 Speaker 2: profession is kind of fundamentally changing, and it's undergoing this 572 00:29:08,680 --> 00:29:12,040 Speaker 2: transformation from a very twentieth century model. Part of that 573 00:29:12,200 --> 00:29:14,080 Speaker 2: is keeping things through your head, but part of that 574 00:29:14,200 --> 00:29:17,800 Speaker 2: is just being the ultimate authority on all things medicine. 575 00:29:17,840 --> 00:29:20,640 Speaker 2: You know, you have all the knowledge, couldn't get knowledge otherwise, 576 00:29:20,920 --> 00:29:23,440 Speaker 2: you have all the access. Patients had nowhere else to go, 577 00:29:23,520 --> 00:29:26,320 Speaker 2: and you had kind of the ultimate authority. People trusted 578 00:29:26,480 --> 00:29:30,400 Speaker 2: doctors and healthcare professionals more than any other part of society, 579 00:29:30,680 --> 00:29:32,600 Speaker 2: and those things are all changing. You know, knowledge is 580 00:29:32,640 --> 00:29:35,920 Speaker 2: more democratized, first with the Internet, now with AI, access 581 00:29:36,000 --> 00:29:39,760 Speaker 2: is beingcoming democratized. We have direct consumer company to telehealth companies. 582 00:29:40,040 --> 00:29:43,240 Speaker 2: I talked about AI writing prescriptions, and I think more 583 00:29:43,240 --> 00:29:45,720 Speaker 2: of that will happen. And then there's been this crisis 584 00:29:45,720 --> 00:29:48,080 Speaker 2: of trust in institutions, and so the other thing that 585 00:29:48,120 --> 00:29:50,200 Speaker 2: I've been trying to think about is how does AI, 586 00:29:50,880 --> 00:29:54,200 Speaker 2: you know, play into this more general phenomenon by which 587 00:29:54,240 --> 00:29:57,720 Speaker 2: the medical profession is really undergoing a fundamental transformation in 588 00:29:57,760 --> 00:29:58,680 Speaker 2: the twenty first century. 589 00:29:58,880 --> 00:30:02,880 Speaker 1: Yeah, you mentioned sort of an earned crisis of trust. 590 00:30:03,040 --> 00:30:05,840 Speaker 1: You are another piece in the YUCA recently talking about 591 00:30:05,840 --> 00:30:09,040 Speaker 1: the Gilded Age of healthcare, which got a lot of pickup. 592 00:30:09,160 --> 00:30:11,760 Speaker 1: I think you're on CBS News talking about it. Why 593 00:30:11,760 --> 00:30:14,479 Speaker 1: didn't that piece struck a chord? And what's the connection 594 00:30:14,520 --> 00:30:16,600 Speaker 1: between that piece and your work on AI in the 595 00:30:16,640 --> 00:30:17,320 Speaker 1: medical setting. 596 00:30:17,680 --> 00:30:20,880 Speaker 2: Well, I think both of these pieces and a number 597 00:30:20,880 --> 00:30:23,959 Speaker 2: of other ones, they get at this idea of the 598 00:30:23,960 --> 00:30:28,800 Speaker 2: fundamental frustration that patients and doctors have with the current 599 00:30:28,840 --> 00:30:33,280 Speaker 2: state of the healthcare system. And when we think about corporatization, 600 00:30:34,120 --> 00:30:37,720 Speaker 2: that is things like private equity buying practices and hospitals 601 00:30:38,120 --> 00:30:41,360 Speaker 2: that is, things like nonprofit hospitals that are nonprofit but 602 00:30:41,480 --> 00:30:45,040 Speaker 2: actually behaving very much like for profit entities. That is, 603 00:30:45,080 --> 00:30:48,960 Speaker 2: things like insurers engaging in prior authorizations and care denials. 604 00:30:49,440 --> 00:30:51,880 Speaker 2: All these things kind of come together to create a 605 00:30:51,920 --> 00:30:56,240 Speaker 2: system that isn't really working for anyone. And so what 606 00:30:56,280 --> 00:30:58,440 Speaker 2: I was trying to get at that in that piece 607 00:30:58,560 --> 00:31:00,320 Speaker 2: is that you know, we have almost it's like a 608 00:31:00,360 --> 00:31:04,120 Speaker 2: gilded age where there are what we've done is the 609 00:31:04,160 --> 00:31:06,720 Speaker 2: healthcare system much of the time, is seen as a 610 00:31:06,800 --> 00:31:10,720 Speaker 2: vehicle through which various people can profit, as opposed to 611 00:31:10,760 --> 00:31:14,120 Speaker 2: a vehicle through which we can best help patients and 612 00:31:14,200 --> 00:31:18,640 Speaker 2: support the health of people. And that is just so backwards. 613 00:31:18,640 --> 00:31:21,880 Speaker 2: And so, you know, to the extent that AI plays 614 00:31:21,880 --> 00:31:24,600 Speaker 2: into that story, you know, the hope is that it 615 00:31:24,640 --> 00:31:27,320 Speaker 2: creates certain types of efficiencies. You know, a lot of 616 00:31:27,800 --> 00:31:31,480 Speaker 2: the documentation and red tape that occurs either through prior 617 00:31:31,480 --> 00:31:37,240 Speaker 2: authorization or regulatory reporting, maybe that gets automated. Maybe AI 618 00:31:37,360 --> 00:31:39,880 Speaker 2: takes off a lot of the tasks that are preventing, 619 00:31:40,120 --> 00:31:43,200 Speaker 2: you know, clinicians from getting to spend more time with 620 00:31:43,440 --> 00:31:46,680 Speaker 2: patients in the room. Maybe people feel like they have 621 00:31:46,800 --> 00:31:50,520 Speaker 2: better access now because they're kind of easy. Simple questions 622 00:31:50,520 --> 00:31:53,400 Speaker 2: can be answered by AI, and they're able to spend 623 00:31:53,400 --> 00:31:56,560 Speaker 2: more time on the more difficult stuff with with real doctors. So, 624 00:31:57,400 --> 00:32:00,480 Speaker 2: you know, again, the profession is changing. There are many 625 00:32:00,520 --> 00:32:03,200 Speaker 2: reasons for that. One of that is that we've become 626 00:32:03,240 --> 00:32:05,480 Speaker 2: a much more corporate system than we were, you know, 627 00:32:05,520 --> 00:32:08,840 Speaker 2: half a century ago, and maybe AI, you know, has 628 00:32:09,200 --> 00:32:12,280 Speaker 2: an ability to get us to a place where strange enough, 629 00:32:12,280 --> 00:32:14,360 Speaker 2: I technology could make care more humane. 630 00:32:14,400 --> 00:32:18,680 Speaker 1: Again just a close drew you said, quote doctors can 631 00:32:18,760 --> 00:32:24,760 Speaker 1: remake their profession working with other powers to help shape rules, norms, 632 00:32:24,920 --> 00:32:29,040 Speaker 1: and relationships. What's your prescription for how they should do that. 633 00:32:29,520 --> 00:32:32,400 Speaker 2: Well, I think we need to think about how we 634 00:32:32,440 --> 00:32:36,520 Speaker 2: can engage beyond the walls of a clinic or beyond 635 00:32:36,520 --> 00:32:41,240 Speaker 2: the halls of a hospital. And so that requires leaning 636 00:32:41,240 --> 00:32:44,520 Speaker 2: in to the ways that medicine is changing and trying 637 00:32:44,560 --> 00:32:47,320 Speaker 2: to play more of an active role. And so, you know, 638 00:32:47,320 --> 00:32:51,800 Speaker 2: I've talked about this idea that medical pression is changing. 639 00:32:51,840 --> 00:32:55,160 Speaker 2: We can you know, focus on gatekeeping and keeping others out. 640 00:32:55,400 --> 00:32:57,960 Speaker 2: We can kind of retreat into this idea that we're 641 00:32:58,000 --> 00:33:00,280 Speaker 2: just going to focus on kind of technical skills, making 642 00:33:00,360 --> 00:33:03,680 Speaker 2: sure that we're compensated in a certain way, or we 643 00:33:03,720 --> 00:33:05,640 Speaker 2: can kind of reinvent the profession. We can kind of 644 00:33:05,680 --> 00:33:08,560 Speaker 2: embrace this world that we're now in. Part of that 645 00:33:08,960 --> 00:33:12,080 Speaker 2: means being on social media, doctors who are talented in 646 00:33:12,120 --> 00:33:15,280 Speaker 2: that way, making engaging videos that are both informative and 647 00:33:15,360 --> 00:33:19,080 Speaker 2: truthful and getting people's attention that way. Part of that 648 00:33:19,240 --> 00:33:22,720 Speaker 2: is getting involved with these new healthcare startups and companies 649 00:33:22,920 --> 00:33:26,160 Speaker 2: that are using technology to try to improve care. Part 650 00:33:26,200 --> 00:33:28,880 Speaker 2: of that is running for office, you know as physicians. 651 00:33:29,000 --> 00:33:32,520 Speaker 2: Part of that is banning together in professional societies and 652 00:33:32,600 --> 00:33:35,320 Speaker 2: you know, at a time where the information coming out 653 00:33:35,360 --> 00:33:39,040 Speaker 2: of the federal government isn't always accurate and reliable. Maybe 654 00:33:39,040 --> 00:33:42,000 Speaker 2: people are turning to these alternate forms of knowing, by 655 00:33:42,000 --> 00:33:45,600 Speaker 2: which I mean professional societies, local health departments, these other 656 00:33:45,640 --> 00:33:48,040 Speaker 2: ways in which we feel like we can get the 657 00:33:48,040 --> 00:33:51,120 Speaker 2: best health information across to people. And so the idea 658 00:33:51,440 --> 00:33:54,240 Speaker 2: is that we live. We're not a hedgemon anymore. The 659 00:33:54,240 --> 00:33:57,280 Speaker 2: medical profession is not just something that people blindly follow. 660 00:33:57,440 --> 00:33:59,920 Speaker 2: It's something that is going to require work to engage 661 00:34:00,040 --> 00:34:03,280 Speaker 2: these other actors and these other ways of becoming leaders 662 00:34:03,320 --> 00:34:04,160 Speaker 2: in the public sphere. 663 00:34:05,960 --> 00:34:07,680 Speaker 1: Trucula, thank you for joining tech Stuff. 664 00:34:07,720 --> 00:34:14,120 Speaker 2: Thanks so much for having me. 665 00:34:30,400 --> 00:34:32,560 Speaker 1: That's it for tech Stuff this week. I'm os Voloshin. 666 00:34:32,960 --> 00:34:35,840 Speaker 1: This episode was produced by Eliza Dennis and Melissa Slaughter. 667 00:34:36,360 --> 00:34:39,399 Speaker 1: It was executive produced by me Caroen Price, Julian Nutter, 668 00:34:39,480 --> 00:34:43,680 Speaker 1: and Kate Osborne for Kaleidoscope and Katrian Novel for iHeart Podcasts. 669 00:34:44,080 --> 00:34:47,160 Speaker 1: Jack Insley makes this episode and Kyle Murdoch wrote our 670 00:34:47,200 --> 00:34:50,719 Speaker 1: theme song. Please rate, review, and reach out to us 671 00:34:50,760 --> 00:34:53,520 Speaker 1: at tech Stuff podcast at gmail dot com. We love 672 00:34:53,560 --> 00:34:54,160 Speaker 1: hearing from you. 673 00:35:01,960 --> 00:35:02,600 Speaker 2: Fo