1 00:00:04,960 --> 00:00:09,000 Speaker 1: On this episode of News World. How is artificial intelligence 2 00:00:09,039 --> 00:00:13,360 Speaker 1: going to impact healthcare? Many experts have indicated AI may 3 00:00:13,480 --> 00:00:17,639 Speaker 1: change the way we diagnose disease, manage medical records, and 4 00:00:17,720 --> 00:00:22,480 Speaker 1: review medical data. High powered artificial intelligence and machine learning 5 00:00:22,920 --> 00:00:26,880 Speaker 1: are enabling scientists and doctors to create and analyze vast 6 00:00:26,880 --> 00:00:30,680 Speaker 1: amounts of data and to develop new molecularities to disrupt 7 00:00:30,760 --> 00:00:34,559 Speaker 1: targeted diseases. My guest today believes we are in the 8 00:00:34,600 --> 00:00:38,920 Speaker 1: cost of really significant medical breakthroughs. I'm really pleased to 9 00:00:39,000 --> 00:00:42,800 Speaker 1: welcome my guest, doctor Marshall Rangey. He is the former 10 00:00:42,880 --> 00:00:46,320 Speaker 1: executive Dean of the University of North Carolina School of Medicine, 11 00:00:46,760 --> 00:00:51,000 Speaker 1: is currently Executive vice president for Medical Affairs and Dean 12 00:00:51,040 --> 00:00:54,120 Speaker 1: of the Medical School for the University of Michigan, and 13 00:00:54,200 --> 00:00:58,720 Speaker 1: his most recent book is a techno medical thriller entitled 14 00:00:59,040 --> 00:01:12,400 Speaker 1: Coded to Kill. Marshall, thank you for joining me. I 15 00:01:12,400 --> 00:01:13,440 Speaker 1: think this is very cool. 16 00:01:14,800 --> 00:01:16,600 Speaker 2: It's a privilege to be on your show. Thank you 17 00:01:16,680 --> 00:01:17,080 Speaker 2: very much. 18 00:01:17,640 --> 00:01:20,080 Speaker 1: Was it a big shock moving from UNC to Michigan. 19 00:01:20,840 --> 00:01:24,640 Speaker 3: Well, I had to change sports. You'd see all basketball, 20 00:01:24,720 --> 00:01:27,880 Speaker 3: no football. Here is Maine, all football, no basketball, although 21 00:01:27,880 --> 00:01:28,800 Speaker 3: there's been some good teams. 22 00:01:28,920 --> 00:01:32,800 Speaker 1: Although Michigan at times has had very good men's basketball teams. 23 00:01:33,319 --> 00:01:35,760 Speaker 1: I was just curious. I thought that'd be almost a 24 00:01:35,800 --> 00:01:39,240 Speaker 1: culture shock. Now, in a piece you published in Real 25 00:01:39,280 --> 00:01:43,080 Speaker 1: Clear Health entitled We're on the Cusp of a Historic 26 00:01:43,160 --> 00:01:48,240 Speaker 1: Epoch of Discovery, you describe the medical breakthroughs that we're 27 00:01:48,240 --> 00:01:52,240 Speaker 1: experiencing right now. Will you discuss some of them? 28 00:01:52,800 --> 00:01:53,120 Speaker 2: Sure? 29 00:01:53,520 --> 00:01:58,160 Speaker 3: The AI movement's been termed a Promethean moment, meaning you 30 00:01:58,160 --> 00:02:01,160 Speaker 3: know everything's going to change about everything. I tend to 31 00:02:01,160 --> 00:02:03,320 Speaker 3: be more or less a believer of that. I'll just 32 00:02:03,360 --> 00:02:05,840 Speaker 3: give you a couple of examples. One has to do 33 00:02:05,960 --> 00:02:10,040 Speaker 3: you mentioned with drug discovery, and in the past, if 34 00:02:10,080 --> 00:02:12,600 Speaker 3: you had a great target that looked like it was 35 00:02:12,639 --> 00:02:17,919 Speaker 3: going to cure or significantly positively impacted the disease, you'd 36 00:02:17,919 --> 00:02:20,960 Speaker 3: get one hundred medicinal chemists working on creating a molecule, 37 00:02:21,600 --> 00:02:26,480 Speaker 3: and that process itself much less testing would take years. Now, 38 00:02:26,960 --> 00:02:30,000 Speaker 3: and I'm familiar with a drug discovery project which was 39 00:02:30,040 --> 00:02:33,640 Speaker 3: done using AI, taking a big database of drugs, loading 40 00:02:33,840 --> 00:02:40,040 Speaker 3: training the AI algorithm, and over about three days it 41 00:02:40,240 --> 00:02:44,639 Speaker 3: identified about five molecules to treat a disease that was 42 00:02:44,720 --> 00:02:49,760 Speaker 3: virtually untargetable today, and all five of these turned out 43 00:02:49,800 --> 00:02:53,200 Speaker 3: to be have potency and a couple of more compounds 44 00:02:54,080 --> 00:02:58,120 Speaker 3: unknown compounds. So it's almost like thinking back to natural products, 45 00:02:58,160 --> 00:03:00,160 Speaker 3: where you dig something up out of the soil, you 46 00:03:00,200 --> 00:03:04,040 Speaker 3: find a new antibiotic. So it finds things that I 47 00:03:04,160 --> 00:03:06,880 Speaker 3: find incomprehensible how it brings all that data together, but 48 00:03:06,919 --> 00:03:10,360 Speaker 3: it really does. And so if you think drug discovery, 49 00:03:10,480 --> 00:03:13,760 Speaker 3: I think that's awfully important because when we think about healthcare, 50 00:03:14,120 --> 00:03:16,600 Speaker 3: we have so many diseases that are difficult to treat. 51 00:03:17,040 --> 00:03:20,040 Speaker 3: Some are rare diseases, rare genetic diseases. I mean, one 52 00:03:20,040 --> 00:03:23,120 Speaker 3: of the biggest problems is type two diabetes, are Alzheimer's 53 00:03:23,480 --> 00:03:28,160 Speaker 3: and these are very difficult to treat diseases that I 54 00:03:28,200 --> 00:03:30,639 Speaker 3: think diabetes has gotten better, but I think we'll have 55 00:03:31,120 --> 00:03:33,560 Speaker 3: solutions for in the coming few years. 56 00:03:34,320 --> 00:03:37,240 Speaker 1: When you say a few years, are we talking a decade, 57 00:03:37,320 --> 00:03:39,080 Speaker 1: two decades and what are you thinking? 58 00:03:39,440 --> 00:03:42,760 Speaker 3: I'm thinking less than that, less than five years, maybe 59 00:03:42,760 --> 00:03:46,560 Speaker 3: as soon as two or three years. Because another approach 60 00:03:46,640 --> 00:03:50,160 Speaker 3: to drug discovery, which can be aided tremendously by AI 61 00:03:50,800 --> 00:03:55,920 Speaker 3: is using RNA therapies, inhibitory RNA therapies, and they also 62 00:03:56,000 --> 00:03:58,720 Speaker 3: can be developed very quickly in a very targeted way. 63 00:03:59,320 --> 00:04:02,080 Speaker 1: It's just kind of stress our regulatory system. 64 00:04:02,840 --> 00:04:06,480 Speaker 3: For sure, in a lot of different ways. So as you, 65 00:04:06,640 --> 00:04:09,200 Speaker 3: I'm sure know much more than I do about it. 66 00:04:09,640 --> 00:04:12,440 Speaker 3: Some of these new therapies are enormously expensive, so how 67 00:04:12,480 --> 00:04:15,440 Speaker 3: can we afford them? And that's where I'd like to 68 00:04:15,480 --> 00:04:18,919 Speaker 3: flip over to thinking about AI and healthcare. We have 69 00:04:19,120 --> 00:04:22,240 Speaker 3: so far to go in terms of reducing the cost 70 00:04:22,279 --> 00:04:25,280 Speaker 3: of healthcare, and we desperately need to do that. So 71 00:04:26,120 --> 00:04:32,000 Speaker 3: there are currently some small companies out there that enable 72 00:04:32,160 --> 00:04:35,040 Speaker 3: the use of an AI bought a one more place 73 00:04:35,080 --> 00:04:38,200 Speaker 3: a doctor or a nurse, but is available twenty four 74 00:04:38,200 --> 00:04:41,760 Speaker 3: to seven that can get far along in diagnosis. And 75 00:04:41,760 --> 00:04:44,039 Speaker 3: then if you take that along with some new technologies. 76 00:04:44,560 --> 00:04:47,640 Speaker 3: There was a I thought a fascinating paper in the 77 00:04:47,720 --> 00:04:52,920 Speaker 3: Washington Post about Alzheimer's disease where five different groups describe 78 00:04:52,960 --> 00:04:58,120 Speaker 3: five different ways of early diagnosis of Alzheimer's, recognizing patterns 79 00:04:58,120 --> 00:05:01,880 Speaker 3: that as people. As humans, the most brilliant humans can't 80 00:05:01,920 --> 00:05:04,679 Speaker 3: recognize these patterns because they bring to other so much data. 81 00:05:04,760 --> 00:05:07,479 Speaker 3: So I think if you combine some of that, which 82 00:05:07,720 --> 00:05:11,320 Speaker 3: leads to earlier treatment for chronic diseases. The payback won't 83 00:05:11,320 --> 00:05:14,560 Speaker 3: be overnight, but the payback for the healthcare system of the 84 00:05:14,600 --> 00:05:18,000 Speaker 3: United States could be over a decade. We could really 85 00:05:18,040 --> 00:05:22,200 Speaker 3: see a dramatic reduction in what I'll call unnecessary testing. 86 00:05:22,680 --> 00:05:24,800 Speaker 3: Maybe it's necessary today, but it won't be necessary in 87 00:05:24,880 --> 00:05:25,920 Speaker 3: a couple of years. 88 00:05:26,120 --> 00:05:28,680 Speaker 1: Yeah. I had the experience. I went in to you 89 00:05:28,720 --> 00:05:31,720 Speaker 1: have my teeth cleaned last week, and the dentist now 90 00:05:31,800 --> 00:05:36,440 Speaker 1: has an artificial intelligence component. When they look at the 91 00:05:36,560 --> 00:05:41,440 Speaker 1: X rays, the artificial intelligence component scans the same data 92 00:05:42,520 --> 00:05:48,440 Speaker 1: and can identify at about ten times the accuracy what 93 00:05:48,560 --> 00:05:51,560 Speaker 1: the dentists can identify. It's a local dentist, not some 94 00:05:51,640 --> 00:05:54,080 Speaker 1: of the advanced research center. They said, it's just a 95 00:05:54,120 --> 00:05:58,680 Speaker 1: technology that's come online, and that it dramatically enhances his 96 00:05:58,760 --> 00:06:03,560 Speaker 1: ability to care, for example, very very early if you 97 00:06:03,560 --> 00:06:06,200 Speaker 1: have a cavity. You've got a couple of things you mentioned, 98 00:06:06,560 --> 00:06:09,039 Speaker 1: and one of them is this whole question of really 99 00:06:09,080 --> 00:06:13,120 Speaker 1: advanced imaging techniques that I think are going to revolutionize 100 00:06:13,720 --> 00:06:18,120 Speaker 1: our ability to have early intervention, which sometimes increases cost, 101 00:06:18,160 --> 00:06:21,279 Speaker 1: sometimes decreases, but it almost always extends. 102 00:06:20,839 --> 00:06:22,840 Speaker 2: Life, right. Absolutely. 103 00:06:23,520 --> 00:06:26,039 Speaker 3: Yeah, if you think of any chronic disease, the earlier 104 00:06:26,080 --> 00:06:29,400 Speaker 3: you're inter being, the better the outcome, So I couldn't 105 00:06:29,520 --> 00:06:31,839 Speaker 3: agree more. I'll tell you just this sort of a 106 00:06:31,880 --> 00:06:35,239 Speaker 3: weird research study I read some time ago recently. Really, 107 00:06:35,680 --> 00:06:37,880 Speaker 3: I had to do with looking at the concept that 108 00:06:38,160 --> 00:06:40,520 Speaker 3: the color of your tongue can predict disease, and this 109 00:06:40,640 --> 00:06:44,720 Speaker 3: is like ancient Chinese medicine. But the group that was 110 00:06:44,760 --> 00:06:48,520 Speaker 3: investigating it looked at using AI in that circumstance, and 111 00:06:48,600 --> 00:06:52,440 Speaker 3: AI was fantastic, and there was a small group. You know, 112 00:06:52,480 --> 00:06:54,440 Speaker 3: we're not ready to get AI o our tongues right now, 113 00:06:54,480 --> 00:06:57,719 Speaker 3: but it proved the point that AI can pick up 114 00:06:57,760 --> 00:07:00,800 Speaker 3: things we just don't see. But also we don't always 115 00:07:00,800 --> 00:07:02,600 Speaker 3: know what AI is looking at. So maybe it's not 116 00:07:02,640 --> 00:07:04,240 Speaker 3: looking at the color at all. Maybe it's looking at 117 00:07:04,240 --> 00:07:07,880 Speaker 3: the papules and the distribution of the papules of the time. 118 00:07:08,000 --> 00:07:09,960 Speaker 3: I don't really know, And that's one of the great 119 00:07:10,040 --> 00:07:12,440 Speaker 3: mysteries of AI. You don't know what all data brought in, 120 00:07:12,480 --> 00:07:16,840 Speaker 3: but it sure does turn out to be provocatively accurate. 121 00:07:17,360 --> 00:07:21,560 Speaker 1: How much research do you see long on both Uncn 122 00:07:21,600 --> 00:07:23,840 Speaker 1: and Michigan in the application of AI. 123 00:07:24,640 --> 00:07:26,440 Speaker 3: I don't follow things that you and see quite as 124 00:07:26,520 --> 00:07:28,320 Speaker 3: closely as I was, but I can tell you we 125 00:07:28,360 --> 00:07:32,360 Speaker 3: have an explosion of it. So we're looking at applications 126 00:07:32,360 --> 00:07:36,800 Speaker 3: of AI direct applications in healthcare. We're using AI to 127 00:07:37,480 --> 00:07:41,480 Speaker 3: be able to organize and scan and make much quicker 128 00:07:41,920 --> 00:07:45,440 Speaker 3: decisions about electronic medical records. I don't know if you've 129 00:07:45,440 --> 00:07:48,080 Speaker 3: ever requested your own electronic medical record, but you know 130 00:07:48,120 --> 00:07:51,160 Speaker 3: it's like page after page after page after page. And 131 00:07:51,640 --> 00:07:55,280 Speaker 3: AI does a fantastic job of just summarizing that and 132 00:07:55,320 --> 00:07:59,680 Speaker 3: summarizing maybe three hundred pages down to two or three pages. 133 00:08:00,240 --> 00:08:03,560 Speaker 3: And for your care provider, then the main thing I 134 00:08:03,640 --> 00:08:05,560 Speaker 3: think that helps is they don't miss stuff. I know 135 00:08:05,600 --> 00:08:08,280 Speaker 3: when I start looking through a medical record, after a while, 136 00:08:08,280 --> 00:08:10,760 Speaker 3: I'm just not en off. So it makes it more 137 00:08:10,760 --> 00:08:14,000 Speaker 3: accurate and much easier for the providers. The same thing 138 00:08:14,040 --> 00:08:18,240 Speaker 3: in terms of thinking about how we interact with what 139 00:08:18,280 --> 00:08:20,240 Speaker 3: we call the inbox, an epic where people ask a 140 00:08:20,240 --> 00:08:23,080 Speaker 3: lot of questions. We can't be sure that it's going 141 00:08:23,120 --> 00:08:25,160 Speaker 3: to answer every one of those, but if you get 142 00:08:25,240 --> 00:08:27,400 Speaker 3: like an excellent first draft and so it makes it 143 00:08:27,440 --> 00:08:30,160 Speaker 3: really speeds things up and getting back to patients in 144 00:08:30,280 --> 00:08:30,920 Speaker 3: real time. 145 00:08:31,480 --> 00:08:32,480 Speaker 2: So I think those are great. 146 00:08:32,520 --> 00:08:35,880 Speaker 3: But more importantly we're looking at applications of AI in 147 00:08:36,400 --> 00:08:40,040 Speaker 3: areas that do involve imaging, so like I exams ophthalmology 148 00:08:40,640 --> 00:08:45,199 Speaker 3: in interpreting mammograms and interpreting MRIs and cts. When a 149 00:08:45,280 --> 00:08:48,199 Speaker 3: radiologist looks at an MRI, you may have seen them 150 00:08:48,200 --> 00:08:50,360 Speaker 3: doing this in real time, but they're looking at thirty 151 00:08:50,440 --> 00:08:54,600 Speaker 3: or forty images. And one thing AI can certainly do 152 00:08:54,679 --> 00:08:59,640 Speaker 3: is say images one through thirty two, don't worry about 153 00:08:59,640 --> 00:09:02,480 Speaker 3: their normal image, thirty two, three, and thirty four you 154 00:09:02,520 --> 00:09:05,719 Speaker 3: better take a quick look at and then the radiologist 155 00:09:05,720 --> 00:09:08,160 Speaker 3: has much more time to ponder it and try to 156 00:09:08,160 --> 00:09:11,560 Speaker 3: put that together with the clinical scenario. But I think 157 00:09:11,600 --> 00:09:13,440 Speaker 3: the next step is going to be AI looking at 158 00:09:13,440 --> 00:09:16,520 Speaker 3: the medical record and AI looking into the images and 159 00:09:16,600 --> 00:09:17,560 Speaker 3: pulling it all together. 160 00:09:17,600 --> 00:09:19,160 Speaker 2: And I think that's just around the corner. 161 00:09:19,280 --> 00:09:21,560 Speaker 1: You know. One of the problems is political. We were 162 00:09:21,559 --> 00:09:26,600 Speaker 1: looking at a little company that had developed a computerized 163 00:09:26,840 --> 00:09:30,800 Speaker 1: eye exam. The normal model is you go see your 164 00:09:30,800 --> 00:09:34,920 Speaker 1: optometrist or ophthalmologists every two years in order to have 165 00:09:35,000 --> 00:09:37,600 Speaker 1: an exam so they can reorder whatever you might need, 166 00:09:37,920 --> 00:09:40,240 Speaker 1: and then you go back to the alternating year just 167 00:09:40,280 --> 00:09:42,840 Speaker 1: to have a quick checkup. Well, they have developed a 168 00:09:42,880 --> 00:09:47,240 Speaker 1: computerized scam that would perform the second year, and in 169 00:09:47,360 --> 00:09:52,640 Speaker 1: state after state, the ophthalmology and optometry lobbyists would get 170 00:09:52,640 --> 00:09:55,920 Speaker 1: it out lawed. I can't believe it, Yeah, because they 171 00:09:55,960 --> 00:09:58,280 Speaker 1: saw it as a direct threat to their income. One 172 00:09:58,320 --> 00:10:00,480 Speaker 1: of the great lessons for me when I stepped down speaker, 173 00:10:00,840 --> 00:10:04,000 Speaker 1: I took two areas, national security and healthcare, which were 174 00:10:04,040 --> 00:10:07,880 Speaker 1: both very complicated. Healthcare, I think is about ten times 175 00:10:08,240 --> 00:10:13,319 Speaker 1: more complicated than national security. It's unbelievably dense. But one 176 00:10:13,320 --> 00:10:15,200 Speaker 1: of the things I learned that was I think a 177 00:10:15,240 --> 00:10:18,360 Speaker 1: real surprise to me was the degree to which the 178 00:10:18,440 --> 00:10:22,280 Speaker 1: systems operate in an entirely human manner. That is, if 179 00:10:22,280 --> 00:10:26,160 Speaker 1: you're a hospital, your interest is optimizing the hospital, and 180 00:10:26,200 --> 00:10:29,520 Speaker 1: if you're the local pharmacy, your interest is optimizing the pharmacy. 181 00:10:29,800 --> 00:10:33,240 Speaker 1: And in the process, if that means that you're suboptimizing 182 00:10:33,280 --> 00:10:36,480 Speaker 1: the country, that's okay for you because you know your 183 00:10:36,520 --> 00:10:39,280 Speaker 1: pharmacy is doing well. So it made me realize how 184 00:10:39,840 --> 00:10:43,240 Speaker 1: you both had to figure out an improvement, but then 185 00:10:43,280 --> 00:10:44,840 Speaker 1: you had to figure out how to get through all 186 00:10:44,840 --> 00:10:49,120 Speaker 1: these mine fields to get the improvement both culturally adopted. 187 00:10:49,480 --> 00:10:51,840 Speaker 1: You know, would professionals be wing to do it and 188 00:10:51,960 --> 00:10:54,760 Speaker 1: to get it adopted by people who might feel that 189 00:10:55,160 --> 00:10:57,880 Speaker 1: their income, or their prestige or their status was threatened. 190 00:10:58,360 --> 00:11:00,680 Speaker 1: I don't know how much of that you experience in 191 00:11:00,720 --> 00:11:03,719 Speaker 1: a medical school, but it's been a fascinating challenge to me. 192 00:11:04,679 --> 00:11:08,920 Speaker 3: That's so interesting because it's absolutely true. And I'm currently 193 00:11:09,160 --> 00:11:12,120 Speaker 3: working on a book with Forbes. Forbes will ask CEOs 194 00:11:12,120 --> 00:11:14,960 Speaker 3: and different areas, right books and so ours is about 195 00:11:15,400 --> 00:11:18,640 Speaker 3: changes in healthcare, really disruptors in healthcare. And so all 196 00:11:18,679 --> 00:11:21,960 Speaker 3: these disruptors are companies like what you talked about, that 197 00:11:22,000 --> 00:11:24,800 Speaker 3: are coming in and saying, hey, there's a problem, there's 198 00:11:24,840 --> 00:11:27,760 Speaker 3: a pain point in medicine. We can fix it, and 199 00:11:28,000 --> 00:11:31,800 Speaker 3: they do. But then everyone's competing with everybody for that 200 00:11:31,920 --> 00:11:32,559 Speaker 3: same dollar. 201 00:11:33,160 --> 00:11:34,040 Speaker 2: And it's true. 202 00:11:34,200 --> 00:11:36,840 Speaker 3: You think about hospitals, to think about insurance companies, you 203 00:11:36,880 --> 00:11:40,160 Speaker 3: think about pharmacy benefit managers. They're all after that same 204 00:11:40,200 --> 00:11:43,480 Speaker 3: dollar and they're pretty successful, which I think does lead 205 00:11:43,520 --> 00:11:46,719 Speaker 3: to an awful lot of cost in healthcare. So I 206 00:11:46,760 --> 00:11:48,720 Speaker 3: can't tell you how happy I meant to hear that 207 00:11:48,800 --> 00:11:52,960 Speaker 3: somebody who has been is integally involved in decision making 208 00:11:53,000 --> 00:11:55,880 Speaker 3: at the national level understands this. It's a pleasant surprise 209 00:11:55,960 --> 00:11:56,680 Speaker 3: to me to. 210 00:11:56,760 --> 00:12:01,800 Speaker 1: Go back to artificial intelligence, to what extent is the 211 00:12:02,120 --> 00:12:06,400 Speaker 1: University of Michigan Hospital using this and trying to integrate 212 00:12:06,440 --> 00:12:12,920 Speaker 1: it into either accelerating speed or maximizing the ability to 213 00:12:12,920 --> 00:12:13,840 Speaker 1: deal with complexity. 214 00:12:14,559 --> 00:12:17,559 Speaker 3: We're currently I would say a year ago we were 215 00:12:17,640 --> 00:12:21,079 Speaker 3: using no AI. I'd say we're using AI and ten 216 00:12:21,120 --> 00:12:23,680 Speaker 3: percent of what we do now. I believe that in 217 00:12:23,720 --> 00:12:25,560 Speaker 3: the next year or two we'll be using it in 218 00:12:25,720 --> 00:12:28,440 Speaker 3: half of what we do. Because people are a little 219 00:12:28,520 --> 00:12:31,120 Speaker 3: anxious about to start with, but I think it's proven 220 00:12:31,120 --> 00:12:34,320 Speaker 3: that it has real value and is not just a spectacle. 221 00:12:34,400 --> 00:12:35,960 Speaker 2: It offers real value. 222 00:12:36,160 --> 00:12:41,880 Speaker 1: Is the early diagnostics or the ability to radically enhance 223 00:12:42,000 --> 00:12:45,680 Speaker 1: research or to create connectivity for the patients. I've always thought, 224 00:12:45,920 --> 00:12:47,600 Speaker 1: if you think about the number of times in your 225 00:12:47,640 --> 00:12:51,040 Speaker 1: life you'll go into a doctor or a dentist or 226 00:12:51,040 --> 00:12:54,920 Speaker 1: something and sit down and fill out paper. And I'm 227 00:12:54,920 --> 00:12:57,000 Speaker 1: now old enough when they say, you know, list all 228 00:12:57,080 --> 00:12:59,440 Speaker 1: of your drugs, Well, I carry all my drugs in 229 00:12:59,559 --> 00:13:01,600 Speaker 1: my phone, so I can pull up my phone and 230 00:13:01,640 --> 00:13:04,520 Speaker 1: do it. But I think back to my mother, who 231 00:13:04,559 --> 00:13:07,319 Speaker 1: did not have a phone at that time that was smart. 232 00:13:07,480 --> 00:13:09,920 Speaker 1: She had a ruggar phone and she's taking and out 233 00:13:09,920 --> 00:13:12,440 Speaker 1: of drugs. I am certain there are occasions where she 234 00:13:12,520 --> 00:13:15,360 Speaker 1: went to see a doctor or a hospital and only 235 00:13:15,360 --> 00:13:17,559 Speaker 1: gave them half the drugs she was using because she 236 00:13:17,679 --> 00:13:20,920 Speaker 1: just forgot. Shouldn't there be some way to almost automate 237 00:13:20,960 --> 00:13:22,160 Speaker 1: that whole process. 238 00:13:22,559 --> 00:13:24,800 Speaker 3: I think there is, and I think, particularly now that 239 00:13:24,800 --> 00:13:28,480 Speaker 3: we have electronic medical records that can connect to pharmacy records, 240 00:13:28,679 --> 00:13:32,200 Speaker 3: we do already look at pharmacy records when somebody says, yes, 241 00:13:32,240 --> 00:13:35,040 Speaker 3: I haven't missed by dose of lipator in three years, 242 00:13:35,040 --> 00:13:37,360 Speaker 3: you see they also haven't refilled their prescription in the 243 00:13:37,400 --> 00:13:40,640 Speaker 3: last two years. So something doesn't quite connect there. I 244 00:13:40,679 --> 00:13:42,600 Speaker 3: think as we get more and more connected now, I 245 00:13:42,600 --> 00:13:45,200 Speaker 3: do think that there's a danger of getting connections. I'll 246 00:13:45,200 --> 00:13:47,800 Speaker 3: come back to that in just a minute. But it's 247 00:13:47,920 --> 00:13:50,360 Speaker 3: long been a dogma that, wow, if we could just 248 00:13:51,360 --> 00:13:56,640 Speaker 3: pick medications or surgical procedures or whatever therapies that were 249 00:13:56,679 --> 00:13:59,320 Speaker 3: targeted toward the individual, rather than saying, well, we know 250 00:13:59,400 --> 00:14:03,560 Speaker 3: this approach works in sixty percent of people, and many 251 00:14:03,600 --> 00:14:06,319 Speaker 3: people take high pretension as an easy example. They may 252 00:14:06,360 --> 00:14:08,240 Speaker 3: try two or three or four drugs before they find 253 00:14:08,240 --> 00:14:11,960 Speaker 3: one that works. I think by combining when it's going 254 00:14:12,000 --> 00:14:15,880 Speaker 3: to be possible to combine genotype and other O mix 255 00:14:16,120 --> 00:14:19,640 Speaker 3: so to speak, the proteomics and others along with medical 256 00:14:19,680 --> 00:14:23,360 Speaker 3: history to be much more accurate in predicting the kind 257 00:14:23,400 --> 00:14:24,680 Speaker 3: of therapies that will. 258 00:14:24,440 --> 00:14:26,640 Speaker 2: Be really useful. 259 00:14:26,680 --> 00:14:29,640 Speaker 3: So that's where I see us going, and that saves time, 260 00:14:29,840 --> 00:14:34,080 Speaker 3: it's more effective, and it saves money in the long run. 261 00:14:34,720 --> 00:14:37,120 Speaker 3: When I said I think there's a worry about all 262 00:14:37,160 --> 00:14:41,000 Speaker 3: this connectivity, that's actually one of the streams of my novel, 263 00:14:41,080 --> 00:14:46,000 Speaker 3: because once you're totally connected, you're totally connected. And all 264 00:14:46,040 --> 00:14:49,120 Speaker 3: it takes is one kind of rogue twenty year old 265 00:14:49,160 --> 00:14:52,960 Speaker 3: at Amazon to know that. G If I put together 266 00:14:53,600 --> 00:14:58,640 Speaker 3: Marshall's purchasing his medications, his EHR, what can I do 267 00:14:58,680 --> 00:15:00,480 Speaker 3: with that? And in my novel, it's what evil can 268 00:15:00,520 --> 00:15:01,080 Speaker 3: I do with it? 269 00:15:18,160 --> 00:15:21,760 Speaker 1: Given your prestige and your background and in health, what 270 00:15:21,920 --> 00:15:23,080 Speaker 1: led you to write a novel? 271 00:15:24,320 --> 00:15:28,800 Speaker 3: Well, a couple of things. So I started because I 272 00:15:28,800 --> 00:15:31,320 Speaker 3: had a patient. Mccardial just had a patient that came 273 00:15:31,360 --> 00:15:33,720 Speaker 3: in and he could have been a perfect character in 274 00:15:33,720 --> 00:15:37,640 Speaker 3: a Grisham novel. A attorney worked for the FBI, got 275 00:15:37,720 --> 00:15:40,320 Speaker 3: on drugs, lost his family and was kind of working 276 00:15:40,360 --> 00:15:41,120 Speaker 3: his way back. 277 00:15:40,880 --> 00:15:41,760 Speaker 2: And he had heart problems. 278 00:15:41,800 --> 00:15:43,960 Speaker 3: So one day he said, hey, you know John Grisham, 279 00:15:44,000 --> 00:15:46,520 Speaker 3: and I'm like, no, I don't think I do. And 280 00:15:46,560 --> 00:15:48,680 Speaker 3: he said he's an author, and I said, oh. He said, 281 00:15:48,680 --> 00:15:50,120 Speaker 3: I'll bring you a book. So I read a Grisham 282 00:15:50,120 --> 00:15:52,160 Speaker 3: book and I was forever hooked on Grisham. 283 00:15:52,200 --> 00:15:52,680 Speaker 2: I love the guy. 284 00:15:52,680 --> 00:15:55,320 Speaker 3: I loved these thrillers, and I thought, well, it'd be 285 00:15:55,360 --> 00:15:57,200 Speaker 3: fun to try to write one. I bet I can 286 00:15:57,240 --> 00:15:59,560 Speaker 3: do that. Of course, that was like fifteen years before 287 00:15:59,600 --> 00:16:02,880 Speaker 3: I finished my novel. But at the end of the day, 288 00:16:02,920 --> 00:16:06,040 Speaker 3: what I wanted to combine two things. One was to 289 00:16:06,080 --> 00:16:07,800 Speaker 3: make a thriller that people would like to read, and 290 00:16:07,840 --> 00:16:12,080 Speaker 3: the other was to hit an audience that isn't reading 291 00:16:12,560 --> 00:16:16,480 Speaker 3: editorials in the Washington Post or the New York Times 292 00:16:16,920 --> 00:16:18,760 Speaker 3: about some of the things they have to think about. 293 00:16:18,800 --> 00:16:22,680 Speaker 3: How do they protect their phi their protected health information. 294 00:16:23,160 --> 00:16:25,560 Speaker 3: How do they keep things out of the medical record 295 00:16:25,600 --> 00:16:27,800 Speaker 3: that don't want in the medical record. This is not 296 00:16:27,920 --> 00:16:30,360 Speaker 3: a real what do they call that? Not a teaser? 297 00:16:30,400 --> 00:16:33,120 Speaker 3: But there is a character in the novel that is 298 00:16:33,160 --> 00:16:36,640 Speaker 3: a Southern politician that somebody wants to kill. And they 299 00:16:36,680 --> 00:16:38,760 Speaker 3: figured out how they could kill them through the medical record. 300 00:16:39,200 --> 00:16:41,880 Speaker 3: I found this fascinating. I didn't know this, but in 301 00:16:41,920 --> 00:16:45,400 Speaker 3: big hospitals, when you see that ivy bag hanging, you think, well, 302 00:16:45,440 --> 00:16:47,440 Speaker 3: there's a farm tech or a pharmacist down there who 303 00:16:47,440 --> 00:16:47,960 Speaker 3: mix it up. 304 00:16:48,160 --> 00:16:49,920 Speaker 2: Well, the big hospitals, that's not the case. 305 00:16:50,120 --> 00:16:54,080 Speaker 3: They use robots, and the robots are incredibly accurate. But 306 00:16:54,320 --> 00:16:56,400 Speaker 3: you can imagine a scenario where somebody hacks in the 307 00:16:56,440 --> 00:16:59,800 Speaker 3: medical record, reprograms a robot and gives me something that 308 00:17:00,080 --> 00:17:02,440 Speaker 3: is lethal to me. So that's kind of one of 309 00:17:02,440 --> 00:17:04,800 Speaker 3: the themes that's swirling around in the novel. 310 00:17:05,160 --> 00:17:07,840 Speaker 1: Did you find it challenging to learn how to do 311 00:17:08,560 --> 00:17:12,040 Speaker 1: dialogue and because a novel is really different than writing 312 00:17:12,080 --> 00:17:14,000 Speaker 1: non fiction, Yeah. 313 00:17:13,840 --> 00:17:15,040 Speaker 2: I found it very challenging. 314 00:17:15,560 --> 00:17:19,119 Speaker 3: What helped me the most was for several years, I 315 00:17:19,160 --> 00:17:20,520 Speaker 3: get up early, so I do a little rite in 316 00:17:20,560 --> 00:17:24,200 Speaker 3: the morning, and inevitably somebody would aggravate me, so I'd 317 00:17:24,200 --> 00:17:26,639 Speaker 3: come home and write something really nasty about that person. 318 00:17:26,920 --> 00:17:29,480 Speaker 3: So after about five years, I was so proud of 319 00:17:29,480 --> 00:17:31,399 Speaker 3: my novel. I showed it to my wife and she said, well, 320 00:17:31,800 --> 00:17:34,240 Speaker 3: let's see. Now, they're about one hundred characters. There's no 321 00:17:34,320 --> 00:17:36,399 Speaker 3: dialogue and no plot. Maybe you ought to go off 322 00:17:36,400 --> 00:17:39,280 Speaker 3: to a writing course. So I did do that. I 323 00:17:39,760 --> 00:17:41,640 Speaker 3: don't know if you do creative writing or not. It's 324 00:17:41,640 --> 00:17:44,320 Speaker 3: fun to do because unlike everything that you and I 325 00:17:44,400 --> 00:17:46,760 Speaker 3: have to do in our lives, there's no background check. 326 00:17:46,800 --> 00:17:48,960 Speaker 2: It's just you make it up. So it's fiction. 327 00:17:49,480 --> 00:17:50,680 Speaker 1: Where was your writing course? 328 00:17:51,600 --> 00:17:54,399 Speaker 3: I went to a course that was put on by 329 00:17:54,440 --> 00:17:57,680 Speaker 3: company it's not run anymore, called ck Seak. It was 330 00:17:57,720 --> 00:18:01,760 Speaker 3: on Cape cod and it was hosted by two really 331 00:18:01,800 --> 00:18:06,000 Speaker 3: great doctor authors, Michael Palmer and Tess Garretts, and both 332 00:18:06,000 --> 00:18:08,359 Speaker 3: write thrillers. I went to it thinking, well, they're going 333 00:18:08,359 --> 00:18:09,879 Speaker 3: to make a brief appearance, but they were there for 334 00:18:09,920 --> 00:18:13,280 Speaker 3: three days. And I also went there thinking, wow, I'm 335 00:18:13,280 --> 00:18:14,840 Speaker 3: going to get into an issue. This is back in 336 00:18:14,880 --> 00:18:17,560 Speaker 3: two thousand and seven or eight, because how many doctor 337 00:18:17,600 --> 00:18:20,520 Speaker 3: writers can they be? And so I was lucky I 338 00:18:20,520 --> 00:18:22,359 Speaker 3: got in. They said, well, you're lucky you filled it up. 339 00:18:22,400 --> 00:18:26,639 Speaker 3: They're three hundred and fifty doctors coming, including two classmates 340 00:18:26,680 --> 00:18:28,879 Speaker 3: mine for medical school, so a lot of interest in 341 00:18:28,880 --> 00:18:29,640 Speaker 3: this field, and. 342 00:18:29,760 --> 00:18:32,040 Speaker 1: Most of them writing fiction or they just want to 343 00:18:32,080 --> 00:18:32,880 Speaker 1: learn how to write. 344 00:18:33,160 --> 00:18:36,840 Speaker 3: Most of them are writing fiction and it ranges from 345 00:18:37,000 --> 00:18:39,560 Speaker 3: some of the most bizarre stuff you can imagine, to 346 00:18:39,840 --> 00:18:41,719 Speaker 3: people trying to write things like I was writing. 347 00:18:42,119 --> 00:18:47,760 Speaker 1: I've written both fiction and nonfiction in there totally different challenges. 348 00:18:48,200 --> 00:18:50,320 Speaker 2: Tell me what you wrote about in fiction, because I 349 00:18:50,359 --> 00:18:51,040 Speaker 2: love reading fiction. 350 00:18:51,880 --> 00:18:53,360 Speaker 1: I started with a very good friend of min named 351 00:18:53,400 --> 00:18:57,760 Speaker 1: Bill Fortune, who's a history professor but also a professional 352 00:18:57,800 --> 00:19:01,240 Speaker 1: novelist and has written many, many novels. Our first book 353 00:19:01,280 --> 00:19:06,399 Speaker 1: was an alternative history in which Hitler is in a 354 00:19:06,440 --> 00:19:10,280 Speaker 1: car accident just before Pearl Harbor, and so Germany doesn't 355 00:19:10,320 --> 00:19:13,960 Speaker 1: attack us, and the result is we don't have an 356 00:19:14,000 --> 00:19:17,200 Speaker 1: excuse to declare war. So we turn to the Pacific, 357 00:19:17,680 --> 00:19:21,000 Speaker 1: and Hitler dominates in Europe and the novel is called 358 00:19:21,080 --> 00:19:24,240 Speaker 1: nineteen forty five. And so in this alternity of nineteen 359 00:19:24,320 --> 00:19:27,720 Speaker 1: forty five, the Germans are dominant in Europe, were dominant 360 00:19:27,720 --> 00:19:30,879 Speaker 1: in the Pacific, and Hitler has decided he has to 361 00:19:30,920 --> 00:19:34,560 Speaker 1: attack us, and he has been briefed that there's a 362 00:19:34,880 --> 00:19:39,680 Speaker 1: facility in Oakridge, Tennessee, developing a nuclear weapon. You've got 363 00:19:39,720 --> 00:19:42,800 Speaker 1: some fact in there, yeah, And so the whole thing 364 00:19:42,880 --> 00:19:46,080 Speaker 1: is an adventure story, you know, launching a specialized raid 365 00:19:46,520 --> 00:19:49,159 Speaker 1: to try to destroy this before the Americans can build it. 366 00:19:49,760 --> 00:19:52,720 Speaker 1: And I learned so much from Bell because, for example, 367 00:19:52,760 --> 00:19:57,120 Speaker 1: I would talk with you and say, the Germans had 368 00:19:57,240 --> 00:20:00,639 Speaker 1: arrived just off South Carolina, and the troops that were 369 00:20:00,680 --> 00:20:04,280 Speaker 1: going to go to Tennessee had to move from a 370 00:20:04,359 --> 00:20:06,960 Speaker 1: large ocean going ship to a small ship that was 371 00:20:06,960 --> 00:20:10,000 Speaker 1: going to take them up river. Well, by the time 372 00:20:10,080 --> 00:20:14,440 Speaker 1: Bill taught me how you write the last guy trying 373 00:20:14,440 --> 00:20:17,280 Speaker 1: to get off the ship, and the ship waving back 374 00:20:17,320 --> 00:20:20,399 Speaker 1: and forth, and the danger of being crushed, and in 375 00:20:20,440 --> 00:20:22,920 Speaker 1: the end he does get crushed. I mean he took 376 00:20:23,480 --> 00:20:25,800 Speaker 1: a couple of sentences and turned them into five pages 377 00:20:26,160 --> 00:20:27,840 Speaker 1: where you're sitting on the edge of your seat, thinking, 378 00:20:27,840 --> 00:20:29,840 Speaker 1: Oh my god, what's going to happen this. We went 379 00:20:29,880 --> 00:20:34,720 Speaker 1: on from there and wrote several books about the Revolutionary 380 00:20:34,760 --> 00:20:37,920 Speaker 1: War from Washington's perspective, and then we wrote a couple 381 00:20:37,960 --> 00:20:43,480 Speaker 1: of novels about the Civil War, and probably our best gambit, 382 00:20:44,240 --> 00:20:47,480 Speaker 1: we wrote an alternative history of Gettysburg. And we got 383 00:20:47,480 --> 00:20:51,920 Speaker 1: the Army War College in Carlisle, Pennsylvania. We got their 384 00:20:51,960 --> 00:20:55,520 Speaker 1: commanding general and the person who teaches their Gettysburgh class, 385 00:20:56,080 --> 00:20:58,120 Speaker 1: and we went out and we walked them through our 386 00:20:58,200 --> 00:21:00,719 Speaker 1: theory because we were both military historians, so we had 387 00:21:00,720 --> 00:21:03,400 Speaker 1: a theory of how this thing might have happened, and 388 00:21:03,480 --> 00:21:06,680 Speaker 1: we went out and literally walked the battlefield. And when 389 00:21:06,720 --> 00:21:08,480 Speaker 1: we got done, the guy who taught the cars said, 390 00:21:08,480 --> 00:21:11,679 Speaker 1: you know, that actually sounds more like Lee than what 391 00:21:11,760 --> 00:21:15,520 Speaker 1: he did that day. And so we wrote an alternative 392 00:21:15,560 --> 00:21:18,840 Speaker 1: history of Gettysburg, which then the two more alternative histories 393 00:21:18,880 --> 00:21:21,639 Speaker 1: of how the Civil War would have occurred in that setting. 394 00:21:22,240 --> 00:21:23,240 Speaker 2: What's that one called? 395 00:21:23,640 --> 00:21:24,879 Speaker 1: The first one's called Gettysburg. 396 00:21:25,240 --> 00:21:27,240 Speaker 2: Oh, I've seen that book. I've never read it. 397 00:21:27,280 --> 00:21:29,960 Speaker 1: Though we did it deliberately because Gettysburgh is the most 398 00:21:29,960 --> 00:21:34,359 Speaker 1: frequently written about event in American history. We were trying 399 00:21:34,359 --> 00:21:37,240 Speaker 1: to find some way to launch our fiction efforts. But 400 00:21:37,280 --> 00:21:39,919 Speaker 1: it was great fun. So when we describe things in 401 00:21:39,960 --> 00:21:43,520 Speaker 1: the book, that is how they were. You could go 402 00:21:43,560 --> 00:21:45,320 Speaker 1: and take our book and you could walk the whole 403 00:21:45,359 --> 00:21:48,159 Speaker 1: battlefield and get a sense for it. But it was 404 00:21:48,240 --> 00:21:51,240 Speaker 1: really fun, and it was fun to take personalities like 405 00:21:51,359 --> 00:21:56,280 Speaker 1: Lee or Mead or Longstreet and weave them in in 406 00:21:56,320 --> 00:21:57,960 Speaker 1: a way that worked well. No, I will tell you 407 00:21:58,040 --> 00:22:01,800 Speaker 1: I recently did a podcast with one of the great 408 00:22:01,840 --> 00:22:05,160 Speaker 1: American historians, and his father had written a book called 409 00:22:05,200 --> 00:22:10,080 Speaker 1: Killer Angels, which is the best book about Gettysburg ever written. 410 00:22:10,560 --> 00:22:14,160 Speaker 1: It's like poetry. I've written far more of nonfiction than fiction. 411 00:22:14,720 --> 00:22:17,600 Speaker 1: When you find somebody who has the rhythm and they 412 00:22:17,640 --> 00:22:21,000 Speaker 1: have the words and they suck you into their world, 413 00:22:21,200 --> 00:22:22,400 Speaker 1: I find it amazing. 414 00:22:22,960 --> 00:22:24,280 Speaker 2: I love it. I'll tell you what. 415 00:22:24,320 --> 00:22:26,239 Speaker 3: I didn't know this value, but I'm going to get 416 00:22:26,320 --> 00:22:45,879 Speaker 3: nineteen forty five in Gettysburg and read him. 417 00:22:46,320 --> 00:22:48,080 Speaker 1: When you're looking down the road the next four or 418 00:22:48,080 --> 00:22:50,680 Speaker 1: five years, do you think you're going to be able 419 00:22:50,680 --> 00:22:54,240 Speaker 1: to collaborate with the insurance companies and get them to 420 00:22:54,280 --> 00:22:58,760 Speaker 1: help accept and help implement the kind of changes that 421 00:22:58,840 --> 00:22:59,680 Speaker 1: are coming with AI. 422 00:23:01,119 --> 00:23:03,399 Speaker 3: Well, not knowing who in the world might listen to this, 423 00:23:03,440 --> 00:23:05,960 Speaker 3: I'm gonna go ahead and tell you what I think, and 424 00:23:06,560 --> 00:23:09,119 Speaker 3: that is we have tried very hard to collaborate with 425 00:23:09,200 --> 00:23:13,119 Speaker 3: insurance companies. It's been difficult to collaborate because in so 426 00:23:13,240 --> 00:23:19,320 Speaker 3: many ways we're at odds financially, and so we haven't 427 00:23:19,359 --> 00:23:21,840 Speaker 3: tried around AI. I'll tell you one good experience. We 428 00:23:21,880 --> 00:23:23,879 Speaker 3: had a good experience with Blue Cross, Pushield and Michigan 429 00:23:24,359 --> 00:23:29,159 Speaker 3: around developing quality initiatives, and this was looking at the 430 00:23:29,280 --> 00:23:32,960 Speaker 3: quality of different kinds of operations across the state, if 431 00:23:33,000 --> 00:23:34,760 Speaker 3: you have your prostate taken out, or if you have 432 00:23:35,359 --> 00:23:38,440 Speaker 3: any kind of cancer surgery. And I think that actually 433 00:23:38,800 --> 00:23:42,040 Speaker 3: was very productive. We combined our data with their data, 434 00:23:42,320 --> 00:23:46,119 Speaker 3: their claims data, and it helped identify people that were 435 00:23:46,680 --> 00:23:49,240 Speaker 3: needed to improve their performance. And it was done in 436 00:23:49,280 --> 00:23:53,160 Speaker 3: a very protected environment. But I think that was good 437 00:23:53,160 --> 00:23:57,639 Speaker 3: for healthcare. Subsequently, we tried to collaborate in other areas unsuccessfully, 438 00:23:58,160 --> 00:24:01,879 Speaker 3: and so AI ought to be one that is un natural. 439 00:24:01,920 --> 00:24:03,360 Speaker 3: But I'll tell you a funny story I heard about 440 00:24:03,359 --> 00:24:06,520 Speaker 3: AI recently. So we have one consultant who's telling us 441 00:24:06,600 --> 00:24:10,960 Speaker 3: that we can come deliver to you AI to help 442 00:24:11,000 --> 00:24:15,840 Speaker 3: with pre authorization for procedures. So whatever you're going to have, 443 00:24:16,359 --> 00:24:20,399 Speaker 3: from an eye surgery to heart surgery, your insurance company 444 00:24:20,440 --> 00:24:22,720 Speaker 3: has to pre authorize that, and that can be a 445 00:24:22,720 --> 00:24:26,159 Speaker 3: difficult process, and we sometimes end up with somebody who's 446 00:24:26,800 --> 00:24:28,520 Speaker 3: on the gurney getting ready to have surgery and we 447 00:24:28,520 --> 00:24:30,959 Speaker 3: still don't have pre authorization, and if we don't get it, 448 00:24:31,280 --> 00:24:34,560 Speaker 3: we eat the entire procedure in terms of the cost. 449 00:24:35,240 --> 00:24:37,439 Speaker 3: So I thought, well, this could be really good, And 450 00:24:37,480 --> 00:24:40,480 Speaker 3: then I was at another meeting of colleagues. They said, man, 451 00:24:40,520 --> 00:24:43,120 Speaker 3: you got to be careful because the insurance companies now 452 00:24:43,160 --> 00:24:46,280 Speaker 3: are using AI to figure out how to best deny 453 00:24:46,359 --> 00:24:49,639 Speaker 3: pre authorization. So, you know, just a great example of 454 00:24:49,680 --> 00:24:51,720 Speaker 3: AI works in one way for us, another way for 455 00:24:51,720 --> 00:24:54,320 Speaker 3: the insurance companies. But I think if we can put 456 00:24:54,320 --> 00:24:56,960 Speaker 3: aside that and think about how we could collaborate in 457 00:24:57,040 --> 00:25:00,920 Speaker 3: terms of quality outcomes. And it's been talked about for decades, 458 00:25:01,000 --> 00:25:03,800 Speaker 3: have been the cost curve in healthcare. We've got to 459 00:25:03,840 --> 00:25:06,400 Speaker 3: do that. We're out of control right now. That's hard 460 00:25:06,440 --> 00:25:09,240 Speaker 3: for me to say because I've been in healthcare for 461 00:25:09,240 --> 00:25:12,080 Speaker 3: forty years now. I think AI, in the ways we 462 00:25:12,119 --> 00:25:15,119 Speaker 3: talked about it before, in terms of better more rapid, 463 00:25:15,200 --> 00:25:19,480 Speaker 3: earlier diagnosis, better selection of therapies, can make a big difference. 464 00:25:19,800 --> 00:25:23,040 Speaker 1: You know, as we think about this much more electronic world. 465 00:25:23,680 --> 00:25:26,760 Speaker 1: Your book Coded to Kill really centers around the hacking 466 00:25:27,160 --> 00:25:30,280 Speaker 1: of an electronic health record system. How much as we 467 00:25:30,320 --> 00:25:33,800 Speaker 1: put more and more things in an electronic format, how 468 00:25:33,880 --> 00:25:38,080 Speaker 1: much do you worry not just about the criminal hacker, 469 00:25:38,880 --> 00:25:41,679 Speaker 1: but also the North Koreans of the Chinese. I mean, 470 00:25:41,680 --> 00:25:45,560 Speaker 1: they are very sophisticated players. IS relatives have showed in Lebanon. 471 00:25:45,880 --> 00:25:48,520 Speaker 1: I mean, there are people who have capabilities that are 472 00:25:48,520 --> 00:25:49,240 Speaker 1: a little scary. 473 00:25:49,520 --> 00:25:51,800 Speaker 2: How on earth could they do that with the pagers? 474 00:25:51,840 --> 00:25:54,080 Speaker 1: You mean it was about a twelve year project? 475 00:25:54,280 --> 00:25:54,600 Speaker 2: Really? 476 00:25:54,880 --> 00:25:58,720 Speaker 1: Yeah, they ultimately got inside the supply chain and made 477 00:25:58,760 --> 00:26:03,440 Speaker 1: sure that only their got to Hesbela. It's a very elegant, 478 00:26:03,560 --> 00:26:06,879 Speaker 1: very difficult project, and it takes a country that's in 479 00:26:07,000 --> 00:26:10,280 Speaker 1: desperate efforts to survive. But if you think about it, 480 00:26:10,320 --> 00:26:12,440 Speaker 1: all these different kinds of things, how do you build 481 00:26:12,560 --> 00:26:15,760 Speaker 1: layers of defense so you can't be hacked into? 482 00:26:16,400 --> 00:26:16,600 Speaker 2: Right? 483 00:26:16,760 --> 00:26:20,040 Speaker 3: Well, I'm happy to both tell you and here you 484 00:26:20,119 --> 00:26:22,879 Speaker 3: critique it. Given your knowledge of national security, as you 485 00:26:22,960 --> 00:26:26,520 Speaker 3: know pretty much what we're thinking about. So when I 486 00:26:26,560 --> 00:26:29,000 Speaker 3: came here, one of the first questions I asked was 487 00:26:29,080 --> 00:26:33,160 Speaker 3: how often do we have people inappropriately accessing our medical records? 488 00:26:33,280 --> 00:26:35,560 Speaker 3: When I was at UNC, I got into this because 489 00:26:35,600 --> 00:26:38,399 Speaker 3: the dean at the time we had a bunch of 490 00:26:38,440 --> 00:26:42,560 Speaker 3: people mini staff and about forty faculty who had inappropriately 491 00:26:42,600 --> 00:26:45,800 Speaker 3: accessed the electronic medical record of one of the fabulous 492 00:26:45,800 --> 00:26:49,040 Speaker 3: basketball players who was out for a few games. And so, 493 00:26:49,480 --> 00:26:51,239 Speaker 3: lucky me, I got a sign to talk to all 494 00:26:51,280 --> 00:26:53,840 Speaker 3: those faculty about this and you know how it's a 495 00:26:53,880 --> 00:26:56,760 Speaker 3: federal crime, and try to scare the dickens out of him. 496 00:26:57,359 --> 00:26:58,600 Speaker 2: You know, I got the wildest stories. 497 00:26:58,640 --> 00:27:01,119 Speaker 3: One guy told me, well, sickle cell experts, so I 498 00:27:01,160 --> 00:27:02,840 Speaker 3: worried he might have sickle cell disease. So I was 499 00:27:02,840 --> 00:27:04,600 Speaker 3: looking to see if I could be helpful. I mean, 500 00:27:04,600 --> 00:27:06,960 Speaker 3: that wasn't it. He's a sports fan. I got here 501 00:27:06,960 --> 00:27:09,560 Speaker 3: and I said, does that ever happened at Michigan? They said, oh, no, no, no, 502 00:27:09,600 --> 00:27:12,080 Speaker 3: we know it doesn't. And I said, well, what automated 503 00:27:12,119 --> 00:27:14,159 Speaker 3: system are using to look at every medical record? There 504 00:27:14,200 --> 00:27:16,280 Speaker 3: are some out there, and they weren't using one. It 505 00:27:16,320 --> 00:27:18,200 Speaker 3: took two or three years to get them to use one, 506 00:27:18,440 --> 00:27:22,159 Speaker 3: and turned out we were having many, many people that 507 00:27:22,240 --> 00:27:25,240 Speaker 3: were inappropriately accessing medical records. Now that's not a hack 508 00:27:25,280 --> 00:27:27,920 Speaker 3: at all. They just have access. They find out you're 509 00:27:27,920 --> 00:27:30,040 Speaker 3: in the hospital, or the governors in the hospital, they say, huh, 510 00:27:30,119 --> 00:27:32,359 Speaker 3: let me see what's going on. And so we fired 511 00:27:32,359 --> 00:27:34,679 Speaker 3: a whole slew of people. It's taken about five years 512 00:27:35,160 --> 00:27:38,000 Speaker 3: to get it under control. Now we're down to about 513 00:27:38,040 --> 00:27:41,960 Speaker 3: twenty times a year that that happens. But that's trivial. 514 00:27:42,320 --> 00:27:45,200 Speaker 3: What I worry about is we get about between six 515 00:27:45,320 --> 00:27:49,639 Speaker 3: thousand and more attempted hacks a day externally, and most 516 00:27:49,640 --> 00:27:52,480 Speaker 3: of those I think of as like those robo calls 517 00:27:52,520 --> 00:27:56,000 Speaker 3: you get during dinner time, but they're about eighty a 518 00:27:56,080 --> 00:28:01,760 Speaker 3: month that are serious attempts from foreign actors and sometimes domestic. 519 00:28:02,560 --> 00:28:05,880 Speaker 3: And what's said in healthcare, which I believe is sadly true, 520 00:28:05,960 --> 00:28:08,680 Speaker 3: is it's not if you will ever get hacked, it's 521 00:28:08,720 --> 00:28:10,840 Speaker 3: when you'll get hacked and what you do. And so 522 00:28:11,320 --> 00:28:13,160 Speaker 3: the layers we've tried to build, and we have lots 523 00:28:13,160 --> 00:28:15,679 Speaker 3: of security on top to try to permit the hacks. 524 00:28:16,320 --> 00:28:18,080 Speaker 3: But some of the best things we're doing, and is 525 00:28:18,119 --> 00:28:23,959 Speaker 3: AI enabled, actually is to detect inappropriate access into our 526 00:28:23,960 --> 00:28:27,280 Speaker 3: medical record and immediately coordinate off. So, you know, the 527 00:28:27,280 --> 00:28:31,400 Speaker 3: CrowdStrike thing that happened recently, we were back within hours, 528 00:28:31,560 --> 00:28:35,120 Speaker 3: a couple of hours, where somehow systems in Michigan were 529 00:28:35,119 --> 00:28:37,480 Speaker 3: down for two or three weeks. You know, we can't 530 00:28:37,480 --> 00:28:41,480 Speaker 3: do anything when your EHR is blocked. And our university 531 00:28:41,520 --> 00:28:44,959 Speaker 3: had a big hacked about a year ago during student 532 00:28:45,000 --> 00:28:47,560 Speaker 3: registration time that was in the medical school, and as 533 00:28:47,600 --> 00:28:50,640 Speaker 3: the university, it's like a pimple. How do we surround it? 534 00:28:50,680 --> 00:28:53,720 Speaker 3: So it doesn't spread. That's our strategy. I'd love to 535 00:28:53,760 --> 00:28:55,200 Speaker 3: hear what you think we ought to be doing. 536 00:28:55,520 --> 00:28:58,560 Speaker 1: The only advice I would have is to find one 537 00:28:58,600 --> 00:29:03,280 Speaker 1: of the most recent national security agency heads and bring 538 00:29:03,320 --> 00:29:06,240 Speaker 1: them in and ask them to review everything we have 539 00:29:06,360 --> 00:29:06,760 Speaker 1: not done. 540 00:29:06,760 --> 00:29:08,160 Speaker 2: That. That's a great idea. 541 00:29:08,680 --> 00:29:10,200 Speaker 1: I'm very close to one of the former in and 542 00:29:10,240 --> 00:29:14,479 Speaker 1: SAY heads. That's a world that is so sophisticated, and 543 00:29:14,520 --> 00:29:18,440 Speaker 1: it's still the leading capacity and an internet activity in 544 00:29:18,480 --> 00:29:21,200 Speaker 1: the world. Marshall, I want to thank you for joining me. 545 00:29:21,440 --> 00:29:24,120 Speaker 1: This has been a wide ranging and I think pretty 546 00:29:24,120 --> 00:29:28,160 Speaker 1: fascinating conversation. Your book Coded to Kill, I suspect is 547 00:29:28,200 --> 00:29:28,760 Speaker 1: the beginning of. 548 00:29:28,760 --> 00:29:30,560 Speaker 2: A whole new career, I hope. 549 00:29:30,560 --> 00:29:34,880 Speaker 1: So that's available now on Amazon and bookstores everywhere, and 550 00:29:34,960 --> 00:29:37,320 Speaker 1: I really appreciate you talking with us. I hope we 551 00:29:37,360 --> 00:29:39,280 Speaker 1: can do this again when your next book comes out. 552 00:29:39,560 --> 00:29:41,040 Speaker 2: Well, thank you again, it was awesome. 553 00:29:45,080 --> 00:29:48,080 Speaker 1: Thank you to my guest doctor Marshall Rangy. You can 554 00:29:48,120 --> 00:29:50,520 Speaker 1: get a link to buy his book Coded to Kill 555 00:29:50,800 --> 00:29:54,000 Speaker 1: on our show page at newtsworld dot com. Newtwell is 556 00:29:54,000 --> 00:29:58,160 Speaker 1: produced by Gager three sixty and iHeartMedia. Our executive producer 557 00:29:58,520 --> 00:30:02,800 Speaker 1: is Guernsey Sloan researcher is Rachel Peterson. The artwork for 558 00:30:02,840 --> 00:30:06,720 Speaker 1: the show was created by Steve Penley. Special thanks to 559 00:30:06,760 --> 00:30:09,800 Speaker 1: the team at gingrid Street sixty. If you've been enjoying Newtsworld, 560 00:30:10,160 --> 00:30:13,000 Speaker 1: I hope you'll go to Apple Podcasts and both rate 561 00:30:13,080 --> 00:30:15,920 Speaker 1: us with five stars and give us a review so 562 00:30:16,040 --> 00:30:19,320 Speaker 1: others can learn what it's all about. Right now listeners 563 00:30:19,320 --> 00:30:23,520 Speaker 1: of Newsworld consigner for my three freeweekly columns at gingridsweet 564 00:30:23,560 --> 00:30:28,200 Speaker 1: sixty dot com slash newsletter. I'm Newt Gingrich. This is Newsworld.