1 00:00:04,440 --> 00:00:12,399 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,480 --> 00:00:15,440 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:15,440 --> 00:00:18,640 Speaker 1: an executive producer with iHeart Podcasts and how the Tech 4 00:00:18,680 --> 00:00:22,479 Speaker 1: are you? Okay, So this isn't really tech stuff today, 5 00:00:22,640 --> 00:00:24,919 Speaker 1: I thought it would do something a little different. So 6 00:00:25,000 --> 00:00:28,680 Speaker 1: recently we had Jacob Goldstein on the show. And Jacob 7 00:00:29,000 --> 00:00:34,040 Speaker 1: is a journalist. He's done tons of work for multiple 8 00:00:34,760 --> 00:00:38,919 Speaker 1: prestigious news outlets, and he's also the host of a 9 00:00:39,000 --> 00:00:43,040 Speaker 1: podcast called What's Your Problem with Jacob Goldstein? And on 10 00:00:43,080 --> 00:00:46,760 Speaker 1: that podcast, Jacob talks with various smarty pants in the 11 00:00:46,840 --> 00:00:51,240 Speaker 1: engineering field to talk about how technology can potentially help 12 00:00:51,320 --> 00:00:54,240 Speaker 1: us solve some very difficult problems. And I thought it 13 00:00:54,240 --> 00:00:57,840 Speaker 1: would be great to bring you an episode of his podcast, 14 00:00:58,040 --> 00:01:01,000 Speaker 1: because I think if you dig text, you're also going 15 00:01:01,040 --> 00:01:03,880 Speaker 1: to dig What's Your Problem. But I know it can 16 00:01:03,920 --> 00:01:07,360 Speaker 1: be a hassle to go seek out another podcast, and 17 00:01:08,000 --> 00:01:11,480 Speaker 1: a lot of y'all may never take that initiative. So 18 00:01:11,520 --> 00:01:14,679 Speaker 1: I thought, well, I'll bring one episode in just for today, 19 00:01:15,000 --> 00:01:17,119 Speaker 1: and we can listen to an episode of What's Your 20 00:01:17,160 --> 00:01:20,520 Speaker 1: Problem and enjoy that, and then if you like it, 21 00:01:20,680 --> 00:01:23,119 Speaker 1: you can go seek out that podcast and subscribe to it. 22 00:01:23,200 --> 00:01:26,280 Speaker 1: And if otherwise you're like this isn't my bag, well 23 00:01:26,319 --> 00:01:28,520 Speaker 1: don't worry. We'll have another tech stuff episode for you 24 00:01:28,560 --> 00:01:33,240 Speaker 1: on Wednesday. So this episode is called using AI to 25 00:01:33,360 --> 00:01:36,760 Speaker 1: help Doctors Save lives, and I think that's a cool 26 00:01:36,959 --> 00:01:39,720 Speaker 1: topic to talk about. Often on this show, I'm talking 27 00:01:39,720 --> 00:01:44,560 Speaker 1: about artificial intelligence in a rather skeptical way because I 28 00:01:44,640 --> 00:01:47,600 Speaker 1: feel it's not a fault with AI necessarily. It's a 29 00:01:47,640 --> 00:01:50,800 Speaker 1: fault in how lots of businesses are rushing to try 30 00:01:50,920 --> 00:01:56,320 Speaker 1: and incorporate and adopt AI without fully baking in a 31 00:01:56,400 --> 00:02:00,280 Speaker 1: business reason for it, and that kind of short sightedness 32 00:02:00,360 --> 00:02:04,360 Speaker 1: can often have negative consequences. But I would never deny 33 00:02:04,400 --> 00:02:07,640 Speaker 1: the fact that artificial intelligence does have its place and 34 00:02:07,680 --> 00:02:11,000 Speaker 1: it can end up being a huge benefit to us 35 00:02:11,080 --> 00:02:14,720 Speaker 1: if we design it properly and implement it properly. That's 36 00:02:14,760 --> 00:02:18,160 Speaker 1: a big if, and I think in healthcare is one 37 00:02:18,200 --> 00:02:21,519 Speaker 1: place where AI makes a lot of sense, again, assuming 38 00:02:21,560 --> 00:02:24,359 Speaker 1: that we do take the care to design and implement 39 00:02:24,400 --> 00:02:28,919 Speaker 1: it appropriately. Obviously, there are very high stakes when we're 40 00:02:28,960 --> 00:02:33,119 Speaker 1: talking about healthcare. So let's listen in on this episode 41 00:02:33,200 --> 00:02:36,200 Speaker 1: of What's Your Problem? And I hope you enjoy. 42 00:02:37,480 --> 00:02:40,920 Speaker 2: When you walk into a hospital, technology is everywhere. In 43 00:02:40,919 --> 00:02:43,800 Speaker 2: one room, a surgeon is giving a patient a bionic knee. 44 00:02:44,040 --> 00:02:47,639 Speaker 2: In another room, a CT scanner is creating this incredible 45 00:02:47,720 --> 00:02:50,640 Speaker 2: three D picture of the inside of a person's body. 46 00:02:51,080 --> 00:02:55,240 Speaker 2: But in other places the hospital feels less high tech. 47 00:02:56,160 --> 00:03:00,720 Speaker 2: Doctors are still reading patients charts and making decision partly 48 00:03:00,800 --> 00:03:04,240 Speaker 2: on evidence but largely on instinct. This part of the 49 00:03:04,240 --> 00:03:06,520 Speaker 2: hospital is not so different from what it might have 50 00:03:06,560 --> 00:03:09,839 Speaker 2: looked like, you know, fifty years ago, and bringing new 51 00:03:09,919 --> 00:03:13,560 Speaker 2: technology to this part of medicine to care at the 52 00:03:13,560 --> 00:03:18,320 Speaker 2: bedside is a really hard, really interesting problem, because you 53 00:03:18,400 --> 00:03:20,800 Speaker 2: not only have to figure out how to use technology 54 00:03:20,840 --> 00:03:24,080 Speaker 2: to deliver useful information to the doctor at the right time, 55 00:03:24,760 --> 00:03:27,520 Speaker 2: you also have to figure out how to convince the 56 00:03:27,560 --> 00:03:36,920 Speaker 2: doctor that the information is actually worth listening to. I'm 57 00:03:37,000 --> 00:03:39,480 Speaker 2: Jacob Boldstein and this is What's Your Problem, the show 58 00:03:39,520 --> 00:03:41,760 Speaker 2: where I talk to people who are trying to make 59 00:03:41,880 --> 00:03:46,360 Speaker 2: technological progress. My guest today is Succi Sarya. She's the 60 00:03:46,440 --> 00:03:49,600 Speaker 2: founder and CEO of a company called Baesian Health, and 61 00:03:49,680 --> 00:03:52,840 Speaker 2: she's also a professor at Johns Hopkins, where she runs 62 00:03:52,840 --> 00:03:57,360 Speaker 2: a lab focused on machine learning and healthcare. Succi's problem 63 00:03:57,480 --> 00:04:01,360 Speaker 2: is this, how can you use artificially intelligence to detect 64 00:04:01,560 --> 00:04:05,760 Speaker 2: when hospital patients are at risk of potentially deadly complications? 65 00:04:06,360 --> 00:04:08,920 Speaker 2: And then once you've done that, how can you get 66 00:04:08,960 --> 00:04:12,520 Speaker 2: doctors to believe that the AI's warning is worth paying 67 00:04:12,600 --> 00:04:15,800 Speaker 2: attention to. She told me she first got interested in 68 00:04:15,840 --> 00:04:18,480 Speaker 2: healthcare sort of by accident, when she was a grad 69 00:04:18,520 --> 00:04:21,440 Speaker 2: student at Stanford studying AI and robots. 70 00:04:25,240 --> 00:04:28,080 Speaker 3: You know, I grew up actually being fascinated by AI. 71 00:04:28,160 --> 00:04:30,919 Speaker 3: I loved AI, and really most of my interest was 72 00:04:30,960 --> 00:04:33,200 Speaker 3: in the algorithm front and like looking at robotics and 73 00:04:33,200 --> 00:04:36,360 Speaker 3: building robots that were really smart, you know. And I 74 00:04:36,400 --> 00:04:41,000 Speaker 3: got acquainted with medicine through a friend colleague who was 75 00:04:41,040 --> 00:04:44,880 Speaker 3: a doctor taking care of babies. And what I learned 76 00:04:44,880 --> 00:04:47,640 Speaker 3: through her was that this is all this data we're 77 00:04:47,800 --> 00:04:52,520 Speaker 3: starting to collect, but literally nobody was doing designing any 78 00:04:52,520 --> 00:04:55,280 Speaker 3: software to make sense of it. So it was just 79 00:04:55,360 --> 00:04:59,680 Speaker 3: coming from a world where you know, I studied all 80 00:04:59,720 --> 00:05:05,240 Speaker 3: kinds of data day in day out, with robots doing 81 00:05:05,279 --> 00:05:07,680 Speaker 3: fun tasks like getting the robot to hold the ball 82 00:05:07,800 --> 00:05:11,480 Speaker 3: or juggle the ball to then realizing, holy crap, there's 83 00:05:11,560 --> 00:05:14,160 Speaker 3: like so many more useful things we could be doing. 84 00:05:14,200 --> 00:05:17,520 Speaker 3: So that was really my first discovery of like how 85 00:05:17,520 --> 00:05:19,960 Speaker 3: big a gap there was between people who thought about 86 00:05:20,000 --> 00:05:23,280 Speaker 3: AI versus people versus the problems that needed to be solved, 87 00:05:23,279 --> 00:05:25,240 Speaker 3: and how little we understood about these problems. 88 00:05:25,760 --> 00:05:28,800 Speaker 2: So so you decide that this is going to be 89 00:05:28,880 --> 00:05:31,080 Speaker 2: your thing, right, this is your life's work now. 90 00:05:31,240 --> 00:05:34,120 Speaker 3: I mean in the beginning, I wasn't convinced. In the beginning, 91 00:05:34,240 --> 00:05:37,320 Speaker 3: it was just about spending a few years helping out 92 00:05:37,520 --> 00:05:39,760 Speaker 3: and making sure we are able to make you know, 93 00:05:40,080 --> 00:05:42,240 Speaker 3: in the beginning, it was about my next three years. 94 00:05:42,960 --> 00:05:45,640 Speaker 3: Like I was afraid of investine. I was afraid of 95 00:05:45,680 --> 00:05:48,839 Speaker 3: the complexity of medicine. Like it wasn't an easy field. 96 00:05:48,880 --> 00:05:52,160 Speaker 3: It's not one where they welcome you, right, just as 97 00:05:52,160 --> 00:05:54,680 Speaker 3: an engineer, you don't come in and like at least 98 00:05:54,720 --> 00:05:56,960 Speaker 3: twelve thirteen years ago, that wasn't the culture that. 99 00:05:57,120 --> 00:06:00,920 Speaker 2: Like right, Like like an an MD a hospital does 100 00:06:00,960 --> 00:06:04,440 Speaker 2: not want to hear from some AI researcher. They're busy, 101 00:06:04,440 --> 00:06:05,400 Speaker 2: Oh no. 102 00:06:05,680 --> 00:06:08,120 Speaker 3: For sure, and they're like, we're busy, we have real 103 00:06:08,160 --> 00:06:08,640 Speaker 3: work to do. 104 00:06:08,880 --> 00:06:10,400 Speaker 2: Yeah, what is this? 105 00:06:10,839 --> 00:06:13,200 Speaker 3: Like this all sounds an esoteric mumbo jumbo. 106 00:06:13,440 --> 00:06:16,640 Speaker 2: Yeah, And so you say, you know, we're collecting all 107 00:06:16,680 --> 00:06:19,200 Speaker 2: this data in healthcare and we're not doing anything with it. 108 00:06:20,040 --> 00:06:24,039 Speaker 2: That is not intuitive, Like, that's not you know, I 109 00:06:24,040 --> 00:06:26,200 Speaker 2: think most people sort of prior And this is at 110 00:06:26,240 --> 00:06:28,839 Speaker 2: an academic hospital, right, Your friend is at Stanford Hospital, 111 00:06:28,880 --> 00:06:31,719 Speaker 2: a very prestigious academic hospital. I think Stanford Hospital, I 112 00:06:31,760 --> 00:06:34,320 Speaker 2: think data. I think these are people doing research. So 113 00:06:34,400 --> 00:06:36,120 Speaker 2: what do you mean when you say we're collecting all 114 00:06:36,120 --> 00:06:37,560 Speaker 2: this data and not doing anything with it. 115 00:06:37,760 --> 00:06:40,880 Speaker 3: Yeah, So twelve thirteen, fourteen years ago, this field was 116 00:06:41,000 --> 00:06:44,480 Speaker 3: very new and at the time even collecting and storing 117 00:06:44,520 --> 00:06:47,960 Speaker 3: this data, natural question was can be afforded? It costs 118 00:06:48,000 --> 00:06:50,200 Speaker 3: dollars to store this data? Why would we do that? 119 00:06:50,520 --> 00:06:52,120 Speaker 2: And when you say, what what kind of data are 120 00:06:52,160 --> 00:06:54,120 Speaker 2: you talking about here? When you say collect and store 121 00:06:54,120 --> 00:06:54,920 Speaker 2: this data? 122 00:06:55,040 --> 00:06:58,560 Speaker 3: So literally, this was at the time babies entering, you know, 123 00:06:58,600 --> 00:07:01,480 Speaker 3: in the new natle ICU, these are premature babies are 124 00:07:01,520 --> 00:07:04,800 Speaker 3: born in real time. Devices are collecting heart rate and 125 00:07:04,880 --> 00:07:08,760 Speaker 3: vitals and oxygen saturation data and like, and so that 126 00:07:08,839 --> 00:07:11,680 Speaker 3: kind of detailed data, which is much more bulky, was 127 00:07:11,880 --> 00:07:14,320 Speaker 3: historically not stored. Instead, what they would do is they'd 128 00:07:14,360 --> 00:07:18,840 Speaker 3: take like fifteen minute averages and capture that okay, And 129 00:07:19,280 --> 00:07:21,320 Speaker 3: naturally the question came up, do we need to store it? 130 00:07:21,360 --> 00:07:23,760 Speaker 3: This is really expensive data. Let's just throw it away 131 00:07:24,160 --> 00:07:26,600 Speaker 3: after forty eight hours, we don't need it anymore. Let's 132 00:07:26,640 --> 00:07:27,880 Speaker 3: just throw a quick summary of it. 133 00:07:28,160 --> 00:07:30,080 Speaker 2: Huh. So you might do a study, you might track 134 00:07:30,200 --> 00:07:32,760 Speaker 2: certain data points, but the idea that you're going to 135 00:07:32,840 --> 00:07:36,120 Speaker 2: just as a matter of course, be storing all of 136 00:07:36,120 --> 00:07:40,080 Speaker 2: this data that is now being generated and saved because 137 00:07:40,440 --> 00:07:44,360 Speaker 2: electronic medical records are just being adopted. Nobody was doing that. 138 00:07:44,440 --> 00:07:46,000 Speaker 2: Nobody had really thought to do it. It was an 139 00:07:46,000 --> 00:07:48,920 Speaker 2: expensive prospect. It didn't seem like there would be a 140 00:07:48,920 --> 00:07:50,880 Speaker 2: good reason to do it exactly. 141 00:07:51,320 --> 00:07:56,160 Speaker 3: And coming from AI, where we looked at you know, 142 00:07:56,240 --> 00:08:00,680 Speaker 3: fingerprint data on the internet in retail or finance, then 143 00:08:00,920 --> 00:08:04,200 Speaker 3: the you know, we it was so natural to think 144 00:08:04,240 --> 00:08:07,880 Speaker 3: about how this data teaches you things that it felt 145 00:08:08,120 --> 00:08:11,520 Speaker 3: crazy to me that like me one similarly, all sorts 146 00:08:11,560 --> 00:08:14,520 Speaker 3: of amazing things about these babies or human body or 147 00:08:14,560 --> 00:08:16,760 Speaker 3: how we're involved, or like what are the signs and 148 00:08:16,800 --> 00:08:19,360 Speaker 3: fingerprints of disease? How did they show up? 149 00:08:19,600 --> 00:08:22,400 Speaker 2: When you say fingerprint data, that's a that's a metaphor, right, 150 00:08:22,560 --> 00:08:25,360 Speaker 2: what does fingerprint data mean? In the context of sort 151 00:08:25,400 --> 00:08:27,440 Speaker 2: of e commerce and online finance. 152 00:08:28,160 --> 00:08:30,400 Speaker 3: Well, like they went to this site and then they 153 00:08:30,440 --> 00:08:33,400 Speaker 3: came to this site, or like they saw and add 154 00:08:33,559 --> 00:08:36,480 Speaker 3: somewhere else about this, and now you know they're searching 155 00:08:37,360 --> 00:08:39,920 Speaker 3: for something, and it shows you intense It's. 156 00:08:39,720 --> 00:08:43,640 Speaker 2: This moment ten years ago when like the when people 157 00:08:43,679 --> 00:08:46,480 Speaker 2: are using data to know like everything about what I 158 00:08:46,559 --> 00:08:49,680 Speaker 2: do when I'm shopping for new shoes. But you you're 159 00:08:49,800 --> 00:08:52,800 Speaker 2: but they're not collecting data on like sick newborn babies 160 00:08:53,120 --> 00:08:53,960 Speaker 2: exactly right. 161 00:08:54,200 --> 00:08:57,600 Speaker 3: Does that mind blowing to you? Because it was crazy 162 00:08:57,679 --> 00:08:58,560 Speaker 3: mind blowing to me. 163 00:08:59,000 --> 00:09:02,199 Speaker 2: Okay, yes, my mind is blown. So what do you do? 164 00:09:03,880 --> 00:09:07,240 Speaker 3: Well? I mean it seemed like such a pressing problem. 165 00:09:07,320 --> 00:09:09,960 Speaker 3: It also helped that we were funded as a moonshot 166 00:09:09,960 --> 00:09:14,640 Speaker 3: project by the Google founders, that it was a high 167 00:09:14,640 --> 00:09:17,400 Speaker 3: profile investment, and it sort of naturally led way for 168 00:09:17,880 --> 00:09:21,520 Speaker 3: United place like Stanford curiosity, and we had some amazing 169 00:09:21,559 --> 00:09:26,560 Speaker 3: collaborators who were equally curious, who said, well, let's dive 170 00:09:26,600 --> 00:09:30,360 Speaker 3: in and see what we'll understand. And that was the 171 00:09:30,360 --> 00:09:34,160 Speaker 3: start of it. I literally got hold of this massive, 172 00:09:34,160 --> 00:09:38,360 Speaker 3: twelve hundred page, like this huge thig book to learn 173 00:09:38,400 --> 00:09:40,880 Speaker 3: about babies and what conditions they experience and what does 174 00:09:40,920 --> 00:09:42,920 Speaker 3: it all mean, and then starting to understand how does 175 00:09:42,960 --> 00:09:45,800 Speaker 3: it show up in the data, and you know, spent 176 00:09:46,040 --> 00:09:49,400 Speaker 3: evenings and weekends, and actually I remember sitting in the 177 00:09:49,440 --> 00:09:55,560 Speaker 3: basement of Stanford Hospital at over Christmas trying to work 178 00:09:55,600 --> 00:09:57,679 Speaker 3: on trying to get data out of the health record 179 00:09:57,720 --> 00:09:59,640 Speaker 3: in the first place. And we were trying to experiment 180 00:09:59,640 --> 00:10:02,360 Speaker 3: with all of techniques for pulling the data out, which 181 00:10:02,640 --> 00:10:04,720 Speaker 3: you know now is a whole lot easier than it 182 00:10:04,840 --> 00:10:06,439 Speaker 3: was twelve years ago because. 183 00:10:06,160 --> 00:10:09,480 Speaker 2: It's not built for that, right, It's basically built somewhat 184 00:10:09,480 --> 00:10:11,640 Speaker 2: to track the patient and to a significant degree to 185 00:10:11,720 --> 00:10:15,400 Speaker 2: like bill insurance. Right, that's traditionally what electronic medical records 186 00:10:15,400 --> 00:10:15,760 Speaker 2: were for. 187 00:10:16,120 --> 00:10:17,240 Speaker 3: That's exactly right. 188 00:10:17,280 --> 00:10:19,640 Speaker 2: Kind of amazing and kind of weird. I mean, I 189 00:10:19,679 --> 00:10:23,120 Speaker 2: want to talk more about the bigger idea of data 190 00:10:23,320 --> 00:10:28,559 Speaker 2: and healthcare, but just to kind of land this moment 191 00:10:29,120 --> 00:10:31,960 Speaker 2: early in your career at Stanford, like, is there some 192 00:10:32,160 --> 00:10:34,160 Speaker 2: project you do, Like what is the end of your 193 00:10:34,200 --> 00:10:35,079 Speaker 2: work at Stanford. 194 00:10:35,480 --> 00:10:39,559 Speaker 3: So the project was, you know, we're monitoring these premature 195 00:10:39,600 --> 00:10:43,640 Speaker 3: babies right anywhere between twenty four week old babies which 196 00:10:43,679 --> 00:10:46,479 Speaker 3: are very very tiny, like very twenty. 197 00:10:46,200 --> 00:10:47,839 Speaker 2: Four weeks of gestation. 198 00:10:47,600 --> 00:10:52,040 Speaker 3: To be exactly to like twenty eight thirty thirty two. 199 00:10:52,880 --> 00:10:55,720 Speaker 3: And the idea was, these babies, you know, are like 200 00:10:55,760 --> 00:10:59,800 Speaker 3: they're at risk for significant, like an array of complications. Yeah, 201 00:11:00,040 --> 00:11:03,120 Speaker 3: and the idea is the sooner you know, the earlier 202 00:11:03,200 --> 00:11:05,240 Speaker 3: you can do something about it, the greater the chance 203 00:11:05,320 --> 00:11:08,680 Speaker 3: that you're going to actually resuscitate them. So our job was, like, 204 00:11:08,800 --> 00:11:10,839 Speaker 3: could we look at this data from the second they're 205 00:11:10,880 --> 00:11:15,160 Speaker 3: born and collect this data to start analyzing and modeling 206 00:11:15,520 --> 00:11:17,960 Speaker 3: which babies at risk for which of these complications? And 207 00:11:18,000 --> 00:11:20,959 Speaker 3: if you could, then you could start to put more 208 00:11:21,000 --> 00:11:25,320 Speaker 3: of these preventative prophylactic type pathways or approaches in place 209 00:11:25,360 --> 00:11:26,000 Speaker 3: for carrying. 210 00:11:25,760 --> 00:11:29,640 Speaker 2: Basically identify problems more quickly leading to better outcomes. That's 211 00:11:29,679 --> 00:11:31,920 Speaker 2: the basic desire exactly. 212 00:11:31,920 --> 00:11:34,960 Speaker 3: And in the process I discovered, like, you know, a 213 00:11:35,040 --> 00:11:38,320 Speaker 3: long time ago, there was a physician named Virginia Apgar, 214 00:11:38,880 --> 00:11:42,120 Speaker 3: and what she figured out is like, just by measuring 215 00:11:42,880 --> 00:11:46,040 Speaker 3: five different things from when the baby is born, she 216 00:11:46,160 --> 00:11:48,600 Speaker 3: can compute a very simple score that tells you how 217 00:11:48,640 --> 00:11:52,160 Speaker 3: the baby's doing. And so so naturally, the question we 218 00:11:52,200 --> 00:11:54,319 Speaker 3: asked is, Okay, so now that we are seeing all 219 00:11:54,320 --> 00:11:57,200 Speaker 3: these ways in which the machine learning and AI is 220 00:11:57,240 --> 00:12:01,319 Speaker 3: discovering novel signs and patterns are predictive. Could we just 221 00:12:01,360 --> 00:12:03,960 Speaker 3: simply combine this to come up with a simple score 222 00:12:04,840 --> 00:12:08,160 Speaker 3: that says, you know, can I predict complications? And what 223 00:12:08,240 --> 00:12:10,800 Speaker 3: we found was this new simple score that uses data 224 00:12:10,840 --> 00:12:13,079 Speaker 3: that no special thing you have to do, it's already 225 00:12:13,120 --> 00:12:15,719 Speaker 3: being collected. We just analyze it and we ought to 226 00:12:15,760 --> 00:12:18,600 Speaker 3: compute the score turns out to be much more predictive 227 00:12:18,600 --> 00:12:21,559 Speaker 3: than the ABGAR at predicting complications. 228 00:12:22,080 --> 00:12:24,959 Speaker 2: And so so it worked. I mean, did do people 229 00:12:25,040 --> 00:12:27,320 Speaker 2: use it? Is it standard of care? Now? What happened 230 00:12:27,360 --> 00:12:28,640 Speaker 2: with that? With that research? 231 00:12:29,160 --> 00:12:30,840 Speaker 3: So at that point I was like, oh, this is 232 00:12:30,880 --> 00:12:33,679 Speaker 3: so cool. And literally we got all these journalists who 233 00:12:33,679 --> 00:12:35,360 Speaker 3: wanted to write about it, and it was on the 234 00:12:35,360 --> 00:12:39,400 Speaker 3: fundraising you know, it was like Stanford's fundraising highlight for 235 00:12:39,480 --> 00:12:42,360 Speaker 3: like the next five years, et cetera. But what was 236 00:12:42,400 --> 00:12:44,120 Speaker 3: the saddest thing about it is that there was no 237 00:12:44,280 --> 00:12:48,439 Speaker 3: natural mechanism for implementing it in practice. And it had 238 00:12:48,480 --> 00:12:50,400 Speaker 3: to do with so many different pieces to it, Like 239 00:12:50,760 --> 00:12:54,040 Speaker 3: we didn't have the infrastructure, we didn't have the like 240 00:12:54,320 --> 00:12:56,800 Speaker 3: know how of like how do you get physicians to 241 00:12:56,840 --> 00:12:59,240 Speaker 3: trust something like this. How do you build this in 242 00:12:59,280 --> 00:13:01,640 Speaker 3: a way that is true, us worthy and reliable. How 243 00:13:01,679 --> 00:13:03,160 Speaker 3: do you do this so that it's not just like 244 00:13:03,240 --> 00:13:06,199 Speaker 3: a pet project in one hospital, but it's like a 245 00:13:06,600 --> 00:13:10,000 Speaker 3: system that is scalable nationally. And you know, what is 246 00:13:10,040 --> 00:13:12,520 Speaker 3: the incentive structure? Who pays for it and why would 247 00:13:12,520 --> 00:13:15,240 Speaker 3: they pay for it? And all of that is literally 248 00:13:15,280 --> 00:13:18,600 Speaker 3: what sort of got me, like got me super interested 249 00:13:18,640 --> 00:13:20,920 Speaker 3: in the field where I started to feel, Wow, we're 250 00:13:20,920 --> 00:13:24,480 Speaker 3: at the start of what feels like is a massive movement, 251 00:13:25,440 --> 00:13:27,920 Speaker 3: has many components to be figured out, but we need 252 00:13:27,960 --> 00:13:32,920 Speaker 3: to figure this out. Interestingly, at the time on sand 253 00:13:32,960 --> 00:13:35,400 Speaker 3: Hill Road, you know why, virtually being in pal Aalto. 254 00:13:35,240 --> 00:13:38,880 Speaker 2: Yes Santel Road where all the venture capitalists are exactly. 255 00:13:38,440 --> 00:13:41,920 Speaker 3: People were like, this is fantastic, here's money. Why don't 256 00:13:41,920 --> 00:13:45,320 Speaker 3: you start a company on this topic? And I spent 257 00:13:45,440 --> 00:13:48,840 Speaker 3: six months investigating, you know, talking to lots of peers 258 00:13:50,440 --> 00:13:55,680 Speaker 3: health systems, hospitals and realizing we're just too early. There's 259 00:13:55,720 --> 00:13:57,680 Speaker 3: a lot of work that needs to go in place 260 00:13:57,920 --> 00:14:00,840 Speaker 3: for this to become something that will scale nation. Now, 261 00:14:00,880 --> 00:14:02,360 Speaker 3: fast forward ten years. 262 00:14:02,200 --> 00:14:03,920 Speaker 2: Later, I want to fast forward, but give me just 263 00:14:03,960 --> 00:14:07,360 Speaker 2: another moment when you say it's too early, Like in 264 00:14:07,440 --> 00:14:10,440 Speaker 2: what ways was it too early? Like specifically, what was 265 00:14:10,520 --> 00:14:13,200 Speaker 2: not not ready in the world to start a company 266 00:14:13,240 --> 00:14:13,680 Speaker 2: at that time? 267 00:14:13,800 --> 00:14:15,720 Speaker 3: So the first thing we needed is for hospitals to 268 00:14:15,760 --> 00:14:18,000 Speaker 3: be ready to implement a system like that. For that 269 00:14:18,080 --> 00:14:21,800 Speaker 3: to happen, they needed to have implemented the Electronic Health record, huh, 270 00:14:22,120 --> 00:14:24,560 Speaker 3: be stable users of the HR so that they'd be 271 00:14:24,600 --> 00:14:26,840 Speaker 3: willing to plug in third party systems on top of it. 272 00:14:27,080 --> 00:14:30,080 Speaker 2: And it's kind of amazing that ten years ago, you know, 273 00:14:30,400 --> 00:14:36,240 Speaker 2: twenty whatever, twenty teens, still hospitals were not sort of 274 00:14:36,360 --> 00:14:40,440 Speaker 2: ubiquitous users of electronic medical records, right, like doctors were 275 00:14:40,440 --> 00:14:41,640 Speaker 2: still writing on paper. 276 00:14:42,400 --> 00:14:45,320 Speaker 3: Honestly, coming from computer science where I did you know, 277 00:14:45,360 --> 00:14:47,920 Speaker 3: where I was involved in other areas of AI and 278 00:14:47,960 --> 00:14:51,440 Speaker 3: computer science, like this was like the biggest like shift 279 00:14:51,720 --> 00:14:55,240 Speaker 3: in mindset I felt every time I came back into 280 00:14:55,240 --> 00:14:57,040 Speaker 3: the healthcare side of the equation, it felt like I 281 00:14:57,120 --> 00:15:00,040 Speaker 3: was going at least twenty thirty years back, right. 282 00:15:00,120 --> 00:15:02,320 Speaker 2: Like get a time machine going into the past when 283 00:15:02,360 --> 00:15:06,120 Speaker 2: you walk into the hospital, which is particularly, I don't know, 284 00:15:06,200 --> 00:15:11,120 Speaker 2: ironic surprising, given how in some ways healthcare feels very 285 00:15:11,160 --> 00:15:14,280 Speaker 2: cutting edge, right, Like A central interesting thing to me 286 00:15:14,400 --> 00:15:18,040 Speaker 2: about the work that you do is the way in 287 00:15:18,080 --> 00:15:20,840 Speaker 2: which healthcare is. You know, you go get a whatever, 288 00:15:21,120 --> 00:15:24,280 Speaker 2: a CT scan. It's this incredible machine and it uploads 289 00:15:24,280 --> 00:15:27,920 Speaker 2: to a computer and a whatever AI radiologist can you 290 00:15:27,960 --> 00:15:31,520 Speaker 2: know read the scan blah blah blah. And yet on 291 00:15:31,560 --> 00:15:34,480 Speaker 2: the kind of data side, on the complicated patient at 292 00:15:34,480 --> 00:15:38,240 Speaker 2: the bedside side, it's still very kind of old fashioned 293 00:15:38,240 --> 00:15:39,440 Speaker 2: and almost artisanal. 294 00:15:40,520 --> 00:15:43,360 Speaker 3: I mean, you raise like a fantastic point, which is 295 00:15:43,640 --> 00:15:48,120 Speaker 3: I think when it comes to introducing and designing new medicines, Yeah, 296 00:15:48,200 --> 00:15:52,120 Speaker 3: we've become really really good, but in terms of once 297 00:15:52,160 --> 00:15:56,560 Speaker 3: the medicine is produced, in terms of actually accelerating the adoption, 298 00:15:56,800 --> 00:16:00,520 Speaker 3: optimizing the update, yeah, designing who gets it and what 299 00:16:00,600 --> 00:16:04,680 Speaker 3: does and when detecting early who would benefit from it. 300 00:16:05,120 --> 00:16:07,680 Speaker 3: That's what I call the healthcare delivery side of the equation. 301 00:16:07,920 --> 00:16:11,240 Speaker 3: I feel like there's a very very vast gap of 302 00:16:11,320 --> 00:16:12,800 Speaker 3: what needs to happen to get better. 303 00:16:13,560 --> 00:16:17,480 Speaker 2: So, okay, so you do this project. You see that 304 00:16:17,560 --> 00:16:21,960 Speaker 2: it's too early to start a company because the world 305 00:16:22,040 --> 00:16:25,600 Speaker 2: isn't ready yet, because hospitals aren't even widely using electronic 306 00:16:25,640 --> 00:16:28,240 Speaker 2: medical records yet. Much less being ready to sort of 307 00:16:28,480 --> 00:16:31,840 Speaker 2: expert the data and listen to the data, et cetera. 308 00:16:32,280 --> 00:16:36,560 Speaker 2: And you take a job as a professor at Johns Hopkins, Right, 309 00:16:36,600 --> 00:16:37,440 Speaker 2: is that the next step? 310 00:16:38,040 --> 00:16:41,040 Speaker 3: That's right? And part of the move to Hopkins was 311 00:16:41,680 --> 00:16:45,840 Speaker 3: realizing there's so much depth and breadth of medicine, not 312 00:16:45,880 --> 00:16:48,920 Speaker 3: just around the on the actual devices or the engineering 313 00:16:48,960 --> 00:16:51,480 Speaker 3: on the chemical or the drug development, but also on 314 00:16:51,520 --> 00:16:53,960 Speaker 3: the delivery side, like how what does it take to 315 00:16:54,240 --> 00:16:57,640 Speaker 3: scale ideas nationally? How do you design policy around it? 316 00:16:58,440 --> 00:17:01,800 Speaker 3: There was sort of a whole institute dedicated to scaling 317 00:17:02,160 --> 00:17:06,280 Speaker 3: ideas nationally, So to me that was extremely exciting to 318 00:17:06,400 --> 00:17:10,520 Speaker 3: learn about what would it take to really build the 319 00:17:10,560 --> 00:17:13,680 Speaker 3: foundations of a field like this. And moving to Baltimore 320 00:17:13,720 --> 00:17:16,919 Speaker 3: was a big move, but I was just excited by 321 00:17:16,920 --> 00:17:19,240 Speaker 3: the idea of learning it all and learning it especially 322 00:17:19,320 --> 00:17:22,040 Speaker 3: as an engineer as ERNII research, as an outsider coming 323 00:17:22,080 --> 00:17:22,840 Speaker 3: into healthcare. 324 00:17:25,160 --> 00:17:27,720 Speaker 2: In a minute, Succi and her colleagues figure out how 325 00:17:27,760 --> 00:17:30,720 Speaker 2: to use AI to detect when certain patients are at 326 00:17:30,800 --> 00:17:35,439 Speaker 2: risk for complications and also how to get doctors to listen. 327 00:17:44,080 --> 00:17:46,800 Speaker 2: So Succi is at Johns Hopkins in Baltimore and she 328 00:17:46,880 --> 00:17:50,960 Speaker 2: has this big idea using AI to help doctors treat 329 00:17:51,000 --> 00:17:54,240 Speaker 2: hospital patients, but she has to figure out exactly what 330 00:17:54,400 --> 00:17:55,640 Speaker 2: to focus on. 331 00:17:55,640 --> 00:17:57,720 Speaker 3: One of the big areas was this idea of like 332 00:17:57,840 --> 00:18:03,680 Speaker 3: early detection of patients at risk for complications and diagnostic 333 00:18:03,840 --> 00:18:07,760 Speaker 3: errors being the third leading cause of death. Like that's nuts. Like, 334 00:18:07,800 --> 00:18:11,600 Speaker 3: so today, you know there are critical moments that are missed. 335 00:18:11,760 --> 00:18:14,760 Speaker 3: We get patients the wrong diagnosis or that they're developing 336 00:18:14,800 --> 00:18:17,560 Speaker 3: something subtly and slowly. That's like a whole branch of 337 00:18:17,600 --> 00:18:21,960 Speaker 3: diagnostic errors where you know, complication or a condition develops, 338 00:18:21,960 --> 00:18:24,840 Speaker 3: but they don't get noticed in a timely fashion. And 339 00:18:24,920 --> 00:18:29,000 Speaker 3: so these seemed perfect for AI to come in with 340 00:18:29,040 --> 00:18:31,439 Speaker 3: the kind of data that exists to be able to 341 00:18:31,680 --> 00:18:34,440 Speaker 3: flag patients that are high risk and make it easy 342 00:18:34,480 --> 00:18:35,600 Speaker 3: to provide a second pair of eyes. 343 00:18:35,760 --> 00:18:39,760 Speaker 2: Because it's basically pattern matching, right, I mean, differential diagnosis 344 00:18:39,840 --> 00:18:44,080 Speaker 2: is taking lots of different variables from the patient and 345 00:18:45,200 --> 00:18:48,920 Speaker 2: trying to put those variables together to match the patient 346 00:18:49,000 --> 00:18:52,520 Speaker 2: to you know, thousands of other patients and say, oh, 347 00:18:52,680 --> 00:18:56,080 Speaker 2: all of these, all of these variables, all of these 348 00:18:56,240 --> 00:18:59,199 Speaker 2: health indicators suggest that the patient has disease X. Like 349 00:18:59,240 --> 00:19:02,679 Speaker 2: that's fundamentally what a differential diagnosis is, and like machine 350 00:19:02,760 --> 00:19:04,439 Speaker 2: learning should be very good. 351 00:19:04,320 --> 00:19:09,040 Speaker 3: At that exactly. And previously people have attempted differential diagnosis 352 00:19:09,359 --> 00:19:13,159 Speaker 3: with very coarse symptoms, like high level description of like 353 00:19:13,320 --> 00:19:16,120 Speaker 3: you have cop your fever. What was different this time 354 00:19:16,160 --> 00:19:18,800 Speaker 3: around is because of the HR, we had very detailed. 355 00:19:18,480 --> 00:19:22,639 Speaker 2: Data the EHR, the electronic health record right exactly. 356 00:19:22,200 --> 00:19:26,399 Speaker 3: And so it provided this brand new opportunity to do this. 357 00:19:26,560 --> 00:19:28,720 Speaker 3: And then you know, naturally when you go down the 358 00:19:28,760 --> 00:19:32,239 Speaker 3: list and start looking at problem areas, sepsis is a 359 00:19:32,320 --> 00:19:35,439 Speaker 3: model disease. We chose to demonstrate the idea. 360 00:19:35,880 --> 00:19:38,720 Speaker 2: So let's just talk about sepsis for a minute. What 361 00:19:38,880 --> 00:19:39,440 Speaker 2: is sepsis? 362 00:19:39,720 --> 00:19:43,080 Speaker 3: So let's say your patient gets infected. Your immune system 363 00:19:43,160 --> 00:19:45,840 Speaker 3: is now going to do respond in order to protect 364 00:19:45,840 --> 00:19:50,560 Speaker 3: your body, but in sepsis, it overreacts and starts attacking 365 00:19:50,600 --> 00:19:56,840 Speaker 3: your organ systems, leading to organ failure and depth. And 366 00:19:56,920 --> 00:19:59,800 Speaker 3: so the idea of its sepsis treatment is very much 367 00:19:59,840 --> 00:20:02,040 Speaker 3: the earlier you can detect it, the better you are 368 00:20:02,080 --> 00:20:03,840 Speaker 3: at like tackling it. 369 00:20:03,960 --> 00:20:09,720 Speaker 2: Right, Okay, so I buy it. It seems seems like 370 00:20:09,760 --> 00:20:11,600 Speaker 2: a big problem and it seems like one that might 371 00:20:11,640 --> 00:20:15,359 Speaker 2: be solved or at least, you know, made less bad 372 00:20:15,600 --> 00:20:19,119 Speaker 2: by with the application of machine learning. So how do 373 00:20:19,160 --> 00:20:22,000 Speaker 2: you how do you actually do it? What do you 374 00:20:22,000 --> 00:20:24,600 Speaker 2: have to do to build the model and see if 375 00:20:24,600 --> 00:20:26,159 Speaker 2: it works and get people to use it. 376 00:20:26,280 --> 00:20:29,119 Speaker 3: Yeah, so this is almost like what you're about to 377 00:20:29,160 --> 00:20:31,760 Speaker 3: describe in two minutes what was almost a five year journey. 378 00:20:32,160 --> 00:20:34,680 Speaker 3: So first, it's collecting a huge amount of data where 379 00:20:34,680 --> 00:20:38,320 Speaker 3: you can identify both patients of suptic versus non septic 380 00:20:38,320 --> 00:20:40,239 Speaker 3: and when they had it, and what other conditions did 381 00:20:40,240 --> 00:20:42,720 Speaker 3: they have, and what else was happening in their life right, 382 00:20:42,920 --> 00:20:44,879 Speaker 3: and you know, all the data leading up to that 383 00:20:44,960 --> 00:20:47,919 Speaker 3: episode and what was done after the fact. So you 384 00:20:47,960 --> 00:20:50,000 Speaker 3: get the data. Then the next part is, you know, 385 00:20:50,040 --> 00:20:52,639 Speaker 3: you have to actually understand the biological process or the 386 00:20:52,640 --> 00:20:55,320 Speaker 3: clinical process that's happening and layer that on top of 387 00:20:55,359 --> 00:20:57,240 Speaker 3: the data to make sure you're going from like just 388 00:20:57,280 --> 00:21:00,240 Speaker 3: bits and bytes to data that makes sense, okay, And 389 00:21:00,480 --> 00:21:04,280 Speaker 3: then you implement lots of different learning algorithms to be 390 00:21:04,359 --> 00:21:06,879 Speaker 3: able to experiment, you know, the thing that we first 391 00:21:06,920 --> 00:21:10,119 Speaker 3: did versus the thing we do now. There's like lots 392 00:21:10,119 --> 00:21:12,399 Speaker 3: of generations of improvements in order to get to a 393 00:21:12,400 --> 00:21:16,440 Speaker 3: place where you're going from like, you know, not very 394 00:21:16,440 --> 00:21:18,320 Speaker 3: good signal to very good signal. 395 00:21:18,840 --> 00:21:22,359 Speaker 2: So you're building a model through trial and error, basically 396 00:21:22,400 --> 00:21:25,879 Speaker 2: trying to get an AI model that has a high 397 00:21:26,440 --> 00:21:30,800 Speaker 2: sensitivity and specificity that's good at issuing an alert when 398 00:21:30,800 --> 00:21:33,239 Speaker 2: a patient has sepsis, and does an issue too many 399 00:21:33,280 --> 00:21:35,159 Speaker 2: alerts when the patient doesn't have sepsis. 400 00:21:34,760 --> 00:21:37,320 Speaker 3: Basically exactly, and also does it in a way that 401 00:21:37,680 --> 00:21:40,359 Speaker 3: you know, when it says somebody has sepsis, it's able 402 00:21:40,359 --> 00:21:43,280 Speaker 3: to explain why. It's able to provide enough information so 403 00:21:43,320 --> 00:21:46,480 Speaker 3: that the clinician can act on it. And it's not 404 00:21:46,560 --> 00:21:49,240 Speaker 3: doing it solely that there's not enough to work on, 405 00:21:49,280 --> 00:21:51,600 Speaker 3: and it's not doing it so late that it's useless. 406 00:21:51,920 --> 00:21:56,520 Speaker 2: Like often people talk about AI models machine learning models 407 00:21:56,600 --> 00:21:59,840 Speaker 2: as black boxes, right, like, very good at pattern matching, 408 00:22:00,040 --> 00:22:02,399 Speaker 2: very good at predicting the next word, but we don't 409 00:22:02,440 --> 00:22:04,760 Speaker 2: know why, And so you're saying in this instance, you 410 00:22:04,840 --> 00:22:06,080 Speaker 2: sort of need to know why. 411 00:22:07,480 --> 00:22:10,359 Speaker 3: My very key evolution of a scientist working in this 412 00:22:10,480 --> 00:22:12,320 Speaker 3: area was in the beginning, I saw it all as 413 00:22:12,400 --> 00:22:16,560 Speaker 3: data in math, and then as I started working more 414 00:22:16,560 --> 00:22:19,040 Speaker 3: and more in interfacing and actually deploying systems like this, 415 00:22:19,160 --> 00:22:21,960 Speaker 3: what I started realizing it's actually not math and data, 416 00:22:22,000 --> 00:22:26,760 Speaker 3: it's about trust, because ultimately, to get adoption and to 417 00:22:26,800 --> 00:22:30,360 Speaker 3: get outcomes, I need to get trust from these highly 418 00:22:30,440 --> 00:22:35,320 Speaker 3: trained clinicians who studied this year and year out, and 419 00:22:35,400 --> 00:22:38,760 Speaker 3: they have a process in a system for working and 420 00:22:38,800 --> 00:22:40,640 Speaker 3: you have to fit within this system. 421 00:22:40,760 --> 00:22:43,280 Speaker 2: And they're very busy, and it's very high stakes, and 422 00:22:43,320 --> 00:22:46,720 Speaker 2: they kind of think they know everything, and it's so 423 00:22:46,880 --> 00:22:51,840 Speaker 2: presumably very hard to get them to trust you in 424 00:22:51,920 --> 00:22:54,440 Speaker 2: making their clinical judgments exactly. 425 00:22:54,520 --> 00:22:57,560 Speaker 3: But moreover, I've also been on the other side of 426 00:22:57,680 --> 00:23:00,480 Speaker 3: like tons of engineers making all sorts of about their 427 00:23:00,560 --> 00:23:03,439 Speaker 3: system knows better, but when you actually go and make 428 00:23:03,520 --> 00:23:06,720 Speaker 3: sense of what the evaluations they've done, they literally have 429 00:23:06,880 --> 00:23:09,880 Speaker 3: very little understanding of medicine and the practice of healthcare, 430 00:23:09,960 --> 00:23:14,119 Speaker 3: so like their claims are mostly not good. So a 431 00:23:14,240 --> 00:23:16,960 Speaker 3: huge part of it is like developing respect and humility 432 00:23:17,400 --> 00:23:20,119 Speaker 3: for the system, the complexity, so that when you're bringing 433 00:23:20,119 --> 00:23:23,719 Speaker 3: in this new thing, it really truly fits, it's easy 434 00:23:23,760 --> 00:23:28,320 Speaker 3: to use, it makes sense, it creates value. Without all that, 435 00:23:28,480 --> 00:23:30,440 Speaker 3: you're not going to get to the benefit. 436 00:23:31,880 --> 00:23:34,879 Speaker 2: So now you say it creates value, and suddenly you 437 00:23:34,960 --> 00:23:38,720 Speaker 2: sound like a founder, an entrepreneur and not like an 438 00:23:38,800 --> 00:23:43,960 Speaker 2: academic where where in this arc do you start a company? 439 00:23:44,080 --> 00:23:46,320 Speaker 3: You know, it was somewhere in twenty eighteen. I remember 440 00:23:46,720 --> 00:23:49,000 Speaker 3: twenty eighteen was a transformative video for me for a 441 00:23:49,080 --> 00:23:53,320 Speaker 3: number of reasons. I'll start with the very simple thing 442 00:23:53,359 --> 00:23:57,760 Speaker 3: of like, when we first built this system and deployed it, 443 00:23:58,040 --> 00:24:01,639 Speaker 3: only like two or three clinicians use it, and it 444 00:24:01,720 --> 00:24:03,840 Speaker 3: was the two to three clinicians who were involved in 445 00:24:03,920 --> 00:24:06,679 Speaker 3: working on the project with us. What I realized was 446 00:24:06,800 --> 00:24:09,479 Speaker 3: we knew from looking at large amounts of data that 447 00:24:09,560 --> 00:24:12,719 Speaker 3: the system was working, it was working correctly, and we 448 00:24:12,760 --> 00:24:15,399 Speaker 3: could identify these cases. We could identify them early, and 449 00:24:15,520 --> 00:24:18,399 Speaker 3: even from interacting the clinicians, we knew you could do 450 00:24:18,440 --> 00:24:20,960 Speaker 3: something differently about it. So it's one thing for system 451 00:24:21,000 --> 00:24:24,000 Speaker 3: to detect. You know, clinicians will say, so what, so 452 00:24:24,040 --> 00:24:25,879 Speaker 3: what am I supposed to do well about it? And 453 00:24:25,920 --> 00:24:28,280 Speaker 3: in this scenario, we've even done studies to know that 454 00:24:28,840 --> 00:24:31,199 Speaker 3: actually they could be acting, you know, they could use 455 00:24:31,240 --> 00:24:35,360 Speaker 3: this output to meaningfully change the patient's care. So then 456 00:24:35,520 --> 00:24:38,679 Speaker 3: to me, the question was, Okay, if we know this 457 00:24:38,760 --> 00:24:41,479 Speaker 3: thing works, why the heck are we not succeeding? And 458 00:24:41,520 --> 00:24:43,960 Speaker 3: that's kind of where it went from the puzzle of 459 00:24:44,119 --> 00:24:46,280 Speaker 3: math and data to trust. You know, how do we 460 00:24:46,320 --> 00:24:49,040 Speaker 3: develop and deploy it in a way that's transparent. How 461 00:24:49,080 --> 00:24:51,639 Speaker 3: do we understand like what are the top of mind 462 00:24:51,800 --> 00:24:54,280 Speaker 3: issues from a practicing clinician's point of view, and how 463 00:24:54,320 --> 00:24:56,880 Speaker 3: do we address it? Where are we creating value? How 464 00:24:56,920 --> 00:24:58,480 Speaker 3: do we start quantifying value? 465 00:25:00,040 --> 00:25:02,480 Speaker 2: There any moments where you're like, you know, you have 466 00:25:02,600 --> 00:25:06,280 Speaker 2: this thing that can be helpful, and yet someone a doctor, 467 00:25:06,359 --> 00:25:10,080 Speaker 2: a hospital administrator, whatever, is telling you why they're not 468 00:25:10,160 --> 00:25:10,800 Speaker 2: going to use it. 469 00:25:10,920 --> 00:25:15,480 Speaker 4: Basically, I mean so many moments I can't even like 470 00:25:15,920 --> 00:25:19,400 Speaker 4: begin so I think I remember this time when they 471 00:25:19,400 --> 00:25:22,399 Speaker 4: basically were like, Okay, this thing is flagged the system. 472 00:25:22,840 --> 00:25:24,719 Speaker 3: What do I do with it? And I was like, 473 00:25:24,920 --> 00:25:26,960 Speaker 3: you should look if the patient has something, And they 474 00:25:26,960 --> 00:25:28,840 Speaker 3: were like, are you kidding me? How many flags? 475 00:25:29,080 --> 00:25:29,320 Speaker 1: Do you know? 476 00:25:29,359 --> 00:25:32,400 Speaker 3: How many alerting systems exist? If I were to take 477 00:25:32,480 --> 00:25:36,199 Speaker 3: every single alerting system and start to use that to 478 00:25:36,320 --> 00:25:39,520 Speaker 3: start informing when I'm doing a diagnostic workup and what 479 00:25:39,560 --> 00:25:42,239 Speaker 3: am I doing, I basically would not get my day 480 00:25:42,280 --> 00:25:43,600 Speaker 3: to day work done right. 481 00:25:43,840 --> 00:25:45,919 Speaker 2: It's like it's like when you're if you're ever in 482 00:25:45,960 --> 00:25:49,119 Speaker 2: an emergency room, like everything is beeping all the time, 483 00:25:49,680 --> 00:25:52,119 Speaker 2: and your system is just one more beep in a 484 00:25:52,160 --> 00:25:54,639 Speaker 2: sea of beeps that everybody ignores, and. 485 00:25:54,560 --> 00:25:56,680 Speaker 3: You feel passionately about it. 486 00:25:56,320 --> 00:25:59,159 Speaker 2: Yeah, it's your reasons you care about this beep, but 487 00:25:59,240 --> 00:26:00,480 Speaker 2: nobody else cares about this. 488 00:26:00,440 --> 00:26:03,679 Speaker 3: Being Nobody gives a damn. And it was just like 489 00:26:04,359 --> 00:26:07,520 Speaker 3: so it was difficult, right, like you come. I was 490 00:26:07,600 --> 00:26:10,240 Speaker 3: sort of like, you know, I felt defeated. I sat there, 491 00:26:10,320 --> 00:26:13,080 Speaker 3: I was like, this is so unbelievable. This is like 492 00:26:13,160 --> 00:26:15,880 Speaker 3: so powerful. Why aren't they believing me? And so there 493 00:26:15,960 --> 00:26:19,080 Speaker 3: was an information gap right like then it was like understanding, 494 00:26:19,480 --> 00:26:22,520 Speaker 3: oh this you know, the system in which they live. Okay, 495 00:26:22,800 --> 00:26:26,199 Speaker 3: I understand that all these different alerts exist. How are 496 00:26:26,240 --> 00:26:28,760 Speaker 3: these alerts created? How are we different? How can we 497 00:26:28,800 --> 00:26:32,560 Speaker 3: demonstrate we're different? Why should we be trusted? And so 498 00:26:32,640 --> 00:26:35,960 Speaker 3: that was as an example starting point. Like another one 499 00:26:36,000 --> 00:26:38,520 Speaker 3: was like we deployed it, and we deployed it in 500 00:26:38,520 --> 00:26:41,160 Speaker 3: a way where it was you know, within the electronic 501 00:26:41,200 --> 00:26:42,639 Speaker 3: health record, but it was done in a way that 502 00:26:42,760 --> 00:26:46,040 Speaker 3: was really cumbersome, like every time they needed to respond, 503 00:26:46,640 --> 00:26:48,679 Speaker 3: it was like a few you know, it was like 504 00:26:49,200 --> 00:26:53,080 Speaker 3: a minute and a half of work, and you know, honestly, 505 00:26:53,160 --> 00:26:55,560 Speaker 3: they're so busy. A minute and a half extra to 506 00:26:55,600 --> 00:26:58,480 Speaker 3: do something that they don't already have total conviction in 507 00:26:59,119 --> 00:27:02,439 Speaker 3: is like a lot to So then you spend a 508 00:27:02,440 --> 00:27:04,720 Speaker 3: bunch of time optimizing, well, how do we go it 509 00:27:04,720 --> 00:27:06,120 Speaker 3: from me take it from a minute and a half 510 00:27:06,200 --> 00:27:09,480 Speaker 3: to like three seconds? How do we optimize it so 511 00:27:09,520 --> 00:27:13,359 Speaker 3: that it's instantaneous? It's easy, it's just there. 512 00:27:14,440 --> 00:27:16,719 Speaker 2: So this isn't about the data at all. This is 513 00:27:16,880 --> 00:27:19,760 Speaker 2: just user experience basically. 514 00:27:19,840 --> 00:27:23,840 Speaker 3: Hugely human factors, like human factors and human factors here 515 00:27:23,920 --> 00:27:27,400 Speaker 3: is very different and complicated because you're trying to optimize 516 00:27:27,440 --> 00:27:30,480 Speaker 3: human factors within a chassis that is very complicated. Right, 517 00:27:30,560 --> 00:27:33,840 Speaker 3: Like you're not like standalone software, This is like you're 518 00:27:34,080 --> 00:27:37,439 Speaker 3: within an electronic health record, and like, how do you 519 00:27:37,520 --> 00:27:39,639 Speaker 3: do this in a way that the electronic health record 520 00:27:39,640 --> 00:27:41,080 Speaker 3: providers will allow. 521 00:27:40,800 --> 00:27:43,440 Speaker 2: You information not your software? 522 00:27:43,680 --> 00:27:46,040 Speaker 3: Yeah, it's not your software, And how can you do 523 00:27:46,080 --> 00:27:49,160 Speaker 3: it in a way that is smooth and seamless and 524 00:27:49,200 --> 00:27:51,960 Speaker 3: they actually like it? And then you can do this 525 00:27:52,040 --> 00:27:54,200 Speaker 3: in a way where it's not just custom built for 526 00:27:54,320 --> 00:27:56,920 Speaker 3: a Johns Hopkins, but it's something that you can send 527 00:27:56,960 --> 00:27:59,600 Speaker 3: to take to a rural hospital, right. 528 00:28:00,119 --> 00:28:02,320 Speaker 2: So you're doing all this, at what point in this 529 00:28:02,440 --> 00:28:03,560 Speaker 2: arc do you start the company? 530 00:28:04,280 --> 00:28:09,000 Speaker 3: So another like personal thing happened, which is I lost 531 00:28:09,000 --> 00:28:14,320 Speaker 3: my nephew to sepsis. And you know, it was the craziest, 532 00:28:14,960 --> 00:28:20,399 Speaker 3: like saddest, like you know, most insane feeling to be 533 00:28:20,480 --> 00:28:23,080 Speaker 3: able to like, you know, as like a researcher, as 534 00:28:23,080 --> 00:28:26,159 Speaker 3: a scientist. I'm like ned deep in these research areas. 535 00:28:26,160 --> 00:28:29,840 Speaker 3: And then it's one thing to go and talk about it, 536 00:28:29,880 --> 00:28:32,200 Speaker 3: to say, well, here's how you do it, and here's 537 00:28:32,240 --> 00:28:34,919 Speaker 3: how it works, and here's why it will work, and 538 00:28:34,960 --> 00:28:37,480 Speaker 3: here's why this is a great idea. And it's another 539 00:28:37,600 --> 00:28:39,640 Speaker 3: to then come to that moment of realization where like, 540 00:28:40,480 --> 00:28:42,600 Speaker 3: well I haven't actually done anything to make a difference. 541 00:28:42,800 --> 00:28:46,959 Speaker 2: So you're already working on sepsis, yes, and your nephew 542 00:28:47,040 --> 00:28:49,120 Speaker 2: you say, nephew meaning younger than you? 543 00:28:49,240 --> 00:28:50,880 Speaker 3: Is this a young much younger than me? 544 00:28:51,000 --> 00:28:51,360 Speaker 2: Wow? 545 00:28:52,200 --> 00:28:56,720 Speaker 3: And realizing like I was doing, like it all sounded 546 00:28:56,760 --> 00:28:59,240 Speaker 3: like an excellent like it all sounded great on paper, 547 00:28:59,400 --> 00:29:01,800 Speaker 3: you know it. It was like, you know, I'd go 548 00:29:01,840 --> 00:29:04,600 Speaker 3: to meetings and lots of people would listen and they'd say, yay, 549 00:29:04,760 --> 00:29:07,400 Speaker 3: great idea, et cetera. But then at the end of 550 00:29:07,440 --> 00:29:11,040 Speaker 3: the day, for me, it was like I'd gotten too 551 00:29:11,160 --> 00:29:13,200 Speaker 3: used to you know, it's easy. It's easy to like 552 00:29:13,240 --> 00:29:16,160 Speaker 3: talk about something smart and then people say it's a 553 00:29:16,160 --> 00:29:17,600 Speaker 3: great idea, and then you leave the room and you 554 00:29:17,600 --> 00:29:19,840 Speaker 3: feel good about it, and then you go back and 555 00:29:19,880 --> 00:29:23,480 Speaker 3: you work on it some more. And I think it 556 00:29:23,560 --> 00:29:27,880 Speaker 3: was hard, like hard for me to sort of realize 557 00:29:27,920 --> 00:29:31,520 Speaker 3: like I had gotten to carre it away and I'd 558 00:29:31,560 --> 00:29:35,040 Speaker 3: gotten to carre it away like not thinking about what 559 00:29:35,120 --> 00:29:36,800 Speaker 3: is it actually going to take to make it real? 560 00:29:37,520 --> 00:29:40,280 Speaker 3: And the making it real is what's like just so 561 00:29:40,480 --> 00:29:42,840 Speaker 3: much harder than I thought. But part of it is 562 00:29:42,880 --> 00:29:47,320 Speaker 3: I also felt like this isn't just a sad This 563 00:29:47,400 --> 00:29:50,280 Speaker 3: isn't just like a you know, for an idea for sepsis. 564 00:29:50,400 --> 00:29:52,760 Speaker 3: This is really like crazy to me that this isn't 565 00:29:52,760 --> 00:29:55,480 Speaker 3: how we operate the like I think the time has 566 00:29:55,560 --> 00:29:58,280 Speaker 3: come and what is exciting to me is in the 567 00:29:58,360 --> 00:30:00,640 Speaker 3: last year or two, I'm starting to see the world 568 00:30:00,800 --> 00:30:05,320 Speaker 3: has shifted. There's been a very meaningful change in the 569 00:30:05,400 --> 00:30:09,480 Speaker 3: last few years. I think losing my like losing my nephew, 570 00:30:09,560 --> 00:30:12,760 Speaker 3: made it very real. It went from this idea to 571 00:30:13,040 --> 00:30:17,320 Speaker 3: feeling like this was an opportunity where it's very real. 572 00:30:17,400 --> 00:30:20,560 Speaker 3: Now we can make a difference. The pieces exist, and 573 00:30:20,640 --> 00:30:23,400 Speaker 3: I need to make it happen. I can't hide anymore. 574 00:30:23,640 --> 00:30:27,600 Speaker 3: And in twenty eighteen I went from like started to 575 00:30:27,640 --> 00:30:31,840 Speaker 3: realize like most systems that finished implementing the health record, 576 00:30:31,960 --> 00:30:37,480 Speaker 3: electronic health record policies were starting to change. The AI 577 00:30:37,760 --> 00:30:40,120 Speaker 3: was mature enough that it was really clear we could 578 00:30:40,160 --> 00:30:43,080 Speaker 3: do a lot with it. And it was my very 579 00:30:43,120 --> 00:30:49,280 Speaker 3: little part I could do to you know, address my 580 00:30:49,280 --> 00:30:51,920 Speaker 3: my you know, my part of grief related to my nephew. 581 00:30:52,040 --> 00:30:54,560 Speaker 3: Like it was the very little role I could play. 582 00:30:54,640 --> 00:30:58,040 Speaker 3: So so in twenty eighteen I started to, you know, 583 00:30:58,360 --> 00:30:59,920 Speaker 3: think go after it with the idea of the work 584 00:31:00,080 --> 00:31:02,280 Speaker 3: to actually start a company. We're actually going to turn 585 00:31:02,320 --> 00:31:05,200 Speaker 3: this into something that scales nationally. And that's where it 586 00:31:05,200 --> 00:31:05,680 Speaker 3: all began. 587 00:31:06,040 --> 00:31:10,000 Speaker 2: So you start the company, and you do build this 588 00:31:11,160 --> 00:31:17,200 Speaker 2: AI model to detect sepsis in the hospitalized patients, and 589 00:31:18,040 --> 00:31:21,480 Speaker 2: you do this study and you wind up publishing the 590 00:31:21,520 --> 00:31:25,760 Speaker 2: outcome in the journal Nature Medicine, right, which seems like 591 00:31:25,800 --> 00:31:28,760 Speaker 2: a big, big moment in your work, in the life 592 00:31:28,760 --> 00:31:31,600 Speaker 2: of your company. So tell me about that study. 593 00:31:32,760 --> 00:31:35,640 Speaker 3: Yeah, So in twenty two in July twenty two, we 594 00:31:35,680 --> 00:31:38,320 Speaker 3: had three studies. They were featured on the cover of 595 00:31:38,400 --> 00:31:40,680 Speaker 3: Nature Medicine. These were very big studies for the field. 596 00:31:41,320 --> 00:31:45,040 Speaker 3: Then the studies that came out in twenty two were 597 00:31:45,080 --> 00:31:51,160 Speaker 3: basically showing how we implemented the system by five different sites, 598 00:31:51,240 --> 00:31:54,400 Speaker 3: like both in the emergency department, the floor, the hospital 599 00:31:54,440 --> 00:31:58,440 Speaker 3: flow is the ICUs across academic and community hospital, So 600 00:31:58,600 --> 00:32:02,560 Speaker 3: five different hospital in totally different geographic region right in 601 00:32:02,640 --> 00:32:08,400 Speaker 3: Maryland in DC, rich communities, poor communities. And what we 602 00:32:08,400 --> 00:32:12,280 Speaker 3: were able to show was the system both like you know, 603 00:32:12,320 --> 00:32:14,440 Speaker 3: almost three quarter of a million patients in the study 604 00:32:14,720 --> 00:32:17,840 Speaker 3: forty four hundred physicians and nurses who were part of 605 00:32:17,840 --> 00:32:23,440 Speaker 3: the study that you could detect sepsist significantly earlier than 606 00:32:23,480 --> 00:32:26,040 Speaker 3: they were currently detecting and acting on. So that was 607 00:32:26,560 --> 00:32:30,680 Speaker 3: one second we showed that. In fact, when we then 608 00:32:30,760 --> 00:32:35,560 Speaker 3: implemented the system, we show saw meaningful reduction in treatment timing, 609 00:32:35,760 --> 00:32:39,800 Speaker 3: like patients were getting treatment in a more timely fashion 610 00:32:39,800 --> 00:32:42,200 Speaker 3: when providers were seeing the alert and acting off of it. 611 00:32:43,080 --> 00:32:46,640 Speaker 3: And then the third we know early detection is possible 612 00:32:46,680 --> 00:32:49,440 Speaker 3: now and we know treatment timing has moved, and we've 613 00:32:49,480 --> 00:32:51,760 Speaker 3: known in sepsis that early treatment is the key to 614 00:32:51,800 --> 00:32:53,680 Speaker 3: better outcomes, So the questions do we see that in 615 00:32:53,720 --> 00:32:56,520 Speaker 3: our population as well? And we saw that in patients 616 00:32:56,560 --> 00:33:01,120 Speaker 3: who actually got you know, early alerts. On who got 617 00:33:01,120 --> 00:33:03,920 Speaker 3: the alerts and providers acted on it, we actually saw 618 00:33:04,000 --> 00:33:08,880 Speaker 3: much better outcomes in terms of reductions in mortality, morbidity, 619 00:33:09,320 --> 00:33:13,120 Speaker 3: length of state, fewer complications, secondary complications that arise out 620 00:33:13,120 --> 00:33:17,320 Speaker 3: of sepsis. So it was just extremely exciting to see 621 00:33:17,800 --> 00:33:21,479 Speaker 3: that we could go from you know, a technical idea 622 00:33:22,000 --> 00:33:24,560 Speaker 3: to actual outcomes. And then one of the most interesting 623 00:33:24,560 --> 00:33:29,120 Speaker 3: things we'd studied here was adoption. Will clinicians adopt? It 624 00:33:29,160 --> 00:33:32,440 Speaker 3: was a very real world study to show, like, can 625 00:33:32,440 --> 00:33:35,720 Speaker 3: of system like this actually work? And we showed ninety 626 00:33:35,800 --> 00:33:39,280 Speaker 3: percent physician adoption. So that was extremely exciting to see. 627 00:33:39,320 --> 00:33:41,920 Speaker 3: And that's what I call that's what you know was 628 00:33:42,000 --> 00:33:43,360 Speaker 3: about closing the trust gap. 629 00:33:43,760 --> 00:33:47,800 Speaker 2: So, okay, so you published this paper whatever a year 630 00:33:47,800 --> 00:33:50,200 Speaker 2: and a half ago, where are you now? What's your 631 00:33:50,200 --> 00:33:50,760 Speaker 2: company doing? 632 00:33:50,880 --> 00:33:54,360 Speaker 3: One thing that's very also that I didn't cover earlier 633 00:33:54,400 --> 00:33:57,600 Speaker 3: is that we expanded the system dramatically from not just 634 00:33:57,680 --> 00:34:02,880 Speaker 3: working on sepsis, but a variety of other conditions like sepsis, 635 00:34:03,160 --> 00:34:06,880 Speaker 3: where there is very significant both clinical benefit but also 636 00:34:07,000 --> 00:34:09,520 Speaker 3: financial benefit for the health system. The reason the financial 637 00:34:09,560 --> 00:34:12,240 Speaker 3: piece matters is, you know, ultimately health systems are working 638 00:34:12,239 --> 00:34:14,719 Speaker 3: on one two percent margin. For them to be able 639 00:34:14,760 --> 00:34:17,879 Speaker 3: to implement systems that actually improve care, they still need 640 00:34:17,880 --> 00:34:21,240 Speaker 3: to be able to financially justify that this can be done, 641 00:34:21,680 --> 00:34:23,000 Speaker 3: and that was crucial. 642 00:34:23,239 --> 00:34:26,080 Speaker 2: So what are some of the other things you're working 643 00:34:26,080 --> 00:34:27,279 Speaker 2: on besides sepsis? Now? 644 00:34:27,880 --> 00:34:32,280 Speaker 3: Like, another example area is presh ulcers. Okay, huge area where. 645 00:34:32,120 --> 00:34:37,560 Speaker 5: Like bed bed source exactly Like, it's an area where 646 00:34:37,600 --> 00:34:40,799 Speaker 5: again huge patient impact in terms of like you know, 647 00:34:40,960 --> 00:34:42,880 Speaker 5: if you do end up getting a serious beds or 648 00:34:42,920 --> 00:34:45,839 Speaker 5: how detrimental it is for the patient, sometimes leading to death, 649 00:34:45,920 --> 00:34:48,279 Speaker 5: sometimes leading the need for amputation. 650 00:34:48,800 --> 00:34:53,160 Speaker 3: But even more interestingly, huge burden on the caregivers themselves, 651 00:34:53,200 --> 00:34:56,440 Speaker 3: like nurses today have to do a huge amount of 652 00:34:56,440 --> 00:34:58,919 Speaker 3: work to take care of these patients. Like today, there 653 00:34:58,800 --> 00:35:01,560 Speaker 3: are lots of scenarios where these patients are missed, and 654 00:35:01,560 --> 00:35:03,919 Speaker 3: there's an opportunity where you can actually use the data 655 00:35:03,920 --> 00:35:07,640 Speaker 3: to identify this higher school and start again implementing these 656 00:35:07,680 --> 00:35:10,600 Speaker 3: new ways in which you can do targeted you know, 657 00:35:11,120 --> 00:35:12,160 Speaker 3: preventative measures. 658 00:35:12,440 --> 00:35:16,120 Speaker 2: What has to happen for you to you know, for 659 00:35:16,200 --> 00:35:18,840 Speaker 2: your software to get adopted at hospitals all around the country. 660 00:35:18,920 --> 00:35:21,839 Speaker 2: Like I buy that it's helpful. How do you get 661 00:35:21,840 --> 00:35:24,120 Speaker 2: from it being a kind of researchy thing to being 662 00:35:24,160 --> 00:35:25,600 Speaker 2: a thing that everybody uses. 663 00:35:25,760 --> 00:35:28,640 Speaker 3: So the hurdles we needed to cross was one. We 664 00:35:28,680 --> 00:35:30,480 Speaker 3: needed to figure out a way to get approvals from 665 00:35:30,520 --> 00:35:32,560 Speaker 3: the electronic health records to be able to integrate it. 666 00:35:32,680 --> 00:35:33,040 Speaker 3: We did. 667 00:35:33,040 --> 00:35:35,080 Speaker 2: That took a couple of years from like the just 668 00:35:35,160 --> 00:35:37,680 Speaker 2: the big software makers, Epic, whatever, the companies that make 669 00:35:37,719 --> 00:35:40,240 Speaker 2: the electronic health records. They have to say yes, okay, 670 00:35:40,280 --> 00:35:43,600 Speaker 2: so that's done. Check. Great, what has to happened next? Yeah? 671 00:35:43,719 --> 00:35:46,040 Speaker 3: Next, you need a system that is able to you know, 672 00:35:46,080 --> 00:35:47,600 Speaker 3: when you go from one side to the next, to 673 00:35:47,600 --> 00:35:49,239 Speaker 3: the next to the next. You need the ability to 674 00:35:49,239 --> 00:35:51,320 Speaker 3: be able to measure and generalize as you core, cross 675 00:35:51,320 --> 00:35:52,680 Speaker 3: site and reliably perform. 676 00:35:53,120 --> 00:35:55,200 Speaker 2: So it has to work in lots of different kinds 677 00:35:55,239 --> 00:35:58,840 Speaker 2: of hospitals that collect different kinds of data in different settings. 678 00:35:58,800 --> 00:36:00,880 Speaker 3: And in our partnerships shown that data. 679 00:36:01,040 --> 00:36:02,840 Speaker 2: Okay, third check. 680 00:36:02,719 --> 00:36:05,040 Speaker 3: Like I said, we have to show that basically people 681 00:36:05,080 --> 00:36:07,239 Speaker 3: will adopt in these different environments. So we have data 682 00:36:07,280 --> 00:36:08,120 Speaker 3: to show that okay. 683 00:36:08,880 --> 00:36:09,080 Speaker 2: Four. 684 00:36:09,680 --> 00:36:12,480 Speaker 3: In some of these areas you need FD approval, and 685 00:36:12,600 --> 00:36:14,600 Speaker 3: in the areas we need of the approval, we're working 686 00:36:14,600 --> 00:36:15,960 Speaker 3: with the FDA to get those approvals. 687 00:36:16,000 --> 00:36:19,600 Speaker 2: Okay. So that's kind of the next step, correct. 688 00:36:19,480 --> 00:36:21,960 Speaker 3: And then once that's done, you can now start to 689 00:36:22,200 --> 00:36:25,839 Speaker 3: you know, it's available, it can be marketed, you can 690 00:36:25,880 --> 00:36:28,880 Speaker 3: scale it nationally. All very exciting things. 691 00:36:29,200 --> 00:36:35,719 Speaker 2: So if things go well for you, what will the 692 00:36:35,760 --> 00:36:38,080 Speaker 2: world look like in say five. 693 00:36:37,880 --> 00:36:41,439 Speaker 3: Years, Oh my god, so exciting. I think we will 694 00:36:41,480 --> 00:36:46,280 Speaker 3: actually be implemented at sixty seventy eighty percent of the market, 695 00:36:46,360 --> 00:36:50,920 Speaker 3: I hope in the US. What's interesting now is like, 696 00:36:50,960 --> 00:36:52,719 Speaker 3: you know, healthcare is a market which is a leader 697 00:36:52,760 --> 00:36:56,279 Speaker 3: follow up market, and once you show things that work, 698 00:36:56,360 --> 00:36:59,080 Speaker 3: it makes logical sense. You have the proof points, you've 699 00:36:59,080 --> 00:37:01,719 Speaker 3: tackled most of the and issues that people struggle with. 700 00:37:02,520 --> 00:37:04,360 Speaker 3: Then this is an area where you can scale. And 701 00:37:04,400 --> 00:37:06,160 Speaker 3: when it comes to like the areas we're working in, 702 00:37:06,160 --> 00:37:09,360 Speaker 3: which is clinical, unlike some of the others like billing 703 00:37:09,400 --> 00:37:12,799 Speaker 3: and messaging and back office. You know, the years of 704 00:37:12,840 --> 00:37:15,560 Speaker 3: development required to build what we build is very long, 705 00:37:15,640 --> 00:37:17,600 Speaker 3: Like it's taken us eight to nine years to do 706 00:37:17,640 --> 00:37:19,520 Speaker 3: all the pieces necessary to get to where we are, 707 00:37:19,560 --> 00:37:21,920 Speaker 3: so there aren't as a lot of like other competitors 708 00:37:21,920 --> 00:37:22,400 Speaker 3: in the market. 709 00:37:22,480 --> 00:37:24,799 Speaker 2: You have a moat, and FDA approval is going to 710 00:37:24,800 --> 00:37:25,520 Speaker 2: be even more of a. 711 00:37:25,480 --> 00:37:28,640 Speaker 3: Mote among other things. Exactly, So we have a very 712 00:37:29,040 --> 00:37:33,000 Speaker 3: very significant like moat and hurdles people have to cross 713 00:37:33,040 --> 00:37:35,640 Speaker 3: to really get it to work, and we've invested in them. 714 00:37:36,120 --> 00:37:39,880 Speaker 2: And so in your happy five year future, most of 715 00:37:39,920 --> 00:37:42,960 Speaker 2: the hospitals in the country will be using your software, 716 00:37:43,000 --> 00:37:48,279 Speaker 2: your models to detect sepsis, to detect bedsores earlier than 717 00:37:48,960 --> 00:37:49,279 Speaker 2: in a. 718 00:37:49,239 --> 00:37:51,799 Speaker 3: Variety of for the conditions. Like we've looked at our 719 00:37:51,840 --> 00:37:54,640 Speaker 3: own financial models and show that like a you know, 720 00:37:54,760 --> 00:37:59,560 Speaker 3: modest four to five hospital health system stands to gain 721 00:37:59,680 --> 00:38:03,040 Speaker 3: like fifty two hundred million dollars from the implementation of 722 00:38:03,080 --> 00:38:06,239 Speaker 3: our system in some you know, the condition areas we're tackling. 723 00:38:06,000 --> 00:38:08,480 Speaker 2: And people will die less and be less sick as 724 00:38:08,480 --> 00:38:09,560 Speaker 2: a benefit also. 725 00:38:09,800 --> 00:38:12,640 Speaker 3: And that is honestly the biggest maturity I've had in 726 00:38:12,680 --> 00:38:16,240 Speaker 3: building this company. I started from like the cause of caring, 727 00:38:16,560 --> 00:38:20,719 Speaker 3: and it was realizing, like it's funny in healthcare, they're 728 00:38:20,760 --> 00:38:23,640 Speaker 3: so used to caring for patients who are dying every day. 729 00:38:23,680 --> 00:38:27,600 Speaker 3: They've gotten desensitized. You then come back to realizing you 730 00:38:27,680 --> 00:38:30,520 Speaker 3: need the other things to follow, like the money. You 731 00:38:30,520 --> 00:38:32,320 Speaker 3: need to figure out a way to make it easy 732 00:38:32,360 --> 00:38:34,920 Speaker 3: for them to do the right thing, And when you 733 00:38:35,000 --> 00:38:38,279 Speaker 3: do that, then they do actually care about doing the 734 00:38:38,360 --> 00:38:39,960 Speaker 3: right thing, because that's why they were there in the 735 00:38:40,000 --> 00:38:40,560 Speaker 3: first place. 736 00:38:43,600 --> 00:38:53,080 Speaker 2: We'll be back in a minute with the lightning round. Okay, 737 00:38:53,880 --> 00:38:56,439 Speaker 2: I'm going to keep you another two minutes or something 738 00:38:56,480 --> 00:39:00,120 Speaker 2: to do a lightning round. You went to college at 739 00:39:00,160 --> 00:39:03,920 Speaker 2: Mount Holyoke and all women's college. Yeah, and so I'm curious, 740 00:39:03,960 --> 00:39:07,360 Speaker 2: what is one thing you would tell someone considering attending 741 00:39:07,360 --> 00:39:08,440 Speaker 2: an all women's college. 742 00:39:08,520 --> 00:39:10,919 Speaker 3: Oh? I loved Mount holy O. It was so much fun. 743 00:39:10,960 --> 00:39:13,279 Speaker 3: It's where I got my confidence that I could do 744 00:39:13,360 --> 00:39:16,279 Speaker 3: really really hard things and not be, you know, not 745 00:39:16,280 --> 00:39:17,319 Speaker 3: feel defeated. 746 00:39:17,600 --> 00:39:21,439 Speaker 2: If you weren't working in healthcare, where would you be 747 00:39:21,520 --> 00:39:22,640 Speaker 2: trying to apply AI? 748 00:39:23,880 --> 00:39:26,280 Speaker 3: Oh my god, I've just been so obsessed with healthcare 749 00:39:26,360 --> 00:39:28,960 Speaker 3: for the last decade. I haven't really lifted my head 750 00:39:29,000 --> 00:39:31,120 Speaker 3: to think about other things. I mean, honestly, there are 751 00:39:31,120 --> 00:39:35,120 Speaker 3: a million areas you could apply it, but I don't 752 00:39:35,160 --> 00:39:37,080 Speaker 3: like thinking about it because it's just that the need 753 00:39:37,160 --> 00:39:39,080 Speaker 3: is so dire in health care, and it's so hard. 754 00:39:39,160 --> 00:39:40,880 Speaker 3: It's so hard for an II research to focus in 755 00:39:40,880 --> 00:39:44,359 Speaker 3: healthcare because they don't make it easy. You can make 756 00:39:44,440 --> 00:39:46,719 Speaker 3: a lot more money doing the same kind of things 757 00:39:46,760 --> 00:39:49,000 Speaker 3: in finance. You can get the data more easily, you 758 00:39:49,040 --> 00:39:51,840 Speaker 3: can make money off of it more easily. Like it 759 00:39:51,920 --> 00:39:54,320 Speaker 3: is annoying, It is really annoying. 760 00:39:54,719 --> 00:39:56,920 Speaker 2: Is chet GPT overrated or underrated? 761 00:39:57,600 --> 00:39:59,719 Speaker 3: Actually I think it's underrated. 762 00:40:00,000 --> 00:40:03,319 Speaker 2: Okay, go on, I think you know. 763 00:40:04,160 --> 00:40:06,279 Speaker 3: When we see the math, we're like, okay, that's the math. 764 00:40:06,320 --> 00:40:08,920 Speaker 3: That's interesting to me. What was really informative was like 765 00:40:08,960 --> 00:40:13,440 Speaker 3: the experience, the social experience. It was so exciting to 766 00:40:13,480 --> 00:40:16,480 Speaker 3: see people who first interacted with it and you know, 767 00:40:16,560 --> 00:40:19,879 Speaker 3: have their head mind be blown by the experience. And 768 00:40:20,320 --> 00:40:23,400 Speaker 3: that's sort of then informing how important the user experience 769 00:40:23,440 --> 00:40:25,839 Speaker 3: out of the houses, Like you know, we had some 770 00:40:25,880 --> 00:40:28,319 Speaker 3: of the chatbot technology before we had some of the 771 00:40:28,880 --> 00:40:32,080 Speaker 3: interactive but it's sort of how opening I designed it 772 00:40:32,120 --> 00:40:36,560 Speaker 3: in the use cases like storytelling, poems, like the use 773 00:40:36,600 --> 00:40:39,120 Speaker 3: cases where they trained the system to be very good 774 00:40:39,160 --> 00:40:43,760 Speaker 3: at conversant like was what made the experience so exciting 775 00:40:43,800 --> 00:40:47,360 Speaker 3: because then people could start you know, like experiencing it 776 00:40:47,400 --> 00:40:50,200 Speaker 3: themselves and that sort of opened up their mind to 777 00:40:50,200 --> 00:40:51,040 Speaker 3: what else could it do? 778 00:40:51,320 --> 00:40:53,920 Speaker 2: Analogous to the lesson you were talking about in your 779 00:40:53,960 --> 00:40:57,680 Speaker 2: own work, where getting the answer right figuring out if 780 00:40:57,680 --> 00:41:01,080 Speaker 2: the person has sepsis is actually the only part of 781 00:41:01,120 --> 00:41:01,839 Speaker 2: what you have to. 782 00:41:01,800 --> 00:41:05,719 Speaker 3: Do huge and that's I think where AI as a 783 00:41:05,800 --> 00:41:07,719 Speaker 3: field that a lot has a lot of growing up 784 00:41:07,760 --> 00:41:10,320 Speaker 3: to do because historically the people who entered this field 785 00:41:10,400 --> 00:41:14,920 Speaker 3: are you know, they gravitate towards the math, they gravitate 786 00:41:14,920 --> 00:41:17,719 Speaker 3: towards the hired science. But what they don't realize is 787 00:41:17,800 --> 00:41:21,880 Speaker 3: ultimately it is a people problem that you're solving. You 788 00:41:21,960 --> 00:41:24,200 Speaker 3: have to get people to love it. You have to 789 00:41:24,200 --> 00:41:26,920 Speaker 3: get people to incorporate it in their daily lives for 790 00:41:26,960 --> 00:41:29,880 Speaker 3: this to be successful, and you have to operate in 791 00:41:29,920 --> 00:41:33,239 Speaker 3: a world which is not very precise, like people have 792 00:41:33,280 --> 00:41:35,560 Speaker 3: their faults and their mistakes and they work in a 793 00:41:35,560 --> 00:41:37,439 Speaker 3: particular way, and you've got to get this thing to fit. 794 00:41:41,960 --> 00:41:44,839 Speaker 2: Suchi Saria is a professor at Johns Hopkins and the 795 00:41:44,880 --> 00:41:50,400 Speaker 2: founder and CEO of Asian Health. Today's show was produced 796 00:41:50,440 --> 00:41:54,560 Speaker 2: by Edith Russlo and Gabriel Hunter Chang. It was edited 797 00:41:54,560 --> 00:41:58,359 Speaker 2: by Karen Chakerji and engineered by Sarah Bruguer. You can 798 00:41:58,400 --> 00:42:02,120 Speaker 2: email us at a problem at Pushkin dot Fm. I'm 799 00:42:02,160 --> 00:42:04,840 Speaker 2: Jacob Goldstein and we'll be back next week with another 800 00:42:04,880 --> 00:42:12,680 Speaker 2: episode of What's Your From