1 00:00:15,356 --> 00:00:23,516 Speaker 1: Pushkin. When you walk into a hospital, technology is everywhere. 2 00:00:23,716 --> 00:00:25,756 Speaker 1: In one room, a surgeon is giving a patient a 3 00:00:25,796 --> 00:00:29,716 Speaker 1: bionic knee. In another room, a CT scanner is creating 4 00:00:29,716 --> 00:00:32,676 Speaker 1: this incredible three D picture of the inside of a 5 00:00:32,716 --> 00:00:37,636 Speaker 1: person's body. But in other places the hospital feels less 6 00:00:37,676 --> 00:00:42,636 Speaker 1: high tech. Doctors are still reading patients charts and making 7 00:00:42,676 --> 00:00:46,956 Speaker 1: decisions partly on evidence but largely on instinct. This part 8 00:00:46,996 --> 00:00:49,116 Speaker 1: of the hospital is not so different from what it 9 00:00:49,196 --> 00:00:52,756 Speaker 1: might have looked like fifty years ago, and bringing new 10 00:00:52,836 --> 00:00:56,476 Speaker 1: technology to this part of medicine to care at the 11 00:00:56,476 --> 00:01:01,196 Speaker 1: bedside is a really hard, really interesting problem, because you 12 00:01:01,316 --> 00:01:03,676 Speaker 1: not only have to figure out how to use technology 13 00:01:03,756 --> 00:01:07,076 Speaker 1: to deliver useful information to the doctor at the right time, 14 00:01:07,676 --> 00:01:10,396 Speaker 1: you also have to figure out how to convince the 15 00:01:10,476 --> 00:01:19,836 Speaker 1: doctor that the information is actually worth listening to. I'm 16 00:01:19,916 --> 00:01:22,396 Speaker 1: Jacob Bolgstein and this is What's Your Problem, the show 17 00:01:22,396 --> 00:01:24,676 Speaker 1: where I talk to people who are trying to make 18 00:01:24,796 --> 00:01:29,276 Speaker 1: technological progress. My guest today is Succi Sariya. She's the 19 00:01:29,356 --> 00:01:32,516 Speaker 1: founder and CEO of a company called Baesian Health, and 20 00:01:32,556 --> 00:01:35,716 Speaker 1: she's also a professor at Johns Hopkins, where she runs 21 00:01:35,756 --> 00:01:40,316 Speaker 1: a lab focused on machine learning and healthcare Succi's problem 22 00:01:40,396 --> 00:01:44,236 Speaker 1: is this, how can you use artificial intelligence to detect 23 00:01:44,476 --> 00:01:48,676 Speaker 1: when hospital patients are at risk of potentially deadly complications? 24 00:01:49,276 --> 00:01:51,796 Speaker 1: And then once you've done that, how can you get 25 00:01:51,836 --> 00:01:55,476 Speaker 1: doctors to believe that the AI's warning is worth paying 26 00:01:55,516 --> 00:01:58,716 Speaker 1: attention to. She told me she first got interested in 27 00:01:58,756 --> 00:02:01,396 Speaker 1: healthcare sort of by accident, when she was a grad 28 00:02:01,396 --> 00:02:04,356 Speaker 1: student at Stanford studying AI and robots. 29 00:02:08,156 --> 00:02:11,196 Speaker 2: You know, I crew actually being fascinated by AI. I 30 00:02:11,236 --> 00:02:13,876 Speaker 2: love DAI, and really most of my interest was in 31 00:02:13,996 --> 00:02:16,436 Speaker 2: the algorithm front and like looking at robotics and building 32 00:02:16,556 --> 00:02:19,436 Speaker 2: robots that were really smart, you know. And I got 33 00:02:19,476 --> 00:02:24,036 Speaker 2: acquainted with medicine through a friend colleague who was a 34 00:02:24,076 --> 00:02:27,996 Speaker 2: doctor taking care of babies, And what I learned through 35 00:02:28,036 --> 00:02:31,076 Speaker 2: her was that this is all this data we're starting 36 00:02:31,076 --> 00:02:35,796 Speaker 2: to collect, but literally nobody was doing designing any software 37 00:02:35,836 --> 00:02:38,596 Speaker 2: to make sense of it. So it was just coming 38 00:02:38,636 --> 00:02:42,956 Speaker 2: from a world where you know, I studied all kinds 39 00:02:42,996 --> 00:02:48,396 Speaker 2: of data day in day out, with robots doing fun 40 00:02:48,516 --> 00:02:50,756 Speaker 2: tasks like getting the robot to hold the ball or 41 00:02:50,796 --> 00:02:54,556 Speaker 2: juggle the ball to then realizing, holy crap, there's like 42 00:02:54,996 --> 00:02:57,196 Speaker 2: so many more useful things we could be doing. So 43 00:02:57,196 --> 00:03:00,636 Speaker 2: that was really my first discovery of like how big 44 00:03:00,636 --> 00:03:03,156 Speaker 2: a gap there was between people who thought about AI 45 00:03:03,276 --> 00:03:06,196 Speaker 2: versus people versus the problems that needed to be solved, 46 00:03:06,196 --> 00:03:08,156 Speaker 2: and how little we understood about these problems. 47 00:03:08,636 --> 00:03:11,716 Speaker 1: So so you decide that this is going to be 48 00:03:11,796 --> 00:03:13,996 Speaker 1: your thing, right, this is your life's work now. 49 00:03:14,156 --> 00:03:17,036 Speaker 2: I mean in the beginning, I wasn't convinced. In the beginning, 50 00:03:17,156 --> 00:03:20,196 Speaker 2: it was just about spending a few years helping out 51 00:03:20,436 --> 00:03:22,556 Speaker 2: and making sure we are able to make you know, 52 00:03:22,996 --> 00:03:25,116 Speaker 2: in the beginning, it was about my next three years. 53 00:03:25,836 --> 00:03:28,556 Speaker 2: Like I was afraid of investine. I was afraid of 54 00:03:28,596 --> 00:03:31,756 Speaker 2: the complexity of medicine. Like it wasn't an easy field. 55 00:03:31,756 --> 00:03:35,036 Speaker 2: It's not one where they welcome you, right, just as 56 00:03:35,076 --> 00:03:37,596 Speaker 2: an engineer, you don't come in and like at least 57 00:03:37,636 --> 00:03:40,036 Speaker 2: twelve thirteen years ago, that wasn't the culture. 58 00:03:39,756 --> 00:03:42,996 Speaker 1: That like right, Like like an MD an MD at 59 00:03:42,996 --> 00:03:46,196 Speaker 1: a hospital does not want to hear from some AI researcher. 60 00:03:46,236 --> 00:03:50,436 Speaker 2: They're busy, Oh no, for sure, And they're like, we're busy, 61 00:03:50,516 --> 00:03:51,516 Speaker 2: we have real work to do. 62 00:03:51,796 --> 00:03:53,316 Speaker 1: Yeah, what is this? 63 00:03:53,756 --> 00:03:56,036 Speaker 2: Like this all sounds an esoteric mumbo jumbo. 64 00:03:56,356 --> 00:03:59,556 Speaker 1: Yeah, And so you say, you know, we're collecting all 65 00:03:59,556 --> 00:04:02,116 Speaker 1: this data and healthcare and we're not doing anything with it. 66 00:04:02,956 --> 00:04:06,956 Speaker 1: That is not intuitive, Like, that's not you know, I 67 00:04:06,956 --> 00:04:08,956 Speaker 1: think most people sort of prior and this is that 68 00:04:09,156 --> 00:04:11,756 Speaker 1: an academic hospital, right, Your friend is at Stanford Hospital, 69 00:04:11,756 --> 00:04:14,636 Speaker 1: a very prestigious academic hospitel. I think Stanford Hospital, I 70 00:04:14,636 --> 00:04:17,236 Speaker 1: think data. I think these are people doing research. So 71 00:04:17,276 --> 00:04:18,996 Speaker 1: what do you mean when you say we're collecting all 72 00:04:19,036 --> 00:04:20,476 Speaker 1: this data and not doing anything with it. 73 00:04:20,676 --> 00:04:23,796 Speaker 2: Yeah, So twelve thirteen, fourteen years ago, this field was 74 00:04:23,916 --> 00:04:27,396 Speaker 2: very new and at the time even collecting and storing 75 00:04:27,436 --> 00:04:30,876 Speaker 2: this data, natural question was can be afforded? It costs 76 00:04:30,916 --> 00:04:33,196 Speaker 2: dollars to store this data? Why would we do that? 77 00:04:33,436 --> 00:04:35,036 Speaker 1: And when you say, what what kind of data are 78 00:04:35,036 --> 00:04:36,996 Speaker 1: you talking about here? When you say collect and store 79 00:04:37,036 --> 00:04:37,876 Speaker 1: this data? 80 00:04:37,956 --> 00:04:41,476 Speaker 2: So literally, this was at the time babies entering, you know, 81 00:04:41,516 --> 00:04:44,436 Speaker 2: in the new natle ICU, these are premature babies are 82 00:04:44,436 --> 00:04:47,716 Speaker 2: born in real time. Devices are collecting heart rate and 83 00:04:47,796 --> 00:04:51,636 Speaker 2: vitals and oxygen saturation data and like, and so that 84 00:04:51,716 --> 00:04:54,596 Speaker 2: kind of detailed data, which is much more bulky, was 85 00:04:54,756 --> 00:04:57,276 Speaker 2: historically not stored. Instead, what they would do is they'd 86 00:04:57,276 --> 00:05:01,756 Speaker 2: take like fifteen minute averages and capture that okay, And 87 00:05:02,196 --> 00:05:04,236 Speaker 2: naturally the question came up, do we need to store it? 88 00:05:04,276 --> 00:05:06,636 Speaker 2: This is really expensive data. Let's just throw it away 89 00:05:07,036 --> 00:05:09,236 Speaker 2: after forty eight hours, we don't need it anymore, or 90 00:05:09,316 --> 00:05:10,796 Speaker 2: let's just throw a quick summary of it. 91 00:05:10,956 --> 00:05:12,996 Speaker 1: Huh. So you might do a study, you might track 92 00:05:13,116 --> 00:05:15,636 Speaker 1: certain data points, but the idea that you're going to 93 00:05:15,756 --> 00:05:18,996 Speaker 1: just as a matter of course, be storing all of 94 00:05:19,036 --> 00:05:22,956 Speaker 1: this data that is now being generated and saved because 95 00:05:23,356 --> 00:05:27,276 Speaker 1: electronic medical records are just being adopted. Nobody was doing that. 96 00:05:27,356 --> 00:05:28,876 Speaker 1: Nobody had really thought to do it. It was an 97 00:05:28,916 --> 00:05:31,796 Speaker 1: expensive prospect, It didn't seem like there would be a 98 00:05:31,796 --> 00:05:33,796 Speaker 1: good reason to do it exactly. 99 00:05:34,196 --> 00:05:39,036 Speaker 2: And coming from AI, where we looked at you know, 100 00:05:39,156 --> 00:05:43,596 Speaker 2: fingerprint data on the internet in retail or finance, then 101 00:05:43,836 --> 00:05:47,116 Speaker 2: the you know, we it was so natural to think 102 00:05:47,156 --> 00:05:50,756 Speaker 2: about how this data teaches you things that it felt 103 00:05:51,036 --> 00:05:54,156 Speaker 2: crazy to me that like, we one similarly than all 104 00:05:54,196 --> 00:05:57,276 Speaker 2: sorts of amazing things about these babies or human body 105 00:05:57,356 --> 00:05:59,596 Speaker 2: or how we're involved, or like what are the signs 106 00:05:59,596 --> 00:06:02,276 Speaker 2: and fingerprints of disease? How did they show up? 107 00:06:02,516 --> 00:06:05,276 Speaker 1: When you say fingerprint data, that's a that's a metaphor, right, 108 00:06:05,476 --> 00:06:08,276 Speaker 1: what does fingerprint data mean in the context of sort 109 00:06:08,316 --> 00:06:10,316 Speaker 1: of eCOM online finance. 110 00:06:11,076 --> 00:06:13,316 Speaker 2: Well, like they went to this site and then they 111 00:06:13,316 --> 00:06:16,316 Speaker 2: came to this site, or like they saw and add 112 00:06:16,436 --> 00:06:19,396 Speaker 2: somewhere else about this, and now you know they're searching 113 00:06:20,276 --> 00:06:22,836 Speaker 2: for something, and it shows you intense. 114 00:06:22,436 --> 00:06:26,316 Speaker 1: It's this moment ten years ago when like the when 115 00:06:26,316 --> 00:06:29,236 Speaker 1: people are using data to know like everything about what 116 00:06:29,316 --> 00:06:32,396 Speaker 1: I do when I'm shopping for new shoes. But you 117 00:06:32,396 --> 00:06:35,236 Speaker 1: you're but they're not collecting data on like sick newborn 118 00:06:35,276 --> 00:06:36,836 Speaker 1: babies exactly right. 119 00:06:37,236 --> 00:06:39,316 Speaker 2: Is that mind blowing to you? Because it was a 120 00:06:40,036 --> 00:06:41,476 Speaker 2: crazy mind blowing to me. 121 00:06:41,916 --> 00:06:44,876 Speaker 1: Okay, yes, my mind is blown. So what do you 122 00:06:44,916 --> 00:06:47,036 Speaker 1: do well? 123 00:06:47,436 --> 00:06:50,316 Speaker 2: I mean, it seemed like such a pressing problem. It 124 00:06:50,356 --> 00:06:53,196 Speaker 2: also helped that we were funded as a moonshot project 125 00:06:53,276 --> 00:06:58,516 Speaker 2: by the Google founders, that it was a high profile investment, 126 00:06:58,596 --> 00:07:01,316 Speaker 2: and it sort of naturally led away for United place 127 00:07:01,396 --> 00:07:05,796 Speaker 2: like Stanford curiosity, and we had some amazing collaborators who 128 00:07:05,796 --> 00:07:09,876 Speaker 2: were equally curious, who said, well, let's dive in and 129 00:07:09,916 --> 00:07:13,876 Speaker 2: see what we'll understand. And that was the start of it. 130 00:07:14,396 --> 00:07:17,316 Speaker 2: I literally, you know, got hold of this massive, twelve 131 00:07:17,396 --> 00:07:21,596 Speaker 2: hundred page like this huge, big book to learn about 132 00:07:21,676 --> 00:07:23,876 Speaker 2: babies and what conditions they experience and what does it 133 00:07:23,916 --> 00:07:25,916 Speaker 2: all mean, and then starting to understand how does it 134 00:07:25,916 --> 00:07:29,516 Speaker 2: show up in the data, and you know, spent evenings 135 00:07:29,556 --> 00:07:32,796 Speaker 2: and weekends, and actually I remember sitting in the basement 136 00:07:32,876 --> 00:07:38,596 Speaker 2: of Stanford Hospital at over Christmas trying to work on 137 00:07:38,676 --> 00:07:40,676 Speaker 2: trying to get data out of the health record in 138 00:07:40,716 --> 00:07:42,636 Speaker 2: the first place. And we were trying to experiment with 139 00:07:42,676 --> 00:07:45,276 Speaker 2: all sorts of techniques for pulling the data out, which 140 00:07:45,556 --> 00:07:47,636 Speaker 2: you know now is a whole lot easier than it 141 00:07:47,716 --> 00:07:49,276 Speaker 2: was twelve years ago because. 142 00:07:49,076 --> 00:07:52,396 Speaker 1: It's not built for that, right, It's basically built somewhat 143 00:07:52,396 --> 00:07:54,556 Speaker 1: to track the patient and to a significant degree to 144 00:07:54,636 --> 00:07:58,276 Speaker 1: like build insurance. Right, that's traditionally what electronic medical records 145 00:07:58,316 --> 00:07:58,676 Speaker 1: were for. 146 00:07:59,036 --> 00:08:00,156 Speaker 2: That's exactly right. 147 00:08:00,156 --> 00:08:02,556 Speaker 1: Kind of amazing and kind of weird. I mean, I 148 00:08:02,556 --> 00:08:05,996 Speaker 1: want to talk more about the bigger idea of data 149 00:08:06,196 --> 00:08:11,476 Speaker 1: and healthcare, but just to kind of land this moment 150 00:08:11,996 --> 00:08:14,876 Speaker 1: early in your career at Stanford, like, is there some 151 00:08:15,036 --> 00:08:17,076 Speaker 1: project you do, Like what is the end of your 152 00:08:17,076 --> 00:08:17,956 Speaker 1: work at Stanford. 153 00:08:18,396 --> 00:08:22,436 Speaker 2: So the project was, you know, we're monitoring these premature 154 00:08:22,476 --> 00:08:26,556 Speaker 2: babies right anywhere between twenty four week old babies which 155 00:08:26,596 --> 00:08:29,396 Speaker 2: are very very tiny, like very twenty. 156 00:08:29,116 --> 00:08:31,876 Speaker 1: Four weeks of gestation exactly. 157 00:08:31,956 --> 00:08:36,556 Speaker 2: To like twenty eight thirty thirty two. And the idea was, 158 00:08:36,596 --> 00:08:40,436 Speaker 2: these babies, you know, are like they're at risk for significant, 159 00:08:40,476 --> 00:08:43,596 Speaker 2: like an array of complications. And the idea is, the 160 00:08:43,716 --> 00:08:47,236 Speaker 2: sooner you know, the earlier you can do something about it, 161 00:08:47,356 --> 00:08:49,876 Speaker 2: the greater the chance that you're going to actually resuscitate them. 162 00:08:50,356 --> 00:08:52,476 Speaker 2: So our job was, like, could we look at this 163 00:08:52,556 --> 00:08:55,596 Speaker 2: data from the second they're born and collect this data 164 00:08:55,636 --> 00:08:59,596 Speaker 2: to start analyzing and modeling which babies at risk for 165 00:08:59,636 --> 00:09:02,116 Speaker 2: which of these complications? And if you could, then you 166 00:09:02,156 --> 00:09:06,236 Speaker 2: could start to put more of these preventative prophylactic type 167 00:09:06,436 --> 00:09:08,916 Speaker 2: pathways or approaches in place for caring. 168 00:09:08,676 --> 00:09:12,556 Speaker 1: Basically identify problems more quickly leading to better outcomes. That's 169 00:09:12,596 --> 00:09:14,836 Speaker 1: the basic desire exactly. 170 00:09:14,836 --> 00:09:17,876 Speaker 2: And in the process I discovered, like, you know, a 171 00:09:17,956 --> 00:09:21,276 Speaker 2: long time ago, there was a physician named Virginia Apcar, 172 00:09:21,796 --> 00:09:24,996 Speaker 2: and what she figured out is like, just by measuring 173 00:09:25,796 --> 00:09:29,316 Speaker 2: five different things from when the baby's born, she can 174 00:09:29,556 --> 00:09:31,596 Speaker 2: compute a very simple score that tells you how the 175 00:09:31,636 --> 00:09:35,836 Speaker 2: baby's doing. And so so naturally, the question we asked is, okay, 176 00:09:35,876 --> 00:09:37,756 Speaker 2: so now that we are seeing all these ways in 177 00:09:37,756 --> 00:09:41,676 Speaker 2: which the machine learning and AI is discovering novel signs 178 00:09:41,676 --> 00:09:45,156 Speaker 2: and patterns that are predictive. Could we just simply combine 179 00:09:45,196 --> 00:09:48,316 Speaker 2: this to come up with a simple score. Huh that says, 180 00:09:48,796 --> 00:09:51,596 Speaker 2: you know, can I predict complications? And what we found 181 00:09:51,676 --> 00:09:54,276 Speaker 2: was this new simple score that uses data that no 182 00:09:54,396 --> 00:09:56,676 Speaker 2: special thing you have to do, it's already being collected. 183 00:09:57,156 --> 00:09:59,076 Speaker 2: We just analyze it and we aught to compute the 184 00:09:59,116 --> 00:10:01,756 Speaker 2: score turns out to be much more predictive than the 185 00:10:01,756 --> 00:10:04,036 Speaker 2: APCAR at predicting complications. 186 00:10:04,996 --> 00:10:07,876 Speaker 1: And so so it worked. I mean, did do people 187 00:10:07,956 --> 00:10:10,236 Speaker 1: use it? Is it standard of care? Now? What happened 188 00:10:10,236 --> 00:10:11,516 Speaker 1: with that research? 189 00:10:12,036 --> 00:10:13,716 Speaker 2: So at that point I was like, oh, this is 190 00:10:13,756 --> 00:10:16,556 Speaker 2: so cool. And literally we got all these journalists who 191 00:10:16,596 --> 00:10:18,236 Speaker 2: wanted to write about it, and it was on the 192 00:10:18,276 --> 00:10:22,316 Speaker 2: fundraising you know, it was like Stanford's fundraising highlight for 193 00:10:22,356 --> 00:10:25,276 Speaker 2: like the next five years, et cetera. But what was 194 00:10:25,316 --> 00:10:27,076 Speaker 2: the saddest thing about it is that there was no 195 00:10:27,196 --> 00:10:31,356 Speaker 2: natural mechanism for implementing it in practice, And it had 196 00:10:31,396 --> 00:10:33,316 Speaker 2: to do with so many different pieces to it, Like 197 00:10:33,676 --> 00:10:36,956 Speaker 2: we didn't have the infrastructure, we didn't have the like 198 00:10:37,236 --> 00:10:39,716 Speaker 2: know how of like how do you get physicians to 199 00:10:39,756 --> 00:10:42,156 Speaker 2: trust something like this? How do you build this in 200 00:10:42,196 --> 00:10:44,756 Speaker 2: a way that is trustworthy and reliable? How do you 201 00:10:44,796 --> 00:10:46,996 Speaker 2: do this so that it's not just like a pet 202 00:10:47,076 --> 00:10:50,036 Speaker 2: project in one hospital, but it's like a system that 203 00:10:50,156 --> 00:10:54,156 Speaker 2: is scalable nationally. And you know, what is the incentive structure? 204 00:10:54,156 --> 00:10:56,036 Speaker 2: Who pays for it and why would they pay for it? 205 00:10:56,076 --> 00:10:59,676 Speaker 2: And all of that is literally what sort of got me, 206 00:10:59,996 --> 00:11:02,316 Speaker 2: like got me super interested in the field where I 207 00:11:02,516 --> 00:11:04,716 Speaker 2: started to feel, wow, we're at the start of what 208 00:11:05,156 --> 00:11:09,396 Speaker 2: feels like is a massive movement, has many component to 209 00:11:09,436 --> 00:11:12,516 Speaker 2: be figured out, but we need to figure this out. Interestingly, 210 00:11:13,356 --> 00:11:16,996 Speaker 2: at the time on Sandhill Road, you know why, virtually 211 00:11:17,036 --> 00:11:18,156 Speaker 2: being in Palo Alto. 212 00:11:18,116 --> 00:11:21,716 Speaker 1: Yeah, Santel Road, where all the venture capitalists are exactly. 213 00:11:21,316 --> 00:11:24,796 Speaker 2: People were like, this is fantastic, here's money. Why don't 214 00:11:24,836 --> 00:11:28,236 Speaker 2: you start a company on this topic? And I spent 215 00:11:28,356 --> 00:11:31,756 Speaker 2: six months investigating, you know, talking to lots of peers 216 00:11:33,356 --> 00:11:38,596 Speaker 2: health systems, hospitals and realizing were just too early. There's 217 00:11:38,636 --> 00:11:40,596 Speaker 2: a lot of work that needs to go in place 218 00:11:40,836 --> 00:11:43,756 Speaker 2: for this to become something that will stale nationally. Now 219 00:11:43,796 --> 00:11:45,276 Speaker 2: fast forward ten years. 220 00:11:45,116 --> 00:11:46,836 Speaker 1: Later, I want to fast forward, but give me just 221 00:11:46,876 --> 00:11:50,276 Speaker 1: another moment when you say it's too early, Like in 222 00:11:50,316 --> 00:11:53,356 Speaker 1: what ways was it too early? Like specifically what was 223 00:11:53,436 --> 00:11:56,236 Speaker 1: not ready in the world to start a company at 224 00:11:56,236 --> 00:11:56,596 Speaker 1: that time. 225 00:11:56,676 --> 00:11:58,636 Speaker 2: So the first thing we needed is for hospitals to 226 00:11:58,636 --> 00:12:00,916 Speaker 2: be ready to implement a system like that. For that 227 00:12:00,956 --> 00:12:04,116 Speaker 2: to happen, they needed to have implemented the electronic health record, 228 00:12:05,036 --> 00:12:07,516 Speaker 2: be stable users of the HR so that they'd be 229 00:12:07,516 --> 00:12:09,756 Speaker 2: willing to plug in third party systems on top of it. 230 00:12:09,996 --> 00:12:12,956 Speaker 1: And it's kind of amazing that ten years ago, you know, 231 00:12:13,316 --> 00:12:19,156 Speaker 1: twenty whatever, twenty teens, still hospitals were not sort of 232 00:12:19,276 --> 00:12:23,316 Speaker 1: ubiquitous users of electronic medical records, right, like doctors were 233 00:12:23,356 --> 00:12:24,556 Speaker 1: still writing on paper. 234 00:12:25,276 --> 00:12:28,196 Speaker 2: Honestly, coming from computer science where I did you know, 235 00:12:28,236 --> 00:12:30,836 Speaker 2: where I was involved in other areas of AI and 236 00:12:30,876 --> 00:12:34,356 Speaker 2: computer science, like this was like the biggest like shift 237 00:12:34,636 --> 00:12:38,156 Speaker 2: in mindset I felt every time I came back into 238 00:12:38,156 --> 00:12:39,956 Speaker 2: the healthcare side of the equation, it felt like I 239 00:12:40,036 --> 00:12:42,836 Speaker 2: was going at least twenty thirty years back. 240 00:12:42,796 --> 00:12:45,036 Speaker 1: Right like at a time machine going into the past 241 00:12:45,076 --> 00:12:48,636 Speaker 1: when you walk into the hospital, which is particularly I 242 00:12:48,676 --> 00:12:53,356 Speaker 1: don't know, ironic surprising, given how in some ways healthcare 243 00:12:53,436 --> 00:12:56,956 Speaker 1: feels very cutting edge, right, Like A central interesting thing 244 00:12:56,996 --> 00:13:00,596 Speaker 1: to me about the work that you do is the 245 00:13:00,596 --> 00:13:02,916 Speaker 1: way in which healthcare is. You know, you go get 246 00:13:02,956 --> 00:13:06,396 Speaker 1: a whatever, a CT scan. It's this incredible machine and 247 00:13:06,436 --> 00:13:10,476 Speaker 1: it uploads to a computer and a whatever AI radiologists. 248 00:13:10,476 --> 00:13:13,076 Speaker 1: Can you read the scan blah blah blah. And yet 249 00:13:14,196 --> 00:13:17,156 Speaker 1: on the kind of data side, on the complicated patient 250 00:13:17,236 --> 00:13:20,636 Speaker 1: at the bedside side, it's still very kind of old 251 00:13:20,676 --> 00:13:22,316 Speaker 1: fashioned and almost artisanal. 252 00:13:23,396 --> 00:13:26,276 Speaker 2: I mean, you raised like a fantastic point, which is 253 00:13:26,556 --> 00:13:31,076 Speaker 2: I think when it comes to introducing and designing new medicines, Yeah, 254 00:13:31,116 --> 00:13:35,036 Speaker 2: we've become really really good, But in terms of once 255 00:13:35,076 --> 00:13:39,476 Speaker 2: the medicine is produced, in terms of actually accelerating the adoption, 256 00:13:39,676 --> 00:13:43,876 Speaker 2: optimizing the update, designing who gets it and what does 257 00:13:43,916 --> 00:13:48,236 Speaker 2: and when detecting early who would benefit from it, that's 258 00:13:48,276 --> 00:13:50,556 Speaker 2: what I call the healthcare delivery side of the equation. 259 00:13:50,836 --> 00:13:54,156 Speaker 2: I feel like there's a very very vast gap of 260 00:13:54,196 --> 00:13:55,716 Speaker 2: what needs to happen to get better. 261 00:13:56,476 --> 00:14:00,396 Speaker 1: So, okay, so you do this project. You see that 262 00:14:00,476 --> 00:14:04,916 Speaker 1: it's too early to start a company because the world 263 00:14:04,956 --> 00:14:08,476 Speaker 1: isn't ready yet, because hospitals aren't even widely using electronic 264 00:14:08,516 --> 00:14:11,156 Speaker 1: medical record yet, much less being ready to sort of 265 00:14:11,396 --> 00:14:14,756 Speaker 1: expert the data and listen to the data, et cetera, 266 00:14:15,156 --> 00:14:19,476 Speaker 1: and you take a job as a professor at Johns Hopkins. Right, 267 00:14:19,516 --> 00:14:20,356 Speaker 1: is that the next step? 268 00:14:20,956 --> 00:14:23,956 Speaker 2: That's right? And part of the move to Hopkins was 269 00:14:24,556 --> 00:14:28,756 Speaker 2: realizing there's so much depth and breadth of medicine, not 270 00:14:28,796 --> 00:14:31,956 Speaker 2: just around on the actual devices or the engineering or 271 00:14:31,956 --> 00:14:34,516 Speaker 2: the chemical or the drug development, but also on the 272 00:14:34,556 --> 00:14:37,476 Speaker 2: delivery side, like how what does it take to scale 273 00:14:37,516 --> 00:14:41,516 Speaker 2: ideas nationally? How do you design policy around it? There 274 00:14:41,596 --> 00:14:46,076 Speaker 2: was sort of a whole institute dediclicated to scaling ideas nationally. 275 00:14:46,516 --> 00:14:49,956 Speaker 2: So to me that was extremely exciting to learn about 276 00:14:50,396 --> 00:14:54,276 Speaker 2: what would it take to really build the foundations of 277 00:14:54,356 --> 00:14:56,756 Speaker 2: a field like this. And moving to Baltimore was a 278 00:14:56,796 --> 00:15:00,196 Speaker 2: big move, but I was just excited by the idea 279 00:15:00,236 --> 00:15:02,476 Speaker 2: of learning it all and learning it especially as an 280 00:15:02,476 --> 00:15:05,636 Speaker 2: engineer as ERNII research as an outsider coming into healthcare. 281 00:15:08,076 --> 00:15:10,636 Speaker 1: In a minute, Succi and her colleagues figure out how 282 00:15:10,676 --> 00:15:13,636 Speaker 1: to use AI to detect when certain patients are at 283 00:15:13,716 --> 00:15:18,316 Speaker 1: risk for complications and also how to get doctors to listen. 284 00:15:28,396 --> 00:15:31,156 Speaker 1: So Sucia is at Johns Hopkins in Baltimore and she 285 00:15:31,236 --> 00:15:35,276 Speaker 1: has this big idea using AI to help doctors treat 286 00:15:35,356 --> 00:15:38,556 Speaker 1: hospital patients, but she has to figure out exactly what 287 00:15:38,716 --> 00:15:39,956 Speaker 1: to focus on. 288 00:15:39,956 --> 00:15:42,036 Speaker 2: One of the big areas was this idea of like 289 00:15:42,156 --> 00:15:48,036 Speaker 2: early detection of patients at risk for complications and diagnostic 290 00:15:48,156 --> 00:15:50,836 Speaker 2: errors being the third leading cause of death. Like, huh, 291 00:15:50,916 --> 00:15:54,996 Speaker 2: that's nuts. Like, so today, you know there are critical 292 00:15:54,996 --> 00:15:57,956 Speaker 2: moments that are missed. We get patients the wrong diagnosis 293 00:15:58,036 --> 00:16:01,036 Speaker 2: or that they're developing something subtly and slowly. That's like 294 00:16:01,036 --> 00:16:04,596 Speaker 2: a whole branch of diagnostic errors where you know, complication 295 00:16:04,876 --> 00:16:07,316 Speaker 2: or a condition develops, but they don't get noticed in 296 00:16:07,356 --> 00:16:12,436 Speaker 2: a timely fashion. And so these seemed perfect for AI 297 00:16:12,556 --> 00:16:14,756 Speaker 2: to come in with the kind of data that exists 298 00:16:15,116 --> 00:16:17,636 Speaker 2: to be able to flag patients that are high risk 299 00:16:18,076 --> 00:16:20,076 Speaker 2: and make it easy to provide a second pair of VICE. 300 00:16:20,076 --> 00:16:24,076 Speaker 1: Because it's basically pattern matching, right, I mean, differential diagnosis 301 00:16:24,156 --> 00:16:28,436 Speaker 1: is taking lots of different variables from the patient and 302 00:16:29,516 --> 00:16:33,236 Speaker 1: trying to put those variables together to match the patient 303 00:16:33,316 --> 00:16:36,876 Speaker 1: to you know, thousands of other patients and say, oh, 304 00:16:36,996 --> 00:16:40,436 Speaker 1: all of these, all of these variables, all of these 305 00:16:40,556 --> 00:16:43,556 Speaker 1: health indicators suggest that the patient has disease X. Like 306 00:16:43,556 --> 00:16:47,036 Speaker 1: that's fundamentally what a differential diagnosis is, and like machine 307 00:16:47,076 --> 00:16:50,196 Speaker 1: learning should be very good at that exactly. 308 00:16:50,316 --> 00:16:54,236 Speaker 2: And previously people have attempted a differential diagnosis with very 309 00:16:54,356 --> 00:16:57,916 Speaker 2: coarse symptoms like high level description of like you have 310 00:16:57,996 --> 00:17:00,916 Speaker 2: cop your fever. What was different this time around is 311 00:17:00,916 --> 00:17:03,356 Speaker 2: because of the HR, we had very detailed data. 312 00:17:03,156 --> 00:17:05,236 Speaker 1: The EHR, the electronic health record right. 313 00:17:05,116 --> 00:17:09,876 Speaker 2: Exactly, and so it provided this brand new opportunity to 314 00:17:10,436 --> 00:17:12,756 Speaker 2: do this. And then you know, naturally when you go 315 00:17:12,796 --> 00:17:16,116 Speaker 2: down the list and start looking at problem areas, sepsis 316 00:17:16,236 --> 00:17:19,756 Speaker 2: is a model disease. We chose to demonstrate the idea. 317 00:17:20,196 --> 00:17:23,036 Speaker 1: So let's just talk about sepsis for a minute. What 318 00:17:23,196 --> 00:17:23,756 Speaker 1: is sepsis? 319 00:17:24,036 --> 00:17:27,396 Speaker 2: So let's say a patient gets infected. Your immune system 320 00:17:27,476 --> 00:17:30,156 Speaker 2: is now going to do respond in order to protect 321 00:17:30,156 --> 00:17:34,876 Speaker 2: your body. But in sepsis, it overreacts and starts attacking 322 00:17:34,916 --> 00:17:40,876 Speaker 2: your organ systems, leading to organ failure and your depth. 323 00:17:41,036 --> 00:17:43,876 Speaker 2: And so the idea of its sepsis treatment is very 324 00:17:43,956 --> 00:17:46,196 Speaker 2: much the earlier you can detect it, the better you 325 00:17:46,236 --> 00:17:48,156 Speaker 2: are at like tackling it. 326 00:17:48,276 --> 00:17:54,036 Speaker 1: Right, Okay, so I buy it. It seems seems like 327 00:17:54,076 --> 00:17:55,916 Speaker 1: a big problem and it seems like one that might 328 00:17:55,956 --> 00:17:59,676 Speaker 1: be solved or at least, you know, made less bad 329 00:17:59,916 --> 00:18:03,436 Speaker 1: by with the application of machine learning. So how do 330 00:18:03,516 --> 00:18:06,316 Speaker 1: you how do you actually do it? What do you 331 00:18:06,316 --> 00:18:08,836 Speaker 1: have to do to build a model and see if 332 00:18:08,876 --> 00:18:10,476 Speaker 1: it were and get people to use it. 333 00:18:10,596 --> 00:18:13,436 Speaker 2: Yeah, so this is almost like what you're about to 334 00:18:13,476 --> 00:18:16,076 Speaker 2: describe in two minutes what was almost a five year journey. 335 00:18:16,476 --> 00:18:18,956 Speaker 2: So first, it's collecting a huge amount of data where 336 00:18:18,996 --> 00:18:22,636 Speaker 2: you can identify both patients of subtic versus non septic 337 00:18:22,636 --> 00:18:24,556 Speaker 2: and when they had it, and what other conditions did 338 00:18:24,556 --> 00:18:26,636 Speaker 2: they have, and what else was happening in their life, 339 00:18:27,236 --> 00:18:29,196 Speaker 2: and you know, all the data leading up to that 340 00:18:29,276 --> 00:18:32,236 Speaker 2: episode and what was done after the fact. So you 341 00:18:32,276 --> 00:18:34,316 Speaker 2: get the data. Then the next part is, you know, 342 00:18:34,356 --> 00:18:36,956 Speaker 2: you have to actually understand the biological process or the 343 00:18:36,956 --> 00:18:39,636 Speaker 2: clinical process that's happening and layer that on top of 344 00:18:39,716 --> 00:18:41,556 Speaker 2: the data to make sure you're going from like just 345 00:18:41,596 --> 00:18:44,476 Speaker 2: bits and bytes to data that makes sense, okay, And 346 00:18:44,796 --> 00:18:48,596 Speaker 2: then you implement lots of different learning algorithms to be 347 00:18:48,676 --> 00:18:51,236 Speaker 2: able to experiment, you know, the thing that we first 348 00:18:51,236 --> 00:18:54,436 Speaker 2: did versus the thing we do now. There's like lots 349 00:18:54,476 --> 00:18:56,716 Speaker 2: of generations of improvements in order to get to a 350 00:18:56,716 --> 00:19:00,756 Speaker 2: place where you're going from, like you know, not very 351 00:19:00,756 --> 00:19:02,676 Speaker 2: good signal to very good signal. 352 00:19:03,156 --> 00:19:06,156 Speaker 1: So you're you're building a model through trial and error, 353 00:19:06,156 --> 00:19:09,636 Speaker 1: basically trying to get an AI model that that has 354 00:19:09,676 --> 00:19:13,756 Speaker 1: a high sensitivity and specificity that's good at issuing an 355 00:19:13,796 --> 00:19:17,396 Speaker 1: alert when a patient has sepsis and doesn't issue too 356 00:19:17,436 --> 00:19:19,476 Speaker 1: many alerts when the patient doesn't have sepsis. 357 00:19:19,116 --> 00:19:21,636 Speaker 2: Basically exactly, and also does it in a way that 358 00:19:21,996 --> 00:19:24,676 Speaker 2: you know, when it says somebody has sepsis, it's able 359 00:19:24,676 --> 00:19:27,596 Speaker 2: to explain why. It's able to provide enough information so 360 00:19:27,636 --> 00:19:30,796 Speaker 2: that the clinician can act on it. And it's not 361 00:19:30,876 --> 00:19:33,556 Speaker 2: doing it sorely that there's not enough to work on, 362 00:19:33,596 --> 00:19:35,916 Speaker 2: and it's not doing it so late that it's useless. 363 00:19:36,236 --> 00:19:40,876 Speaker 1: Like often people talk about AI models machine learning models 364 00:19:40,956 --> 00:19:44,236 Speaker 1: as black boxes, right, like very good at pattern matching, 365 00:19:44,316 --> 00:19:46,716 Speaker 1: very good at predicting the next word, but we don't 366 00:19:46,756 --> 00:19:49,076 Speaker 1: know why, And so you're saying, in this instance, you 367 00:19:49,156 --> 00:19:50,396 Speaker 1: sort of need to know why. 368 00:19:51,796 --> 00:19:54,676 Speaker 2: My very key evolution of a scientist working in this 369 00:19:54,796 --> 00:19:56,636 Speaker 2: area was, in the beginning, I saw it all as 370 00:19:56,716 --> 00:20:00,876 Speaker 2: data and math, and then as I started working more 371 00:20:00,876 --> 00:20:03,356 Speaker 2: and more in interfacing and actually deploying systems like this. 372 00:20:03,476 --> 00:20:06,276 Speaker 2: What I started realizing it's actually not math and data. 373 00:20:06,316 --> 00:20:10,956 Speaker 2: It's about trust, huh, Because ultimately to get adoption and 374 00:20:10,996 --> 00:20:14,036 Speaker 2: to get outcomes, I need to get trust from these 375 00:20:14,236 --> 00:20:19,316 Speaker 2: highly trained clinicians who studied this year and year out, 376 00:20:19,516 --> 00:20:22,916 Speaker 2: and they have a process and a system for working, 377 00:20:22,956 --> 00:20:24,956 Speaker 2: and you have to fit within this system. 378 00:20:25,076 --> 00:20:27,596 Speaker 1: And they're very busy, and it's very high stakes, and 379 00:20:27,676 --> 00:20:31,036 Speaker 1: they kind of think they know everything, and it's so 380 00:20:31,196 --> 00:20:36,156 Speaker 1: presumably very hard to get them to trust you in 381 00:20:36,236 --> 00:20:38,756 Speaker 1: making their clinical judgments exactly. 382 00:20:38,836 --> 00:20:41,876 Speaker 2: But moreover, I've also been on the other side of 383 00:20:41,996 --> 00:20:44,716 Speaker 2: like tons of engineers making all sorts of claims about 384 00:20:44,716 --> 00:20:47,596 Speaker 2: their system knows better. But when you actually go and 385 00:20:47,636 --> 00:20:50,876 Speaker 2: make sense of what the evaluations they've done, they literally 386 00:20:50,956 --> 00:20:54,196 Speaker 2: have very little understanding of medicine and the practice of healthcare. 387 00:20:54,276 --> 00:20:58,436 Speaker 2: So like their claims are mostly not good. So a 388 00:20:58,556 --> 00:21:01,276 Speaker 2: huge part of it is like developing respect and humility 389 00:21:01,716 --> 00:21:04,436 Speaker 2: for the system, the complexity, so that when you're bringing 390 00:21:04,476 --> 00:21:08,036 Speaker 2: in this new thing, it really truly fits. It's easy 391 00:21:08,076 --> 00:21:12,996 Speaker 2: to use sense it creates value. Without all that, you're 392 00:21:13,036 --> 00:21:14,796 Speaker 2: not going to get to the benefit. 393 00:21:16,196 --> 00:21:19,196 Speaker 1: So now you say it creates value and suddenly you 394 00:21:19,276 --> 00:21:23,676 Speaker 1: sound like a founder, an entrepreneur, and not like an academic. 395 00:21:24,276 --> 00:21:28,276 Speaker 1: Where where in this arc do you start a company? 396 00:21:28,396 --> 00:21:30,636 Speaker 2: You know, it was somewhere in twenty eighteen. I remember 397 00:21:31,036 --> 00:21:33,356 Speaker 2: twenty eighteen was a transformative video for me for a 398 00:21:33,396 --> 00:21:37,636 Speaker 2: number of reasons. I'll start with the very simple thing 399 00:21:37,676 --> 00:21:42,076 Speaker 2: of like, when we first built this system and deployed it, 400 00:21:42,356 --> 00:21:45,956 Speaker 2: only like two or three clinicians used it, and it 401 00:21:46,036 --> 00:21:48,156 Speaker 2: was the two to three clinicians who were involved in 402 00:21:48,236 --> 00:21:51,036 Speaker 2: working on the project with us. What I realized was 403 00:21:51,116 --> 00:21:53,756 Speaker 2: we knew from looking at large amounts of data that 404 00:21:53,876 --> 00:21:57,036 Speaker 2: the system was working, it was working correctly, and we 405 00:21:57,076 --> 00:21:59,756 Speaker 2: could identify these cases. We could identify them early, and 406 00:21:59,836 --> 00:22:02,716 Speaker 2: even from interacting the clinicians, we knew you could do 407 00:22:02,756 --> 00:22:05,276 Speaker 2: something differently about it. So it's one thing for system 408 00:22:05,316 --> 00:22:08,316 Speaker 2: to detect. You know, clinicians will say so what, so 409 00:22:08,316 --> 00:22:10,196 Speaker 2: what am I supposed to do well about it? And 410 00:22:10,236 --> 00:22:12,596 Speaker 2: in this scenario, we've even done studies to know that 411 00:22:13,156 --> 00:22:15,516 Speaker 2: actually they could be acting, you know, they could use 412 00:22:15,556 --> 00:22:19,676 Speaker 2: this output to meaningfully change the patient's care. So then 413 00:22:19,836 --> 00:22:22,996 Speaker 2: to me, the question was, Okay, if we know this 414 00:22:23,076 --> 00:22:25,796 Speaker 2: thing works, why the heck are we not succeeding. And 415 00:22:25,836 --> 00:22:28,276 Speaker 2: that's kind of where it went from the puzzle of 416 00:22:28,436 --> 00:22:30,596 Speaker 2: math and data to trust. You know, how do we 417 00:22:30,636 --> 00:22:33,356 Speaker 2: develop and deploy it in a way that's transparent. How 418 00:22:33,396 --> 00:22:35,956 Speaker 2: do we understand like what are the top of mind 419 00:22:36,116 --> 00:22:38,596 Speaker 2: issues from a practicing clinician's point of view, and how 420 00:22:38,636 --> 00:22:41,156 Speaker 2: do we address it? Where are we creating value? How 421 00:22:41,236 --> 00:22:42,756 Speaker 2: do we start quantifying value? 422 00:22:44,036 --> 00:22:46,476 Speaker 1: Now? Are there any moments where you're like, you know, 423 00:22:46,556 --> 00:22:48,596 Speaker 1: you have this thing that can be helpful, and yet 424 00:22:49,116 --> 00:22:53,756 Speaker 1: someone a doctor, a hospital administrator, whatever is telling you 425 00:22:53,836 --> 00:22:55,076 Speaker 1: why they're not going to use it? 426 00:22:55,236 --> 00:22:59,796 Speaker 2: Basically, I mean so many moments I can't even like 427 00:23:00,236 --> 00:23:03,716 Speaker 2: begin so I think I remember this time when they 428 00:23:03,716 --> 00:23:06,716 Speaker 2: basically were like, Okay, this thing is flagged the system. 429 00:23:07,156 --> 00:23:08,956 Speaker 2: What do I do with it? And I was like, 430 00:23:09,276 --> 00:23:11,116 Speaker 2: you should look if the patient has a scepting and 431 00:23:11,156 --> 00:23:13,156 Speaker 2: they were like, are you kidding me? How many flags? 432 00:23:13,436 --> 00:23:16,196 Speaker 2: Do you know? How many alerting systems exist? If I 433 00:23:16,236 --> 00:23:19,556 Speaker 2: were to take every single alerting system and start to 434 00:23:19,676 --> 00:23:23,236 Speaker 2: use that to start informing when I'm doing a diagnostic 435 00:23:23,276 --> 00:23:26,076 Speaker 2: workup and what am I doing, I basically would not 436 00:23:26,116 --> 00:23:27,916 Speaker 2: get my day to day work done right. 437 00:23:28,156 --> 00:23:29,956 Speaker 1: It's like if you're it's like when you're if you're 438 00:23:29,956 --> 00:23:32,916 Speaker 1: ever in an emergency room, like everything is beeping all 439 00:23:32,956 --> 00:23:36,156 Speaker 1: the time, and your system is just one more beep 440 00:23:36,236 --> 00:23:38,956 Speaker 1: in a sea of beeps that everybody ignores, and. 441 00:23:38,876 --> 00:23:41,756 Speaker 2: You feel passionately about it. 442 00:23:41,236 --> 00:23:44,116 Speaker 1: Your reasons, who care about this beap? But nobody else 443 00:23:44,156 --> 00:23:44,996 Speaker 1: cares about this beap? 444 00:23:45,116 --> 00:23:48,756 Speaker 2: Nobody gives a damn And it was just like so 445 00:23:49,676 --> 00:23:52,076 Speaker 2: it was difficult, right like you come. I was sort 446 00:23:52,116 --> 00:23:54,516 Speaker 2: of like, you know, I felt defeated. I sat that 447 00:23:54,636 --> 00:23:57,396 Speaker 2: I was like, this is so unbelievable. This is like 448 00:23:57,476 --> 00:24:00,196 Speaker 2: so powerful. Why aren't they believing me? And so there 449 00:24:00,276 --> 00:24:04,316 Speaker 2: was an information gap right like then it was like understanding, oh, 450 00:24:04,996 --> 00:24:07,236 Speaker 2: you know the system in which they live, Okay, I 451 00:24:07,356 --> 00:24:10,676 Speaker 2: understand that all these different alert exist. How are these 452 00:24:10,716 --> 00:24:13,636 Speaker 2: alerts created? How are we different? How can we demonstrate 453 00:24:13,676 --> 00:24:17,156 Speaker 2: we're different? Why should we be trusted? And so that 454 00:24:17,316 --> 00:24:20,436 Speaker 2: was as an example starting point. Like another one was 455 00:24:20,516 --> 00:24:22,916 Speaker 2: like we deployed it, and we deployed it in a 456 00:24:22,956 --> 00:24:26,036 Speaker 2: way where it was you know, within the electronic health record, 457 00:24:26,036 --> 00:24:27,956 Speaker 2: but it was done in a way that was really cumbersome, 458 00:24:28,476 --> 00:24:31,476 Speaker 2: like every time they needed to respond, it was like 459 00:24:31,676 --> 00:24:33,996 Speaker 2: a few you know, it was like a minute and 460 00:24:34,076 --> 00:24:38,276 Speaker 2: a half of work, and you know, honestly they're so busy. 461 00:24:38,476 --> 00:24:40,716 Speaker 2: A minute and a half extra to do something that 462 00:24:40,756 --> 00:24:43,836 Speaker 2: they don't already have total conviction in is like a 463 00:24:43,876 --> 00:24:47,156 Speaker 2: lot to ask. So then we spend a bunch of 464 00:24:47,196 --> 00:24:49,276 Speaker 2: time optimizing, well, how do we go it from me 465 00:24:49,356 --> 00:24:50,796 Speaker 2: take it from a minute and a half to like 466 00:24:50,916 --> 00:24:55,356 Speaker 2: three seconds? How do we optimize it so that it's instantaneous. 467 00:24:55,396 --> 00:24:57,636 Speaker 2: It's easy, it's just there. 468 00:24:58,756 --> 00:25:01,036 Speaker 1: So this isn't about the data at all. This is 469 00:25:01,196 --> 00:25:04,156 Speaker 1: just user experience basically. 470 00:25:04,156 --> 00:25:08,156 Speaker 2: Hugely human factors, like human factors and human factors here 471 00:25:08,236 --> 00:25:11,716 Speaker 2: is very different and complicated because you're trying to optimize 472 00:25:11,756 --> 00:25:14,796 Speaker 2: human factors within a chassis that is very complicated. Right, 473 00:25:14,876 --> 00:25:18,196 Speaker 2: Like you're not like standalone software. This is like you're 474 00:25:18,436 --> 00:25:21,756 Speaker 2: within an electronic health record, and like, how do you 475 00:25:21,836 --> 00:25:23,916 Speaker 2: do this in a way that the electronic health record 476 00:25:23,956 --> 00:25:24,756 Speaker 2: providers will. 477 00:25:24,596 --> 00:25:27,756 Speaker 1: Allow you information software? 478 00:25:28,636 --> 00:25:30,476 Speaker 2: It's not your software? And how can you do it 479 00:25:30,476 --> 00:25:33,476 Speaker 2: in a way that like is smooth and seamless and 480 00:25:33,516 --> 00:25:36,276 Speaker 2: they actually like it? And then you can do this 481 00:25:36,356 --> 00:25:38,556 Speaker 2: in a way where it's not just custom built for 482 00:25:38,636 --> 00:25:41,236 Speaker 2: a Johns Hopkins, but it's something that you can send 483 00:25:41,276 --> 00:25:43,156 Speaker 2: to take to a rural hospital. 484 00:25:44,356 --> 00:25:46,516 Speaker 1: So so you're doing all this, at what point in 485 00:25:46,556 --> 00:25:47,876 Speaker 1: this arc do you start the company? 486 00:25:48,596 --> 00:25:53,316 Speaker 2: So another like personal thing happened, which is I lost 487 00:25:53,316 --> 00:25:58,636 Speaker 2: my nephew to sepsis. And you know, it was the craziest, 488 00:25:59,316 --> 00:26:04,716 Speaker 2: like saddest, like you know, most insane feeling to be 489 00:26:04,796 --> 00:26:07,396 Speaker 2: able to like, you know, as like a researcher, as 490 00:26:07,396 --> 00:26:10,476 Speaker 2: a scientist, I'm like net deep in these research areas. 491 00:26:10,476 --> 00:26:14,156 Speaker 2: And then it's one thing to go and talk about it, 492 00:26:14,196 --> 00:26:16,516 Speaker 2: to say, well, here's how you do it, and here's 493 00:26:16,556 --> 00:26:19,236 Speaker 2: how it works, and here's why it will work, and 494 00:26:19,276 --> 00:26:21,796 Speaker 2: here's why this is a great idea. And it's another 495 00:26:21,916 --> 00:26:23,996 Speaker 2: to then come to that moment of realization where like, 496 00:26:24,796 --> 00:26:26,956 Speaker 2: well I haven't actually done anything to make a difference. 497 00:26:27,116 --> 00:26:31,276 Speaker 1: So you're already working on sepsis. Yes, and your nephew, 498 00:26:31,356 --> 00:26:33,436 Speaker 1: you say, nephew meaning younger than you? 499 00:26:33,556 --> 00:26:35,196 Speaker 2: Is this a young much younger than me? 500 00:26:35,316 --> 00:26:35,676 Speaker 1: Wow? 501 00:26:36,516 --> 00:26:41,076 Speaker 2: And realizing like I was doing, like it all sounded 502 00:26:41,076 --> 00:26:43,556 Speaker 2: like an excellent like it all sounded great on paper. 503 00:26:43,716 --> 00:26:46,236 Speaker 2: You know. It was like you know, I'd go to 504 00:26:46,316 --> 00:26:48,956 Speaker 2: meetings and lots of people would listen and they'd say, Yay, 505 00:26:49,076 --> 00:26:51,716 Speaker 2: great idea, et cetera. But then at the end of 506 00:26:51,756 --> 00:26:55,356 Speaker 2: the day, for me, it was like I'd gotten too 507 00:26:55,476 --> 00:26:57,516 Speaker 2: used to you know, it's easy. It's easy to like 508 00:26:57,556 --> 00:27:00,476 Speaker 2: talk about something smart and then people say it's a 509 00:27:00,516 --> 00:27:01,916 Speaker 2: great idea, and then you leave the room and you 510 00:27:01,916 --> 00:27:04,196 Speaker 2: feel good about it, and then you go back and 511 00:27:04,196 --> 00:27:07,836 Speaker 2: you work on it some more. And I think it 512 00:27:07,876 --> 00:27:12,196 Speaker 2: was hard, like hard for me to sort of realize 513 00:27:12,236 --> 00:27:15,836 Speaker 2: like I had gotten to carry it away, and I'd 514 00:27:15,836 --> 00:27:19,356 Speaker 2: gotten to carry it away like not thinking about what 515 00:27:19,436 --> 00:27:21,116 Speaker 2: is it actually going to take to make it real? 516 00:27:21,836 --> 00:27:24,596 Speaker 2: And the making it real is what's like just so 517 00:27:24,796 --> 00:27:27,156 Speaker 2: much harder than I thought. But part of it is 518 00:27:27,196 --> 00:27:31,636 Speaker 2: I also felt like this isn't just a sad This 519 00:27:31,716 --> 00:27:34,636 Speaker 2: isn't just like a you know, for an idea for sebsis. 520 00:27:34,716 --> 00:27:37,076 Speaker 2: This is really like crazy to me that this isn't 521 00:27:37,076 --> 00:27:39,796 Speaker 2: how we operate the like, I think the time has 522 00:27:39,876 --> 00:27:42,596 Speaker 2: come and what is exciting to me is in the 523 00:27:42,676 --> 00:27:44,956 Speaker 2: last year or two, I'm starting to see the world 524 00:27:45,116 --> 00:27:49,636 Speaker 2: has shifted. There's been a very meaningful change in the 525 00:27:49,716 --> 00:27:53,796 Speaker 2: last few years. I think losing my like losing my nephew, 526 00:27:53,876 --> 00:27:57,076 Speaker 2: made it very real. It went from this idea to 527 00:27:57,356 --> 00:28:01,636 Speaker 2: feeling like this was an opportunity where it's very real. 528 00:28:01,716 --> 00:28:04,876 Speaker 2: Now we can make a difference. The pieces exist, and 529 00:28:04,956 --> 00:28:07,716 Speaker 2: I need to make it happen. I can't hide anymore. 530 00:28:07,956 --> 00:28:11,916 Speaker 2: And in twenty eighteen I went from like started to 531 00:28:11,996 --> 00:28:16,116 Speaker 2: realize like most systems that finished implementing the health record 532 00:28:16,316 --> 00:28:21,796 Speaker 2: electronic health record policies were starting to change. The AI 533 00:28:22,076 --> 00:28:24,436 Speaker 2: was mature enough that it was really clear we could 534 00:28:24,516 --> 00:28:27,396 Speaker 2: do a lot with it. And it was my very 535 00:28:27,436 --> 00:28:33,596 Speaker 2: little part I could do to you know, address my 536 00:28:33,596 --> 00:28:36,236 Speaker 2: my you know, my part of grief related to my nephew. 537 00:28:36,356 --> 00:28:38,876 Speaker 2: Like it was the very little role I could play. 538 00:28:38,956 --> 00:28:42,356 Speaker 2: So so in twenty eighteen I started to, you know, 539 00:28:42,676 --> 00:28:44,396 Speaker 2: think go after it with the idea that we're going 540 00:28:44,436 --> 00:28:46,596 Speaker 2: to actually start a company. We're actually going to turn 541 00:28:46,636 --> 00:28:49,476 Speaker 2: this into something that scales nationally. And that's where it 542 00:28:49,516 --> 00:28:49,996 Speaker 2: all began. 543 00:28:50,356 --> 00:28:54,356 Speaker 1: So you start the company, and you do build this 544 00:28:55,476 --> 00:29:02,436 Speaker 1: AI model to detect sepsis in hospitalized patients, and you 545 00:29:02,476 --> 00:29:06,316 Speaker 1: do this study and you wind up publishing the outcome 546 00:29:07,516 --> 00:29:10,476 Speaker 1: in the journal Nature Medicine, which seems like a big, 547 00:29:10,676 --> 00:29:13,636 Speaker 1: big moment in your work, in the life of your company. 548 00:29:13,676 --> 00:29:15,916 Speaker 1: So tell me about that study. 549 00:29:17,076 --> 00:29:19,956 Speaker 2: Yeah, So in twenty two, in July twenty two, we 550 00:29:20,036 --> 00:29:22,636 Speaker 2: had three studies. They were featured on the cover of 551 00:29:22,716 --> 00:29:24,996 Speaker 2: Nature Medicine. These were very big studies for the field. 552 00:29:25,636 --> 00:29:29,356 Speaker 2: Then the studies that came out in twenty two were 553 00:29:29,396 --> 00:29:35,476 Speaker 2: basically showing how we implemented the system by five different sites, 554 00:29:35,556 --> 00:29:38,716 Speaker 2: like both in the emergency department, the floor, the hospital 555 00:29:38,796 --> 00:29:42,796 Speaker 2: flow is the ICUs across academic and community hospital, So 556 00:29:42,916 --> 00:29:46,916 Speaker 2: five different hospitals in totally different geographic region right in 557 00:29:46,956 --> 00:29:52,716 Speaker 2: Maryland in DC, rich communities, poor communities. And what we 558 00:29:52,716 --> 00:29:56,636 Speaker 2: were able to show was the system both like you know, 559 00:29:56,636 --> 00:29:58,796 Speaker 2: almost three quarter of a million patients in the study 560 00:29:59,036 --> 00:30:02,156 Speaker 2: forty four hundred physicians and nurses who were part of 561 00:30:02,156 --> 00:30:07,756 Speaker 2: the study that you could detect sepsist significantly earlier than 562 00:30:07,796 --> 00:30:10,356 Speaker 2: they were currently detecting an acting on. So that was 563 00:30:10,876 --> 00:30:14,996 Speaker 2: one second we showed that. In fact, when we then 564 00:30:15,076 --> 00:30:19,916 Speaker 2: implemented the system, we show saw meaningful reduction in treatment timing, 565 00:30:20,076 --> 00:30:24,116 Speaker 2: like patients were getting treatment in a more timely fashion 566 00:30:24,156 --> 00:30:26,516 Speaker 2: when providers were seeing the alert and acting off of it. 567 00:30:27,396 --> 00:30:30,956 Speaker 2: And then the third we know early detection is possible 568 00:30:30,996 --> 00:30:33,756 Speaker 2: now and we know treatment timing is moved and we've 569 00:30:33,796 --> 00:30:36,076 Speaker 2: known in sepsis that early treatment is the key to 570 00:30:36,116 --> 00:30:38,036 Speaker 2: better outcomes, So the questions do we see that in 571 00:30:38,036 --> 00:30:40,836 Speaker 2: our population as well? And we saw that in patients 572 00:30:40,876 --> 00:30:45,436 Speaker 2: who actually got you know, early alerts. On who got 573 00:30:45,436 --> 00:30:48,236 Speaker 2: the alerts and providers acted on it, we actually saw 574 00:30:48,316 --> 00:30:53,196 Speaker 2: much better outcomes in terms of reductions in mortality, morbidity, 575 00:30:53,636 --> 00:30:57,436 Speaker 2: length of state, fewer complications, secondary complications that arise out 576 00:30:57,436 --> 00:31:01,596 Speaker 2: of sepsis. So it was just extremely exciting to see 577 00:31:02,116 --> 00:31:05,796 Speaker 2: that we could go from you know, a technical idea 578 00:31:06,316 --> 00:31:08,836 Speaker 2: to actual outcomes. And then one of the most interesting 579 00:31:08,836 --> 00:31:13,436 Speaker 2: things we've studied here was adoption. Will clinicians adopt? It 580 00:31:13,476 --> 00:31:16,756 Speaker 2: was a very real world study to show, like, can 581 00:31:16,756 --> 00:31:20,036 Speaker 2: a system like this actually work? And you showed ninety 582 00:31:20,116 --> 00:31:23,596 Speaker 2: percent physician adoption. So that was extremely exciting to see. 583 00:31:23,636 --> 00:31:26,236 Speaker 2: And that's what I call that's what you know was 584 00:31:26,316 --> 00:31:27,676 Speaker 2: about closing the trust gap. 585 00:31:28,076 --> 00:31:32,116 Speaker 1: So, okay, so you published this paper whatever a year 586 00:31:32,116 --> 00:31:34,516 Speaker 1: and a half ago, where are you now? What's your 587 00:31:34,516 --> 00:31:35,116 Speaker 1: company doing? 588 00:31:35,196 --> 00:31:38,676 Speaker 2: One thing that's very also that I didn't cover earlier 589 00:31:38,716 --> 00:31:41,916 Speaker 2: is that we expanded the system dramatically from not just 590 00:31:41,996 --> 00:31:47,196 Speaker 2: working on sepsis but a variety of other conditions like sepsis, 591 00:31:47,476 --> 00:31:51,196 Speaker 2: where there is very significant both clinical benefit but also 592 00:31:51,316 --> 00:31:53,836 Speaker 2: financial benefit for the health system. The reason the financial 593 00:31:53,876 --> 00:31:56,556 Speaker 2: piece matters is, you know, ultimately health systems are working 594 00:31:56,556 --> 00:31:59,036 Speaker 2: on one two percent margin. For them to be able 595 00:31:59,076 --> 00:32:02,196 Speaker 2: to implement systems that actually improve care, they still need 596 00:32:02,236 --> 00:32:05,436 Speaker 2: to be able to financially justify that this can be done, 597 00:32:05,996 --> 00:32:07,316 Speaker 2: and that was crucial. 598 00:32:07,556 --> 00:32:10,396 Speaker 1: So what are some of the other things you're working 599 00:32:10,396 --> 00:32:11,596 Speaker 1: on besides sepsis? Now? 600 00:32:12,196 --> 00:32:16,116 Speaker 2: Like another example area is fresh ulcers? Okay, huge area 601 00:32:16,156 --> 00:32:19,956 Speaker 2: where like bed so like a bed source exactly like 602 00:32:20,556 --> 00:32:24,076 Speaker 2: it's an area where again huge patient impact in terms 603 00:32:24,116 --> 00:32:26,236 Speaker 2: of like you know, if you do end up getting 604 00:32:26,236 --> 00:32:28,836 Speaker 2: a serious beds or how detrimental it is for the patient, 605 00:32:29,036 --> 00:32:32,036 Speaker 2: sometimes leading to death, sometimes leading the need for amputation, 606 00:32:33,116 --> 00:32:37,476 Speaker 2: but even more interestingly, huge burden on the caregivers themselves, 607 00:32:37,516 --> 00:32:40,756 Speaker 2: like nurses today have to do a huge amount of 608 00:32:40,756 --> 00:32:43,236 Speaker 2: work to take care of these patients. Like today, there 609 00:32:43,116 --> 00:32:45,876 Speaker 2: are lots of scenarios where these patients are missed, and 610 00:32:45,916 --> 00:32:48,236 Speaker 2: there's an opportunity where you can actually use this data 611 00:32:48,236 --> 00:32:51,996 Speaker 2: to identify this higher school and start again implementing these 612 00:32:51,996 --> 00:32:54,916 Speaker 2: new ways in which you can do targeted you know, 613 00:32:55,476 --> 00:32:56,516 Speaker 2: preventative measures. 614 00:32:56,756 --> 00:33:00,436 Speaker 1: What has to happen for you to you know, for 615 00:33:00,516 --> 00:33:03,156 Speaker 1: your software to get adopted at hospitals all around the country. 616 00:33:03,236 --> 00:33:06,156 Speaker 1: Like I buy that it's helpful. How do you get 617 00:33:06,156 --> 00:33:08,436 Speaker 1: from it being a kind of researchy thing to being 618 00:33:08,476 --> 00:33:09,836 Speaker 1: a thing that everybody uses? 619 00:33:10,076 --> 00:33:12,956 Speaker 2: So the hurdles we needed to cross was one. We 620 00:33:13,036 --> 00:33:14,796 Speaker 2: needed to figure out a way to get approvals from 621 00:33:14,836 --> 00:33:16,876 Speaker 2: the electronic health records to be able to integrate it. 622 00:33:16,996 --> 00:33:18,956 Speaker 2: We did. That took a couple of years. 623 00:33:18,676 --> 00:33:21,116 Speaker 1: From like the just the big software makers, Epic, whatever, 624 00:33:21,196 --> 00:33:23,596 Speaker 1: the companies that make the electronic health records. They have 625 00:33:23,676 --> 00:33:26,676 Speaker 1: to say yes, okay, so that's done. Check. Great. What 626 00:33:26,796 --> 00:33:27,916 Speaker 1: has to happened next? Yeah? 627 00:33:28,036 --> 00:33:30,356 Speaker 2: Next, you need a system that is able to you know, 628 00:33:30,396 --> 00:33:31,916 Speaker 2: when you go from one side to the next, to 629 00:33:31,916 --> 00:33:33,556 Speaker 2: the next to the next. You need the ability to 630 00:33:33,556 --> 00:33:35,636 Speaker 2: be able to measure and generalize as you core, cross 631 00:33:35,636 --> 00:33:37,036 Speaker 2: site and reliably perform. 632 00:33:37,436 --> 00:33:39,516 Speaker 1: So it has to work in lots of different kinds 633 00:33:39,556 --> 00:33:42,636 Speaker 1: of hospitals that collect different kinds of data in different settings. 634 00:33:43,116 --> 00:33:45,316 Speaker 2: And in our partnerships we've shown that data. 635 00:33:45,396 --> 00:33:47,116 Speaker 1: Okay, Third check. 636 00:33:47,036 --> 00:33:49,356 Speaker 2: Like I said, we have to show that basically people 637 00:33:49,396 --> 00:33:51,556 Speaker 2: will adopt in these different environments. So we have data 638 00:33:51,596 --> 00:33:51,996 Speaker 2: to show that. 639 00:33:52,076 --> 00:33:53,196 Speaker 1: Okay. 640 00:33:53,196 --> 00:33:56,636 Speaker 2: For in some of these areas you need fd approval okay, 641 00:33:56,676 --> 00:33:58,676 Speaker 2: and in the areas we need f the approval. We're 642 00:33:58,676 --> 00:34:00,276 Speaker 2: working with the FDA to get those approvals. 643 00:34:00,316 --> 00:34:03,916 Speaker 1: Okay. So that's kind of the next step, correct. 644 00:34:03,796 --> 00:34:06,276 Speaker 2: And then once that's done, you can now start to 645 00:34:06,516 --> 00:34:09,716 Speaker 2: you know, it's it's available, it can be market it, 646 00:34:09,916 --> 00:34:13,196 Speaker 2: you can scale it nationally. All very exciting things. 647 00:34:13,516 --> 00:34:19,956 Speaker 1: So so if things go well for you, what will 648 00:34:19,956 --> 00:34:23,236 Speaker 1: the world look like in say five years. 649 00:34:23,356 --> 00:34:26,156 Speaker 2: Oh my god, so exciting. I think we will actually 650 00:34:26,236 --> 00:34:30,596 Speaker 2: be implemented at sixty seventy eighty percent of the market, 651 00:34:30,676 --> 00:34:35,236 Speaker 2: I hope in the US. What's interesting now is like, 652 00:34:35,276 --> 00:34:37,036 Speaker 2: you know, healthcare is a market which is a leader 653 00:34:37,116 --> 00:34:40,596 Speaker 2: follow up market. And once you show things that work, 654 00:34:40,676 --> 00:34:43,396 Speaker 2: it makes logical sense. You have the proof points, you've 655 00:34:43,396 --> 00:34:46,076 Speaker 2: tackled most of the common issues that people struggle with. 656 00:34:46,836 --> 00:34:48,676 Speaker 2: Then this is an area where you can scale. And 657 00:34:48,716 --> 00:34:50,476 Speaker 2: when it comes to like the areas we're working in, 658 00:34:50,476 --> 00:34:53,196 Speaker 2: which is clinical, unlike some of the other areas like 659 00:34:53,276 --> 00:34:57,036 Speaker 2: billing and messaging and back office, you know, the years 660 00:34:57,036 --> 00:34:59,876 Speaker 2: of development required to build. What we build is very long, 661 00:34:59,956 --> 00:35:01,916 Speaker 2: Like it's taken us eight to nine years to do 662 00:35:01,956 --> 00:35:03,836 Speaker 2: all the pieces necessary to get to where we are, 663 00:35:03,876 --> 00:35:06,236 Speaker 2: So there aren't as a lot of like other competitors 664 00:35:06,236 --> 00:35:06,716 Speaker 2: in the market. 665 00:35:06,796 --> 00:35:09,036 Speaker 1: You have a mode, and FDA approval is going to 666 00:35:09,036 --> 00:35:09,836 Speaker 1: be even more of a. 667 00:35:09,796 --> 00:35:12,996 Speaker 2: Mode among other things. Exactly, so we have a very 668 00:35:13,356 --> 00:35:17,316 Speaker 2: very significant like moat and hurdles people have to cross 669 00:35:17,356 --> 00:35:19,956 Speaker 2: to really get it to work, and we've invested in them. 670 00:35:20,436 --> 00:35:24,236 Speaker 1: And so in your happy five year future, most of 671 00:35:24,236 --> 00:35:27,316 Speaker 1: the hospitals in the country will be using your software, 672 00:35:27,316 --> 00:35:32,596 Speaker 1: your models to detect sepsis, to detect bedsores earlier than 673 00:35:33,276 --> 00:35:34,196 Speaker 1: in a variety. 674 00:35:33,916 --> 00:35:36,356 Speaker 2: Of for the conditions. Like we've looked at our own 675 00:35:36,356 --> 00:35:39,556 Speaker 2: financial models and show that like a you know, modest 676 00:35:40,076 --> 00:35:44,156 Speaker 2: four to five hospital health system stands to gain like 677 00:35:44,676 --> 00:35:47,516 Speaker 2: fifty two hundred million dollars from the implementation of our 678 00:35:47,596 --> 00:35:50,556 Speaker 2: system in some you know, the condition areas we're tackling. 679 00:35:50,316 --> 00:35:52,756 Speaker 1: And people will die less and be less sick as 680 00:35:52,796 --> 00:35:54,436 Speaker 1: a benefit also, And. 681 00:35:54,356 --> 00:35:57,356 Speaker 2: That is honestly the biggest maturity I've had in building 682 00:35:57,356 --> 00:36:00,556 Speaker 2: this company. I started from like the cause of caring, 683 00:36:00,916 --> 00:36:05,036 Speaker 2: and it was realizing like It's funny. In healthcare, they're 684 00:36:05,076 --> 00:36:07,956 Speaker 2: so used to caring for patients who are dying every day. 685 00:36:07,996 --> 00:36:11,796 Speaker 2: They've gotten the sensitive. You then come back to realizing 686 00:36:11,836 --> 00:36:14,716 Speaker 2: you need the other things to follow, like the money. 687 00:36:14,756 --> 00:36:16,196 Speaker 2: You need to figure out a way to make it 688 00:36:16,236 --> 00:36:19,116 Speaker 2: easy for them to do the right thing, And when 689 00:36:19,156 --> 00:36:22,556 Speaker 2: you do that, then they do actually care about doing 690 00:36:22,556 --> 00:36:24,196 Speaker 2: the right thing, because that's why they were there in 691 00:36:24,196 --> 00:36:24,876 Speaker 2: the first place. 692 00:36:27,916 --> 00:36:39,756 Speaker 1: We'll be back in a minute with the lightning round. Okay, 693 00:36:40,556 --> 00:36:43,116 Speaker 1: I'm going to keep you another two minutes or something 694 00:36:43,156 --> 00:36:46,796 Speaker 1: to do a lightning round. You went to college at 695 00:36:46,836 --> 00:36:50,596 Speaker 1: Mount Holyoke and all women's college. Yeah, and so I'm curious, 696 00:36:50,636 --> 00:36:54,036 Speaker 1: what is one thing you would tell someone considering attending 697 00:36:54,036 --> 00:36:55,116 Speaker 1: an all women's college. 698 00:36:55,236 --> 00:36:57,596 Speaker 2: Oh, I loved Mount Holyoke. It was so much fun. 699 00:36:57,636 --> 00:36:59,956 Speaker 2: It's where I got my confidence that I could do 700 00:37:00,036 --> 00:37:02,916 Speaker 2: really really hard things and not be, you know, not 701 00:37:02,996 --> 00:37:03,716 Speaker 2: feel defeated. 702 00:37:04,276 --> 00:37:08,116 Speaker 1: If you weren't working in healthcare, where would you be 703 00:37:08,196 --> 00:37:09,036 Speaker 1: trying to apply a. 704 00:37:10,556 --> 00:37:12,956 Speaker 2: Oh my god, I've just been so obsessed with healthcare 705 00:37:13,036 --> 00:37:15,676 Speaker 2: for the last decade. I haven't really lifted my head 706 00:37:15,676 --> 00:37:17,796 Speaker 2: to think about other things. I mean, honestly, there are 707 00:37:17,796 --> 00:37:21,796 Speaker 2: a million areas you could apply it, but I don't 708 00:37:21,836 --> 00:37:23,756 Speaker 2: like thinking about it because it's just that the need 709 00:37:23,836 --> 00:37:25,756 Speaker 2: is so dire in health care and it's so hard. 710 00:37:25,836 --> 00:37:27,556 Speaker 2: It's so hard for an II research to focus in 711 00:37:27,556 --> 00:37:31,036 Speaker 2: healthcare because they don't make it easy. You can make 712 00:37:31,156 --> 00:37:33,436 Speaker 2: a lot more money doing the same kind of things 713 00:37:33,436 --> 00:37:35,676 Speaker 2: in finance. You can get the data more easily, you 714 00:37:35,716 --> 00:37:38,516 Speaker 2: can make money off of it more easily. Like it 715 00:37:38,636 --> 00:37:40,996 Speaker 2: is annoying, It is really annoying. 716 00:37:41,396 --> 00:37:43,596 Speaker 1: Is chet GPT overrated or underrated? 717 00:37:44,276 --> 00:37:46,436 Speaker 2: Actually? I think it's underrated. 718 00:37:46,596 --> 00:37:49,596 Speaker 1: Okay, go on, I think. 719 00:37:49,716 --> 00:37:52,396 Speaker 2: You know, when we see the math, we're like, okay, 720 00:37:52,436 --> 00:37:54,716 Speaker 2: that's the math. That's interesting to me. What was really 721 00:37:54,716 --> 00:37:58,716 Speaker 2: informative was like the experience, the social experience. It was 722 00:37:58,956 --> 00:38:02,276 Speaker 2: so exciting to see people who first interacted with it 723 00:38:02,796 --> 00:38:05,196 Speaker 2: and you know, have the head mind be blown by 724 00:38:05,196 --> 00:38:09,036 Speaker 2: the experience. And that's sort of then informing how important 725 00:38:09,116 --> 00:38:11,716 Speaker 2: the user experience out of the houses, like you know, 726 00:38:11,796 --> 00:38:14,556 Speaker 2: we had some of the chatbot technology before we had 727 00:38:14,596 --> 00:38:17,236 Speaker 2: some of the interactive but it's sort of how opening 728 00:38:17,316 --> 00:38:21,836 Speaker 2: I designed it in the use cases like storytelling, poems, 729 00:38:22,356 --> 00:38:25,076 Speaker 2: like the use cases where they trained the system to 730 00:38:25,116 --> 00:38:29,596 Speaker 2: be very good at conversant like was what made the 731 00:38:29,596 --> 00:38:31,916 Speaker 2: experience so exciting because then people could start, you know, 732 00:38:32,036 --> 00:38:36,316 Speaker 2: like experiencing it themselves and that sort of opened up 733 00:38:36,356 --> 00:38:37,756 Speaker 2: their mind to what else could it do? 734 00:38:37,996 --> 00:38:40,596 Speaker 1: Analogous to the lesson you were talking about in your 735 00:38:40,636 --> 00:38:44,356 Speaker 1: own work, where getting the answer right figuring out if 736 00:38:44,356 --> 00:38:47,956 Speaker 1: the person has sepsis is actually only part of what 737 00:38:47,996 --> 00:38:48,516 Speaker 1: you have to. 738 00:38:48,476 --> 00:38:52,716 Speaker 2: Do huge and that's I think where ais a field 739 00:38:52,796 --> 00:38:54,516 Speaker 2: that a lot has a lot of growing up to 740 00:38:54,516 --> 00:38:57,436 Speaker 2: do because historically the people who entered this field are 741 00:38:57,636 --> 00:39:01,876 Speaker 2: you know, they gravitate towards the math, they gravitate towards 742 00:39:01,916 --> 00:39:05,076 Speaker 2: the hired science. But what they don't realize is ultimately 743 00:39:05,636 --> 00:39:08,716 Speaker 2: it is a people problem that you're solving. You have 744 00:39:08,796 --> 00:39:11,036 Speaker 2: to get people to love it. You have to get 745 00:39:11,036 --> 00:39:13,716 Speaker 2: people to incorporate it in their daily lives for this 746 00:39:13,796 --> 00:39:16,676 Speaker 2: to be successful, and you have to operate in a 747 00:39:16,716 --> 00:39:20,116 Speaker 2: world which is not very precise, Like people have their 748 00:39:20,116 --> 00:39:22,836 Speaker 2: faults and their mistakes, and they work in a particular way, 749 00:39:22,876 --> 00:39:24,356 Speaker 2: and you've got to get this thing to fit. 750 00:39:28,636 --> 00:39:31,436 Speaker 1: Suchi Saraya is a professor at Johns Hopkins and the 751 00:39:31,556 --> 00:39:37,076 Speaker 1: founder and CEO of Asian Health. Today's show was produced 752 00:39:37,116 --> 00:39:41,236 Speaker 1: by Edith Russolo and Gabriel Hunter Chang. It was edited 753 00:39:41,236 --> 00:39:45,036 Speaker 1: by Karen Chakerji and engineered by Sarah Bruguer. You can 754 00:39:45,076 --> 00:39:48,796 Speaker 1: email us at a problem at pushkin dot FA. I'm 755 00:39:48,836 --> 00:39:51,516 Speaker 1: Jacob Goldstein, and we'll be back next week with another 756 00:39:51,556 --> 00:39:52,716 Speaker 1: episode of What's You're Talking