1 00:00:15,356 --> 00:00:23,436 Speaker 1: Pushkin. This episode is a paid partnership with T Mobile 2 00:00:23,476 --> 00:00:30,436 Speaker 1: for Business. Hello, Hello, Malcolm Glabel. Here today. I wanted 3 00:00:30,476 --> 00:00:34,156 Speaker 1: to share a very special conversation I had recently hosted 4 00:00:34,236 --> 00:00:37,236 Speaker 1: by my good friends at T Mobile for Business about 5 00:00:37,316 --> 00:00:42,636 Speaker 1: how AI is changing our world. My guests are oh Kattaba, 6 00:00:43,036 --> 00:00:46,716 Speaker 1: the CMO of T Mobile for Business, doctor Azizi Satias, 7 00:00:47,156 --> 00:00:50,036 Speaker 1: chair of the Department of Informatics and Health Data Science 8 00:00:50,036 --> 00:00:52,956 Speaker 1: at the University of Miami Miller's School of Medicine, and 9 00:00:53,036 --> 00:00:57,796 Speaker 1: Ryan Litt, COO and co founder of three AM Innovations. 10 00:00:58,796 --> 00:00:58,836 Speaker 2: MO. 11 00:00:59,036 --> 00:01:01,956 Speaker 1: I know from years ago when we had a fascinating 12 00:01:01,956 --> 00:01:05,516 Speaker 1: conversation about five G when that technology was in its infancy. 13 00:01:06,236 --> 00:01:08,956 Speaker 1: Brian is from Buffalo and we shared a deep affection 14 00:01:09,516 --> 00:01:12,556 Speaker 1: for the Buffalo Bills. And as Easy as will soon 15 00:01:12,596 --> 00:01:16,316 Speaker 1: be obvious is Jamaica, which of course is the surest 16 00:01:16,316 --> 00:01:19,396 Speaker 1: way to my heart. Anyway, we talked about some really 17 00:01:19,396 --> 00:01:22,436 Speaker 1: cool applications of AI and five G and the way 18 00:01:22,516 --> 00:01:25,596 Speaker 1: really smart people like Ryan and as Easy are using 19 00:01:25,636 --> 00:01:29,956 Speaker 1: these technologies to solve some pretty hard and fascinating problems. 20 00:01:40,596 --> 00:01:44,636 Speaker 1: Thank you. Hey everyone, we're all wearing our Should we 21 00:01:44,676 --> 00:01:46,556 Speaker 1: just put our I know this is a podcast. You 22 00:01:46,556 --> 00:01:49,356 Speaker 1: can't see it, but we're all wearing T mobile sneakers 23 00:01:49,396 --> 00:01:51,116 Speaker 1: right now. I see two of us have got the 24 00:01:51,156 --> 00:01:57,036 Speaker 1: Air Forces and then the others se converse converse. Yeah, 25 00:01:57,076 --> 00:01:59,196 Speaker 1: so we're all representing the brand. I think very effective 26 00:01:59,236 --> 00:02:03,236 Speaker 1: to hear. So we're here to talk about AI and 27 00:02:03,476 --> 00:02:05,556 Speaker 1: five G. But what we're really here to talk about 28 00:02:06,196 --> 00:02:08,436 Speaker 1: is something much simpler and more important than that is 29 00:02:08,636 --> 00:02:12,916 Speaker 1: problem solve Right. All of you guys are people who 30 00:02:13,156 --> 00:02:16,236 Speaker 1: basically solve problems for a living, and I wanted to 31 00:02:16,276 --> 00:02:19,876 Speaker 1: start there, and maybe Ryan, you could kick us off, 32 00:02:20,316 --> 00:02:22,516 Speaker 1: tell us a little bit about what you do, but 33 00:02:22,556 --> 00:02:25,116 Speaker 1: then tell us about the problem you're trying to solve. 34 00:02:26,636 --> 00:02:26,916 Speaker 3: Sure. 35 00:02:27,036 --> 00:02:31,716 Speaker 2: So, when we think about emergency events and really at 36 00:02:31,716 --> 00:02:35,556 Speaker 2: the majority of the world, the primary tool set that 37 00:02:36,436 --> 00:02:40,636 Speaker 2: firefighters use is a radio to communicate their status to 38 00:02:40,676 --> 00:02:45,116 Speaker 2: the outside operation. And I'm sure we can all imagine, however, 39 00:02:45,876 --> 00:02:51,916 Speaker 2: you know, winding hallways, dense forests, black smoke, falling debris, 40 00:02:52,476 --> 00:02:56,196 Speaker 2: pretty reasonable to expect that people can become disoriented, They 41 00:02:56,196 --> 00:02:58,676 Speaker 2: can be a bit confused, and the issue for a 42 00:02:58,716 --> 00:03:02,516 Speaker 2: firefighters when they're confused, inherently so is the rest of 43 00:03:02,516 --> 00:03:03,036 Speaker 2: the operation. 44 00:03:03,316 --> 00:03:06,156 Speaker 1: Yeah, wait before you go on, tell us a little 45 00:03:06,156 --> 00:03:09,356 Speaker 1: bit about how how is it you landed in this 46 00:03:09,396 --> 00:03:12,116 Speaker 1: particular world. Why is it you're thinking about this problem 47 00:03:12,276 --> 00:03:16,596 Speaker 1: of the disorientation of the of the firefighter. 48 00:03:16,956 --> 00:03:21,116 Speaker 2: Well, you know, ultimately when there's confusion, it ultimately leads 49 00:03:21,116 --> 00:03:25,116 Speaker 2: to injury and sometimes death. So the true inspiration to 50 00:03:25,236 --> 00:03:28,596 Speaker 2: our origin is in Buffalo, New York, there was a 51 00:03:28,716 --> 00:03:32,676 Speaker 2: convenience store that was on fire, and you know, upon arrival, 52 00:03:32,716 --> 00:03:35,316 Speaker 2: firefighters quickly got to trying to put out the fire. 53 00:03:35,676 --> 00:03:39,636 Speaker 2: But the firefighter, the fire itself moved faster, so they 54 00:03:39,636 --> 00:03:42,956 Speaker 2: had to call an evacuation pull everybody out, but they 55 00:03:42,956 --> 00:03:46,316 Speaker 2: were unsure if everybody got out, so they assigned a 56 00:03:46,356 --> 00:03:49,116 Speaker 2: team to go sweep the facility to try to rescue 57 00:03:49,116 --> 00:03:51,876 Speaker 2: anybody remaining, and unfortunately the structure. 58 00:03:51,516 --> 00:03:54,316 Speaker 4: Collapsed over top of them and killed them both. And 59 00:03:54,796 --> 00:03:57,236 Speaker 4: when was this go ahead? Two thousand and nine. 60 00:03:57,516 --> 00:04:00,436 Speaker 2: Yeah, And to make matters worse, there was nobody inside. 61 00:04:00,636 --> 00:04:03,476 Speaker 2: They were just unsure. And so for us, it's no 62 00:04:03,516 --> 00:04:05,476 Speaker 2: slight against them, but we just feel like they deserve 63 00:04:05,516 --> 00:04:08,516 Speaker 2: better tools. There has to be more than what they 64 00:04:08,516 --> 00:04:11,316 Speaker 2: have that rate. No slide against the radio either, but 65 00:04:11,356 --> 00:04:14,516 Speaker 2: all of us are here because technology is flush in 66 00:04:14,596 --> 00:04:17,076 Speaker 2: many other places and our belief as they deserve to 67 00:04:17,076 --> 00:04:17,556 Speaker 2: have it too. 68 00:04:18,676 --> 00:04:22,716 Speaker 1: Tell us how that company starts. Does it arise out 69 00:04:22,756 --> 00:04:26,996 Speaker 1: of that particular incident, or I'm just curious about how 70 00:04:27,036 --> 00:04:28,756 Speaker 1: you kind of evolved to the point where you were 71 00:04:28,756 --> 00:04:30,236 Speaker 1: looking for solutions to that problem. 72 00:04:30,596 --> 00:04:30,796 Speaker 3: Yees. 73 00:04:30,876 --> 00:04:33,116 Speaker 2: So I mean that event was in two thousand and nine. 74 00:04:33,916 --> 00:04:38,436 Speaker 2: We officially started in twenty seventeen, so there's a time 75 00:04:38,476 --> 00:04:41,916 Speaker 2: and distance between those two things. My co founder Patrick 76 00:04:42,236 --> 00:04:46,716 Speaker 2: is a volunteer firefighter, and he was constantly educated that 77 00:04:46,836 --> 00:04:49,796 Speaker 2: if career firefighters, which to be clear for everybody, they 78 00:04:50,356 --> 00:04:52,876 Speaker 2: are getting paid and work as a firefighter every day, 79 00:04:53,196 --> 00:04:56,196 Speaker 2: but volunteers typically have a day job and then they 80 00:04:56,236 --> 00:04:58,476 Speaker 2: get called to an emergency event in the middle of it. 81 00:04:58,996 --> 00:05:02,436 Speaker 2: So the message was, if the career people can make mistakes, 82 00:05:03,076 --> 00:05:05,516 Speaker 2: we're definitely going to be prone to making mistakes, so 83 00:05:05,636 --> 00:05:08,916 Speaker 2: let's learn from this. And so Patrick kind of lived 84 00:05:08,916 --> 00:05:11,516 Speaker 2: with for years in education and felt, come on, I 85 00:05:11,516 --> 00:05:13,316 Speaker 2: got this iPhone in my pocket, there's got to be 86 00:05:13,836 --> 00:05:17,996 Speaker 2: something more. And finally, by twenty seventeen, technology seemed to 87 00:05:18,036 --> 00:05:20,596 Speaker 2: go in a place that made sense, and he had 88 00:05:20,596 --> 00:05:23,156 Speaker 2: to find a partner to help him do it, and 89 00:05:23,196 --> 00:05:25,476 Speaker 2: that's why we ended up pursuing it from there. So 90 00:05:25,556 --> 00:05:30,876 Speaker 2: our original intent was to build technology to help them 91 00:05:30,916 --> 00:05:33,476 Speaker 2: in these emergency events. The hard part, though, is an 92 00:05:33,476 --> 00:05:38,276 Speaker 2: emergency is inherently chaotic, unpredictable, right, and all of a 93 00:05:38,316 --> 00:05:40,836 Speaker 2: sudden we think, Okay, we're just going to repurpose technology 94 00:05:40,836 --> 00:05:44,356 Speaker 2: that already exists and afford it to the fire service. Instead, 95 00:05:44,596 --> 00:05:48,156 Speaker 2: we're at the edge of technology, actually pushing on capabilities that, 96 00:05:48,196 --> 00:05:50,636 Speaker 2: according to colleagues and people that we worked with in 97 00:05:50,756 --> 00:05:54,636 Speaker 2: NASA and DHS, didn't exist. So for example, like tracking 98 00:05:54,636 --> 00:05:58,516 Speaker 2: someone's location when they are GPS denied, you know, helping 99 00:05:58,516 --> 00:06:01,596 Speaker 2: communication to be shared when you were communication denied. 100 00:06:02,036 --> 00:06:03,276 Speaker 4: It turned out that not many. 101 00:06:03,076 --> 00:06:05,836 Speaker 2: People around the world were doing it, and at which 102 00:06:05,836 --> 00:06:07,996 Speaker 2: point were we said, oh, this is going to be 103 00:06:07,996 --> 00:06:12,076 Speaker 2: a lot more of a difficult endeavor than we had anticipated. 104 00:06:12,276 --> 00:06:14,356 Speaker 4: So that's the origin of why we're here, is there. 105 00:06:15,516 --> 00:06:18,076 Speaker 1: This is all super interesting and I want to come 106 00:06:18,076 --> 00:06:20,436 Speaker 1: back in more detail after we've gone down the pen 107 00:06:20,476 --> 00:06:22,236 Speaker 1: a little bit. But one thing I wanted to talk 108 00:06:22,276 --> 00:06:23,556 Speaker 1: to you to talk about just a little bit, so 109 00:06:23,636 --> 00:06:25,836 Speaker 1: we understand this. When you have a kind of fire 110 00:06:25,876 --> 00:06:30,396 Speaker 1: that's out of control. The specific issue that you were 111 00:06:30,396 --> 00:06:33,316 Speaker 1: trying to solve is that once someone a firefighter, entered 112 00:06:33,316 --> 00:06:36,756 Speaker 1: the facility, you lost track of where that person was, right, 113 00:06:37,316 --> 00:06:40,676 Speaker 1: and there was no existing system in place that would 114 00:06:40,676 --> 00:06:43,076 Speaker 1: allow you to easily track that person. 115 00:06:44,436 --> 00:06:44,716 Speaker 4: Correct. 116 00:06:44,836 --> 00:06:49,836 Speaker 1: Yeah, And is it because of the the Is it 117 00:06:49,836 --> 00:06:53,356 Speaker 1: because the fire had destroyed any kind of infrastructure that 118 00:06:53,436 --> 00:06:55,316 Speaker 1: might make that possible, or is it just I mean 119 00:06:55,756 --> 00:07:00,476 Speaker 1: was why is what's particularly hard about about tracking someone 120 00:07:00,596 --> 00:07:03,476 Speaker 1: in the middle of a burning building? 121 00:07:06,316 --> 00:07:09,836 Speaker 2: Well, circumstantially, it wasn't necessarily the case that comms are 122 00:07:09,916 --> 00:07:13,156 Speaker 2: necessarily completely blown out. Not always the case, because sometimes 123 00:07:13,156 --> 00:07:17,196 Speaker 2: the radio system continues to function. There are so many 124 00:07:17,316 --> 00:07:20,556 Speaker 2: dynamics to the situation that for you to give someone 125 00:07:20,596 --> 00:07:24,156 Speaker 2: a tool that you say universally will help you is 126 00:07:24,196 --> 00:07:26,356 Speaker 2: a very precarious undertaking. 127 00:07:26,476 --> 00:07:26,636 Speaker 5: Right. 128 00:07:26,676 --> 00:07:29,076 Speaker 2: You have to handle in a large structure like we're 129 00:07:29,116 --> 00:07:31,796 Speaker 2: in now, in a small structure in a suburban area, 130 00:07:31,996 --> 00:07:35,396 Speaker 2: in remote areas and wildland fires and such, right, And 131 00:07:35,436 --> 00:07:37,196 Speaker 2: it has to work on all of those places in 132 00:07:37,276 --> 00:07:40,396 Speaker 2: order to work for a firefighter, because the modern firefighter 133 00:07:40,596 --> 00:07:45,956 Speaker 2: experiences so much right, so, you know, chasing those problems fundamentally. 134 00:07:45,396 --> 00:07:47,756 Speaker 4: Difficult, a lot of data, a lot of. 135 00:07:49,276 --> 00:07:52,676 Speaker 2: Error, right, and you know you push hard to make 136 00:07:52,676 --> 00:07:55,076 Speaker 2: sure that it's purpose built. So I think this is 137 00:07:55,116 --> 00:07:58,156 Speaker 2: where the AI portion of our discussions makes sense. 138 00:07:58,236 --> 00:07:58,436 Speaker 3: Right. 139 00:07:58,476 --> 00:08:02,556 Speaker 2: It can help to interpret a lot of inputs and 140 00:08:02,716 --> 00:08:05,676 Speaker 2: give us some simple surfacings and understandings that we can 141 00:08:05,756 --> 00:08:06,556 Speaker 2: leverage from there. 142 00:08:06,836 --> 00:08:11,836 Speaker 1: Yeah, well, I want you to respond to Ryan. And 143 00:08:12,076 --> 00:08:15,236 Speaker 1: I'm curious whether did you when you when you when 144 00:08:15,276 --> 00:08:17,636 Speaker 1: you started on this road, did you imagine you'll be 145 00:08:17,636 --> 00:08:19,276 Speaker 1: having conversations with people like Ryan. 146 00:08:19,956 --> 00:08:21,036 Speaker 3: I was certainly hopeful. 147 00:08:21,596 --> 00:08:25,636 Speaker 6: Being able to serve the first responder community is such 148 00:08:26,356 --> 00:08:30,676 Speaker 6: an important undertaking, you know, every single day to protect 149 00:08:30,996 --> 00:08:34,516 Speaker 6: you and me, our families, our communities. And you know, 150 00:08:34,516 --> 00:08:37,796 Speaker 6: from a te mobile for business perspective, how can we 151 00:08:37,876 --> 00:08:42,556 Speaker 6: take this incredible best in the nation five G network 152 00:08:43,116 --> 00:08:48,116 Speaker 6: and how can we harness some very specific capabilities to 153 00:08:48,316 --> 00:08:52,876 Speaker 6: bring to life a solution that serves the first responder community? 154 00:08:52,956 --> 00:08:56,756 Speaker 6: And you know, companies like Ryan's three a M. And 155 00:08:57,356 --> 00:09:00,636 Speaker 6: just a few weeks ago now we launched what we 156 00:09:00,756 --> 00:09:06,876 Speaker 6: call ty Priority, which brings not just the network which 157 00:09:06,916 --> 00:09:11,316 Speaker 6: has forty percent more capacity, which means more firefighters and 158 00:09:11,396 --> 00:09:14,916 Speaker 6: police and ems showing up at a location, are able 159 00:09:14,956 --> 00:09:17,276 Speaker 6: to get on the network and do what they need 160 00:09:17,316 --> 00:09:20,556 Speaker 6: to do. But then something that we call a slice, 161 00:09:21,076 --> 00:09:26,316 Speaker 6: which is really a fancy technology term, which is, hey, 162 00:09:26,316 --> 00:09:29,836 Speaker 6: can we create the a traffic cop if you will, 163 00:09:30,956 --> 00:09:35,676 Speaker 6: a capability that as first responders are getting on the 164 00:09:35,716 --> 00:09:39,436 Speaker 6: network that not only gives them the access to the network, 165 00:09:39,596 --> 00:09:45,436 Speaker 6: priority access and then preemption access to essentially bump if 166 00:09:45,476 --> 00:09:47,436 Speaker 6: you will, a commercial user off of the network, and 167 00:09:47,476 --> 00:09:50,636 Speaker 6: that's been around for four, five, six, seven years at 168 00:09:50,636 --> 00:09:53,516 Speaker 6: this point, but can we give them the ability then 169 00:09:53,636 --> 00:10:00,316 Speaker 6: to manage that traffic and dynamically allocate the amount of 170 00:10:00,356 --> 00:10:03,956 Speaker 6: capacity on the network to the first responders so that 171 00:10:04,036 --> 00:10:08,676 Speaker 6: in these sorts of scenarios where extreme congestion can be 172 00:10:08,716 --> 00:10:12,636 Speaker 6: a current, you know, like a trained railment or a 173 00:10:12,756 --> 00:10:16,876 Speaker 6: massive natural disaster, et cetera, that we can essentially give 174 00:10:17,036 --> 00:10:19,156 Speaker 6: up to one hundred percent of the network over to 175 00:10:19,196 --> 00:10:22,156 Speaker 6: the first responders so that they can save lives. 176 00:10:22,876 --> 00:10:24,876 Speaker 1: Yeah, I want to return to that, but I want 177 00:10:24,916 --> 00:10:28,316 Speaker 1: to talk a little bit too. As easy you are 178 00:10:28,356 --> 00:10:31,596 Speaker 1: tell us what you tell us your title of your job. 179 00:10:31,956 --> 00:10:35,236 Speaker 5: So I currently serve as an interim chair for the 180 00:10:35,276 --> 00:10:38,236 Speaker 5: Department of informatics and health data science, and I'm the 181 00:10:38,236 --> 00:10:41,916 Speaker 5: phone and director of the Media and Innovation Lab, and 182 00:10:42,036 --> 00:10:44,556 Speaker 5: I co lead a sleep on circadian science, and I 183 00:10:44,596 --> 00:10:47,316 Speaker 5: lead Population health Informatics. So, not to be funny, but 184 00:10:47,876 --> 00:10:50,756 Speaker 5: as a Jamaican we're known from multiple jobs. And yes, 185 00:10:52,156 --> 00:10:53,836 Speaker 5: this is at the University of It is at the 186 00:10:53,916 --> 00:10:57,676 Speaker 5: University of Miami. But you're a doctor by I'm PhD. 187 00:10:57,716 --> 00:11:00,956 Speaker 5: I'm a clinical psychologist. But I lead many off the 188 00:11:01,076 --> 00:11:06,116 Speaker 5: efforts at the university to lead digital transformation. And so 189 00:11:06,476 --> 00:11:11,796 Speaker 5: I was recruited from an another large institution when I 190 00:11:11,796 --> 00:11:14,756 Speaker 5: was at NYU School of Medicine to lead this effort 191 00:11:14,916 --> 00:11:17,836 Speaker 5: at the University of Miami. And the reason why it's 192 00:11:17,876 --> 00:11:22,236 Speaker 5: important is because the University of Miami really serves as 193 00:11:22,316 --> 00:11:27,596 Speaker 5: the academic epicenter of the Southeast, particularly in Florida, and 194 00:11:27,716 --> 00:11:33,596 Speaker 5: Miami in particular is really considered the gateway to the 195 00:11:33,596 --> 00:11:36,116 Speaker 5: global South. For those of you are not familiar, the 196 00:11:36,116 --> 00:11:40,836 Speaker 5: global SETH represents eighty percent of the world's population, yet 197 00:11:41,316 --> 00:11:46,436 Speaker 5: as a euphemism, they're oftentimes seen as the poorest, less resource, 198 00:11:46,516 --> 00:11:50,356 Speaker 5: particularly in healthcare. And so I was brought to lead 199 00:11:50,436 --> 00:11:53,676 Speaker 5: that effort to create models that would be able to 200 00:11:53,796 --> 00:11:57,436 Speaker 5: serve not just South Florida, but how it could be 201 00:11:57,516 --> 00:12:04,236 Speaker 5: translated to similar socio economic deprived communities throughout Florida and 202 00:12:04,316 --> 00:12:07,676 Speaker 5: then use it as a model to really do this 203 00:12:07,836 --> 00:12:08,756 Speaker 5: in the global South. 204 00:12:09,556 --> 00:12:13,236 Speaker 1: You at what point during your career did you realize 205 00:12:13,236 --> 00:12:15,676 Speaker 1: that what you wanted to do is use technology to 206 00:12:15,716 --> 00:12:18,796 Speaker 1: solve problems. I mean, you have a PhD in clinical psychology. 207 00:12:18,836 --> 00:12:21,756 Speaker 1: You're not looking at AI and five G when you're 208 00:12:21,756 --> 00:12:22,396 Speaker 1: doing a PhD. 209 00:12:23,516 --> 00:12:25,516 Speaker 3: Well, you know, so, great question. 210 00:12:25,676 --> 00:12:30,036 Speaker 5: So when I realized that technology was important was when 211 00:12:30,036 --> 00:12:35,396 Speaker 5: I realized that many of the most vexing health care 212 00:12:35,516 --> 00:12:39,956 Speaker 5: challenges that we saw in my own family, my grandmother 213 00:12:40,036 --> 00:12:43,556 Speaker 5: who raised me, and we realized that there was just 214 00:12:43,716 --> 00:12:47,796 Speaker 5: significant lack of resources. She had insurance, but what we 215 00:12:47,876 --> 00:12:51,756 Speaker 5: saw was a significant gap in the continuity of care. 216 00:12:52,436 --> 00:12:55,636 Speaker 5: And extrapolate in her experience to what I see when 217 00:12:55,636 --> 00:13:00,396 Speaker 5: I go to barbershops, beauty salons, and faith based organizations. 218 00:13:00,436 --> 00:13:03,076 Speaker 5: Because we're one of those folks who we like to 219 00:13:03,116 --> 00:13:05,876 Speaker 5: be in the community that we don't believe in this 220 00:13:06,116 --> 00:13:09,356 Speaker 5: sterile brick on mortar healthcare because we believe healthcare needs 221 00:13:09,396 --> 00:13:12,556 Speaker 5: to be more And what we found out was that 222 00:13:12,596 --> 00:13:15,636 Speaker 5: in order for us to meet the challenges that our 223 00:13:15,796 --> 00:13:19,196 Speaker 5: nation and our globe sees that we either need to 224 00:13:19,236 --> 00:13:22,116 Speaker 5: train a whole lot more healthcare practitioners, which we still 225 00:13:22,156 --> 00:13:25,036 Speaker 5: need to do, but that was not going to be 226 00:13:25,156 --> 00:13:28,956 Speaker 5: sufficient to close that gap in good time. So what 227 00:13:29,036 --> 00:13:32,236 Speaker 5: we realized was that technology, though it is not a 228 00:13:32,236 --> 00:13:35,556 Speaker 5: panacea that can cure all, was going to be the 229 00:13:35,596 --> 00:13:37,756 Speaker 5: means by which we were going to be able to 230 00:13:37,876 --> 00:13:42,076 Speaker 5: one provide the care that so many people desperately need, 231 00:13:42,276 --> 00:13:47,236 Speaker 5: but also to provide adjunctive and supportive and augmentive care 232 00:13:47,476 --> 00:13:51,076 Speaker 5: to healthcare provide us. And so technology became the means 233 00:13:51,116 --> 00:13:54,196 Speaker 5: by which it would allow us to really extend our 234 00:13:54,276 --> 00:13:59,116 Speaker 5: tentacles into places beyond that we thought were unimaginable. 235 00:13:59,396 --> 00:14:01,956 Speaker 1: It gives a specific example, yeah, of a moment where 236 00:14:01,996 --> 00:14:05,116 Speaker 1: you realized, oh, this is a NOTT we can only 237 00:14:05,156 --> 00:14:07,076 Speaker 1: crack with technology. 238 00:14:07,156 --> 00:14:10,556 Speaker 5: Yes, So we created our own remote health monitor and 239 00:14:10,636 --> 00:14:15,636 Speaker 5: solution called the Mailbox, and we were funded to do 240 00:14:15,756 --> 00:14:20,636 Speaker 5: some really novel research looking at cardiovascular health in urban 241 00:14:20,716 --> 00:14:25,836 Speaker 5: and rural areas. And so, like most scientists, we don't 242 00:14:25,876 --> 00:14:29,836 Speaker 5: care about you know, accolades per se. We just wanted 243 00:14:29,836 --> 00:14:31,916 Speaker 5: to do the work and we did the work, and 244 00:14:31,956 --> 00:14:36,076 Speaker 5: then COVID happened, and we went into someone's home because 245 00:14:36,076 --> 00:14:37,996 Speaker 5: we would typically send out technicians. 246 00:14:38,156 --> 00:14:39,236 Speaker 3: And I remember. 247 00:14:39,156 --> 00:14:41,956 Speaker 5: Because she's part of our study, and because of hypo compliance, 248 00:14:41,996 --> 00:14:44,396 Speaker 5: I can't say her name, but we'll call her Miss Jones. 249 00:14:45,116 --> 00:14:48,796 Speaker 5: Miss Jones is a sixty year old African American woman 250 00:14:49,116 --> 00:14:53,156 Speaker 5: lives in Brooklyn, and we call her Miss Jones. Such 251 00:14:53,196 --> 00:14:55,276 Speaker 5: and such will be coming down there to do the study. 252 00:14:55,316 --> 00:14:57,996 Speaker 5: And she said, Hey, honey, you ain't coming here at 253 00:14:57,996 --> 00:15:01,116 Speaker 5: all because I ain't trying to get the RONA. And 254 00:15:01,196 --> 00:15:06,236 Speaker 5: that allowed us to realize that how can we flip it? 255 00:15:06,796 --> 00:15:10,156 Speaker 5: And that's what really spurred us into action quickly to 256 00:15:10,276 --> 00:15:14,396 Speaker 5: create a remote health monitor and solution, knowing very well 257 00:15:14,556 --> 00:15:18,676 Speaker 5: that it can be used for people. It's oftentimes said 258 00:15:19,196 --> 00:15:21,716 Speaker 5: that since twenty sixteen, they're about one hundred and forty 259 00:15:21,836 --> 00:15:27,516 Speaker 5: million emergency department visits and they're about sixty percent of 260 00:15:27,676 --> 00:15:31,396 Speaker 5: global deaths that can be attributed to non communicable diseases 261 00:15:31,476 --> 00:15:34,756 Speaker 5: like cardive metabolic health. And what are the biggest drivers 262 00:15:34,796 --> 00:15:40,156 Speaker 5: of that? No healthcare right and people don't have access. 263 00:15:40,436 --> 00:15:42,436 Speaker 5: So when we went to someone like Miss Jones and 264 00:15:42,476 --> 00:15:45,236 Speaker 5: what we've seen beer out in our studies and what 265 00:15:45,276 --> 00:15:45,796 Speaker 5: we've seen. 266 00:15:46,076 --> 00:15:48,956 Speaker 3: We've seen another woman who she lives. 267 00:15:48,756 --> 00:15:52,156 Speaker 5: In government housing in Florida and she would always go 268 00:15:52,236 --> 00:15:56,196 Speaker 5: to her landlord because she had these respiratory illnesses and 269 00:15:56,636 --> 00:15:59,356 Speaker 5: the landlord will push her aside and said, no, nothing 270 00:15:59,396 --> 00:16:01,756 Speaker 5: is wrong. You're trying to evade pain your rent. And 271 00:16:01,796 --> 00:16:03,836 Speaker 5: she's like, no, there's something wrong with you. You need 272 00:16:03,876 --> 00:16:06,956 Speaker 5: to change something. And she was part of our study 273 00:16:07,436 --> 00:16:10,316 Speaker 5: and we have as part of our remote health monitoring 274 00:16:10,396 --> 00:16:13,396 Speaker 5: solution and air quality device and she was able to 275 00:16:13,556 --> 00:16:16,476 Speaker 5: use that to show to her landlord that there is 276 00:16:16,556 --> 00:16:19,396 Speaker 5: something significantly wrong in terms of mold. 277 00:16:19,876 --> 00:16:20,756 Speaker 3: And so look at this. 278 00:16:21,556 --> 00:16:25,036 Speaker 5: Many of us live in environments that we just trust 279 00:16:25,556 --> 00:16:28,356 Speaker 5: that it has the right environment, it has. 280 00:16:28,116 --> 00:16:30,636 Speaker 3: Everything, even if you have health care. 281 00:16:31,436 --> 00:16:33,356 Speaker 5: And what we want to be able to do is 282 00:16:33,396 --> 00:16:36,996 Speaker 5: to put a wearable on the environment, put a wearable 283 00:16:36,996 --> 00:16:41,116 Speaker 5: on individuals, and it is facilitated through technology so that 284 00:16:41,196 --> 00:16:44,076 Speaker 5: we can quantify, so that we can show and prove 285 00:16:44,396 --> 00:16:47,716 Speaker 5: so that we can further empower our patients. That's just 286 00:16:47,796 --> 00:16:50,716 Speaker 5: one example. There's another example as well, and one other 287 00:16:50,796 --> 00:16:54,836 Speaker 5: woman who lives in a rural area in Florida, and 288 00:16:54,956 --> 00:16:57,996 Speaker 5: went to the physician like most of us, and we 289 00:16:58,116 --> 00:17:01,116 Speaker 5: get all of these print outs on our lab work 290 00:17:01,156 --> 00:17:03,036 Speaker 5: and we don't know what they mean. Let's be real 291 00:17:03,756 --> 00:17:08,316 Speaker 5: and not not to knock on my colleagues, but you 292 00:17:08,436 --> 00:17:12,156 Speaker 5: will be very lucky if someone goes through with you 293 00:17:12,316 --> 00:17:16,316 Speaker 5: what each measurement means. Right, So this is what happened. 294 00:17:16,876 --> 00:17:19,836 Speaker 5: This woman went to her provider and the provider said, 295 00:17:20,156 --> 00:17:22,916 Speaker 5: I think something is up with your heart. Something is 296 00:17:22,996 --> 00:17:24,796 Speaker 5: up with your heart. Now, this is a woman who 297 00:17:24,796 --> 00:17:27,636 Speaker 5: works two jobs, has three kids, so she was like, 298 00:17:27,636 --> 00:17:29,556 Speaker 5: what should I do? Well, you should go ahead and 299 00:17:29,596 --> 00:17:33,316 Speaker 5: see a cardiologist. Didn't provide the necessary handoff at all. 300 00:17:34,076 --> 00:17:36,836 Speaker 5: And so here is it that we dropped the ball 301 00:17:37,036 --> 00:17:39,876 Speaker 5: as a community that this lady just went off and 302 00:17:39,956 --> 00:17:41,956 Speaker 5: just said, well, I guess something is wrong with my heart. 303 00:17:42,116 --> 00:17:44,076 Speaker 5: You know, we'll see I'll go to the er, which 304 00:17:44,076 --> 00:17:46,996 Speaker 5: is why we have so many er visits. And so 305 00:17:47,156 --> 00:17:49,236 Speaker 5: what she was able to do by wearing one of 306 00:17:49,276 --> 00:17:52,956 Speaker 5: our rings, she called us angry. She said, doctor Sashas 307 00:17:53,796 --> 00:17:57,956 Speaker 5: your device is waking me up every ten minutes, I 308 00:17:57,996 --> 00:18:01,036 Speaker 5: don't want to be part of your study anymore. When 309 00:18:01,076 --> 00:18:04,076 Speaker 5: we looked at our command center and we saw what 310 00:18:04,116 --> 00:18:07,596 Speaker 5: was happening. These lady oxygen levels were dropping below eighty 311 00:18:07,676 --> 00:18:11,956 Speaker 5: percent critical. So what we ended up doing we said, 312 00:18:12,156 --> 00:18:14,076 Speaker 5: you know what, we don't care about health care and 313 00:18:14,156 --> 00:18:17,556 Speaker 5: insurance right now, we have a study physician. We connected 314 00:18:17,556 --> 00:18:20,116 Speaker 5: her and she was able to see a cardiologist in 315 00:18:20,196 --> 00:18:23,516 Speaker 5: no time. She called us crying, saying thank you very 316 00:18:23,636 --> 00:18:28,476 Speaker 5: much because if she hadn't gotten that intervention, she probably 317 00:18:28,476 --> 00:18:31,116 Speaker 5: would have died, and she should have left her kids orphans. 318 00:18:31,436 --> 00:18:33,436 Speaker 5: This is what we've seen black and brown families all 319 00:18:33,436 --> 00:18:36,036 Speaker 5: the time. It's not just a healthcare issue. She had 320 00:18:36,076 --> 00:18:39,516 Speaker 5: health care, but are we able to connect the dots 321 00:18:39,516 --> 00:18:42,516 Speaker 5: and we believe through technology we can have a physical 322 00:18:42,516 --> 00:18:43,156 Speaker 5: in a box. 323 00:18:43,036 --> 00:18:43,516 Speaker 3: To do with her. 324 00:18:43,996 --> 00:18:45,836 Speaker 1: I want to come back to Ryaneah the same question. 325 00:18:46,236 --> 00:18:49,916 Speaker 1: Let's talk about the technology here. Yeah, you gave her 326 00:18:49,916 --> 00:18:52,996 Speaker 1: a ring. Yeah, like a describe this? 327 00:18:53,116 --> 00:18:55,716 Speaker 5: Yeah, I mean yeah, I mean I have the ring here. 328 00:18:55,756 --> 00:19:00,276 Speaker 5: But it's a ring that measures what we call cardiopulmonary coupling. 329 00:18:59,956 --> 00:19:01,756 Speaker 3: Big terms, here's what it means. 330 00:19:02,396 --> 00:19:08,596 Speaker 5: Typically, what happens is your respiratory system, your lungs operate 331 00:19:08,796 --> 00:19:12,596 Speaker 5: in conjunction with their circuit ury system your heart. What 332 00:19:12,836 --> 00:19:17,596 Speaker 5: ends up happening in between that. Physiology is so many things, 333 00:19:17,636 --> 00:19:20,276 Speaker 5: and that's where we believe many of the illnesses that 334 00:19:20,356 --> 00:19:24,796 Speaker 5: get undetected, that's where they surface, and they surface mostly 335 00:19:24,836 --> 00:19:27,916 Speaker 5: in your sleep, so you will never feel those symptoms 336 00:19:27,996 --> 00:19:31,036 Speaker 5: at all. So what we were able to do through 337 00:19:31,116 --> 00:19:34,396 Speaker 5: the ring measuring CARDI Pullman recompany because your watch doesn't 338 00:19:34,396 --> 00:19:37,596 Speaker 5: do that, because your watch only measures one or the other. 339 00:19:38,156 --> 00:19:40,956 Speaker 5: We're able to measure the two and we're able to 340 00:19:40,996 --> 00:19:43,996 Speaker 5: measure how the two interact and connect with each other. 341 00:19:44,156 --> 00:19:46,396 Speaker 1: So this ring, is this an off the shelf thing 342 00:19:46,476 --> 00:19:46,876 Speaker 1: or something you. 343 00:19:48,236 --> 00:19:49,996 Speaker 5: We're trying to get it off the shelf, but it's 344 00:19:50,036 --> 00:19:54,556 Speaker 5: more of a medical device. And there I say, it's 345 00:19:54,596 --> 00:19:57,156 Speaker 5: not this any other but it's not as expensive as 346 00:19:57,196 --> 00:20:00,036 Speaker 5: others we've worked with some other proprietary It is not 347 00:20:00,076 --> 00:20:00,836 Speaker 5: as expensive. 348 00:20:00,956 --> 00:20:03,676 Speaker 1: So you wear this ring and then it's connected to. 349 00:20:03,716 --> 00:20:05,156 Speaker 3: What it's connected to. 350 00:20:06,716 --> 00:20:09,876 Speaker 5: A cell phone that we provide, so it's tether so 351 00:20:10,076 --> 00:20:13,036 Speaker 5: when you fall asleep, it hit start and it starts 352 00:20:13,076 --> 00:20:15,396 Speaker 5: to measure. It can measure if you're at risk for 353 00:20:15,476 --> 00:20:20,436 Speaker 5: sleep apnel. It can measure if you have significant oxygen 354 00:20:20,796 --> 00:20:23,396 Speaker 5: you know, desaturation lowering the levels. 355 00:20:23,076 --> 00:20:24,676 Speaker 1: And that data is coming back. 356 00:20:25,236 --> 00:20:28,396 Speaker 5: Yes, so that data comes back to the command center 357 00:20:28,516 --> 00:20:29,636 Speaker 5: that we are able to. 358 00:20:29,676 --> 00:20:32,716 Speaker 1: See, which is at the University of which is at Yes. 359 00:20:32,556 --> 00:20:34,596 Speaker 3: In our group at the University of Miami. 360 00:20:34,836 --> 00:20:37,516 Speaker 1: And how many patients do you have on for example. 361 00:20:37,236 --> 00:20:40,876 Speaker 5: Yes, So right now we're piloting this in our research studies. 362 00:20:40,876 --> 00:20:44,556 Speaker 5: So we have fifteen hundred participants African American and Hispanics 363 00:20:44,556 --> 00:20:48,636 Speaker 5: in urban and rural areas. And we've partnered with community 364 00:20:48,636 --> 00:20:54,036 Speaker 5: health centers, federally qualified health centers. Oftentimes academic centers are 365 00:20:54,036 --> 00:20:57,276 Speaker 5: the ones who are the ones who wave the flag 366 00:20:57,356 --> 00:21:00,676 Speaker 5: of technology. What we said at the University of Miami 367 00:21:00,716 --> 00:21:03,036 Speaker 5: is that we have to do more. That it is 368 00:21:03,196 --> 00:21:06,756 Speaker 5: our vocation and it is our mission to really be 369 00:21:07,396 --> 00:21:12,556 Speaker 5: that you know, supporting force. So we work with the 370 00:21:12,716 --> 00:21:14,756 Speaker 5: largest free clinic in the state of Florida. 371 00:21:17,996 --> 00:21:33,916 Speaker 1: We'll be right back with more from the panel. We're 372 00:21:33,916 --> 00:21:37,596 Speaker 1: back with more Cattaba, Doctor as Easy, Satious and Ryan 373 00:21:37,676 --> 00:21:40,796 Speaker 1: Lydd So walk us through how you use technology to 374 00:21:41,836 --> 00:21:42,796 Speaker 1: answer those questions. 375 00:21:43,916 --> 00:21:46,476 Speaker 2: I think that the place that we start, and as 376 00:21:46,516 --> 00:21:50,036 Speaker 2: some of us in technology, because myself, you know, probably 377 00:21:50,036 --> 00:21:55,076 Speaker 2: more of a technologist, it's to start with the person first, right, 378 00:21:55,116 --> 00:21:59,596 Speaker 2: to observe, to understand, and then augment. But ideally we 379 00:21:59,636 --> 00:22:03,516 Speaker 2: always say compliment, not complicate, Right, So if there's something 380 00:22:03,516 --> 00:22:04,596 Speaker 2: that's already available, if. 381 00:22:04,516 --> 00:22:06,516 Speaker 4: There are tools that are already there. 382 00:22:06,396 --> 00:22:08,436 Speaker 2: Can we listen to those tools so that it can 383 00:22:08,516 --> 00:22:10,916 Speaker 2: feel seem like to the first responder. The last thing 384 00:22:10,916 --> 00:22:12,516 Speaker 2: we want them to do is be playing with new 385 00:22:12,556 --> 00:22:14,956 Speaker 2: tech and buttons and other things to make their jobs 386 00:22:14,996 --> 00:22:18,196 Speaker 2: even more complex. So we sought to make a more 387 00:22:18,196 --> 00:22:22,316 Speaker 2: integrative solution, which therefore you know, five G and software 388 00:22:22,356 --> 00:22:24,956 Speaker 2: and these sorts of things start to you know, form 389 00:22:25,156 --> 00:22:28,436 Speaker 2: because it makes sense to do. We've thought a lot 390 00:22:28,476 --> 00:22:33,356 Speaker 2: about bioindicators, like doctors talking about cardiac arrest is still 391 00:22:33,396 --> 00:22:36,716 Speaker 2: one of the greatest killers in the fire service. Detecting 392 00:22:36,716 --> 00:22:40,556 Speaker 2: blood oxygen levels would be amazing because if we could 393 00:22:40,556 --> 00:22:43,236 Speaker 2: capture those things as a precursor, we could draw those 394 00:22:43,236 --> 00:22:45,996 Speaker 2: individuals out before it's too late. The hard part is 395 00:22:45,996 --> 00:22:49,276 Speaker 2: is the stressor it's such a high stress environment that 396 00:22:49,996 --> 00:22:52,156 Speaker 2: we need the technology to get to a point where 397 00:22:52,196 --> 00:22:54,796 Speaker 2: it can actually give us that accuracy when we need 398 00:22:54,836 --> 00:22:56,876 Speaker 2: it and not tell us after the cardiac o rent 399 00:22:56,876 --> 00:22:59,836 Speaker 2: has already happened. Oh you know, this person's about to 400 00:22:59,836 --> 00:23:02,796 Speaker 2: have one. So for us we look at interfacing with 401 00:23:02,876 --> 00:23:06,836 Speaker 2: other technology. But inevitably what got interesting is phones had 402 00:23:06,836 --> 00:23:09,556 Speaker 2: a role to play, right and in a couple of 403 00:23:09,556 --> 00:23:11,876 Speaker 2: different ways, one of which is the compute. All the 404 00:23:11,916 --> 00:23:13,636 Speaker 2: things that phones can do for all of us in 405 00:23:13,636 --> 00:23:16,756 Speaker 2: our daily lives, those are great assets and tools for 406 00:23:16,796 --> 00:23:20,036 Speaker 2: the fire service. Right now, they literally have that radio 407 00:23:20,116 --> 00:23:22,876 Speaker 2: I explained before, and rarely much else. 408 00:23:23,396 --> 00:23:24,156 Speaker 4: So an example. 409 00:23:24,636 --> 00:23:27,356 Speaker 2: I again, we're human centric, so we stay with people, 410 00:23:27,356 --> 00:23:30,036 Speaker 2: we embed in fire stations. And I was following a 411 00:23:30,076 --> 00:23:33,396 Speaker 2: fire chief and the alarms went off, and we went 412 00:23:33,436 --> 00:23:36,396 Speaker 2: off to an emergency event, and I watched him as 413 00:23:36,396 --> 00:23:39,236 Speaker 2: he pulled out two radios, turned each one to a 414 00:23:39,236 --> 00:23:42,636 Speaker 2: different channel, placed them against his ears, and looked up 415 00:23:42,636 --> 00:23:45,236 Speaker 2: at the event and proceeded to manage it. Manage it, 416 00:23:45,236 --> 00:23:49,196 Speaker 2: in other words, keep it safe, you know, mitigate the emergency. Thankfully, 417 00:23:49,516 --> 00:23:53,116 Speaker 2: everything was all clear, nobody got hurt. We went back 418 00:23:53,156 --> 00:23:55,516 Speaker 2: to the station and I asked him, hey, Chief, have 419 00:23:55,596 --> 00:23:57,436 Speaker 2: you taught yourself over the years to listen to two 420 00:23:57,476 --> 00:24:00,996 Speaker 2: conversations at the same time? And he's like, nah, He's like, 421 00:24:01,156 --> 00:24:04,876 Speaker 2: the intensity draws my attention, So he listens for the 422 00:24:04,916 --> 00:24:07,356 Speaker 2: intensity of the voice to say this might be something 423 00:24:07,396 --> 00:24:08,516 Speaker 2: it's time for me to listen. 424 00:24:08,636 --> 00:24:10,396 Speaker 4: Oh, and that's so fascinating. 425 00:24:10,476 --> 00:24:12,876 Speaker 2: Yeah, And the thought process was coming home, driving back 426 00:24:12,876 --> 00:24:15,036 Speaker 2: to our headquarters in Buffalo is a bit of a drive. 427 00:24:15,636 --> 00:24:19,516 Speaker 2: I thought computers don't have yours, Right, What about the 428 00:24:19,516 --> 00:24:23,036 Speaker 2: idea of opening up a phone and allowing the phone 429 00:24:23,036 --> 00:24:25,436 Speaker 2: to listen to as many conversations as maybe happening at 430 00:24:25,436 --> 00:24:28,156 Speaker 2: any given time, and maybe take tay a little further. 431 00:24:28,276 --> 00:24:31,396 Speaker 2: Instead of just listening for intensity, we can actually listen 432 00:24:31,436 --> 00:24:34,636 Speaker 2: to that conversation and interpret it. And that was literally 433 00:24:34,636 --> 00:24:37,836 Speaker 2: the dawn of us starting to use AI. And you know, 434 00:24:37,876 --> 00:24:39,916 Speaker 2: when we think about other tools, what other tools do 435 00:24:39,996 --> 00:24:41,716 Speaker 2: we have that can fundamentally bring that? 436 00:24:42,076 --> 00:24:46,116 Speaker 1: So, just so I understand, we're at a complex fire scene. 437 00:24:46,236 --> 00:24:51,756 Speaker 1: We have multiple firefighters, multiple people talking on radios. The 438 00:24:51,796 --> 00:24:53,756 Speaker 1: guy in charge has going to make sense, has to 439 00:24:53,796 --> 00:24:57,156 Speaker 1: coordinate all things going on. And you're saying, we could 440 00:24:57,196 --> 00:25:02,076 Speaker 1: have AI listen to all of those conversations simultaneously and 441 00:25:02,156 --> 00:25:06,156 Speaker 1: do what exactly prioritize them, summarize them. How does the 442 00:25:06,196 --> 00:25:09,556 Speaker 1: AI interface with the human decision? 443 00:25:09,796 --> 00:25:11,676 Speaker 2: Yeah, So the nice part is you can teach it 444 00:25:11,756 --> 00:25:15,036 Speaker 2: for what you want to listen for so a lot 445 00:25:15,036 --> 00:25:18,036 Speaker 2: of times there are operative words of concern that are communicated. 446 00:25:18,356 --> 00:25:20,476 Speaker 4: They want to know when certain indicators happen. 447 00:25:20,876 --> 00:25:23,356 Speaker 2: But let's be honest, the real thing that most people 448 00:25:23,356 --> 00:25:26,156 Speaker 2: are looking for is when the firefighter is under duress, 449 00:25:26,316 --> 00:25:29,396 Speaker 2: when the firefighters at risk of a loss of life, 450 00:25:29,596 --> 00:25:33,436 Speaker 2: so may day, and these types of situations are pretty consistent. 451 00:25:33,676 --> 00:25:35,276 Speaker 2: So the way we think about it is we take 452 00:25:35,356 --> 00:25:40,116 Speaker 2: the communication standard operating procedure. How do people communicate officially 453 00:25:40,156 --> 00:25:41,396 Speaker 2: through these radio systems? 454 00:25:41,636 --> 00:25:42,836 Speaker 4: When do we know it's bad? 455 00:25:43,236 --> 00:25:45,596 Speaker 2: Let's teach the AI to listen for that, and then 456 00:25:45,596 --> 00:25:47,676 Speaker 2: that way we rise to the top. In a we 457 00:25:47,716 --> 00:25:50,156 Speaker 2: have a software interface, of course, and the chief will 458 00:25:50,156 --> 00:25:53,276 Speaker 2: see someone just said something that is of concern. They 459 00:25:53,276 --> 00:25:56,156 Speaker 2: turn red, they glow, We show them where they're located, 460 00:25:56,436 --> 00:25:57,916 Speaker 2: and then the chief can take it from there. 461 00:25:58,036 --> 00:25:59,716 Speaker 1: So the chief's looking at his phone or is. 462 00:25:59,716 --> 00:26:01,876 Speaker 4: He So the chief is actually looking at a tablet. 463 00:26:01,996 --> 00:26:04,036 Speaker 2: A tablet just because you want a little bit more 464 00:26:04,036 --> 00:26:06,276 Speaker 2: surface area to kind of be able to. 465 00:26:07,116 --> 00:26:10,156 Speaker 1: In real time. The tablet is true ranking everybody and 466 00:26:10,516 --> 00:26:14,196 Speaker 1: prioritizing the person who is in most distress or under 467 00:26:14,236 --> 00:26:14,796 Speaker 1: the most stress. 468 00:26:14,916 --> 00:26:16,676 Speaker 2: Yes, and then the other nice part with the phone, 469 00:26:16,716 --> 00:26:19,156 Speaker 2: because of the amount of data that's available, we can 470 00:26:19,356 --> 00:26:22,516 Speaker 2: localize people in three dimensional space, so we can actually 471 00:26:22,556 --> 00:26:25,876 Speaker 2: show where they exist in the world, but inside even 472 00:26:25,916 --> 00:26:28,556 Speaker 2: a given structure and with height considered. 473 00:26:28,996 --> 00:26:31,236 Speaker 4: So that's where we fuse these things together. 474 00:26:31,316 --> 00:26:33,836 Speaker 2: So we use some of the capabilities inside the phone, 475 00:26:33,876 --> 00:26:36,236 Speaker 2: all the sensors, all the networks, and we can say, hey, 476 00:26:36,276 --> 00:26:37,236 Speaker 2: this person's up here. 477 00:26:37,476 --> 00:26:39,636 Speaker 4: Oh, by the way, through the AI, they. 478 00:26:39,516 --> 00:26:42,116 Speaker 2: Said something that you need to know about, So now 479 00:26:42,116 --> 00:26:45,356 Speaker 2: we can really localize this is where that person exists, 480 00:26:45,756 --> 00:26:47,916 Speaker 2: and then from there they can decide what they want 481 00:26:47,956 --> 00:26:48,116 Speaker 2: to do. 482 00:26:48,156 --> 00:26:51,956 Speaker 1: Then well, I'm listening to these two Ryan and as 483 00:26:51,996 --> 00:26:55,676 Speaker 1: Easy and I'm seeing so here are people in very 484 00:26:55,676 --> 00:27:01,196 Speaker 1: specific corners of the world taking these technologies and making 485 00:27:01,396 --> 00:27:05,116 Speaker 1: doing very very practical things with it. I'm curious how 486 00:27:05,156 --> 00:27:09,796 Speaker 1: does T mobile interact in this are you? Are you 487 00:27:09,876 --> 00:27:14,116 Speaker 1: a cheerleader? Are you an instigator? Are you? Are you 488 00:27:14,196 --> 00:27:18,276 Speaker 1: the person who helps them? There must obstacles. I mean, 489 00:27:18,316 --> 00:27:21,196 Speaker 1: you're changing the way people do business. I'm curious this 490 00:27:21,276 --> 00:27:23,876 Speaker 1: team will play a role in kind of how would 491 00:27:23,876 --> 00:27:26,876 Speaker 1: you characterize what you yes your partnership. 492 00:27:26,676 --> 00:27:27,596 Speaker 3: At the end of the day. 493 00:27:28,236 --> 00:27:31,436 Speaker 6: What we love to do is to visit with business 494 00:27:31,516 --> 00:27:36,476 Speaker 6: customers on what's your challenge, like, what is the heart 495 00:27:36,556 --> 00:27:40,676 Speaker 6: of what you're trying to accomplish with your solution, your product, 496 00:27:40,796 --> 00:27:46,316 Speaker 6: your service, and how can we build capabilities in and 497 00:27:46,356 --> 00:27:50,636 Speaker 6: around our network that really support that. So as an example, 498 00:27:51,796 --> 00:27:54,396 Speaker 6: and I can touch on both of the use cases 499 00:27:54,396 --> 00:27:55,756 Speaker 6: that have come up in the last few minutes, but 500 00:27:56,156 --> 00:27:58,276 Speaker 6: talking about Ryan in three AM for just a moment, 501 00:27:59,196 --> 00:28:03,676 Speaker 6: I love the conversation really oriented around Hey, as you 502 00:28:03,756 --> 00:28:07,476 Speaker 6: think about your platform and the situational awareness that you're 503 00:28:07,476 --> 00:28:10,036 Speaker 6: trying to give the chief or whoever's doing command and 504 00:28:10,076 --> 00:28:13,916 Speaker 6: control of that specific situation, how can we leverage both 505 00:28:13,996 --> 00:28:18,556 Speaker 6: devices where whether it's wearables that give you insights if 506 00:28:18,556 --> 00:28:22,116 Speaker 6: a person can't even talk, perhaps smoke inhalation and they've 507 00:28:22,196 --> 00:28:24,596 Speaker 6: fallen and okay, now I need to know, hey, they're 508 00:28:24,676 --> 00:28:25,876 Speaker 6: not moving, how's. 509 00:28:25,636 --> 00:28:26,876 Speaker 4: That information coming back? 510 00:28:27,436 --> 00:28:31,316 Speaker 6: Using the devices for things like both near field communications 511 00:28:31,356 --> 00:28:35,596 Speaker 6: and barometric pressure, which has been in the phones for 512 00:28:35,756 --> 00:28:39,076 Speaker 6: six seven years. Again at this point, that lets you know, hey, 513 00:28:39,116 --> 00:28:41,556 Speaker 6: not only the X and Y axis of where they are, 514 00:28:41,796 --> 00:28:44,516 Speaker 6: but how many floors up on a building are they 515 00:28:44,556 --> 00:28:48,996 Speaker 6: which is incredibly important for firefighters, and then over time 516 00:28:49,476 --> 00:28:54,436 Speaker 6: we're also going to be enabling API access into the network. 517 00:28:54,436 --> 00:28:57,436 Speaker 6: We've announced this, it's coming out in the near future, 518 00:28:57,716 --> 00:29:01,516 Speaker 6: which will allow the three AM platform to enhance all 519 00:29:01,516 --> 00:29:05,356 Speaker 6: of the capabilities they already have around things like even 520 00:29:05,396 --> 00:29:09,396 Speaker 6: more precise location, quality of service. Hey, I'm in the building, 521 00:29:09,476 --> 00:29:12,756 Speaker 6: it's burning. I need to dial up the network resources 522 00:29:12,756 --> 00:29:15,956 Speaker 6: to support everything that's happening there. It's the number one thing. 523 00:29:16,196 --> 00:29:20,556 Speaker 2: What does API mean by the way, Application programming interface? 524 00:29:20,676 --> 00:29:21,716 Speaker 4: Thank you very much. 525 00:29:21,876 --> 00:29:26,916 Speaker 6: It's basically, in plain English, a way of building a 526 00:29:27,036 --> 00:29:30,596 Speaker 6: door so that someone else's platform can come knock on 527 00:29:30,676 --> 00:29:32,876 Speaker 6: the door, the door is opened, and we give them 528 00:29:32,956 --> 00:29:36,716 Speaker 6: very specific capabilities on things that they can do with 529 00:29:36,916 --> 00:29:43,316 Speaker 6: network resourcing in real time quality of service, location, application support. 530 00:29:43,876 --> 00:29:48,556 Speaker 1: So you make we have this complicated thing situation happening 531 00:29:48,996 --> 00:29:53,116 Speaker 1: and at various moments we want to use as many 532 00:29:53,156 --> 00:29:57,556 Speaker 1: resources as possible to answer very specific and problems, and 533 00:29:57,636 --> 00:30:02,276 Speaker 1: you're divert you're making sure that necessary network resources go 534 00:30:02,516 --> 00:30:03,836 Speaker 1: to the right place at the right time. 535 00:30:03,996 --> 00:30:06,036 Speaker 6: Exactly all of these at at the heart of it's 536 00:30:06,036 --> 00:30:09,636 Speaker 6: setting aside. The technology is ways of in suring that 537 00:30:09,716 --> 00:30:14,156 Speaker 6: you're diverting or allocating the right amount of resources to 538 00:30:14,236 --> 00:30:18,836 Speaker 6: a given use case so that the first responder or 539 00:30:18,956 --> 00:30:22,756 Speaker 6: the doctor, or the mobile network that's enabling you know, 540 00:30:22,876 --> 00:30:25,396 Speaker 6: this clinical health that scaled, no matter where you happen 541 00:30:25,436 --> 00:30:27,836 Speaker 6: to be in America is available for them to be 542 00:30:27,876 --> 00:30:28,676 Speaker 6: able to do that thing. 543 00:30:29,476 --> 00:30:32,476 Speaker 1: I read the study recently, a couple weeks ago, maybe 544 00:30:32,516 --> 00:30:34,556 Speaker 1: no less in it, so you may have seen it. 545 00:30:34,556 --> 00:30:39,916 Speaker 1: It was some study talking about an AI diagnostic tool 546 00:30:40,156 --> 00:30:42,476 Speaker 1: for doctors. Did you guys see this? And it's like, 547 00:30:43,996 --> 00:30:46,276 Speaker 1: Arm number one was the doctor all by himself does 548 00:30:46,316 --> 00:30:49,756 Speaker 1: a diagnosis and they're like seventy two percent, right. Arm 549 00:30:49,836 --> 00:30:54,156 Speaker 1: number two is doctor plus AI and it was seventy seven. 550 00:30:54,676 --> 00:30:57,156 Speaker 1: Arm number three was AI alone and it was ninety two. 551 00:30:58,076 --> 00:31:01,796 Speaker 1: And the concusion study was we gave doctors these tools 552 00:31:02,156 --> 00:31:04,316 Speaker 1: and most of the time they didn't want to use them. 553 00:31:04,756 --> 00:31:10,756 Speaker 1: So I'm curious about that problem in your worlds when 554 00:31:10,796 --> 00:31:13,276 Speaker 1: do you get pushed back? Are you sure that you've 555 00:31:13,316 --> 00:31:17,436 Speaker 1: given a marvelous suite of tools to people out in 556 00:31:17,596 --> 00:31:20,876 Speaker 1: these fields? Do they use them? Is there a roadblock there? 557 00:31:20,876 --> 00:31:22,596 Speaker 3: And I can comment on that. 558 00:31:22,836 --> 00:31:25,356 Speaker 5: So I know that study very well because there's some 559 00:31:25,396 --> 00:31:29,236 Speaker 5: of my colleagues who did that work really Yeah. So 560 00:31:29,236 --> 00:31:32,756 Speaker 5: so here here's in terms of pushback, definitely, And I 561 00:31:32,756 --> 00:31:35,436 Speaker 5: think what in healthcare one of the things that we 562 00:31:35,476 --> 00:31:38,356 Speaker 5: get pushed back around is our own data privacy security. 563 00:31:38,516 --> 00:31:45,356 Speaker 5: That is huge, particularly for information technology departments. But what 564 00:31:45,396 --> 00:31:48,556 Speaker 5: we have done, because we know that that there is 565 00:31:48,596 --> 00:31:51,836 Speaker 5: going to be some this is disruptive technology and we 566 00:31:52,076 --> 00:31:54,836 Speaker 5: have to be able to better socialize it, we have 567 00:31:54,956 --> 00:31:58,356 Speaker 5: led an entire year off what we call innovation retreats 568 00:31:58,396 --> 00:32:02,316 Speaker 5: at the University of Miami so that we can give 569 00:32:02,356 --> 00:32:04,876 Speaker 5: it to them in bite size format, so that they 570 00:32:05,116 --> 00:32:08,876 Speaker 5: understand that it's not just focused on the technology, but 571 00:32:09,156 --> 00:32:12,116 Speaker 5: how is it that we can actually help to solve 572 00:32:12,156 --> 00:32:12,836 Speaker 5: what they're doing. 573 00:32:12,996 --> 00:32:14,596 Speaker 3: And so when we broadish. 574 00:32:14,356 --> 00:32:16,676 Speaker 1: Day that you're talking about clinicians. 575 00:32:16,116 --> 00:32:19,076 Speaker 5: Clinicians and not just clinicians, because I think when you're 576 00:32:19,076 --> 00:32:24,476 Speaker 5: talking about healthcare, let me just kind of deconstruct behind 577 00:32:24,716 --> 00:32:30,516 Speaker 5: that provider, you have administrative staff, billing, you know, scheduling 578 00:32:31,276 --> 00:32:35,276 Speaker 5: all of those people who are critical to ensure the operations, 579 00:32:35,636 --> 00:32:39,756 Speaker 5: and particularly some of those operations are very mundane and 580 00:32:39,996 --> 00:32:43,196 Speaker 5: very time consuming, and it collects a lot of data 581 00:32:43,276 --> 00:32:45,316 Speaker 5: and therefore, as a result, it can lead to a 582 00:32:45,356 --> 00:32:48,916 Speaker 5: tremendous amount of error. So what we're trying to do 583 00:32:48,956 --> 00:32:52,836 Speaker 5: and what we did, was to lead this digital innovation transformation, 584 00:32:53,076 --> 00:32:56,956 Speaker 5: set off retreats, focusing on the problem, trying to understand 585 00:32:56,956 --> 00:33:01,156 Speaker 5: what their pain points are, and then have the technology 586 00:33:01,196 --> 00:33:04,116 Speaker 5: come second, or have the technology come last. 587 00:33:04,116 --> 00:33:06,916 Speaker 1: Give me an example of what someone's pain point might be. 588 00:33:07,476 --> 00:33:08,876 Speaker 1: What's what's an objection you? 589 00:33:09,196 --> 00:33:16,236 Speaker 5: Yeah, yeah, So for example, digital literacy, some provide us 590 00:33:17,036 --> 00:33:20,836 Speaker 5: unfortunately are stuck in their ways. They believe that they 591 00:33:20,876 --> 00:33:23,756 Speaker 5: want to feel and touch the patient as they should, 592 00:33:23,756 --> 00:33:26,716 Speaker 5: and we're not saying what we're proposing is we're not 593 00:33:26,756 --> 00:33:30,076 Speaker 5: saying that they shouldn't do that. But I think some 594 00:33:30,156 --> 00:33:34,116 Speaker 5: of them have a form of technophobia as well. And 595 00:33:34,156 --> 00:33:37,556 Speaker 5: by digital literacy, I'm talking you know, they may feel 596 00:33:37,596 --> 00:33:39,596 Speaker 5: as if they don't know or they may not be 597 00:33:39,636 --> 00:33:42,796 Speaker 5: as fascile in working some of the technology. So we 598 00:33:42,876 --> 00:33:45,596 Speaker 5: really peer it down, you know, for them, and I 599 00:33:45,636 --> 00:33:48,796 Speaker 5: think some of the technology I think some pushback as well. 600 00:33:49,196 --> 00:33:51,956 Speaker 5: I think many people, especially in the community that we serve, 601 00:33:52,356 --> 00:33:55,716 Speaker 5: many people believe, and it's an important issue, that access 602 00:33:55,756 --> 00:33:58,036 Speaker 5: is a huge issue for their patients. So they may 603 00:33:58,076 --> 00:34:01,676 Speaker 5: say well, my patient doesn't have a cell phone, and 604 00:34:01,716 --> 00:34:03,996 Speaker 5: I'm like, we push back and said, actually, the Pew 605 00:34:04,116 --> 00:34:08,396 Speaker 5: says ninety two percent of the US population, particularly low 606 00:34:08,396 --> 00:34:12,356 Speaker 5: income folks, actually have a smartphone or some form of 607 00:34:12,516 --> 00:34:16,436 Speaker 5: mobile device. No, it's a different thing when we're talking 608 00:34:16,476 --> 00:34:18,076 Speaker 5: about do they know how to use it? Do they 609 00:34:18,076 --> 00:34:19,956 Speaker 5: know how to optimally use it as well? And this 610 00:34:20,036 --> 00:34:22,356 Speaker 5: is what we do as well. We provide training to 611 00:34:22,596 --> 00:34:25,356 Speaker 5: patients as well as to how to use it as well. 612 00:34:25,356 --> 00:34:27,836 Speaker 5: So those are some of the unique pushbacks. And then 613 00:34:27,836 --> 00:34:31,916 Speaker 5: obviously data, where do my data go? And provide us 614 00:34:31,956 --> 00:34:33,956 Speaker 5: ask those questions as well. And I think this is 615 00:34:33,996 --> 00:34:39,556 Speaker 5: why having very robust secure environments is important times. So 616 00:34:39,596 --> 00:34:42,676 Speaker 5: similar to what we do, especially with the mailbox, we 617 00:34:42,716 --> 00:34:45,436 Speaker 5: have about seven or so devices that they were not 618 00:34:45,636 --> 00:34:48,036 Speaker 5: built to communicate with each other. So, you know, the 619 00:34:48,076 --> 00:34:51,316 Speaker 5: API is another thing and we call it handshakes, you know. 620 00:34:51,796 --> 00:34:54,156 Speaker 5: And what we try to do is we said we 621 00:34:54,196 --> 00:34:56,996 Speaker 5: wanted to create a remote health monitoring solution that's like 622 00:34:57,036 --> 00:35:00,516 Speaker 5: the Walmart version because typically when you look at remote 623 00:35:00,556 --> 00:35:04,356 Speaker 5: health monitoring solutions are very expensive and quite proprietory. We 624 00:35:04,436 --> 00:35:07,516 Speaker 5: want providers and we want provide us and patients to 625 00:35:07,516 --> 00:35:11,516 Speaker 5: be empowered that bring your own device, whatever device you have, 626 00:35:11,556 --> 00:35:15,716 Speaker 5: as long as it actually has the necessary API connectivity, 627 00:35:16,036 --> 00:35:17,876 Speaker 5: then we'll be able to collect to those dat So 628 00:35:17,996 --> 00:35:20,316 Speaker 5: those are some of the pushbacks that we've experienced. 629 00:35:20,756 --> 00:35:24,436 Speaker 1: Ryan, Do you surely this must be I mean, you're 630 00:35:24,596 --> 00:35:28,156 Speaker 1: entering a field that has been fighting virus in the 631 00:35:28,156 --> 00:35:29,836 Speaker 1: same way for a very very long time. 632 00:35:30,156 --> 00:35:30,756 Speaker 4: Yeah. 633 00:35:30,996 --> 00:35:33,716 Speaker 2: Hate the way things are, but hate change probably even more. 634 00:35:34,316 --> 00:35:38,036 Speaker 2: That's they're saying, not mine, I bros. The first place 635 00:35:38,036 --> 00:35:40,916 Speaker 2: that it started was absolutely social media. The biggest fear 636 00:35:41,356 --> 00:35:44,316 Speaker 2: in the fire Service about even bringing a phone into 637 00:35:44,356 --> 00:35:46,716 Speaker 2: the mix, or let's call it a smart device, is 638 00:35:46,756 --> 00:35:49,956 Speaker 2: the propensity to share this information publicly. But the reality was, 639 00:35:49,996 --> 00:35:52,596 Speaker 2: I reminded them when you go to a supermarket, you 640 00:35:52,636 --> 00:35:55,196 Speaker 2: know the kids that are ringing up your groceries. That's 641 00:35:55,236 --> 00:35:58,396 Speaker 2: a Windows computer, but they're not cruising around on social media. 642 00:35:58,676 --> 00:36:01,156 Speaker 2: We can configure the device to only do the thing 643 00:36:01,236 --> 00:36:04,076 Speaker 2: you wanted to do, so we can take advantage of 644 00:36:04,116 --> 00:36:04,876 Speaker 2: the capability. 645 00:36:05,036 --> 00:36:06,716 Speaker 4: So that was the first obstacle, and. 646 00:36:06,756 --> 00:36:08,956 Speaker 2: Now that we deal with AI, the big one is 647 00:36:09,116 --> 00:36:13,996 Speaker 2: hallucination and inaccuracy naturally right, well, great, I like this idea, 648 00:36:14,116 --> 00:36:16,956 Speaker 2: But what happens if it's wrong? And I think, to 649 00:36:17,076 --> 00:36:20,876 Speaker 2: quote a chief that I work with at the Philadelphia 650 00:36:20,876 --> 00:36:25,436 Speaker 2: Fire Department, he actually wrote his thesis on leveraging AI 651 00:36:25,516 --> 00:36:29,116 Speaker 2: in the Philadelphia Fire Department and beyond. And his argument 652 00:36:29,276 --> 00:36:33,036 Speaker 2: was decision support not to make the decisions for you, 653 00:36:33,196 --> 00:36:35,636 Speaker 2: not to ask it, and you shall receive and just 654 00:36:35,716 --> 00:36:38,556 Speaker 2: do what it says. Have it, go retrieve the things 655 00:36:38,756 --> 00:36:41,676 Speaker 2: that you need. Right, and so this concept of augmented 656 00:36:41,716 --> 00:36:44,236 Speaker 2: retrieval giving it domain specific knowledge. 657 00:36:44,636 --> 00:36:47,596 Speaker 4: Here is something about what you're dealing with. Let me go. 658 00:36:47,596 --> 00:36:50,436 Speaker 2: Find the best information and present it to you so 659 00:36:50,516 --> 00:36:53,036 Speaker 2: that you can decide from there. I think those bits 660 00:36:53,076 --> 00:36:55,636 Speaker 2: are the essential. And then lastly, absolutely for all of 661 00:36:55,676 --> 00:36:58,396 Speaker 2: us as security. So the nice part is Microsoft and 662 00:36:58,436 --> 00:37:02,556 Speaker 2: some of these groups have made sort of enterprise contained AIS, 663 00:37:02,796 --> 00:37:06,156 Speaker 2: so we're not dispersing this throughout some central knowledge. This 664 00:37:06,236 --> 00:37:09,116 Speaker 2: is specific to the fire department, which in our perspective 665 00:37:09,196 --> 00:37:11,276 Speaker 2: it helps accuracy to go actually up. 666 00:37:11,436 --> 00:37:14,276 Speaker 1: But when you go out on someone from three am 667 00:37:14,356 --> 00:37:16,676 Speaker 1: goes out on a sales call, you go and visit 668 00:37:16,676 --> 00:37:19,956 Speaker 1: a fire department somewhere and you say, we have this 669 00:37:20,316 --> 00:37:23,276 Speaker 1: whole set of ideas to solve some problems for you. 670 00:37:23,276 --> 00:37:25,836 Speaker 1: You have your conversation with the chief, who you've never 671 00:37:25,876 --> 00:37:27,796 Speaker 1: talked to before. What does the chief say? 672 00:37:29,716 --> 00:37:32,196 Speaker 2: The chief is immediately any single time it has to 673 00:37:32,196 --> 00:37:35,676 Speaker 2: do with safety of their firefighters, they're obviously compelled to listen. 674 00:37:37,036 --> 00:37:39,316 Speaker 2: The hard part, I think really is how much change 675 00:37:39,356 --> 00:37:41,236 Speaker 2: is this going to bring to my organization? Other words, 676 00:37:41,356 --> 00:37:45,396 Speaker 2: how much friction is me implementing this technology going to bring? 677 00:37:45,556 --> 00:37:48,076 Speaker 2: And so one of my proudest moments, which sounds super innocuous, 678 00:37:48,236 --> 00:37:50,556 Speaker 2: we did the fourth of July and it was hundreds 679 00:37:50,556 --> 00:37:53,076 Speaker 2: of people and they all forgot they had the device. 680 00:37:53,396 --> 00:37:57,356 Speaker 2: And I was super happy, right because it became invisible. 681 00:37:57,756 --> 00:38:01,596 Speaker 2: And if we can do that, you know, the obstacles 682 00:38:01,596 --> 00:38:04,236 Speaker 2: are sort of overcome. Right, So the idea of automation 683 00:38:04,756 --> 00:38:07,316 Speaker 2: and streamlining all of this contextually, just put the smart 684 00:38:07,316 --> 00:38:10,356 Speaker 2: thing in your pocket, don't worry about anything else. That's 685 00:38:10,396 --> 00:38:13,236 Speaker 2: our fundamental goal, and that's the way that we overcome 686 00:38:13,276 --> 00:38:14,676 Speaker 2: those objections. 687 00:38:14,756 --> 00:38:20,556 Speaker 1: All these innovations have multiple constituencies, right right, And I 688 00:38:21,916 --> 00:38:23,596 Speaker 1: wonder if you can sort of appine on this. This 689 00:38:24,036 --> 00:38:28,756 Speaker 1: must be a kind of perennial issue for anyone who's 690 00:38:28,876 --> 00:38:32,836 Speaker 1: like T mobile who is driving innovation is to ask yourself, 691 00:38:32,876 --> 00:38:36,596 Speaker 1: who's the customer here? Right? Do you have these do 692 00:38:36,636 --> 00:38:40,636 Speaker 1: you face this kind of tension between who? Is it 693 00:38:40,676 --> 00:38:44,476 Speaker 1: important to clarify who we're serving with this innovation before 694 00:38:44,516 --> 00:38:46,996 Speaker 1: you go down the road towards pushing the innovation. 695 00:38:47,556 --> 00:38:52,556 Speaker 6: It really goes back to, if you will, selling through curiosity, 696 00:38:52,956 --> 00:38:55,476 Speaker 6: meaning when you're sitting down with the customer, you're trying 697 00:38:55,516 --> 00:39:00,636 Speaker 6: to understand what it is that they're trying to accomplish 698 00:39:00,756 --> 00:39:03,836 Speaker 6: and is it for their employees, is it business to 699 00:39:03,916 --> 00:39:06,876 Speaker 6: business to consumers or is it their end consumer that 700 00:39:06,876 --> 00:39:10,156 Speaker 6: they're trying to solve the problem and then designing the 701 00:39:10,196 --> 00:39:14,836 Speaker 6: solution to that meet that need. Going back to your 702 00:39:14,876 --> 00:39:18,756 Speaker 6: AI study example just a few minutes ago, like this 703 00:39:19,076 --> 00:39:21,516 Speaker 6: is what I love about what we're here today to 704 00:39:21,556 --> 00:39:28,236 Speaker 6: celebrate is unconventional thinking, which inherently is what is to 705 00:39:28,276 --> 00:39:30,476 Speaker 6: the left of me here today with you know, doctor 706 00:39:30,516 --> 00:39:34,876 Speaker 6: Azezy and Ryan is individuals that looked at the industries 707 00:39:34,916 --> 00:39:37,516 Speaker 6: in which they were working and thought there is a 708 00:39:37,556 --> 00:39:41,116 Speaker 6: better way. I don't care how our industry has done 709 00:39:41,156 --> 00:39:45,076 Speaker 6: it before, and can I build something that drives that outcome? 710 00:39:45,316 --> 00:39:46,836 Speaker 3: I mean with doctor as easy. 711 00:39:47,756 --> 00:39:52,596 Speaker 6: Clinical studies invariably have been at a some central location, 712 00:39:53,276 --> 00:39:58,276 Speaker 6: and what that means is that marginalized groups, underserved groups 713 00:39:58,556 --> 00:40:02,996 Speaker 6: were being underserved. And so the problem statement was, hey, 714 00:40:03,036 --> 00:40:08,836 Speaker 6: can we bring together low cost medical devices, stitch them 715 00:40:08,836 --> 00:40:12,196 Speaker 6: toge gather with a connectivity solution which then in real 716 00:40:12,276 --> 00:40:16,436 Speaker 6: time will send that information back one so that we 717 00:40:16,476 --> 00:40:18,596 Speaker 6: can learn more on how to better serve these groups. 718 00:40:18,636 --> 00:40:21,676 Speaker 6: But in the case of the cardiacc patient that you 719 00:40:21,676 --> 00:40:23,076 Speaker 6: were talking about a little bit ago. 720 00:40:23,516 --> 00:40:24,596 Speaker 4: Also save lives. 721 00:40:24,956 --> 00:40:28,836 Speaker 6: So that's the heart of it for me is I love, love, 722 00:40:28,876 --> 00:40:35,916 Speaker 6: love visiting with businesses that are thinking unconventionally innovatively, and 723 00:40:35,956 --> 00:40:39,356 Speaker 6: then how can we build something with them to drive 724 00:40:40,076 --> 00:40:42,596 Speaker 6: the outcome, which may be the business or in this 725 00:40:42,676 --> 00:40:46,116 Speaker 6: case is the end person that's part of the clinical trial. 726 00:40:48,116 --> 00:40:50,556 Speaker 1: Two to ask questions we're sadly running in time, but 727 00:40:50,596 --> 00:40:53,596 Speaker 1: two us questions to both of you. I'm curious about 728 00:40:54,236 --> 00:40:59,476 Speaker 1: how you measure success. So you, Ryan, you've given this 729 00:41:00,636 --> 00:41:04,956 Speaker 1: marvelous tool to people in very high stressed situations, and 730 00:41:05,396 --> 00:41:11,076 Speaker 1: intuitively we would say you've made you bade their the 731 00:41:11,196 --> 00:41:15,996 Speaker 1: job of the of fighting the fire better easier. But 732 00:41:16,036 --> 00:41:18,236 Speaker 1: how do you know how do you know that's true. 733 00:41:18,276 --> 00:41:20,916 Speaker 1: A And how do you know how much you've improved? 734 00:41:20,956 --> 00:41:25,516 Speaker 1: I mean, do you actively go out and collect data 735 00:41:25,596 --> 00:41:28,716 Speaker 1: or feedback or something from the field to understand that 736 00:41:28,956 --> 00:41:31,196 Speaker 1: the magnitude of the impact you're having. 737 00:41:31,316 --> 00:41:35,036 Speaker 2: It's a great question, and I get really fired up 738 00:41:35,156 --> 00:41:39,796 Speaker 2: because competitors or people in the space throw vanity metrics 739 00:41:39,836 --> 00:41:42,316 Speaker 2: around and they try to tell first responders this is 740 00:41:42,356 --> 00:41:44,356 Speaker 2: how much time and how many lives they're going to save, 741 00:41:44,836 --> 00:41:49,396 Speaker 2: and that's a ridiculous concept. It's all relative, right, So 742 00:41:49,476 --> 00:41:51,476 Speaker 2: to your question, you know, for example, I was at 743 00:41:51,476 --> 00:41:54,756 Speaker 2: a major event. It was actually a marathon, so there 744 00:41:54,756 --> 00:41:57,876 Speaker 2: are a lot of medical issues people that go into 745 00:41:57,916 --> 00:42:01,396 Speaker 2: cardiac arrest and over exhaustion, and there were code blues, 746 00:42:01,676 --> 00:42:04,396 Speaker 2: which means this person is critical. If we don't get 747 00:42:04,436 --> 00:42:08,996 Speaker 2: to the hospital immediately, they will likely die. And immediately 748 00:42:09,036 --> 00:42:11,876 Speaker 2: they go the tool And to your question, is you 749 00:42:11,916 --> 00:42:14,916 Speaker 2: brought up earlier about screens and distraction. We are infinitely 750 00:42:14,996 --> 00:42:17,796 Speaker 2: obsessed with that. The reason why I think automation and 751 00:42:17,836 --> 00:42:19,556 Speaker 2: AI is interesting is because it can be in the 752 00:42:19,556 --> 00:42:21,316 Speaker 2: background and there when you need it. 753 00:42:21,476 --> 00:42:22,436 Speaker 4: That's how we view it. 754 00:42:22,516 --> 00:42:25,356 Speaker 1: So what is instance the tool selling is putting the 755 00:42:25,596 --> 00:42:26,556 Speaker 1: code blues at the top. 756 00:42:26,676 --> 00:42:28,996 Speaker 2: Well, so they're putting it up, code blue gets called in. 757 00:42:29,396 --> 00:42:32,156 Speaker 2: They immediately look at their people on the map and 758 00:42:32,636 --> 00:42:35,396 Speaker 2: typically they would have emergency resources that are assigned to 759 00:42:35,436 --> 00:42:38,636 Speaker 2: specific areas for an event, and you would just say, okay, 760 00:42:38,756 --> 00:42:40,556 Speaker 2: send you know, ISP two. 761 00:42:40,756 --> 00:42:41,636 Speaker 4: That's where they're going to go. 762 00:42:41,836 --> 00:42:44,556 Speaker 2: But instead they're all the way three blocks down, and 763 00:42:44,596 --> 00:42:46,316 Speaker 2: now that you've made that assignment, it's going to take 764 00:42:46,316 --> 00:42:48,396 Speaker 2: them three blocks to get to the patient. By the 765 00:42:48,436 --> 00:42:50,436 Speaker 2: time you get their oxygen has been denied from the 766 00:42:50,436 --> 00:42:54,116 Speaker 2: brain for too long and we've lost the patient, right, 767 00:42:54,156 --> 00:42:56,396 Speaker 2: So instead they say no, no, no, no, ISP three, you 768 00:42:56,556 --> 00:42:59,156 Speaker 2: turn around. I literally watched them and they coach them 769 00:42:59,196 --> 00:43:01,236 Speaker 2: back and that incident command to look to me goes. 770 00:43:01,236 --> 00:43:03,916 Speaker 2: Your tool has been instrumental today. So those are those 771 00:43:03,956 --> 00:43:05,276 Speaker 2: moments where we saved a life. 772 00:43:05,396 --> 00:43:09,476 Speaker 1: Precision, the precision with which you can you can allocate 773 00:43:09,556 --> 00:43:12,156 Speaker 1: resources to the problem is greater. 774 00:43:11,916 --> 00:43:15,356 Speaker 2: Here, right, Yeah, So in those moments, those are those 775 00:43:15,396 --> 00:43:18,756 Speaker 2: things that that sort of matter, right, And to your 776 00:43:18,796 --> 00:43:21,596 Speaker 2: question though, on how do we you know, sort of 777 00:43:21,836 --> 00:43:24,196 Speaker 2: bring it back to people to show them the impact 778 00:43:24,196 --> 00:43:28,076 Speaker 2: it's driving Again, I think usage creates value. The more 779 00:43:28,156 --> 00:43:30,756 Speaker 2: you use it the more it's valuable to you. Why 780 00:43:31,036 --> 00:43:35,116 Speaker 2: because we actually document all data for all events forever. 781 00:43:35,516 --> 00:43:37,476 Speaker 2: And then what you can do is you can scrub 782 00:43:37,516 --> 00:43:39,836 Speaker 2: through it and go back in time from years ago 783 00:43:40,156 --> 00:43:42,516 Speaker 2: and say what happened at exactly the three minute mark 784 00:43:42,596 --> 00:43:45,436 Speaker 2: on this particular event. You can pose it almost like 785 00:43:45,476 --> 00:43:48,196 Speaker 2: the matrix and spin it around and look at it, 786 00:43:48,236 --> 00:43:51,236 Speaker 2: look at all the information that was presented, and that 787 00:43:51,316 --> 00:43:55,396 Speaker 2: becomes mission critical for evolving your best practices, you know, 788 00:43:55,796 --> 00:43:56,276 Speaker 2: things of that. 789 00:43:56,796 --> 00:43:57,876 Speaker 1: It becomes a learning tool. 790 00:43:57,956 --> 00:43:58,116 Speaker 3: Then. 791 00:43:58,316 --> 00:44:01,396 Speaker 1: Yes, in addition to real to its real time importance, 792 00:44:01,676 --> 00:44:04,276 Speaker 1: it has a kind of retrospective importance that you can 793 00:44:04,356 --> 00:44:05,836 Speaker 1: leverage that data to kind of figure out how to 794 00:44:05,876 --> 00:44:06,516 Speaker 1: do a better job. 795 00:44:06,636 --> 00:44:08,516 Speaker 2: And the big piece that I think will triage into 796 00:44:08,676 --> 00:44:11,956 Speaker 2: a ZAS is that our greatest goal here is a 797 00:44:11,996 --> 00:44:16,156 Speaker 2: safe first responder. Is a safe society is or safe communities. 798 00:44:16,396 --> 00:44:18,156 Speaker 2: If we keep them safe, the rest of us are 799 00:44:18,196 --> 00:44:21,516 Speaker 2: in a much better position. The sad part is the 800 00:44:21,556 --> 00:44:24,636 Speaker 2: average life expectancy of a firefighter sixty one years of age. 801 00:44:24,956 --> 00:44:26,436 Speaker 4: Cardiac arrest being a big driver. 802 00:44:26,556 --> 00:44:29,596 Speaker 2: Cancer is really crawling up there though, and we have 803 00:44:29,636 --> 00:44:32,756 Speaker 2: all other terminal diseases that come later in life. Right, 804 00:44:32,996 --> 00:44:36,156 Speaker 2: So our goal is over time throughout your career, because 805 00:44:36,156 --> 00:44:38,556 Speaker 2: we capture all of this data, and because we could 806 00:44:38,556 --> 00:44:41,396 Speaker 2: cross reference with medical professions exactly as a diseas he's 807 00:44:41,436 --> 00:44:44,716 Speaker 2: talking about, Hey, you spent one thousand hours in that 808 00:44:44,756 --> 00:44:48,476 Speaker 2: facility that has now been discovered to contain carcinogens. Now 809 00:44:48,516 --> 00:44:51,076 Speaker 2: the medical practitioner can do things on a preventative care 810 00:44:51,076 --> 00:44:53,596 Speaker 2: standpoint so that we can get ahead of that and 811 00:44:53,636 --> 00:44:56,876 Speaker 2: make sure that firefighters live along in healthy life. So 812 00:44:56,916 --> 00:44:58,356 Speaker 2: to me, that is that ultimate goal. 813 00:44:59,196 --> 00:45:02,076 Speaker 1: It's easy. I'm almost more interested in you responding to 814 00:45:03,356 --> 00:45:07,396 Speaker 1: what Moe was saying about decentralization and why that's I 815 00:45:07,436 --> 00:45:12,556 Speaker 1: think that's actually a lovely place to to end this conversation, 816 00:45:13,036 --> 00:45:17,036 Speaker 1: because it does strike me as I've listened to both 817 00:45:17,036 --> 00:45:21,316 Speaker 1: of you that there is something there is a real 818 00:45:21,436 --> 00:45:26,036 Speaker 1: revolution here in the way data is being collected and 819 00:45:26,116 --> 00:45:29,436 Speaker 1: used and how we're learning from it. But the decentralization 820 00:45:29,556 --> 00:45:37,436 Speaker 1: piece has a kind of social and almost political importance, right, 821 00:45:37,556 --> 00:45:40,636 Speaker 1: It's like it's something high So talk a little bit. 822 00:45:40,796 --> 00:45:42,556 Speaker 1: This is what you've managed to do to You've now 823 00:45:42,636 --> 00:45:46,796 Speaker 1: decentralized the collection of medical information from people and the 824 00:45:46,836 --> 00:45:50,836 Speaker 1: conduct of studies. What does that mean for fairness and society, 825 00:45:50,956 --> 00:45:54,436 Speaker 1: for the quality of the data we're collecting, for the 826 00:45:54,476 --> 00:45:57,556 Speaker 1: way people perceive the medical care system. It's a big deal. 827 00:45:57,996 --> 00:45:58,596 Speaker 3: It's huge. 828 00:45:58,716 --> 00:46:03,076 Speaker 5: That's a great question, Thanks for asking. We believe that 829 00:46:03,196 --> 00:46:06,796 Speaker 5: most of healthcare occurs outside of the brick on more 830 00:46:06,836 --> 00:46:10,716 Speaker 5: to healthcare, and what would often happen is that we 831 00:46:10,756 --> 00:46:14,116 Speaker 5: would get these findings that are artifacts. So for example, 832 00:46:14,156 --> 00:46:16,556 Speaker 5: if you go to you know, your provide on your 833 00:46:16,556 --> 00:46:21,796 Speaker 5: blood pressure is high, are you considered hypertensive or is 834 00:46:21,836 --> 00:46:25,836 Speaker 5: it just artifactual right because of the fact that you know, people. 835 00:46:25,636 --> 00:46:26,916 Speaker 3: Are stressful and the like. 836 00:46:27,356 --> 00:46:30,036 Speaker 5: What we're thinking, what we know, what we're doing is 837 00:46:30,076 --> 00:46:33,556 Speaker 5: that we are actually connecting the dots in between visits 838 00:46:33,756 --> 00:46:37,236 Speaker 5: what we call real world data. We want to study 839 00:46:37,276 --> 00:46:40,796 Speaker 5: the human being in the wild, not in some kind 840 00:46:40,796 --> 00:46:44,756 Speaker 5: of artificial setting, and that allows us to be more fair, 841 00:46:44,996 --> 00:46:48,476 Speaker 5: but it also allows us to be far reaching as well. 842 00:46:48,516 --> 00:46:50,836 Speaker 5: Why one of the things that I hadn't shared and 843 00:46:50,836 --> 00:46:53,036 Speaker 5: I'll share this know is that at the end of 844 00:46:53,076 --> 00:46:57,236 Speaker 5: this we're going to be creating digital twins of each person. 845 00:46:57,276 --> 00:47:00,636 Speaker 5: What does that mean we can know exactly what someone's 846 00:47:01,076 --> 00:47:05,356 Speaker 5: biological algorithm is based on sensing data as well as 847 00:47:05,356 --> 00:47:06,796 Speaker 5: blood work that we're collecting. 848 00:47:07,036 --> 00:47:07,836 Speaker 3: What does this mean. 849 00:47:08,076 --> 00:47:11,716 Speaker 5: It means we can anticipate what comes next or even 850 00:47:11,756 --> 00:47:16,276 Speaker 5: before it happens. But from a fearness standpoint, this allows 851 00:47:16,356 --> 00:47:20,876 Speaker 5: us to really get into all crevices, all the areas, 852 00:47:20,956 --> 00:47:25,596 Speaker 5: all underserved communities that were left by the wayside. So 853 00:47:25,716 --> 00:47:29,156 Speaker 5: for US metrics or success, I've never led us study 854 00:47:29,196 --> 00:47:32,236 Speaker 5: where the recruitment has been so great. And this is 855 00:47:32,276 --> 00:47:34,876 Speaker 5: the this is why, you know, one of the biggest 856 00:47:34,916 --> 00:47:37,596 Speaker 5: journal science learned about what we were doing and wanted 857 00:47:37,636 --> 00:47:41,276 Speaker 5: us to document that because typically when people innovate, the 858 00:47:41,396 --> 00:47:45,396 Speaker 5: innovate for the haves and have mores. We fundamentally believe 859 00:47:45,436 --> 00:47:48,156 Speaker 5: that if we innovate for the have nots, that it 860 00:47:48,196 --> 00:47:50,676 Speaker 5: would allow us to scale much better and it would 861 00:47:50,716 --> 00:47:54,596 Speaker 5: have far more rich and more applicability. So from an 862 00:47:54,636 --> 00:47:57,876 Speaker 5: ethics and an equitable standpoint, so that's why we dubbed 863 00:47:57,876 --> 00:48:01,676 Speaker 5: what we call health tequity. Right, We've been talking about 864 00:48:01,676 --> 00:48:04,356 Speaker 5: this with the American Art Association. It's a real deal 865 00:48:04,396 --> 00:48:09,436 Speaker 5: though that we believe that at the intersection, at the 866 00:48:09,476 --> 00:48:14,476 Speaker 5: nexus of equity and technology, that we could exacerbate health 867 00:48:14,516 --> 00:48:18,396 Speaker 5: care issues or it could be cure, we could mend it, 868 00:48:18,436 --> 00:48:21,076 Speaker 5: and we're saying that we want to be the ones 869 00:48:21,156 --> 00:48:24,076 Speaker 5: and so for us, here's what we're doing already. We're 870 00:48:24,076 --> 00:48:28,836 Speaker 5: screening people for Alzheimer's disease much earlier using augmented reality 871 00:48:28,996 --> 00:48:31,156 Speaker 5: where we can determine if someone is going to get 872 00:48:31,436 --> 00:48:34,836 Speaker 5: Alzheimer's disease six to ten years before age of onset. 873 00:48:35,196 --> 00:48:39,756 Speaker 5: We're providing virtual reality solutions to black and brown Black 874 00:48:39,756 --> 00:48:42,516 Speaker 5: on brown moms who have no tourists been known to 875 00:48:42,516 --> 00:48:46,316 Speaker 5: have a huge epidemic in maternal mental health at Maternal Health, 876 00:48:46,596 --> 00:48:50,036 Speaker 5: we're providing that to a freugh of folks. We're reaching 877 00:48:50,036 --> 00:48:52,396 Speaker 5: out and providing it to over three thousand kids in 878 00:48:52,396 --> 00:48:55,476 Speaker 5: the state of Florida because sixty eight percent of families 879 00:48:55,516 --> 00:49:00,036 Speaker 5: don't live near a licensed mental health practitioner. And we're 880 00:49:00,076 --> 00:49:03,596 Speaker 5: also building the next generation of technologies on health care 881 00:49:03,716 --> 00:49:07,876 Speaker 5: provide us so that we know can have a provider, 882 00:49:07,956 --> 00:49:11,596 Speaker 5: we can listen because typically when you go to your provider, 883 00:49:11,636 --> 00:49:13,916 Speaker 5: what are they doing? They're writing notes and there's no 884 00:49:13,996 --> 00:49:17,556 Speaker 5: eye contact. No. We can use AI and ambient technology 885 00:49:17,836 --> 00:49:21,356 Speaker 5: to capture all of those data so that your provider 886 00:49:21,436 --> 00:49:25,796 Speaker 5: can be with you more in a more human human way, 887 00:49:26,076 --> 00:49:26,636 Speaker 5: and that's. 888 00:49:26,476 --> 00:49:29,276 Speaker 3: What it will allow us to do. So that's how we. 889 00:49:29,796 --> 00:49:32,756 Speaker 5: Measure metrics of success, and I think that's where the 890 00:49:32,796 --> 00:49:37,996 Speaker 5: ethics lies as well, restoring the humanity in medicine. Oftentimes 891 00:49:37,996 --> 00:49:41,516 Speaker 5: people think that when you use technology that it actually 892 00:49:41,596 --> 00:49:44,916 Speaker 5: effaces the human. What we're trying to do is that 893 00:49:44,956 --> 00:49:49,236 Speaker 5: we believe that technology can allow us to make healthcare 894 00:49:49,516 --> 00:49:53,956 Speaker 5: more human again, restoring the soul and reclaiming the soul 895 00:49:53,956 --> 00:49:55,316 Speaker 5: of healthcare through technology. 896 00:49:55,476 --> 00:49:56,396 Speaker 3: That's our thesis. 897 00:49:56,676 --> 00:50:01,596 Speaker 1: Yeah, that's really beautiful. I will say just one last note. 898 00:50:02,796 --> 00:50:05,076 Speaker 1: The whole time you guys were talking, I was having 899 00:50:05,556 --> 00:50:10,716 Speaker 1: these kinds of absurd fantasies about how I, as I 900 00:50:10,756 --> 00:50:14,036 Speaker 1: am a parent of two girls, how I could use 901 00:50:14,076 --> 00:50:20,476 Speaker 1: both of your technologies to uh helicopter parent. I give 902 00:50:20,556 --> 00:50:23,036 Speaker 1: them a wearable, they would monitor everything. I'd be listening 903 00:50:23,036 --> 00:50:25,876 Speaker 1: to all their conversations, and I just walk around with 904 00:50:26,036 --> 00:50:30,636 Speaker 1: Ryan with one of your with one of your tablets, 905 00:50:30,916 --> 00:50:32,996 Speaker 1: and every they were just like highlight if there's ever 906 00:50:33,036 --> 00:50:37,476 Speaker 1: any kind of problem. But this has been absolutely fascinating. 907 00:50:38,396 --> 00:50:39,796 Speaker 1: I don't I feel we could have gone on and 908 00:50:39,876 --> 00:50:42,796 Speaker 1: on and on for another hour. But I think what 909 00:50:42,836 --> 00:50:45,396 Speaker 1: you've done is just given us a little glimpse into 910 00:50:45,956 --> 00:50:53,516 Speaker 1: how human ingenuity is using technology in utterly unexpected ways. 911 00:50:54,516 --> 00:50:57,556 Speaker 1: And I think that's it's a it's it's a beautiful 912 00:50:57,556 --> 00:50:59,156 Speaker 1: story that needs to be told, and I'm that we're 913 00:50:59,156 --> 00:50:59,516 Speaker 1: telling it. 914 00:51:00,156 --> 00:51:02,716 Speaker 3: Thank you, thank you, thank you, thank you. 915 00:51:09,356 --> 00:51:12,196 Speaker 1: Thanks for listening to this special episode brought to you 916 00:51:12,236 --> 00:51:15,916 Speaker 1: by T Mobile for Business. Special thanks to our guests 917 00:51:16,276 --> 00:51:20,636 Speaker 1: Mokattaba T Mobile for Businesses Chief Marketing Officer, Ryan Litt, 918 00:51:21,356 --> 00:51:24,956 Speaker 1: chief operating Officer and co founder three A M Innovations, 919 00:51:25,276 --> 00:51:29,316 Speaker 1: and doctor Azizi Satias, Chair of the Department of Informatics 920 00:51:29,356 --> 00:51:32,476 Speaker 1: and Health Data Science at the University of Miami School 921 00:51:32,476 --> 00:51:36,116 Speaker 1: of Medicine. And special thanks to the entire production crew 922 00:51:36,356 --> 00:51:40,796 Speaker 1: at iHeartMedia. This episode was produced by Nina Birgh Lawrence 923 00:51:40,796 --> 00:51:44,556 Speaker 1: with Lucy Sullivan and Ben Nadaf Haffrey. Editing by Karen 924 00:51:44,596 --> 00:51:49,236 Speaker 1: Schakerji mastering by Sarah Bruguer. Special thanks to Lou Carloso 925 00:51:49,476 --> 00:51:53,956 Speaker 1: for on site recording. Our executive producer is Jacob Smith. 926 00:51:55,036 --> 00:51:55,876 Speaker 1: I'm Malcolm Lapa