1 00:00:01,200 --> 00:00:03,720 Speaker 1: Hey, everybody, it's me Josh and for this week's s 2 00:00:03,840 --> 00:00:08,039 Speaker 1: Y s K Selex, I've chosen will computers replaced doctors. 3 00:00:08,680 --> 00:00:11,799 Speaker 1: It's an episode so dated I still wore a fitbit 4 00:00:11,960 --> 00:00:15,320 Speaker 1: when we recorded it. No, but seriously, it is a 5 00:00:15,400 --> 00:00:19,160 Speaker 1: really interesting episode. And even though we recorded it years ago, 6 00:00:19,640 --> 00:00:22,439 Speaker 1: the stuff that we're talking about still quite hasn't come 7 00:00:22,480 --> 00:00:26,120 Speaker 1: to fruition. So sit back and enjoy this peek into 8 00:00:26,120 --> 00:00:33,400 Speaker 1: the future. Welcome to Stuff you should know, a production 9 00:00:33,440 --> 00:00:41,640 Speaker 1: of My Heart Radios How Stuff Works. Hey, and welcome 10 00:00:41,680 --> 00:00:44,879 Speaker 1: to the podcast. I'm Josh Clark. There's Charles W. Chuck Bryant, 11 00:00:44,880 --> 00:00:48,720 Speaker 1: our guest producer Noel is here. Yeah, Jerry needs a 12 00:00:48,720 --> 00:00:53,320 Speaker 1: buffer day from her Christmas break. I can't say that again. No, 13 00:00:53,880 --> 00:00:56,360 Speaker 1: she's at home on her buffer day. And the freezing 14 00:00:56,400 --> 00:01:00,360 Speaker 1: cold because we record these somewhat in advance, we are 15 00:01:00,360 --> 00:01:04,640 Speaker 1: in the midst of the polar vortex and um, yeah, 16 00:01:04,680 --> 00:01:06,440 Speaker 1: everyone's just talking about how cold it is. We're back. 17 00:01:06,480 --> 00:01:09,560 Speaker 1: It is our first recording after the holidays. It's literally 18 00:01:09,600 --> 00:01:12,319 Speaker 1: freezing cold up. So welcome back, buddy. Thanks, Welcome back 19 00:01:12,319 --> 00:01:13,840 Speaker 1: to you too. Even though this will be what like 20 00:01:13,920 --> 00:01:17,200 Speaker 1: late January, it will be a balmy sixteen. I think 21 00:01:17,200 --> 00:01:19,320 Speaker 1: the end of the end of January. When is when 22 00:01:19,319 --> 00:01:21,560 Speaker 1: this one comes out? All right, Well, happy New Year 23 00:01:21,560 --> 00:01:24,080 Speaker 1: and happy New Year to YouTube, and happy holidays to you. 24 00:01:24,200 --> 00:01:28,400 Speaker 1: Thanks um Chuck, Yes, sir, you're feeling good. You're loose, 25 00:01:28,400 --> 00:01:31,280 Speaker 1: You're ready. I'm loose. So you see this, you've seen 26 00:01:31,319 --> 00:01:33,679 Speaker 1: this before. Yeah, your Fitbit, My fit bit is that 27 00:01:33,720 --> 00:01:37,920 Speaker 1: buzz marketing. Not really, It's just a really good example. Um. 28 00:01:38,280 --> 00:01:42,160 Speaker 1: I feel like fit Bit. I'm not like necessarily loyal 29 00:01:42,200 --> 00:01:44,280 Speaker 1: to it or anything like that. They don't pay me 30 00:01:44,319 --> 00:01:48,040 Speaker 1: money to mention the podcast. Sometimes I'll just be like, 31 00:01:48,080 --> 00:01:50,760 Speaker 1: stop staring at me. Fit Bit. Yeah, um, but no, 32 00:01:50,880 --> 00:01:53,520 Speaker 1: I like it. I'm happy with it. Um I I 33 00:01:53,640 --> 00:01:57,960 Speaker 1: pointed out though, because it's part of this to me, 34 00:01:58,120 --> 00:02:00,440 Speaker 1: and I don't think it's over confirmation bias. It seems 35 00:02:00,480 --> 00:02:05,440 Speaker 1: like there really is a growing desire among just average 36 00:02:05,520 --> 00:02:10,919 Speaker 1: ordinary people to be able to track there their health, 37 00:02:11,120 --> 00:02:16,040 Speaker 1: their well being, their activity. Um, and to do it easily. Yeah, 38 00:02:16,120 --> 00:02:18,839 Speaker 1: we have tools now that make it like that thing 39 00:02:19,240 --> 00:02:21,600 Speaker 1: super convenient. Yeah, it's in fit It's not the only one. 40 00:02:21,639 --> 00:02:24,400 Speaker 1: There's like Nike fuel Band. There's Jawbone is another really 41 00:02:24,400 --> 00:02:28,399 Speaker 1: good one. There's others like um that track uh, your 42 00:02:28,520 --> 00:02:32,080 Speaker 1: galvanic response, so they're able to put that together with 43 00:02:32,520 --> 00:02:35,520 Speaker 1: respiration and heartbeat and come up with a pretty good 44 00:02:35,520 --> 00:02:38,160 Speaker 1: assessment of how many calories you're burning at any given time, 45 00:02:38,480 --> 00:02:40,120 Speaker 1: which is like kind of a holy grail with this 46 00:02:40,240 --> 00:02:43,160 Speaker 1: kind of thing right now. Um, there's others that track 47 00:02:43,200 --> 00:02:46,920 Speaker 1: your sleep. There's apps out there that let you, um 48 00:02:47,600 --> 00:02:51,880 Speaker 1: track your mood. Um. There's sites like quantified self, which 49 00:02:51,880 --> 00:02:55,760 Speaker 1: are basically like people trying to push wearable technology like 50 00:02:55,840 --> 00:03:00,000 Speaker 1: this further into the future. There's entire websites like share 51 00:03:00,120 --> 00:03:03,800 Speaker 1: care dot com that are dedicated to health information and 52 00:03:03,880 --> 00:03:08,560 Speaker 1: health um uh support yeah, self advocacy, yeah. And and 53 00:03:08,600 --> 00:03:11,440 Speaker 1: there's this It seems to me, this desire to kind 54 00:03:11,480 --> 00:03:14,720 Speaker 1: of say, hey, this is my health, this is my body. 55 00:03:14,919 --> 00:03:17,360 Speaker 1: I want to know more about it, you know, totally 56 00:03:17,440 --> 00:03:20,639 Speaker 1: Like I don't want to necessarily cut out doctors, but 57 00:03:21,160 --> 00:03:24,440 Speaker 1: I I want to decide if I should go to 58 00:03:25,120 --> 00:03:27,320 Speaker 1: the doctor, if it's time or not, and I want 59 00:03:27,360 --> 00:03:30,640 Speaker 1: to use data to do that. Yeah. I imagine I 60 00:03:30,760 --> 00:03:32,880 Speaker 1: frustrate a lot of doctors because I'm one of those 61 00:03:33,639 --> 00:03:36,600 Speaker 1: obnoxious people that goes in and it's like, well, here's 62 00:03:36,600 --> 00:03:39,880 Speaker 1: what I think I have based on my research. There's 63 00:03:39,960 --> 00:03:42,280 Speaker 1: nothing wrong with that. Yeah, that is what you're an 64 00:03:42,320 --> 00:03:44,920 Speaker 1: informed patient. That's true, exactly what you're supposed to do. 65 00:03:44,920 --> 00:03:47,160 Speaker 1: And if you're getting on your doctor's nerves, then go 66 00:03:47,240 --> 00:03:51,400 Speaker 1: see another doctor. Uh yeah, I agree. Actually in search 67 00:03:51,440 --> 00:03:53,800 Speaker 1: of a new GP right now for those because and 68 00:03:53,920 --> 00:03:57,960 Speaker 1: others you got on his nerves. Other reasons too, really 69 00:03:58,000 --> 00:04:01,480 Speaker 1: cold cold hands now like poor bedside manner, never seen 70 00:04:01,520 --> 00:04:06,880 Speaker 1: the doctor Like here's my intern from Emory. Yeah, which great. 71 00:04:06,920 --> 00:04:09,160 Speaker 1: You know. I love them getting experienced, but I would 72 00:04:09,200 --> 00:04:11,080 Speaker 1: like them both to be in there, not just like 73 00:04:11,600 --> 00:04:14,000 Speaker 1: smell you later and the doctor leaves. Well that's another 74 00:04:14,040 --> 00:04:18,040 Speaker 1: thing too. It's kind of like, um, doctor, okay, let's 75 00:04:18,080 --> 00:04:20,000 Speaker 1: let's just lay it on on the table here. What 76 00:04:20,000 --> 00:04:22,839 Speaker 1: what you've just mentioned and what I was talking about, 77 00:04:22,960 --> 00:04:26,800 Speaker 1: if you put it all together. The medical field, physicians 78 00:04:27,160 --> 00:04:31,160 Speaker 1: in particular, are currently in a what's the beginning of 79 00:04:31,160 --> 00:04:36,000 Speaker 1: what's possibly a really pickle of a state for them. 80 00:04:36,040 --> 00:04:38,640 Speaker 1: I think a transition period, yes, but they may be 81 00:04:38,720 --> 00:04:41,680 Speaker 1: transitioned right out of existence in large parts. Yes, I 82 00:04:41,800 --> 00:04:44,280 Speaker 1: may for sure, depending on who you talk to. There's 83 00:04:44,320 --> 00:04:47,200 Speaker 1: like this whole question now, like, what is the future 84 00:04:47,640 --> 00:04:51,359 Speaker 1: of medicine, and more specifically, in the case of this 85 00:04:51,400 --> 00:04:56,039 Speaker 1: episode that we're talking about, do human physicians factor largely 86 00:04:56,200 --> 00:05:02,040 Speaker 1: into that future? And the answer is no, yeah, depending 87 00:05:02,080 --> 00:05:04,520 Speaker 1: on who you ask, Like I said, um, we there's 88 00:05:04,560 --> 00:05:09,000 Speaker 1: this one guy, Dr Kent Bottles, who um, he feels 89 00:05:09,640 --> 00:05:13,360 Speaker 1: that GPS might go the way of the Dodo and 90 00:05:13,480 --> 00:05:19,200 Speaker 1: be replaced by diagnostic computers, maybe with avatars. Then there's 91 00:05:19,279 --> 00:05:22,600 Speaker 1: other people like farhad Manjou is a technical writer, his 92 00:05:22,640 --> 00:05:25,359 Speaker 1: wife as a pathologist. He thinks, no, no, no, the 93 00:05:25,400 --> 00:05:27,239 Speaker 1: GPS are the ones that are going to be in business. 94 00:05:27,680 --> 00:05:30,120 Speaker 1: The specialists are gonna be out of business because computers 95 00:05:30,120 --> 00:05:33,200 Speaker 1: are really good at specializing on one single thing, not 96 00:05:33,360 --> 00:05:38,000 Speaker 1: maybe so good at a general practitioner thing. So there's 97 00:05:38,040 --> 00:05:40,839 Speaker 1: lots of opinions out there on how much they'll be 98 00:05:40,880 --> 00:05:43,160 Speaker 1: replaced and who might be replaced. Right then, as a 99 00:05:43,200 --> 00:05:47,840 Speaker 1: reclient wrote a column who basically he basically said like now, like, 100 00:05:48,040 --> 00:05:50,920 Speaker 1: we we will still need humans, but we mainly need 101 00:05:51,000 --> 00:05:55,400 Speaker 1: humans to communicate to the other humans and facilitate the 102 00:05:55,400 --> 00:05:57,560 Speaker 1: the interaction between the robots and the humans, and we 103 00:05:57,560 --> 00:06:01,000 Speaker 1: already have this. They're called nurses or a nurse practitioners. 104 00:06:01,400 --> 00:06:03,880 Speaker 1: He kind of as reclined, is the one that thought 105 00:06:03,960 --> 00:06:07,360 Speaker 1: that a computer avatar might have a better bedside manner 106 00:06:07,360 --> 00:06:10,039 Speaker 1: than a doctor. Well, let's give that one example. There's this, 107 00:06:10,279 --> 00:06:12,800 Speaker 1: there was there's an example I kept finding while we 108 00:06:12,800 --> 00:06:14,719 Speaker 1: were doing research for this, and it's actually in the 109 00:06:14,800 --> 00:06:18,839 Speaker 1: article on how stuff works. Um, it's uh. There was 110 00:06:18,880 --> 00:06:22,440 Speaker 1: a kiosk, a medical kiosk, during a panel called man 111 00:06:22,520 --> 00:06:26,279 Speaker 1: Made Minds colon Living with Thinking Machines. When there's a 112 00:06:26,279 --> 00:06:29,479 Speaker 1: colon in there, you know it's serious stuff. Um. And 113 00:06:29,520 --> 00:06:31,360 Speaker 1: it was at the World Science Festival in two thousand 114 00:06:31,360 --> 00:06:36,599 Speaker 1: and eleven. And basically, this, this computerized avatar um interacted 115 00:06:36,640 --> 00:06:41,720 Speaker 1: with a woman whose baby had diarrhea, and the woman said, hey, Avatar, 116 00:06:42,360 --> 00:06:44,479 Speaker 1: my baby has diarrhea. What are you going to do 117 00:06:44,560 --> 00:06:47,680 Speaker 1: about it? And the avatar said, well, tell me all 118 00:06:47,720 --> 00:06:50,520 Speaker 1: the symptoms and all this stuff, and avatar decided that 119 00:06:50,800 --> 00:06:54,799 Speaker 1: the baby's diarrhea, while present, wasn't severe enough to warrant 120 00:06:55,120 --> 00:06:58,120 Speaker 1: immediate medical attention, so it went ahead and made an 121 00:06:58,120 --> 00:07:02,000 Speaker 1: appointment with a hue and doctor for later on that week, 122 00:07:02,560 --> 00:07:06,800 Speaker 1: and the mother said that she preferred the treatment by 123 00:07:06,800 --> 00:07:10,200 Speaker 1: the avatar to the real life nurses at the hospitals 124 00:07:10,400 --> 00:07:13,720 Speaker 1: where she lived in New York. Yeah. Uh so it 125 00:07:13,840 --> 00:07:18,080 Speaker 1: is possible to create computers with better bedside manner than say, 126 00:07:18,120 --> 00:07:21,160 Speaker 1: your g P. Well, it's at the very least it 127 00:07:21,160 --> 00:07:23,240 Speaker 1: will be consistent. And that's one of the things that 128 00:07:24,120 --> 00:07:27,160 Speaker 1: I'm not poopooing. Doctors or nurses are many, many, many, 129 00:07:27,200 --> 00:07:29,560 Speaker 1: many great ones, but I've also had some pretty bad 130 00:07:29,560 --> 00:07:33,080 Speaker 1: experiences in emergency rooms and with doctors and nurses. With 131 00:07:33,120 --> 00:07:36,200 Speaker 1: a computer, at least it's a consistent you know, their 132 00:07:36,280 --> 00:07:40,920 Speaker 1: program to display empathy no matter what. You know, they're 133 00:07:40,960 --> 00:07:43,000 Speaker 1: not too busy and they're not, you know, having a 134 00:07:43,040 --> 00:07:46,560 Speaker 1: bad day, so they don't you know, they don't have 135 00:07:46,600 --> 00:07:50,520 Speaker 1: any prejudices against you personally or anything like that. Their computer. 136 00:07:50,680 --> 00:07:55,920 Speaker 1: They don't hate diarrhea. But humans, humans respond to even 137 00:07:55,960 --> 00:08:01,000 Speaker 1: programmed empathy, even synthetic empathy from a a computer. I 138 00:08:01,040 --> 00:08:05,440 Speaker 1: could see that a little bit. Like I've dove into 139 00:08:05,480 --> 00:08:10,000 Speaker 1: the gaming world enough to know that, you know, the 140 00:08:10,040 --> 00:08:14,240 Speaker 1: realism of a of a avatar can be convincing, and 141 00:08:14,280 --> 00:08:15,840 Speaker 1: it's not like you think, oh, it's a real person, 142 00:08:15,880 --> 00:08:17,720 Speaker 1: but it helps to put a human face on it, 143 00:08:17,760 --> 00:08:22,040 Speaker 1: you know, exactly. Literally, Um, they I saw a reference 144 00:08:22,080 --> 00:08:25,400 Speaker 1: to a study that found um, people who are being 145 00:08:25,440 --> 00:08:30,720 Speaker 1: treated for anxiety disorders tended to share more about their 146 00:08:30,760 --> 00:08:35,880 Speaker 1: experiences and themselves with an avatar them with a human psychiatrists. 147 00:08:36,000 --> 00:08:39,760 Speaker 1: Oh that's interesting because they're like not embarrassed to tell 148 00:08:39,800 --> 00:08:41,920 Speaker 1: a real person. Yeah, that makes sense, I might open 149 00:08:42,000 --> 00:08:44,240 Speaker 1: up more to to a computer. Right, So so we've 150 00:08:44,240 --> 00:08:47,120 Speaker 1: got that part, like the bedside manner. It is possible 151 00:08:47,360 --> 00:08:50,400 Speaker 1: that we can create machines now and are creating machines 152 00:08:50,440 --> 00:08:54,400 Speaker 1: now that have at least equal, if not better bedside 153 00:08:54,440 --> 00:08:58,400 Speaker 1: manner than some physicians. Yeah, okay, so bedside manner one 154 00:08:58,440 --> 00:09:00,360 Speaker 1: of the big things that doctors bring to the table. 155 00:09:00,640 --> 00:09:04,480 Speaker 1: Check computers have that. Yeah, it's it's different now than 156 00:09:04,520 --> 00:09:06,920 Speaker 1: it was in the old days. I feel like just 157 00:09:07,000 --> 00:09:10,559 Speaker 1: the whole quality of personal care is gone down. It's 158 00:09:10,600 --> 00:09:13,080 Speaker 1: not necessarily the doctor's falter. There's a lot of reasons 159 00:09:13,120 --> 00:09:15,160 Speaker 1: to place the blame. But it's not like when you 160 00:09:15,200 --> 00:09:16,600 Speaker 1: were a kid and you feel like you had your 161 00:09:16,880 --> 00:09:20,320 Speaker 1: family doctor who knew you, maybe even gave birth birth 162 00:09:21,400 --> 00:09:27,000 Speaker 1: you're my son exactly. Um, it just invested like you 163 00:09:27,000 --> 00:09:28,679 Speaker 1: gotta stick with the same doctor if you want that 164 00:09:28,760 --> 00:09:31,040 Speaker 1: kind of care, I think, right. And there's another benefit 165 00:09:31,080 --> 00:09:35,240 Speaker 1: besides bedside banner UM that comes with that that kind 166 00:09:35,280 --> 00:09:38,160 Speaker 1: of care, that kind of personal care is an awareness 167 00:09:38,320 --> 00:09:42,079 Speaker 1: of your medical history. Yeah, not just that, but oh well, 168 00:09:42,160 --> 00:09:46,400 Speaker 1: your dad died of congenital heart disorders like that, so 169 00:09:46,760 --> 00:09:48,840 Speaker 1: you may be at higher risk of it. To just 170 00:09:48,920 --> 00:09:53,040 Speaker 1: that kind of awareness has been typically lost to even 171 00:09:53,080 --> 00:09:54,960 Speaker 1: though we have medical histories and they're in our charts 172 00:09:54,960 --> 00:09:58,080 Speaker 1: that they're in our files UM, and intimate knowledge of 173 00:09:58,840 --> 00:10:04,200 Speaker 1: a patients UM medical history is pretty much lost in 174 00:10:04,280 --> 00:10:08,640 Speaker 1: today's modern practice of medicine. Yeah. UM. That's another thing 175 00:10:08,720 --> 00:10:14,840 Speaker 1: that that computers could conceivably top doctors on UM, which 176 00:10:14,880 --> 00:10:19,400 Speaker 1: basically falls under the umbrella of diagnosis or diagnostics. Yeah. 177 00:10:19,400 --> 00:10:21,920 Speaker 1: I mean there's two two sides to this. There's diagnoses 178 00:10:21,960 --> 00:10:26,360 Speaker 1: and treatment and some UH programs. A little bit of 179 00:10:26,400 --> 00:10:30,360 Speaker 1: the history this UM goes back to the nineteen seventies. 180 00:10:30,559 --> 00:10:36,080 Speaker 1: At the University of Pittsburgh, they develop software to diagnose problems. UM. 181 00:10:36,240 --> 00:10:39,080 Speaker 1: MASS General since the eighties has been working on their 182 00:10:39,160 --> 00:10:45,160 Speaker 1: d X plan, which provides ranked list of diagnoses. Whereas 183 00:10:45,240 --> 00:10:49,320 Speaker 1: the what's the computer the M Watson Watson who who 184 00:10:49,360 --> 00:10:52,400 Speaker 1: won at jeopardy. Yeah, that's more based. Um, it looks 185 00:10:52,440 --> 00:10:56,720 Speaker 1: like on treatment options than diagnosis at this point. So 186 00:10:56,760 --> 00:10:59,160 Speaker 1: they're using these well yeah, but they said it's not 187 00:10:59,280 --> 00:11:01,079 Speaker 1: they haven't. I don't think they want to leave it 188 00:11:01,120 --> 00:11:03,960 Speaker 1: alone with diagnosis yet. No, and to do its thing. 189 00:11:04,160 --> 00:11:08,640 Speaker 1: There's already something out there for diagnosis that's meant to 190 00:11:08,920 --> 00:11:14,400 Speaker 1: support physicians. From what I understand with Watson, if there 191 00:11:14,440 --> 00:11:19,000 Speaker 1: is a doctor of the future, it's Watson. Um. He 192 00:11:19,080 --> 00:11:22,520 Speaker 1: has a lot of advantages over not just um human doctors, 193 00:11:22,600 --> 00:11:28,920 Speaker 1: but other artificial intelligence healthcare machines. I guess you could 194 00:11:29,280 --> 00:11:33,280 Speaker 1: clumsily call it. He has a knack for natural language. 195 00:11:33,840 --> 00:11:39,560 Speaker 1: So let's say there's like a structured formula or formulaic 196 00:11:39,960 --> 00:11:47,280 Speaker 1: type of language that the medical field is supposed to use, right, yes, okay, Um, 197 00:11:47,320 --> 00:11:51,520 Speaker 1: health records don't always necessarily contain that language. They might 198 00:11:51,520 --> 00:11:55,800 Speaker 1: contain natural language, which is really confusing for computers to 199 00:11:55,880 --> 00:11:59,960 Speaker 1: take in an absorb. Yeah, you know, the humans can 200 00:12:00,120 --> 00:12:03,320 Speaker 1: pick up on meanings of things that robots and and 201 00:12:03,640 --> 00:12:08,520 Speaker 1: software cannot, like inferences, and we might be using sarcasm, 202 00:12:08,600 --> 00:12:10,960 Speaker 1: although there's probably not going to be any sarcasm in 203 00:12:11,000 --> 00:12:13,880 Speaker 1: your medical records. Yeah, but like figurative language and stuff 204 00:12:13,880 --> 00:12:16,360 Speaker 1: like that, but computers a language is a big part 205 00:12:16,400 --> 00:12:20,040 Speaker 1: of the problem. Or more to the point, with the diagnosis, 206 00:12:20,120 --> 00:12:22,640 Speaker 1: patient says he feels like he has a hive of 207 00:12:22,720 --> 00:12:27,120 Speaker 1: bees in his stomach, Like that might mean something that 208 00:12:27,120 --> 00:12:30,000 Speaker 1: you or me, but to a computer it's like followed 209 00:12:30,000 --> 00:12:32,400 Speaker 1: a bunch of bees or something. Right, Watson has the 210 00:12:32,400 --> 00:12:36,280 Speaker 1: advantage of saying, Okay, well, there's a sensation of bees 211 00:12:36,559 --> 00:12:38,600 Speaker 1: in the stomach, there's not actually bees in the stomach, 212 00:12:38,640 --> 00:12:42,760 Speaker 1: So let's figure this out then, Watson, or anything that 213 00:12:42,760 --> 00:12:47,840 Speaker 1: that he eventually becomes UM, well, we'll be able to 214 00:12:47,880 --> 00:12:52,200 Speaker 1: go through medical records, current medical research, UM, the patient's 215 00:12:52,200 --> 00:12:57,480 Speaker 1: medical history, UH, diagnostic tests that were done, blood work, UM, 216 00:12:57,679 --> 00:13:01,160 Speaker 1: instrument tests, and put it all together there and then 217 00:13:01,240 --> 00:13:06,319 Speaker 1: spit out a list of diagnoses with different confidence levels. 218 00:13:06,320 --> 00:13:07,720 Speaker 1: So the one at the top is the one that 219 00:13:07,800 --> 00:13:12,800 Speaker 1: Watson says is he is ninety eight point seven percent 220 00:13:12,840 --> 00:13:17,240 Speaker 1: sure is what's wrong with this patient? And UM as 221 00:13:17,280 --> 00:13:21,120 Speaker 1: a diagnostician, that's pretty impressive, and that's using all the 222 00:13:21,200 --> 00:13:26,720 Speaker 1: available data that's that's available also to human physicians, but 223 00:13:26,800 --> 00:13:30,600 Speaker 1: they simply don't have the time to take it all 224 00:13:30,679 --> 00:13:33,880 Speaker 1: in Yeah. I think some research said that eighty per 225 00:13:34,200 --> 00:13:37,560 Speaker 1: of doctors spend less than five hours a week reading 226 00:13:37,640 --> 00:13:42,640 Speaker 1: medical journals a month a month. Yeah, so that's these 227 00:13:42,679 --> 00:13:46,280 Speaker 1: things can read thousands in seconds. So it's it's sort 228 00:13:46,320 --> 00:13:50,120 Speaker 1: of a matter of of efficiency really, and like if 229 00:13:50,160 --> 00:13:53,080 Speaker 1: doctors don't have time to read all this stuff, I know, 230 00:13:53,160 --> 00:13:56,640 Speaker 1: we we looked into this one sort of a savant 231 00:13:56,800 --> 00:14:02,240 Speaker 1: diagnoser is that a word? I don't gnostician? Diagnostician? Uh 232 00:14:02,360 --> 00:14:06,599 Speaker 1: doctor uh Dolly Wall in San Francisco. He's sort of 233 00:14:06,679 --> 00:14:10,320 Speaker 1: legendary for diagnosing things, to the point where he does 234 00:14:10,320 --> 00:14:12,520 Speaker 1: it on stage as almost like a parlor trick. I 235 00:14:12,520 --> 00:14:14,160 Speaker 1: would love to see it. I would too. They give 236 00:14:14,240 --> 00:14:18,559 Speaker 1: him forty five minutes and and a bunch of symptoms 237 00:14:18,559 --> 00:14:22,200 Speaker 1: basically like really confusing because they're trying to stump him, 238 00:14:22,240 --> 00:14:25,880 Speaker 1: and generally he comes out on top. But he even 239 00:14:26,000 --> 00:14:29,400 Speaker 1: uses a program, a diagnostic program called isabel Right, that's 240 00:14:29,400 --> 00:14:32,360 Speaker 1: the one I said earlier, that's already here. Yeah, so 241 00:14:32,600 --> 00:14:35,200 Speaker 1: doctors are using these to help themselves out. But he 242 00:14:35,240 --> 00:14:39,400 Speaker 1: says that he's never had Isabelle offered diagnosis that he 243 00:14:39,520 --> 00:14:43,120 Speaker 1: has missed, but he's like the dude, Yeah, and He 244 00:14:43,160 --> 00:14:46,360 Speaker 1: also admits that he's like, like, I'm a freak of nature, right, 245 00:14:46,400 --> 00:14:49,120 Speaker 1: go ahead, quiz me exactly. Yeah. He also reads like 246 00:14:49,400 --> 00:14:52,600 Speaker 1: case histories, like for fun, that kind of stuff. He's not, 247 00:14:53,200 --> 00:14:57,520 Speaker 1: he's not a normal physician. He's a complete and total outlier. Um, 248 00:14:58,320 --> 00:15:01,480 Speaker 1: if he were, if every physician we're like this guy, 249 00:15:01,960 --> 00:15:05,440 Speaker 1: then they're there. Probably wouldn't be this conversation going on 250 00:15:05,600 --> 00:15:09,320 Speaker 1: right now. But most physicians aren't. And it's not just 251 00:15:09,440 --> 00:15:12,160 Speaker 1: with current medical research that they're just not aware of 252 00:15:12,160 --> 00:15:13,960 Speaker 1: because they haven't had time to pick up the lance 253 00:15:13,960 --> 00:15:16,800 Speaker 1: at the last few months, but it's also their training 254 00:15:16,840 --> 00:15:19,080 Speaker 1: to Like if a doctors in practice for twenty years, 255 00:15:19,360 --> 00:15:23,640 Speaker 1: the brain and the human brain tends to create habits 256 00:15:23,680 --> 00:15:26,440 Speaker 1: because it likes to expend as little energy as possible. 257 00:15:26,720 --> 00:15:29,320 Speaker 1: It's it's trying to be as ficient as possible. And 258 00:15:29,360 --> 00:15:32,680 Speaker 1: I think the same thing happens with medical practice. You're trained, 259 00:15:32,760 --> 00:15:35,120 Speaker 1: you understand, you come out of medical school with a 260 00:15:35,160 --> 00:15:37,560 Speaker 1: lot of book learning, and then you put it to 261 00:15:37,600 --> 00:15:40,000 Speaker 1: practice and you kind of find your niche and along 262 00:15:40,000 --> 00:15:41,640 Speaker 1: the way you forget a lot of the stuff that 263 00:15:41,720 --> 00:15:43,840 Speaker 1: you haven't done in twenty years or haven't learned about 264 00:15:43,880 --> 00:15:46,280 Speaker 1: in twenty years, so it's not just current stuff, it's 265 00:15:46,320 --> 00:15:49,200 Speaker 1: old stuff too. And if you feed the physician's desk 266 00:15:49,240 --> 00:15:53,440 Speaker 1: reference into Watson or one of his his compatriots, like, 267 00:15:53,840 --> 00:15:56,640 Speaker 1: all of that knowledge can be quickly index in research 268 00:15:56,720 --> 00:16:00,800 Speaker 1: to try to spit out a more accurate diagnosed. Yeah. 269 00:16:00,880 --> 00:16:03,480 Speaker 1: I think that's a great idea. It's like a partnering 270 00:16:03,560 --> 00:16:08,200 Speaker 1: up with computers. It is sarily replacing, but what they're 271 00:16:08,240 --> 00:16:11,080 Speaker 1: doing with Watson is is very much moving towards replacing 272 00:16:11,120 --> 00:16:14,080 Speaker 1: doctors in that sense. Well, here's a scary stat um. 273 00:16:14,120 --> 00:16:16,840 Speaker 1: One in five diagnoses in the United States are incorrect 274 00:16:16,920 --> 00:16:21,120 Speaker 1: or incomplete one in five and a lot of times 275 00:16:21,120 --> 00:16:23,400 Speaker 1: it's not that the doctor is a jerk or not 276 00:16:23,480 --> 00:16:26,440 Speaker 1: any good, but like you said, they just maybe haven't 277 00:16:26,720 --> 00:16:30,080 Speaker 1: seen these cases that were written about in some obscure 278 00:16:30,120 --> 00:16:33,600 Speaker 1: medical journal that the computer has scanned an index, you know. 279 00:16:33,920 --> 00:16:39,000 Speaker 1: And Dolly Wall Dr Dolly Wall himself at Freak Diagnostician Dollywood, Yeah, 280 00:16:39,080 --> 00:16:41,440 Speaker 1: pretty close, which is a wonderful place, by the way, 281 00:16:41,480 --> 00:16:45,440 Speaker 1: I know you love Dollywood. Um. Dr Dolly wall Uh 282 00:16:45,960 --> 00:16:49,120 Speaker 1: himself says a lot even with me, A lot of 283 00:16:49,160 --> 00:16:53,640 Speaker 1: it is intuition, and intuition can be wrong. That's a 284 00:16:53,720 --> 00:16:58,400 Speaker 1: criticism though, of computers as doctors. They lack intuition. Like, 285 00:16:59,000 --> 00:17:03,120 Speaker 1: there's kind of even larger, even larger than this computer's 286 00:17:03,160 --> 00:17:08,280 Speaker 1: replacing doctor's conversation going on. It's kind of a conversation 287 00:17:08,400 --> 00:17:12,800 Speaker 1: or a debate over whether intuition or data. Yeah, trump's 288 00:17:12,920 --> 00:17:15,920 Speaker 1: one or the other. Which one is the right way 289 00:17:15,960 --> 00:17:19,440 Speaker 1: to go? Yeah? This one stat too, it says according 290 00:17:19,440 --> 00:17:21,040 Speaker 1: to an expert, I'm not sure what that means. It 291 00:17:21,040 --> 00:17:24,320 Speaker 1: sounds sinky, but they said, only of the knowledge of 292 00:17:24,359 --> 00:17:30,040 Speaker 1: physicians use to diagnose is evidence based, So that means 293 00:17:30,119 --> 00:17:33,919 Speaker 1: is intuition, which which also jibs and dovetails with that 294 00:17:33,960 --> 00:17:37,320 Speaker 1: one in five being wrong, I mean, or one in 295 00:17:37,400 --> 00:17:40,040 Speaker 1: five being right. I'd like the idea of intuition to 296 00:17:40,080 --> 00:17:42,720 Speaker 1: a certain degree for sure, but there's also got to 297 00:17:42,760 --> 00:17:47,040 Speaker 1: be like data backing it up. Sure, you know, so 298 00:17:47,600 --> 00:17:49,560 Speaker 1: in your perfect world. And it sounds like we still 299 00:17:49,600 --> 00:17:52,320 Speaker 1: have physicians, but they go back and double check themselves 300 00:17:52,960 --> 00:17:56,639 Speaker 1: using a program. Yeah, but I could also be down 301 00:17:56,720 --> 00:17:59,800 Speaker 1: with um simple what is it? What do they call 302 00:17:59,840 --> 00:18:08,200 Speaker 1: it here? Um? Something based diseases, rules based chronic diseases. Yeah, 303 00:18:08,240 --> 00:18:11,320 Speaker 1: like minor things that are pretty easy to diagnose. They're 304 00:18:11,359 --> 00:18:14,720 Speaker 1: not even necessarily minor. We just understand them so fully 305 00:18:14,760 --> 00:18:17,480 Speaker 1: that we say type two diabetes is going to behave 306 00:18:17,520 --> 00:18:19,840 Speaker 1: and present itself like this. Yeah, but I wouldn't mind 307 00:18:19,840 --> 00:18:22,520 Speaker 1: going like it seems like once a year I get 308 00:18:22,560 --> 00:18:24,560 Speaker 1: like an upper respiratory infection. It's been three or four 309 00:18:24,640 --> 00:18:27,199 Speaker 1: years in a row, and I know what the treatment is, 310 00:18:27,240 --> 00:18:29,399 Speaker 1: I know how it feels. It would be great to 311 00:18:29,400 --> 00:18:33,159 Speaker 1: go into a machine and have them take some stats 312 00:18:33,200 --> 00:18:36,280 Speaker 1: and blow into it and hear my wheezing and give 313 00:18:36,320 --> 00:18:41,040 Speaker 1: me a a steroid shot and a Z pack and 314 00:18:41,160 --> 00:18:43,600 Speaker 1: a breathing treatment and send me on my way. So 315 00:18:43,600 --> 00:18:45,280 Speaker 1: it's always what clears it up. Would you care if 316 00:18:45,320 --> 00:18:48,000 Speaker 1: it was a robot that gave you that shot? Not 317 00:18:48,080 --> 00:18:52,959 Speaker 1: at all, um, but I definitely would want more personal 318 00:18:53,000 --> 00:18:54,879 Speaker 1: care if it was something what if it was a 319 00:18:54,960 --> 00:18:59,080 Speaker 1: robot with a nice avatar, sexy avatar maybe or just 320 00:18:59,160 --> 00:19:01,680 Speaker 1: a friendly one. Yeah, that was a little, a little, 321 00:19:01,680 --> 00:19:04,000 Speaker 1: a little it would touch your forearm here there. Yeah, 322 00:19:04,040 --> 00:19:07,280 Speaker 1: well that might be a little creepy. Yeah, if like 323 00:19:07,520 --> 00:19:09,240 Speaker 1: it was an old timey doctor who like gave you 324 00:19:09,280 --> 00:19:12,199 Speaker 1: some epocac if you had diarrhea, just send you on 325 00:19:12,200 --> 00:19:14,560 Speaker 1: your way drink a coke, but it wouldn't send you 326 00:19:14,560 --> 00:19:16,159 Speaker 1: on your way to give you ipocac and then it 327 00:19:16,200 --> 00:19:19,920 Speaker 1: wouldn't let go of your forearm. Yeah, so strong well 328 00:19:20,440 --> 00:19:22,720 Speaker 1: surgical robots. That's a that's a thing. I mean, we're 329 00:19:22,760 --> 00:19:27,439 Speaker 1: kidding around, but they've been performing. They've been performing robotic 330 00:19:27,480 --> 00:19:32,600 Speaker 1: surgery since the early eighties, um doctor assisted until two 331 00:19:32,600 --> 00:19:35,000 Speaker 1: thousand ten, where they were in Montreal. They performed the 332 00:19:35,040 --> 00:19:40,640 Speaker 1: first fully robotic surgeries when they removed a prostate with 333 00:19:40,760 --> 00:19:47,400 Speaker 1: a fully robotic UH surgeon and fully robotic anesthesiologist Dr 334 00:19:47,520 --> 00:19:50,400 Speaker 1: mc sleepy. Dr mc sleepy. Yeah, and the the that's 335 00:19:50,440 --> 00:19:52,880 Speaker 1: the real name the robot surgeon was da Vinci, which 336 00:19:52,880 --> 00:19:57,240 Speaker 1: is like the basically gold standard for robotic surgical or 337 00:19:57,240 --> 00:20:02,919 Speaker 1: surgical robots. Yeah, they had to that thirty thousand robotic 338 00:20:03,000 --> 00:20:05,919 Speaker 1: surgeries performed in the US, So it's it's big, it 339 00:20:06,119 --> 00:20:09,440 Speaker 1: is and um but the da Vinci is a doctor 340 00:20:10,000 --> 00:20:12,960 Speaker 1: basically sitting in a little uh it looks like an 341 00:20:13,119 --> 00:20:18,879 Speaker 1: arcade game and using UM robotic arms to mimic his 342 00:20:19,040 --> 00:20:23,080 Speaker 1: or her movements on more microscopic levels. Right, So the 343 00:20:23,240 --> 00:20:26,480 Speaker 1: robot has more precise movements and can make smaller movements, 344 00:20:27,200 --> 00:20:30,600 Speaker 1: um than the doctor. It's tell it's and what's the 345 00:20:30,640 --> 00:20:35,800 Speaker 1: opposite of telescoping, like going downward in scale? Whatever that is. 346 00:20:35,880 --> 00:20:38,600 Speaker 1: It's taking the movements of the doctor and reducing them 347 00:20:38,600 --> 00:20:43,720 Speaker 1: in scale. Let's call it reverse telescoping, reverse telescoping those movements, um, 348 00:20:43,720 --> 00:20:46,560 Speaker 1: which is a pretty awesome achievement in and of itself. 349 00:20:46,800 --> 00:20:51,359 Speaker 1: That doctors being fed three d um graphics of what 350 00:20:51,560 --> 00:20:54,719 Speaker 1: the robot is seeing, uh, and just kind of working 351 00:20:54,720 --> 00:20:59,679 Speaker 1: from there. Uh. What we're moving towards apparently is fully 352 00:21:01,240 --> 00:21:04,720 Speaker 1: robotic size surgeries. I was talking to Joe McCormick from 353 00:21:04,760 --> 00:21:08,280 Speaker 1: Forward Thinking and he was saying that, um, there was 354 00:21:09,000 --> 00:21:13,040 Speaker 1: there's something called the Raven four, I believe. Uh. And 355 00:21:13,080 --> 00:21:15,520 Speaker 1: basically you just say, this is going to be a 356 00:21:15,560 --> 00:21:20,280 Speaker 1: gall bladder surgery on a six ft six male age 357 00:21:20,680 --> 00:21:24,160 Speaker 1: you know whatever. And here's his here's the cat scan 358 00:21:24,320 --> 00:21:28,639 Speaker 1: of his abdomen um. So go removes gall bladder and 359 00:21:28,680 --> 00:21:31,080 Speaker 1: you press enter and the thing goes in there and 360 00:21:31,119 --> 00:21:34,520 Speaker 1: like removes the guy's gall bladder and sews him up. Yeah, 361 00:21:34,760 --> 00:21:40,520 Speaker 1: that's fully robotic, like fully autonomous robotic surgery. Button then 362 00:21:40,520 --> 00:21:42,800 Speaker 1: it does it. You're not actually controlling a machine that 363 00:21:42,880 --> 00:21:46,240 Speaker 1: does it exactly the machines doing it at your behest, 364 00:21:46,320 --> 00:21:49,200 Speaker 1: but you're not controlling it. Yeah. Um, and we're right 365 00:21:49,240 --> 00:21:51,400 Speaker 1: on the cusp of that, and apparently it's already happening. 366 00:21:52,240 --> 00:21:55,760 Speaker 1: Uh yeah, but there are some issues. UM. I looked 367 00:21:55,760 --> 00:21:59,280 Speaker 1: into it and found that a lot of injury reporting 368 00:21:59,280 --> 00:22:04,080 Speaker 1: and robotics agery is um not being reported. It's it's substandard. 369 00:22:04,720 --> 00:22:08,520 Speaker 1: And uh, this woman, Sheina Wilson, had robotic surgery for 370 00:22:08,600 --> 00:22:13,760 Speaker 1: hysterectomy in two thousand thirteen and apparently this uh intuitive 371 00:22:13,760 --> 00:22:17,320 Speaker 1: surgical system had there had been a bunch of injuries 372 00:22:18,000 --> 00:22:20,360 Speaker 1: that she didn't know about, and she had her rectum 373 00:22:20,359 --> 00:22:24,479 Speaker 1: burned badly and said, if I would have known that 374 00:22:24,520 --> 00:22:28,199 Speaker 1: this system had these issues, would not have elected to 375 00:22:28,240 --> 00:22:31,200 Speaker 1: take part in it. So there's a lot of under reporting. Um. 376 00:22:31,240 --> 00:22:34,720 Speaker 1: The f d A, UM, they have no authority to 377 00:22:35,080 --> 00:22:38,760 Speaker 1: force a doctor to do this, and apparently there's every 378 00:22:38,800 --> 00:22:42,359 Speaker 1: reason in every link in the chain not to report 379 00:22:42,440 --> 00:22:46,000 Speaker 1: these things, you know, and the f d A not 380 00:22:46,280 --> 00:22:51,119 Speaker 1: enforcing this kind of thing, not enforcing reporting is ridiculous. Yeah. 381 00:22:51,200 --> 00:22:54,600 Speaker 1: You know. The thing is that things like that happen, 382 00:22:54,800 --> 00:23:00,440 Speaker 1: and there's under reporting. UM. With human surgic surgeons as well. Yeah, sure, 383 00:23:00,440 --> 00:23:04,520 Speaker 1: not just robotic. It's like overall, apparently surgical injury and 384 00:23:04,600 --> 00:23:08,119 Speaker 1: accident reporting is not compulsory. Yeah, and here's here's a 385 00:23:08,160 --> 00:23:11,800 Speaker 1: few points though. Counterpoints I guess is one, it's not 386 00:23:11,880 --> 00:23:15,040 Speaker 1: always the robotic component of the surgery that was the 387 00:23:15,080 --> 00:23:18,800 Speaker 1: cause to a lot of times they say they don't 388 00:23:18,840 --> 00:23:22,199 Speaker 1: know about this until like a lawsuit is filed, So 389 00:23:22,240 --> 00:23:24,440 Speaker 1: it could be weeks or months later with the physician 390 00:23:24,480 --> 00:23:27,679 Speaker 1: doesn't know about it, or the FDA might not get 391 00:23:27,840 --> 00:23:30,120 Speaker 1: report on it, and like six months later you follow 392 00:23:30,119 --> 00:23:32,919 Speaker 1: a lawsuit and that's how it comes to light. Um. 393 00:23:32,960 --> 00:23:35,879 Speaker 1: But the FDA is definitely concerned and are supposedly working 394 00:23:35,880 --> 00:23:41,159 Speaker 1: to improve this. That's very concerned. They're very concerned. Uh. 395 00:23:41,240 --> 00:23:43,920 Speaker 1: And another problem too, and that same article, a lot 396 00:23:43,960 --> 00:23:48,119 Speaker 1: of these robotic surgical systems, you still have to have 397 00:23:48,480 --> 00:23:51,840 Speaker 1: the correct amount of training. And Uh. The feeling of 398 00:23:51,960 --> 00:23:55,240 Speaker 1: some experts is that UM or at least this one guy, 399 00:23:55,440 --> 00:23:58,840 Speaker 1: Enrico Benedetti, he's a head of surgery at the University 400 00:23:58,840 --> 00:24:01,520 Speaker 1: of Illinois Chicago, UH, says a lot of it just 401 00:24:01,520 --> 00:24:03,520 Speaker 1: comes back to training. These some of these doctors aren't 402 00:24:03,520 --> 00:24:07,480 Speaker 1: getting adequately trained in these machines enough to perform this regery. Yeah, 403 00:24:07,880 --> 00:24:11,640 Speaker 1: like what happens when I do this? Oh? That happens. 404 00:24:12,480 --> 00:24:14,960 Speaker 1: That's not good. I've got another alarming stat for you 405 00:24:15,000 --> 00:24:18,880 Speaker 1: to hold on. Hold on, hold on. Before that, let's 406 00:24:18,880 --> 00:24:37,000 Speaker 1: do a message break real quick. Okay, tell me your 407 00:24:37,040 --> 00:24:40,359 Speaker 1: alarming stat al right. JOHNS. Hopkinsider a study that found 408 00:24:40,560 --> 00:24:44,240 Speaker 1: as many as forty thou patients die in intensive care 409 00:24:44,400 --> 00:24:49,480 Speaker 1: each year in the US due to misdiagnosis man and UM. 410 00:24:49,520 --> 00:24:53,640 Speaker 1: Another study found that system related factors like UH, lack 411 00:24:53,680 --> 00:24:57,399 Speaker 1: of teamworking, communication, or just poor processes were involved in 412 00:24:57,480 --> 00:25:04,840 Speaker 1: six of diagnostic error and cognitive factors, and with premature 413 00:25:04,880 --> 00:25:08,040 Speaker 1: closure is the most common, which is basically just sticking 414 00:25:08,040 --> 00:25:10,840 Speaker 1: to that initial diagnosis and not being open minded to 415 00:25:10,880 --> 00:25:14,120 Speaker 1: other like second opinions. Yeah. So there's this thing called 416 00:25:14,160 --> 00:25:17,879 Speaker 1: anchoring bias that UM was in that New York Times article. 417 00:25:18,400 --> 00:25:21,600 Speaker 1: Dr Dolly Wall the guy who created this program that's 418 00:25:21,720 --> 00:25:26,520 Speaker 1: now around to support diagnostics where a physician will say, 419 00:25:26,520 --> 00:25:28,280 Speaker 1: I think it's this, but let me put in the 420 00:25:28,320 --> 00:25:31,080 Speaker 1: symptoms and ask Isabelle Um which is the name of 421 00:25:31,119 --> 00:25:33,880 Speaker 1: the program, and it's named after the guy who created 422 00:25:33,920 --> 00:25:38,199 Speaker 1: the program's daughter. Yeah, when she was three, took her 423 00:25:38,200 --> 00:25:40,199 Speaker 1: to the hospital and the doctors said, well, she has 424 00:25:40,280 --> 00:25:43,320 Speaker 1: chicken pox. And she did indeed have chicken pox, but 425 00:25:43,520 --> 00:25:46,200 Speaker 1: that's all they looked at. They completely missed a pretty 426 00:25:46,280 --> 00:25:50,040 Speaker 1: nasty case of necrotizing fasciitis, which we've talked about before, 427 00:25:50,520 --> 00:25:54,760 Speaker 1: flesh eating bacteria, and um, she almost died from it. 428 00:25:54,960 --> 00:25:57,439 Speaker 1: It was. It was disfigured from it as a result, 429 00:25:57,520 --> 00:26:00,520 Speaker 1: so that her father, who was a money manager, said, 430 00:26:00,600 --> 00:26:03,640 Speaker 1: I'm going to take whatever computer programming skills I haven't 431 00:26:03,680 --> 00:26:08,160 Speaker 1: put it towards this program Isabelle, which is meant to say, yes, 432 00:26:08,200 --> 00:26:10,880 Speaker 1: you're right with this diagnosis, I agree with you, or 433 00:26:11,000 --> 00:26:14,399 Speaker 1: have you considered these other diagnoses? And he said, like, 434 00:26:14,520 --> 00:26:17,800 Speaker 1: had hasabel been around and his daughter's doctors consulted it, 435 00:26:18,040 --> 00:26:21,399 Speaker 1: they would not have missed the necrotizing fasciitus. Well, it 436 00:26:21,440 --> 00:26:24,000 Speaker 1: makes sense, um as an assist. You know. Um, there's 437 00:26:24,000 --> 00:26:27,240 Speaker 1: this company called Life Calm that said in clinical trials 438 00:26:27,320 --> 00:26:32,880 Speaker 1: that if you use a medical diagnostic program as an assist, Uh, 439 00:26:32,920 --> 00:26:37,680 Speaker 1: those engines were accurate without using exams or imaging or 440 00:26:37,760 --> 00:26:43,880 Speaker 1: labs even really just symptoms. Yeah, that's crazy, that's really 441 00:26:44,119 --> 00:26:48,600 Speaker 1: really really good. Yeah, like that's a that's an A, 442 00:26:48,800 --> 00:26:51,520 Speaker 1: that's a low A. It's still in a. But as 443 00:26:51,560 --> 00:26:53,560 Speaker 1: an assistant, I think it's you know, it's kind of 444 00:26:53,600 --> 00:26:57,159 Speaker 1: a no brainer, don't you think. Oh yeah, I think so. 445 00:26:57,280 --> 00:27:00,119 Speaker 1: I don't know why. I all I can think of 446 00:27:00,280 --> 00:27:05,160 Speaker 1: is possibly worrying about feeding the beasts that will take 447 00:27:05,240 --> 00:27:08,360 Speaker 1: your job, or just having too much of a case 448 00:27:08,440 --> 00:27:11,520 Speaker 1: load to take the time to double check your work 449 00:27:11,680 --> 00:27:15,080 Speaker 1: on a computer would be the only reasons why doctors 450 00:27:15,080 --> 00:27:19,440 Speaker 1: aren't using that. Well, the smartphone is becoming a potential 451 00:27:19,800 --> 00:27:23,800 Speaker 1: uh self diagnos ur There's all these cool things on 452 00:27:23,840 --> 00:27:26,560 Speaker 1: the horizon that you can use your your phone for. 453 00:27:27,240 --> 00:27:29,840 Speaker 1: There's one called a live Core which you can take 454 00:27:29,880 --> 00:27:33,919 Speaker 1: your own ECG testy and potentially, for the cost of 455 00:27:33,960 --> 00:27:36,080 Speaker 1: getting one e c G in a hospital, you could 456 00:27:36,240 --> 00:27:40,520 Speaker 1: send a year's worth of daily ECGs you took yourself 457 00:27:40,560 --> 00:27:44,080 Speaker 1: to your doctor, and then you carry all that info 458 00:27:44,280 --> 00:27:46,199 Speaker 1: and all of your other medical info from all of 459 00:27:46,200 --> 00:27:48,720 Speaker 1: your apps that will eventually be integrated into one or 460 00:27:48,720 --> 00:27:51,240 Speaker 1: two apps that will probably become preloaded on your iPhone 461 00:27:51,240 --> 00:27:53,919 Speaker 1: in the next couple of years. And you've got your 462 00:27:53,960 --> 00:27:56,680 Speaker 1: medical history right there. Yeah, and I you know, most 463 00:27:56,720 --> 00:27:59,639 Speaker 1: of these require like a little clip on like um, 464 00:27:59,680 --> 00:28:02,600 Speaker 1: something called cell scope that's like you clip it onto 465 00:28:02,600 --> 00:28:06,639 Speaker 1: your little camera lens essentially, and it's like, what are 466 00:28:06,640 --> 00:28:08,919 Speaker 1: the little magnifiers with the lights that doctors used to 467 00:28:08,920 --> 00:28:11,720 Speaker 1: look in your ears and eyes? Uh, yeah, it looks 468 00:28:11,760 --> 00:28:14,439 Speaker 1: like one of those clipped onto your your iPhone and 469 00:28:14,600 --> 00:28:17,840 Speaker 1: it produces, uh, you can do imaging for skin moles 470 00:28:17,840 --> 00:28:21,879 Speaker 1: and rashes and ear infections. They have one called NTRA 471 00:28:22,480 --> 00:28:26,400 Speaker 1: that you could potentially give your own eyes uh, get 472 00:28:26,400 --> 00:28:30,960 Speaker 1: your own like glasses prescription done and then you ordering 473 00:28:31,320 --> 00:28:35,840 Speaker 1: the information to some website and they say and then 474 00:28:35,880 --> 00:28:41,360 Speaker 1: this one called Adamant that smells your breath, that smells 475 00:28:41,400 --> 00:28:44,440 Speaker 1: gases in your breath and it could detect like lung 476 00:28:44,480 --> 00:28:49,760 Speaker 1: cancer even yeah, apparently you have real metabolic changes to 477 00:28:49,840 --> 00:28:52,360 Speaker 1: the smell of your breath yea when you have different 478 00:28:52,400 --> 00:28:55,040 Speaker 1: types of cancer, not just long um. Like bees can 479 00:28:55,080 --> 00:28:59,440 Speaker 1: detect breast cancer. Um. If you breathe into like this 480 00:29:00,000 --> 00:29:02,760 Speaker 1: special glass fear with bees around it, they can be 481 00:29:02,800 --> 00:29:05,440 Speaker 1: trained to detect lung cancer and they come back with 482 00:29:05,520 --> 00:29:08,680 Speaker 1: the correct results a lot of the time. So a 483 00:29:08,680 --> 00:29:10,200 Speaker 1: lot of these are on the horizon they're not like 484 00:29:10,240 --> 00:29:13,560 Speaker 1: in heavy rotation yet, No, but but it's pretty neat 485 00:29:13,600 --> 00:29:17,200 Speaker 1: All of them reveal this idea that no one cares 486 00:29:17,320 --> 00:29:22,440 Speaker 1: about your particular health and well being more than you. 487 00:29:22,760 --> 00:29:24,920 Speaker 1: Unless you're one of those dudes who doesn't really care. 488 00:29:25,520 --> 00:29:28,920 Speaker 1: Then your your wife does or your mom you know, 489 00:29:29,200 --> 00:29:31,320 Speaker 1: and we probably cares more about me than me, right, 490 00:29:31,440 --> 00:29:36,040 Speaker 1: But there's there. The point is the doctor, the insurance company, 491 00:29:36,080 --> 00:29:40,080 Speaker 1: the the hospital. While they're all in the field because 492 00:29:40,120 --> 00:29:42,840 Speaker 1: they do care about your health, of course, they can't 493 00:29:42,880 --> 00:29:44,920 Speaker 1: possibly care about it more than you or your loved 494 00:29:44,920 --> 00:29:48,360 Speaker 1: one does. So the idea of giving you the ability 495 00:29:48,560 --> 00:29:51,800 Speaker 1: to keep all of that information yourself and easily handed 496 00:29:51,840 --> 00:29:54,880 Speaker 1: over to them or potentially down the road, a computer 497 00:29:55,000 --> 00:29:58,040 Speaker 1: version of them. I can't think of any any better 498 00:29:58,080 --> 00:30:01,360 Speaker 1: revolution in medicine right now than that read. I think 499 00:30:01,360 --> 00:30:03,440 Speaker 1: it's pretty exciting. Yeah. I think we're going to live 500 00:30:03,440 --> 00:30:06,560 Speaker 1: into the triple digits, buddy. Yeah. And I think there 501 00:30:06,560 --> 00:30:08,360 Speaker 1: will always be a need for doctors and nurses. I 502 00:30:08,400 --> 00:30:12,000 Speaker 1: don't think anyone will be wholly replaced but a little 503 00:30:12,160 --> 00:30:16,880 Speaker 1: robot assist. Yeah. Yeah. Let me make one more point, 504 00:30:16,880 --> 00:30:20,080 Speaker 1: all right, there's so you've heard of genomics, Yes, There's 505 00:30:20,120 --> 00:30:23,880 Speaker 1: also this thing called proteonomics, which is basically your protein 506 00:30:24,000 --> 00:30:27,800 Speaker 1: version of your your genome, your genome, and it's all 507 00:30:27,840 --> 00:30:29,880 Speaker 1: of the proteins in your body that you have that 508 00:30:29,920 --> 00:30:32,400 Speaker 1: your manufacturing, that you're losing, and all the changes and 509 00:30:32,480 --> 00:30:35,480 Speaker 1: fluctuations in them. And the idea is that you can 510 00:30:35,560 --> 00:30:40,320 Speaker 1: get a full work up of your proteonome and your genome, 511 00:30:41,240 --> 00:30:44,160 Speaker 1: and eventually you can add that to your medical history 512 00:30:44,160 --> 00:30:46,400 Speaker 1: as well, what your E k G reading has been 513 00:30:46,480 --> 00:30:49,800 Speaker 1: over the past year, um, any way you may have 514 00:30:49,840 --> 00:30:53,160 Speaker 1: gained or lost or anything like that, what your breath 515 00:30:53,160 --> 00:30:57,640 Speaker 1: smells like metabolically speaking, and not only have your current 516 00:30:57,680 --> 00:31:03,000 Speaker 1: state of health, but personalized your version of that personalized 517 00:31:03,040 --> 00:31:06,600 Speaker 1: down to your genes and proteins in your body, so 518 00:31:07,480 --> 00:31:12,120 Speaker 1: a treatment could be specifically tailored to you. Wow, that's 519 00:31:12,120 --> 00:31:14,760 Speaker 1: gonna be really tough for a human physician to do 520 00:31:14,800 --> 00:31:17,240 Speaker 1: that on their own. To top that, Yeah, the the 521 00:31:17,280 --> 00:31:21,520 Speaker 1: amount of data available already is overwhelming human doctors. When 522 00:31:21,560 --> 00:31:23,280 Speaker 1: you add this other kind of stuff on it, it's 523 00:31:23,280 --> 00:31:26,720 Speaker 1: just pulling away from them more and more. Yeah, and 524 00:31:26,720 --> 00:31:29,840 Speaker 1: medical record keeping is uh, I know, there's been issues 525 00:31:29,880 --> 00:31:33,400 Speaker 1: with that and digitizing that and keeping up with medical records, 526 00:31:33,400 --> 00:31:35,600 Speaker 1: and if you could be yourself advocate and keep up 527 00:31:35,600 --> 00:31:37,640 Speaker 1: with your own medical records, it might be kind of nice. 528 00:31:38,800 --> 00:31:41,880 Speaker 1: So I I feel like we answered the question, which 529 00:31:42,040 --> 00:31:46,160 Speaker 1: is yes, no more doctors. I don't know. I think 530 00:31:46,200 --> 00:31:48,840 Speaker 1: in in the future, I will always have humans to 531 00:31:49,160 --> 00:31:53,239 Speaker 1: interact between us. I think because we're always gonna want 532 00:31:53,280 --> 00:31:54,880 Speaker 1: somebody to yell at or be like what is this 533 00:31:54,960 --> 00:31:57,880 Speaker 1: robot doing? Or can you help me this robots give 534 00:31:57,920 --> 00:31:59,640 Speaker 1: me some ipocacu and won't let go on my arm 535 00:31:59,800 --> 00:32:03,960 Speaker 1: or burn my rectum. Yes, we're always going to need humans. 536 00:32:04,000 --> 00:32:06,680 Speaker 1: It's just I don't know, well we need physicians, and 537 00:32:06,720 --> 00:32:09,200 Speaker 1: if we do, will they be super specialized like just 538 00:32:09,280 --> 00:32:14,200 Speaker 1: the Supreme Court of Physicians. Who knows. It's pretty exciting. 539 00:32:14,240 --> 00:32:16,360 Speaker 1: But we will see this change one way or another 540 00:32:16,640 --> 00:32:22,240 Speaker 1: in the next fifteen years under my prediction to it's happening. Okay, 541 00:32:22,600 --> 00:32:26,480 Speaker 1: goog and Chuck if the year yeah. Really, If you 542 00:32:26,560 --> 00:32:31,440 Speaker 1: wanna learn more about computers possibly replacing doctors, you can 543 00:32:31,440 --> 00:32:33,800 Speaker 1: type those words into the search bar how stuff works 544 00:32:33,840 --> 00:32:36,880 Speaker 1: dot com. And since I said search bar, that means 545 00:32:36,920 --> 00:32:55,240 Speaker 1: it's time for a message break. Okay, So, so what 546 00:32:55,320 --> 00:33:00,840 Speaker 1: do we have listener mail time. Yeah, I have one 547 00:33:01,080 --> 00:33:05,800 Speaker 1: called I'm gonna call it fight Club. Okay, hey guys, 548 00:33:05,800 --> 00:33:09,600 Speaker 1: just finished the podcast on deep refrigerating. I think I'll 549 00:33:09,680 --> 00:33:13,600 Speaker 1: keep my Energy Star certified fridge. Thanks very much. But 550 00:33:13,680 --> 00:33:16,360 Speaker 1: Josh did mention something about eating weeds and asked a 551 00:33:16,400 --> 00:33:19,120 Speaker 1: somewhat rhetorical question, what are weeds anyway? Just plants we 552 00:33:19,120 --> 00:33:21,800 Speaker 1: say are bad? It reminded me of some today's common 553 00:33:22,040 --> 00:33:25,520 Speaker 1: uh some that some of today's common noxious weeds. How 554 00:33:25,560 --> 00:33:28,840 Speaker 1: they got their reputation not so long ago, lawns were 555 00:33:28,840 --> 00:33:32,280 Speaker 1: perfect blends of Bermuda rye and Kentucky blue grass. They 556 00:33:32,320 --> 00:33:35,400 Speaker 1: also included many types of clover, dandelion, and other quote weeds. 557 00:33:35,720 --> 00:33:38,760 Speaker 1: In fact, many seed mixtures specifically included white clover because 558 00:33:38,760 --> 00:33:41,080 Speaker 1: it makes an excellent cover in soils where more common 559 00:33:41,080 --> 00:33:44,920 Speaker 1: grasses won't grow. In steps the Scott Fertilizer Company, post 560 00:33:44,960 --> 00:33:47,719 Speaker 1: World War Two America housing tracks were popping up all 561 00:33:47,720 --> 00:33:50,720 Speaker 1: over the US and new suburbia, and Scott was encouraging 562 00:33:50,720 --> 00:33:52,720 Speaker 1: returning gis to take pride in their new lawns and 563 00:33:52,760 --> 00:33:55,680 Speaker 1: to buy their products to do so. And we're extremely 564 00:33:55,720 --> 00:33:59,440 Speaker 1: high waisted pants, that's right. They produced fertilizers, weed killers, 565 00:33:59,440 --> 00:34:01,520 Speaker 1: and other long your products, some of which had a 566 00:34:01,520 --> 00:34:04,080 Speaker 1: curious side effect killing many leafy greens that came up 567 00:34:04,080 --> 00:34:06,960 Speaker 1: to the point that we're not considered weeds at the time, 568 00:34:07,000 --> 00:34:10,920 Speaker 1: including white clover. Instead of reformulating, what they did was 569 00:34:11,000 --> 00:34:13,960 Speaker 1: what any red blooded American corporation would do. They redefined 570 00:34:14,040 --> 00:34:16,839 Speaker 1: what was a weed. White clover made that list as 571 00:34:16,840 --> 00:34:20,120 Speaker 1: the dandelions, when in fact both are still in use 572 00:34:20,160 --> 00:34:22,200 Speaker 1: today in cooking and medicines. Would you call that a 573 00:34:22,280 --> 00:34:26,919 Speaker 1: noxious weed? No? So thanks for that, guys, and thanks 574 00:34:26,920 --> 00:34:29,040 Speaker 1: for all the knowledge I've learned and have a great 575 00:34:29,719 --> 00:34:33,560 Speaker 1: and that is from Robert Paulson. Oh yeah, Robert Paulson. 576 00:34:33,640 --> 00:34:36,239 Speaker 1: He's a he's a sharp dude. That's why I called 577 00:34:36,239 --> 00:34:39,200 Speaker 1: it fight club, remember that. Oh yeah, I think I 578 00:34:39,280 --> 00:34:41,240 Speaker 1: made a joke to him about that once on Twitter 579 00:34:41,280 --> 00:34:43,440 Speaker 1: and he never responded. Yeah, he's he writes in a 580 00:34:43,480 --> 00:34:45,200 Speaker 1: lot now he's every time I see his name, I 581 00:34:45,239 --> 00:34:48,520 Speaker 1: think and his name is Robert Paulson. Yeah, thanks a lot, 582 00:34:48,600 --> 00:34:51,719 Speaker 1: Robert Paulson. We appreciate you. If you ever shot in 583 00:34:51,760 --> 00:34:53,839 Speaker 1: the head and the commission of a robbery, we will 584 00:34:53,880 --> 00:34:58,080 Speaker 1: dispose of your body. Um. If you want to get 585 00:34:58,120 --> 00:34:59,520 Speaker 1: in touch with me and Chuck, and you have a 586 00:34:59,640 --> 00:35:01,759 Speaker 1: name that you would like us to poke fun at 587 00:35:02,200 --> 00:35:05,840 Speaker 1: Bring it On. You can tweet to us at s 588 00:35:05,960 --> 00:35:08,920 Speaker 1: y s K podcast. You can post your name on 589 00:35:08,960 --> 00:35:12,600 Speaker 1: Facebook dot com slash Stuff you Should Know. You can 590 00:35:12,600 --> 00:35:14,880 Speaker 1: send us an email to stuff Podcasts at how stuff 591 00:35:14,880 --> 00:35:18,320 Speaker 1: Works dot com. And then, of course, go visit our website. 592 00:35:18,360 --> 00:35:20,960 Speaker 1: Make it your homepage. It's the coolest place on the web. 593 00:35:21,239 --> 00:35:26,919 Speaker 1: It's Stuff you Should Know dot com. Stuff you Should 594 00:35:26,960 --> 00:35:29,440 Speaker 1: Know is a production of iHeart Radio's How Stuff Works. 595 00:35:29,480 --> 00:35:31,720 Speaker 1: For more podcasts for my heart Radio, visit the iHeart 596 00:35:31,800 --> 00:35:34,319 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 597 00:35:34,320 --> 00:35:35,000 Speaker 1: favorite shows.