1 00:00:00,960 --> 00:00:04,120 Speaker 1: Welcome to you stuff you should know from house Stuff 2 00:00:04,160 --> 00:00:12,360 Speaker 1: Works dot com. Hey, and welcome to the podcast. I'm 3 00:00:12,440 --> 00:00:15,840 Speaker 1: Josh Clark. There's Charles W. Chuck Bryant. Our guest producer 4 00:00:15,960 --> 00:00:19,480 Speaker 1: Noel is here. Yeah, Jerry needs a buffer day from 5 00:00:19,480 --> 00:00:23,919 Speaker 1: her Christmas break. I can't say that again. No, she's 6 00:00:23,960 --> 00:00:26,520 Speaker 1: at home on her buffer day. And the freezing cold 7 00:00:27,600 --> 00:00:30,240 Speaker 1: because we record these somewhat in advance, we are in 8 00:00:30,280 --> 00:00:34,880 Speaker 1: the midst of the polar vortex and um, yeah, everyone's 9 00:00:34,880 --> 00:00:36,560 Speaker 1: just talking about how cold is We're back. It is 10 00:00:36,600 --> 00:00:40,239 Speaker 1: our first recording after the holidays. It's literally freezing cold up. 11 00:00:40,320 --> 00:00:42,600 Speaker 1: So welcome back, buddy, Thanks, Welcome back to you too. 12 00:00:42,680 --> 00:00:45,199 Speaker 1: Even though this will be what like late January, it 13 00:00:45,240 --> 00:00:47,320 Speaker 1: will be a balmy sixteen Atlanta. I think the end 14 00:00:47,360 --> 00:00:49,880 Speaker 1: of the end of January is when this one comes out. 15 00:00:49,920 --> 00:00:52,080 Speaker 1: All right, Well, happy New Year, and happy New Year 16 00:00:52,120 --> 00:00:56,440 Speaker 1: to YouTube, and happy holidays to you. Thanks. Um Chuck, Yes, sir, 17 00:00:57,240 --> 00:00:59,520 Speaker 1: you're feeling good. You're loose, You're ready. I'm loose. So 18 00:00:59,680 --> 00:01:02,600 Speaker 1: you see this. You've seen this before. Yeah, you're fit. 19 00:01:02,640 --> 00:01:05,720 Speaker 1: My fit bit is that buzz marketing? Not really, It's 20 00:01:05,760 --> 00:01:08,959 Speaker 1: just a really good example. Um I've felt like Fitbit, 21 00:01:09,040 --> 00:01:13,080 Speaker 1: I'm not like necessarily loyal to it or anything like that. 22 00:01:13,480 --> 00:01:17,600 Speaker 1: They don't pay me money to mention the podcast. Sometimes 23 00:01:17,400 --> 00:01:20,320 Speaker 1: it's be like, stop staring at me fit Bit. Um. 24 00:01:20,360 --> 00:01:22,840 Speaker 1: But no, I like it. I'm happy with it. Um. 25 00:01:23,000 --> 00:01:27,039 Speaker 1: I I pointed out though, because it's part of this 26 00:01:27,440 --> 00:01:29,720 Speaker 1: to me, and I don't think it's over confirmation bias. 27 00:01:29,800 --> 00:01:34,400 Speaker 1: It seems like there really is a growing desire among 28 00:01:34,680 --> 00:01:39,320 Speaker 1: just average ordinary people to be able to track there 29 00:01:40,200 --> 00:01:44,679 Speaker 1: their health, their well being, their activity, um, and to 30 00:01:44,720 --> 00:01:47,720 Speaker 1: do it easily. Yeah, we have tools now that make it, 31 00:01:47,880 --> 00:01:51,040 Speaker 1: like that thing super convenient. Yeah, it's and Fitbit's not 32 00:01:51,040 --> 00:01:53,480 Speaker 1: the only one. There's like Nike fuel Band. There's Jawbone 33 00:01:53,560 --> 00:01:56,640 Speaker 1: is another really good one. There's others like um that 34 00:01:56,880 --> 00:02:01,280 Speaker 1: track uh, your galvanic response, so they're able to put 35 00:02:01,320 --> 00:02:04,800 Speaker 1: that together with respiration and heartbeat and come up with 36 00:02:04,840 --> 00:02:07,120 Speaker 1: a pretty good assessment of how many calories you're burning 37 00:02:07,120 --> 00:02:09,079 Speaker 1: at any given time, which is like kind of a 38 00:02:09,120 --> 00:02:12,080 Speaker 1: holy grail with this kind of thing right now. Um, 39 00:02:12,120 --> 00:02:15,200 Speaker 1: there's others that check your sleep. There's apps out there 40 00:02:15,280 --> 00:02:19,800 Speaker 1: that let you um check your mood. Um. There's sites 41 00:02:19,840 --> 00:02:23,400 Speaker 1: like Quantified Self which are basically like people trying to 42 00:02:23,480 --> 00:02:28,040 Speaker 1: push wearable technology like this further into the future. There's 43 00:02:28,240 --> 00:02:31,680 Speaker 1: entire websites like share care dot com that are dedicated 44 00:02:31,720 --> 00:02:38,040 Speaker 1: to health information and health um uh support self advocacy. Yeah, 45 00:02:38,120 --> 00:02:40,920 Speaker 1: and there's this. It seems to me, this desire to 46 00:02:41,080 --> 00:02:43,760 Speaker 1: kind of say, hey, this is my health, this is 47 00:02:43,840 --> 00:02:46,600 Speaker 1: my body. I want to know more about it, you know, 48 00:02:47,280 --> 00:02:50,480 Speaker 1: like I don't want to necessarily cut out doctors, but 49 00:02:51,000 --> 00:02:53,840 Speaker 1: I I I want to decide if I should go 50 00:02:54,160 --> 00:02:56,840 Speaker 1: to the doctor, if it's time or not, and I 51 00:02:56,919 --> 00:03:00,200 Speaker 1: want to use data to do that. Yeah. I imagine U. 52 00:03:00,400 --> 00:03:02,440 Speaker 1: I frustrate a lot of doctors because I'm one of 53 00:03:02,440 --> 00:03:06,120 Speaker 1: those obnoxious people that goes in and it's like, well, 54 00:03:06,160 --> 00:03:08,359 Speaker 1: here's what I think I have based on my research. 55 00:03:09,440 --> 00:03:11,960 Speaker 1: There's nothing wrong with that. Yeah, that is what you're 56 00:03:12,000 --> 00:03:14,679 Speaker 1: an informed patient. That's true, exactly what you're supposed to do. 57 00:03:14,760 --> 00:03:16,960 Speaker 1: And if you're getting on your doctor's nerves, then go 58 00:03:17,040 --> 00:03:21,280 Speaker 1: see another doctor. Yeah, I agree. Actually in search of 59 00:03:21,320 --> 00:03:24,040 Speaker 1: a new GP right now for those because and others 60 00:03:24,040 --> 00:03:28,160 Speaker 1: you got on his nerves other reasons too. Really cold 61 00:03:28,320 --> 00:03:31,880 Speaker 1: cold hands like poor bedside man are never seen the doctor, 62 00:03:31,880 --> 00:03:36,960 Speaker 1: like here's my intern from Emory. Yeah, which great. You know. 63 00:03:37,000 --> 00:03:39,280 Speaker 1: I love them getting experienced, but I would like them 64 00:03:39,280 --> 00:03:41,800 Speaker 1: both to be in there, not just like smell you 65 00:03:41,920 --> 00:03:44,200 Speaker 1: later and the doctor leaves. Well that's another thing too. 66 00:03:44,200 --> 00:03:48,200 Speaker 1: It's kind of like, um, doctor, okay, let's let's just 67 00:03:48,240 --> 00:03:50,000 Speaker 1: lay it on on the table here. Yeah, what what 68 00:03:50,080 --> 00:03:52,960 Speaker 1: you've just mentioned and what I was talking about. If 69 00:03:52,960 --> 00:03:57,680 Speaker 1: you put it all together, the medical field physicians in particular, 70 00:03:58,080 --> 00:04:01,760 Speaker 1: are currently in a what's the beginning of what's possibly 71 00:04:02,480 --> 00:04:06,120 Speaker 1: a really pickle of a state for them. I think 72 00:04:06,160 --> 00:04:09,520 Speaker 1: a transition period, yes, but they may be transitioned right 73 00:04:09,560 --> 00:04:12,200 Speaker 1: out of existence in large parts. Yes, some may for sure, 74 00:04:12,280 --> 00:04:14,640 Speaker 1: depending on who you talk to. There's like this whole 75 00:04:14,680 --> 00:04:19,600 Speaker 1: question now, like what is the future of medicine and 76 00:04:19,680 --> 00:04:21,920 Speaker 1: more specifically, in the case of this episode that we're 77 00:04:21,920 --> 00:04:26,920 Speaker 1: talking about, do human physicians factor largely into that future? 78 00:04:27,680 --> 00:04:31,960 Speaker 1: And the answer is I don't know, yeah, depending on 79 00:04:32,000 --> 00:04:34,480 Speaker 1: who you ask, Like I said, um, we there's this 80 00:04:34,520 --> 00:04:39,560 Speaker 1: one guy, Dr Kent Bottles, who um he feels that 81 00:04:40,279 --> 00:04:43,400 Speaker 1: GPS might go the way of the DODO and be 82 00:04:43,520 --> 00:04:49,279 Speaker 1: replaced by diagnostic computers, maybe with avatars. Then there's other 83 00:04:49,320 --> 00:04:52,760 Speaker 1: people like Farhad Manjou's a technical writer, his wife as 84 00:04:52,800 --> 00:04:55,760 Speaker 1: a pathologist. He thinks, no, no no, no, the gps are 85 00:04:55,760 --> 00:04:57,560 Speaker 1: the ones that are going to be in business. The 86 00:04:57,600 --> 00:05:00,080 Speaker 1: specialists are gonna be out of business because computers are 87 00:05:00,080 --> 00:05:03,480 Speaker 1: really good at specializing on one single thing, not maybe 88 00:05:03,480 --> 00:05:08,080 Speaker 1: so good at a general practitioner thing. So there's lots 89 00:05:08,080 --> 00:05:11,120 Speaker 1: of opinions out there on how much they'll be replaced 90 00:05:11,120 --> 00:05:14,000 Speaker 1: and who might be replaced. Right then, As Reclined wrote 91 00:05:14,040 --> 00:05:17,680 Speaker 1: a column who basically he basically said like no, like 92 00:05:17,839 --> 00:05:20,720 Speaker 1: we we will still need humans, but we mainly need 93 00:05:20,839 --> 00:05:25,200 Speaker 1: humans to communicate to the other humans and facilitate the 94 00:05:25,200 --> 00:05:27,360 Speaker 1: the interaction between the robots and the humans. And we 95 00:05:27,400 --> 00:05:31,080 Speaker 1: already have this. They're called nurses or nurse practitioners. Yea. 96 00:05:31,120 --> 00:05:33,359 Speaker 1: But he kind of as Reclined is the one that 97 00:05:33,440 --> 00:05:36,919 Speaker 1: thought that a computer avatar might have a better bedside 98 00:05:36,920 --> 00:05:39,440 Speaker 1: manner than a doctor. Well, let's give that one example. 99 00:05:39,480 --> 00:05:42,200 Speaker 1: There's this, there was there's an example I kept finding 100 00:05:42,240 --> 00:05:44,440 Speaker 1: while we were doing research for the sentence. Actually in 101 00:05:44,440 --> 00:05:48,480 Speaker 1: the article on how stuff works. Um, it's uh. There 102 00:05:48,560 --> 00:05:52,039 Speaker 1: was a kiosk, a medical kiosk, during a panel called 103 00:05:52,080 --> 00:05:56,000 Speaker 1: man Made Minds colon living with thinking machines. When there's 104 00:05:56,000 --> 00:05:59,159 Speaker 1: a colon in there, you know it's serious stuff. Um. 105 00:05:59,200 --> 00:06:00,920 Speaker 1: And it was at the World Science Festival in two 106 00:06:00,920 --> 00:06:06,400 Speaker 1: thousand eleven, and basically this, this computerized avatar UM interacted 107 00:06:06,480 --> 00:06:11,520 Speaker 1: with a woman whose baby had diarrhea, and the woman said, hey, Avatar, 108 00:06:12,200 --> 00:06:14,840 Speaker 1: my baby has diarrhea. What are you gonna do about it? 109 00:06:15,440 --> 00:06:18,080 Speaker 1: And the avatar said, well, tell me all the symptoms 110 00:06:18,080 --> 00:06:22,080 Speaker 1: and all this stuff, and avatar decided that the baby's diarrhea, 111 00:06:22,160 --> 00:06:26,360 Speaker 1: while present, wasn't severe enough to warrant immediate medical attention, 112 00:06:26,720 --> 00:06:29,520 Speaker 1: so it went ahead and made an appointment with a 113 00:06:29,640 --> 00:06:33,039 Speaker 1: human doctor for later on that week. And the mother 114 00:06:33,200 --> 00:06:37,719 Speaker 1: said that she preferred the treatment by the avatar to 115 00:06:37,800 --> 00:06:41,040 Speaker 1: the real life nurses at the hospitals where she lived 116 00:06:41,040 --> 00:06:44,359 Speaker 1: in New York. Yeah. Uh, so it is possible to 117 00:06:44,400 --> 00:06:49,320 Speaker 1: create computers with better bedside manner than say, your g P. Well, 118 00:06:49,400 --> 00:06:51,960 Speaker 1: it's at the very least it will be consistent. And 119 00:06:52,000 --> 00:06:55,040 Speaker 1: that's one of the things that I'm not poopooing. Doctors 120 00:06:55,080 --> 00:06:58,000 Speaker 1: or nurses are many, many, many, many great ones, but 121 00:06:58,240 --> 00:07:01,400 Speaker 1: I've also had some pretty bad experiences, as in emergency 122 00:07:01,480 --> 00:07:03,600 Speaker 1: rooms and with doctors and nurses with a computer. At 123 00:07:03,680 --> 00:07:07,880 Speaker 1: least it's a consistent you know, their program to display 124 00:07:07,880 --> 00:07:11,400 Speaker 1: empathy no matter what. You know, they're not too busy 125 00:07:11,480 --> 00:07:14,320 Speaker 1: and they're not you know, having a bad day, so 126 00:07:14,520 --> 00:07:17,320 Speaker 1: they they don't you know, they don't have any prejudices 127 00:07:17,360 --> 00:07:20,320 Speaker 1: against you personally or anything like that. They're a computer. 128 00:07:20,480 --> 00:07:25,600 Speaker 1: They don't hate diarrhea. But humans, humans respond to even 129 00:07:25,760 --> 00:07:31,040 Speaker 1: programmed empathy, even synthetic empathy from a computer. I could 130 00:07:31,120 --> 00:07:36,120 Speaker 1: see that a little bit. Like I've dove into the 131 00:07:36,160 --> 00:07:40,440 Speaker 1: gaming world enough to know that, you know, the realism 132 00:07:40,520 --> 00:07:44,280 Speaker 1: of a of a avatar can be convincing, and it's 133 00:07:44,320 --> 00:07:45,840 Speaker 1: not like you think, oh, it's a real person, but 134 00:07:45,960 --> 00:07:47,720 Speaker 1: it helps to put a human face on it, you know, 135 00:07:48,000 --> 00:07:52,480 Speaker 1: exactly literally. Um, they saw a reference to a study 136 00:07:52,480 --> 00:07:57,080 Speaker 1: that found um, people who are being treated for anxiety 137 00:07:57,120 --> 00:08:02,560 Speaker 1: disorders tended to share more about their experiences and themselves 138 00:08:02,800 --> 00:08:06,240 Speaker 1: with an avatar them with a human psychiatrists. Oh that's 139 00:08:06,280 --> 00:08:10,720 Speaker 1: interesting because they're like not embarrassed to tell a real person. Yeah, 140 00:08:10,800 --> 00:08:12,440 Speaker 1: that makes sense. I might open up more to to 141 00:08:12,520 --> 00:08:14,840 Speaker 1: a computer. Right, So so we've got that part, like 142 00:08:14,880 --> 00:08:18,080 Speaker 1: the bedside manner, it is possible that we can create 143 00:08:18,120 --> 00:08:22,360 Speaker 1: machines now and are creating machines now that have at 144 00:08:22,400 --> 00:08:26,920 Speaker 1: least equal, if not better bedside manner than some physicians. Yeah, okay, 145 00:08:26,960 --> 00:08:29,440 Speaker 1: so bedside manner one of the big things that doctors 146 00:08:29,440 --> 00:08:32,960 Speaker 1: bring to the table. Check computers have that. Yeah, it's 147 00:08:33,040 --> 00:08:35,200 Speaker 1: it's different now than it was in the old days. 148 00:08:35,240 --> 00:08:38,960 Speaker 1: I feel like just the whole quality of personal care 149 00:08:39,120 --> 00:08:41,800 Speaker 1: is gone down. It's not necessarily the doctor's falts. There's 150 00:08:41,800 --> 00:08:44,480 Speaker 1: a lot of reasons to place the blame, but it's 151 00:08:44,480 --> 00:08:45,840 Speaker 1: not like when you were a kid and you feel 152 00:08:45,880 --> 00:08:48,240 Speaker 1: like you had your family doctor who knew you, maybe 153 00:08:48,280 --> 00:08:55,440 Speaker 1: even gave birthth you're my son exactly. Um, it just 154 00:08:55,520 --> 00:08:58,200 Speaker 1: invested like you gotta stick with the same doctor if 155 00:08:58,200 --> 00:08:59,800 Speaker 1: you want that kind of care, I think, right, And 156 00:09:00,080 --> 00:09:03,760 Speaker 1: is another benefit besides bedside banner um that comes with 157 00:09:03,880 --> 00:09:06,760 Speaker 1: that that kind of care, that kind of personal care 158 00:09:07,120 --> 00:09:11,280 Speaker 1: is an awareness of your medical history. Not just that 159 00:09:11,320 --> 00:09:14,920 Speaker 1: but oh well, your dad died of congenital heart disorders 160 00:09:15,320 --> 00:09:18,480 Speaker 1: like that, so you may be at higher risk of it. 161 00:09:18,080 --> 00:09:22,199 Speaker 1: To just that kind of awareness has been typically lost too. 162 00:09:22,679 --> 00:09:24,520 Speaker 1: Even though we have medical histories and they're in our 163 00:09:24,559 --> 00:09:27,600 Speaker 1: charts that they're in our files UM, and intimate knowledge 164 00:09:27,760 --> 00:09:33,559 Speaker 1: of a patients UM medical history is pretty much lost 165 00:09:33,920 --> 00:09:38,480 Speaker 1: in today's modern practice of medicine. UM. That's another thing 166 00:09:38,520 --> 00:09:44,679 Speaker 1: that that computers could conceivably top doctors on UM, which 167 00:09:44,720 --> 00:09:49,240 Speaker 1: basically falls under the umbrella of diagnosis or diagnostics. Yeah. 168 00:09:49,240 --> 00:09:51,719 Speaker 1: I mean there's two two sides to this. There's diagnoses 169 00:09:51,760 --> 00:09:56,200 Speaker 1: and treatment and some UH programs. A little bit of 170 00:09:56,200 --> 00:09:59,920 Speaker 1: the history this UM goes back to the nineteen seventy 171 00:10:00,120 --> 00:10:03,720 Speaker 1: is at the University of Pittsburgh. They develop software to 172 00:10:03,760 --> 00:10:07,760 Speaker 1: diagnose problems. UM. MASS General since the eighties has been 173 00:10:07,800 --> 00:10:12,800 Speaker 1: working on their d X plan, which provides ranked lists 174 00:10:12,840 --> 00:10:18,640 Speaker 1: of diagnoses, whereas the what's the computer the Watson Watson 175 00:10:18,679 --> 00:10:21,760 Speaker 1: who who won a Jeopardy Yeah, that's more based UM. 176 00:10:21,840 --> 00:10:25,880 Speaker 1: It looks like on treatment options than diagnosis at this point. 177 00:10:26,480 --> 00:10:29,600 Speaker 1: So well, yeah, but they said it's not They haven't. 178 00:10:29,920 --> 00:10:31,320 Speaker 1: I don't think they want to leave it alone with 179 00:10:31,360 --> 00:10:34,520 Speaker 1: diagnosis yet. No, and to do its thing there there's 180 00:10:34,559 --> 00:10:41,120 Speaker 1: already something out there for diagnosis that's meant to support physicians. 181 00:10:41,160 --> 00:10:44,440 Speaker 1: From what I understand with Watson, if there is a 182 00:10:44,559 --> 00:10:49,120 Speaker 1: doctor of the future, it's Watson. Yeah. UM, he has 183 00:10:49,160 --> 00:10:52,360 Speaker 1: a lot of advantages over not just um human doctors, 184 00:10:52,400 --> 00:10:58,720 Speaker 1: but other artificial intelligence healthcare machines. I guess you could 185 00:10:59,120 --> 00:11:03,920 Speaker 1: clumsily call he has a knack for natural language. So 186 00:11:04,040 --> 00:11:10,160 Speaker 1: let's say there's like a structured formula or formulaic type 187 00:11:10,200 --> 00:11:17,120 Speaker 1: of language that the medical field is supposed to use, right, yes, okay, um, 188 00:11:17,120 --> 00:11:21,320 Speaker 1: health records don't always necessarily contain that language. They might 189 00:11:21,360 --> 00:11:25,640 Speaker 1: contain natural language, which is really confusing for computers to 190 00:11:25,720 --> 00:11:30,120 Speaker 1: take in an absorb. Yeah. You know, humans can pick 191 00:11:30,200 --> 00:11:34,280 Speaker 1: up on meanings of things that robots and and software cannot, 192 00:11:35,080 --> 00:11:38,920 Speaker 1: like inferences, and we might be using sarcasm, although there's 193 00:11:38,920 --> 00:11:42,160 Speaker 1: probably not going to be any sarcasm in your medical records. Yeah, 194 00:11:42,160 --> 00:11:44,880 Speaker 1: but like figurative language and stuff like that. But computers 195 00:11:45,200 --> 00:11:46,920 Speaker 1: a language is a big part of the problem. Or 196 00:11:47,760 --> 00:11:50,720 Speaker 1: more to the point, with the diagnosis, patient says he 197 00:11:50,760 --> 00:11:54,319 Speaker 1: feels like he has a hive of bees in his stomach. 198 00:11:55,200 --> 00:11:57,520 Speaker 1: Like that might mean something that you or me, but 199 00:11:57,559 --> 00:12:01,240 Speaker 1: to a computer it's like followed the two bees or something. Right, 200 00:12:01,520 --> 00:12:04,240 Speaker 1: Watson has the advantage of saying, Okay, well there's a 201 00:12:04,280 --> 00:12:07,920 Speaker 1: sensation of bees in the stomach, there's not actually bees 202 00:12:07,920 --> 00:12:10,960 Speaker 1: in the stomach. So let's figure this out then, Watson 203 00:12:11,520 --> 00:12:17,160 Speaker 1: or anything that that he eventually becomes. UM, Well, we'll 204 00:12:17,200 --> 00:12:21,360 Speaker 1: be able to go through medical records, current medical research, UM, 205 00:12:21,440 --> 00:12:26,360 Speaker 1: the patient's medical history, uh, diagnostic tests that were done, 206 00:12:26,400 --> 00:12:30,760 Speaker 1: blood work, instrument tests, and put it all together and 207 00:12:30,760 --> 00:12:36,160 Speaker 1: then spit out a list of diagnoses with different confidence levels. 208 00:12:36,160 --> 00:12:37,560 Speaker 1: So the one at the top is the one that 209 00:12:37,600 --> 00:12:42,880 Speaker 1: Watson says is he is ninety eight point percent sure 210 00:12:42,960 --> 00:12:48,240 Speaker 1: is what's wrong with this patient? And UM as a diagnostician, 211 00:12:48,640 --> 00:12:51,880 Speaker 1: that's pretty impressive. And that's using all the available data 212 00:12:51,920 --> 00:12:57,240 Speaker 1: that's that's available also to human physicians, but they simply 213 00:12:57,360 --> 00:13:01,200 Speaker 1: don't have the time to make it all in. Yeah. 214 00:13:01,200 --> 00:13:04,560 Speaker 1: I think some research said that eighty percent of doctors 215 00:13:04,640 --> 00:13:08,559 Speaker 1: spend less than five hours a week reading medical journals 216 00:13:08,600 --> 00:13:12,880 Speaker 1: a month a month. Yeah, so that's these things can 217 00:13:12,920 --> 00:13:16,320 Speaker 1: read thousands in seconds. So it's it's sort of a 218 00:13:16,320 --> 00:13:20,520 Speaker 1: matter of of efficiency really, and like if doctors don't 219 00:13:20,520 --> 00:13:23,280 Speaker 1: have time to read all this stuff, I know, we 220 00:13:23,280 --> 00:13:26,480 Speaker 1: we looked into this one. Uh sort of a savant 221 00:13:26,640 --> 00:13:33,679 Speaker 1: diagnoser is that a word? Diagnostician? Uh? Doctor uh Dolly 222 00:13:33,720 --> 00:13:38,600 Speaker 1: Wall in San Francisco. He's sort of legendary for diagnosing things, 223 00:13:38,640 --> 00:13:40,920 Speaker 1: to the point where he does it on stage as 224 00:13:40,960 --> 00:13:44,200 Speaker 1: almost like a parlor trick would do. They give him 225 00:13:44,400 --> 00:13:48,880 Speaker 1: forty five minutes and and a bunch of symptoms basically 226 00:13:49,040 --> 00:13:52,280 Speaker 1: like really confusing because they're trying to stump him, and 227 00:13:52,600 --> 00:13:56,040 Speaker 1: generally he comes out on top. But he even uses 228 00:13:56,080 --> 00:13:59,280 Speaker 1: a program, a diagnostic program called Isabelle. Right, that's the 229 00:13:59,320 --> 00:14:02,800 Speaker 1: one I said every here that's already here. So doctors 230 00:14:02,840 --> 00:14:05,360 Speaker 1: are using these to help themselves out. But he says 231 00:14:05,440 --> 00:14:09,240 Speaker 1: that he's never had Isabelle offer a diagnosis that he 232 00:14:09,360 --> 00:14:12,680 Speaker 1: has missed. Right, But he's like the dude that yeah, 233 00:14:12,720 --> 00:14:15,080 Speaker 1: And he also admits that he's like like I'm a 234 00:14:15,120 --> 00:14:18,400 Speaker 1: freak of nature. Go ahead, quiz me exactly. He also 235 00:14:18,440 --> 00:14:22,040 Speaker 1: reads like case histories, like for fun, that kind of stuff. 236 00:14:22,080 --> 00:14:24,920 Speaker 1: He's not, he's not a normal physician. He's a complete 237 00:14:24,920 --> 00:14:30,440 Speaker 1: and total outlier. Um, if he were. If every physician 238 00:14:30,440 --> 00:14:34,120 Speaker 1: were like this guy, then they're there probably wouldn't be 239 00:14:34,160 --> 00:14:38,080 Speaker 1: this conversation going on right now. But most physicians aren't. 240 00:14:38,320 --> 00:14:41,240 Speaker 1: And it's not just with current medical research that they're 241 00:14:41,280 --> 00:14:43,040 Speaker 1: just not aware of because they haven't had time to 242 00:14:43,040 --> 00:14:45,760 Speaker 1: pick up the lancet the last few months, but it's 243 00:14:45,760 --> 00:14:47,720 Speaker 1: also their training to Like, if a doctor is in 244 00:14:47,760 --> 00:14:51,120 Speaker 1: practice for twenty years, the brain, and the human brain 245 00:14:51,200 --> 00:14:54,840 Speaker 1: tends to create habits because it likes to expend as 246 00:14:54,880 --> 00:14:57,680 Speaker 1: little energy as possible. It's it's trying to be as 247 00:14:57,680 --> 00:15:00,440 Speaker 1: ficient as possible. And I think the same thing happens 248 00:15:00,480 --> 00:15:03,640 Speaker 1: with medical practice. You're trained, you understand, you come out 249 00:15:03,680 --> 00:15:06,680 Speaker 1: of medical school with a lot of book learning, and 250 00:15:06,720 --> 00:15:08,360 Speaker 1: then you put it to practice and you kind of 251 00:15:08,400 --> 00:15:10,600 Speaker 1: find your niche and along the way you forget a 252 00:15:10,640 --> 00:15:12,640 Speaker 1: lot of the stuff that you haven't done in twenty 253 00:15:12,720 --> 00:15:14,600 Speaker 1: years or haven't learned about in twenty years. So it's 254 00:15:14,640 --> 00:15:17,680 Speaker 1: not just current stuff, it's old stuff too. And if 255 00:15:17,680 --> 00:15:20,480 Speaker 1: you feed the physician's desk reference into Watson or one 256 00:15:20,520 --> 00:15:24,800 Speaker 1: of his his compatriots, like all of that knowledge can 257 00:15:24,800 --> 00:15:27,920 Speaker 1: be quickly index and research to try to spit out 258 00:15:28,280 --> 00:15:32,160 Speaker 1: a more accurate diagnosis. Yeah, I think that's a great idea. 259 00:15:32,200 --> 00:15:36,440 Speaker 1: It's like a partnering up with computers. It is sarily replacing, 260 00:15:36,920 --> 00:15:39,880 Speaker 1: but what they're doing with Watson is is very much 261 00:15:39,920 --> 00:15:42,560 Speaker 1: moving towards replacing doctors in that sense. Well, here's a 262 00:15:42,600 --> 00:15:45,640 Speaker 1: scary stat um one in five diagnoses in the United 263 00:15:45,640 --> 00:15:50,400 Speaker 1: States are incorrect or incomplete one in five and a 264 00:15:50,440 --> 00:15:52,520 Speaker 1: lot of times. It's not that the doctor's a jerk 265 00:15:53,000 --> 00:15:54,920 Speaker 1: or not any good, but like you said, they just 266 00:15:55,080 --> 00:15:59,280 Speaker 1: maybe haven't seen these cases that were written about in 267 00:15:59,320 --> 00:16:02,800 Speaker 1: some obscure medical journal that the computer has scanned an index. 268 00:16:03,240 --> 00:16:06,440 Speaker 1: Ya and Dolly Wall Dr Dolly Wall himself at Freak 269 00:16:06,520 --> 00:16:10,880 Speaker 1: Diagnostication Dollywood, Yeah, pretty close, which is a wonderful place. 270 00:16:10,880 --> 00:16:16,840 Speaker 1: By the way, um Dr Dolly wall Uh himself says 271 00:16:16,880 --> 00:16:19,960 Speaker 1: a lot even with me, A lot of it is intuition, 272 00:16:20,960 --> 00:16:24,680 Speaker 1: and intuition can be wrong. That's a criticism though, of 273 00:16:25,040 --> 00:16:29,520 Speaker 1: computers as doctors. They lack intuition. Like there's kind of 274 00:16:29,560 --> 00:16:33,880 Speaker 1: even a larger even larger than this computer's replacing doctor's 275 00:16:33,880 --> 00:16:38,320 Speaker 1: conversation going on. It's kind of a conversation or a 276 00:16:38,440 --> 00:16:43,120 Speaker 1: debate over whether intuition or data. Yeah, Trump's one or 277 00:16:43,160 --> 00:16:47,400 Speaker 1: the other. Which one is the right way to go? Yeah? 278 00:16:47,400 --> 00:16:49,840 Speaker 1: This one step too, it says according to an expert, 279 00:16:49,840 --> 00:16:52,040 Speaker 1: I'm not sure what that means. That sounds sinky, but 280 00:16:52,080 --> 00:16:55,520 Speaker 1: they said, only of the knowledge of physicians use to 281 00:16:55,720 --> 00:17:01,960 Speaker 1: diagnose is evidence based, so that means intuition, which which 282 00:17:02,120 --> 00:17:05,280 Speaker 1: also jibs in dovetails with that one in five being 283 00:17:05,320 --> 00:17:08,880 Speaker 1: wrong or one in five being right. I'd like the 284 00:17:08,920 --> 00:17:11,720 Speaker 1: idea of intuition to a certain degree, for sure, but 285 00:17:11,800 --> 00:17:15,359 Speaker 1: there's also got to be like data backing it up. Sure, 286 00:17:15,920 --> 00:17:18,919 Speaker 1: you know so in your perfect world. Then it sounds 287 00:17:18,920 --> 00:17:21,000 Speaker 1: like we still have physicians, but they go back and 288 00:17:21,040 --> 00:17:24,960 Speaker 1: double check themselves using a program. Yeah, but I could 289 00:17:25,000 --> 00:17:29,240 Speaker 1: also be down with um simple what is it? What 290 00:17:29,240 --> 00:17:33,440 Speaker 1: do they call it in here? Um? Something based diseases, 291 00:17:35,480 --> 00:17:39,600 Speaker 1: rules based chronic diseases. Yeah, like minor things that are 292 00:17:39,640 --> 00:17:42,399 Speaker 1: pretty easy to diagnose, so they're not even necessarily minor. 293 00:17:42,480 --> 00:17:45,840 Speaker 1: We just understand them so fully that we say type 294 00:17:45,840 --> 00:17:48,920 Speaker 1: two diabetes is going to behave and present itself like this. Yeah, 295 00:17:48,920 --> 00:17:51,720 Speaker 1: but I wouldn't mind going like it seems like once 296 00:17:51,760 --> 00:17:53,919 Speaker 1: a year I get, like an upper respiratory infection. It's 297 00:17:53,920 --> 00:17:55,919 Speaker 1: been three or four years in a row, and I 298 00:17:55,960 --> 00:17:57,960 Speaker 1: know what the treatment is, I know how it feels. 299 00:17:58,400 --> 00:18:00,480 Speaker 1: It would be great to go and to a machine 300 00:18:00,640 --> 00:18:03,919 Speaker 1: and have them take some stats and blow into it 301 00:18:04,040 --> 00:18:08,640 Speaker 1: and hear my wheezing and give me a a steroid 302 00:18:08,680 --> 00:18:12,200 Speaker 1: shot and a Z pack and a breathing treatment and 303 00:18:12,320 --> 00:18:14,200 Speaker 1: send me on my way. So it's always what clears 304 00:18:14,200 --> 00:18:15,960 Speaker 1: it up. Would you care if it was a robot 305 00:18:16,520 --> 00:18:21,040 Speaker 1: that gave you that shot? Not at all, um, but 306 00:18:21,160 --> 00:18:23,720 Speaker 1: I definitely would want more personal care if it was something. 307 00:18:24,080 --> 00:18:26,080 Speaker 1: What if it was a robot with a nice avatar, 308 00:18:26,960 --> 00:18:30,720 Speaker 1: sexy avatar or just a friendly one. Yeah, that was 309 00:18:30,760 --> 00:18:32,520 Speaker 1: a little, a little, a little it would touch your 310 00:18:32,520 --> 00:18:36,480 Speaker 1: forearm here there. Yeah, well that might be a little creepy. Yeah, 311 00:18:36,680 --> 00:18:38,760 Speaker 1: it's like it was an old timey doctor who like 312 00:18:38,800 --> 00:18:41,760 Speaker 1: gave you some epocac if you had diarrhea, just send 313 00:18:41,800 --> 00:18:44,040 Speaker 1: you on your way, drink a coke. But it wouldn't 314 00:18:44,040 --> 00:18:45,680 Speaker 1: send you on your way to give you ip acac 315 00:18:45,720 --> 00:18:49,760 Speaker 1: and then it wouldn't let go of your forearm. Yeah. Well, 316 00:18:50,280 --> 00:18:52,520 Speaker 1: surgical robots, that's a that's a thing. I mean, we're 317 00:18:52,560 --> 00:18:57,240 Speaker 1: getting around, but they've been performing they've been performing robotic 318 00:18:57,280 --> 00:19:02,399 Speaker 1: surgery since the early eighties. Um doctor assisted until two 319 00:19:02,440 --> 00:19:04,840 Speaker 1: thousand ten, where they were in Montreal. They performed the 320 00:19:04,880 --> 00:19:10,440 Speaker 1: first fully robotic surgeries when they removed a prostate with 321 00:19:10,560 --> 00:19:17,240 Speaker 1: the fully robotic UH surgeon and fully robotic anentusiologist Dr 322 00:19:17,320 --> 00:19:20,320 Speaker 1: mc sleepy Dr mc sleepy. Yeah, and then that's the 323 00:19:20,359 --> 00:19:22,800 Speaker 1: real name the robot surgeon was da Vinci, which is 324 00:19:22,840 --> 00:19:28,560 Speaker 1: like the basically gold standard for robotic surgical or surgical robots. Yeah, 325 00:19:28,640 --> 00:19:33,240 Speaker 1: they had in two thousand, thirteen fifty thousand robotic surgeries 326 00:19:33,280 --> 00:19:37,840 Speaker 1: performed in the US. So it's it's big and um. 327 00:19:37,880 --> 00:19:41,199 Speaker 1: But the da Vinci is a doctor basically sitting in 328 00:19:41,240 --> 00:19:44,439 Speaker 1: a little uh it looks like an arcade game and 329 00:19:44,680 --> 00:19:49,639 Speaker 1: using um robotic arms to mimic his or her movements 330 00:19:50,280 --> 00:19:53,920 Speaker 1: on more microscopic levels. Right, So the robot has more 331 00:19:53,960 --> 00:19:58,119 Speaker 1: precise movements and can make smaller movements, um than the doctor. 332 00:19:58,200 --> 00:20:02,280 Speaker 1: It's tell it's and what's the opposite of telescoping, like 333 00:20:02,359 --> 00:20:06,439 Speaker 1: going downward in scale? Whatever that is. It's taking the 334 00:20:06,520 --> 00:20:09,400 Speaker 1: movements of the doctor and reducing them in scale. Let's 335 00:20:09,400 --> 00:20:13,800 Speaker 1: call it reverse telescoping, reverse telescoping those movements, um, which 336 00:20:13,880 --> 00:20:16,639 Speaker 1: is a pretty awesome achievement in and of itself. That 337 00:20:16,760 --> 00:20:21,479 Speaker 1: doctors being fed three d um graphics of what the 338 00:20:21,600 --> 00:20:26,800 Speaker 1: robot is seeing, uh, and just kind of working from there. Uh. 339 00:20:26,840 --> 00:20:32,639 Speaker 1: What we're moving towards apparently is fully robotic size surgeries. 340 00:20:32,880 --> 00:20:36,119 Speaker 1: I was talking to Joe McCormick from Forward Thinking, and 341 00:20:36,200 --> 00:20:39,760 Speaker 1: he was saying that, um, there was there's something called 342 00:20:39,800 --> 00:20:44,160 Speaker 1: the Raven four, I believe. Uh. And basically you just say, 343 00:20:44,280 --> 00:20:46,720 Speaker 1: this is going to be a gall bladder surgery on 344 00:20:46,800 --> 00:20:51,280 Speaker 1: a six ft six male age, you know whatever, and 345 00:20:51,720 --> 00:20:56,159 Speaker 1: here's his here's the cat scan of his abdomen um. 346 00:20:56,359 --> 00:21:00,159 Speaker 1: So go removes gallbladder and you press enter, and thing 347 00:21:00,280 --> 00:21:02,399 Speaker 1: goes in there and like removes the guy's gallbladder and 348 00:21:02,440 --> 00:21:08,000 Speaker 1: sews him up. Yeah, that's fully robotic, like fully autonomous 349 00:21:08,240 --> 00:21:10,960 Speaker 1: robotic surgery. Press the button and it does it. You're 350 00:21:11,000 --> 00:21:14,439 Speaker 1: not actually controlling a machine that does it exactly the 351 00:21:14,480 --> 00:21:18,320 Speaker 1: machines doing it at your behest, but you're not controlling it. Yeah. Um, 352 00:21:18,359 --> 00:21:20,280 Speaker 1: and we're right on the cusp of that, and apparently 353 00:21:20,320 --> 00:21:25,119 Speaker 1: it's already happening. Uh yeah, but there are some issues. Um. 354 00:21:25,200 --> 00:21:27,520 Speaker 1: I looked into it and found that a lot of 355 00:21:28,119 --> 00:21:32,640 Speaker 1: injury reporting and robotic surgery is um not being reported. 356 00:21:32,640 --> 00:21:37,320 Speaker 1: It's it's substandard. And uh, this woman, Sheina Wilson, had 357 00:21:37,480 --> 00:21:41,480 Speaker 1: robotic surgery for a hysterectomy in two thousand thirteen and 358 00:21:41,560 --> 00:21:46,160 Speaker 1: apparently this uh intuitive surgical system had there had been 359 00:21:46,160 --> 00:21:48,960 Speaker 1: a bunch of injuries that she didn't know about, and 360 00:21:49,040 --> 00:21:53,639 Speaker 1: she had her wrectum burned badly and said, if I 361 00:21:53,640 --> 00:21:57,000 Speaker 1: would have known that this system had these issues, would 362 00:21:57,040 --> 00:21:59,640 Speaker 1: not have elected to take part in it. So there's 363 00:21:59,640 --> 00:22:03,240 Speaker 1: a lot of under reporting. UM. The f d A, UM, 364 00:22:03,280 --> 00:22:06,080 Speaker 1: they have no authority to force a doctor to do this, 365 00:22:06,880 --> 00:22:10,200 Speaker 1: and apparently there's every reason in every link in the 366 00:22:10,280 --> 00:22:14,639 Speaker 1: chain not to report these things, you know, and the 367 00:22:14,800 --> 00:22:17,439 Speaker 1: f d A not enforcing this kind of thing, not 368 00:22:17,600 --> 00:22:23,080 Speaker 1: enforcing reporting is ridiculous. Yeah, you know. The thing is 369 00:22:23,119 --> 00:22:27,119 Speaker 1: that things like that happen, and there's under reporting um 370 00:22:27,200 --> 00:22:31,040 Speaker 1: with human surg surgeons as well. Sure, not just robotic, 371 00:22:31,080 --> 00:22:35,840 Speaker 1: it's like overall apparently surgical injury and accident reporting is 372 00:22:35,880 --> 00:22:38,720 Speaker 1: not compulsory. Yeah, and here's here's a few points though. 373 00:22:39,040 --> 00:22:43,200 Speaker 1: Counterpoints I guess is one, it's not always the robotic 374 00:22:43,240 --> 00:22:47,480 Speaker 1: component of the surgery that was the cause to a 375 00:22:47,520 --> 00:22:49,159 Speaker 1: lot of times they say they don't know about this 376 00:22:49,320 --> 00:22:52,359 Speaker 1: until like a lawsuit is filed, so it could be 377 00:22:52,400 --> 00:22:55,159 Speaker 1: weeks or months later with the physician doesn't know about it, 378 00:22:55,560 --> 00:22:58,320 Speaker 1: the f d A might not get report on it, 379 00:22:58,440 --> 00:23:00,680 Speaker 1: and like six months later you follow lawsuit and that's 380 00:23:00,680 --> 00:23:03,639 Speaker 1: how it comes to light. UM. But the FDA is 381 00:23:03,640 --> 00:23:07,920 Speaker 1: definitely concerned and are supposedly working to improve this. That's 382 00:23:08,000 --> 00:23:12,280 Speaker 1: very concerned. They're very concerned. H And another problem too 383 00:23:12,480 --> 00:23:16,480 Speaker 1: in that same article, a lot of these robotic surgical systems, 384 00:23:17,160 --> 00:23:19,439 Speaker 1: you still have to have the correct amount of training 385 00:23:20,040 --> 00:23:24,000 Speaker 1: and UH. The feeling of some experts is that UM 386 00:23:24,119 --> 00:23:27,359 Speaker 1: or at least this one guy, Enrico Benedetti, he's a 387 00:23:27,400 --> 00:23:30,680 Speaker 1: head of surgery at the University of Illinois, Chicago, says 388 00:23:30,720 --> 00:23:32,320 Speaker 1: a lot of it just comes back to training these 389 00:23:32,400 --> 00:23:35,240 Speaker 1: some of these doctors aren't getting adequately trained in these 390 00:23:35,320 --> 00:23:38,520 Speaker 1: machines enough to perform the surgery. Like what happens when 391 00:23:38,520 --> 00:23:43,360 Speaker 1: I do this? Oh, that happens. That's not good. I've 392 00:23:43,359 --> 00:23:45,639 Speaker 1: got another alarming stat for you to hold on. Hold on, 393 00:23:45,720 --> 00:23:55,800 Speaker 1: hold on. Before that, let's do a message break rel okay, 394 00:23:56,160 --> 00:23:59,760 Speaker 1: tell me you're alarming stat alright. JOHNS. Hopkinsider study that 395 00:23:59,800 --> 00:24:03,840 Speaker 1: found as many as forty thou patients die in intensive 396 00:24:03,840 --> 00:24:08,280 Speaker 1: care each year in the US due to misdiagnosis MAN 397 00:24:08,600 --> 00:24:12,840 Speaker 1: and UM. Another study found that system related factors like UH, 398 00:24:13,280 --> 00:24:17,119 Speaker 1: lack of teamworking, communication, or just poor processes were involved. 399 00:24:17,160 --> 00:24:24,040 Speaker 1: In six of diagnostic error and cognitive factors in with 400 00:24:24,160 --> 00:24:27,480 Speaker 1: premature closure is the most common, which is basically just 401 00:24:27,520 --> 00:24:30,560 Speaker 1: sticking to that admitsial diagnosis and not being open minded 402 00:24:30,560 --> 00:24:33,680 Speaker 1: to other like second opinions. Yeah, so there's this thing 403 00:24:33,720 --> 00:24:36,960 Speaker 1: called anchoring bias. That UM was in that New York 404 00:24:37,000 --> 00:24:40,560 Speaker 1: Times article. Dr dollar Wall the guy who created this 405 00:24:40,600 --> 00:24:45,959 Speaker 1: program that's now around to support diagnostics where a physician 406 00:24:45,960 --> 00:24:47,920 Speaker 1: will say, I think it's this, but let me put 407 00:24:47,920 --> 00:24:50,600 Speaker 1: in the symptoms and ask Isabelle um which is the 408 00:24:50,680 --> 00:24:52,960 Speaker 1: name of the program, and it's named after the guy 409 00:24:53,000 --> 00:24:56,920 Speaker 1: who created the Program's daughter, oh man that yeah, when 410 00:24:56,960 --> 00:24:58,879 Speaker 1: she was three, took her to the hospital and the 411 00:24:58,920 --> 00:25:01,479 Speaker 1: doctors said, well, she has chicken pox. And she did 412 00:25:01,480 --> 00:25:04,320 Speaker 1: indeed have chicken pox, but that's all they looked at. 413 00:25:04,720 --> 00:25:08,320 Speaker 1: They completely missed a pretty nasty case of necrotizing fasciitus, 414 00:25:08,600 --> 00:25:13,200 Speaker 1: which we've talked about before, flush eating bacteria, and um, 415 00:25:13,320 --> 00:25:16,399 Speaker 1: she almost died from it. It was it was disfigured 416 00:25:16,440 --> 00:25:18,720 Speaker 1: from it as a result, so that her father, who 417 00:25:18,800 --> 00:25:21,879 Speaker 1: is a money manager, said I'm going to take whatever 418 00:25:21,960 --> 00:25:25,119 Speaker 1: computer programming skills I haven't put it towards this program. 419 00:25:25,160 --> 00:25:28,840 Speaker 1: Isabelle which is meant to say, yes, you're right with 420 00:25:28,880 --> 00:25:31,640 Speaker 1: this diagnosis, I agree with you, or have you considered 421 00:25:31,640 --> 00:25:35,200 Speaker 1: these other diagnoses? And he said, like, had Isabel been 422 00:25:35,240 --> 00:25:38,399 Speaker 1: around and his daughter's doctors consulted it, they would not 423 00:25:38,520 --> 00:25:42,240 Speaker 1: have missed the necrotizing fasciitus. Well it makes sense, um, 424 00:25:42,280 --> 00:25:44,480 Speaker 1: as an assist. You know, Um, there's this company called 425 00:25:44,480 --> 00:25:47,800 Speaker 1: life Calm that said in clinical trials that if you 426 00:25:47,920 --> 00:25:52,920 Speaker 1: use a medical diagnostic program as an assist, uh, those 427 00:25:52,960 --> 00:25:57,879 Speaker 1: engines were accurate without using exams or imaging or labs 428 00:25:57,920 --> 00:26:04,360 Speaker 1: even really just symptom. Yeah, that's crazy, that's really really 429 00:26:04,840 --> 00:26:08,840 Speaker 1: really good. Yeah, like that's a that's an A, that's 430 00:26:08,880 --> 00:26:11,960 Speaker 1: a low A. It's still in a. But as an assistant, 431 00:26:12,000 --> 00:26:14,040 Speaker 1: I think it's you know, it's kind of a no brainer, 432 00:26:15,440 --> 00:26:17,640 Speaker 1: don't you think. Oh yeah, I think so. I don't 433 00:26:17,680 --> 00:26:21,600 Speaker 1: know why. I all I can think of is possibly 434 00:26:21,840 --> 00:26:25,560 Speaker 1: worrying about feeding the beasts that will take your job, 435 00:26:26,280 --> 00:26:28,639 Speaker 1: or just having too much of a case load to 436 00:26:28,720 --> 00:26:31,720 Speaker 1: take the time to double check your work on a 437 00:26:31,760 --> 00:26:36,440 Speaker 1: computer would be the only reasons why doctors aren't using that. Well, 438 00:26:36,480 --> 00:26:42,360 Speaker 1: the smartphone is becoming a potential self diagnoser. There's all 439 00:26:42,359 --> 00:26:45,119 Speaker 1: these cool things on the horizon that you can use 440 00:26:45,160 --> 00:26:48,320 Speaker 1: your your phone for. There's one called a live Core, 441 00:26:49,000 --> 00:26:52,639 Speaker 1: which you can take your own ECG test and potentially, 442 00:26:53,119 --> 00:26:54,879 Speaker 1: for the cost of getting one e c G in 443 00:26:54,920 --> 00:26:59,200 Speaker 1: the hospital, you could send a year's worth of daily 444 00:26:59,200 --> 00:27:02,679 Speaker 1: ECGs you took yourself to your doctor, and then you 445 00:27:02,760 --> 00:27:05,280 Speaker 1: carry all that info and all of your other medical 446 00:27:05,320 --> 00:27:07,359 Speaker 1: info from all of your apps that will eventually be 447 00:27:07,400 --> 00:27:09,760 Speaker 1: integrated into one or two apps that will probably become 448 00:27:09,800 --> 00:27:12,160 Speaker 1: preloaded on your iPhone in the next couple of years, 449 00:27:13,000 --> 00:27:16,159 Speaker 1: and you've got your medical history right there. Yeah, I mean, 450 00:27:16,119 --> 00:27:18,359 Speaker 1: you know. Most of these require like a little clip 451 00:27:18,400 --> 00:27:21,760 Speaker 1: on like um, something called cell scope that's like you 452 00:27:21,800 --> 00:27:25,600 Speaker 1: clip it onto your little camera lens essentially, and it's 453 00:27:25,640 --> 00:27:27,960 Speaker 1: like what are the little magnifiers with the lights that 454 00:27:28,040 --> 00:27:31,160 Speaker 1: doctors used to look in your ears and eyes? Uh yeah, 455 00:27:31,240 --> 00:27:33,200 Speaker 1: it looks like one of those clipped onto your your 456 00:27:33,200 --> 00:27:37,080 Speaker 1: iPhone and it produces, uh, you can do imaging for 457 00:27:37,080 --> 00:27:40,360 Speaker 1: skin moles and rashes and ear infections. They have one 458 00:27:40,400 --> 00:27:45,639 Speaker 1: called NTRA that you could potentially give your own eyes uh, 459 00:27:46,080 --> 00:27:49,200 Speaker 1: get your own like glasses prescription done, and then you 460 00:27:50,560 --> 00:27:55,520 Speaker 1: order the information to some website and they say, and 461 00:27:55,520 --> 00:28:00,679 Speaker 1: then this one called adamant uh that smells your breath, 462 00:28:00,680 --> 00:28:03,720 Speaker 1: that smells gases in your breath and it could detect 463 00:28:03,840 --> 00:28:08,399 Speaker 1: like lung cancer even yeah, apparently you have real metabolic 464 00:28:08,520 --> 00:28:11,800 Speaker 1: changes to the smell of your breath when you have 465 00:28:11,880 --> 00:28:14,679 Speaker 1: different types of cancer, not just long um. Like bees 466 00:28:14,720 --> 00:28:19,040 Speaker 1: can detect breast cancer. Um. If you breathe into like 467 00:28:19,119 --> 00:28:22,320 Speaker 1: this uh special glass fear with bees around it, they 468 00:28:22,320 --> 00:28:24,880 Speaker 1: can be trained to detect lung cancer and they come 469 00:28:24,880 --> 00:28:27,359 Speaker 1: back with the correct results a lot of the time. 470 00:28:28,320 --> 00:28:29,800 Speaker 1: So a lot of these are on the horizon. They're 471 00:28:29,800 --> 00:28:33,399 Speaker 1: not like in heavy rotation yet, no, but it's pretty neat. 472 00:28:33,440 --> 00:28:37,120 Speaker 1: All of them reveal this idea that no one cares 473 00:28:37,160 --> 00:28:42,280 Speaker 1: about your particular health and well being more than you 474 00:28:42,600 --> 00:28:44,760 Speaker 1: unless you're one of those dudes who doesn't really care. 475 00:28:45,320 --> 00:28:49,560 Speaker 1: Then your your wife does, you know, and we probably 476 00:28:49,560 --> 00:28:52,280 Speaker 1: cares more about me than me, right, But there's there. 477 00:28:52,880 --> 00:28:57,560 Speaker 1: The point is the doctor, the insurance company, the the hospital, 478 00:28:57,960 --> 00:29:00,600 Speaker 1: while they're all in the field because they do care 479 00:29:00,640 --> 00:29:03,480 Speaker 1: about your health, of course, they can't possibly care about 480 00:29:03,520 --> 00:29:05,880 Speaker 1: it more than you or your loved one does. So 481 00:29:05,960 --> 00:29:09,160 Speaker 1: the idea of giving you the ability to keep all 482 00:29:09,200 --> 00:29:12,040 Speaker 1: of that information yourself and easily hand it over to 483 00:29:12,120 --> 00:29:15,440 Speaker 1: them or potentially down the road, a computer version of them. 484 00:29:16,240 --> 00:29:18,880 Speaker 1: I can't think of any any better revolution in medicine 485 00:29:18,960 --> 00:29:22,280 Speaker 1: right now than that, agreed. I think it's pretty exciting. Yeah. 486 00:29:22,360 --> 00:29:25,360 Speaker 1: I think we're going to live into the triple digits, buddy. Yeah. 487 00:29:25,400 --> 00:29:27,240 Speaker 1: And I think there will always be a need for 488 00:29:27,320 --> 00:29:29,600 Speaker 1: doctors and nurses. I don't think anyone will be wholly 489 00:29:29,640 --> 00:29:35,920 Speaker 1: replaced but a little robot assist. Yeah. Yeah. Let me 490 00:29:35,960 --> 00:29:37,960 Speaker 1: make one more point, all right, there's so you've heard 491 00:29:37,960 --> 00:29:42,520 Speaker 1: of genomics. Yes, there's also this thing called proteonomics, which 492 00:29:42,560 --> 00:29:46,120 Speaker 1: is basically your protein version of your your genome, your genome, 493 00:29:46,720 --> 00:29:49,000 Speaker 1: and it's all of the proteins in your body that 494 00:29:49,080 --> 00:29:51,680 Speaker 1: you have, that your manufacturing, that you're losing, and all 495 00:29:51,720 --> 00:29:54,640 Speaker 1: the changes and fluctuations in them. And the idea is 496 00:29:54,720 --> 00:29:57,480 Speaker 1: that you can get a full work up of your 497 00:29:57,840 --> 00:30:02,720 Speaker 1: proteinome and your gino them and eventually you can add 498 00:30:02,760 --> 00:30:05,160 Speaker 1: that to your medical history as well. What your e 499 00:30:05,280 --> 00:30:07,920 Speaker 1: k G reading has been over the past year, um, 500 00:30:08,360 --> 00:30:10,880 Speaker 1: any way you may have gained or lost or anything 501 00:30:10,960 --> 00:30:15,480 Speaker 1: like that, what your breath smells like metabolically speaking, and 502 00:30:15,880 --> 00:30:21,440 Speaker 1: not only have your current state of health, but personalized 503 00:30:21,480 --> 00:30:24,000 Speaker 1: your version of that, personalized down to your genes and 504 00:30:24,120 --> 00:30:29,160 Speaker 1: proteins in your body, so a treatment could be specifically 505 00:30:29,200 --> 00:30:32,880 Speaker 1: tailored to you. Wow, that's gonna be really tough for 506 00:30:32,920 --> 00:30:35,400 Speaker 1: a human physician to do that on their own. The 507 00:30:35,480 --> 00:30:37,800 Speaker 1: top of that, yeah, be the the amount of data 508 00:30:37,840 --> 00:30:41,800 Speaker 1: available already is overwhelming human doctors. When you add this 509 00:30:41,880 --> 00:30:43,840 Speaker 1: other kind of stuff on it, it's just pulling away 510 00:30:43,840 --> 00:30:47,560 Speaker 1: from them more and more. Yeah. And medical record keeping 511 00:30:47,640 --> 00:30:50,800 Speaker 1: is uh. I know there's been issues with that and 512 00:30:50,960 --> 00:30:53,440 Speaker 1: digitizing that and keeping up with medical records. And if 513 00:30:53,480 --> 00:30:55,600 Speaker 1: you could be yourself advocate and keep up with your 514 00:30:55,600 --> 00:30:58,760 Speaker 1: own medical records, it might be kind of nice. So 515 00:30:58,880 --> 00:31:03,800 Speaker 1: I feel like we have heard the question. Yes, doctor, 516 00:31:05,160 --> 00:31:07,160 Speaker 1: I don't know. I think in in the future, I 517 00:31:07,160 --> 00:31:11,440 Speaker 1: will always have humans to interact between us. I think 518 00:31:11,960 --> 00:31:14,000 Speaker 1: because we're always gonna want somebody to yell at or 519 00:31:14,000 --> 00:31:16,520 Speaker 1: be like, what is this robot doing? Or can you 520 00:31:16,520 --> 00:31:18,680 Speaker 1: help me this robot's give me some ipocac and won't 521 00:31:18,720 --> 00:31:21,200 Speaker 1: let go on my arm or burn my wrectum. Yes, 522 00:31:22,000 --> 00:31:25,280 Speaker 1: we're always going to need humans. It's just I don't know, well, 523 00:31:25,280 --> 00:31:27,680 Speaker 1: we need physicians, and if we do, will they be 524 00:31:27,760 --> 00:31:32,520 Speaker 1: super specialized like just the Supreme Court of Physicians. Who knows. 525 00:31:33,240 --> 00:31:35,560 Speaker 1: It's pretty exciting, but we will see this change one 526 00:31:35,560 --> 00:31:38,880 Speaker 1: way or another in the next fifteen years under my prediction. 527 00:31:39,400 --> 00:31:46,320 Speaker 1: It's happening. Okay, good and Chuck. Yeah really, if you 528 00:31:46,360 --> 00:31:51,239 Speaker 1: wanna learn more about computers possibly replacing doctors, you can 529 00:31:51,280 --> 00:31:53,640 Speaker 1: type those words into the search bar how stuff works 530 00:31:53,680 --> 00:31:56,720 Speaker 1: dot com. And since I said search bar, that means 531 00:31:56,720 --> 00:32:04,800 Speaker 1: it's time for a message break. Okay, So so what 532 00:32:04,880 --> 00:32:10,400 Speaker 1: do we have listener mail time? Yeah, I have one 533 00:32:10,640 --> 00:32:15,360 Speaker 1: called I'm gonna call it fight Club. Okay, Hey, guys, 534 00:32:15,360 --> 00:32:19,160 Speaker 1: just finished the podcast on deep refrigerating. I think I'll 535 00:32:19,240 --> 00:32:23,160 Speaker 1: keep my Energy Star certified fridge. Thanks very much. But 536 00:32:23,240 --> 00:32:25,840 Speaker 1: Josh did mentioned something about eating weeds and asked a 537 00:32:25,960 --> 00:32:28,640 Speaker 1: somewhat rhetorical question, what are weeds anyway? Just plants we 538 00:32:28,680 --> 00:32:31,840 Speaker 1: say are bad? Reminded me of some today's common uh 539 00:32:32,240 --> 00:32:35,239 Speaker 1: some that some of today's common noxious weeds. How they 540 00:32:35,240 --> 00:32:38,760 Speaker 1: got their reputation. Not so long ago, lawns were perfect 541 00:32:38,760 --> 00:32:42,400 Speaker 1: blends of Bermuda rye and Kentucky bluegrass. They also included 542 00:32:42,440 --> 00:32:45,640 Speaker 1: many types of clover, dandelion and other quote weeds. In fact, 543 00:32:45,680 --> 00:32:48,640 Speaker 1: many seed mixtures specifically included white clover because it makes 544 00:32:48,640 --> 00:32:51,200 Speaker 1: an excellent cover in soils where more common grasses won't 545 00:32:51,240 --> 00:32:54,840 Speaker 1: grow In steps the Scott Fertilizer Company post World War 546 00:32:54,880 --> 00:32:57,440 Speaker 1: two America housing tracks were popping up all over the 547 00:32:57,560 --> 00:33:00,640 Speaker 1: US and new suburbia, and Scott was in urging returning 548 00:33:00,680 --> 00:33:02,360 Speaker 1: gis to take pride in their new lawns and to 549 00:33:02,400 --> 00:33:05,480 Speaker 1: buy their products to do so. And we're extremely high 550 00:33:05,480 --> 00:33:09,239 Speaker 1: waisted pants, right. They produced fertilizers, weed killers, and other 551 00:33:09,280 --> 00:33:11,959 Speaker 1: long care products, some of which had a curious side effect, 552 00:33:12,040 --> 00:33:14,120 Speaker 1: killing many leafy greens that came up to the point 553 00:33:14,920 --> 00:33:17,480 Speaker 1: that we're not considered weeds at the time, including white clover. 554 00:33:17,760 --> 00:33:21,200 Speaker 1: Instead of reformulating, what they did was what any red 555 00:33:21,200 --> 00:33:24,240 Speaker 1: blooded American corporation would do. They redefined what was a weed. 556 00:33:25,000 --> 00:33:27,840 Speaker 1: White clover made that list as the dandelions, when in 557 00:33:27,880 --> 00:33:31,040 Speaker 1: fact both are still in use today in cooking and medicines. 558 00:33:31,040 --> 00:33:34,600 Speaker 1: Would you call that an oxious weed? No? So thanks 559 00:33:34,600 --> 00:33:37,280 Speaker 1: for that, guys, and thanks for all the knowledge I've 560 00:33:37,320 --> 00:33:40,720 Speaker 1: learned and have a great and that it's from Robert Paulson. 561 00:33:42,040 --> 00:33:44,440 Speaker 1: Oh yeah, Robert Paulson. He's a he's a sharp dude. 562 00:33:45,240 --> 00:33:48,200 Speaker 1: That's why I called the Pike Club. Remember that. Oh yeah, 563 00:33:48,360 --> 00:33:50,040 Speaker 1: I think I made a joke to him about that 564 00:33:50,080 --> 00:33:52,520 Speaker 1: once on Twitter and he never responded. Yeah, he's he 565 00:33:52,560 --> 00:33:54,320 Speaker 1: writes in a lot now he's every time I see 566 00:33:54,320 --> 00:33:57,440 Speaker 1: his name, I think and his name is Robert Paulson. Yeah, 567 00:33:57,560 --> 00:34:00,840 Speaker 1: thanks a lot, Robert Paulson. We appreciate you. If you're 568 00:34:00,840 --> 00:34:03,000 Speaker 1: ever shot in the head and the commission of a robbery, 569 00:34:03,080 --> 00:34:07,400 Speaker 1: we will dispose of your body. Um. If you want 570 00:34:07,440 --> 00:34:08,880 Speaker 1: to get in touch with me and Chuck and you 571 00:34:08,920 --> 00:34:10,839 Speaker 1: have a name that you would like us to poke 572 00:34:10,920 --> 00:34:14,680 Speaker 1: fun at, bring it on. You can tweet to us 573 00:34:15,000 --> 00:34:18,000 Speaker 1: at s y s K podcast. You can post your 574 00:34:18,080 --> 00:34:21,440 Speaker 1: name on Facebook dot com slash Stuff you Should Know. 575 00:34:21,960 --> 00:34:24,279 Speaker 1: You can send us an email to Stuff Podcast at 576 00:34:24,320 --> 00:34:26,880 Speaker 1: Discovery dot com. You can check us out on YouTube 577 00:34:27,160 --> 00:34:29,440 Speaker 1: search Josh and Chuck. It will bring up our YouTube 578 00:34:29,480 --> 00:34:32,080 Speaker 1: channel and you will kick your heels with glee. And then, 579 00:34:32,120 --> 00:34:35,360 Speaker 1: of course go visit our website. Make it your homepage. 580 00:34:35,400 --> 00:34:38,239 Speaker 1: It's the coolest place on the web. It's Stuff you 581 00:34:38,239 --> 00:34:46,040 Speaker 1: Should Know dot com For more on this and thousands 582 00:34:46,040 --> 00:34:55,440 Speaker 1: of other topics, is it how stuff works? Dot com 583 00:34:55,520 --> 00:34:58,720 Speaker 1: Jack Heards has quickly become the online shopping destination for guys. 584 00:34:58,760 --> 00:35:01,479 Speaker 1: Here's why everything on site is up to eighty percent off. 585 00:35:01,680 --> 00:35:03,279 Speaker 1: As a listener of stuff you should know, you can 586 00:35:03,320 --> 00:35:06,120 Speaker 1: skip the membership waitlist and get instant access that sign 587 00:35:06,239 --> 00:35:09,280 Speaker 1: up dot jack threads, dot com slash no Stuff