1 00:00:00,600 --> 00:00:02,759 Speaker 1: All right, josh. So the first part of our winter 2 00:00:02,840 --> 00:00:07,400 Speaker 1: tour is over a lot of fun. But we are 3 00:00:07,440 --> 00:00:11,520 Speaker 1: going back out this weekend next to Atlanta. While we're 4 00:00:11,520 --> 00:00:14,120 Speaker 1: not going anywhere, well, we're going down the road ten 5 00:00:14,160 --> 00:00:18,079 Speaker 1: minutes from my house. Sure, Atlanta, Birmingham. We would still 6 00:00:18,079 --> 00:00:20,200 Speaker 1: love to see you, and you can still get great seats. Yeah, 7 00:00:20,239 --> 00:00:22,440 Speaker 1: and this is a brand new show. Unless you were 8 00:00:22,480 --> 00:00:25,680 Speaker 1: in San Francisco, San Diego, Austin, or Dallas, you ain't 9 00:00:25,680 --> 00:00:28,880 Speaker 1: seen the show. And it is bringing down the house 10 00:00:29,000 --> 00:00:32,320 Speaker 1: all over the country eventually probably all over the world. Yes, 11 00:00:32,360 --> 00:00:34,239 Speaker 1: and you can get tickets. You can just go to 12 00:00:34,360 --> 00:00:36,680 Speaker 1: s y s K live dot com. It's our square 13 00:00:36,680 --> 00:00:39,320 Speaker 1: space powered site and they were powering our tour and 14 00:00:39,320 --> 00:00:43,360 Speaker 1: they were they're powering me on a daily basis. So 15 00:00:43,440 --> 00:00:47,920 Speaker 1: we will see you guys very soon. Welcome to Stuff 16 00:00:47,920 --> 00:00:56,680 Speaker 1: you should Know from House Stuff Works dot com. Hey, 17 00:00:56,720 --> 00:00:59,880 Speaker 1: and welcome to the podcast. I'm Joshua m clark. There's 18 00:01:00,080 --> 00:01:07,000 Speaker 1: Charles w Wayne, Chucker's Chuck Tran Bryant Chuck Trand I 19 00:01:07,040 --> 00:01:09,800 Speaker 1: remember that and then that one never even made sense. No, 20 00:01:10,240 --> 00:01:13,440 Speaker 1: who said that me? Yeah, I don't get it. It 21 00:01:13,480 --> 00:01:16,520 Speaker 1: doesn't mean anything. That's why I never made sens. And 22 00:01:16,520 --> 00:01:22,880 Speaker 1: then there's Jerry Chairs, Jerome Rolling, Jair Tran, and Josh Tran. Yeah, 23 00:01:23,080 --> 00:01:26,759 Speaker 1: the trans I'm excited to record this and then leave 24 00:01:26,959 --> 00:01:30,440 Speaker 1: because I just quickly on my phone saw that Billy 25 00:01:30,520 --> 00:01:33,080 Speaker 1: Joel did Ado performance in the commercial break of IT 26 00:01:33,680 --> 00:01:36,880 Speaker 1: talk show and the video was up. Oh yeah, so 27 00:01:36,920 --> 00:01:41,240 Speaker 1: I got things to do. Okay, well, let's go personalized medicine. Chuck, huh, 28 00:01:41,560 --> 00:01:45,720 Speaker 1: So let's take it back. Let's take away back. Okay, 29 00:01:45,800 --> 00:01:47,920 Speaker 1: let's talk about medicine in general. Right, are we way 30 00:01:47,920 --> 00:01:50,040 Speaker 1: back machining it or no? No, no, no, all right. 31 00:01:50,440 --> 00:01:57,160 Speaker 1: So there's this idea that um two best understand how 32 00:01:57,200 --> 00:02:01,360 Speaker 1: to treat a person, you should understand. Critique said, it's 33 00:02:01,400 --> 00:02:04,320 Speaker 1: far more important to know what person the disease has 34 00:02:04,600 --> 00:02:07,639 Speaker 1: than what disease the person has. Boy, that is smart 35 00:02:07,720 --> 00:02:10,119 Speaker 1: for back then it is you know, and I think 36 00:02:10,160 --> 00:02:13,920 Speaker 1: that this was the original idea behind medicine, that we 37 00:02:13,919 --> 00:02:16,760 Speaker 1: we can understand a disease, but when you apply it 38 00:02:16,800 --> 00:02:19,000 Speaker 1: to a person, it's going to be different than when 39 00:02:19,040 --> 00:02:23,120 Speaker 1: you apply it to another person. And that is the 40 00:02:23,160 --> 00:02:28,840 Speaker 1: heart of personalized medicine, is that understanding. Unfortunately, for many 41 00:02:29,160 --> 00:02:32,919 Speaker 1: hundreds of years. Well actually for a shorter time than that. 42 00:02:33,520 --> 00:02:36,799 Speaker 1: But in Western medicine, the idea has been that if 43 00:02:36,840 --> 00:02:40,000 Speaker 1: it works for most people, it will probably work for you, 44 00:02:40,400 --> 00:02:43,200 Speaker 1: or that's good enough for us. Yeah, it's called a 45 00:02:43,240 --> 00:02:46,959 Speaker 1: trial and error approach, and that should scare you to death. 46 00:02:47,200 --> 00:02:52,519 Speaker 1: Well I get it, because until we until the Human 47 00:02:52,560 --> 00:02:55,720 Speaker 1: Genome Project, we didn't have a lot of choices as 48 00:02:55,760 --> 00:02:59,640 Speaker 1: a society other than to do our best for the majority. 49 00:03:00,760 --> 00:03:04,119 Speaker 1: You know. Well, yeah, like that changed everything. It did, 50 00:03:04,160 --> 00:03:06,240 Speaker 1: but even before that, it was it was like, that 51 00:03:06,320 --> 00:03:08,880 Speaker 1: was what two thousand, two thousand one, something like that, 52 00:03:08,919 --> 00:03:11,600 Speaker 1: the Human Genome Project. Yeah, I mean before that, there 53 00:03:11,639 --> 00:03:14,160 Speaker 1: were some precursors to personalized medicine, like let's look at 54 00:03:14,160 --> 00:03:17,480 Speaker 1: family histories and stuff like that. Yeah, but even like 55 00:03:17,520 --> 00:03:20,680 Speaker 1: it's that's not that old. It wasn't until World War 56 00:03:20,720 --> 00:03:25,120 Speaker 1: Two that people started noticing huh. You know, different people 57 00:03:25,240 --> 00:03:28,840 Speaker 1: have different reactions to different kinds of medicine. There's actually 58 00:03:28,880 --> 00:03:32,080 Speaker 1: an anti malarial drug that was given to troops in 59 00:03:32,160 --> 00:03:35,760 Speaker 1: World War two, American troops, and um, if you're an 60 00:03:35,760 --> 00:03:39,160 Speaker 1: African American, there's a high likelihood that you might develop 61 00:03:39,200 --> 00:03:42,720 Speaker 1: anemia after you were given this anti malarial drug. But 62 00:03:42,840 --> 00:03:46,200 Speaker 1: that wasn't that didn't show among um. White troops and 63 00:03:46,320 --> 00:03:49,720 Speaker 1: doctors thought, what's behind this? And they went and looked 64 00:03:49,720 --> 00:03:53,640 Speaker 1: and saw that genetically speaking African Americans were less likely 65 00:03:53,920 --> 00:03:57,480 Speaker 1: to have a gene active that produces a protective enzyme 66 00:03:57,800 --> 00:04:00,640 Speaker 1: that keeps you from developing anemia when you're giving this 67 00:04:00,720 --> 00:04:03,760 Speaker 1: particular anti malarial drug. And that in the middle of 68 00:04:03,760 --> 00:04:06,520 Speaker 1: the twentieth century, it was the first time we really 69 00:04:06,600 --> 00:04:11,600 Speaker 1: started in the Western medicine tradition thinking that no, people 70 00:04:11,680 --> 00:04:15,000 Speaker 1: have different reactions to different types of treatments and can 71 00:04:15,080 --> 00:04:18,080 Speaker 1: have different experiences with different types of disease. Did they 72 00:04:18,080 --> 00:04:21,160 Speaker 1: do something about it in that case? I don't know. 73 00:04:21,760 --> 00:04:23,680 Speaker 1: I was curious. It depends on the time period in 74 00:04:23,680 --> 00:04:27,080 Speaker 1: this country. Shamefully, they might have said, like, yeah, but 75 00:04:27,120 --> 00:04:32,680 Speaker 1: who cares. Yeah. At the same time, the Tuskegee Um, yeah, exactly, Yeah, 76 00:04:33,320 --> 00:04:36,520 Speaker 1: the Tuskegee experiments were going on. We're also infecting people 77 00:04:36,560 --> 00:04:42,920 Speaker 1: in Guatemala with syphilis. Crazy crazy stuff. Um. So you 78 00:04:42,960 --> 00:04:47,960 Speaker 1: mentioned Hippocrates, um, more than two thousand years ago. He 79 00:04:48,000 --> 00:04:53,800 Speaker 1: was pretty advanced for thinking that jerkxes needs bleeding, but 80 00:04:54,240 --> 00:04:58,599 Speaker 1: uh uh Zeus does not. Zeus never needs a bleeding 81 00:04:58,600 --> 00:05:02,000 Speaker 1: by that, But it was lightning bowl at the problem exactly. 82 00:05:02,880 --> 00:05:04,440 Speaker 1: But he was way ahead of his time to be 83 00:05:04,480 --> 00:05:07,599 Speaker 1: thinking that way back then. Um, some other pioneers since then, 84 00:05:08,160 --> 00:05:12,680 Speaker 1: I think we talked about these two, Reuben Otenberg and 85 00:05:12,800 --> 00:05:17,359 Speaker 1: Ludvik hick Thorne job. I don't know that was not 86 00:05:17,520 --> 00:05:22,760 Speaker 1: good hicktoin uh in nineteen o seven, and I think 87 00:05:22,760 --> 00:05:24,880 Speaker 1: in our blood episode we might have talked about this. 88 00:05:25,200 --> 00:05:27,120 Speaker 1: That was such a good episode, it was a really 89 00:05:27,120 --> 00:05:29,200 Speaker 1: good one. I think, Um, they were the first ones 90 00:05:29,279 --> 00:05:31,719 Speaker 1: to say, you know what, people have different blood types 91 00:05:31,760 --> 00:05:34,520 Speaker 1: as it works, so that's why people keep dying. Well, 92 00:05:34,560 --> 00:05:37,000 Speaker 1: now we're putting this blood into someone that doesn't have 93 00:05:37,040 --> 00:05:39,479 Speaker 1: the same blood. So that was land Steiner who came 94 00:05:39,520 --> 00:05:41,240 Speaker 1: up with the idea that way. It wasn't blood types. 95 00:05:41,360 --> 00:05:45,840 Speaker 1: These two were the ones who first started to match people, like, well, 96 00:05:45,920 --> 00:05:48,719 Speaker 1: let's match these people. That's the that's yeah, that's a 97 00:05:48,760 --> 00:05:53,520 Speaker 1: pretty good first example of personalizing medicine on the most 98 00:05:53,560 --> 00:05:58,800 Speaker 1: basic level, like let's not kill people with blood right uh. 99 00:05:58,839 --> 00:06:01,480 Speaker 1: And then like I said, um, family histories and such, 100 00:06:01,600 --> 00:06:04,080 Speaker 1: they finally started saying, hey, you know what, maybe we'll 101 00:06:04,080 --> 00:06:06,680 Speaker 1: look at your father and your mother and your grandparents, 102 00:06:07,520 --> 00:06:09,719 Speaker 1: because if they have this disease, you might have it 103 00:06:09,720 --> 00:06:13,920 Speaker 1: as well. But everything changed when the Human Genome Project 104 00:06:13,960 --> 00:06:17,400 Speaker 1: came along, and UH, all of a sudden, we found 105 00:06:17,400 --> 00:06:20,440 Speaker 1: out we could learn a lot more about our predisposition 106 00:06:20,839 --> 00:06:25,039 Speaker 1: for certain diseases. Yeah, because if you think about it, 107 00:06:25,160 --> 00:06:29,880 Speaker 1: um are reactions to different diseases, and also the same 108 00:06:29,880 --> 00:06:35,080 Speaker 1: medicines that treat different diseases. UH, can be traced down 109 00:06:35,080 --> 00:06:38,279 Speaker 1: to the to the genetic level, to the molecular level, 110 00:06:38,880 --> 00:06:41,320 Speaker 1: to whether a gene is turned off and expressing a 111 00:06:41,320 --> 00:06:45,600 Speaker 1: certain kind of protein or enzyme um or whether our 112 00:06:45,680 --> 00:06:49,200 Speaker 1: genes are going to allow for a tumor that expresses 113 00:06:49,240 --> 00:06:51,960 Speaker 1: a certain kind of protein that can be tracked. If 114 00:06:52,000 --> 00:06:56,479 Speaker 1: you conceivably can look at a person's genome sequence, the 115 00:06:56,480 --> 00:07:00,719 Speaker 1: whole thing, analyze it, and then look it what genes 116 00:07:00,760 --> 00:07:03,760 Speaker 1: are turned on or off, what proteins are being expressed, 117 00:07:03,760 --> 00:07:06,480 Speaker 1: that kind of thing, then you if you also know 118 00:07:06,839 --> 00:07:10,320 Speaker 1: that a certain kind of drug attracts a certain kind 119 00:07:10,360 --> 00:07:13,600 Speaker 1: of tumor that's associated with that type of genome or 120 00:07:13,640 --> 00:07:18,520 Speaker 1: genetic sequence, then you can put patient and drug together 121 00:07:18,960 --> 00:07:22,760 Speaker 1: under its ideal form. Dude, we should stop and just 122 00:07:22,840 --> 00:07:26,800 Speaker 1: walk away. That's a mic drop statement. I don't think 123 00:07:26,800 --> 00:07:29,880 Speaker 1: we need anything else. Okay, I'm gonna go watch Billy 124 00:07:29,960 --> 00:07:33,560 Speaker 1: Jolson do up. All right. So, if you think you 125 00:07:33,600 --> 00:07:36,280 Speaker 1: go to the doctor and you get personalized medicine, in 126 00:07:36,280 --> 00:07:38,080 Speaker 1: a sense, you sort of are. But what we're talking 127 00:07:38,120 --> 00:07:40,880 Speaker 1: about is what Josh has said, which is your own 128 00:07:40,920 --> 00:07:46,080 Speaker 1: individual biology being the most overriding factor in how you 129 00:07:46,080 --> 00:07:50,280 Speaker 1: were treated. Your biology, not just you know, you're a 130 00:07:50,360 --> 00:07:53,440 Speaker 1: human being. Yeah, this works on human beings and horses. 131 00:07:54,200 --> 00:07:56,400 Speaker 1: And your mom had cancer, your grandma had cancer, so 132 00:07:56,440 --> 00:07:59,720 Speaker 1: you might have cancer. No, we're talking about looking inside 133 00:07:59,800 --> 00:08:02,320 Speaker 1: of you to find out what your likelihood to get 134 00:08:02,320 --> 00:08:04,280 Speaker 1: these things are, and like you said, matching you with 135 00:08:04,360 --> 00:08:07,440 Speaker 1: the best treatment plan, right, one of those UM one 136 00:08:07,440 --> 00:08:09,800 Speaker 1: of those courses of study. There's a lot of different 137 00:08:09,840 --> 00:08:13,520 Speaker 1: things that really kind of fall under personalized medicine UM. 138 00:08:13,560 --> 00:08:17,400 Speaker 1: But one of those sub fields is called pharmacogenetics, right. 139 00:08:18,200 --> 00:08:21,120 Speaker 1: And that is again, if you can take a person's 140 00:08:21,200 --> 00:08:25,320 Speaker 1: genome and then uh analyze it, you can say, well, 141 00:08:25,400 --> 00:08:28,160 Speaker 1: I see the sequence right here would react very well 142 00:08:28,200 --> 00:08:32,160 Speaker 1: to this particular drug. That's pharmacogenetics matching the drug to 143 00:08:32,240 --> 00:08:35,840 Speaker 1: the person, right, yeah, which is the opposite of hey, 144 00:08:35,840 --> 00:08:37,560 Speaker 1: it works for eight out of ten people, and if 145 00:08:37,600 --> 00:08:40,920 Speaker 1: you're just one of those, the T S T S 146 00:08:41,120 --> 00:08:44,400 Speaker 1: and that seriously, that is the basis of western medicine 147 00:08:44,440 --> 00:08:46,480 Speaker 1: as it stands right now. It's it called it's a 148 00:08:46,520 --> 00:08:51,080 Speaker 1: trial and error approach. And they don't usually stop at ts. No, 149 00:08:51,840 --> 00:08:54,880 Speaker 1: they just say like, oh, you survived that round of drugs, 150 00:08:54,880 --> 00:08:58,400 Speaker 1: but it didn't Let's try something else. Maybe maybe this 151 00:08:58,480 --> 00:09:01,520 Speaker 1: other one that doesn't work for it tends to work 152 00:09:01,559 --> 00:09:04,560 Speaker 1: for that might work for you. And then it just 153 00:09:04,600 --> 00:09:06,640 Speaker 1: goes on and on and on until they finally hit 154 00:09:06,720 --> 00:09:10,479 Speaker 1: upon that drug hopefully that that doesn't work. I say hopefully, 155 00:09:10,520 --> 00:09:13,760 Speaker 1: because within that trial and error period a lot of 156 00:09:13,800 --> 00:09:18,120 Speaker 1: people die. Sometimes that first time, that first trial results 157 00:09:18,120 --> 00:09:20,520 Speaker 1: in a fatal error, and those are called a d 158 00:09:20,840 --> 00:09:25,760 Speaker 1: S or adverse drug UM events. There's seven hundred and 159 00:09:25,800 --> 00:09:29,920 Speaker 1: seventy thousand people in the US alone they either die 160 00:09:30,080 --> 00:09:33,480 Speaker 1: or are injured by an a d E every year 161 00:09:33,840 --> 00:09:38,679 Speaker 1: in the US alone, almost a million people, seven hundred 162 00:09:38,679 --> 00:09:41,040 Speaker 1: and seventy thousand people every year. You give that person 163 00:09:41,080 --> 00:09:44,520 Speaker 1: a drug and they might die. And the one of 164 00:09:44,520 --> 00:09:49,280 Speaker 1: the goals of UM of pharmacogenetics is to avoid a 165 00:09:49,400 --> 00:09:52,000 Speaker 1: d e s so that you can say, before you 166 00:09:52,040 --> 00:09:55,760 Speaker 1: give anybody a drug, like this won't kill you, Yes, exactly, 167 00:09:56,280 --> 00:09:58,800 Speaker 1: this won't kill you. We know that because we scanned 168 00:09:58,800 --> 00:10:03,400 Speaker 1: your geno. We're not guessing here. We know you genetically 169 00:10:03,640 --> 00:10:06,160 Speaker 1: will not die from this drunk. Yeah, I think we 170 00:10:06,200 --> 00:10:09,160 Speaker 1: should caveat here when we say things like guessing and 171 00:10:09,200 --> 00:10:11,160 Speaker 1: like I don't want to paint the medical industry is 172 00:10:11,840 --> 00:10:16,319 Speaker 1: you know, just throwing darts with a blindfold. They've done 173 00:10:16,360 --> 00:10:18,160 Speaker 1: They did the best job they could, I think, to 174 00:10:18,720 --> 00:10:23,040 Speaker 1: treat massive amounts of people in the most efficient way possible. 175 00:10:23,760 --> 00:10:27,360 Speaker 1: But things are getting better now because of the human 176 00:10:27,400 --> 00:10:30,360 Speaker 1: biome or the human genome and what we've learned about it. 177 00:10:30,600 --> 00:10:32,680 Speaker 1: Like when I look about the future of medicine, it 178 00:10:32,800 --> 00:10:37,000 Speaker 1: is like it's super rosy. Yeah, I agree, you know, 179 00:10:37,080 --> 00:10:38,839 Speaker 1: like a hundred years from now, it's it's gonna be 180 00:10:38,880 --> 00:10:42,040 Speaker 1: amazing what we're gonna be doing, maybe like thirty Like 181 00:10:42,080 --> 00:10:45,560 Speaker 1: we're right there on the cusp right now, where we 182 00:10:45,600 --> 00:10:48,360 Speaker 1: went through a fairly dark age as far as medicine goes, 183 00:10:48,400 --> 00:10:51,120 Speaker 1: where we were taking shots in the dark, figuring things 184 00:10:51,120 --> 00:10:53,720 Speaker 1: out as we went along, and now we are right 185 00:10:53,760 --> 00:10:55,720 Speaker 1: there at the age where we're about to just take 186 00:10:55,720 --> 00:10:59,160 Speaker 1: off like a rocket and really understand health and wellness 187 00:10:59,440 --> 00:11:02,080 Speaker 1: and treatment of disease. All right, well, I feel like 188 00:11:02,080 --> 00:11:04,360 Speaker 1: we're on the cusp of the message break as well. 189 00:11:04,520 --> 00:11:28,760 Speaker 1: I think you're right, So, Chuck, I was talking about 190 00:11:28,760 --> 00:11:34,000 Speaker 1: pharmacod genetics, right, there's actually some examples of pharmacogenetics already 191 00:11:34,160 --> 00:11:37,360 Speaker 1: taking place. This isn't necessarily in the future, like this 192 00:11:37,480 --> 00:11:43,640 Speaker 1: is already starting. Yes, I think it started in the nineties, right, yeah, 193 00:11:43,720 --> 00:11:45,280 Speaker 1: And and we'll get to this later. One of the 194 00:11:45,280 --> 00:11:48,880 Speaker 1: big reasons that things are cooking now, cooking with gas, 195 00:11:49,000 --> 00:11:53,600 Speaker 1: as my dad used to say, is because the massive 196 00:11:53,920 --> 00:12:00,320 Speaker 1: drop in cost for mapping your genome. Yeah, like assive. 197 00:12:00,360 --> 00:12:04,760 Speaker 1: In fact, i'll go ahead and tease you here and uh, 198 00:12:05,080 --> 00:12:07,160 Speaker 1: the first time it was done to James Watson in 199 00:12:07,200 --> 00:12:10,520 Speaker 1: two thousand seven, that was two seven, not even the 200 00:12:10,600 --> 00:12:13,360 Speaker 1: human genome that was two thousand one. Two thousand seven 201 00:12:13,440 --> 00:12:16,560 Speaker 1: was four. A time they mapped the person in full 202 00:12:17,120 --> 00:12:21,439 Speaker 1: cost a million dollars. Now you can get it done 203 00:12:22,360 --> 00:12:24,600 Speaker 1: a good A good one, not a full You know, 204 00:12:24,679 --> 00:12:28,320 Speaker 1: you can't map out the entire genome for this amount 205 00:12:28,320 --> 00:12:33,240 Speaker 1: of money. You can, you can you can sequence it. 206 00:12:33,480 --> 00:12:36,240 Speaker 1: You can sequence it for that's the caveat less than 207 00:12:36,280 --> 00:12:41,400 Speaker 1: two dollars, and pretty soon it's going to be about fifty. 208 00:12:41,559 --> 00:12:44,400 Speaker 1: And then from what I saw in that, I think 209 00:12:44,400 --> 00:12:47,200 Speaker 1: that was like a Business Insider article, there was a 210 00:12:47,240 --> 00:12:51,560 Speaker 1: dude who gave this this really interesting lecture. Um. He 211 00:12:51,960 --> 00:12:58,400 Speaker 1: very strongly asserted that they were pretty confident by thanks 212 00:12:58,400 --> 00:13:03,640 Speaker 1: to economies of scale, Uh, genome sequencing will cost about 213 00:13:03,640 --> 00:13:07,320 Speaker 1: a penny. Yeah, they won't. Won't cost a penny, Like 214 00:13:07,440 --> 00:13:09,880 Speaker 1: you won't pay a penny. I guarantee you that. No, No, 215 00:13:10,080 --> 00:13:12,880 Speaker 1: but it'll be but it might be like fifty bucks 216 00:13:13,760 --> 00:13:17,200 Speaker 1: and someone will be makingfit. No. The I think what 217 00:13:17,240 --> 00:13:18,920 Speaker 1: he was saying was if you take all of the 218 00:13:18,960 --> 00:13:22,200 Speaker 1: genomes that are sequenced in a year, ultimately that's what 219 00:13:22,280 --> 00:13:24,360 Speaker 1: it will have cost. It's about a penny each, right, 220 00:13:24,880 --> 00:13:28,400 Speaker 1: But they it's gonna pop up in in different ways 221 00:13:28,400 --> 00:13:31,000 Speaker 1: than what you have now. Like this is a pretty 222 00:13:31,000 --> 00:13:34,440 Speaker 1: common thought that you will pee into your toilet, and 223 00:13:34,440 --> 00:13:37,480 Speaker 1: your toilet will have a genome sequencer attached to it, 224 00:13:37,760 --> 00:13:40,560 Speaker 1: and when you pee, your urine will be analyzed for 225 00:13:40,640 --> 00:13:43,520 Speaker 1: any changes from that morning or the night before or 226 00:13:43,559 --> 00:13:47,199 Speaker 1: anything like that, so that your baseline health is monitored 227 00:13:47,600 --> 00:13:51,080 Speaker 1: on a like a several times a day basis. Right, 228 00:13:51,120 --> 00:13:53,000 Speaker 1: if my toilet starts telling me to cut down on 229 00:13:53,040 --> 00:13:56,160 Speaker 1: my drinking, then I'm gonna start peeing outside. I imagine 230 00:13:56,200 --> 00:13:58,079 Speaker 1: that you can probably set it to kind of take 231 00:13:58,120 --> 00:14:00,400 Speaker 1: it easy on this area, you know that kind of 232 00:14:00,440 --> 00:14:02,640 Speaker 1: And when I say start being outside, I mean full time. 233 00:14:02,640 --> 00:14:06,200 Speaker 1: I p outside almost every night, off of my deck. 234 00:14:07,320 --> 00:14:11,200 Speaker 1: Sometimes you even stand up. Yeah, that's Raymond mcaulay, by 235 00:14:11,240 --> 00:14:15,280 Speaker 1: the way, he's the bio technology and bio and for 236 00:14:15,559 --> 00:14:24,320 Speaker 1: Maddox chair Singularity University, what's their mascot, the uh fighting 237 00:14:24,360 --> 00:14:27,960 Speaker 1: curs wild. So he's a smart guy, and he's the 238 00:14:27,960 --> 00:14:30,600 Speaker 1: one that is saying that this is just getting cheaper 239 00:14:30,600 --> 00:14:33,160 Speaker 1: and cheaper. And when you look at the graph in 240 00:14:33,200 --> 00:14:35,960 Speaker 1: two thousand seven, it took a nose dive in price, 241 00:14:36,280 --> 00:14:38,680 Speaker 1: Yeah it did. He compared it to Moore's law, where 242 00:14:38,840 --> 00:14:41,840 Speaker 1: um Moore's law is like the amount of computing power 243 00:14:41,960 --> 00:14:46,000 Speaker 1: doubles every eighteen months or something like that, twenty four months. 244 00:14:46,040 --> 00:14:50,160 Speaker 1: They can't remember, um. And it was pointed out that 245 00:14:50,280 --> 00:14:53,680 Speaker 1: genome sequencing was actually moving in a rate of five 246 00:14:53,720 --> 00:14:57,320 Speaker 1: to ten times the rate of Moore's law. That's awesome. 247 00:14:57,560 --> 00:15:00,600 Speaker 1: That is awesome as far as genome sequencing concerned. The 248 00:15:00,600 --> 00:15:05,000 Speaker 1: problem is computing powers still following Moore's law. And here's 249 00:15:05,040 --> 00:15:07,680 Speaker 1: the big problem. This is why we're not all getting 250 00:15:07,680 --> 00:15:11,520 Speaker 1: our genome sequenced right now. Because it might be very 251 00:15:11,600 --> 00:15:16,840 Speaker 1: cheap to sequence human genome, it's still very expensive because 252 00:15:16,840 --> 00:15:21,440 Speaker 1: it requires a lot of computing power to analyze that genome. Yeah, 253 00:15:21,440 --> 00:15:24,800 Speaker 1: that's the main stumbling block is you can't sequence your genome, 254 00:15:25,160 --> 00:15:27,400 Speaker 1: stick it in a machine and have it say you'll 255 00:15:27,440 --> 00:15:32,600 Speaker 1: get cancer. Yet. That's the future, but not too far off. No, 256 00:15:32,680 --> 00:15:36,120 Speaker 1: that's like Gattica. Yeah, but the I mean this guy 257 00:15:36,360 --> 00:15:39,080 Speaker 1: Macaulay was saying, probably in about ten years they will 258 00:15:39,120 --> 00:15:41,080 Speaker 1: have machines like that. Yeah, which is what we need. 259 00:15:41,120 --> 00:15:43,160 Speaker 1: That's the main stumbling block right now is there's so 260 00:15:43,240 --> 00:15:46,480 Speaker 1: much data that computers can't even keep up. So right 261 00:15:46,480 --> 00:15:50,800 Speaker 1: now you could conceivably get a decent genome sequenced and 262 00:15:50,920 --> 00:15:56,000 Speaker 1: analyzed for like fifteen grand, which is not I mean, 263 00:15:56,040 --> 00:15:58,440 Speaker 1: that's not all the realm of it's not the reach 264 00:15:58,480 --> 00:16:03,400 Speaker 1: of everybody. You have to be people that, Um, the 265 00:16:03,400 --> 00:16:08,480 Speaker 1: the the big change will come when all of us 266 00:16:09,320 --> 00:16:14,080 Speaker 1: get our genome sequence basically for free. And the holy 267 00:16:14,120 --> 00:16:16,440 Speaker 1: grail in the not too distant future is to not 268 00:16:16,480 --> 00:16:21,600 Speaker 1: only have a genome sequencer and analyzer in your toilet, 269 00:16:22,000 --> 00:16:24,760 Speaker 1: but also you'll be wearing like a wearable or have 270 00:16:24,840 --> 00:16:28,680 Speaker 1: an implanable something, yeah, but or maybe something that's under 271 00:16:28,720 --> 00:16:34,040 Speaker 1: the skin that is like fitbit, but that's analyzing everything, um, 272 00:16:34,080 --> 00:16:37,400 Speaker 1: including your hormone levels things like that. So you're not 273 00:16:37,440 --> 00:16:40,400 Speaker 1: only analyzing your p you're also analyzing your body in 274 00:16:40,480 --> 00:16:42,680 Speaker 1: a moment to moment basis. And all this stuff is 275 00:16:42,760 --> 00:16:46,400 Speaker 1: run through an app you have on your phone that 276 00:16:46,720 --> 00:16:50,200 Speaker 1: is tied in to your health records and other kinds 277 00:16:50,200 --> 00:16:55,200 Speaker 1: of medical data um that you control and you share 278 00:16:55,240 --> 00:16:58,680 Speaker 1: with your healthcare provider rather than the opposite. That's another 279 00:16:58,720 --> 00:17:01,120 Speaker 1: big change coming that we talk sucked about in Will 280 00:17:01,160 --> 00:17:05,680 Speaker 1: Computers Replace My Doctor episode, that that medical information about 281 00:17:05,680 --> 00:17:09,280 Speaker 1: the person is going to be wrestled away from healthcare 282 00:17:09,720 --> 00:17:12,600 Speaker 1: and healthcare providers and insurance companies and placed in the 283 00:17:12,600 --> 00:17:14,760 Speaker 1: hands of the individual. And that's going to be a 284 00:17:14,840 --> 00:17:18,000 Speaker 1: huge change that will probably come from this personalized medicine 285 00:17:18,040 --> 00:17:21,480 Speaker 1: exactly one of the positive changes. All right. So there 286 00:17:21,480 --> 00:17:25,120 Speaker 1: have been some early stories that have given us all 287 00:17:25,119 --> 00:17:27,760 Speaker 1: hope for the future when it comes to looking at 288 00:17:27,760 --> 00:17:32,399 Speaker 1: these biomarkers UM for potential of disease, and one of them, 289 00:17:32,760 --> 00:17:35,720 Speaker 1: there was a drug called U k A l y 290 00:17:35,840 --> 00:17:39,479 Speaker 1: d e c O kellidico kalitico I think so UH 291 00:17:39,520 --> 00:17:41,960 Speaker 1: in two thousand twelve to treat a rare form of 292 00:17:42,000 --> 00:17:46,360 Speaker 1: cystic fibrosis UM, which is a deadly lung condition. And 293 00:17:46,520 --> 00:17:49,000 Speaker 1: the FDA here in the U S approved this drug 294 00:17:49,720 --> 00:17:53,679 Speaker 1: UM basically because they found out certain people have genetic markers, 295 00:17:53,680 --> 00:17:57,320 Speaker 1: these biomarkers that they wouldn't respond to other drugs treating 296 00:17:57,880 --> 00:18:00,639 Speaker 1: UH cystic fibrosis. So they said, this is a new 297 00:18:00,720 --> 00:18:04,440 Speaker 1: drug that will work for you. Success story boom, and 298 00:18:04,520 --> 00:18:07,600 Speaker 1: this like this is the future of personalized medicine all 299 00:18:07,600 --> 00:18:10,879 Speaker 1: over the place. Right. It covers about four percent of 300 00:18:10,880 --> 00:18:15,000 Speaker 1: cystic fibrosis patients. So in the US, it's people that 301 00:18:15,080 --> 00:18:18,320 Speaker 1: the drug was targeted for, right, because you would think 302 00:18:19,080 --> 00:18:20,800 Speaker 1: I'm just cynical, but you would think that's so few 303 00:18:20,800 --> 00:18:23,080 Speaker 1: people that somebody be like, a why bother, I'll bet 304 00:18:23,160 --> 00:18:25,840 Speaker 1: it costs a bunch of money for the drug, But yes, 305 00:18:25,880 --> 00:18:28,720 Speaker 1: you're right, UM. And then secondly, it also kind of 306 00:18:28,760 --> 00:18:33,200 Speaker 1: shows how personalized medicine shifts our understanding of disease to right, 307 00:18:33,480 --> 00:18:36,480 Speaker 1: the reason these people with cystic fibrosis didn't respond to 308 00:18:36,520 --> 00:18:40,679 Speaker 1: regular medicine is because their cystic fibrosis was developed because 309 00:18:41,119 --> 00:18:45,359 Speaker 1: their genes didn't that regulated salt and water movement across 310 00:18:45,359 --> 00:18:48,960 Speaker 1: the surface of their lungs were mutated and not functioning properly. 311 00:18:49,280 --> 00:18:52,960 Speaker 1: So this specific drug that targets these four percent of 312 00:18:53,000 --> 00:18:56,920 Speaker 1: cystic fibrosis patients goes in and messages with that gene. Well, 313 00:18:56,960 --> 00:18:59,960 Speaker 1: if you do the other ninetent of cystic fibrosis patient, 314 00:19:00,480 --> 00:19:04,359 Speaker 1: their salt and water um movement is just fine. That's 315 00:19:04,359 --> 00:19:07,480 Speaker 1: not why they have cystic fibrosis. So it changes your 316 00:19:07,560 --> 00:19:11,920 Speaker 1: understanding of cystic fibrosis. It's not like you have cystic fibrosis. 317 00:19:12,200 --> 00:19:14,080 Speaker 1: This is why you have it. This is how your 318 00:19:14,119 --> 00:19:17,760 Speaker 1: body is showing that you have cystic fibrosis. You have 319 00:19:17,840 --> 00:19:20,440 Speaker 1: cystic fibrosis, and you can have all these you can 320 00:19:20,480 --> 00:19:24,359 Speaker 1: have it under these different mechanisms. That's what personalized medicine 321 00:19:24,440 --> 00:19:27,560 Speaker 1: is changing too. It's changing our understanding of disease itself. 322 00:19:28,200 --> 00:19:32,720 Speaker 1: Same with cancer, right, certain tumors express certain proteins and 323 00:19:32,840 --> 00:19:36,439 Speaker 1: although yes, you have an out of control growth that 324 00:19:36,560 --> 00:19:39,000 Speaker 1: makes a cancer. It really doesn't bear that much of 325 00:19:39,000 --> 00:19:42,639 Speaker 1: a resemblance to this other kind of cancer. And the 326 00:19:42,720 --> 00:19:46,199 Speaker 1: more we dig into how people respond differently to cancer 327 00:19:46,200 --> 00:19:49,000 Speaker 1: treatments and how they can host different kinds of tumors 328 00:19:49,200 --> 00:19:51,720 Speaker 1: is changing our understanding of cancer. And a lot of 329 00:19:51,760 --> 00:19:54,000 Speaker 1: people are like, cancer is too big of an umbrella. 330 00:19:54,200 --> 00:19:57,560 Speaker 1: These are really almost different diseases. Yeah, And I think 331 00:19:57,800 --> 00:20:01,720 Speaker 1: the Macaulay guys said the hope one day is to 332 00:20:01,760 --> 00:20:04,840 Speaker 1: stop cancer before it even starts at such a small 333 00:20:05,440 --> 00:20:10,040 Speaker 1: molecular level with these advanced Uh. I guess like a 334 00:20:10,040 --> 00:20:12,760 Speaker 1: blood test. Yeah, basically the blood test will be so 335 00:20:12,840 --> 00:20:16,080 Speaker 1: advanced that let's say, you know you're going to develop 336 00:20:16,160 --> 00:20:19,440 Speaker 1: cancer in five years, Like we can tell that already, 337 00:20:19,480 --> 00:20:22,920 Speaker 1: So let's just stop it now before there's a yeah, 338 00:20:23,040 --> 00:20:26,960 Speaker 1: or before it gets big enough that it's a problem. Yeah, exactly. Uh. 339 00:20:27,000 --> 00:20:29,760 Speaker 1: If you have type one diabetes, I think it is 340 00:20:30,400 --> 00:20:35,320 Speaker 1: um good news. There is a new system. It's a 341 00:20:35,600 --> 00:20:40,639 Speaker 1: basically an artificial pancreas device and they are wearable and 342 00:20:41,000 --> 00:20:44,119 Speaker 1: the clinical developed by u v A and Harvard Go 343 00:20:44,320 --> 00:20:53,160 Speaker 1: Cavaliers and Crimson, the Crimson Smarties. That's Harvard. Right, they're 344 00:20:53,160 --> 00:20:55,600 Speaker 1: not the Crimson Tide too, are they? No, not the Tide, 345 00:20:55,640 --> 00:20:59,960 Speaker 1: They're just Crimson's the Crimson. I think you guys left 346 00:21:00,080 --> 00:21:02,720 Speaker 1: part off their Harvard. Well, they do have a mascot, 347 00:21:02,760 --> 00:21:05,040 Speaker 1: I think, like John Harvard, but it's not like it's 348 00:21:05,040 --> 00:21:07,760 Speaker 1: just a square of Crimson. I don't know. I think 349 00:21:07,760 --> 00:21:10,280 Speaker 1: so maybe they're above it. They don't need a Crimson 350 00:21:10,320 --> 00:21:16,280 Speaker 1: Knights Crimson Knights. No, is that Rutgers. That's Scarlet Knights. Anyway, 351 00:21:16,400 --> 00:21:19,919 Speaker 1: uv A and Harvard developed this thing together. Uh. And 352 00:21:20,000 --> 00:21:23,320 Speaker 1: it starts clinical trials in like the next month or two. 353 00:21:24,160 --> 00:21:26,720 Speaker 1: Uh and for six months, two forty people are gonna 354 00:21:26,760 --> 00:21:32,359 Speaker 1: wear this thing, this artificial pancreas to tell your body, uh, 355 00:21:32,880 --> 00:21:36,560 Speaker 1: exactly when you need the optimal level of insulin in 356 00:21:36,600 --> 00:21:39,960 Speaker 1: your body at all times. Well, and introduces that optimal level. 357 00:21:40,520 --> 00:21:44,480 Speaker 1: Oh does it? Uh? Huh? How so? So it's like 358 00:21:44,720 --> 00:21:47,959 Speaker 1: it's monitoring your blood glucost level. Yeah. And you you know, 359 00:21:48,040 --> 00:21:50,880 Speaker 1: if you have diabetes you have to inject insuline. Yeah, 360 00:21:51,040 --> 00:21:54,720 Speaker 1: this stuff, say, is connected to report in your chest. Oh. 361 00:21:54,840 --> 00:21:56,840 Speaker 1: I don't think this one particularly is this is just 362 00:21:56,880 --> 00:22:00,639 Speaker 1: a wearable monitor. But I think eventually they're gonna have 363 00:22:00,680 --> 00:22:03,359 Speaker 1: what you're talking about. I guess I'm just getting ahead 364 00:22:03,400 --> 00:22:09,040 Speaker 1: of the ahead of myself. That's that's actually regulates, not 365 00:22:09,200 --> 00:22:11,879 Speaker 1: monitors in the future, I think is what you're talking about, 366 00:22:12,000 --> 00:22:17,040 Speaker 1: or injects like an optimal dose regulating your glucose so 367 00:22:17,080 --> 00:22:19,080 Speaker 1: you don't have to do it. I think this is 368 00:22:19,119 --> 00:22:21,879 Speaker 1: just a wearable monitor so you could just like press 369 00:22:21,960 --> 00:22:24,159 Speaker 1: and say, okay, what kind of how much insulin do 370 00:22:24,200 --> 00:22:25,719 Speaker 1: I need right now? And it tells you the exact 371 00:22:25,760 --> 00:22:29,280 Speaker 1: like milligrams, so you still have to like a dope, 372 00:22:29,640 --> 00:22:33,520 Speaker 1: go and inject it yourself, right, I think so. I 373 00:22:33,560 --> 00:22:35,400 Speaker 1: don't see how it could be wearable on your arm 374 00:22:35,440 --> 00:22:40,600 Speaker 1: and also be attached to your body like the insides 375 00:22:40,640 --> 00:22:44,359 Speaker 1: of your body through like a an ivy. Yeah, I 376 00:22:44,359 --> 00:22:47,680 Speaker 1: don't think that's what this is. All sounds like there's 377 00:22:47,680 --> 00:22:51,439 Speaker 1: two different things, but it's still monitoring exactly what your 378 00:22:51,480 --> 00:22:54,680 Speaker 1: blood glucoast level is. Absolutely and it's your blood loose 379 00:22:54,800 --> 00:22:59,760 Speaker 1: coast level ergo, it's personalized medicine, that's right. If you 380 00:22:59,760 --> 00:23:04,399 Speaker 1: have tendus like our buddy Aaron Cooper. Aaron Cooper, he 381 00:23:04,400 --> 00:23:06,000 Speaker 1: probably didn't hear that. I'll heard it was a ringing. 382 00:23:06,040 --> 00:23:12,560 Speaker 1: He just hurts. UM. They're working on customizable devices that 383 00:23:12,640 --> 00:23:15,440 Speaker 1: adjust the audio signal that's unique to your own ear. 384 00:23:15,480 --> 00:23:18,040 Speaker 1: In other words, hey just put this hearing aid in 385 00:23:18,080 --> 00:23:20,840 Speaker 1: there that may or may not work for you, right 386 00:23:21,040 --> 00:23:23,720 Speaker 1: from what I understand it actually so, UM, you know 387 00:23:23,840 --> 00:23:27,960 Speaker 1: noise canceling headphones, Well, it kind of works like those. 388 00:23:28,359 --> 00:23:31,280 Speaker 1: I guess it figures out what pitch you're hearing that 389 00:23:31,359 --> 00:23:33,359 Speaker 1: tonight is that, and it just gets rid of it. 390 00:23:34,320 --> 00:23:36,840 Speaker 1: I think that's neat. I do too, UM and then check. 391 00:23:36,880 --> 00:23:41,439 Speaker 1: There's another early example of a good a big win. UM. 392 00:23:41,480 --> 00:23:46,960 Speaker 1: There's something called herceptin, and the FDA said, yes, go 393 00:23:47,000 --> 00:23:51,760 Speaker 1: ahead with this. UM. They figured out that this particular 394 00:23:51,880 --> 00:23:56,440 Speaker 1: drug worked for a specific group of people UM whose 395 00:23:56,480 --> 00:24:00,200 Speaker 1: tumors expressed a specific protein, and it was a breast 396 00:24:00,280 --> 00:24:06,040 Speaker 1: cancer UM tumor targeting drug. But like again, it wasn't like, oh, 397 00:24:06,080 --> 00:24:10,080 Speaker 1: you have breast cancer, here, try um her septin, it'll 398 00:24:10,119 --> 00:24:14,200 Speaker 1: work for you. It's we we we believe that you 399 00:24:14,520 --> 00:24:17,800 Speaker 1: have this kind of tumor because it's expressing this kind 400 00:24:17,800 --> 00:24:20,399 Speaker 1: of protein. So her septin is going to treat this 401 00:24:21,040 --> 00:24:24,640 Speaker 1: orray for her septin. Yeah, Well, let's take another break 402 00:24:24,680 --> 00:24:26,879 Speaker 1: and we'll get back and finish up with some of 403 00:24:26,920 --> 00:24:51,480 Speaker 1: the obstacles in the future. All right, So this all 404 00:24:51,520 --> 00:24:55,160 Speaker 1: sounds rosy, but there are some obstacles we already talked about. 405 00:24:55,160 --> 00:24:58,800 Speaker 1: One the previous biggest one was cost. This article itself 406 00:24:58,880 --> 00:25:04,280 Speaker 1: is m way out of date because it said seventeen 407 00:25:04,280 --> 00:25:06,959 Speaker 1: thousand dollars a person, and now it's already like two 408 00:25:07,040 --> 00:25:09,760 Speaker 1: hundred bucks. I think that might be though with the 409 00:25:10,560 --> 00:25:16,240 Speaker 1: with analysis. Oh really, yeah, I think that's what they're saying. Okay, 410 00:25:16,320 --> 00:25:18,440 Speaker 1: oh yeah, follow up on the data. Yeah, all right's 411 00:25:18,440 --> 00:25:21,640 Speaker 1: announced down to fifteen grand, So it's up by two 412 00:25:21,680 --> 00:25:25,159 Speaker 1: thousand dollars. So it was written a week ago, all right, 413 00:25:26,800 --> 00:25:29,400 Speaker 1: But the cost of the genome was a previous hurdle. 414 00:25:29,480 --> 00:25:32,479 Speaker 1: Now that's coming down. Another hurdle is that we mentioned 415 00:25:32,560 --> 00:25:36,080 Speaker 1: was just processing the data. And then another hurdle is 416 00:25:36,200 --> 00:25:41,720 Speaker 1: just overstating the impact of this of the findings. Um, 417 00:25:41,760 --> 00:25:45,560 Speaker 1: just because and it's a slippery slope, Just because you 418 00:25:45,600 --> 00:25:49,040 Speaker 1: are susceptible to something doesn't mean you're gonna get it. No, 419 00:25:49,280 --> 00:25:52,960 Speaker 1: And that's actually there's something called the Jolie effect, that 420 00:25:53,080 --> 00:25:57,200 Speaker 1: Angelina Jolie effect. Oh boy, I've got eight thousand jokes. 421 00:25:57,400 --> 00:25:59,600 Speaker 1: Have you heard about that now? So do you remember 422 00:25:59,640 --> 00:26:03,720 Speaker 1: when she did genetic testing and found that, um, she 423 00:26:04,040 --> 00:26:07,520 Speaker 1: was there was a likelihood that she would develop breast cancer. 424 00:26:10,359 --> 00:26:13,320 Speaker 1: I think perhaps like her mother may have had breast cancer. 425 00:26:13,359 --> 00:26:15,359 Speaker 1: I'm not sure, but she was convinced that there was 426 00:26:15,400 --> 00:26:17,720 Speaker 1: a good chance she's going to get breast cancer. So 427 00:26:17,840 --> 00:26:21,320 Speaker 1: she went ahead and had a double mestectomy without breast cancer, 428 00:26:21,359 --> 00:26:26,160 Speaker 1: no tumors, no nothing. She just preventatively had mossectomies. Angelina 429 00:26:26,240 --> 00:26:31,280 Speaker 1: Jolie did yes, and it created what's called this Angelina 430 00:26:31,480 --> 00:26:34,680 Speaker 1: Jolie effect. And Christina applegated something like that too. Well, 431 00:26:34,760 --> 00:26:38,439 Speaker 1: she had breast cancer. Angelina Jolie didn't have breast cancer, 432 00:26:39,000 --> 00:26:41,960 Speaker 1: believed that she would conceivably get breast cancer, so it 433 00:26:42,040 --> 00:26:45,840 Speaker 1: just had her breasts removed and the right um and 434 00:26:45,880 --> 00:26:48,399 Speaker 1: it created what's called this Angelina Jolie effect, which is 435 00:26:48,440 --> 00:26:52,400 Speaker 1: this idea that UM, the more we know about our bodies, 436 00:26:52,920 --> 00:26:57,760 Speaker 1: the more UM focused on all the things that could 437 00:26:57,760 --> 00:27:02,080 Speaker 1: conceivably go wrong, hype pathetically could go wrong, that we 438 00:27:02,119 --> 00:27:07,919 Speaker 1: may take radical steps like like prophylactic surgery. Basically, you know, 439 00:27:08,000 --> 00:27:10,320 Speaker 1: to prevent something that may or may not even happen. 440 00:27:10,880 --> 00:27:14,679 Speaker 1: And this is a big concern among bioethicists about this 441 00:27:14,760 --> 00:27:17,879 Speaker 1: kind of understanding that will come from personalized medicine is 442 00:27:18,280 --> 00:27:21,600 Speaker 1: are we gonna all become obsessed with our health? Well? 443 00:27:21,640 --> 00:27:24,359 Speaker 1: I think people that already are. This will just be 444 00:27:24,400 --> 00:27:27,160 Speaker 1: the next step of that. Yeah, But I could see 445 00:27:27,160 --> 00:27:29,240 Speaker 1: if it could bring more people into the full I'm 446 00:27:29,240 --> 00:27:30,720 Speaker 1: sure there's a lot of people who don't think about 447 00:27:30,720 --> 00:27:33,560 Speaker 1: their health just because they don't have that kind of awareness. 448 00:27:33,560 --> 00:27:35,679 Speaker 1: But if it was in their face, like, hey, buddy, 449 00:27:35,920 --> 00:27:38,800 Speaker 1: here's your genome, Look at this crazy stuff that could 450 00:27:38,800 --> 00:27:41,560 Speaker 1: happen to you. Do you may start thinking about it 451 00:27:41,600 --> 00:27:44,560 Speaker 1: even if you weren't predisposed to it before. But you 452 00:27:44,560 --> 00:27:46,639 Speaker 1: would have to go get that done to begin with. 453 00:27:46,880 --> 00:27:50,240 Speaker 1: Well that's another question too. So right now, if getting 454 00:27:50,280 --> 00:27:56,280 Speaker 1: your genome done costs seventeen grand right, um, should that 455 00:27:56,320 --> 00:28:00,639 Speaker 1: be just the providence of the rich or that a 456 00:28:00,760 --> 00:28:05,119 Speaker 1: human right to know what your genome says? If anybody 457 00:28:05,160 --> 00:28:08,600 Speaker 1: can know what their genome says, should everybody? I predict 458 00:28:08,640 --> 00:28:12,000 Speaker 1: that the answer will ultimately be yes to that, then 459 00:28:12,080 --> 00:28:15,679 Speaker 1: there's a right, and the government will probably fund a 460 00:28:15,760 --> 00:28:20,760 Speaker 1: program for every American get to get their genome sequence years. 461 00:28:21,800 --> 00:28:25,200 Speaker 1: Another big problem is the f d A is just overtaxed, 462 00:28:25,440 --> 00:28:29,440 Speaker 1: you know, it's it's a rapidly moving field and they 463 00:28:29,520 --> 00:28:32,639 Speaker 1: just can't keep up at this point, which you know, 464 00:28:32,640 --> 00:28:33,840 Speaker 1: because there are a lot of new things that come 465 00:28:33,840 --> 00:28:36,919 Speaker 1: along with new drugs, new devices that the FDA has 466 00:28:36,960 --> 00:28:40,280 Speaker 1: a test. Well, not just that the understanding of it 467 00:28:40,320 --> 00:28:43,000 Speaker 1: as well, Like they used to have this open database 468 00:28:43,080 --> 00:28:46,000 Speaker 1: from the Human Genome Project to where all of these 469 00:28:46,040 --> 00:28:50,360 Speaker 1: anonymous subjects, genes or genomes were just sitting out there 470 00:28:50,360 --> 00:28:53,880 Speaker 1: for anybody to go and data mine, right, and then 471 00:28:53,960 --> 00:28:57,120 Speaker 1: somebody proved that you can actually find you can d 472 00:28:57,400 --> 00:29:02,520 Speaker 1: anonymize these people because again this is their genome and 473 00:29:02,640 --> 00:29:06,240 Speaker 1: figure out whose genome you're looking at specifically, And the 474 00:29:06,320 --> 00:29:08,560 Speaker 1: FDA had to shut it down, but they shut it 475 00:29:08,560 --> 00:29:11,920 Speaker 1: down after somebody proved that this could already be done. 476 00:29:11,960 --> 00:29:14,560 Speaker 1: So they're they're having to react rather than being able 477 00:29:14,640 --> 00:29:16,600 Speaker 1: to keep up with the changes in the field. And 478 00:29:16,640 --> 00:29:19,040 Speaker 1: that's one of the other huge slippery slopes in the 479 00:29:19,080 --> 00:29:23,840 Speaker 1: future is um Well, a couple of things. How insurance 480 00:29:23,880 --> 00:29:27,840 Speaker 1: companies deal with this um A. Can they deny someone 481 00:29:28,440 --> 00:29:33,480 Speaker 1: based on a biomarker um right, now there's legislation that 482 00:29:33,600 --> 00:29:37,080 Speaker 1: has been signed into law that says no, you cannot. 483 00:29:37,200 --> 00:29:42,520 Speaker 1: It's called biological discrimination, which is profoundly insightful or foresightful 484 00:29:43,000 --> 00:29:46,160 Speaker 1: for the government. Sure, I'm really surprised by that one. Uh. 485 00:29:46,200 --> 00:29:48,480 Speaker 1: And you know what, Canada is the only G seven 486 00:29:48,520 --> 00:29:55,520 Speaker 1: country that doesn't have this protection biological discrimination, and it's 487 00:29:55,560 --> 00:29:57,000 Speaker 1: a big deal. There's a lot of people that are 488 00:29:57,000 --> 00:30:00,200 Speaker 1: going like, why are we the only one where Canada? Uh? 489 00:30:01,800 --> 00:30:05,240 Speaker 1: Predict Trudeau will change that. Well, there's a big push 490 00:30:05,280 --> 00:30:08,880 Speaker 1: to for UM. And it's funny when they voted in 491 00:30:08,920 --> 00:30:16,440 Speaker 1: the what was the act called uh them Genetic Information 492 00:30:16,520 --> 00:30:20,800 Speaker 1: Non Discrimination Act of two right um. It passed by 493 00:30:20,880 --> 00:30:25,480 Speaker 1: a vote of to nothing in the Senate and four 494 00:30:25,600 --> 00:30:28,360 Speaker 1: fourteen to one in the House. Who is the one? 495 00:30:28,480 --> 00:30:32,120 Speaker 1: It was Ron Paul of all people. Huh. I'd be 496 00:30:32,200 --> 00:30:34,600 Speaker 1: interested to know what his his thinking was. I've got 497 00:30:34,640 --> 00:30:36,760 Speaker 1: it because I was I thought the same thing. Here's 498 00:30:36,760 --> 00:30:39,200 Speaker 1: his thinking, because it doesn't make sense that he's because 499 00:30:39,200 --> 00:30:41,320 Speaker 1: he's pretty obsessed with the government staying out of your bills. 500 00:30:42,200 --> 00:30:45,320 Speaker 1: He said, uniform federal mandates are a clumsy and ineffective 501 00:30:45,320 --> 00:30:48,520 Speaker 1: way to deal with problems such as employers, and one 502 00:30:48,520 --> 00:30:50,800 Speaker 1: of the rubs is either you'll be denied insurance or 503 00:30:50,840 --> 00:30:53,320 Speaker 1: maybe you won't get hired for a job or promoted 504 00:30:53,400 --> 00:30:55,240 Speaker 1: if they know that you might, you know, keep the 505 00:30:55,280 --> 00:30:57,800 Speaker 1: buckets soon. That guy can't push a broom. He's got 506 00:30:57,840 --> 00:31:01,320 Speaker 1: a defect on his G four eight gene. But it 507 00:31:01,360 --> 00:31:03,160 Speaker 1: says right here in his experience, and he can push 508 00:31:03,160 --> 00:31:07,479 Speaker 1: your broom genetics. He said, uniform federal mandates are clumsy 509 00:31:07,480 --> 00:31:10,200 Speaker 1: and ineffective way to deal with problems such as employers 510 00:31:10,680 --> 00:31:13,320 Speaker 1: making hiring decisions on the basis of the potential employees 511 00:31:13,440 --> 00:31:17,920 Speaker 1: genetic profile. Imposing federal mandates on private businesses merely raises 512 00:31:17,920 --> 00:31:20,720 Speaker 1: the cost of doing business and thus reduces the overall 513 00:31:20,800 --> 00:31:25,440 Speaker 1: employment opportunities for all citizens. Huh. Yeah, I see what 514 00:31:25,440 --> 00:31:27,840 Speaker 1: he's saying, but I don't know. It's kind of surprised. 515 00:31:27,880 --> 00:31:30,760 Speaker 1: It seems like something you'd want to protect, um, but 516 00:31:30,880 --> 00:31:34,440 Speaker 1: it passed by the widest of margins regardless. Yeah, that 517 00:31:34,520 --> 00:31:37,960 Speaker 1: might be a record. No, I'm sure there's been unanimals 518 00:31:38,200 --> 00:31:40,480 Speaker 1: one of the I would like to know what those were, 519 00:31:41,000 --> 00:31:44,360 Speaker 1: you know, like honoring girl Scouts on Patriot Day or something. Now, 520 00:31:44,400 --> 00:31:47,760 Speaker 1: there was one person's like no, No, that was Bernie Sanders. 521 00:31:47,840 --> 00:31:52,200 Speaker 1: I choked on a on a tagalong once, never buying 522 00:31:52,240 --> 00:31:55,080 Speaker 1: them again. Um, there's another obstacle, Chuck, and it is 523 00:31:55,200 --> 00:32:01,320 Speaker 1: gathering the information, like yeah, to get this understanding of 524 00:32:01,400 --> 00:32:04,480 Speaker 1: you know, what kind of genes lead to certain kinds 525 00:32:04,480 --> 00:32:06,960 Speaker 1: of diseases so that we can treat people in an 526 00:32:06,960 --> 00:32:10,520 Speaker 1: individual basis when we stumble across that same genome and 527 00:32:10,600 --> 00:32:13,200 Speaker 1: a person later, you have to under you have to 528 00:32:13,240 --> 00:32:15,640 Speaker 1: have a big database of genes. So where do you 529 00:32:15,640 --> 00:32:18,800 Speaker 1: get it? Twenty three and me, that's apparently where you 530 00:32:18,840 --> 00:32:21,440 Speaker 1: go get it. It sounds like forever twenty one, like 531 00:32:21,440 --> 00:32:25,760 Speaker 1: a mall store and me. Uh yeah. They are a 532 00:32:25,760 --> 00:32:28,840 Speaker 1: company now and the leading company I think for the 533 00:32:28,920 --> 00:32:33,800 Speaker 1: personal genome test market, and how they're making their money 534 00:32:33,800 --> 00:32:36,200 Speaker 1: now is not by selling these test kits which is 535 00:32:36,280 --> 00:32:40,080 Speaker 1: ninety nine bucks, which supposedly they were selling at a loss, right, 536 00:32:40,160 --> 00:32:42,560 Speaker 1: so they could eventually have this database that they could 537 00:32:42,600 --> 00:32:47,440 Speaker 1: then sell to whoever, not whoever, but namely like form 538 00:32:47,440 --> 00:32:51,840 Speaker 1: of companies and people doing research. So the twenty three 539 00:32:51,840 --> 00:32:54,040 Speaker 1: and me amassed a database if I think about eight 540 00:32:54,080 --> 00:32:58,080 Speaker 1: hundred thousand people, six hundred thousand people who took the 541 00:32:58,120 --> 00:33:00,240 Speaker 1: twenty three and me test and paid ninety nine looks 542 00:33:00,280 --> 00:33:04,520 Speaker 1: for it, agreed to donate their DNA, their gene, their 543 00:33:04,560 --> 00:33:08,040 Speaker 1: genome research to research. Right. So twenty three and Me said, 544 00:33:08,040 --> 00:33:11,040 Speaker 1: thanks a lot, guys. Now we have six hundred thousand 545 00:33:11,040 --> 00:33:15,200 Speaker 1: individuals genomes just sitting there waiting to be analyzed. And 546 00:33:15,400 --> 00:33:17,680 Speaker 1: very recently they closed to deal with a company called 547 00:33:17,760 --> 00:33:21,560 Speaker 1: gene Tech. Gene Tech paid twenty three and Me sixty 548 00:33:21,680 --> 00:33:27,480 Speaker 1: million dollars just to analyze three thousand people with Parkinson's genomes. 549 00:33:27,520 --> 00:33:29,920 Speaker 1: That's why they were selling the kids at a loss, Yes, 550 00:33:30,040 --> 00:33:33,400 Speaker 1: because they knew the big payoff was in something else entirely. Yeah, 551 00:33:33,560 --> 00:33:36,280 Speaker 1: and um, they're they're from what I read in the 552 00:33:36,600 --> 00:33:42,000 Speaker 1: UM A M I T. Technology Review article. Um, the 553 00:33:42,080 --> 00:33:43,920 Speaker 1: twenty three and me. You shouldn't paint them, And I 554 00:33:43,960 --> 00:33:46,360 Speaker 1: don't mean to paint them as nefarious or anything like that. 555 00:33:46,600 --> 00:33:51,040 Speaker 1: But there's a guy named um Charles Seife who writes 556 00:33:51,080 --> 00:33:53,640 Speaker 1: for Scientific American. In two thousand thirteen, he called the 557 00:33:53,680 --> 00:33:58,680 Speaker 1: idea of a private company amassing a private database of 558 00:33:59,000 --> 00:34:03,040 Speaker 1: human genomes yeah terrifying. Yeah, I mean it definitely is 559 00:34:03,080 --> 00:34:06,680 Speaker 1: like the stuff of science fiction movies. I couldn't decide 560 00:34:06,680 --> 00:34:09,520 Speaker 1: whether or not it was bad or not. I think 561 00:34:09,760 --> 00:34:12,360 Speaker 1: what people are most concerned about is like, well, what 562 00:34:12,440 --> 00:34:14,359 Speaker 1: happens in the future, or what if it becomes just 563 00:34:14,400 --> 00:34:16,960 Speaker 1: like Facebook, where they have the rights to sell your 564 00:34:16,960 --> 00:34:19,520 Speaker 1: personal information to whoever wants. It's exactly what it is. 565 00:34:19,560 --> 00:34:23,120 Speaker 1: So Facebook data minds your behavior that you get to 566 00:34:23,160 --> 00:34:26,239 Speaker 1: use their application for free. Twenty three and Me analyzed 567 00:34:26,320 --> 00:34:29,080 Speaker 1: your d n A and sent you some stuff back 568 00:34:29,280 --> 00:34:35,239 Speaker 1: for bucks, and their data mining your genes. It's the 569 00:34:35,360 --> 00:34:38,520 Speaker 1: same thing as Facebook. It's just instead of behavior, they're 570 00:34:38,520 --> 00:34:41,520 Speaker 1: analyzing genes, their data mining or amassing a database of 571 00:34:41,600 --> 00:34:43,759 Speaker 1: it for sale. But right now they're saying, but yeah, 572 00:34:43,800 --> 00:34:45,640 Speaker 1: we're selling it to researchers who are out to make 573 00:34:45,680 --> 00:34:48,160 Speaker 1: medicines to make people better. Yeah, and that's you can't 574 00:34:48,200 --> 00:34:50,600 Speaker 1: really argue with that. It's just the potential for it. 575 00:34:50,640 --> 00:34:52,799 Speaker 1: Can you can understand how somebody could make it. It 576 00:34:52,800 --> 00:34:55,239 Speaker 1: could could be made very uncomfortable by that. Yeah. The 577 00:34:55,280 --> 00:34:59,439 Speaker 1: evil overlord, son of the current head of twenty three 578 00:34:59,440 --> 00:35:01,320 Speaker 1: and me is one will do it well. The founders. 579 00:35:02,280 --> 00:35:04,839 Speaker 1: The founder used to be married to Sergey Brynn of 580 00:35:04,960 --> 00:35:08,520 Speaker 1: um of Google. Yeah. I think they since split up, 581 00:35:08,880 --> 00:35:11,960 Speaker 1: but she still is the founder, and I believe the 582 00:35:11,960 --> 00:35:15,600 Speaker 1: person who's running twenty three and me hopefully she subscribes 583 00:35:15,640 --> 00:35:18,600 Speaker 1: to that Don't Be Evil thing too. Seriously, if you 584 00:35:18,640 --> 00:35:21,240 Speaker 1: want to know more about personalized medicine, we should probably 585 00:35:21,239 --> 00:35:25,160 Speaker 1: revisit this every six months, I think, Chuck. Um, you 586 00:35:25,200 --> 00:35:27,359 Speaker 1: can type those words into the search bar at how 587 00:35:27,400 --> 00:35:30,240 Speaker 1: stuff works dot com. You should also check out these 588 00:35:30,400 --> 00:35:34,239 Speaker 1: um awesome episodes. Your limbs torn off? Now, what can 589 00:35:34,640 --> 00:35:40,680 Speaker 1: can your grandfather's diet shorten your own life? Um? And yeah, blood, 590 00:35:40,840 --> 00:35:43,520 Speaker 1: that was a good one. And then, um, will computers 591 00:35:43,520 --> 00:35:46,800 Speaker 1: replace my doctor? If this episode floated your boat, you 592 00:35:46,840 --> 00:35:49,080 Speaker 1: will love this too. And I said float your boat, 593 00:35:49,120 --> 00:35:53,759 Speaker 1: which means it's time for listener mail. That means it's 594 00:35:53,760 --> 00:35:57,120 Speaker 1: almost time for Billy Joel doo wop. I'm gonna call 595 00:35:57,160 --> 00:36:01,080 Speaker 1: this Satanic Panic Movies. Hey, guys, my wife Jody and 596 00:36:01,080 --> 00:36:03,360 Speaker 1: I just listened to the episode on Satanic Panic and 597 00:36:03,360 --> 00:36:06,239 Speaker 1: we loved it and reminisced about our childhoods. We were 598 00:36:06,239 --> 00:36:08,680 Speaker 1: both children of the eighties and uh, she remembers all 599 00:36:08,680 --> 00:36:11,640 Speaker 1: the daytime talk shows about Satanic panic. We both had 600 00:36:11,640 --> 00:36:13,920 Speaker 1: no idea it was taken so seriously by so many people. 601 00:36:14,360 --> 00:36:16,320 Speaker 1: For me, I always assumed that stuff was just legend. 602 00:36:16,400 --> 00:36:19,520 Speaker 1: Although there was a Devil's Drive Street in my own 603 00:36:19,560 --> 00:36:21,480 Speaker 1: town growing up that kept all its ten year old 604 00:36:21,480 --> 00:36:24,719 Speaker 1: spooked into our teenage years. Uh and it was a 605 00:36:24,800 --> 00:36:26,719 Speaker 1: rite of passage when you finally got your license to 606 00:36:26,800 --> 00:36:30,520 Speaker 1: drive down that street. Mostly, I remember Satanism through movies 607 00:36:30,520 --> 00:36:34,920 Speaker 1: and pop culture, though given your pinchamp for cinema were 608 00:36:34,960 --> 00:36:38,160 Speaker 1: cinema tangents, we were both expecting to hear more on 609 00:36:38,200 --> 00:36:42,239 Speaker 1: that topic in this episode. Agreed. Here's my top ten 610 00:36:42,280 --> 00:36:46,440 Speaker 1: list of mainstream eighties satanic Bannock movies. Number ten, drag Net, 611 00:36:47,760 --> 00:36:50,839 Speaker 1: number nine, The Golden Child. He said this one does 612 00:36:50,880 --> 00:36:53,080 Speaker 1: not hold up well. I'm supposed to hear that they 613 00:36:53,160 --> 00:36:57,000 Speaker 1: didn't hold up well. Number eight Children in the Corn uh. Seven, 614 00:36:57,000 --> 00:37:01,200 Speaker 1: Witches of Eastwick, Eastwick six, Every popular horror movie in 615 00:37:01,200 --> 00:37:03,520 Speaker 1: the eighties right thirteenth Night around Elm Street elloween. I 616 00:37:03,680 --> 00:37:06,200 Speaker 1: take issue with that Man's not samar in Elm Street 617 00:37:06,880 --> 00:37:08,880 Speaker 1: by the third enth is certainly not see pans are 618 00:37:08,920 --> 00:37:13,000 Speaker 1: just creepy killer guys. Slash your boots come one. Number five, 619 00:37:13,080 --> 00:37:17,920 Speaker 1: The Burbs, Yeah, number four, The Evil Dead Series, No. 620 00:37:18,840 --> 00:37:22,320 Speaker 1: Number three, Indiana Jones and the Temple of Doom ritual sacrifice. 621 00:37:22,480 --> 00:37:25,520 Speaker 1: Not give him that. Yeah, I'm not satanas. I think 622 00:37:25,560 --> 00:37:30,520 Speaker 1: he's just broadened Number two, Poultergeist. No, no, not even close. 623 00:37:31,520 --> 00:37:33,640 Speaker 1: Number one. I don't think he asked which ones are 624 00:37:33,640 --> 00:37:37,880 Speaker 1: you gonna say? Don't belong Number one young Sherlock Holmes. 625 00:37:37,960 --> 00:37:40,239 Speaker 1: I love that movie, but I don't remember much about it. 626 00:37:40,360 --> 00:37:42,840 Speaker 1: Oh yeah, there was a whole It was very It 627 00:37:42,960 --> 00:37:46,360 Speaker 1: was more like Indiana Jones and the templeo Toom was 628 00:37:46,440 --> 00:37:51,200 Speaker 1: like a ancient egypt worshiping Victorian cult. That was cool. 629 00:37:51,280 --> 00:37:53,960 Speaker 1: I saw it like in the last year or so. Really, 630 00:37:54,160 --> 00:37:56,279 Speaker 1: I remember enjoying it when I was red. Where did 631 00:37:56,320 --> 00:37:59,880 Speaker 1: that guy go? No idea, I was wondering that myself. Uh. 632 00:38:00,000 --> 00:38:03,160 Speaker 1: Thanks for an amazingly delightful and consistently entertaining podcast. Guys. 633 00:38:03,200 --> 00:38:05,719 Speaker 1: We came out to your Boston show and absolutely loved it. 634 00:38:06,400 --> 00:38:09,960 Speaker 1: Happy New Year. That is from Brian Gladstein of Framing 635 00:38:10,120 --> 00:38:12,960 Speaker 1: m Massachusetts. Thanks Brian, thank you for half of that 636 00:38:13,400 --> 00:38:15,920 Speaker 1: list you send as well. We appreciate it. If you 637 00:38:16,120 --> 00:38:18,120 Speaker 1: want to get in touch with us, send us a 638 00:38:18,200 --> 00:38:20,799 Speaker 1: list that we may or may not trash. You can 639 00:38:20,960 --> 00:38:23,360 Speaker 1: tweet to us at s Y s K podcast. You 640 00:38:23,440 --> 00:38:25,560 Speaker 1: can join us on Facebook dot com slash stuff you 641 00:38:25,560 --> 00:38:27,800 Speaker 1: Should Know. You can send us an email to Stuff 642 00:38:27,840 --> 00:38:30,160 Speaker 1: Podcast to how stuff Works dot com and has always 643 00:38:30,239 --> 00:38:32,040 Speaker 1: joined us at our home on the web, Stuff you 644 00:38:32,080 --> 00:38:39,560 Speaker 1: Should Know dot com. For more on this and thousands 645 00:38:39,600 --> 00:38:41,960 Speaker 1: of other topics, is it how stuff Works dot com