1 00:00:01,120 --> 00:00:04,280 Speaker 1: Welcome to you Stuff you should Know from house Stuff 2 00:00:04,280 --> 00:00:12,640 Speaker 1: Works dot com. Hey, and welcome to the podcast. I'm 3 00:00:12,720 --> 00:00:15,680 Speaker 1: Josh Clark. There's Charles W. Tucky Bryant. This is stuff 4 00:00:15,680 --> 00:00:19,360 Speaker 1: you should know the podcast. This will be done all 5 00:00:19,400 --> 00:00:24,240 Speaker 1: in sing song. You don't want to hear me sing songs, 6 00:00:24,800 --> 00:00:27,720 Speaker 1: because I've heard he sings songs. Your heart will melt, 7 00:00:28,640 --> 00:00:31,680 Speaker 1: glasses will break, my heart will go on, birds with sure, 8 00:00:32,840 --> 00:00:37,680 Speaker 1: and uh, grown men will kiss each other on the mouth. 9 00:00:40,479 --> 00:00:43,080 Speaker 1: So inspiring. That'd be pretty cool if you could do 10 00:00:43,159 --> 00:00:47,080 Speaker 1: that just by singing, Yeah, like you kiss him now, 11 00:00:48,080 --> 00:00:51,560 Speaker 1: glass you break. What you're talking about is Chevy Chase 12 00:00:51,600 --> 00:00:54,800 Speaker 1: in Modern Problems. I never saw that one. He didn't 13 00:00:54,800 --> 00:00:59,360 Speaker 1: have to sing, but he gained telekinetic abilities. You can 14 00:00:59,400 --> 00:01:02,200 Speaker 1: make things happen just by thinking them. I shouldn't even 15 00:01:02,200 --> 00:01:03,560 Speaker 1: say I never saw that one. That's not that much 16 00:01:03,560 --> 00:01:05,720 Speaker 1: of a surprise. Everyone knows how my dad raised me, 17 00:01:06,280 --> 00:01:09,120 Speaker 1: but I've never even heard of that one since now 18 00:01:09,240 --> 00:01:12,880 Speaker 1: eighties movie Modern Problems it was. It was very dumb, 19 00:01:13,319 --> 00:01:15,319 Speaker 1: but it was one of those early HBO movies for me. 20 00:01:15,840 --> 00:01:18,000 Speaker 1: So I just sat around watched it like constantly, and 21 00:01:18,040 --> 00:01:24,720 Speaker 1: I had a couple of dirty jokes. Oh I got you? Yeah, Um, yeah, 22 00:01:25,080 --> 00:01:29,240 Speaker 1: how do we get on that? It was me so chuck. Yes, Josh, 23 00:01:30,120 --> 00:01:34,039 Speaker 1: we are now UM friends. I guess you could say 24 00:01:34,160 --> 00:01:37,000 Speaker 1: after four years, we finally crossed that cusp. No, I'm 25 00:01:37,040 --> 00:01:39,760 Speaker 1: not talking about us. We're not for INDs. Still, you 26 00:01:39,840 --> 00:01:43,640 Speaker 1: and I are friends, Yes, we are. We're friends with 27 00:01:43,680 --> 00:01:49,360 Speaker 1: Science Channel and UM. As such, we are pretty excited 28 00:01:49,440 --> 00:01:51,520 Speaker 1: that they have something going on pretty soon. Yeah, and 29 00:01:51,520 --> 00:01:53,760 Speaker 1: this relates to our podcast topic, which is the reason, 30 00:01:54,080 --> 00:01:57,639 Speaker 1: very reason we chose this podcast topic. UM. Science Channel 31 00:01:57,800 --> 00:02:02,720 Speaker 1: is UM bringing Fringe, the cult classic television show Fringe 32 00:02:03,640 --> 00:02:08,160 Speaker 1: UM to its airwaves starting November twenty ye. They're gonna 33 00:02:08,200 --> 00:02:11,880 Speaker 1: show all five seasons. And uh, we were even lucky 34 00:02:11,960 --> 00:02:15,320 Speaker 1: enough to meet meet the guy, the guy at at 35 00:02:15,360 --> 00:02:19,919 Speaker 1: Comic Con. Yeah not Joshua Jackson, Dr Fringe. Yeah, that's 36 00:02:19,960 --> 00:02:23,040 Speaker 1: not his name, Mr Noble, and he was very nice 37 00:02:23,080 --> 00:02:25,920 Speaker 1: and uh, because he's not just on Fringe, he's on 38 00:02:26,000 --> 00:02:27,799 Speaker 1: Dark Matters too, he's the host of Dark Matters on 39 00:02:27,880 --> 00:02:32,280 Speaker 1: Science Channel. But I personally watched Fringe. I watched all 40 00:02:32,280 --> 00:02:35,079 Speaker 1: the first season, me and Emily did. Emily and I 41 00:02:35,160 --> 00:02:40,120 Speaker 1: excuse me, and um, I really enjoyed it and just 42 00:02:40,200 --> 00:02:42,360 Speaker 1: it was one of those things that I didn't watch 43 00:02:42,360 --> 00:02:46,160 Speaker 1: season two Beyond and Beyond because it just, you know, 44 00:02:46,240 --> 00:02:47,960 Speaker 1: life got in the way or something. But it wasn't 45 00:02:47,960 --> 00:02:49,280 Speaker 1: because it didn't like it was really good. I sort 46 00:02:49,280 --> 00:02:51,880 Speaker 1: of had a X Files vibe. Oh yeah, and the 47 00:02:51,919 --> 00:02:54,000 Speaker 1: twist that they've managed to work in there. Yeah, but 48 00:02:54,040 --> 00:02:56,880 Speaker 1: the science was more predominant. So I liked it. And 49 00:02:56,919 --> 00:03:00,120 Speaker 1: I'm totally gonna watch seasons. We'll probably start with sas 50 00:03:00,120 --> 00:03:02,480 Speaker 1: and one again. Well, then you should tune in November 51 00:03:03,000 --> 00:03:04,600 Speaker 1: and I'm gonna watch seasons two through five now and 52 00:03:04,639 --> 00:03:08,880 Speaker 1: Science Channel. I'm pretty stoked about that, right, Um? And 53 00:03:09,080 --> 00:03:11,720 Speaker 1: uh well yeah, in honor of fringe, we chose kind 54 00:03:11,720 --> 00:03:14,720 Speaker 1: of a fringe science topic. You did, That's right. I 55 00:03:14,720 --> 00:03:18,520 Speaker 1: think it's a good one. Chuck Yeah. Um. Designing Women, 56 00:03:18,919 --> 00:03:23,760 Speaker 1: I mean Designing Children? Where did that show? Oh? Yeah? Man? 57 00:03:23,880 --> 00:03:26,240 Speaker 1: What was the one that followed it with Burt Reynolds 58 00:03:26,360 --> 00:03:30,200 Speaker 1: Evening Shade? Was that tiede? Was that a spinoff? No? 59 00:03:30,360 --> 00:03:33,000 Speaker 1: But I think they were packaged together. It's like Tuesday 60 00:03:33,080 --> 00:03:37,480 Speaker 1: Night Redneck Hour, Sugar Bakers. Yeah right, Yeah, I watched 61 00:03:37,480 --> 00:03:39,840 Speaker 1: Designing Women. I didn't watch evening shade though, which surprises 62 00:03:39,880 --> 00:03:42,520 Speaker 1: me because I love Burt Reynolds. I yeah, I didn't 63 00:03:42,520 --> 00:03:45,640 Speaker 1: see it either, but yeah, designing Women, it was a 64 00:03:45,640 --> 00:03:49,400 Speaker 1: good show. Um that we're talking about designing children and 65 00:03:49,480 --> 00:03:52,960 Speaker 1: not just designing children, designer children. That's right. Uh, the 66 00:03:53,040 --> 00:03:57,920 Speaker 1: idea that one day in the very very very near 67 00:03:57,960 --> 00:04:03,920 Speaker 1: future now UM will be able to make kids ready 68 00:04:03,960 --> 00:04:08,520 Speaker 1: to order, made to order. Yeah, in certain ways, like 69 00:04:08,600 --> 00:04:12,400 Speaker 1: I think right now you we have the ability to 70 00:04:12,560 --> 00:04:15,240 Speaker 1: select eye and hair color. But they're just not doing 71 00:04:15,280 --> 00:04:17,680 Speaker 1: it yet because they try that in Los Angeles and 72 00:04:17,680 --> 00:04:20,320 Speaker 1: people are like, who woa, whoaa do you remember that? Yeah, 73 00:04:20,320 --> 00:04:22,200 Speaker 1: we don't like this. We shouldn't be doing this. And 74 00:04:22,240 --> 00:04:25,039 Speaker 1: I think that's really significant that that happened. Like the 75 00:04:25,120 --> 00:04:28,760 Speaker 1: first real commercial attempt for basically just saying hey you 76 00:04:28,760 --> 00:04:30,920 Speaker 1: want a blonde kid, we can give you a blond kid. 77 00:04:31,680 --> 00:04:37,360 Speaker 1: UM received public outcry, international outcry, so much of it. 78 00:04:37,440 --> 00:04:39,760 Speaker 1: The people are like, okay, sorry, we opened our mouths 79 00:04:39,760 --> 00:04:43,720 Speaker 1: and just forget we said anything. Here's your brunette, you know, yeah, exactly, 80 00:04:44,160 --> 00:04:47,599 Speaker 1: roll the dice, jerks. We don't care. UM. But I 81 00:04:47,680 --> 00:04:51,440 Speaker 1: find that significant, you know, because I wonder, um, you 82 00:04:51,480 --> 00:04:54,640 Speaker 1: know how it's gonna go. How when it does become 83 00:04:54,640 --> 00:04:57,640 Speaker 1: really commercially viable to really make your kid a different 84 00:04:57,760 --> 00:05:01,560 Speaker 1: person than they would have been natural? Really like, how 85 00:05:01,640 --> 00:05:03,839 Speaker 1: people will accept that? Yeah? I mean this is the 86 00:05:03,880 --> 00:05:07,080 Speaker 1: stuff of science fiction that is really happening now. Um, 87 00:05:07,160 --> 00:05:11,400 Speaker 1: the movie Gatica, this reference in this article really good 88 00:05:11,400 --> 00:05:14,159 Speaker 1: movie is really good. Yeah, man, it's good. It's like 89 00:05:14,200 --> 00:05:17,680 Speaker 1: a thinking man's science Uh. I don't know if you 90 00:05:17,680 --> 00:05:21,440 Speaker 1: call it a thriller. Maybe a thriller intrigue at the 91 00:05:21,520 --> 00:05:25,120 Speaker 1: very least, But yeah, it's good. And basically the synopsis 92 00:05:25,120 --> 00:05:31,120 Speaker 1: there is without spoiling anything. Is that not the not 93 00:05:31,200 --> 00:05:35,680 Speaker 1: too distant future we are able to in Gatica two 94 00:05:37,240 --> 00:05:40,599 Speaker 1: build designer children that will grow into designer adults that 95 00:05:40,600 --> 00:05:44,320 Speaker 1: are like, disease free and highly athletic and very intelligent. 96 00:05:44,839 --> 00:05:46,800 Speaker 1: And then the rest of the shlobs of the world 97 00:05:47,360 --> 00:05:51,240 Speaker 1: are you know, sorry t s for them? Basically go 98 00:05:51,320 --> 00:05:54,800 Speaker 1: with your potato chips. So see, Gatica's a good one. 99 00:05:55,320 --> 00:05:57,760 Speaker 1: Like it? Yeah, every time you say like this isn't 100 00:05:57,760 --> 00:06:01,440 Speaker 1: gonna spoil things, I tried really hard to spoiler. I 101 00:06:01,480 --> 00:06:03,920 Speaker 1: was watching an episode of Breaking Bad the other day 102 00:06:04,000 --> 00:06:06,520 Speaker 1: human and I were and Um, I was like, how 103 00:06:06,600 --> 00:06:10,760 Speaker 1: do I know what's about to happen that? I was like, Chuck, yeah. 104 00:06:10,760 --> 00:06:15,320 Speaker 1: And the Math episode, Oh I thought you. I thought 105 00:06:15,320 --> 00:06:17,320 Speaker 1: you meant the one in Breaking Bad that was about meth. 106 00:06:17,400 --> 00:06:19,560 Speaker 1: I was like, they're all about man, right right, I 107 00:06:19,600 --> 00:06:22,720 Speaker 1: didn't spoil that, Yeah, I got you. Yeah, all right, 108 00:06:23,320 --> 00:06:25,560 Speaker 1: So sorry that was not a spoiler for Gatica though. 109 00:06:25,920 --> 00:06:31,440 Speaker 1: UM so, chuck, yes, let's talk about all this you're say. 110 00:06:31,480 --> 00:06:36,520 Speaker 1: In two thousand nine, they came out against that fertility 111 00:06:36,520 --> 00:06:40,120 Speaker 1: clinic in Los Angeles. That's a good example of a 112 00:06:40,120 --> 00:06:44,120 Speaker 1: commercial business saying hey, we can do this now. A 113 00:06:44,160 --> 00:06:47,280 Speaker 1: good example of a government saying, hey, you can do 114 00:06:47,320 --> 00:06:49,640 Speaker 1: this now, we need to do something about it. Was 115 00:06:49,680 --> 00:06:52,240 Speaker 1: the UK proposing a bill and we couldn't find out 116 00:06:52,240 --> 00:06:55,560 Speaker 1: whether it passed or not. So frustrating how hard it 117 00:06:55,640 --> 00:07:00,000 Speaker 1: is to find out things like this sometimes. Um, if 118 00:06:59,800 --> 00:07:02,800 Speaker 1: if you write an article about something that is big 119 00:07:02,880 --> 00:07:05,120 Speaker 1: enough for somebody else to use in an article, you 120 00:07:05,200 --> 00:07:08,880 Speaker 1: better follow up. Yeah, that's what I'm duty journalist. So 121 00:07:08,920 --> 00:07:11,680 Speaker 1: what we know is it was protested at least, so 122 00:07:11,720 --> 00:07:13,480 Speaker 1: I'm not sure if I went through well, and the 123 00:07:13,520 --> 00:07:16,880 Speaker 1: reason it was protested was protested um largely by the 124 00:07:16,920 --> 00:07:20,160 Speaker 1: deaf and heart of hearing community over there, because this 125 00:07:20,280 --> 00:07:26,240 Speaker 1: bill would have or did prohibit um selecting kids for 126 00:07:27,200 --> 00:07:32,640 Speaker 1: disease or disability, right, So if you it allowed you 127 00:07:32,720 --> 00:07:35,520 Speaker 1: to select against that. So if you if you have 128 00:07:35,640 --> 00:07:37,840 Speaker 1: a kid that has a disability, you can be like, 129 00:07:38,000 --> 00:07:41,720 Speaker 1: I don't want that kid. But it prohibited selecting forum 130 00:07:41,720 --> 00:07:45,640 Speaker 1: and the deaf community said, hey, um, if hearing parents 131 00:07:45,680 --> 00:07:48,720 Speaker 1: can select hearing kids, deaf parents should be able to 132 00:07:48,800 --> 00:07:51,840 Speaker 1: select deaf kids. So if you're gonna call deafness a disability, 133 00:07:52,120 --> 00:07:55,000 Speaker 1: you need to change this this bill, which is a 134 00:07:55,000 --> 00:07:57,600 Speaker 1: pretty cool thing to protest, if you ask me. I 135 00:07:57,640 --> 00:07:59,640 Speaker 1: don't know how it felt about it. I thought it's 136 00:07:59,640 --> 00:08:02,920 Speaker 1: still the no, Wow, this is gonna be a good 137 00:08:02,920 --> 00:08:06,360 Speaker 1: one that I was utterly confused. That was like, why 138 00:08:06,400 --> 00:08:08,680 Speaker 1: would you want your child to be deaf and be 139 00:08:08,720 --> 00:08:11,440 Speaker 1: at a disadvantage straight out of the gate in life? 140 00:08:11,880 --> 00:08:14,680 Speaker 1: But then I thought, well, is it a disadvantage exactly? 141 00:08:14,800 --> 00:08:17,640 Speaker 1: That is a great, great question. So I don't know. 142 00:08:17,760 --> 00:08:20,000 Speaker 1: That's where I ended up was did you know that 143 00:08:21,200 --> 00:08:25,280 Speaker 1: percent that's the highest most recent figure I've seen of 144 00:08:25,400 --> 00:08:32,880 Speaker 1: all down syndrome fetuses are aborted. I believe that right, 145 00:08:33,559 --> 00:08:37,240 Speaker 1: And that's the same question. It's like some people are like, well, 146 00:08:37,240 --> 00:08:39,680 Speaker 1: why would you want your kid to be disadvantaged? And 147 00:08:39,800 --> 00:08:46,199 Speaker 1: we are or not not selected through IBF a boarded um, 148 00:08:46,280 --> 00:08:48,280 Speaker 1: and so some people would say, why would you want 149 00:08:48,280 --> 00:08:50,280 Speaker 1: your kid to have you know, your kids going to 150 00:08:50,320 --> 00:08:52,480 Speaker 1: have a disadvantage And other people say, like, have you 151 00:08:52,520 --> 00:08:56,400 Speaker 1: ever met a person with down syndrome? Like they're pretty 152 00:08:56,400 --> 00:09:00,320 Speaker 1: awesome people, you know, And I think that that is 153 00:09:00,679 --> 00:09:05,320 Speaker 1: That's just one argument throughout this idea of designing children 154 00:09:05,440 --> 00:09:09,120 Speaker 1: savior siblings was also included in the bill in England 155 00:09:09,400 --> 00:09:12,160 Speaker 1: let parents select embryos that would make suitable savior siblings. 156 00:09:12,840 --> 00:09:15,880 Speaker 1: Very controversial. I read a couple of articles on this. 157 00:09:16,000 --> 00:09:21,080 Speaker 1: Saviors Savior siblings are basically kids that you conceive initially 158 00:09:21,120 --> 00:09:24,160 Speaker 1: with the purpose of being able to act as donors 159 00:09:24,200 --> 00:09:27,000 Speaker 1: for their older brother or sister. Like you're you're the 160 00:09:27,080 --> 00:09:30,560 Speaker 1: kid that you love is born with like bad kidneys. 161 00:09:31,120 --> 00:09:33,839 Speaker 1: Have another kid that's gonna be a suitable tissue doner 162 00:09:34,040 --> 00:09:37,200 Speaker 1: because you know ahead of time, before the kids even born, 163 00:09:37,240 --> 00:09:40,080 Speaker 1: that it will be, yeah, so that they can give 164 00:09:40,120 --> 00:09:43,520 Speaker 1: them one of their kidneys. Yeah, there was a article 165 00:09:43,960 --> 00:09:46,400 Speaker 1: that I read where these parents had had had a 166 00:09:46,480 --> 00:09:50,840 Speaker 1: quote unquote savior child and used you know, they said, 167 00:09:50,840 --> 00:09:54,120 Speaker 1: what we ended up using was a teaspoon of umbilical 168 00:09:54,320 --> 00:09:57,080 Speaker 1: blood that would have been thrown in the trash and 169 00:09:57,120 --> 00:09:59,800 Speaker 1: that's what saved our other kids life. And this is 170 00:09:59,840 --> 00:10:02,760 Speaker 1: not the designer child, it's not some freak of science. 171 00:10:03,360 --> 00:10:06,439 Speaker 1: This is the reason we have this child. But doesn't 172 00:10:06,440 --> 00:10:10,240 Speaker 1: make it any less valid. So I think ultimately it's 173 00:10:10,240 --> 00:10:12,920 Speaker 1: how you treat the child after they're born. Well, you 174 00:10:12,920 --> 00:10:15,200 Speaker 1: treat them as like your regular child. You put them 175 00:10:15,200 --> 00:10:17,920 Speaker 1: in a closet and wait for the kidney. Sure, you'd hope. 176 00:10:17,920 --> 00:10:19,880 Speaker 1: But at the same time, I mean like you can 177 00:10:19,920 --> 00:10:22,720 Speaker 1: go down the road and say, well, having a savior 178 00:10:22,800 --> 00:10:26,960 Speaker 1: sibling is also having a kid to strip for parts. Yeah, 179 00:10:27,080 --> 00:10:30,719 Speaker 1: you know, there's there's another interpretation of the whole thing. 180 00:10:30,800 --> 00:10:33,160 Speaker 1: So I mean, like, if if you are going to 181 00:10:33,280 --> 00:10:35,880 Speaker 1: have your kid like that, is it valid for society 182 00:10:35,920 --> 00:10:37,400 Speaker 1: to be like, well, well you can't do that. How 183 00:10:37,440 --> 00:10:39,520 Speaker 1: are you going to treat your kid afterwards? Like? Is 184 00:10:39,520 --> 00:10:41,560 Speaker 1: that one of the worries? How would they would treat 185 00:10:41,559 --> 00:10:44,680 Speaker 1: the kid? I've never heard that as an argument. Yeah, 186 00:10:44,679 --> 00:10:46,800 Speaker 1: because it's not like you can have a kid use 187 00:10:46,880 --> 00:10:49,960 Speaker 1: all of their harvest all of their organs and kill them. 188 00:10:50,000 --> 00:10:53,560 Speaker 1: Of course not so. I think it's it's like the 189 00:10:53,640 --> 00:10:55,440 Speaker 1: effect that the impact it's going to have on that 190 00:10:55,559 --> 00:10:58,800 Speaker 1: child and their own identity is like a human being 191 00:10:59,520 --> 00:11:02,400 Speaker 1: and and unique individual human being, rather than a walking 192 00:11:02,600 --> 00:11:05,560 Speaker 1: organ bank for their brother. I would I would think 193 00:11:05,559 --> 00:11:08,200 Speaker 1: I would appreciate that growing up knowing that I was 194 00:11:08,240 --> 00:11:11,280 Speaker 1: born with a higher purpose of potentially saving my older 195 00:11:11,280 --> 00:11:14,640 Speaker 1: brother if he ever needed it. You know, sure, depending 196 00:11:14,679 --> 00:11:18,120 Speaker 1: on how you're raised, you know, are you like Danny 197 00:11:18,160 --> 00:11:23,280 Speaker 1: DeVito and twins right, or are you like the savior sibling. 198 00:11:23,280 --> 00:11:24,800 Speaker 1: It's a great way to put it, like you're the 199 00:11:24,840 --> 00:11:28,720 Speaker 1: savior of this other sibling. It just all depends to 200 00:11:28,800 --> 00:11:31,360 Speaker 1: me on how the parents raised those those children in 201 00:11:31,400 --> 00:11:38,160 Speaker 1: that abnormal dynamic that's fostered through our technology. Yeah. I 202 00:11:38,160 --> 00:11:40,959 Speaker 1: can't imagine though, that a parent who would care enough 203 00:11:41,000 --> 00:11:43,480 Speaker 1: about their one child to have another to save them 204 00:11:43,679 --> 00:11:48,760 Speaker 1: would mistreat or shun the other child in any way. 205 00:11:48,800 --> 00:11:52,440 Speaker 1: That just didn't make sense to me. I just opened 206 00:11:52,480 --> 00:11:54,760 Speaker 1: my hands in a gesture of I don't know everybody, 207 00:11:54,800 --> 00:11:57,200 Speaker 1: all right, um, so let let's talk about this, Chuck, 208 00:11:57,280 --> 00:11:59,040 Speaker 1: let's talk genetics for a little bit. I had to 209 00:11:59,080 --> 00:12:02,640 Speaker 1: go back and do some um genetics one oh one 210 00:12:02,840 --> 00:12:06,400 Speaker 1: priming UM, and I did that. As I did, I 211 00:12:06,440 --> 00:12:09,080 Speaker 1: realized that I wasn't going back and remembering it. I 212 00:12:09,120 --> 00:12:10,920 Speaker 1: was teaching myself for the first time. In a lot 213 00:12:10,960 --> 00:12:15,160 Speaker 1: of ways, I've never really gotten genetics. Even though it's 214 00:12:15,200 --> 00:12:20,440 Speaker 1: so straightforward and cut and dried, there's always like, even 215 00:12:20,480 --> 00:12:23,600 Speaker 1: if you read this Designer Children article, like these are 216 00:12:23,640 --> 00:12:26,040 Speaker 1: two of our best writers, and like, it just doesn't 217 00:12:26,040 --> 00:12:31,840 Speaker 1: come across quite right. Maybe it's just me so like 218 00:12:31,920 --> 00:12:38,720 Speaker 1: me in numbers. So back in two thousand three, UM, 219 00:12:38,840 --> 00:12:42,480 Speaker 1: the Human Genome Project announced that it had fulfilled its 220 00:12:42,480 --> 00:12:46,559 Speaker 1: destiny and successfully mapped the human genome. And the human 221 00:12:46,600 --> 00:12:50,840 Speaker 1: genome is the sum total of the information contained in 222 00:12:50,880 --> 00:12:55,000 Speaker 1: the human DNA. That's right, right, Um, you're gonna say 223 00:12:55,000 --> 00:13:00,760 Speaker 1: the word what DNA stands for, Oh, dioxy ribot clayic acid. 224 00:13:01,720 --> 00:13:05,480 Speaker 1: I would say the oxy. Yeah, I've always heard dioxy, 225 00:13:05,559 --> 00:13:08,080 Speaker 1: and then I was looking at that E. So either way, 226 00:13:08,160 --> 00:13:11,120 Speaker 1: well done. Though. D N A and d N A 227 00:13:11,400 --> 00:13:15,280 Speaker 1: is simply a couple of strands of sugar that form 228 00:13:15,320 --> 00:13:18,360 Speaker 1: a helix of double helix, and they're joined by what 229 00:13:18,480 --> 00:13:21,360 Speaker 1: looks like wrongs on the ladder sugar and phosphate. Yeah, 230 00:13:21,440 --> 00:13:25,080 Speaker 1: okay um. And these wrongs are made up of nucleotides, 231 00:13:25,240 --> 00:13:27,640 Speaker 1: one coming off of each of these strands, the little 232 00:13:27,679 --> 00:13:30,960 Speaker 1: twisty ladder that we all love now. Um. So the 233 00:13:31,000 --> 00:13:33,560 Speaker 1: wrongs of the ladder are made of these nucleotides, and 234 00:13:33,600 --> 00:13:36,240 Speaker 1: when they come together, one on each side, they form 235 00:13:36,320 --> 00:13:39,600 Speaker 1: this full wrong and those are called base pairs. And 236 00:13:39,640 --> 00:13:44,200 Speaker 1: there's four types of nucleotides, right, that's right. There is 237 00:13:44,400 --> 00:13:49,920 Speaker 1: uh at a nine sita scene, thymine, and guanine, and 238 00:13:50,000 --> 00:13:52,920 Speaker 1: you put them together and what you come up with, 239 00:13:53,040 --> 00:13:58,720 Speaker 1: ultimately is a four letter language for the blueprint of 240 00:13:59,080 --> 00:14:02,720 Speaker 1: an organism. Pretty cool, not just making an organism, but 241 00:14:02,840 --> 00:14:05,880 Speaker 1: maintaining it as well. And if you look along the 242 00:14:06,200 --> 00:14:10,000 Speaker 1: strand of DNA, you're going to find little segments where 243 00:14:10,080 --> 00:14:15,040 Speaker 1: this combination, if read by a ribosome uh can be 244 00:14:15,160 --> 00:14:18,080 Speaker 1: used to explain how a cell can make a certain 245 00:14:18,160 --> 00:14:21,160 Speaker 1: kind of protein, usually about three proteins on average, and 246 00:14:21,200 --> 00:14:23,880 Speaker 1: proteins are what are used as the building blocks of 247 00:14:24,160 --> 00:14:28,040 Speaker 1: cellular life and its functions, like everything from our behavior 248 00:14:28,560 --> 00:14:34,040 Speaker 1: to like the structure of your eye is based on proteins, right, 249 00:14:34,520 --> 00:14:38,560 Speaker 1: and your your genes. These little segments that are encoded 250 00:14:39,160 --> 00:14:42,760 Speaker 1: along the d NA um that express these proteins are 251 00:14:42,800 --> 00:14:45,440 Speaker 1: are blueprints for how to express the proteins. That's how 252 00:14:45,480 --> 00:14:48,640 Speaker 1: they're made. That's what they do. That's right, right, that's right. 253 00:14:49,440 --> 00:14:56,280 Speaker 1: We have betweens as a human. Everybody's so happy. You 254 00:14:56,320 --> 00:15:01,920 Speaker 1: just do that. Human that just mess it up? Hmm, 255 00:15:02,080 --> 00:15:05,800 Speaker 1: how about that? Had you've been designed properly, it wouldn't 256 00:15:05,800 --> 00:15:08,600 Speaker 1: have happened. So what I thought was interesting that out 257 00:15:08,640 --> 00:15:11,160 Speaker 1: of the three billion BASS bearers, it's about half and 258 00:15:11,200 --> 00:15:14,160 Speaker 1: half of useful DNA and the rest are junk DNA. 259 00:15:14,400 --> 00:15:17,480 Speaker 1: Yea junk DNA. And they don't think that it's like 260 00:15:17,560 --> 00:15:20,200 Speaker 1: junk DNA, like it's it's totally useless. They think that 261 00:15:20,480 --> 00:15:24,040 Speaker 1: possibly we haven't found the use yet, or they think 262 00:15:24,080 --> 00:15:27,280 Speaker 1: that possibly one of the uses is that it says 263 00:15:27,040 --> 00:15:31,680 Speaker 1: it tells um like stop like here's where, here's where 264 00:15:31,720 --> 00:15:35,160 Speaker 1: this gene stops, or um, this is how much of 265 00:15:35,200 --> 00:15:38,800 Speaker 1: this protein you should express, and this this adjacent gene 266 00:15:38,960 --> 00:15:41,120 Speaker 1: that this junk DNA is next to, or it just 267 00:15:41,160 --> 00:15:44,760 Speaker 1: provides like structure, like actual structure to the to the 268 00:15:44,800 --> 00:15:48,080 Speaker 1: double helix. It's also possible that this is just um 269 00:15:48,400 --> 00:15:52,320 Speaker 1: DNA left over that was deposited by viruses eons ago 270 00:15:53,360 --> 00:15:57,480 Speaker 1: that that don't that don't express themselves any longer enough, 271 00:15:57,920 --> 00:15:59,960 Speaker 1: because that's what viruses too. They insert their own d N, 272 00:16:00,000 --> 00:16:04,600 Speaker 1: A and rs. It's right, they um. So you've got 273 00:16:04,600 --> 00:16:07,680 Speaker 1: this DNA, it's making up chromosomes. You've got twenty three 274 00:16:07,680 --> 00:16:10,080 Speaker 1: in your body. And as complex and as massive as 275 00:16:10,120 --> 00:16:14,760 Speaker 1: the sounds, chuck, twenty pairs pairs, thank you, um, as 276 00:16:14,880 --> 00:16:17,840 Speaker 1: as massive and complex as this whole thing sounds. Every 277 00:16:17,880 --> 00:16:20,760 Speaker 1: cell except for a mature red blood cell, has a 278 00:16:20,800 --> 00:16:25,920 Speaker 1: full human genome and many chromosomes in it, every cell. 279 00:16:26,000 --> 00:16:29,800 Speaker 1: And that's just in the nucleus. That's crazy. It is crazy. 280 00:16:29,880 --> 00:16:31,520 Speaker 1: So you've got all this. We've got a pretty good 281 00:16:31,560 --> 00:16:34,320 Speaker 1: handle on this. The human genome. We've we've mapped it. 282 00:16:34,800 --> 00:16:37,800 Speaker 1: Now we go back and figure out where the genes are. 283 00:16:38,600 --> 00:16:41,160 Speaker 1: And they used to think that it was like nineties 284 00:16:41,240 --> 00:16:45,040 Speaker 1: six percent of DNA was junk, and then they found 285 00:16:45,040 --> 00:16:46,560 Speaker 1: out that, like if you look at the human genome, 286 00:16:46,600 --> 00:16:49,400 Speaker 1: some areas are gene rich, there's a lot of genes. 287 00:16:49,600 --> 00:16:52,200 Speaker 1: Other areas of gene deserts where there's very few. Right, 288 00:16:52,640 --> 00:16:54,880 Speaker 1: we have to go back and look at this map 289 00:16:55,640 --> 00:16:59,960 Speaker 1: and basically crack this code of this four letter line 290 00:17:00,040 --> 00:17:03,520 Speaker 1: Widge and figure out what genes are, what what they do, 291 00:17:03,800 --> 00:17:06,119 Speaker 1: and then ultimately how to manipulate them. And once we 292 00:17:06,160 --> 00:17:10,200 Speaker 1: do that, we effectively have taken humanity out of evolution. 293 00:17:10,400 --> 00:17:16,360 Speaker 1: That's right, scary, is it? I think? So our friend 294 00:17:16,440 --> 00:17:20,680 Speaker 1: David Pierce would beg to different who the guy from 295 00:17:20,720 --> 00:17:26,520 Speaker 1: the Happiness audiobook? We should ratchet up human happiness because 296 00:17:26,520 --> 00:17:31,760 Speaker 1: we can. Boy, that's it's a interesting argument, That's all 297 00:17:31,760 --> 00:17:34,760 Speaker 1: I'm gonna say. So, Um, we're already kind of at 298 00:17:35,440 --> 00:17:40,320 Speaker 1: a very primitive form of this, aren't we. Uh Well, yeah, 299 00:17:40,359 --> 00:17:43,480 Speaker 1: like I said, we we feasibly could choose eye color 300 00:17:43,560 --> 00:17:47,520 Speaker 1: and hair color if we wanted to. Um, And then 301 00:17:47,920 --> 00:17:50,240 Speaker 1: one thing we can definitely do is well, I guess 302 00:17:50,280 --> 00:17:52,480 Speaker 1: we should explain about ib F for those of you 303 00:17:52,480 --> 00:17:58,360 Speaker 1: who don't know. UM. In we first perform in vitro fertilization, 304 00:17:58,400 --> 00:18:01,640 Speaker 1: which basically means when you're a couple and you're having 305 00:18:01,640 --> 00:18:04,280 Speaker 1: trouble having a kid. They're a bunch of different steps. 306 00:18:04,280 --> 00:18:06,600 Speaker 1: You can take a bunch of different routes you can take, 307 00:18:06,960 --> 00:18:09,640 Speaker 1: and one of them is IVF, which means you take 308 00:18:09,840 --> 00:18:13,000 Speaker 1: the sperm from the man, egg from the lady and 309 00:18:13,160 --> 00:18:17,680 Speaker 1: you get them together outside of the human body to 310 00:18:17,880 --> 00:18:20,080 Speaker 1: form a z eyegoat and then you put it back 311 00:18:20,119 --> 00:18:23,920 Speaker 1: in the woman, uh and then she takes it from there. 312 00:18:24,600 --> 00:18:27,640 Speaker 1: And it is can be expensive. It can be very 313 00:18:27,640 --> 00:18:31,280 Speaker 1: hard on the woman, um on her body and and 314 00:18:31,920 --> 00:18:35,560 Speaker 1: emotionally emotionally. I think it's probably hard on the couple emotionally, 315 00:18:37,119 --> 00:18:42,040 Speaker 1: But dudes aren't pump full of hormones, you know. Um. 316 00:18:42,080 --> 00:18:43,879 Speaker 1: So that that is what IVF is, and that is 317 00:18:43,920 --> 00:18:45,320 Speaker 1: one way that you can have a baby if you're 318 00:18:45,359 --> 00:18:49,600 Speaker 1: having trouble having babies. UH. With IVF came something called 319 00:18:49,880 --> 00:18:54,640 Speaker 1: preimplantation genetic diagnosis p g D, which basically means, hey, 320 00:18:55,280 --> 00:18:57,520 Speaker 1: we can look at at your stuff here, and if 321 00:18:57,560 --> 00:19:01,800 Speaker 1: you are predisposed in your fanly to certain things like 322 00:19:01,960 --> 00:19:06,760 Speaker 1: hemophilia A down syndrome tas sac syndrome, we can we 323 00:19:06,840 --> 00:19:10,720 Speaker 1: can stop this process now and try again, right we 324 00:19:10,760 --> 00:19:13,880 Speaker 1: can scream for it. Yeah, fine, And some of its intuitive, 325 00:19:13,920 --> 00:19:17,200 Speaker 1: like with hemophilia A UH, if you and your husband 326 00:19:17,480 --> 00:19:20,399 Speaker 1: both have that, that usually tends to strike boys more 327 00:19:20,440 --> 00:19:24,560 Speaker 1: than girls, so they probably not use embryos that were male, 328 00:19:25,080 --> 00:19:28,080 Speaker 1: likely male. They would use female embryos instead, which brings 329 00:19:28,119 --> 00:19:31,280 Speaker 1: up the sticky point of choosing your gender. Yeah, um, 330 00:19:31,400 --> 00:19:34,879 Speaker 1: some others we can you can find that evidence of 331 00:19:34,880 --> 00:19:39,080 Speaker 1: that disease that say, um, I guess malfunctioning gene that 332 00:19:39,240 --> 00:19:43,320 Speaker 1: creates that disease, because that's what disease is, UM and 333 00:19:43,720 --> 00:19:46,600 Speaker 1: not use those embryos either. So we are we are 334 00:19:46,680 --> 00:19:50,080 Speaker 1: kind of at this primitive state. But it's selective, it is, 335 00:19:50,240 --> 00:19:54,520 Speaker 1: and these are tough decisions that couple's face in life. Uh, 336 00:19:54,600 --> 00:19:56,919 Speaker 1: A lot of thought should go into this if you're 337 00:19:56,960 --> 00:20:01,920 Speaker 1: out there going through this process, it's um, it ain't easy. 338 00:20:02,040 --> 00:20:03,800 Speaker 1: And don't let anyone else tell you what you should 339 00:20:03,840 --> 00:20:05,720 Speaker 1: or shouldn't do. You know there's a personal thing. Oh 340 00:20:05,800 --> 00:20:09,080 Speaker 1: yeah sure, Um, so that's what's that's what's going on 341 00:20:09,080 --> 00:20:13,800 Speaker 1: on the ib F tip. So yeah, the point is 342 00:20:14,040 --> 00:20:19,040 Speaker 1: from that came um PG dye pre implantation genetic diagnosis, 343 00:20:19,080 --> 00:20:22,600 Speaker 1: which is UM kind of right now the most widely 344 00:20:22,640 --> 00:20:27,600 Speaker 1: available type of genetic engineering for couples looking to have 345 00:20:27,640 --> 00:20:29,840 Speaker 1: a baby, right right, And like we said, the sticky 346 00:20:29,880 --> 00:20:32,119 Speaker 1: point of potentially being able to choose your gender if 347 00:20:32,119 --> 00:20:34,040 Speaker 1: you really want a boy, you've got three girls, and 348 00:20:34,200 --> 00:20:36,840 Speaker 1: man I really wanted a boy. Um. And then in 349 00:20:36,920 --> 00:20:39,760 Speaker 1: countries like China where they definitely want boys, it's like 350 00:20:40,119 --> 00:20:43,120 Speaker 1: this could be the future that might upset the balance 351 00:20:43,240 --> 00:20:47,360 Speaker 1: of of nature and how many boys and girls are 352 00:20:47,359 --> 00:20:49,959 Speaker 1: born and what does that mean for the future. So 353 00:20:50,119 --> 00:20:54,159 Speaker 1: I heard to UM to have a soft landing from 354 00:20:54,160 --> 00:20:56,720 Speaker 1: their one child policy, which they're now starting to like 355 00:20:56,920 --> 00:21:01,800 Speaker 1: relax they should have stopped at about twenty years ago. Yeah. Yeah, 356 00:21:02,200 --> 00:21:04,199 Speaker 1: Does that mean they're in trouble? Does that mean we 357 00:21:04,200 --> 00:21:07,000 Speaker 1: need to follow up like we recommended writers do on 358 00:21:07,000 --> 00:21:11,840 Speaker 1: our podcast. Sure, I guess you just did. Oh okay, 359 00:21:11,880 --> 00:21:16,680 Speaker 1: there was heard, UM, So chuck, we've got this. We've 360 00:21:16,680 --> 00:21:19,440 Speaker 1: got this genetic screening. That's one way to do it. 361 00:21:19,680 --> 00:21:23,440 Speaker 1: There's also another way, um that is a little further 362 00:21:23,520 --> 00:21:27,119 Speaker 1: out as far as humans go, UM, and that is 363 00:21:27,760 --> 00:21:30,800 Speaker 1: transgenderic therapy, which is where you take the gene of 364 00:21:30,840 --> 00:21:36,000 Speaker 1: something else that desirous trait and inserted into some into 365 00:21:36,040 --> 00:21:40,840 Speaker 1: the human right. So, what is actually adding a gene 366 00:21:41,040 --> 00:21:43,439 Speaker 1: right where What we've been talking about to this point 367 00:21:43,600 --> 00:21:48,400 Speaker 1: is unnatural selection. But it's been selection. It's like this 368 00:21:48,440 --> 00:21:52,360 Speaker 1: is this is it appeared naturally, but we're gonna take 369 00:21:52,400 --> 00:21:55,840 Speaker 1: away all of the other We're gonna reduce the chances 370 00:21:55,880 --> 00:21:58,399 Speaker 1: that it won't happen, or we're gonna increase the chances 371 00:21:58,440 --> 00:22:02,400 Speaker 1: that that will happen. This is straight up copying and 372 00:22:02,480 --> 00:22:07,399 Speaker 1: pasting or cutting and pasting genes to create something desirable 373 00:22:07,480 --> 00:22:10,760 Speaker 1: or new. That's right, And they already do this in animals, UM. 374 00:22:11,560 --> 00:22:14,120 Speaker 1: So you know, if you can do it in animals, 375 00:22:14,160 --> 00:22:15,600 Speaker 1: it's not gonna be long before you can do it 376 00:22:15,640 --> 00:22:22,000 Speaker 1: with humans. And uh long term, maybe that means we 377 00:22:22,040 --> 00:22:26,240 Speaker 1: can eliminate certain diseases by correcting this stuff along the way, 378 00:22:26,760 --> 00:22:30,560 Speaker 1: right like before it happens. So that could be good. 379 00:22:31,000 --> 00:22:34,640 Speaker 1: So so when you take a gene from one animal 380 00:22:35,160 --> 00:22:38,720 Speaker 1: and UM and planted into another, that's um that's become 381 00:22:38,760 --> 00:22:42,760 Speaker 1: a transgenic animal or a chimera, which is based on 382 00:22:42,840 --> 00:22:49,360 Speaker 1: the goat, serpent, lion, fire breathing animal of legend from Greece. Um. 383 00:22:49,480 --> 00:22:51,600 Speaker 1: They call it a chimera, which is kind of hurtful, 384 00:22:51,600 --> 00:22:53,320 Speaker 1: I think, especially if you're a human and know what 385 00:22:53,400 --> 00:22:56,520 Speaker 1: a chimera is, and you're a chimera, I'm sure you 386 00:22:56,600 --> 00:22:59,760 Speaker 1: probably hurt your feelings. But um, thus far there aren't 387 00:22:59,760 --> 00:23:02,760 Speaker 1: any human caimeras as far as I know. It's mostly 388 00:23:02,960 --> 00:23:05,919 Speaker 1: the big one that we've actually talked about unknowingly before 389 00:23:06,480 --> 00:23:10,120 Speaker 1: is a bio steel. The goat with the spider. Remember 390 00:23:10,160 --> 00:23:11,840 Speaker 1: we were trying to figure out how they got spider 391 00:23:11,880 --> 00:23:16,280 Speaker 1: silk from a goat and the one turns out. It 392 00:23:16,280 --> 00:23:19,399 Speaker 1: turns out that spiders and goats share enough traits to 393 00:23:19,440 --> 00:23:23,560 Speaker 1: where this ultra strong spider silk can be produced in 394 00:23:23,680 --> 00:23:26,960 Speaker 1: the goat's milk. They have similar proteins. And they said, 395 00:23:27,000 --> 00:23:30,200 Speaker 1: I don't know how they came across that. I'm sure 396 00:23:30,240 --> 00:23:31,920 Speaker 1: they had some hint. I don't know why they would 397 00:23:31,920 --> 00:23:33,640 Speaker 1: start with a goat's milk. Yeah, I don't know either, 398 00:23:33,960 --> 00:23:35,520 Speaker 1: but the protein, you know, what would happen if we 399 00:23:35,600 --> 00:23:38,320 Speaker 1: put the spider silk in this goat's milk? What rhymes 400 00:23:38,320 --> 00:23:42,640 Speaker 1: with spider silk goat's milk. Let's try to start there. Um, yeah, 401 00:23:42,640 --> 00:23:45,200 Speaker 1: it worked, but they figured out that like the spider, 402 00:23:45,240 --> 00:23:48,240 Speaker 1: the protein and spider silk is similar to a protein 403 00:23:48,280 --> 00:23:51,920 Speaker 1: and goats milk identical. And well, once you inject the 404 00:23:51,960 --> 00:23:55,879 Speaker 1: goats jeans with that spider me like a glove. It 405 00:23:55,960 --> 00:23:59,520 Speaker 1: starts producing a ton of that protein UM and its milk, 406 00:23:59,560 --> 00:24:03,600 Speaker 1: and you harvest that protein and then start weaving spider 407 00:24:03,640 --> 00:24:06,000 Speaker 1: silk and make the stuff called bio steel, which is 408 00:24:06,160 --> 00:24:09,119 Speaker 1: really really good body armor. Yeah, and that's where we 409 00:24:09,119 --> 00:24:12,080 Speaker 1: talked about it, right, Yeah, Body Armor podcast that you 410 00:24:12,119 --> 00:24:14,840 Speaker 1: can find on our RSS feed. Yeah, which also happened 411 00:24:14,840 --> 00:24:19,200 Speaker 1: to be our first ever UM listener request. Oh really Yeah, 412 00:24:19,800 --> 00:24:24,280 Speaker 1: someone requested that and we acquiesced and we started getting 413 00:24:24,320 --> 00:24:29,440 Speaker 1: all these emails. UM. So the point of that is 414 00:24:29,440 --> 00:24:33,040 Speaker 1: is they're doing this in animals. There are scenarios where 415 00:24:33,359 --> 00:24:36,840 Speaker 1: we could potentially do this with humans, but UM and 416 00:24:37,080 --> 00:24:41,280 Speaker 1: another follow up article we read turns out that enhancing 417 00:24:41,280 --> 00:24:46,760 Speaker 1: ourselves genetically could eventually lead to UM unknown consequences down 418 00:24:46,760 --> 00:24:51,120 Speaker 1: the road. Uh. Specifically, in this case, we have learned 419 00:24:51,160 --> 00:24:54,240 Speaker 1: that our human brain is evolving, it's getting larger, it's 420 00:24:54,280 --> 00:24:59,120 Speaker 1: gaining more cognitive abilities as we evolve, and if you 421 00:24:59,480 --> 00:25:05,640 Speaker 1: start to bring with natural selection via genetic modification, these 422 00:25:05,640 --> 00:25:07,560 Speaker 1: things might not show up right away. It might show 423 00:25:07,640 --> 00:25:11,000 Speaker 1: up generations later. So you might be doing something you 424 00:25:11,000 --> 00:25:14,200 Speaker 1: think will help when and in fact, years from now, 425 00:25:14,280 --> 00:25:18,080 Speaker 1: it might keep your brain from growing. Like everyone else's 426 00:25:18,640 --> 00:25:20,640 Speaker 1: and this is just one example of something that could 427 00:25:20,680 --> 00:25:25,480 Speaker 1: go wrong. Organisms evolve right through mutations. Well, we lack 428 00:25:25,560 --> 00:25:28,240 Speaker 1: the foresight to know what mutation will be beneficial in 429 00:25:28,320 --> 00:25:30,840 Speaker 1: what will be harmful years down the line. So even 430 00:25:30,880 --> 00:25:34,080 Speaker 1: something that may be harmful immediately or somewhat harmful could 431 00:25:34,080 --> 00:25:38,880 Speaker 1: be extremely beneficial decades, hundreds, thousands, millions of years from now. 432 00:25:39,600 --> 00:25:43,040 Speaker 1: We would never know. That's too late. Once you've done it, 433 00:25:43,160 --> 00:25:46,280 Speaker 1: Yeah you're done. No. I kind of had the impression that, like, 434 00:25:46,359 --> 00:25:52,560 Speaker 1: once you start tampering, you could conceivably, you know, keep improving, 435 00:25:52,720 --> 00:25:54,960 Speaker 1: but it would have to be constant. Well, and what 436 00:25:55,119 --> 00:25:56,800 Speaker 1: this article points out, which is a good point, is 437 00:25:56,880 --> 00:25:59,760 Speaker 1: natural selection is at its best when you've got a 438 00:25:59,800 --> 00:26:02,160 Speaker 1: large gene pool. And if you're narrowing that gene pool 439 00:26:02,600 --> 00:26:05,479 Speaker 1: for a reason you think is great, you're still narrowing 440 00:26:05,480 --> 00:26:09,840 Speaker 1: the gene pool. And I think proponents of genetic engineering 441 00:26:09,880 --> 00:26:12,440 Speaker 1: would say, well, that's fine, we're narrowing the gene pool. 442 00:26:12,440 --> 00:26:15,720 Speaker 1: Who cares. We're taking full control of evolution, so evolution 443 00:26:15,800 --> 00:26:20,080 Speaker 1: can kiss off. But this raises all sorts of questions, 444 00:26:20,119 --> 00:26:22,240 Speaker 1: like some of which we've already touched upon, but like 445 00:26:22,520 --> 00:26:27,120 Speaker 1: who decides what's ideal? Who decides what traits are good 446 00:26:27,160 --> 00:26:31,520 Speaker 1: and what are bad? What happens when this becomes you know, 447 00:26:31,600 --> 00:26:35,280 Speaker 1: commercially viable, but it's still extremely expensive than just the 448 00:26:35,320 --> 00:26:38,639 Speaker 1: wealthy have designer children? Well, I mean, what kind of 449 00:26:38,680 --> 00:26:41,000 Speaker 1: designer children do do we make that? I read this 450 00:26:41,040 --> 00:26:43,520 Speaker 1: one ethicist who said that we have a moral obligation 451 00:26:44,240 --> 00:26:48,640 Speaker 1: two genetically engineer and modify our kids so that they're 452 00:26:48,680 --> 00:26:52,200 Speaker 1: not a harm to themselves or other people, which makes 453 00:26:52,240 --> 00:26:54,280 Speaker 1: a lot of sense. Like I can see how that 454 00:26:54,400 --> 00:26:56,879 Speaker 1: is a moral obligation. Like, if you have the technology 455 00:26:56,960 --> 00:27:00,719 Speaker 1: to improve people and improve society like that, you have 456 00:27:00,960 --> 00:27:03,640 Speaker 1: to do it, you know. But then of course there's 457 00:27:03,680 --> 00:27:05,480 Speaker 1: like the other side, it's like God, I don't know, 458 00:27:06,960 --> 00:27:09,920 Speaker 1: we don't really know what we're doing here playing god? Yeah, 459 00:27:09,960 --> 00:27:11,760 Speaker 1: because I mean, what happens if you make a kid 460 00:27:12,240 --> 00:27:16,680 Speaker 1: that's awful little bit and like they're like just totally 461 00:27:16,720 --> 00:27:19,120 Speaker 1: messed up, But they wouldn't have been if you hadn't 462 00:27:19,160 --> 00:27:22,040 Speaker 1: tampered with them. Who's responsible for that? And in what 463 00:27:22,080 --> 00:27:24,880 Speaker 1: ways are you responsible for it? Well? And more well, 464 00:27:24,920 --> 00:27:27,200 Speaker 1: not more importantly, but additionally, where the line is drawn? 465 00:27:27,720 --> 00:27:30,120 Speaker 1: You know, is it okay to say like, that's kind 466 00:27:30,119 --> 00:27:34,040 Speaker 1: of like my baby had blue eyes? Right, No big deal? Right? Yeah? 467 00:27:34,520 --> 00:27:36,960 Speaker 1: Was that the line? Or is the line? Like, um, 468 00:27:36,960 --> 00:27:41,000 Speaker 1: maybe it'd be cool if they were athletic, right and 469 00:27:41,200 --> 00:27:45,480 Speaker 1: super smart and had blue eyes and blond hair and 470 00:27:45,520 --> 00:27:49,400 Speaker 1: then boom boys from Brazil? But what's the again, what's 471 00:27:49,440 --> 00:27:52,080 Speaker 1: what are the problems with those things? Well? Yeah, you 472 00:27:52,160 --> 00:27:54,639 Speaker 1: know it's gonna be athletic and smart, right. Or Happy 473 00:27:54,720 --> 00:27:57,200 Speaker 1: is another one that David pointed out that I find 474 00:27:57,280 --> 00:28:00,240 Speaker 1: tough to to to disagree with, Like, if you have 475 00:28:00,320 --> 00:28:03,919 Speaker 1: the technology to make your kids happier, like ratchet up 476 00:28:03,920 --> 00:28:06,880 Speaker 1: their baseline happiness is how we put it, Why wouldn't 477 00:28:06,880 --> 00:28:09,200 Speaker 1: you do that? If you can make society a better 478 00:28:09,240 --> 00:28:12,240 Speaker 1: place because everybody's happier, why wouldn't you do it? Well, 479 00:28:13,240 --> 00:28:15,439 Speaker 1: just this is just a lot ee to me saying, 480 00:28:16,000 --> 00:28:19,680 Speaker 1: you know, there's always far reaching consequences. There's always ripples, 481 00:28:19,760 --> 00:28:24,720 Speaker 1: every stone you're throwing a lake, you know, like, what 482 00:28:24,720 --> 00:28:28,680 Speaker 1: what else is going to happen? If everybody's happy? Are 483 00:28:28,720 --> 00:28:31,760 Speaker 1: there downfalls or their setbacks? And there what's going on? 484 00:28:32,000 --> 00:28:35,760 Speaker 1: I know it is a tricky, tricky subject. Anytime we 485 00:28:35,800 --> 00:28:38,520 Speaker 1: bring up jeans, it becomes a tricky subject. It does, 486 00:28:38,960 --> 00:28:41,720 Speaker 1: which is why they're fascinating, that's right, and why people 487 00:28:41,760 --> 00:28:45,640 Speaker 1: really get up on their soapbox when they you know, 488 00:28:46,080 --> 00:28:49,760 Speaker 1: this means a lot to a lot of people. Religious circles, 489 00:28:50,120 --> 00:28:52,959 Speaker 1: scientific circles. A lot of folks are weighing in from 490 00:28:53,000 --> 00:28:58,040 Speaker 1: different you know, circles. But you know what that means. 491 00:28:58,720 --> 00:29:03,280 Speaker 1: It's it was just stupid. Okay. If you want to 492 00:29:03,360 --> 00:29:06,560 Speaker 1: learn more about genetics, how stuff works is loaded with 493 00:29:07,040 --> 00:29:10,640 Speaker 1: them articles on them on it. Yeah. Um, you can 494 00:29:10,640 --> 00:29:14,800 Speaker 1: type in genetics, gens, designer babies, whatever you want in 495 00:29:14,840 --> 00:29:16,720 Speaker 1: the handy search bar and it's gonna bring up some 496 00:29:16,760 --> 00:29:19,440 Speaker 1: pretty cool articles. Do you recommend you wasted a year 497 00:29:19,520 --> 00:29:23,120 Speaker 1: or two meeting them? Um? And I said handy search bar. 498 00:29:23,160 --> 00:29:25,120 Speaker 1: I think so that means its time for listener mail. 499 00:29:27,160 --> 00:29:30,840 Speaker 1: It's right, Josh, I am going to call this plug 500 00:29:30,920 --> 00:29:33,960 Speaker 1: for our friends at q s a C. Remember meeting 501 00:29:34,080 --> 00:29:37,760 Speaker 1: uh Sandra at trivia night in New York. She give 502 00:29:37,800 --> 00:29:41,760 Speaker 1: us the hats the baseball Caps. Yeah, what what is 503 00:29:41,880 --> 00:29:44,600 Speaker 1: the qu s a C Is the Quality Services for 504 00:29:44,640 --> 00:29:49,040 Speaker 1: the Autism Compati. So this from uh Sandra. She's super sweet, 505 00:29:49,080 --> 00:29:52,200 Speaker 1: very nice, and she says this, guys, thanks so much 506 00:29:52,240 --> 00:29:55,520 Speaker 1: for humoring my over enthusiasm for my cause. Uh and 507 00:29:55,600 --> 00:29:57,959 Speaker 1: my over enthusiasm for meeting you guys at trivia night 508 00:29:58,040 --> 00:29:59,959 Speaker 1: who I look up to. It was an amazing night 509 00:30:00,000 --> 00:30:02,120 Speaker 1: of randomness. All the other people on our second place 510 00:30:02,160 --> 00:30:05,960 Speaker 1: team met in line total strangers. After my initial star 511 00:30:06,280 --> 00:30:09,480 Speaker 1: struck nous died down, which never happens to me because 512 00:30:09,520 --> 00:30:11,760 Speaker 1: I worked with famous people all the time and couldn't 513 00:30:11,760 --> 00:30:15,000 Speaker 1: care less. I just felt like, Uh, it just felt 514 00:30:15,000 --> 00:30:16,120 Speaker 1: like a night where I was hanging out with a 515 00:30:16,160 --> 00:30:18,840 Speaker 1: few buddies I've known for a while. Um. She still 516 00:30:18,840 --> 00:30:21,800 Speaker 1: about us start We're very partritable. It was just silly, 517 00:30:22,320 --> 00:30:24,160 Speaker 1: Um she said, I felt that way about my whole 518 00:30:24,160 --> 00:30:27,360 Speaker 1: table of strangers. Actually. Uh So, anyway, she told us 519 00:30:27,400 --> 00:30:31,719 Speaker 1: that night about U S a c M organizaization she's with, 520 00:30:31,800 --> 00:30:34,280 Speaker 1: and she says, I'm very passionate about QU S a 521 00:30:34,360 --> 00:30:36,680 Speaker 1: C because they changed my life. Literally. I did a 522 00:30:36,680 --> 00:30:39,640 Speaker 1: five k to support them because it was local and 523 00:30:39,720 --> 00:30:41,520 Speaker 1: my nephew was autistic and I wanted to see if 524 00:30:41,520 --> 00:30:44,400 Speaker 1: I could actually walk that far. She had broken both 525 00:30:44,400 --> 00:30:46,880 Speaker 1: her ankles the previous year. Okay, I thought that was 526 00:30:46,880 --> 00:30:52,200 Speaker 1: funny too. Um Man, I wonder if she was. Was 527 00:30:52,240 --> 00:30:56,800 Speaker 1: it called cobbled? Cobbled? Was that in misery? Oh yeah, 528 00:30:56,800 --> 00:31:01,720 Speaker 1: that's there's a name for the cobble. She hobbled. Hobbled 529 00:31:02,080 --> 00:31:07,080 Speaker 1: cobbled would be if you just than I might even 530 00:31:07,120 --> 00:31:11,440 Speaker 1: called hobbling. How yeah, you're hobbled, but that's like a 531 00:31:11,440 --> 00:31:14,760 Speaker 1: a state. No, but the process by which Kathy Bates 532 00:31:14,800 --> 00:31:16,760 Speaker 1: like broke the ankles, I don't want to talk about it. 533 00:31:16,880 --> 00:31:20,640 Speaker 1: She it was called something like you're it was hobbling somebody. 534 00:31:20,720 --> 00:31:24,720 Speaker 1: Maybe I'll look that up. That was so nasty. Anyway, 535 00:31:24,760 --> 00:31:27,920 Speaker 1: I don't think Sandra was cobbled or hobbled. Uh So 536 00:31:27,960 --> 00:31:29,600 Speaker 1: at the time, I had no clue who they were. 537 00:31:30,080 --> 00:31:33,560 Speaker 1: Cusack made friends with everyone there and many people who 538 00:31:33,640 --> 00:31:36,000 Speaker 1: worked there, and then three months later I actually started 539 00:31:36,000 --> 00:31:38,520 Speaker 1: work there as an employee. I was in corporate television 540 00:31:38,520 --> 00:31:40,840 Speaker 1: as a video editor for ten years and it beat 541 00:31:40,920 --> 00:31:44,000 Speaker 1: down my soul. I was always volunteering and donating what 542 00:31:44,040 --> 00:31:46,920 Speaker 1: I could, and I felt like, maybe nonprofit is what 543 00:31:47,000 --> 00:31:49,080 Speaker 1: I should be doing. The opportunity arose and I took 544 00:31:49,120 --> 00:31:51,800 Speaker 1: it and it has been life changing. Dudes, Since you 545 00:31:51,800 --> 00:31:54,440 Speaker 1: guys are so excellent being philanthropic, I decided to see 546 00:31:54,560 --> 00:31:56,840 Speaker 1: if he would be interested in knowing more. So, Josh 547 00:31:56,880 --> 00:31:59,960 Speaker 1: mentioned you guys were considering doing a podcast on autism. 548 00:32:00,040 --> 00:32:02,720 Speaker 1: I'm sure we'll get around of that at some point, right, Um, 549 00:32:02,720 --> 00:32:04,160 Speaker 1: if you do, I have plenty of people that could 550 00:32:04,160 --> 00:32:07,120 Speaker 1: answer a lot of questions and gladly pass along addresses 551 00:32:07,120 --> 00:32:10,160 Speaker 1: and phone numbers. Q Sack has been around for over 552 00:32:10,240 --> 00:32:13,520 Speaker 1: thirty years, truly amazing in helping the New York City 553 00:32:13,520 --> 00:32:16,720 Speaker 1: and Long Island areas. So do you guys want to 554 00:32:16,760 --> 00:32:21,440 Speaker 1: help support this great uh cause for autism? You can 555 00:32:21,440 --> 00:32:24,440 Speaker 1: go to q s a c dot com or she 556 00:32:24,480 --> 00:32:26,719 Speaker 1: has a bowling page and I think you do. Like 557 00:32:27,000 --> 00:32:31,440 Speaker 1: she gets like a fundraisers going through bowling at www 558 00:32:31,560 --> 00:32:38,000 Speaker 1: dot first giving dot com slash fundraiser slash Sandra sroka 559 00:32:38,560 --> 00:32:41,800 Speaker 1: slash bowl and that is Sandra s O r O 560 00:32:42,000 --> 00:32:47,000 Speaker 1: k A s um Sandra. And yeah she was. She 561 00:32:47,080 --> 00:32:52,680 Speaker 1: was super nice and she's working for autism now. Very cool? Man? UM, 562 00:32:52,760 --> 00:32:54,760 Speaker 1: can I give one more side out? We heard from 563 00:32:54,760 --> 00:32:58,920 Speaker 1: another listener of course, Um, a listener named Emily Eisenman 564 00:32:59,560 --> 00:33:01,720 Speaker 1: is run for LifeStraw. Did you see this to you? 565 00:33:01,960 --> 00:33:03,080 Speaker 1: I did, and now I was gonna read that later 566 00:33:03,160 --> 00:33:06,200 Speaker 1: let's let's go ahead do it now? Okay, yeah, alright, So, um, 567 00:33:06,200 --> 00:33:09,640 Speaker 1: Emily is running for Life strow. She heard our podcast 568 00:33:09,720 --> 00:33:12,400 Speaker 1: from two thousand ten on LifeStraw and she decided to 569 00:33:12,480 --> 00:33:16,320 Speaker 1: raise a thousand dollars to buy lifestraws by running a 570 00:33:16,360 --> 00:33:20,680 Speaker 1: thousand miles. Yeah. Um, and she is going to cross 571 00:33:20,680 --> 00:33:23,800 Speaker 1: a thousand mile mark for the year this week. She 572 00:33:23,840 --> 00:33:26,320 Speaker 1: may have already done it. Um. And she's proving to 573 00:33:26,360 --> 00:33:29,240 Speaker 1: be a better funder than a better runner than a fundraiser, 574 00:33:29,320 --> 00:33:33,040 Speaker 1: she says. So if everybody who listens to Stuff you 575 00:33:33,040 --> 00:33:36,440 Speaker 1: Should Know would go help help her fundraise, it would 576 00:33:36,440 --> 00:33:41,000 Speaker 1: be fantastic. You can go to www dot fundly f 577 00:33:41,200 --> 00:33:44,680 Speaker 1: u n d l y dot com slash run for 578 00:33:44,920 --> 00:33:47,840 Speaker 1: Life straw um, and you guys can go check that 579 00:33:47,880 --> 00:33:50,400 Speaker 1: out and help Emily raise some money for LifeStraw. And 580 00:33:50,440 --> 00:33:53,560 Speaker 1: if you are unfamiliar with LifeStraw, go listen to our 581 00:33:53,600 --> 00:33:57,520 Speaker 1: podcast on that subject, which you can probably are going 582 00:33:57,560 --> 00:34:00,200 Speaker 1: to have to find on our RSS feed as well. Yeah. 583 00:34:00,320 --> 00:34:02,640 Speaker 1: Is um just you know what you do? You you 584 00:34:02,760 --> 00:34:05,440 Speaker 1: Google or you get your favorite search bar and you 585 00:34:05,520 --> 00:34:08,279 Speaker 1: type and stuff you should know rss feed and it's 586 00:34:08,320 --> 00:34:10,440 Speaker 1: like boom right there, all of our shows ever, Yep, 587 00:34:10,520 --> 00:34:14,200 Speaker 1: Stuff you Should Know RSS every single one. It's good stuff. Um, 588 00:34:14,239 --> 00:34:16,960 Speaker 1: all right, well I guess that's it, right, That is it? Sir? 589 00:34:17,040 --> 00:34:19,920 Speaker 1: All right? Uh. If you want to contact us, you 590 00:34:19,960 --> 00:34:22,560 Speaker 1: can tweet to us at s Y s K Podcast, 591 00:34:22,840 --> 00:34:25,319 Speaker 1: join us on Facebook dot com slash stuff we Should Know, 592 00:34:25,680 --> 00:34:27,640 Speaker 1: and you can send us a good old fashioned email 593 00:34:27,719 --> 00:34:36,960 Speaker 1: to Stuff Podcast at Discovery dot com. For more on 594 00:34:36,960 --> 00:34:39,440 Speaker 1: this and thousands of other topics, is it how Stuff 595 00:34:39,440 --> 00:34:47,640 Speaker 1: Works dot com