1 00:00:09,920 --> 00:00:11,120 Speaker 1: Have you seen Gatica. 2 00:00:11,680 --> 00:00:14,280 Speaker 2: I love that movie and I actually rewatched it when 3 00:00:14,320 --> 00:00:16,480 Speaker 2: I was reporting this story. 4 00:00:16,720 --> 00:00:19,360 Speaker 1: Amanda Garrett is the West Coast editor of Fortune, and 5 00:00:19,400 --> 00:00:22,320 Speaker 1: she recently wrote an article about how Silicon Valley is 6 00:00:22,360 --> 00:00:26,599 Speaker 1: investing in fertility technology to create the quote perfect baby. 7 00:00:27,080 --> 00:00:29,400 Speaker 1: This is a concept that's been around in science fiction 8 00:00:29,520 --> 00:00:32,000 Speaker 1: for decades, and a good example of that is in 9 00:00:32,040 --> 00:00:35,280 Speaker 1: a movie from almost thirty years ago called Gatica. 10 00:00:35,760 --> 00:00:39,199 Speaker 2: I mean, in so many conversations with people who are 11 00:00:39,320 --> 00:00:42,680 Speaker 2: outside this industry, a lot of them, you know, would 12 00:00:42,680 --> 00:00:46,280 Speaker 2: sort of refer to Gatica like a future ESSEC kind 13 00:00:46,280 --> 00:00:49,720 Speaker 2: of We're going to change your genes and make you superhuman, 14 00:00:50,040 --> 00:00:53,200 Speaker 2: amazing looking, just like Jude Law or Uma Seraman. 15 00:00:54,720 --> 00:00:58,000 Speaker 1: If you haven't seen Gatica, I don't know, pause the 16 00:00:58,040 --> 00:01:01,160 Speaker 1: podcast and go watch it. But all so, it came 17 00:01:01,200 --> 00:01:04,480 Speaker 1: out in nineteen ninety seven, like don'd tell you honestly, 18 00:01:04,600 --> 00:01:06,320 Speaker 1: I kind of feel like we may be at a 19 00:01:06,360 --> 00:01:11,039 Speaker 1: stage where Gatka is required reading or required watching just 20 00:01:11,040 --> 00:01:14,120 Speaker 1: for understanding what's happening right now if you need to 21 00:01:14,200 --> 00:01:17,119 Speaker 1: catch up. Without giving you too many spoilers here. Gatica 22 00:01:17,160 --> 00:01:19,360 Speaker 1: is a science fiction film that's set in a dystopian 23 00:01:19,400 --> 00:01:22,319 Speaker 1: future in which there are two tiers of people. At 24 00:01:22,319 --> 00:01:25,319 Speaker 1: the top, you have valid people. These are people who've 25 00:01:25,319 --> 00:01:29,000 Speaker 1: been genetically selected to be the healthiest and most attractive possible, 26 00:01:29,440 --> 00:01:32,760 Speaker 1: and at the bottom, people who were conceived naturally. And 27 00:01:32,800 --> 00:01:37,440 Speaker 1: these people are called literally invalids. So valids get access 28 00:01:37,440 --> 00:01:41,119 Speaker 1: to certain high class jobs and invalids don't. The story 29 00:01:41,160 --> 00:01:44,440 Speaker 1: follows Vincent Freeman, who is an invalid played by Ethan Hawk, 30 00:01:44,680 --> 00:01:47,480 Speaker 1: who kind of borrows the identity of a valid man 31 00:01:47,520 --> 00:01:50,160 Speaker 1: played by Jude Law to try to achieve his dream 32 00:01:50,240 --> 00:01:51,240 Speaker 1: of going to space. 33 00:01:52,160 --> 00:01:55,120 Speaker 3: And that's the way it was. Each day I would 34 00:01:55,120 --> 00:01:58,360 Speaker 3: dispose of as much loose skin, fingernails and hair as 35 00:01:58,400 --> 00:02:02,520 Speaker 3: possible to limit how much of my invalid self I 36 00:02:02,520 --> 00:02:04,360 Speaker 3: would leave in the valid world. 37 00:02:06,640 --> 00:02:10,040 Speaker 2: It's fully instantly in a way that like he's a 38 00:02:10,080 --> 00:02:12,280 Speaker 2: newborn baby every day scraping off his cell so that 39 00:02:12,400 --> 00:02:15,160 Speaker 2: he doesn't leave a cell behind. Because there are all 40 00:02:15,200 --> 00:02:17,800 Speaker 2: these sort of genetic identifiers everywhere you go. You need 41 00:02:17,840 --> 00:02:20,800 Speaker 2: to get into work well, just like bioscan you to 42 00:02:20,919 --> 00:02:23,560 Speaker 2: like get in and then you know, even urine he's got, 43 00:02:23,600 --> 00:02:25,639 Speaker 2: like Jude Law's urine that he's leaving behind because he 44 00:02:25,680 --> 00:02:27,079 Speaker 2: can't even be behind his own urine. 45 00:02:27,160 --> 00:02:30,000 Speaker 3: At the same time, Eugene prepared samples of his own 46 00:02:30,080 --> 00:02:33,880 Speaker 3: superior body matter so that I might pass for him, 47 00:02:33,880 --> 00:02:38,799 Speaker 3: customized urine pouches for the frequent substance tests, fingertip blood 48 00:02:38,840 --> 00:02:43,440 Speaker 3: sachets for security checks, and vials filled with other traces. 49 00:02:44,720 --> 00:02:47,440 Speaker 2: The idea is that he is trying, as a normal 50 00:02:47,480 --> 00:02:50,320 Speaker 2: person to be just as good as you know the 51 00:02:50,320 --> 00:02:53,680 Speaker 2: people who are these sort of super human who started 52 00:02:53,680 --> 00:02:55,160 Speaker 2: out as designer babies. 53 00:02:55,160 --> 00:02:57,640 Speaker 1: And according to a man is reporting, there is a 54 00:02:57,800 --> 00:03:00,600 Speaker 1: lot of money going toward making that scie my future 55 00:03:00,639 --> 00:03:04,040 Speaker 1: of designer babies get here faster than you might have thought. 56 00:03:04,520 --> 00:03:09,120 Speaker 2: People inside this industry is fertility tech kind of startup space. 57 00:03:09,360 --> 00:03:12,840 Speaker 2: They're very optimistic and very excited about what's possible. 58 00:03:17,000 --> 00:03:24,519 Speaker 1: From Kaleidoscope and iHeart podcasts, this is kill Switch. I'm 59 00:03:24,520 --> 00:03:27,120 Speaker 1: Dexter Thomas. 60 00:03:32,000 --> 00:04:09,320 Speaker 4: I'm all right, I'm sorry, good bye. 61 00:04:13,520 --> 00:04:18,000 Speaker 1: Where are we right now? With all of this technology, this. 62 00:04:18,000 --> 00:04:21,440 Speaker 2: Is sort of a burgeoning twenty five billion dollar industry, 63 00:04:21,480 --> 00:04:26,159 Speaker 2: this fertility tech startup industry. Twenty twenty four was sort 64 00:04:26,160 --> 00:04:28,320 Speaker 2: of one of the biggest years for investment. There was 65 00:04:28,320 --> 00:04:30,520 Speaker 2: like a two billion dollar jump over the year before 66 00:04:31,040 --> 00:04:33,400 Speaker 2: twenty twenty five. I think, you know, we don't have 67 00:04:33,520 --> 00:04:36,480 Speaker 2: numbers yet, but I think we'll see that definitely be exceeded. 68 00:04:36,560 --> 00:04:40,040 Speaker 2: Just based on some of the major tech money that 69 00:04:40,200 --> 00:04:42,240 Speaker 2: is backing up a lot of these startups. 70 00:04:42,880 --> 00:04:46,560 Speaker 1: Silicon Valley is heavily investing in the fertility tech industry. 71 00:04:46,760 --> 00:04:50,200 Speaker 1: There's big names like Reddit co founder Alex o'hanian, open 72 00:04:50,240 --> 00:04:53,120 Speaker 1: Ai CEO Sam Altman, and twenty three and Meter co 73 00:04:53,200 --> 00:04:56,360 Speaker 1: founder and former CEO. And we just if you live 74 00:04:56,400 --> 00:04:57,760 Speaker 1: in New York, you might have seen the ads on 75 00:04:57,800 --> 00:05:00,680 Speaker 1: the subway that read have your Best Baby telling you 76 00:05:00,720 --> 00:05:04,680 Speaker 1: to go to www dot Pickyourbaby dot com to I 77 00:05:04,720 --> 00:05:08,240 Speaker 1: guess have your best baby. Those ads are from Nucleus Genomics, 78 00:05:08,440 --> 00:05:11,800 Speaker 1: and there's other companies like Orchid Health and Haroosite, and 79 00:05:11,880 --> 00:05:15,400 Speaker 1: these companies are offering what's called polygenic screening. 80 00:05:16,520 --> 00:05:20,000 Speaker 2: Traditional IVF testing looks at chromosomes and single gene diseases 81 00:05:20,040 --> 00:05:24,599 Speaker 2: like cystic fibrosis, sicle cele, anemia, taseas, and these new 82 00:05:24,680 --> 00:05:27,520 Speaker 2: breed of companies are sort of claiming that they can 83 00:05:27,560 --> 00:05:31,800 Speaker 2: predict more complex traits through the polygetic screening. So it'll 84 00:05:31,800 --> 00:05:34,840 Speaker 2: take some cells and then they use AI to amplify it, 85 00:05:34,960 --> 00:05:36,920 Speaker 2: and then you have a genome that you can study, 86 00:05:36,960 --> 00:05:39,600 Speaker 2: and so they'll look at that genome and they can 87 00:05:39,720 --> 00:05:43,960 Speaker 2: use it to predict complex traits like childhood cancers, autism, 88 00:05:44,040 --> 00:05:51,080 Speaker 2: spectrum disorder, schizophrenia, fight no shape, hair texture. 89 00:05:51,080 --> 00:05:53,360 Speaker 1: Dollar nose shape, no. 90 00:05:53,480 --> 00:05:56,560 Speaker 2: Shape, face shape, and then on the margins you have 91 00:05:56,760 --> 00:06:01,720 Speaker 2: things that people are interesting in. That kind of science 92 00:06:01,760 --> 00:06:06,960 Speaker 2: doesn't necessarily back up like IQ musical ability. 93 00:06:07,320 --> 00:06:10,920 Speaker 1: Genetic testing of embryos itself isn't a new thing. Screening 94 00:06:10,960 --> 00:06:13,640 Speaker 1: has been part of the IVF process since the nineties, 95 00:06:14,000 --> 00:06:17,000 Speaker 1: but those tests could only show you so much. You 96 00:06:17,000 --> 00:06:19,640 Speaker 1: could figure out the sex of the embryo or traits 97 00:06:19,760 --> 00:06:23,560 Speaker 1: for certain genetic disorders. But now the introduction of AI 98 00:06:24,000 --> 00:06:27,080 Speaker 1: is really pushing forward what's possible with polygenic screening. 99 00:06:27,480 --> 00:06:31,200 Speaker 2: So human genome consists of three billion base pairs. You 100 00:06:31,279 --> 00:06:33,359 Speaker 2: get one set from each parent, so there's a total 101 00:06:33,360 --> 00:06:38,159 Speaker 2: of six billion base pairs in each cell. So that's 102 00:06:38,200 --> 00:06:41,520 Speaker 2: a lot of data, right, So you're using AI to 103 00:06:42,279 --> 00:06:45,360 Speaker 2: when you get your five or six cells from your embryo, 104 00:06:45,520 --> 00:06:48,280 Speaker 2: you're using AI to amplify that and see what it 105 00:06:48,279 --> 00:06:51,520 Speaker 2: would look like as a human and then you're checking 106 00:06:51,600 --> 00:06:57,039 Speaker 2: against data to see what the potential person might be ultimately. 107 00:06:57,520 --> 00:07:00,040 Speaker 2: And the backers in this space are a lot of 108 00:07:00,800 --> 00:07:02,520 Speaker 2: it's a lot of tech money, it's a lot of 109 00:07:02,560 --> 00:07:05,720 Speaker 2: people who are interested in AI. AI is great at 110 00:07:05,720 --> 00:07:09,320 Speaker 2: detecting patterns, and this sort of fits the bill. 111 00:07:09,760 --> 00:07:12,480 Speaker 1: The founders of these polygenic screening companies are betting on 112 00:07:12,520 --> 00:07:15,280 Speaker 1: their technology to be able to accurately screen for things 113 00:07:15,360 --> 00:07:19,320 Speaker 1: like height, or complicated health disorders, or even personality. 114 00:07:19,800 --> 00:07:23,320 Speaker 2: The founders themselves, they're banking embryos and they're planning to 115 00:07:23,320 --> 00:07:25,320 Speaker 2: screen for these traits in their future kid. Like the 116 00:07:25,360 --> 00:07:28,160 Speaker 2: founder of Harisite is named Michael Christiansen. He's sixty six, 117 00:07:28,720 --> 00:07:31,120 Speaker 2: you know, and he's planning to screen his own embryos 118 00:07:31,120 --> 00:07:33,360 Speaker 2: when he's ready for kids, to make them a little 119 00:07:33,360 --> 00:07:35,560 Speaker 2: bit shorter because he's like sixty six is kind of annoying, 120 00:07:35,600 --> 00:07:37,840 Speaker 2: it's hard to travel. That's a little bit too tall. 121 00:07:38,400 --> 00:07:40,840 Speaker 1: That is fascinating. That was the first thing that hit 122 00:07:40,920 --> 00:07:43,000 Speaker 1: me of the opening line with your article is that 123 00:07:43,280 --> 00:07:44,720 Speaker 1: he's six six. He wishes that he was short. He 124 00:07:44,760 --> 00:07:47,640 Speaker 1: wants his kid to be shorter because it's inconvenient. 125 00:07:48,080 --> 00:07:52,360 Speaker 2: Yeah, another one of the geneticists there, Tobias wool from it. 126 00:07:52,480 --> 00:07:55,480 Speaker 2: He hit the genetic lottery in terms of his health. 127 00:07:56,040 --> 00:07:59,120 Speaker 2: But you know, he's like, I have depression in my history, 128 00:07:59,200 --> 00:08:01,200 Speaker 2: and it's something that I don't want to pass on. 129 00:08:01,640 --> 00:08:02,960 Speaker 2: He and his partner don't want to pass it on 130 00:08:03,080 --> 00:08:05,160 Speaker 2: in their future children, so it's something that they're planning 131 00:08:05,240 --> 00:08:08,760 Speaker 2: to screen for. So these are people who are really 132 00:08:08,840 --> 00:08:12,240 Speaker 2: highly information taking. They want to be as controlled as possible, 133 00:08:13,000 --> 00:08:19,520 Speaker 2: and so they're just extremely focused on optimizing the whole 134 00:08:19,680 --> 00:08:24,440 Speaker 2: entire birth experience the same way they would optimize everything 135 00:08:24,480 --> 00:08:25,160 Speaker 2: about their lives. 136 00:08:25,200 --> 00:08:27,440 Speaker 1: And so just to clarify here, so basically, you can 137 00:08:27,800 --> 00:08:33,200 Speaker 1: essentially choose the embryo that has the traits that you want, 138 00:08:33,320 --> 00:08:35,400 Speaker 1: and if it has traits that you're worried about or 139 00:08:35,400 --> 00:08:40,120 Speaker 1: do not want, you can not use those for IVF exactly. 140 00:08:40,200 --> 00:08:42,960 Speaker 2: Like, if you're screening your embryos and you see that 141 00:08:43,000 --> 00:08:45,520 Speaker 2: your potential child could have a high likelihood of having 142 00:08:45,520 --> 00:08:49,840 Speaker 2: a childhood cancer, the idea is that you wouldn't want 143 00:08:49,880 --> 00:08:51,800 Speaker 2: to choose that embryo. You would want to choose a 144 00:08:51,800 --> 00:08:52,560 Speaker 2: different embryo. 145 00:08:53,280 --> 00:08:55,240 Speaker 1: Okay, I know this is a lot already, but let's 146 00:08:55,280 --> 00:08:58,240 Speaker 1: pause for a second. So right now we're talking about 147 00:08:58,240 --> 00:09:01,880 Speaker 1: the ability to screen for cancer. So just imagine you're 148 00:09:02,040 --> 00:09:04,640 Speaker 1: already going through IVF and you have the option, the 149 00:09:04,840 --> 00:09:08,760 Speaker 1: opportunity to choose an embryo to avoid a high risk 150 00:09:08,840 --> 00:09:12,880 Speaker 1: of a potentially fatal disease. Would you take that opportunity? 151 00:09:13,640 --> 00:09:16,240 Speaker 1: So hold on to that answer in your mind, whatever 152 00:09:16,280 --> 00:09:18,640 Speaker 1: it is, right now, and let's go a little further. 153 00:09:19,920 --> 00:09:22,520 Speaker 2: I spoke with a couple from the Bay Area, Roshan 154 00:09:22,640 --> 00:09:25,920 Speaker 2: George and Julie King. Rochan's firm. He has a consulting 155 00:09:25,960 --> 00:09:30,880 Speaker 2: firm called Technology Brother. Like Techbro, they're highly information seeking people. 156 00:09:30,880 --> 00:09:32,960 Speaker 2: They just wanted to know everything they could possibly know 157 00:09:33,000 --> 00:09:36,800 Speaker 2: about their embryos, and so they discovered going through IVF testing, 158 00:09:37,320 --> 00:09:39,960 Speaker 2: they shared a genetic mutation, which is basically like a 159 00:09:39,960 --> 00:09:42,120 Speaker 2: typo in your DNA, where there's a G there should 160 00:09:42,120 --> 00:09:44,479 Speaker 2: be an A. You know it's on. There's no manifestation, 161 00:09:44,679 --> 00:09:47,839 Speaker 2: but a couple together, the likelihood of them having a 162 00:09:47,960 --> 00:09:51,719 Speaker 2: child that becomes profoundly deaf is a real thing. So 163 00:09:51,760 --> 00:09:55,560 Speaker 2: they'd wound up screening their embryos using the polygenetic screening 164 00:09:55,600 --> 00:09:57,680 Speaker 2: at orc at Health, and then it came down to 165 00:09:57,760 --> 00:10:01,240 Speaker 2: like ninety seconds in the hospital, the audio tech comes in, 166 00:10:01,360 --> 00:10:05,120 Speaker 2: sticks things in the baby's ears, and then they're like, 167 00:10:05,240 --> 00:10:10,360 Speaker 2: hearing's normal. What's interesting too about Rosan George, he talks 168 00:10:10,400 --> 00:10:13,080 Speaker 2: a lot about his experience doing this, and he talked 169 00:10:13,120 --> 00:10:16,959 Speaker 2: with a father whose son has the condition that they 170 00:10:16,960 --> 00:10:19,640 Speaker 2: were worried about their daughter having. And this other father 171 00:10:19,679 --> 00:10:22,199 Speaker 2: said he wished he would have known about the genetic 172 00:10:22,520 --> 00:10:27,240 Speaker 2: screening technology before a son was born, but that said, 173 00:10:27,320 --> 00:10:33,240 Speaker 2: he also couldn't imagine substituting out his actual son with 174 00:10:33,400 --> 00:10:37,760 Speaker 2: a son who could hear. And so for Rosian, his 175 00:10:37,920 --> 00:10:41,559 Speaker 2: reflection about this was that no one's ever thinking you're 176 00:10:41,640 --> 00:10:46,000 Speaker 2: substituting out the actual child who's there. But we're all 177 00:10:46,080 --> 00:10:49,280 Speaker 2: trying to ensure that you have a child who is 178 00:10:49,320 --> 00:10:52,440 Speaker 2: as healthy as possible, and you want your child to 179 00:10:52,480 --> 00:10:56,000 Speaker 2: have the full range of human experiences without being limited. 180 00:10:56,760 --> 00:10:58,680 Speaker 2: One of those sources I spoke with for the story 181 00:10:58,760 --> 00:11:02,240 Speaker 2: is age stamp expert named Ferry Bear, and he said, 182 00:11:02,240 --> 00:11:05,040 Speaker 2: advisor to work at health and he's made huge strides 183 00:11:05,080 --> 00:11:09,840 Speaker 2: in terms of IDF advancements, and his feeling was there's 184 00:11:09,880 --> 00:11:12,839 Speaker 2: nothing that you wouldn't do for your kid, like don't 185 00:11:12,880 --> 00:11:15,880 Speaker 2: tell me that you wouldn't want to do this because 186 00:11:15,920 --> 00:11:21,200 Speaker 2: you wouldn't want to meddle with how things are supposed 187 00:11:21,200 --> 00:11:24,480 Speaker 2: to be. You know. His feeling was, if the science 188 00:11:24,679 --> 00:11:27,520 Speaker 2: can show you how to have a better outcome for 189 00:11:27,600 --> 00:11:31,240 Speaker 2: your child, most people are going to choose in their 190 00:11:31,280 --> 00:11:34,720 Speaker 2: heart of hearts to go that route. And so people 191 00:11:34,800 --> 00:11:38,560 Speaker 2: think of about it really in those terms. Despite sort 192 00:11:38,600 --> 00:11:42,400 Speaker 2: of the cautionary gataka like slippery slope, it exists where 193 00:11:42,440 --> 00:11:46,280 Speaker 2: you get into some of those issues like IQ or 194 00:11:46,960 --> 00:11:50,040 Speaker 2: how athletic is your child going to be? Et cetera, 195 00:11:50,080 --> 00:11:50,559 Speaker 2: et cetera. 196 00:11:52,200 --> 00:11:55,240 Speaker 1: But okay, we got a backup. This is all assuming 197 00:11:55,240 --> 00:11:59,760 Speaker 1: that this stuff actually works. So really, how accurate is 198 00:12:00,040 --> 00:12:05,640 Speaker 1: polygenic screening? Where's the line between real science and science fiction? Also? 199 00:12:06,080 --> 00:12:09,040 Speaker 1: How much does it cost? We'll get into both of 200 00:12:09,080 --> 00:12:20,720 Speaker 1: those questions after the break. How common is embryo screening? 201 00:12:20,800 --> 00:12:23,800 Speaker 1: Is this just a Silicon Valley thing? Especially this more 202 00:12:23,840 --> 00:12:26,040 Speaker 1: advanced stuff that we're talking about, the. 203 00:12:26,040 --> 00:12:28,520 Speaker 2: More advanced that you're paying for out of your pocket, 204 00:12:28,520 --> 00:12:32,679 Speaker 2: and the price ranges for this from Russian George and 205 00:12:32,760 --> 00:12:36,200 Speaker 2: Julie King, they spent about twelve five hundred dollars. That's 206 00:12:36,240 --> 00:12:38,079 Speaker 2: on the low end, but it goes all the way 207 00:12:38,160 --> 00:12:42,480 Speaker 2: up to fifty thousand dollars fifty k, right, so this 208 00:12:42,559 --> 00:12:45,840 Speaker 2: is obviously going to be more affluent, wealthy parents who 209 00:12:45,880 --> 00:12:50,080 Speaker 2: are looking into this. It's not covered by insurance, so 210 00:12:50,200 --> 00:12:53,080 Speaker 2: this is sort of a small pocket. But it does 211 00:12:53,120 --> 00:12:55,280 Speaker 2: continue to grow and grow and grow. But I wouldn't say, 212 00:12:55,840 --> 00:12:58,560 Speaker 2: you know, there are not a thousand babies out there 213 00:12:58,559 --> 00:13:02,280 Speaker 2: in Silicon Valley that have gone through polygenic screening at 214 00:13:02,679 --> 00:13:06,079 Speaker 2: these startups. We're probably in hundreds. 215 00:13:06,720 --> 00:13:09,840 Speaker 1: So okay, These polygenic screening companies are promising that you 216 00:13:09,880 --> 00:13:13,199 Speaker 1: can screen for a lot of different things. Nucleus Genomics 217 00:13:13,240 --> 00:13:16,319 Speaker 1: that's the company with those heavier best Baby ads. They 218 00:13:16,320 --> 00:13:20,120 Speaker 1: have a long list of over two thousand diseases, traits, 219 00:13:20,160 --> 00:13:23,280 Speaker 1: and other factors that they can test for. But how 220 00:13:23,400 --> 00:13:26,160 Speaker 1: reliable and how accurate is this stuff? 221 00:13:27,320 --> 00:13:30,760 Speaker 2: The experts who I spoke with basically said there is 222 00:13:30,880 --> 00:13:37,959 Speaker 2: a lot of science backing up certain traits like physical traits, sex, 223 00:13:38,080 --> 00:13:39,640 Speaker 2: Do you want a boy? Do you want a girl? 224 00:13:40,360 --> 00:13:44,040 Speaker 2: The single cell gene disorders that we talked about TASAC 225 00:13:44,160 --> 00:13:47,920 Speaker 2: sickle cell Tristey twenty one, which is the chromosomal disorder 226 00:13:47,960 --> 00:13:51,360 Speaker 2: that leads to down syndrome. All of these things will 227 00:13:51,400 --> 00:13:55,000 Speaker 2: be there in your genome, but it's these things on 228 00:13:55,040 --> 00:13:58,520 Speaker 2: the margin that people tend to be most excited and 229 00:13:58,640 --> 00:14:00,880 Speaker 2: interested in and just in read to buy, which would 230 00:14:00,920 --> 00:14:04,360 Speaker 2: be like height, IQ, musical ability, Will you be a 231 00:14:04,360 --> 00:14:07,120 Speaker 2: good picture? Are you going to be Clayton Kershaw? You know, 232 00:14:07,640 --> 00:14:11,319 Speaker 2: the critics in this space really are very skeptical about 233 00:14:11,440 --> 00:14:17,760 Speaker 2: how effective the pologetic screening is for these other things. 234 00:14:18,080 --> 00:14:20,400 Speaker 2: One of the experts for I spoke with from Illumina 235 00:14:20,800 --> 00:14:24,520 Speaker 2: and he do a lot with testing themselves and looking 236 00:14:24,520 --> 00:14:26,480 Speaker 2: at genomes. They were saying that one of the reasons 237 00:14:26,520 --> 00:14:29,000 Speaker 2: that we're not as close as we could be is 238 00:14:29,040 --> 00:14:30,840 Speaker 2: that they're just aren't enough people who have had their 239 00:14:30,840 --> 00:14:34,720 Speaker 2: genome mapped. We're about at one million. We need to 240 00:14:34,760 --> 00:14:37,280 Speaker 2: be at a billion, and then we need to go 241 00:14:37,400 --> 00:14:40,640 Speaker 2: beyond that in order to have a really good set 242 00:14:40,760 --> 00:14:44,360 Speaker 2: of data in order to be able to make these 243 00:14:44,400 --> 00:14:48,360 Speaker 2: outcomes more meaningful and more accurate. And I did speak 244 00:14:48,400 --> 00:14:51,120 Speaker 2: with this great sort of pioneer in this field. His 245 00:14:51,200 --> 00:14:54,000 Speaker 2: name is Hank Greeley, and he's at Stanford and he 246 00:14:54,080 --> 00:14:55,920 Speaker 2: knows a lot of the players in the space because 247 00:14:55,960 --> 00:15:00,120 Speaker 2: they were his students. So he was saying this stuff 248 00:15:00,240 --> 00:15:05,600 Speaker 2: this awesome clickbait dystopian fiction. But he was like, there's 249 00:15:05,640 --> 00:15:10,240 Speaker 2: so much hype, like separating the actual science what's possible 250 00:15:10,320 --> 00:15:14,320 Speaker 2: now versus the hype. We're decades away from that. But 251 00:15:14,400 --> 00:15:17,640 Speaker 2: this is also like the tech money, this being sort 252 00:15:17,640 --> 00:15:21,880 Speaker 2: of a Bay Area kind of wealthy family conversation starter. 253 00:15:22,960 --> 00:15:25,400 Speaker 2: Hype is part of it. Hype is sort of part 254 00:15:25,440 --> 00:15:29,120 Speaker 2: of what makes Silicon Valley go round, right, But hype 255 00:15:29,120 --> 00:15:32,600 Speaker 2: with babies is also a little bit like this is different. Right, 256 00:15:32,640 --> 00:15:35,680 Speaker 2: We're talking about human beings and we don't want to 257 00:15:35,680 --> 00:15:38,400 Speaker 2: think about this products or projects. 258 00:15:39,080 --> 00:15:42,400 Speaker 1: Regardless of what the science says. Now, these companies are 259 00:15:42,400 --> 00:15:45,440 Speaker 1: promising these results for their clients, even if they do 260 00:15:45,520 --> 00:15:47,960 Speaker 1: say that they have some internal guardrails for what they 261 00:15:48,080 --> 00:15:49,440 Speaker 1: will and will not do. 262 00:15:50,120 --> 00:15:54,120 Speaker 2: So Harrasite, for instance, talks about you know, IQ hype, BMI, 263 00:15:54,160 --> 00:15:58,400 Speaker 2: et cetera. They feel confident that they can screen for 264 00:15:58,480 --> 00:16:02,720 Speaker 2: this potential. Nothing is guaranteed. You know, there's no such 265 00:16:02,760 --> 00:16:04,920 Speaker 2: thing as a designer baby, is what they told me, 266 00:16:05,640 --> 00:16:08,000 Speaker 2: and that they don't make designer babies. They won't There 267 00:16:08,040 --> 00:16:10,240 Speaker 2: are certain things they will not do. For instance, they 268 00:16:10,280 --> 00:16:13,960 Speaker 2: won't allow you to choose a child that's a psychopath. 269 00:16:14,360 --> 00:16:17,880 Speaker 2: They won't allow you to choose your child's skin color, 270 00:16:17,960 --> 00:16:20,880 Speaker 2: for instance. There are just certain things that they wouldn't 271 00:16:20,920 --> 00:16:24,440 Speaker 2: want to do, but they are really confident that we're 272 00:16:24,560 --> 00:16:27,320 Speaker 2: very close to being able to do those things, and 273 00:16:27,520 --> 00:16:31,160 Speaker 2: they're really optimistic about their abilities to help you screening 274 00:16:31,200 --> 00:16:33,120 Speaker 2: for the potential there. 275 00:16:33,640 --> 00:16:35,960 Speaker 1: So there are critics who would say that these new 276 00:16:36,000 --> 00:16:38,640 Speaker 1: tech fertility companies are getting into what you could call 277 00:16:38,800 --> 00:16:41,840 Speaker 1: techno eugenics. And you can also imagine that there might 278 00:16:41,880 --> 00:16:44,440 Speaker 1: be a slippery slope that would lead to something like 279 00:16:44,520 --> 00:16:49,000 Speaker 1: the valids and invalids of Gatica. And yet there are 280 00:16:49,000 --> 00:16:52,400 Speaker 1: companies and investors who are taking this stuff even further 281 00:16:52,920 --> 00:16:55,200 Speaker 1: to gene editing in embryos. 282 00:16:56,440 --> 00:17:01,840 Speaker 2: So gene editing it involves editing the embryo before it's implanted. 283 00:17:02,440 --> 00:17:07,760 Speaker 2: It is extremely controversial, and I want to say controversial 284 00:17:07,760 --> 00:17:08,560 Speaker 2: in all caps. 285 00:17:08,960 --> 00:17:12,560 Speaker 1: Editing embryonic genes has been a taboo since precise gene 286 00:17:12,680 --> 00:17:15,919 Speaker 1: editing became possible with something called crisper. This has been 287 00:17:16,000 --> 00:17:19,400 Speaker 1: used on adults and even children to treat genetic diseases, 288 00:17:19,440 --> 00:17:22,439 Speaker 1: including a baby who was successfully treated with crisper therapy 289 00:17:22,640 --> 00:17:26,760 Speaker 1: just last May, but editing the genes of an embryo 290 00:17:27,359 --> 00:17:31,040 Speaker 1: is way more controversial and its outright band in most countries, 291 00:17:31,720 --> 00:17:34,760 Speaker 1: but it has been done. In twenty eighteen, a Chinese 292 00:17:34,800 --> 00:17:38,000 Speaker 1: researcher named Hudgin Quay reported that he had genetically altered 293 00:17:38,040 --> 00:17:40,719 Speaker 1: twin girls to be resistant to HIV. 294 00:17:41,240 --> 00:17:45,760 Speaker 5: Two beautiful Nito Chinese guard named the Lulu and Lana 295 00:17:46,000 --> 00:17:49,680 Speaker 5: came crying into the world as healthy as any other 296 00:17:49,880 --> 00:17:51,719 Speaker 5: babies a few weeks ago. 297 00:17:52,000 --> 00:17:54,600 Speaker 1: This is Hudgincoay in his video announcing the birth of 298 00:17:54,640 --> 00:17:55,840 Speaker 1: those twin girls. 299 00:17:56,240 --> 00:18:00,880 Speaker 5: We mark Sahadalta for the first had he said that 300 00:18:01,600 --> 00:18:06,280 Speaker 5: he never thought he could be a father. No, he 301 00:18:06,320 --> 00:18:10,159 Speaker 5: has found a reason to leave, a reason to walk, 302 00:18:10,560 --> 00:18:17,280 Speaker 5: a purpose. You see Mark has ivjeene surgery is and 303 00:18:17,440 --> 00:18:21,560 Speaker 5: should it remain a technology for hearning, enhancing IQ or 304 00:18:21,640 --> 00:18:25,439 Speaker 5: selecting how or I call it that suld be banned. 305 00:18:26,800 --> 00:18:31,200 Speaker 5: I understand them work would be controversial, but I believe 306 00:18:31,760 --> 00:18:36,240 Speaker 5: family need this technology and I'm willing to take the 307 00:18:36,400 --> 00:18:40,560 Speaker 5: criticism for them. 308 00:18:40,760 --> 00:18:44,359 Speaker 1: Ujinkway wasn't just criticized. He was sentenced to three years 309 00:18:44,400 --> 00:18:48,199 Speaker 1: in prison for illegally practicing embryonic gene editing, which you 310 00:18:48,240 --> 00:18:52,200 Speaker 1: would think would scare people off from gene editing technology, 311 00:18:52,920 --> 00:18:56,359 Speaker 1: but not everyone. Some people are investing in it. The 312 00:18:56,400 --> 00:18:59,280 Speaker 1: Wall Street Journal reported late last year about investments from 313 00:18:59,320 --> 00:19:03,160 Speaker 1: Sam Altman's husband and also from coinbase CEO Brian Armstrong 314 00:19:03,520 --> 00:19:06,919 Speaker 1: into a company called Preventive that wants to experiment with 315 00:19:07,080 --> 00:19:09,320 Speaker 1: gene editing. They're just going to do it in places 316 00:19:09,359 --> 00:19:12,880 Speaker 1: outside the US, like United Arab Emirates, where this stuff 317 00:19:13,000 --> 00:19:13,560 Speaker 1: is allowed. 318 00:19:15,119 --> 00:19:18,159 Speaker 2: I did speak with a woman named Kathy t who's 319 00:19:18,200 --> 00:19:22,720 Speaker 2: also former Stanford former Teal fellow. She started a firm 320 00:19:22,760 --> 00:19:28,280 Speaker 2: called Manhattan Genomics, and she is planning to test on 321 00:19:29,119 --> 00:19:32,600 Speaker 2: non human climates this year. She said there's a lot 322 00:19:32,640 --> 00:19:35,680 Speaker 2: of support in the background for her because people do 323 00:19:35,720 --> 00:19:39,080 Speaker 2: feel like, why not explore this area if we can, 324 00:19:39,600 --> 00:19:42,920 Speaker 2: why leave this untouched if it's something that we could 325 00:19:42,960 --> 00:19:46,560 Speaker 2: potentially get right. One of the points that she made 326 00:19:46,560 --> 00:19:49,639 Speaker 2: that I thought was really compelling was that older women 327 00:19:50,640 --> 00:19:53,800 Speaker 2: struggle to get a lot of eggs in order to 328 00:19:53,840 --> 00:19:57,280 Speaker 2: make embryos, and then sometimes they don't make enough eggs, 329 00:19:57,280 --> 00:19:58,879 Speaker 2: and so you have to go back and stimulate your 330 00:19:58,880 --> 00:20:00,959 Speaker 2: follow cles to get more eggs. So that's like an 331 00:20:01,080 --> 00:20:04,920 Speaker 2: arduous process. You're injecting yourself with hormones every day. It's painful. 332 00:20:05,040 --> 00:20:08,439 Speaker 2: You can't turn and bend if you are injecting yourself 333 00:20:08,480 --> 00:20:10,919 Speaker 2: with hormones. These are things that women know, but like 334 00:20:11,000 --> 00:20:14,520 Speaker 2: people don't necessarily talk about them because women are just like, 335 00:20:14,520 --> 00:20:16,399 Speaker 2: I'm going about my daily life even though I am 336 00:20:16,400 --> 00:20:18,640 Speaker 2: not able to bend over' just going to get through 337 00:20:18,640 --> 00:20:20,919 Speaker 2: this because I want to have a baby. So, you know, 338 00:20:21,000 --> 00:20:23,840 Speaker 2: her point is sort of women are having children later 339 00:20:23,880 --> 00:20:26,520 Speaker 2: in life. They just are like, that's what all of 340 00:20:26,560 --> 00:20:30,320 Speaker 2: the data shows us, and it's more difficult, and gene 341 00:20:30,440 --> 00:20:36,040 Speaker 2: editing potentially makes that easier, particularly for this group of 342 00:20:36,080 --> 00:20:39,720 Speaker 2: women who are having children later in life, and that 343 00:20:40,200 --> 00:20:43,960 Speaker 2: it gives you more of a shot at having your 344 00:20:44,000 --> 00:20:45,680 Speaker 2: own biological child. 345 00:20:46,600 --> 00:20:50,399 Speaker 1: Gen editing right now is not legal. You can't do 346 00:20:50,480 --> 00:20:54,119 Speaker 1: that now, not in the US. But you've got companies 347 00:20:54,240 --> 00:20:57,120 Speaker 1: who are based in the US who are putting money 348 00:20:57,119 --> 00:21:00,919 Speaker 1: into this. Are they proposing to change the laws to 349 00:21:00,920 --> 00:21:02,000 Speaker 1: get around this? Somehow? 350 00:21:02,280 --> 00:21:06,240 Speaker 2: I think they're hoping for you know, with this administration, 351 00:21:06,400 --> 00:21:09,800 Speaker 2: they're being sort of less restriction on around innovation and 352 00:21:09,880 --> 00:21:13,359 Speaker 2: medical testing and those kinds of things. So I think 353 00:21:13,400 --> 00:21:14,800 Speaker 2: that that's the hope. 354 00:21:14,680 --> 00:21:18,439 Speaker 1: Really, that the laws will change to allow for whatever 355 00:21:18,480 --> 00:21:22,160 Speaker 1: does they want to do technologically with designer babies. 356 00:21:22,200 --> 00:21:25,280 Speaker 2: Right, and that there'll be more research, more advancements, that 357 00:21:25,280 --> 00:21:28,720 Speaker 2: they'll be more in the space. There's a regulatory vacuum 358 00:21:28,800 --> 00:21:31,359 Speaker 2: right now. One of the lawyers who I spoke with said, 359 00:21:31,400 --> 00:21:34,119 Speaker 2: the law always struggles to keep up with technology, so 360 00:21:34,160 --> 00:21:37,320 Speaker 2: there's a lot of advancement and experimentation that happens when 361 00:21:37,320 --> 00:21:40,920 Speaker 2: there's a regulatory vacuum, when things aren't like banned outright. 362 00:21:41,520 --> 00:21:44,560 Speaker 2: So I think that with the gene editing piece, there's 363 00:21:44,600 --> 00:21:49,160 Speaker 2: a lot of optimism and hope there. But because it's 364 00:21:49,200 --> 00:21:52,680 Speaker 2: so controversial, people don't really want to touch it. It's 365 00:21:52,680 --> 00:21:56,160 Speaker 2: like a third rail almost in this space that it's 366 00:21:56,240 --> 00:21:58,920 Speaker 2: just people are less conversant in it. They're more afraid 367 00:21:59,440 --> 00:22:01,840 Speaker 2: of what we'll have and it does sound scary. We 368 00:22:01,880 --> 00:22:04,720 Speaker 2: don't know what it means, you know, generations to come 369 00:22:04,760 --> 00:22:07,160 Speaker 2: a few edit your genes for your baby. 370 00:22:08,200 --> 00:22:10,960 Speaker 1: So if scientists don't think we're there yet to predict 371 00:22:10,960 --> 00:22:14,600 Speaker 1: complex traits from genetic testing, then editing genes to get 372 00:22:14,640 --> 00:22:18,800 Speaker 1: those traits seems even further away. But some very rich 373 00:22:18,880 --> 00:22:22,520 Speaker 1: people are betting that those scientists and those lawyers are wrong, 374 00:22:23,000 --> 00:22:25,400 Speaker 1: and if those bets pay off, they have a very 375 00:22:25,440 --> 00:22:29,080 Speaker 1: specific vision of what the future could look like. That's 376 00:22:29,240 --> 00:22:45,119 Speaker 1: after the Break. A few years back, a South African 377 00:22:45,200 --> 00:22:48,120 Speaker 1: lawyer wrote a brief article in an academic journal where 378 00:22:48,160 --> 00:22:51,200 Speaker 1: she was partially talking about the legal and moral implications 379 00:22:51,200 --> 00:22:54,800 Speaker 1: of genetiting, but she also makes a comparison between the 380 00:22:54,800 --> 00:22:58,560 Speaker 1: film Gatica to the real life era of South African 381 00:22:58,560 --> 00:23:02,399 Speaker 1: apartheid that she expect sperience. And it makes sense because 382 00:23:02,480 --> 00:23:06,200 Speaker 1: the plot of Gatica centers on a kind of genetic apartheid, 383 00:23:06,640 --> 00:23:10,119 Speaker 1: the divide between the valids and the invalids, and we 384 00:23:10,320 --> 00:23:13,639 Speaker 1: the viewers were supposed to identify with the invalid character 385 00:23:13,680 --> 00:23:17,119 Speaker 1: who's played by Ethan Hawk, because in this society, we 386 00:23:17,200 --> 00:23:20,720 Speaker 1: would also be invalid in the film, this is a 387 00:23:20,760 --> 00:23:25,280 Speaker 1: future that we do not want. And yet one of 388 00:23:25,280 --> 00:23:29,040 Speaker 1: the investors in this industry, Coinbay CEO Brian Armstrong tweeted 389 00:23:29,040 --> 00:23:31,919 Speaker 1: the other day quote, the IVF Clinic of the Future 390 00:23:32,040 --> 00:23:36,359 Speaker 1: will combine a handful of technologies parentheses the Gatica stack. 391 00:23:36,840 --> 00:23:39,880 Speaker 1: He goes on to describe this Gataka stack as including 392 00:23:39,920 --> 00:23:43,200 Speaker 1: technologies that would allow for embryo editing, including for what 393 00:23:43,280 --> 00:23:47,600 Speaker 1: he calls enhancement, and also allow for creating artificial wounds. 394 00:23:47,960 --> 00:23:52,120 Speaker 1: He also says that quote it would start to accelerate evolution. 395 00:23:54,040 --> 00:23:55,720 Speaker 1: It seems like a meme at this point. But it's 396 00:23:55,760 --> 00:23:59,680 Speaker 1: like some people watch these techno dystopian movies and say, 397 00:24:00,359 --> 00:24:02,480 Speaker 1: that would be dope. We should do that, you know, 398 00:24:02,520 --> 00:24:05,560 Speaker 1: will be really cool. We should have Sky in it, yeah, 399 00:24:05,680 --> 00:24:09,240 Speaker 1: and have it go live. Yeah, you're supposed to watch 400 00:24:09,640 --> 00:24:13,040 Speaker 1: Terminator and say that would suck. I don't want that. 401 00:24:13,119 --> 00:24:15,840 Speaker 1: You're supposed to watch Gatica and say that would suck. 402 00:24:15,960 --> 00:24:18,920 Speaker 1: I don't want that, And somebody's watching it, somebody with 403 00:24:19,080 --> 00:24:20,440 Speaker 1: a lot of money's watching and say, you know, it 404 00:24:20,440 --> 00:24:23,159 Speaker 1: would be really cool if we did that thing that 405 00:24:23,160 --> 00:24:25,840 Speaker 1: that guy told me not to do. That we all 406 00:24:25,880 --> 00:24:28,040 Speaker 1: as a society agreed that that'd be a bad idea. 407 00:24:28,160 --> 00:24:29,160 Speaker 1: Let's do that thing. 408 00:24:30,040 --> 00:24:32,760 Speaker 2: Yeah. These are all supposed to be cautionary tales. 409 00:24:32,960 --> 00:24:34,679 Speaker 1: Yes, thirty years ago, these. 410 00:24:34,520 --> 00:24:40,480 Speaker 2: Were cautionary tales. Today they're aspirational tales like let's do that. 411 00:24:41,080 --> 00:24:43,160 Speaker 2: But I mean, if you think about it, there is 412 00:24:43,200 --> 00:24:46,720 Speaker 2: also like if I think of like biohackers. So people 413 00:24:46,720 --> 00:24:52,199 Speaker 2: who are doing peptides, glp ones, nad talking about how 414 00:24:52,200 --> 00:24:56,120 Speaker 2: they're gonna live for a very long time. The Gatica 415 00:24:56,240 --> 00:24:59,119 Speaker 2: stack feels like it takes a lot of those same boxes, 416 00:24:59,320 --> 00:25:02,879 Speaker 2: like demize your kid because you want your kid to 417 00:25:02,880 --> 00:25:05,159 Speaker 2: be as healthy as possible, the same way you're becoming 418 00:25:05,200 --> 00:25:10,480 Speaker 2: as healthy as possible by hacking your biological decline. So 419 00:25:11,040 --> 00:25:13,639 Speaker 2: it feels very and I hate to use this phrase 420 00:25:13,680 --> 00:25:16,600 Speaker 2: but on brand, but yeah, it is interesting how we've 421 00:25:16,640 --> 00:25:19,040 Speaker 2: totally gone to the upside down. Whereas we used to 422 00:25:19,080 --> 00:25:21,119 Speaker 2: be like, well, this would be terrible. We would all 423 00:25:21,160 --> 00:25:24,640 Speaker 2: be invalid and not given chances to fulfill our hopes 424 00:25:24,680 --> 00:25:28,280 Speaker 2: and desires and dreams in this world. And then now 425 00:25:28,359 --> 00:25:34,240 Speaker 2: we're like, let's do it. Well, let's turbocharge our embryos 426 00:25:34,320 --> 00:25:38,359 Speaker 2: and pump it up and have the just the dopest 427 00:25:38,520 --> 00:25:42,560 Speaker 2: best baby we can possibly have. Right, But when I 428 00:25:42,640 --> 00:25:46,680 Speaker 2: rewatched Gatica, I still was rooting for our invalid hero, 429 00:25:47,280 --> 00:25:50,560 Speaker 2: our invalid avenging angel ethan Haaks Like I was like, 430 00:25:50,640 --> 00:25:53,359 Speaker 2: he's the hero. And when they do he and his 431 00:25:53,640 --> 00:25:56,480 Speaker 2: like superior and finger quotes brother go and they do 432 00:25:56,560 --> 00:25:59,560 Speaker 2: that swimming contest the very dangerous that they do where 433 00:25:59,560 --> 00:26:02,159 Speaker 2: they go on the ocean, in the store or whatever. 434 00:26:02,680 --> 00:26:04,840 Speaker 2: The brother who's supposed to be superior as strungers like 435 00:26:04,840 --> 00:26:07,160 Speaker 2: how did Joys beat them? And he says, I never 436 00:26:07,280 --> 00:26:11,960 Speaker 2: sank anything for the way back legend, right, Like, I'm 437 00:26:12,040 --> 00:26:14,879 Speaker 2: rooting for him in Europe so even now, but I 438 00:26:14,920 --> 00:26:16,920 Speaker 2: don't know, you know, I wasn't forged in the fires 439 00:26:16,960 --> 00:26:19,560 Speaker 2: of crypto and all of these other things that I 440 00:26:19,560 --> 00:26:24,120 Speaker 2: feel like, you know, so you think different I think differently. Yeah, 441 00:26:24,160 --> 00:26:27,200 Speaker 2: I'm rooting for Eakalk. Some people are clearly rooting for 442 00:26:27,560 --> 00:26:29,200 Speaker 2: his superior brother. 443 00:26:30,040 --> 00:26:33,280 Speaker 1: You know, you don't really have to watch Gatika to 444 00:26:33,400 --> 00:26:37,120 Speaker 1: imagine a twos tier society. You don't have to watch 445 00:26:37,160 --> 00:26:41,199 Speaker 1: Gadeka to imagine second class citizens. Functionally speaking, this is 446 00:26:41,280 --> 00:26:43,040 Speaker 1: kind of what we're talking about here. I mean, you 447 00:26:43,080 --> 00:26:45,040 Speaker 1: were saying, what is twelve thousand dollars. 448 00:26:44,960 --> 00:26:46,760 Speaker 2: Well, that's the low end, that would be the low end. 449 00:26:46,840 --> 00:26:49,119 Speaker 1: Yeah, on the low end, Yeah, twelve thousand dollars on 450 00:26:49,200 --> 00:26:51,600 Speaker 1: low end, fifty k on the high end. I could 451 00:26:51,600 --> 00:26:54,960 Speaker 1: imagine it going upwards of that. A lot of people 452 00:26:54,960 --> 00:27:00,720 Speaker 1: can barely afford to have a child, period, and so 453 00:27:02,119 --> 00:27:04,439 Speaker 1: I mean, are we now just in a situation where 454 00:27:05,000 --> 00:27:10,360 Speaker 1: rich people with access to this kind of capital can 455 00:27:10,440 --> 00:27:13,800 Speaker 1: have kids that will further have more advantages in their life, 456 00:27:14,160 --> 00:27:15,600 Speaker 1: and everybody else gets left behind? 457 00:27:16,440 --> 00:27:21,040 Speaker 2: Huge issue. Yes, yes, exactly right. I mean IVF itself 458 00:27:21,119 --> 00:27:24,639 Speaker 2: is very, very expensive. So the barrier to entry to 459 00:27:24,720 --> 00:27:28,000 Speaker 2: even having a child if you deal with fertility issues 460 00:27:28,160 --> 00:27:31,840 Speaker 2: is already there. We're already there. This could definitely make 461 00:27:31,880 --> 00:27:32,600 Speaker 2: it a lot worse. 462 00:27:33,840 --> 00:27:37,359 Speaker 1: We hear the word design er baby thrown around a lot. 463 00:27:37,480 --> 00:27:39,720 Speaker 1: It seems like we're getting a little bit closer to 464 00:27:39,720 --> 00:27:43,200 Speaker 1: that as a reality. What does that word mean for you? 465 00:27:43,800 --> 00:27:48,280 Speaker 2: Designer baby means to me is a sort of fictitious 466 00:27:48,800 --> 00:27:54,800 Speaker 2: concept that will I don't think ever be a reality 467 00:27:55,200 --> 00:27:59,720 Speaker 2: because I just don't think that at that base level. 468 00:27:59,760 --> 00:28:01,679 Speaker 2: I think think that what makes a human being is 469 00:28:01,840 --> 00:28:04,240 Speaker 2: so much more to do with how you grow up 470 00:28:04,600 --> 00:28:07,640 Speaker 2: and your experience is growing up. I just think it's 471 00:28:07,640 --> 00:28:11,040 Speaker 2: a fictitious concept. I think people are interested in trying 472 00:28:11,080 --> 00:28:13,440 Speaker 2: to make that concept as close to reality as possible. 473 00:28:13,840 --> 00:28:16,240 Speaker 2: Design your baby, but I don't think it gets you 474 00:28:17,200 --> 00:28:20,520 Speaker 2: to that outcome because there's a lot more that after 475 00:28:20,560 --> 00:28:22,840 Speaker 2: the embryo comes out of the lady, Like, you know, 476 00:28:22,960 --> 00:28:25,240 Speaker 2: one day it's in the womb, one day it's out. 477 00:28:25,240 --> 00:28:27,159 Speaker 2: And once it's out, it's like you still have to 478 00:28:27,160 --> 00:28:29,640 Speaker 2: get that baby to sleep and eat and put them 479 00:28:29,640 --> 00:28:31,600 Speaker 2: in preschool and do those things. You know, So it's 480 00:28:31,640 --> 00:28:35,520 Speaker 2: like it's an uphill slog regardless. And Hank Greeley was saying, 481 00:28:35,680 --> 00:28:38,600 Speaker 2: you know, don't forget there are other things that we 482 00:28:38,760 --> 00:28:43,760 Speaker 2: know influence childhood outcomes, like reading to your kids, taking 483 00:28:43,760 --> 00:28:47,479 Speaker 2: them to the park, good nutrition, getting them vaccinated, Like 484 00:28:47,800 --> 00:28:51,960 Speaker 2: those are studied and have a much more meaningful and 485 00:28:52,120 --> 00:28:56,480 Speaker 2: larger impact on how high you'll be able to vertically 486 00:28:56,600 --> 00:29:02,720 Speaker 2: jump than the genetic material that you're starting with. 487 00:29:03,400 --> 00:29:06,160 Speaker 1: Yeah, I mean to take a couple steps back and 488 00:29:06,240 --> 00:29:10,720 Speaker 1: think about the genetically superior child who may be born, 489 00:29:11,320 --> 00:29:14,560 Speaker 1: not just of you know, embryo selection, but we're talking 490 00:29:14,560 --> 00:29:18,440 Speaker 1: about gene editing. Right, some kid is they got their 491 00:29:18,520 --> 00:29:22,960 Speaker 1: genes edited. I mean to the nines. They're tall, they're handsome, 492 00:29:23,240 --> 00:29:27,240 Speaker 1: they're muscular, they're whatever, and then they're not so good 493 00:29:27,280 --> 00:29:31,160 Speaker 1: at the piano, not as good as mom and dad hoped. 494 00:29:32,280 --> 00:29:35,880 Speaker 1: Does mom and dad go complain about a defective product 495 00:29:35,920 --> 00:29:39,040 Speaker 1: and won a little bit of a refund, Like, how 496 00:29:39,040 --> 00:29:40,120 Speaker 1: does that even work? 497 00:29:40,640 --> 00:29:43,560 Speaker 2: I mean I imagine that it turns into just torture 498 00:29:43,640 --> 00:29:46,600 Speaker 2: for the kid, like we spent all this money to 499 00:29:46,640 --> 00:29:51,280 Speaker 2: make you perfect, and what's wrong? You know? And I 500 00:29:51,400 --> 00:29:55,400 Speaker 2: did talk with ethicists who say that there's so much 501 00:29:55,480 --> 00:29:58,640 Speaker 2: danger in thinking of your child as being the best 502 00:29:58,920 --> 00:30:02,360 Speaker 2: from the get go, telling them that they're superior birth, 503 00:30:03,000 --> 00:30:06,040 Speaker 2: that they're going to be able to do all these things. 504 00:30:06,400 --> 00:30:09,120 Speaker 2: And then what if they don't. What if they're a 505 00:30:09,160 --> 00:30:12,880 Speaker 2: little chubby or not really good at the violin, or 506 00:30:13,000 --> 00:30:16,480 Speaker 2: they don't really feel like playing music, do you love them? Lest? 507 00:30:17,000 --> 00:30:19,240 Speaker 2: Luckily this doesn't apply to me because I have children 508 00:30:19,240 --> 00:30:22,920 Speaker 2: and definitely not having anymore. But you know, it's like 509 00:30:22,960 --> 00:30:25,040 Speaker 2: what I do this, But then I think about what 510 00:30:25,080 --> 00:30:27,800 Speaker 2: I would want, Like you want your kid to feel joy, 511 00:30:28,000 --> 00:30:31,320 Speaker 2: You want your kids to be kind and curious? Right? 512 00:30:31,880 --> 00:30:35,640 Speaker 1: Yeah? I mean it doesn't sound like anybody is really 513 00:30:36,000 --> 00:30:39,520 Speaker 1: just aching to find the gene for kindness. I would 514 00:30:39,520 --> 00:30:41,880 Speaker 1: love that. Listen if you wrote an article that you 515 00:30:41,960 --> 00:30:45,440 Speaker 1: told me dexter, Listen, these people out here, they're trying 516 00:30:45,440 --> 00:30:49,360 Speaker 1: to make sure that they have the kindest, most empathetic kids. 517 00:30:49,400 --> 00:30:52,720 Speaker 1: I say, you know what, let me step back. Maybe 518 00:30:52,720 --> 00:30:55,160 Speaker 1: I'm being a hater here, but when I hear you 519 00:30:55,200 --> 00:30:57,320 Speaker 1: know you want your kids hair color to be such 520 00:30:57,320 --> 00:30:58,880 Speaker 1: and such, you want to be such and such a height, 521 00:30:59,080 --> 00:31:02,600 Speaker 1: You're gonna be good as such such sport, I'm feeling 522 00:31:02,680 --> 00:31:06,280 Speaker 1: like I'm back in Gadega again. You know what. Matter 523 00:31:06,320 --> 00:31:09,400 Speaker 1: of fact, let's go back to Gatica again. So early 524 00:31:09,480 --> 00:31:11,640 Speaker 1: in the movie, when the main character is born, the 525 00:31:11,720 --> 00:31:14,520 Speaker 1: doctors scan him and tell the parents that, hey, your 526 00:31:14,600 --> 00:31:17,120 Speaker 1: kid has a potential heart condition, and also he's going 527 00:31:17,200 --> 00:31:19,480 Speaker 1: to be a little near sighted, he'll probably need glasses, 528 00:31:19,920 --> 00:31:22,720 Speaker 1: and so this child is now invalid. And then we 529 00:31:22,760 --> 00:31:25,880 Speaker 1: see this montage of the kid being discriminated against, like 530 00:31:25,960 --> 00:31:28,680 Speaker 1: the preschool won't even let him in the door. So 531 00:31:28,760 --> 00:31:31,440 Speaker 1: the parents decide to have another kid, and they're going 532 00:31:31,520 --> 00:31:34,640 Speaker 1: to make sure that this kid is going to be valid. 533 00:31:35,200 --> 00:31:37,040 Speaker 1: So they meet with the doctor that talks them through 534 00:31:37,040 --> 00:31:39,040 Speaker 1: what kind of child they're going to get. 535 00:31:39,320 --> 00:31:43,560 Speaker 6: You have specified hazel eyes, dark hair, and fair skin. 536 00:31:45,400 --> 00:31:48,000 Speaker 6: I have taken the liberty of eradicating any potentially prejudicial 537 00:31:48,000 --> 00:31:53,040 Speaker 6: conditions pretty much, your baldness, myopia, alcoholism, and addictive susceptibility 538 00:31:53,480 --> 00:31:55,560 Speaker 6: propuncity for violence, obesity, et cetera. 539 00:31:55,640 --> 00:32:00,640 Speaker 7: We didn't want I mean diseases, yes, but just wondering 540 00:32:00,840 --> 00:32:03,280 Speaker 7: if it's good to just leave a few things to 541 00:32:03,520 --> 00:32:06,480 Speaker 7: the chance. You want to give your child the best possible. 542 00:32:06,560 --> 00:32:10,880 Speaker 7: Start believing we have enough imperfection built in already. Your 543 00:32:10,960 --> 00:32:14,600 Speaker 7: child isn't needing additional burdens. And keep in mind, this 544 00:32:14,720 --> 00:32:17,800 Speaker 7: child is still you, simply the best. 545 00:32:17,520 --> 00:32:23,880 Speaker 1: Of you, Still you, simply the best of you. And 546 00:32:23,960 --> 00:32:27,000 Speaker 1: of course, as the viewer, you're supposed to think this 547 00:32:27,720 --> 00:32:31,200 Speaker 1: is terrible, and yet here we are. 548 00:32:33,120 --> 00:32:37,560 Speaker 2: You know, very Bear, who made a lot of advancements 549 00:32:37,560 --> 00:32:40,719 Speaker 2: in IVF technology. He was like, it used to be 550 00:32:41,000 --> 00:32:44,520 Speaker 2: horrible to show your collarbones and your knees, and now 551 00:32:44,600 --> 00:32:47,640 Speaker 2: people are wearing thongs at the beach, you know when 552 00:32:47,680 --> 00:32:50,040 Speaker 2: he was like that line moves with time and we 553 00:32:50,120 --> 00:32:51,440 Speaker 2: eventually get born more. 554 00:32:51,280 --> 00:32:56,160 Speaker 1: Comfortable with it, or may we all continue to be invalid? 555 00:32:56,640 --> 00:32:59,880 Speaker 2: Invalid heroes unite. We are leading at this point. 556 00:33:05,520 --> 00:33:08,160 Speaker 1: Thank you so much for listening to kill Switch. If 557 00:33:08,160 --> 00:33:10,320 Speaker 1: you want to talk, you can email us at kill 558 00:33:10,360 --> 00:33:14,440 Speaker 1: Switch at Kaleidoscope dot NYC or on Instagram. We're at 559 00:33:14,600 --> 00:33:16,920 Speaker 1: kill Switch Pod And if you dug the show and 560 00:33:17,040 --> 00:33:20,120 Speaker 1: want to do a favor for some fellow random, invalid stranger, 561 00:33:20,480 --> 00:33:22,480 Speaker 1: get that phone back at your pocket and leave us 562 00:33:22,480 --> 00:33:25,400 Speaker 1: a review. Seriously, it helps other people find the show 563 00:33:25,520 --> 00:33:27,360 Speaker 1: and you can think of that as your good deed 564 00:33:27,360 --> 00:33:29,600 Speaker 1: for the day. Also, did you know that kill Switch 565 00:33:29,720 --> 00:33:32,720 Speaker 1: is on YouTube? Watching the show is a whole different experience, 566 00:33:32,840 --> 00:33:35,080 Speaker 1: So if you're into that, the link for that and 567 00:33:35,160 --> 00:33:38,080 Speaker 1: everything else is in the show notes. Kill Switch is 568 00:33:38,120 --> 00:33:41,360 Speaker 1: hosted by Me Dexter Thomas. It's produced by Shena Ozaki, 569 00:33:41,600 --> 00:33:44,960 Speaker 1: Darluk Potts, and Julia Nutter. Our theme song is by 570 00:33:45,000 --> 00:33:47,680 Speaker 1: me and Kyle Murdoch and Kyle also mixed the show 571 00:33:48,120 --> 00:33:52,200 Speaker 1: from Kalaidoscope. Our executive producers are Oswa Lashin, Mangesh Hotigador 572 00:33:52,560 --> 00:33:56,280 Speaker 1: and Kate Osborne from iHeart Our executive producers a Katrina 573 00:33:56,360 --> 00:33:57,600 Speaker 1: Norville and Nikki E. 574 00:33:57,760 --> 00:33:57,960 Speaker 4: Tour. 575 00:33:58,560 --> 00:34:10,560 Speaker 1: Catch on the Next One by m HM