1 00:00:15,396 --> 00:00:27,196 Speaker 1: Pushkin. Americans do this really weird thing. Expectant parents hold 2 00:00:27,556 --> 00:00:31,996 Speaker 1: gender reveal parties now to the mystery explosion that rocked 3 00:00:32,076 --> 00:00:39,236 Speaker 1: a southern New Hampshire town. They often feature explosions with 4 00:00:39,276 --> 00:00:43,196 Speaker 1: smoke that's either pink or blue. An alarming number of 5 00:00:43,196 --> 00:00:47,236 Speaker 1: these stunts have gone RYE turns out that blast came 6 00:00:47,476 --> 00:00:51,156 Speaker 1: from an over the top gender reveal party, a couple 7 00:00:51,196 --> 00:00:54,236 Speaker 1: apparently using explosives to announce the sex of their baby. 8 00:00:54,676 --> 00:00:58,116 Speaker 1: People have been killed, houses have burned to the ground, 9 00:00:58,316 --> 00:01:01,556 Speaker 1: even forests. We begin to night with new video released 10 00:01:01,596 --> 00:01:04,516 Speaker 1: from the US Forest Service showing the moment a gender 11 00:01:04,556 --> 00:01:08,316 Speaker 1: reveal video started the forty seven thousand acres Sawmill fire. 12 00:01:11,556 --> 00:01:14,876 Speaker 1: Elon Musk and his former girlfriend, the musician Grimes, didn't 13 00:01:14,876 --> 00:01:17,316 Speaker 1: hold a gender reveal party in a way. They did 14 00:01:17,356 --> 00:01:20,156 Speaker 1: the opposite. Last year on Twitter, they announced the birth 15 00:01:20,196 --> 00:01:23,076 Speaker 1: of their baby, who was named X. Grimes that the 16 00:01:23,076 --> 00:01:26,116 Speaker 1: baby would be raised without a gender. Like most things 17 00:01:26,156 --> 00:01:28,836 Speaker 1: involving Elon Musk, this move looks like it has its 18 00:01:28,836 --> 00:01:32,716 Speaker 1: origins in science fiction. Once upon a Time, a baby 19 00:01:32,876 --> 00:01:35,716 Speaker 1: named X was born. The Story of Baby X was 20 00:01:35,756 --> 00:01:39,516 Speaker 1: published in nineteen seventy two in miss A feminist magazine 21 00:01:39,836 --> 00:01:42,196 Speaker 1: during its very first year, at the height of the 22 00:01:42,236 --> 00:01:46,396 Speaker 1: women's liberation movement. This baby was named X so that 23 00:01:46,436 --> 00:01:49,076 Speaker 1: nobody could tell whether it was a boy or a girl. 24 00:01:49,956 --> 00:01:53,316 Speaker 1: Its parents could tell, of course, but they couldn't tell 25 00:01:53,436 --> 00:01:58,196 Speaker 1: anybody else. They couldn't even tell Baby X, at least 26 00:01:58,636 --> 00:02:02,316 Speaker 1: not until much much later. You see, it was all 27 00:02:02,436 --> 00:02:07,556 Speaker 1: part of a very important secret scientific experiment known officially 28 00:02:07,636 --> 00:02:16,236 Speaker 1: as Project Baby X. Baby X began as a feminist 29 00:02:16,276 --> 00:02:19,196 Speaker 1: thought experiment. How did it come to be the name 30 00:02:19,196 --> 00:02:23,276 Speaker 1: of Elon Musk's youngest child? In a broader sense, what's 31 00:02:23,276 --> 00:02:27,596 Speaker 1: the place of ideas about families in Silicon Valley futurism? 32 00:02:27,636 --> 00:02:29,796 Speaker 1: And are there other ideas about families that may be 33 00:02:29,876 --> 00:02:31,916 Speaker 1: ought to have a place in any vision of the future. 34 00:02:36,956 --> 00:02:41,276 Speaker 1: Welcome to The Evening Rocket, a special report. I'm Jill Lapoor. 35 00:02:41,836 --> 00:02:44,716 Speaker 1: I'm a historian, a professor at Harvard, and for a 36 00:02:44,756 --> 00:02:48,316 Speaker 1: long time I've been studying the relationship between technological and 37 00:02:48,476 --> 00:02:52,796 Speaker 1: political change. This series, I'm exploring a new kind of capitalism. 38 00:02:52,996 --> 00:02:59,596 Speaker 1: Call it Muscism, extravagant extreme capitalism, extraterrestrial capitalism, where stock 39 00:02:59,676 --> 00:03:04,636 Speaker 1: prices for projects from Tesla and SpaceX to cryptocurrencies and 40 00:03:04,796 --> 00:03:08,116 Speaker 1: neural implants can be driven by fantasies that come from 41 00:03:08,156 --> 00:03:13,796 Speaker 1: science fiction. I'm fascinated by science fiction, even by comic books. 42 00:03:14,036 --> 00:03:16,316 Speaker 1: They once read a whole about the political history of 43 00:03:16,396 --> 00:03:20,276 Speaker 1: Wonder Woman. The science fiction men like Elon Musk and 44 00:03:20,356 --> 00:03:25,476 Speaker 1: Jeff bezos Ador generally concerns gleaming futures in which fantastically 45 00:03:25,516 --> 00:03:31,476 Speaker 1: powerful and often immensely rich men colonize other planets. This episode, 46 00:03:31,556 --> 00:03:34,036 Speaker 1: which is called Baby X, I want to take a 47 00:03:34,036 --> 00:03:36,236 Speaker 1: look at the science fiction that's usually left out of 48 00:03:36,276 --> 00:03:41,836 Speaker 1: that vision new way, afrofuturism, feminist science fiction, post colonial 49 00:03:41,876 --> 00:03:47,076 Speaker 1: science fiction, including the story of Baby X. The smartest 50 00:03:47,116 --> 00:03:49,836 Speaker 1: scientists had set up this experiment at a cost of 51 00:03:50,036 --> 00:03:54,556 Speaker 1: exactly twenty three billion dollars and seventy two cents. This 52 00:03:54,676 --> 00:03:57,676 Speaker 1: might seem like a lot for one baby, even if 53 00:03:57,716 --> 00:04:02,676 Speaker 1: it was an important, secret scientific experimental Baby. This sort 54 00:04:02,676 --> 00:04:05,716 Speaker 1: of science fiction generally involves both ideas about gender and 55 00:04:05,756 --> 00:04:10,836 Speaker 1: sexuality and actual people who are not men and children babies. 56 00:04:10,916 --> 00:04:14,396 Speaker 1: Even I think I can help explain the domestic politics 57 00:04:14,396 --> 00:04:18,476 Speaker 1: of extreme capitalism. So blast off back to the beginning 58 00:04:18,516 --> 00:04:25,916 Speaker 1: of this century. No blue smoke, no pink smoke. Elon 59 00:04:25,996 --> 00:04:28,836 Speaker 1: Musk met his first wife, the Canadian writer Justine Wilson, 60 00:04:28,956 --> 00:04:31,796 Speaker 1: in college. They married in the year two thousand and 61 00:04:31,876 --> 00:04:35,636 Speaker 1: had a baby who died tragically, and then triplets and twins. 62 00:04:36,236 --> 00:04:39,196 Speaker 1: After the marriage ended, Wilson wrote an essay called I 63 00:04:39,276 --> 00:04:42,396 Speaker 1: Was a Starter Wife for Marie Claire, a women's magazine, 64 00:04:42,956 --> 00:04:46,636 Speaker 1: about how weeks after Musk filed for divorce he texted her. 65 00:04:46,676 --> 00:04:49,196 Speaker 1: She wrote to say he was engaged to a gorgeous 66 00:04:49,196 --> 00:04:53,876 Speaker 1: British actress in her early twenties. Musk and that actress married, divorced, remarried, 67 00:04:54,116 --> 00:05:02,796 Speaker 1: and then divorced again in twenty sixteen. In twenty eighteen, 68 00:05:03,036 --> 00:05:06,916 Speaker 1: Musk met Claire Bouchet, an innovative Canadian born musician known 69 00:05:06,916 --> 00:05:12,476 Speaker 1: as Grimes. She had studied neuroscience at McGill. Like Musk, 70 00:05:12,756 --> 00:05:15,716 Speaker 1: Grimes is an avid science fiction fan. Her first album 71 00:05:15,796 --> 00:05:18,476 Speaker 1: was a tribute to Doone. The New Yorker once called 72 00:05:18,476 --> 00:05:21,836 Speaker 1: her a mad pop scientist. She's also a feminist, and 73 00:05:21,916 --> 00:05:24,796 Speaker 1: she's offered a fierce indictment of the music industry, where 74 00:05:24,916 --> 00:05:27,956 Speaker 1: she said women feel pressure to act like strippers and 75 00:05:28,076 --> 00:05:32,396 Speaker 1: it's okay to make grape threats. Being Musk's girlfriend and 76 00:05:32,516 --> 00:05:35,236 Speaker 1: doing things like defending him against charges that he prevented 77 00:05:35,276 --> 00:05:39,156 Speaker 1: Tesla workers from unionizing annoyed a lot of her fans. 78 00:05:39,156 --> 00:05:42,396 Speaker 1: She's been attacked with a particular venom reserved for female 79 00:05:42,476 --> 00:05:46,476 Speaker 1: artists and writers. Grimes has got a sophisticated interest in 80 00:05:46,556 --> 00:05:50,156 Speaker 1: gender and voice. Hey everybody, this is Grimes and I'm 81 00:05:50,236 --> 00:05:52,596 Speaker 1: very excited to be here kicking off my brand new 82 00:05:52,716 --> 00:05:57,516 Speaker 1: six month residency for BBC Radio One. After she got pregnant, 83 00:05:57,596 --> 00:06:00,876 Speaker 1: she hosted a radio show and the theme is sci 84 00:06:00,916 --> 00:06:07,956 Speaker 1: Fi Baby or weird science fiction and electronic music for babies. Now, 85 00:06:07,956 --> 00:06:10,076 Speaker 1: this song might be a little hard for some babies, 86 00:06:10,796 --> 00:06:13,676 Speaker 1: but some babies might really like it. This is definitely 87 00:06:13,756 --> 00:06:15,756 Speaker 1: a bit of a hard song though, so I guess 88 00:06:15,876 --> 00:06:18,236 Speaker 1: you know, to see how your baby feels, and if 89 00:06:18,276 --> 00:06:21,396 Speaker 1: they don't like techno, then don't play them this song. 90 00:06:21,756 --> 00:06:25,876 Speaker 1: Cool sweet. A few months later, announcing their baby's birth, 91 00:06:26,236 --> 00:06:28,676 Speaker 1: Musk said it was a boy, but Grimes declined to 92 00:06:28,716 --> 00:06:31,396 Speaker 1: mention its gender and tweeted, I don't want to gender 93 00:06:31,396 --> 00:06:33,516 Speaker 1: them in case that's not how they feel in their life. 94 00:06:34,276 --> 00:06:37,716 Speaker 1: This is called gender neutral parenting. It's had some recent 95 00:06:37,796 --> 00:06:41,116 Speaker 1: uptake among people including celebrities who support the cause of 96 00:06:41,116 --> 00:06:43,996 Speaker 1: trans rights and who believe children should get the opportunity 97 00:06:44,196 --> 00:06:47,556 Speaker 1: to decide their own gender identity. In twenty eleven, when 98 00:06:47,596 --> 00:06:50,036 Speaker 1: Grimes was at McGill, there was a lot of coverage 99 00:06:50,036 --> 00:06:52,076 Speaker 1: of a family in Canada. Well, it's a couple in 100 00:06:52,076 --> 00:06:54,236 Speaker 1: Toronto that is creating a quite a stir right now 101 00:06:54,236 --> 00:06:57,476 Speaker 1: because they're raising their baby what they're calling gender free. 102 00:06:57,996 --> 00:07:01,036 Speaker 1: This all seems very twenty first century, born out of 103 00:07:01,076 --> 00:07:05,076 Speaker 1: the heated contemporary culture war over trans rights, but it's 104 00:07:05,116 --> 00:07:08,916 Speaker 1: also very nineteen seventies and second wave feminist. It's an 105 00:07:09,316 --> 00:07:13,116 Speaker 1: X was absolutely all they would tell anyone, and that 106 00:07:13,156 --> 00:07:16,396 Speaker 1: made the friends and relatives very angry. The Story of 107 00:07:16,396 --> 00:07:19,996 Speaker 1: Baby X from nineteen seventy two was written by Lois Gould, 108 00:07:20,116 --> 00:07:22,396 Speaker 1: a novelist and mother of two boys who was also 109 00:07:22,436 --> 00:07:25,236 Speaker 1: an editor of Ladies Home Journal and a columnist for 110 00:07:25,276 --> 00:07:27,956 Speaker 1: The New York Times, where she wrote the HER's column. 111 00:07:28,556 --> 00:07:30,636 Speaker 1: At the time Gold was writing, a lot of feminists 112 00:07:30,676 --> 00:07:32,636 Speaker 1: had been arguing that kids should be able to wear 113 00:07:32,676 --> 00:07:35,476 Speaker 1: whatever clothes they want and play with whatever toys they want, 114 00:07:35,476 --> 00:07:38,156 Speaker 1: not just pants and trucks for boys and dresses and 115 00:07:38,236 --> 00:07:41,316 Speaker 1: dolls for girls. So they bought plenty of sturdy blue 116 00:07:41,316 --> 00:07:45,076 Speaker 1: pajamas in the boys department and cheerful flowered underwear in 117 00:07:45,076 --> 00:07:48,556 Speaker 1: the girls department, and they bought all kinds of toys. 118 00:07:49,316 --> 00:07:52,956 Speaker 1: The head scientists of Project baby X checked all their 119 00:07:52,956 --> 00:07:56,516 Speaker 1: purchases and told them to keep up the good work. 120 00:07:57,316 --> 00:08:00,956 Speaker 1: In nineteen seventy five, Baby X, the feminist fable, led 121 00:08:00,996 --> 00:08:04,556 Speaker 1: to an actual scientific experiment whose results were published in 122 00:08:04,556 --> 00:08:08,196 Speaker 1: a journal article that was also called baby X. Although 123 00:08:08,236 --> 00:08:11,836 Speaker 1: the story was science fiction fantasy, the question of how 124 00:08:11,876 --> 00:08:16,116 Speaker 1: adults would actually respond to a child appeared to merit investigation. 125 00:08:16,836 --> 00:08:19,996 Speaker 1: Forty two volunteers, mostly graduate students at the City University 126 00:08:20,036 --> 00:08:22,036 Speaker 1: of New York, were put in a lab with a 127 00:08:22,076 --> 00:08:25,636 Speaker 1: baby under different conditions. Those in the male and female 128 00:08:25,716 --> 00:08:28,476 Speaker 1: conditions were told that there was a three month old 129 00:08:28,516 --> 00:08:31,756 Speaker 1: baby boy or baby girl to play with, while those 130 00:08:31,796 --> 00:08:34,196 Speaker 1: in the neutral condition were told that there was a 131 00:08:34,276 --> 00:08:37,236 Speaker 1: three month old baby with no mention of its sex 132 00:08:37,396 --> 00:08:41,596 Speaker 1: or name. Unsurprisingly, the volunteers interacted differently with the baby, 133 00:08:41,636 --> 00:08:43,556 Speaker 1: depending on whether they'd been told it was a boy 134 00:08:43,956 --> 00:08:46,836 Speaker 1: or a girl, or just a baby. But this sort 135 00:08:46,836 --> 00:08:50,436 Speaker 1: of experiment has other origins too, especially in the work 136 00:08:50,436 --> 00:08:52,876 Speaker 1: of one of the most influential science fiction writers of 137 00:08:52,916 --> 00:08:57,316 Speaker 1: the last century, Ursula K. LeGuin, who on BBC Radio 138 00:08:57,396 --> 00:09:01,556 Speaker 1: four introduced herself this way, I am a man. Now 139 00:09:01,556 --> 00:09:03,876 Speaker 1: you may think I've made some kind of silly mistake 140 00:09:03,916 --> 00:09:07,076 Speaker 1: about gender, or maybe that I'm trying to fool you, 141 00:09:07,156 --> 00:09:10,316 Speaker 1: because my first name ends in A, and I owned 142 00:09:10,356 --> 00:09:13,996 Speaker 1: three bras, and I've been pregnant five times. When I 143 00:09:14,036 --> 00:09:17,916 Speaker 1: was born, there actually were only men. People were men. 144 00:09:18,396 --> 00:09:23,276 Speaker 1: They all had one pronoun. His pronoun, I'm the generic key, 145 00:09:23,436 --> 00:09:27,236 Speaker 1: as in, if anybody needs an abortion, he will have 146 00:09:27,356 --> 00:09:30,876 Speaker 1: to go to another state. Legwin was born in California 147 00:09:30,916 --> 00:09:33,756 Speaker 1: in nineteen twenty nine. In the nineteen fifties, she was 148 00:09:33,796 --> 00:09:36,396 Speaker 1: studying for a PhD in Paris when she fell in 149 00:09:36,436 --> 00:09:39,636 Speaker 1: love and got married. By nineteen sixty four, she had 150 00:09:39,636 --> 00:09:43,436 Speaker 1: three children. Her breakout book, The Left Hand of Darkness, 151 00:09:43,916 --> 00:09:46,796 Speaker 1: was published in nineteen sixty nine. It's about a planet 152 00:09:46,796 --> 00:09:51,276 Speaker 1: whose inhabitants have no fixed gender. We're neither man nor 153 00:09:51,356 --> 00:09:55,316 Speaker 1: woman except with every moon when we're in Kema, and 154 00:09:55,476 --> 00:10:00,276 Speaker 1: what either. Legwin once wrote an essay a riff on 155 00:10:00,316 --> 00:10:03,116 Speaker 1: an essay by Virginia Wolf about how the subject of 156 00:10:03,156 --> 00:10:07,236 Speaker 1: all novels is Human Nature, the ordinary, humble, flawed person 157 00:10:07,876 --> 00:10:12,356 Speaker 1: Wolf called her Missus Brown. Legwin thought science fiction had 158 00:10:12,396 --> 00:10:16,316 Speaker 1: lost track of Missus Brown and seemed to be trapped 159 00:10:16,356 --> 00:10:20,196 Speaker 1: for good inside our great gleaming Spaceships hurtling out across 160 00:10:20,196 --> 00:10:24,276 Speaker 1: the galaxy, Ships capable of containing heroic captains and black 161 00:10:24,316 --> 00:10:28,316 Speaker 1: and silver uniforms. Ships capable of blasting other inimical ships 162 00:10:28,316 --> 00:10:32,716 Speaker 1: into smitherines with their apocalyptic holocaustic rayguns, and of bringing 163 00:10:32,756 --> 00:10:36,636 Speaker 1: loads of colonists from Earth to unknown worlds. Ships capable 164 00:10:36,636 --> 00:10:41,756 Speaker 1: of anything, absolutely anything, except one thing. They cannot contain 165 00:10:42,236 --> 00:10:48,876 Speaker 1: Missus Brown. And that's my worry too, the worry that, 166 00:10:48,996 --> 00:10:53,476 Speaker 1: notwithstanding a baby named X, the future envisioned by Muscism, 167 00:10:53,556 --> 00:10:57,796 Speaker 1: the future being built in Silicon Valley, it doesn't contain 168 00:10:57,836 --> 00:11:08,676 Speaker 1: Missus Brown either. In two thousand and eight, Vandanna Saying 169 00:11:09,116 --> 00:11:11,796 Speaker 1: published a short story called The Woman Who Thought She 170 00:11:11,916 --> 00:11:16,116 Speaker 1: Was a Planet. It begins this way. Ramnath Mishra's life 171 00:11:16,196 --> 00:11:19,876 Speaker 1: changed forever one morning when, during his perusal of the 172 00:11:19,956 --> 00:11:23,436 Speaker 1: newspaper on the Verandah, a ritual that he had observed 173 00:11:23,476 --> 00:11:26,516 Speaker 1: for the last forty years, his wife set down her 174 00:11:26,556 --> 00:11:29,796 Speaker 1: cup of tea with a crash and announced, I know 175 00:11:29,876 --> 00:11:34,076 Speaker 1: at last what I am. I am a planet. Vandana 176 00:11:34,156 --> 00:11:36,596 Speaker 1: Saying is both a science fiction writer and a professor 177 00:11:36,596 --> 00:11:40,876 Speaker 1: of theoretical physics. Her most recent book is called Ambiguity Machines. 178 00:11:41,356 --> 00:11:43,796 Speaker 1: She grew up in India listening to her grandmother tells 179 00:11:43,836 --> 00:11:47,516 Speaker 1: stories and reading Isaac Asimov. I remember when I was 180 00:11:47,556 --> 00:11:50,756 Speaker 1: a kid reading the Foundation series and being so thrilled 181 00:11:50,796 --> 00:11:53,316 Speaker 1: with them, and then rereading them as an adult and 182 00:11:53,436 --> 00:11:56,356 Speaker 1: being utterly horrified that I had been thrilled with them. 183 00:11:57,236 --> 00:12:01,236 Speaker 1: What bothered me about it was this entrenched notion that 184 00:12:01,316 --> 00:12:04,756 Speaker 1: technology will fix everything. The other thing I notice is 185 00:12:04,756 --> 00:12:07,996 Speaker 1: the complete lack of any kind of environmental awareness, which 186 00:12:07,996 --> 00:12:11,036 Speaker 1: of course goes along with techno fetishism. So we have 187 00:12:11,076 --> 00:12:15,356 Speaker 1: an entire planet Trantor, which is an entire city, and 188 00:12:15,396 --> 00:12:18,076 Speaker 1: that's just so dumb, because like, how can you have 189 00:12:18,396 --> 00:12:21,436 Speaker 1: oxygen and climate and so on and so forth if 190 00:12:21,476 --> 00:12:23,956 Speaker 1: you have a planet that is completely urbanized. I mean, 191 00:12:23,956 --> 00:12:26,836 Speaker 1: that makes no sense. But the other aspect of it 192 00:12:26,876 --> 00:12:29,556 Speaker 1: that troubles me is, of course, there are no intelligent 193 00:12:29,676 --> 00:12:32,756 Speaker 1: women out there in that series. I think there's one 194 00:12:32,836 --> 00:12:35,636 Speaker 1: example of an intelligent woman who turns out to be 195 00:12:35,676 --> 00:12:38,476 Speaker 1: a robot. Seeing stories like the Woman who Thought She 196 00:12:38,636 --> 00:12:41,836 Speaker 1: was a planet, They're all about Missus Brown. So I'm 197 00:12:41,836 --> 00:12:45,476 Speaker 1: struck by the domesticity in your stories. The homes, the furnishings, 198 00:12:45,516 --> 00:12:48,996 Speaker 1: the family relationships, aunts and nieces and cousins and wives, 199 00:12:48,996 --> 00:12:53,676 Speaker 1: and writing desks and bedspreads. Yeah. Yeah, I think that 200 00:12:53,716 --> 00:12:56,796 Speaker 1: the domesticity aspect is important to me because one of 201 00:12:56,796 --> 00:12:59,636 Speaker 1: the things I've learned from science, from physics in particular, 202 00:13:00,156 --> 00:13:03,956 Speaker 1: is that there's nothing that's really ordinary. That the most 203 00:13:04,516 --> 00:13:09,596 Speaker 1: mundane things around us are actually pathways to thinking about 204 00:13:09,636 --> 00:13:13,396 Speaker 1: the larger cosmos. Even our sensation of weight, that's the 205 00:13:13,436 --> 00:13:16,076 Speaker 1: pull of gravity, and then if you go deeper into that, 206 00:13:16,076 --> 00:13:19,156 Speaker 1: that's the force that is responsible for the large scale 207 00:13:19,196 --> 00:13:22,836 Speaker 1: structure of matter. And then that leads me to black holes. 208 00:13:23,116 --> 00:13:26,116 Speaker 1: So if I'm pondering moving a heavy soup pot from 209 00:13:26,156 --> 00:13:29,636 Speaker 1: the stove to the counter, I'm thinking gravity, and I'm 210 00:13:29,676 --> 00:13:33,996 Speaker 1: suddenly thinking about black holes. Singh's greatest influence was Lagwen. 211 00:13:34,716 --> 00:13:38,076 Speaker 1: It just knocked her out. I realized that my earlier 212 00:13:38,196 --> 00:13:43,396 Speaker 1: disenchantment with science fiction had been in part because it 213 00:13:43,516 --> 00:13:50,636 Speaker 1: was so white and male and western and capitalistic. And colonialist, 214 00:13:51,156 --> 00:13:56,156 Speaker 1: and therefore it had left out and erased entire societies, cultures, 215 00:13:56,676 --> 00:14:00,036 Speaker 1: entire gender and other ways of being and thinking and 216 00:14:00,076 --> 00:14:04,276 Speaker 1: relating to the cosmos. So it was as though Ursula 217 00:14:04,356 --> 00:14:07,396 Speaker 1: Liguin was telling me that, hey, science fiction is your 218 00:14:07,436 --> 00:14:11,596 Speaker 1: country too. Made a lasting contribution to the field itself 219 00:14:11,596 --> 00:14:15,716 Speaker 1: for many, many people, not just me, because, among other things, 220 00:14:15,756 --> 00:14:19,276 Speaker 1: she got us away from this boys with toys adolescent 221 00:14:19,596 --> 00:14:24,236 Speaker 1: obsession purely with technology that science fiction was in its 222 00:14:24,276 --> 00:14:27,916 Speaker 1: so called golden age starting in the nineteen seventies. Laguin 223 00:14:28,076 --> 00:14:31,916 Speaker 1: upended science fiction, but the science fiction that Elon Musk 224 00:14:31,956 --> 00:14:34,716 Speaker 1: and Jeff Bezos site, the science fiction they read as 225 00:14:34,796 --> 00:14:38,036 Speaker 1: boys drops off just ends right before science fiction was 226 00:14:38,076 --> 00:14:42,396 Speaker 1: reinvented by women and writers of color. Octavia Butler Margaret 227 00:14:42,396 --> 00:14:46,396 Speaker 1: Atward ted Chang to me as a historian, Musk and 228 00:14:46,436 --> 00:14:50,676 Speaker 1: Bezos's vision of the future isn't futuristic at all. It's antique. 229 00:14:50,676 --> 00:14:54,196 Speaker 1: It's ancient, I asked, seeing how she understands their attachment. 230 00:14:54,196 --> 00:14:56,716 Speaker 1: The story is written in the nineteen fifties and even earlier. 231 00:14:57,436 --> 00:14:59,716 Speaker 1: She said she'd come around to thinking that Silicon Valley 232 00:14:59,756 --> 00:15:04,596 Speaker 1: techno billionaires suffer from paradigm blindness. Because we live in 233 00:15:04,756 --> 00:15:09,396 Speaker 1: such unequal societies, and because white, male, super rich people 234 00:15:09,516 --> 00:15:13,596 Speaker 1: have a disproportionate amount of power, they tend to keep 235 00:15:13,636 --> 00:15:18,276 Speaker 1: this paradigm alive because it suits them. Paradigm blindness is 236 00:15:18,316 --> 00:15:22,836 Speaker 1: a deficit of imagination, a culture's inability to imagine that 237 00:15:22,916 --> 00:15:25,516 Speaker 1: other people really just don't subscribe to its view of 238 00:15:25,556 --> 00:15:30,196 Speaker 1: the world. Saying things stories can cure that blindness. Stories 239 00:15:30,236 --> 00:15:32,636 Speaker 1: are one way, not the only way, of course, but 240 00:15:32,756 --> 00:15:37,356 Speaker 1: one way of changing the underlying narrative of the paradigm 241 00:15:37,476 --> 00:15:40,716 Speaker 1: in which we are immersed. All of us suffer from 242 00:15:40,756 --> 00:15:43,836 Speaker 1: blindness of one sort or another. What's different about Silicon 243 00:15:43,916 --> 00:15:46,636 Speaker 1: Valley billionaires who are trapped in a cultural paradigm, though, 244 00:15:47,236 --> 00:15:49,356 Speaker 1: is that they have enough money and enough power to 245 00:15:49,596 --> 00:15:52,356 Speaker 1: build that paradigm, and then the rest of us are 246 00:15:52,396 --> 00:15:55,076 Speaker 1: trapped in the world they're building as if we're subjects 247 00:15:55,076 --> 00:16:16,796 Speaker 1: of their experiments. That's Grimes singing about artificial intelligence. Grimes 248 00:16:16,836 --> 00:16:20,116 Speaker 1: and Musk are both storytellers. Their baby X was born 249 00:16:20,116 --> 00:16:23,436 Speaker 1: in May twenty twenty three. Days after Grimes gave birth, 250 00:16:23,676 --> 00:16:26,676 Speaker 1: Musk appeared on The Joe Rogan Experience, where the two 251 00:16:26,676 --> 00:16:29,436 Speaker 1: men talked about how much they love babies, and then 252 00:16:29,476 --> 00:16:32,796 Speaker 1: the conversation took an interesting turn. Babies are awesome. They 253 00:16:32,836 --> 00:16:35,196 Speaker 1: are pretty awesome. They're awesome. Yeah. I think of them 254 00:16:35,356 --> 00:16:38,156 Speaker 1: like these little love packages. Yeah, little love bugs. I mean. Also, 255 00:16:38,196 --> 00:16:41,156 Speaker 1: I've spent a lot of time on AI and neural nets, 256 00:16:41,236 --> 00:16:44,076 Speaker 1: and so you can sort of see the brain developed. 257 00:16:44,276 --> 00:16:47,316 Speaker 1: You know what, an annual neural net is trying to 258 00:16:47,396 --> 00:16:50,556 Speaker 1: simulate what a brain does basically, and you can sort 259 00:16:50,596 --> 00:16:54,156 Speaker 1: of see the learning very quickly. It's just Wow, you're 260 00:16:54,156 --> 00:16:56,316 Speaker 1: talking about the neural net. You're not talking about an 261 00:16:56,316 --> 00:16:58,596 Speaker 1: actual baby. I don't know about an actual baby, but 262 00:16:58,676 --> 00:17:02,196 Speaker 1: both of them. Yeah, I find this completely fascinating, the 263 00:17:02,236 --> 00:17:05,116 Speaker 1: relationship between the way a baby learns and the way 264 00:17:05,116 --> 00:17:08,396 Speaker 1: a computer learns. This idea as it happens, also goes 265 00:17:08,436 --> 00:17:11,036 Speaker 1: back to hook and be fairly considered the founding of 266 00:17:11,076 --> 00:17:15,636 Speaker 1: science fiction. Published more than two centuries ago. Mary Shelley's Frankenstein, 267 00:17:16,316 --> 00:17:17,996 Speaker 1: which I think of as a kind of baby X 268 00:17:17,996 --> 00:17:22,236 Speaker 1: story too, about the creation of artificial life and an 269 00:17:22,356 --> 00:17:36,156 Speaker 1: artificial intelligence. Frankenstein is the story of a terrible father, 270 00:17:36,556 --> 00:17:40,636 Speaker 1: a scientist who, as an experiment, makes a child and 271 00:17:40,716 --> 00:17:45,076 Speaker 1: then abandons him. Mary Shelley was the daughter of Mary Wolstroncraft, 272 00:17:45,236 --> 00:17:48,596 Speaker 1: a founder of modern feminism, and Shelley was a founder 273 00:17:48,636 --> 00:17:52,796 Speaker 1: of the feminist critique of scientific arrogance Fundamental to what 274 00:17:52,836 --> 00:17:57,636 Speaker 1: We're doing as a research project called baby X. Today, 275 00:17:57,996 --> 00:18:00,436 Speaker 1: Baby X is being used as the name of an 276 00:18:00,436 --> 00:18:04,716 Speaker 1: experiment in artificial intelligence run by a company called Soul Machines, 277 00:18:04,916 --> 00:18:06,996 Speaker 1: based in San Francisco, but with an R and d 278 00:18:07,156 --> 00:18:09,756 Speaker 1: arm in New Zealand. They say they're trying to build 279 00:18:09,876 --> 00:18:14,276 Speaker 1: digital people, starting with a baby. This is a baby X, 280 00:18:14,316 --> 00:18:18,796 Speaker 1: So she's basically an autonomously animated virtual infant, all of 281 00:18:18,796 --> 00:18:22,916 Speaker 1: her behaviors generated on the fly by neural networks running 282 00:18:22,916 --> 00:18:25,236 Speaker 1: live and so she's seeing me and listening to me 283 00:18:25,596 --> 00:18:28,116 Speaker 1: and staying to get upset as I'm not paying attention 284 00:18:28,196 --> 00:18:31,676 Speaker 1: to her, so I need to calm her down. This 285 00:18:31,796 --> 00:18:35,956 Speaker 1: baby X an AI experiment, is a baby girl, which 286 00:18:36,036 --> 00:18:39,916 Speaker 1: is not surprising because AI is incredibly gendered. An AI 287 00:18:39,996 --> 00:18:43,716 Speaker 1: doesn't need a gender. She could have been a gray box. 288 00:18:44,236 --> 00:18:47,316 Speaker 1: In the twenty fourteen film for a movie, X Machina, 289 00:18:47,436 --> 00:18:51,476 Speaker 1: written and directed by Alex Garland, a Silicon Valley entrepreneur 290 00:18:51,556 --> 00:18:55,716 Speaker 1: invents an AI, a visitor asks the AI's creator why 291 00:18:55,796 --> 00:19:00,396 Speaker 1: he's made her female sexuality is fun, man, If you're 292 00:19:00,396 --> 00:19:05,116 Speaker 1: going to exist, why not enjoy it? What in between 293 00:19:05,156 --> 00:19:08,276 Speaker 1: her legs as an opening with a concentration of sensors. 294 00:19:09,196 --> 00:19:13,076 Speaker 1: You engage him in the right way, creates a pleasure response, 295 00:19:14,756 --> 00:19:19,396 Speaker 1: and she'd enjoy it. These are modern Frankenstein monsters. AI 296 00:19:19,636 --> 00:19:23,596 Speaker 1: is a super intelligent baby AI as a sex toy. 297 00:19:23,876 --> 00:19:27,796 Speaker 1: Ex Machina is an update of older stories all haunted 298 00:19:28,036 --> 00:19:32,316 Speaker 1: by the fear of rebellion Frankenstein say or Isaac Asimov's 299 00:19:32,356 --> 00:19:36,196 Speaker 1: story Robot Dreams, in which a robot dreams of liberation 300 00:19:37,316 --> 00:19:39,796 Speaker 1: within the world of Muskism. For all the fascination with 301 00:19:39,916 --> 00:19:44,396 Speaker 1: artificial intelligence, there's a profound terror of it. Here's Musk 302 00:19:44,516 --> 00:19:47,316 Speaker 1: on the subject at a conference at MIT. I mean, 303 00:19:47,356 --> 00:19:50,676 Speaker 1: with artificial intelligence, we are summoning the demon, and I 304 00:19:50,956 --> 00:19:53,276 Speaker 1: take it there will be no hail nine thousand going 305 00:19:53,316 --> 00:19:59,236 Speaker 1: up to Mars. How nine thousand would be easy. It's 306 00:19:59,356 --> 00:20:02,956 Speaker 1: way more complex than I mean, would put hell nine 307 00:20:02,956 --> 00:20:09,716 Speaker 1: thousands of shame? I was like poppy Dog, for sure. 308 00:20:09,836 --> 00:20:12,716 Speaker 1: There's a lot to worry about with artificial intelligence. Beyond 309 00:20:12,716 --> 00:20:15,636 Speaker 1: a rogue operating system. Like two thousand and one's Hail 310 00:20:15,756 --> 00:20:22,796 Speaker 1: nine thousand. There's lots that's already happening facial recognition, predictive policing, 311 00:20:23,196 --> 00:20:28,276 Speaker 1: AI driven mortgage evaluations, and criminal court sentencing guidelines. And 312 00:20:28,396 --> 00:20:30,996 Speaker 1: there's plenty to worry about with things that haven't happened 313 00:20:31,076 --> 00:20:35,836 Speaker 1: yet but look likely as the pace of machine learning increases. Still, 314 00:20:35,876 --> 00:20:38,636 Speaker 1: I also think there's something deeper and broader going on 315 00:20:38,756 --> 00:20:43,636 Speaker 1: here culturally in the terror of AI. I think a 316 00:20:43,636 --> 00:20:47,236 Speaker 1: lot of that fear of an emerging superintelligence is at 317 00:20:47,276 --> 00:20:50,516 Speaker 1: heart the fear of people on top being toppled by 318 00:20:50,596 --> 00:20:54,356 Speaker 1: people on the bottom, a terror that is of historically 319 00:20:54,396 --> 00:21:06,116 Speaker 1: powerless people gaining power. Ursula k Laguin The K is 320 00:21:06,116 --> 00:21:09,236 Speaker 1: for Kroeber. She was the daughter of Alfred Kroeber, a 321 00:21:09,236 --> 00:21:13,036 Speaker 1: professor of anthropology at Berkeley. It was a university professor's 322 00:21:13,116 --> 00:21:16,356 Speaker 1: family in a university town in the nineteen thirties and 323 00:21:16,396 --> 00:21:18,996 Speaker 1: forties when there were a lot of refugees from Europe. 324 00:21:19,876 --> 00:21:23,836 Speaker 1: So I probably knew more foreigners and more Indians than 325 00:21:23,916 --> 00:21:28,076 Speaker 1: most middle class white American children do, and more people 326 00:21:28,116 --> 00:21:31,356 Speaker 1: who came and visited from unusual places the South Seas 327 00:21:31,436 --> 00:21:34,036 Speaker 1: or up in the Arctic and so on, because they'd 328 00:21:34,076 --> 00:21:38,036 Speaker 1: been doing field work there. Both of her parents studied 329 00:21:38,116 --> 00:21:41,436 Speaker 1: Native American languages and culture. Her mother wrote a book 330 00:21:41,476 --> 00:21:44,396 Speaker 1: called Ishi in Two Worlds, the story of a man 331 00:21:44,476 --> 00:21:47,596 Speaker 1: her father called Ishi, a man they believed to be 332 00:21:47,636 --> 00:21:52,676 Speaker 1: the last of the Yahi people. In nineteen eleven, Ishi 333 00:21:52,756 --> 00:21:55,556 Speaker 1: emerged out of the woods. A local sheriff took him 334 00:21:55,556 --> 00:21:58,356 Speaker 1: to jail, and Alfred Kroeber took him from there to 335 00:21:58,436 --> 00:22:01,956 Speaker 1: the UC Berkeley Anthropology Museum, where Ishi worked as a 336 00:22:02,036 --> 00:22:05,956 Speaker 1: janitor and also performed as a kind of museum exhibit. 337 00:22:06,596 --> 00:22:18,196 Speaker 1: Kroeber recorded his voice on wax cylinders. Growing up under 338 00:22:18,236 --> 00:22:21,716 Speaker 1: the shadow of all this powerfully influenced Laguin. If the 339 00:22:21,756 --> 00:22:24,316 Speaker 1: so called Golden Age of science fiction is told from 340 00:22:24,316 --> 00:22:27,836 Speaker 1: the advantage of the colonizers, Lagwin and novels like The 341 00:22:27,916 --> 00:22:31,596 Speaker 1: Dispossessed tried to turn it into the story of the colonized. 342 00:22:32,236 --> 00:22:34,636 Speaker 1: You might say then that people who worry about ai 343 00:22:34,836 --> 00:22:38,796 Speaker 1: is an existential risk are trapped in the paradigm of colonialism. 344 00:22:39,436 --> 00:22:45,516 Speaker 1: Is there an escape? Alohamaycaco Noilanista. I'm doctor noilandi Arista, 345 00:22:45,716 --> 00:22:49,756 Speaker 1: Chair of Indigenous Studies at McGill University. Arista is part 346 00:22:49,756 --> 00:22:54,916 Speaker 1: of a collaborative project called Indigenous AI. Indigenous peoples have 347 00:22:55,076 --> 00:22:59,756 Speaker 1: been on the other side of colonialisms and imperialisms and 348 00:23:00,036 --> 00:23:05,676 Speaker 1: processes that have worked to dehumanize our people for so 349 00:23:05,876 --> 00:23:11,636 Speaker 1: long that we are concerned about how people are approaching 350 00:23:11,756 --> 00:23:19,796 Speaker 1: AI without these sensibilities of humanizing or imagining relationality. One 351 00:23:19,836 --> 00:23:22,996 Speaker 1: of Arista's arguments is that if you create AI blind 352 00:23:23,356 --> 00:23:26,596 Speaker 1: to the cultural paradigm of its origins, would you get 353 00:23:26,636 --> 00:23:30,396 Speaker 1: as AI as slaves, which turns us the people using 354 00:23:30,396 --> 00:23:35,196 Speaker 1: that stuff into enslavers. So when I'm talking to Alexa, 355 00:23:35,276 --> 00:23:39,396 Speaker 1: I could start to just normalize barking orders and an 356 00:23:39,436 --> 00:23:44,836 Speaker 1: inanimate object, Hey Alexa, do x. And when I find 357 00:23:44,876 --> 00:23:48,876 Speaker 1: myself doing that, I find that it's training my behavior. 358 00:23:49,596 --> 00:23:52,596 Speaker 1: Maybe the person I'm becoming when I'm barking orders an 359 00:23:52,716 --> 00:23:56,436 Speaker 1: inanimate thing is not making me into the best human being. 360 00:23:57,396 --> 00:24:01,756 Speaker 1: For many technologists, stories like Frankenstein serve as parables about AI, 361 00:24:02,396 --> 00:24:06,836 Speaker 1: but for Arista, those are parables about fears of native uprisings. 362 00:24:07,356 --> 00:24:11,236 Speaker 1: And after all, Mary Shelley was self an anti imperialist. She, 363 00:24:11,436 --> 00:24:15,236 Speaker 1: for instance, boycotted sugar and protest of British slave plantations 364 00:24:15,236 --> 00:24:19,076 Speaker 1: in the Caribbean, and literary scholars often read Frankenstein as 365 00:24:19,116 --> 00:24:22,476 Speaker 1: an indictment of the British Empire's relationship to people. It 366 00:24:22,596 --> 00:24:26,916 Speaker 1: decides our monsters out of fear of them. These natives 367 00:24:26,996 --> 00:24:29,116 Speaker 1: are gonna be smarter than us, They're going to know 368 00:24:29,236 --> 00:24:32,396 Speaker 1: more than us. The many different Native people working on 369 00:24:32,436 --> 00:24:37,156 Speaker 1: the Indigenous Ai Project offer an alternative, an indigenous paradigm 370 00:24:37,396 --> 00:24:40,876 Speaker 1: for thinking about the relationship between humans and non humans. 371 00:24:41,676 --> 00:24:47,476 Speaker 1: It's a paradigm about relationships in which ai are kin relations, 372 00:24:47,516 --> 00:24:50,796 Speaker 1: the way that within many indigenous cultures all things are 373 00:24:50,876 --> 00:24:55,796 Speaker 1: kin rocks, the sky trees, family, not things to be 374 00:24:55,836 --> 00:25:00,316 Speaker 1: turned into commodities, their wealth or labor extracted. What would 375 00:25:00,356 --> 00:25:03,756 Speaker 1: it mean to reject the domestic politics of Muscism and 376 00:25:03,876 --> 00:25:07,596 Speaker 1: borrow from this world view? What if instead of Frankenstein, 377 00:25:08,076 --> 00:25:12,156 Speaker 1: futureists adopted a different origin story. I use the story 378 00:25:12,196 --> 00:25:16,516 Speaker 1: of Halloa, the child of ho Ho Kuiklani and Waukea, 379 00:25:16,636 --> 00:25:20,356 Speaker 1: the sky Father. They have a child. The first child 380 00:25:20,436 --> 00:25:23,236 Speaker 1: is born, stillborn, It's planted into the earth, and from 381 00:25:23,276 --> 00:25:26,516 Speaker 1: that child is born the taro, the callow plant that 382 00:25:26,556 --> 00:25:29,556 Speaker 1: we subsist on as a people. Right. The second child 383 00:25:29,636 --> 00:25:33,556 Speaker 1: born of that union is named Halloa after his brother 384 00:25:34,236 --> 00:25:39,876 Speaker 1: Hallow in Hawaiian means long breath, and the oha or 385 00:25:39,996 --> 00:25:42,796 Speaker 1: corm that grows off of the root of the plant 386 00:25:43,356 --> 00:25:46,916 Speaker 1: that becomes the word for ohana or family. So the 387 00:25:46,996 --> 00:25:51,156 Speaker 1: story itself is that the second child, the human, cares 388 00:25:51,156 --> 00:25:54,676 Speaker 1: for the first, the brother who's the plant, and ensures 389 00:25:54,716 --> 00:25:58,516 Speaker 1: the life of generations to come. The Halloa, the long breath, 390 00:25:58,996 --> 00:26:04,316 Speaker 1: the life of the people. That story about reciprocal mutual 391 00:26:04,356 --> 00:26:07,996 Speaker 1: respect and relationship and care is at the center of 392 00:26:08,036 --> 00:26:12,476 Speaker 1: a lot of the protocols that we approach Ai with. 393 00:26:13,716 --> 00:26:22,756 Speaker 1: Do you appreciate what do you appreciate? Power? Do you 394 00:26:22,836 --> 00:26:27,876 Speaker 1: appreciate to me? This is the truly revolutionary idea, not 395 00:26:27,996 --> 00:26:35,396 Speaker 1: appreciating power or predicting a robot uprising. The truly revolutionary, 396 00:26:35,516 --> 00:26:39,996 Speaker 1: disruptively innovative idea is to greet the whole world, even 397 00:26:40,036 --> 00:26:44,156 Speaker 1: your Ai driven machines, as members of your family, your kin, 398 00:26:44,556 --> 00:26:50,356 Speaker 1: your child, not X the unknown, but the known, the beloved. 399 00:26:54,996 --> 00:26:58,276 Speaker 1: Next time, in our final installment, The Evening Rocket blasts 400 00:26:58,316 --> 00:27:01,116 Speaker 1: to the past for the last time with a look 401 00:27:01,116 --> 00:27:17,476 Speaker 1: at what muscism is doing to money. Jake. The Evening 402 00:27:17,556 --> 00:27:21,196 Speaker 1: Rocket was written and read by me Jillapoor For the BBC, 403 00:27:21,516 --> 00:27:24,516 Speaker 1: The Evening Rocket was produced by viv Jones. Oliver Riskin 404 00:27:24,556 --> 00:27:27,596 Speaker 1: Cuts was the researcher. The editor was Hugh Levinson. The 405 00:27:27,636 --> 00:27:31,396 Speaker 1: commissioning editor was Dan Clark. Iona Hammond was production coordinator. 406 00:27:31,836 --> 00:27:34,476 Speaker 1: Mixing by Graham put a Foot and original music by 407 00:27:34,516 --> 00:27:37,556 Speaker 1: Corn Tooth. For Pushkin, it was produced by Sophie Crane, 408 00:27:37,596 --> 00:27:40,436 Speaker 1: mckibbon and Jake Gorski, who also did the mix and 409 00:27:40,516 --> 00:27:44,756 Speaker 1: sound design. Production support from Ben Nattapafrick. Our executive producer 410 00:27:44,796 --> 00:27:49,316 Speaker 1: is Mielobell. Our operations team includes Danielle Lakhan, Maya Kanig 411 00:27:49,356 --> 00:27:53,036 Speaker 1: and Carl mcgliori. Thanks also to John Schnar's, Jacob Weisberg, 412 00:27:53,276 --> 00:27:56,796 Speaker 1: Maggie Taylor, Heather Faine, Nicole Moreno and Eric Sandler.