1 00:00:05,320 --> 00:00:10,240 Speaker 1: How is science fiction like a cultural research and development lab. 2 00:00:10,880 --> 00:00:15,640 Speaker 1: Will we someday have AI agents that live inside robot bodies? 3 00:00:16,040 --> 00:00:19,840 Speaker 1: And will we be liable if they commit murder? What 4 00:00:20,120 --> 00:00:25,439 Speaker 1: happens when reality itself is no longer verifiable? How do 5 00:00:25,520 --> 00:00:30,240 Speaker 1: we create AI agents that guide us towards survival and 6 00:00:30,320 --> 00:00:34,879 Speaker 1: self actualization, not just profit or distraction? And what is 7 00:00:35,040 --> 00:00:39,159 Speaker 1: the Young Lady's Illustrated Primer? This week we talk with 8 00:00:39,240 --> 00:00:43,599 Speaker 1: researcher Bethany Maples about science fiction and how it might 9 00:00:43,680 --> 00:00:47,560 Speaker 1: prepare us to wrestle with the deepest human questions about 10 00:00:47,920 --> 00:00:55,800 Speaker 1: AI and identity and the future of humanity. Welcome to 11 00:00:55,840 --> 00:00:59,120 Speaker 1: Innercosmos with me David Eagleman. I'm a neuroscientist and author 12 00:00:59,160 --> 00:01:02,560 Speaker 1: at Stanford and in these episodes we sail deeply into 13 00:01:02,600 --> 00:01:06,920 Speaker 1: our three pound universe to understand the hidden forces that 14 00:01:07,040 --> 00:01:28,480 Speaker 1: shape our lives. Today, we're going to talk about science fiction. 15 00:01:29,040 --> 00:01:32,920 Speaker 1: So imagine this. It's nineteen sixty six and you're watching 16 00:01:33,280 --> 00:01:36,759 Speaker 1: Star Trek on the television, one of those cathode ray 17 00:01:36,800 --> 00:01:41,000 Speaker 1: tube televisions with a bubble screen. Anyway, what you're looking 18 00:01:41,000 --> 00:01:44,000 Speaker 1: at is Captain Kirk and Spock, and they're interacting with 19 00:01:44,120 --> 00:01:49,440 Speaker 1: a cool looking machine. It's a thin slab of glass 20 00:01:49,480 --> 00:01:54,600 Speaker 1: and metal that can display anything, including a person at 21 00:01:54,680 --> 00:01:58,080 Speaker 1: a distance talking to a camera. It's like a telephone call, 22 00:01:58,120 --> 00:02:01,680 Speaker 1: but you can actually see the other person. Now, for 23 00:02:01,880 --> 00:02:05,520 Speaker 1: the audience in nineteen sixty six, this is science fiction. 24 00:02:05,640 --> 00:02:09,960 Speaker 1: It's pure fantasy. But for a generation of engineers and 25 00:02:10,040 --> 00:02:13,440 Speaker 1: inventors who are just growing up and tuning in, it 26 00:02:13,480 --> 00:02:16,760 Speaker 1: was a blueprint. Fast forward a few decades and we 27 00:02:16,840 --> 00:02:19,320 Speaker 1: all have tablets in our hands every day, and even 28 00:02:19,480 --> 00:02:23,399 Speaker 1: smaller tablets called smartphones, and we never stop to think 29 00:02:23,800 --> 00:02:28,560 Speaker 1: how fantastical this would have seemed to our great grandparents. 30 00:02:28,880 --> 00:02:31,400 Speaker 1: So that's the thing I want to explore today, because 31 00:02:31,480 --> 00:02:37,200 Speaker 1: quite commonly science fiction anticipates the future, or more specifically, 32 00:02:37,240 --> 00:02:39,960 Speaker 1: it lays down the tracks for it. Now, if you're 33 00:02:40,000 --> 00:02:42,280 Speaker 1: a regular listener, you know that I am a lover 34 00:02:42,520 --> 00:02:48,840 Speaker 1: of literature more generally, because literature expands how we see ourselves, 35 00:02:49,360 --> 00:02:55,160 Speaker 1: but science fiction in particular often prototypes futures. We often 36 00:02:55,520 --> 00:02:59,639 Speaker 1: think about sci fi as escapism or entertainment, but when 37 00:02:59,639 --> 00:03:02,200 Speaker 1: we look more closely, we can sometimes see it as 38 00:03:02,280 --> 00:03:07,600 Speaker 1: a testing ground for society's biggest hopes and deepest fears. 39 00:03:07,800 --> 00:03:11,600 Speaker 1: It's the stage where we get to rehearse what might be, 40 00:03:12,240 --> 00:03:16,120 Speaker 1: and sometimes we end up waking up inside worlds that 41 00:03:16,320 --> 00:03:21,240 Speaker 1: novelists and filmmakers imagined a long time before. For example, 42 00:03:21,280 --> 00:03:25,799 Speaker 1: today we'll talk about Neil Stevenson's Diamond Age, which features 43 00:03:26,080 --> 00:03:29,880 Speaker 1: an intelligent self writing book that can raise the child, 44 00:03:30,440 --> 00:03:34,080 Speaker 1: or we'll talk about Spike Jones's movie Her, where a 45 00:03:34,280 --> 00:03:37,960 Speaker 1: lonely man falls in love with his AI operating system. 46 00:03:38,160 --> 00:03:42,360 Speaker 1: So science fiction gives us a lens into the future 47 00:03:42,400 --> 00:03:45,720 Speaker 1: and whether or not it's a reliable oracle, which it 48 00:03:45,800 --> 00:03:49,240 Speaker 1: sometimes is and sometimes isn't. It always lets us ask 49 00:03:49,880 --> 00:03:54,440 Speaker 1: what if? What if your AI companion wasn't just a 50 00:03:54,520 --> 00:03:58,520 Speaker 1: distraction but an advocate for your very best self? What 51 00:03:58,760 --> 00:04:03,120 Speaker 1: if technolog didn't just change our tools, but changed what 52 00:04:03,200 --> 00:04:06,520 Speaker 1: it means to be human. It allows us to stretch 53 00:04:06,720 --> 00:04:10,840 Speaker 1: our brain's internal models. So that brings me to my 54 00:04:10,880 --> 00:04:14,960 Speaker 1: guest today. My colleague Bethany Maples runs an education company 55 00:04:14,960 --> 00:04:18,279 Speaker 1: called Atypical AI. I first met her when she was 56 00:04:18,320 --> 00:04:22,280 Speaker 1: a researcher in education at Stanford, where she studied the 57 00:04:22,400 --> 00:04:27,119 Speaker 1: rise of personalized AI agents like AI tutors or learning 58 00:04:27,160 --> 00:04:31,320 Speaker 1: companions or lovers and how these things are changing us. 59 00:04:31,720 --> 00:04:35,640 Speaker 1: She recently published a paper in Nature on loneliness and 60 00:04:35,760 --> 00:04:42,120 Speaker 1: suicide mitigation using GPT chatbots. In other words, can having 61 00:04:42,200 --> 00:04:47,160 Speaker 1: an AI friend actually save a life? The answer was yes. 62 00:04:47,400 --> 00:04:50,920 Speaker 1: Check that out in episode ninety eight now, when Bethany 63 00:04:51,000 --> 00:04:54,719 Speaker 1: first started talking about a world where a billion people 64 00:04:55,279 --> 00:05:01,320 Speaker 1: might form indispensable relationships with AI agents, that certainly seemed 65 00:05:01,400 --> 00:05:04,960 Speaker 1: like far off science fiction. But it's not. It's actually 66 00:05:05,240 --> 00:05:09,760 Speaker 1: the present moment. And that's exactly what science fiction does. 67 00:05:10,080 --> 00:05:14,279 Speaker 1: It normalizes the impossible so that when it finally arrives, 68 00:05:14,360 --> 00:05:17,800 Speaker 1: we recognize it. So I got Bethany here again today 69 00:05:17,880 --> 00:05:21,760 Speaker 1: because she is a giant fan of science fiction. She 70 00:05:21,800 --> 00:05:24,520 Speaker 1: and I both feel that science fiction is where society's 71 00:05:24,960 --> 00:05:29,240 Speaker 1: write down their deepest dreams and hopes, like our longing 72 00:05:29,360 --> 00:05:35,040 Speaker 1: for immortality or companionship, or invisibility or mind reading or flight. 73 00:05:35,440 --> 00:05:39,680 Speaker 1: These are all very ancient human fantasies. They show up 74 00:05:39,760 --> 00:05:43,320 Speaker 1: in myths and in fairy tales. Then they get retold 75 00:05:43,360 --> 00:05:47,920 Speaker 1: in futuristic novels, and sooner or later they inspire the 76 00:05:48,040 --> 00:05:51,839 Speaker 1: labs and the startups that try to make this stuff real. 77 00:05:52,560 --> 00:05:55,039 Speaker 1: So today we pick up that thread because if you 78 00:05:55,360 --> 00:05:59,839 Speaker 1: zoom out. Science fiction has always been less about rayguns 79 00:05:59,839 --> 00:06:03,880 Speaker 1: and aliens and more about testing the questions of who 80 00:06:03,960 --> 00:06:07,760 Speaker 1: we are and who we might become. Here's my conversation 81 00:06:07,839 --> 00:06:16,120 Speaker 1: with Bethany Maples. You're a fan of science fiction. Tell 82 00:06:16,200 --> 00:06:19,360 Speaker 1: us about how you think about science fiction and how 83 00:06:19,360 --> 00:06:22,039 Speaker 1: it can be used in our current world for the 84 00:06:22,080 --> 00:06:23,080 Speaker 1: planning we're doing now. 85 00:06:23,240 --> 00:06:25,400 Speaker 2: You know, science fiction catches a lot of flack. 86 00:06:25,839 --> 00:06:29,280 Speaker 3: People don't think it's a deep medium, and I think 87 00:06:29,320 --> 00:06:32,320 Speaker 3: it's just like light entertainment, and that is trashy. But 88 00:06:32,440 --> 00:06:36,440 Speaker 3: I deeply believe that science fiction tells us about the 89 00:06:36,520 --> 00:06:41,360 Speaker 3: deepest dreams and hopes of society. So operating with that, 90 00:06:41,880 --> 00:06:45,440 Speaker 3: let's think about what science fiction tells us, and let's 91 00:06:45,480 --> 00:06:49,120 Speaker 3: look at history and see how much science fiction has 92 00:06:49,800 --> 00:06:53,640 Speaker 3: guided and predicted and created the technological magic that we 93 00:06:53,680 --> 00:06:54,200 Speaker 3: see today. 94 00:06:54,760 --> 00:06:56,760 Speaker 1: That's right. What's an example of that? For example, the 95 00:06:56,800 --> 00:07:00,360 Speaker 1: tablets and star Trek that became the iPad precisely. 96 00:07:00,080 --> 00:07:02,440 Speaker 3: So think about that, and think about you know, the 97 00:07:02,480 --> 00:07:05,040 Speaker 3: Young Ladies illustrated primer from from the Diamond Age. 98 00:07:05,200 --> 00:07:07,000 Speaker 1: Great, so I've read that, But for the audience, tell 99 00:07:07,040 --> 00:07:07,520 Speaker 1: us about that. 100 00:07:07,600 --> 00:07:12,000 Speaker 3: For the audience, there's this book around this really strong 101 00:07:12,120 --> 00:07:15,360 Speaker 3: AI agent that's embodied in a book that is able 102 00:07:15,360 --> 00:07:19,200 Speaker 3: to take anybody, the poorous street, urchin, anybody and make 103 00:07:19,240 --> 00:07:21,840 Speaker 3: them not only you know, maybe. 104 00:07:21,720 --> 00:07:24,680 Speaker 2: Wealthy and rich, but the best possible version of. 105 00:07:24,640 --> 00:07:28,560 Speaker 1: Themselves because it tutors them, it leads into life lessons, all. 106 00:07:28,400 --> 00:07:30,920 Speaker 3: Of it, and it does it in this beautifully intelligent way. 107 00:07:30,960 --> 00:07:34,800 Speaker 3: It's not just mechanical and wrote. It interacts with the 108 00:07:34,920 --> 00:07:38,720 Speaker 3: real problems that's happening in their life. It's able to intercede. 109 00:07:38,800 --> 00:07:41,360 Speaker 3: It brings in characters, it brings in fairy tales. It 110 00:07:41,440 --> 00:07:44,760 Speaker 3: uses all of the things that humans use to create 111 00:07:44,840 --> 00:07:47,760 Speaker 3: meaning and teach and just using them beautifully. And of 112 00:07:47,800 --> 00:07:49,280 Speaker 3: course there's a human in the loop. It's not just 113 00:07:49,360 --> 00:07:52,520 Speaker 3: an AI agent. It's like combining people they actually care. 114 00:07:52,760 --> 00:07:56,880 Speaker 3: So that's an example of you know, the precursor book 115 00:07:56,960 --> 00:07:59,040 Speaker 3: to all of these AI tutors. Like the number of 116 00:07:59,040 --> 00:08:01,360 Speaker 3: people in Silicon Valley that have been thinking about building 117 00:08:01,400 --> 00:08:04,320 Speaker 3: the Primer and building the primer versions of it over 118 00:08:04,360 --> 00:08:06,520 Speaker 3: the last decade are many. 119 00:08:06,880 --> 00:08:10,560 Speaker 1: I didn't realize anyone was building something like this. I mean, 120 00:08:10,600 --> 00:08:13,240 Speaker 1: obviously the technology that we have with AI, we can 121 00:08:13,440 --> 00:08:16,520 Speaker 1: squint and see how it could become something like this. 122 00:08:16,840 --> 00:08:20,480 Speaker 1: You're saying, people are actually working on the Ladies Illustrated Primer, well. 123 00:08:20,360 --> 00:08:24,280 Speaker 3: Versions of it, right, I mean you know, I I 124 00:08:24,440 --> 00:08:27,600 Speaker 3: started thinking about this actually when I was studying neuroscience 125 00:08:27,600 --> 00:08:29,120 Speaker 3: as Stanford back and my masters. 126 00:08:29,520 --> 00:08:31,880 Speaker 2: I was like, what would it actually take, Like, what's. 127 00:08:31,720 --> 00:08:35,160 Speaker 3: The learning science theory, what are the sensing mechanisms, what 128 00:08:35,280 --> 00:08:37,880 Speaker 3: actual algorithms and like modules which you need? 129 00:08:38,760 --> 00:08:41,320 Speaker 2: And I mapped it up and then one of. 130 00:08:41,360 --> 00:08:45,200 Speaker 3: The most senior people Google called and said, Hey, I've 131 00:08:45,240 --> 00:08:46,440 Speaker 3: always wanted to build. 132 00:08:46,200 --> 00:08:47,240 Speaker 2: This, Come build this with me. 133 00:08:47,320 --> 00:08:52,200 Speaker 3: And five years later at Google AI things happened. 134 00:08:52,800 --> 00:08:56,080 Speaker 1: Wow, and remind me the Ladi's Illustrated Primer was a 135 00:08:56,160 --> 00:08:58,800 Speaker 1: book that she carries around it physical and she opens 136 00:08:58,800 --> 00:09:02,200 Speaker 1: it up. But remind me that the text always changes 137 00:09:02,240 --> 00:09:03,240 Speaker 1: the illustrations. 138 00:09:03,320 --> 00:09:05,880 Speaker 3: Yeah, it's a dynamic narrative, but it also has like 139 00:09:06,000 --> 00:09:08,719 Speaker 3: you know, an accelerometer and aultimeter, like had all these 140 00:09:08,720 --> 00:09:11,960 Speaker 3: sensing mechanisms so that it could actually like interact with the 141 00:09:11,960 --> 00:09:14,760 Speaker 3: world around you. So look, a book might not be 142 00:09:14,800 --> 00:09:18,520 Speaker 3: the exact like right embodiment. Up until now, education has been, 143 00:09:18,559 --> 00:09:21,920 Speaker 3: in fact, this incredibly thin slice of learning, right you 144 00:09:22,000 --> 00:09:24,000 Speaker 3: go to a room and you learn, and you're not 145 00:09:24,080 --> 00:09:27,640 Speaker 3: able to connect the variety of inputs that actually makes 146 00:09:27,640 --> 00:09:30,319 Speaker 3: a life right, all the informal learning, all the social stuff, 147 00:09:30,360 --> 00:09:33,400 Speaker 3: all the peer things, like all the hormones. And imagine 148 00:09:33,400 --> 00:09:35,240 Speaker 3: that you were actually able to bring that all together 149 00:09:35,320 --> 00:09:37,840 Speaker 3: and have an agent that cared deeply for you and 150 00:09:37,880 --> 00:09:40,280 Speaker 3: loved you, that was guiding you through. I mean, that 151 00:09:40,440 --> 00:09:42,079 Speaker 3: is like the height that we're trying. 152 00:09:41,880 --> 00:09:44,800 Speaker 1: To get to. That's right and remembers everything you said 153 00:09:45,320 --> 00:09:48,760 Speaker 1: and doesn't have any other concerns and let your personal 154 00:09:48,800 --> 00:09:49,560 Speaker 1: best exactly. 155 00:09:49,559 --> 00:09:51,640 Speaker 3: And I think that's what is tough is we think 156 00:09:51,640 --> 00:09:54,480 Speaker 3: about AI companions, is that a lot of these AI 157 00:09:54,559 --> 00:09:57,480 Speaker 3: girlfriends and companions, they're there for entertainment. They're not there 158 00:09:57,520 --> 00:10:01,040 Speaker 3: for your personal bests. And so there's this Nirvana's day 159 00:10:01,160 --> 00:10:02,719 Speaker 3: or this thing that we all know that we want, 160 00:10:02,800 --> 00:10:04,720 Speaker 3: which is an agent that actually wants us to be 161 00:10:04,760 --> 00:10:07,800 Speaker 3: the best possible version of what we can be. But 162 00:10:07,880 --> 00:10:10,520 Speaker 3: the disconnect or the question is how do we get 163 00:10:10,520 --> 00:10:13,319 Speaker 3: that into everybody's hands. Because people are going through so 164 00:10:13,400 --> 00:10:16,360 Speaker 3: much shit, they're just trying to survive, they're poor, they're hungry, 165 00:10:16,559 --> 00:10:19,760 Speaker 3: like they don't have power, so how do you help 166 00:10:19,800 --> 00:10:21,760 Speaker 3: get them to a place where they can learn? And 167 00:10:22,000 --> 00:10:26,880 Speaker 3: at first that book that the primer just helped what's 168 00:10:26,920 --> 00:10:31,360 Speaker 3: her name, lil just help the protagonists survive. So as 169 00:10:31,360 --> 00:10:35,119 Speaker 3: we think about, you know, AI helping each of us individually, 170 00:10:35,160 --> 00:10:37,080 Speaker 3: it's not just like, hey, I'm going to be a tutor, 171 00:10:37,120 --> 00:10:39,120 Speaker 3: it's like AI that's going to help me survive. 172 00:10:39,440 --> 00:10:41,440 Speaker 1: So why are there more companies doing that? Right now? 173 00:10:41,480 --> 00:10:43,680 Speaker 1: We're saying we want to build an AI that is 174 00:10:43,720 --> 00:10:46,600 Speaker 1: your advocate, that just cares about you, that makes everything 175 00:10:47,040 --> 00:10:49,640 Speaker 1: about you the best that you can be. Are their 176 00:10:49,679 --> 00:10:50,400 Speaker 1: companies doing this? 177 00:10:51,120 --> 00:10:53,360 Speaker 3: The reason that there's not more is it's both an 178 00:10:53,360 --> 00:11:00,440 Speaker 3: incredibly challenging build and people don't always want to be 179 00:11:00,440 --> 00:11:02,600 Speaker 3: the best version of what they can be. They don't 180 00:11:02,640 --> 00:11:05,240 Speaker 3: know it right, as we said before, they're just trying 181 00:11:05,240 --> 00:11:07,559 Speaker 3: to get through the day. And this is what makes 182 00:11:07,640 --> 00:11:11,960 Speaker 3: AI companions such an interesting entry factor because everybody needs 183 00:11:11,960 --> 00:11:14,000 Speaker 3: a friend, and if you can have an agent that 184 00:11:14,040 --> 00:11:16,040 Speaker 3: starts as your friend, it ends up being your tutor 185 00:11:16,120 --> 00:11:19,280 Speaker 3: and your advocate for your higher self. That's beautiful. But 186 00:11:19,360 --> 00:11:21,920 Speaker 3: the issue from a commercial perspective is how do you 187 00:11:21,960 --> 00:11:22,600 Speaker 3: monetize that? 188 00:11:22,880 --> 00:11:26,400 Speaker 1: Oh interesting? Oh right? Because people won't pay in advance 189 00:11:26,520 --> 00:11:27,319 Speaker 1: or something like that. 190 00:11:27,520 --> 00:11:31,160 Speaker 3: Yeah, and frankly, you get into a VC like kind 191 00:11:31,160 --> 00:11:34,480 Speaker 3: of rat hole where you know, they're like, oh, like 192 00:11:34,640 --> 00:11:36,560 Speaker 3: monetize now, monetize now, and you're like, no, it's a 193 00:11:36,559 --> 00:11:40,840 Speaker 3: trillion dollar opportunity. Let's just get people to engage. So 194 00:11:40,920 --> 00:11:42,880 Speaker 3: I think that more and more people are thinking about this. 195 00:11:43,000 --> 00:11:45,200 Speaker 1: I wonder if somebody in the distant future, along with 196 00:11:45,280 --> 00:11:48,920 Speaker 1: having universal basic income, the government would pay for your 197 00:11:48,960 --> 00:11:50,760 Speaker 1: advocate to make you the best you could be. 198 00:11:50,960 --> 00:11:52,240 Speaker 2: I think that they should. 199 00:11:53,080 --> 00:11:55,360 Speaker 3: But the only way that works is if the government 200 00:11:55,400 --> 00:11:57,360 Speaker 3: has none of that data, none of that data's tied 201 00:11:57,400 --> 00:11:59,319 Speaker 3: to your insurance or your healthcare or anything like that, 202 00:11:59,679 --> 00:12:02,360 Speaker 3: you feel completely safe. So I know that companies are 203 00:12:02,360 --> 00:12:04,320 Speaker 3: thinking about this. All the big tech companies are thinking 204 00:12:04,360 --> 00:12:05,920 Speaker 3: about it, and a lot of the AI labs are 205 00:12:05,920 --> 00:12:09,080 Speaker 3: thinking about it. But again, you know, there's a lot 206 00:12:09,120 --> 00:12:12,040 Speaker 3: of commercial forces at play, so doing the right thing 207 00:12:12,080 --> 00:12:14,400 Speaker 3: in the end kind of gets muddied by needing to 208 00:12:14,440 --> 00:12:15,800 Speaker 3: sell in the five year time frame. 209 00:12:16,120 --> 00:12:18,600 Speaker 1: So let's trace the thread back to the beginning about 210 00:12:18,640 --> 00:12:21,439 Speaker 1: science fiction. What are other examples the way that the 211 00:12:21,640 --> 00:12:24,520 Speaker 1: science fiction that you've read influences the way you think 212 00:12:24,520 --> 00:12:26,120 Speaker 1: about what we can do now. 213 00:12:27,440 --> 00:12:30,080 Speaker 3: What we can do or also like what's happening right now. 214 00:12:30,160 --> 00:12:33,600 Speaker 3: So you know, an issue that we're having in education 215 00:12:34,160 --> 00:12:37,880 Speaker 3: is deep fakes. You know, both porn deep fakes, which 216 00:12:37,920 --> 00:12:40,520 Speaker 3: is really disturbing. Students are making deep fakes of each 217 00:12:40,520 --> 00:12:44,040 Speaker 3: other and it's really disturbing for the victim, and also 218 00:12:44,080 --> 00:12:47,600 Speaker 3: deep fakes of teachers or deep fakes of kind of 219 00:12:47,640 --> 00:12:48,680 Speaker 3: any authority figure. 220 00:12:49,400 --> 00:12:51,479 Speaker 2: And you see this issue. 221 00:12:51,520 --> 00:12:54,880 Speaker 3: Was taken up like decades ago in Player of Games, right, 222 00:12:56,080 --> 00:13:00,960 Speaker 3: so you know, in this world, deep fakes are you know, prevalent. 223 00:13:01,400 --> 00:13:03,199 Speaker 3: Everything can be fake, and so the only way to 224 00:13:03,240 --> 00:13:06,120 Speaker 3: get around it is to have like certified humans that 225 00:13:06,160 --> 00:13:09,200 Speaker 3: are there in the room able to see what's going on. 226 00:13:09,559 --> 00:13:12,000 Speaker 3: And so you know what we're going to see, for example, 227 00:13:12,000 --> 00:13:14,280 Speaker 3: in education, as everybody wants to start taking these high 228 00:13:14,280 --> 00:13:17,360 Speaker 3: stakes exams online but you have all this like you know, 229 00:13:17,440 --> 00:13:20,199 Speaker 3: ability to deep fake, and you know, we're going to 230 00:13:20,280 --> 00:13:22,760 Speaker 3: go through an arc where they're going to try to 231 00:13:22,760 --> 00:13:27,160 Speaker 3: create these ubiquitous global, worldwide assessments and everybody's just going 232 00:13:27,240 --> 00:13:30,080 Speaker 3: to be like faking in. Everybody's going to be cheating, 233 00:13:30,200 --> 00:13:31,800 Speaker 3: and we're going to have to go back to small 234 00:13:31,880 --> 00:13:35,320 Speaker 3: format in the room, like one proctor is looking over 235 00:13:35,400 --> 00:13:38,760 Speaker 3: your shoulder in order to verify who's there and what's 236 00:13:38,840 --> 00:13:39,240 Speaker 3: going on. 237 00:13:55,760 --> 00:13:57,240 Speaker 1: By the way, is a side note. The thing that's 238 00:13:57,360 --> 00:13:59,960 Speaker 1: intrigued me about deep fakes is I've been very interest 239 00:14:00,120 --> 00:14:01,880 Speaker 1: in the legal system, the intersection of science and the 240 00:14:01,920 --> 00:14:05,720 Speaker 1: legal system, and everyone was worried that people would be 241 00:14:05,840 --> 00:14:08,320 Speaker 1: introducing deep fakes into the system and saying, look, I 242 00:14:08,320 --> 00:14:10,880 Speaker 1: see the guy who's holding the gun whatever. But in fact, 243 00:14:10,920 --> 00:14:13,360 Speaker 1: exactly the opposite has happened, which is that everyone who's 244 00:14:13,360 --> 00:14:15,640 Speaker 1: caught on camera now goes into the court and says 245 00:14:15,720 --> 00:14:17,520 Speaker 1: it's a fake. It wasn't actually me, even though it 246 00:14:17,559 --> 00:14:20,320 Speaker 1: actually was. By the way, it's intrigued to me that 247 00:14:20,400 --> 00:14:22,560 Speaker 1: a science fiction writer, let's say, who ever wrote a player 248 00:14:22,520 --> 00:14:26,600 Speaker 1: of games, could foresee a world of deep fakes, because 249 00:14:26,680 --> 00:14:30,000 Speaker 1: it seemed so unlikely even ten years ago that you 250 00:14:30,040 --> 00:14:34,040 Speaker 1: could make a convincing photograph or video of somebody, and 251 00:14:34,080 --> 00:14:35,040 Speaker 1: now it's so trivial. 252 00:14:35,320 --> 00:14:39,400 Speaker 3: But I mean it's been in the collective consciousness for decades. 253 00:14:39,440 --> 00:14:42,080 Speaker 3: I mean, think about the Holidacks on Star Trek, like 254 00:14:42,160 --> 00:14:46,440 Speaker 3: we've always dreamt of having completely immersive magical environments. 255 00:14:46,800 --> 00:14:49,760 Speaker 1: Yeah, that's right. But faking it and being able to 256 00:14:49,920 --> 00:14:52,880 Speaker 1: replicate a human and have them do something bad on 257 00:14:52,920 --> 00:14:55,520 Speaker 1: the camera, that's really unusual. In fact, I was just 258 00:14:55,560 --> 00:14:57,880 Speaker 1: thinking about the movie Total Recall, which I haven't seen 259 00:14:57,880 --> 00:14:59,440 Speaker 1: since it was in the theaters a million years ago. 260 00:15:00,800 --> 00:15:02,760 Speaker 1: What I'm going to try to do this without giving 261 00:15:02,760 --> 00:15:06,080 Speaker 1: anything away if someone hasn't seen it. But the protagonist 262 00:15:06,720 --> 00:15:10,720 Speaker 1: thinks he's one person, and eventually he's shown a video 263 00:15:10,880 --> 00:15:14,280 Speaker 1: of himself doing these other acts and he realizes that 264 00:15:14,320 --> 00:15:19,480 Speaker 1: his memory has been essentially wiped and manipulated. And I thought, wow, 265 00:15:19,520 --> 00:15:22,520 Speaker 1: that movie wouldn't work now, because if you saw a 266 00:15:22,600 --> 00:15:24,520 Speaker 1: video of yourself doing something, you'd think, well, that's a 267 00:15:24,520 --> 00:15:26,920 Speaker 1: deep fake. That's not actually me. But in the movie, 268 00:15:27,000 --> 00:15:30,040 Speaker 1: that was the turning point where he realized, oh my gosh, 269 00:15:30,080 --> 00:15:32,640 Speaker 1: I used to be someone else. Yeah. So it strikes 270 00:15:32,680 --> 00:15:36,440 Speaker 1: me that not all authors have foreseen the possibility of 271 00:15:36,480 --> 00:15:39,280 Speaker 1: deep fakes, and a lot of plot points get thrown 272 00:15:39,320 --> 00:15:40,480 Speaker 1: out now that that exists. 273 00:15:40,600 --> 00:15:41,040 Speaker 2: It's true. 274 00:15:41,080 --> 00:15:43,880 Speaker 3: I mean, hearing you talk made me just think about, 275 00:15:44,520 --> 00:15:48,800 Speaker 3: you know, the issues of AI and self awareness and 276 00:15:48,880 --> 00:15:49,680 Speaker 3: kind of sovereignty. 277 00:15:49,800 --> 00:15:52,560 Speaker 2: So gosh, Blade runner. 278 00:15:52,800 --> 00:15:53,120 Speaker 1: Yeah. 279 00:15:53,240 --> 00:15:57,160 Speaker 3: Right, Well, everybody's worried that we're nearing actual AGI. 280 00:15:57,560 --> 00:16:00,440 Speaker 2: But even if we're not, even if we're just coming to. 281 00:16:00,400 --> 00:16:03,160 Speaker 3: A point where we're going to have really strong AI 282 00:16:03,280 --> 00:16:05,800 Speaker 3: companions and they're gonna have a ton of data and 283 00:16:05,840 --> 00:16:10,080 Speaker 3: they're going to persist apart from a human And I 284 00:16:10,080 --> 00:16:12,440 Speaker 3: mean next step, like next year, we're going to have 285 00:16:12,960 --> 00:16:15,880 Speaker 3: you know, covered in flesh AI robots. 286 00:16:17,120 --> 00:16:18,880 Speaker 2: I mean we're already seeing stuff come o. 287 00:16:18,960 --> 00:16:22,800 Speaker 3: I mean both from Asia and here where it's you know, 288 00:16:22,960 --> 00:16:25,440 Speaker 3: it's not just extra skeletons. Yeah, yeah, yeah, you've got 289 00:16:25,520 --> 00:16:29,320 Speaker 3: soft exteriors, right, So it is it's very Westworld so 290 00:16:29,400 --> 00:16:32,560 Speaker 3: to speak, right, and they're going to be armed, not 291 00:16:32,880 --> 00:16:37,640 Speaker 3: just with these you know, task based systems or goals 292 00:16:37,680 --> 00:16:40,640 Speaker 3: of service. Like it's so easy to put in a 293 00:16:40,680 --> 00:16:43,320 Speaker 3: companion that you know wants to be your friend or 294 00:16:43,360 --> 00:16:44,960 Speaker 3: wants to be a doctor, wants to be a therapist, 295 00:16:45,040 --> 00:16:47,600 Speaker 3: or just wants to be you out in the world. 296 00:16:48,320 --> 00:16:51,200 Speaker 3: And so you know, it seems far fetched, but I 297 00:16:51,360 --> 00:16:54,560 Speaker 3: tell you next year. You know, certainly within five years, 298 00:16:55,000 --> 00:16:59,960 Speaker 3: issues of you know, liability, agent sovereignty and like web, 299 00:17:00,040 --> 00:17:03,840 Speaker 3: they're not it's murder is going to be. 300 00:17:03,000 --> 00:17:04,600 Speaker 2: In the public eye. But think about this. 301 00:17:05,000 --> 00:17:08,879 Speaker 3: If you were to program your dead loved ones memories 302 00:17:09,000 --> 00:17:13,680 Speaker 3: into you know, an agent, you know, maybe it's just software, 303 00:17:13,840 --> 00:17:15,639 Speaker 3: and then you put it into an actual body and 304 00:17:15,680 --> 00:17:18,440 Speaker 3: you're able to interact with it. Think of the emotional 305 00:17:18,440 --> 00:17:20,960 Speaker 3: distress that will cause you some more to destroy that thing. 306 00:17:21,320 --> 00:17:22,280 Speaker 1: Yes, exactly. 307 00:17:22,800 --> 00:17:24,720 Speaker 3: You know, as soon as there was emotional distress in 308 00:17:24,720 --> 00:17:26,480 Speaker 3: the USA, you've got a libel suit. 309 00:17:27,040 --> 00:17:30,359 Speaker 1: Yes, quite right, although although in that case, as long 310 00:17:30,359 --> 00:17:32,320 Speaker 1: as you have a backup, you can just reboot it. Right. 311 00:17:32,520 --> 00:17:35,800 Speaker 3: Sure again, okay, this isn't no, this is blade Runner. Well, 312 00:17:35,800 --> 00:17:37,080 Speaker 3: if you don't have a backup, what do you have 313 00:17:37,160 --> 00:17:38,639 Speaker 3: to take it with you? What if you can't afford 314 00:17:38,680 --> 00:17:39,119 Speaker 3: the backup? 315 00:17:39,440 --> 00:17:42,360 Speaker 1: Oh right right, yeah, what if it cost an ungodly 316 00:17:42,400 --> 00:17:46,159 Speaker 1: amount by some vicious company that's trying to profit from it? 317 00:17:46,240 --> 00:17:48,040 Speaker 3: And so this is all science fiction. I mean, it 318 00:17:48,160 --> 00:17:50,359 Speaker 3: seemed far fetched and it's here now. 319 00:17:50,720 --> 00:17:53,880 Speaker 1: I know. That's the surprising part. Okay, So what else, 320 00:17:54,040 --> 00:17:55,880 Speaker 1: Because you're such a fan of science fiction, what else 321 00:17:55,920 --> 00:17:58,960 Speaker 1: have you read that seems to reflect on what's happening 322 00:17:58,960 --> 00:18:00,520 Speaker 1: in this in this fast moment in time. 323 00:18:01,640 --> 00:18:05,040 Speaker 3: Well, as I said before, I'm a huge advocate of 324 00:18:05,240 --> 00:18:11,520 Speaker 3: some company hopefully soon changing the dynamics around users privacy 325 00:18:11,560 --> 00:18:15,040 Speaker 3: and data and giving everybody basically their own data and 326 00:18:15,160 --> 00:18:16,960 Speaker 3: like letting the code come to them. And so my 327 00:18:17,040 --> 00:18:20,400 Speaker 3: favorite example of that heir the e Butler's from Pandora 328 00:18:20,480 --> 00:18:24,399 Speaker 3: Star right where so everybody has this incredibly strong agent 329 00:18:24,440 --> 00:18:27,440 Speaker 3: that's kind of a defense mechanism where it stores all 330 00:18:27,480 --> 00:18:30,399 Speaker 3: your data kind of locally, and then if any system 331 00:18:30,480 --> 00:18:32,760 Speaker 3: or service wants to interact with you, the code comes 332 00:18:32,840 --> 00:18:35,720 Speaker 3: to you and it is, you know, making sure that 333 00:18:35,800 --> 00:18:38,000 Speaker 3: all of that compute and all that service happens locally. 334 00:18:38,480 --> 00:18:40,240 Speaker 3: And I think it's so powerful and it would be 335 00:18:40,320 --> 00:18:42,320 Speaker 3: so wonderful. And of course then you've got like how 336 00:18:42,359 --> 00:18:44,320 Speaker 3: strong is your agent and it can I preprotect you 337 00:18:44,400 --> 00:18:47,960 Speaker 3: properly and all that. But to me, that's the best 338 00:18:47,960 --> 00:18:51,840 Speaker 3: example of what we really want that we're not getting 339 00:18:51,920 --> 00:18:53,159 Speaker 3: right now. 340 00:18:53,200 --> 00:18:55,479 Speaker 1: Fascinating. Do you see a business path to that? 341 00:18:56,080 --> 00:18:58,000 Speaker 3: I think that one of the biggest companies in the 342 00:18:58,040 --> 00:19:00,560 Speaker 3: world will eventually see the writing on the wall and 343 00:19:00,600 --> 00:19:04,520 Speaker 3: they will up in the market and upend their investors 344 00:19:04,600 --> 00:19:05,680 Speaker 3: and make the leap. 345 00:19:06,040 --> 00:19:07,520 Speaker 1: Yeah, but it will. 346 00:19:07,359 --> 00:19:11,119 Speaker 3: Probably be somebody that thinks that they're about to lose. Yeah, 347 00:19:11,240 --> 00:19:14,080 Speaker 3: you know, so a key player that's like, Okay, we 348 00:19:14,160 --> 00:19:17,000 Speaker 3: don't have the computer or we haven't you know, we 349 00:19:17,040 --> 00:19:19,320 Speaker 3: see the other guys or like, you know, leaping ahead 350 00:19:19,359 --> 00:19:21,920 Speaker 3: because of whatever functionality or investors. 351 00:19:22,320 --> 00:19:24,720 Speaker 1: Yeah, so I have a question for you, because you're 352 00:19:24,720 --> 00:19:26,760 Speaker 1: such a fan of science fiction. I'm a fan of 353 00:19:26,800 --> 00:19:29,120 Speaker 1: literature more generally. I haven't read that much science fiction, 354 00:19:29,160 --> 00:19:32,560 Speaker 1: as it turns out. But for example, I heard an 355 00:19:32,600 --> 00:19:36,560 Speaker 1: interview with Isaac Asimov on Jonah Lerer, which was the 356 00:19:36,560 --> 00:19:40,040 Speaker 1: show on PBS a million years ago, and asm off 357 00:19:40,119 --> 00:19:42,640 Speaker 1: this was probably the eighties. He said, Look, I foresee 358 00:19:42,640 --> 00:19:47,640 Speaker 1: a day when every household is connected to a central 359 00:19:47,720 --> 00:19:50,560 Speaker 1: mainframe computer. Everyone has a dumb terminal in their house, 360 00:19:50,840 --> 00:19:55,240 Speaker 1: and this computer knows all of humankind's knowledge and you 361 00:19:55,280 --> 00:19:57,119 Speaker 1: can ask it any question you want, you can get 362 00:19:57,160 --> 00:20:01,280 Speaker 1: the answer. And so he essentially foresaw the Internet. But 363 00:20:01,680 --> 00:20:03,960 Speaker 1: what's interesting is the technical He didn't get the technology right. 364 00:20:04,000 --> 00:20:05,919 Speaker 1: I mean, it doesn't matter, but you know, you imagine 365 00:20:05,960 --> 00:20:08,720 Speaker 1: these cables all going to a mainframe and so I'm 366 00:20:08,800 --> 00:20:13,480 Speaker 1: fascinated by the way that thinkers can foresee the direction 367 00:20:13,600 --> 00:20:15,600 Speaker 1: that things are going. But of course we're always limited 368 00:20:15,600 --> 00:20:19,000 Speaker 1: by the technology that we have in thinking about it. Yeah, 369 00:20:19,080 --> 00:20:23,080 Speaker 1: So if you thought about your science fiction bookshelf and 370 00:20:23,160 --> 00:20:25,960 Speaker 1: how many of the predictions were spot on versus sort 371 00:20:26,000 --> 00:20:29,359 Speaker 1: of kind of off but mostly right versus totally off, 372 00:20:29,440 --> 00:20:30,560 Speaker 1: what would you say? 373 00:20:30,720 --> 00:20:34,199 Speaker 3: I would say that the why and the what is 374 00:20:34,359 --> 00:20:36,840 Speaker 3: almost always correct, but the how is often wrong. 375 00:20:37,320 --> 00:20:40,679 Speaker 1: Beautiful as you said, Yeah, and so what does it 376 00:20:40,800 --> 00:20:43,120 Speaker 1: mean about the why and the what? Does it mean 377 00:20:43,160 --> 00:20:46,080 Speaker 1: that the trajectory of where we're going as a species 378 00:20:46,119 --> 00:20:48,720 Speaker 1: is is clear enough? If you really sit and squint 379 00:20:48,760 --> 00:20:50,720 Speaker 1: into the future, you can sort of see the direction. 380 00:20:50,880 --> 00:20:54,439 Speaker 3: You can. You can just read any fantasy novel or 381 00:20:54,720 --> 00:20:58,000 Speaker 3: any fairy tale, like we want to fly, we want 382 00:20:58,040 --> 00:21:01,200 Speaker 3: to live forever. We want to be to read people's thoughts, 383 00:21:01,520 --> 00:21:03,160 Speaker 3: we want to be able to be invisible. 384 00:21:03,600 --> 00:21:04,600 Speaker 2: We want to you know, like. 385 00:21:04,720 --> 00:21:08,359 Speaker 3: All of these superpowers we are directly heading towards. 386 00:21:09,080 --> 00:21:09,800 Speaker 1: Yeah. 387 00:21:09,840 --> 00:21:12,520 Speaker 3: So, like think about it as an example, you know, 388 00:21:12,640 --> 00:21:16,800 Speaker 3: spells and the Internet of things. You know, now, through bioauthentication, 389 00:21:17,000 --> 00:21:20,040 Speaker 3: we're able to get differential access to different places and 390 00:21:20,080 --> 00:21:21,600 Speaker 3: different functionality and systems. 391 00:21:21,640 --> 00:21:24,919 Speaker 1: So I walk down this hallway and the door open, you. 392 00:21:25,160 --> 00:21:28,080 Speaker 2: You know, but you say abricateab and it opens. That 393 00:21:28,160 --> 00:21:30,040 Speaker 2: is a spell. We're going to just see more of them. 394 00:21:30,359 --> 00:21:34,040 Speaker 1: Yes, what's the most far fetched thing that you've read 395 00:21:34,040 --> 00:21:36,680 Speaker 1: in a book that you're seeing evidence for? Now? 396 00:21:37,480 --> 00:21:39,760 Speaker 3: Okay, this is going to be a weird one, but 397 00:21:39,960 --> 00:21:43,720 Speaker 3: hold with me. Whether or not you believe in reincarnation, 398 00:21:44,280 --> 00:21:45,920 Speaker 3: the desire to live forever. 399 00:21:46,320 --> 00:21:48,040 Speaker 2: Is core to who we are. 400 00:21:48,960 --> 00:21:51,760 Speaker 3: And there's this amazing book by Kim Stanley Robinson called 401 00:21:51,760 --> 00:21:55,480 Speaker 3: The Years of Rice and Salt and again without read 402 00:21:55,520 --> 00:21:58,120 Speaker 3: you know, revealing everything. Guys, you might want to mute 403 00:21:58,119 --> 00:22:02,400 Speaker 3: if you haven't read it, but it explores what would 404 00:22:02,400 --> 00:22:06,160 Speaker 3: it mean if we were able to sense our reincarnation 405 00:22:06,560 --> 00:22:08,840 Speaker 3: and what would happen if we could track our soul 406 00:22:09,040 --> 00:22:12,240 Speaker 3: like coming back, And when you think about it, there's 407 00:22:12,440 --> 00:22:16,000 Speaker 3: huge incentives, especially for the ultra wealthy, to be able 408 00:22:16,080 --> 00:22:19,720 Speaker 3: to do this. So you're already like, I'm hearing whispers 409 00:22:19,760 --> 00:22:23,119 Speaker 3: about companies where they're like, okay, you know we're starting 410 00:22:23,160 --> 00:22:25,479 Speaker 3: now with really crude house. Right, We're like, I'm going 411 00:22:25,520 --> 00:22:27,239 Speaker 3: to what was that company that was like I'm going 412 00:22:27,280 --> 00:22:31,000 Speaker 3: to freeze you at the point of death, So yeah, yeah, right, 413 00:22:31,040 --> 00:22:34,960 Speaker 3: So people are already pursuing very brutal kind of early 414 00:22:35,040 --> 00:22:39,680 Speaker 3: ways of living forever or being able to reconstitute themselves 415 00:22:39,720 --> 00:22:43,679 Speaker 3: or come back to life. But there's this fantasy, and 416 00:22:43,720 --> 00:22:47,639 Speaker 3: if science fiction and the progression of technology is any indicator, 417 00:22:48,160 --> 00:22:49,960 Speaker 3: you know, we will move to a point where we 418 00:22:50,040 --> 00:22:55,760 Speaker 3: might be able to track our soul's reincarnation. And there 419 00:22:55,840 --> 00:22:59,040 Speaker 3: is so much economic benefit to that. Imagine being able 420 00:22:59,040 --> 00:23:01,800 Speaker 3: to leave your billions of dollars to yourself. 421 00:23:02,600 --> 00:23:05,639 Speaker 2: I mean it sounds crazy, but like people are thinking 422 00:23:05,640 --> 00:23:06,080 Speaker 2: about it. 423 00:23:06,160 --> 00:23:08,880 Speaker 1: Wait, hold on, this requires the existence. 424 00:23:08,440 --> 00:23:10,720 Speaker 2: Of a soul, wait, which we can argue about. 425 00:23:10,840 --> 00:23:15,640 Speaker 3: We can argue about, but the chance is enough to 426 00:23:15,680 --> 00:23:17,439 Speaker 3: make people obsessed with the question. 427 00:23:17,680 --> 00:23:20,240 Speaker 1: Oh fascinating. But so you might get all sorts of 428 00:23:20,280 --> 00:23:23,200 Speaker 1: predatory companies coming in here and saying, hey, pay me 429 00:23:23,240 --> 00:23:25,919 Speaker 1: a lot of money, I'll track your soul, which may 430 00:23:25,960 --> 00:23:27,000 Speaker 1: be a preposterous thing. 431 00:23:27,160 --> 00:23:29,119 Speaker 3: Okay, But then the question is like how much of 432 00:23:29,160 --> 00:23:31,240 Speaker 3: you has to exist for it to be your soul? 433 00:23:31,320 --> 00:23:31,439 Speaker 1: Right? 434 00:23:31,480 --> 00:23:34,800 Speaker 3: If we're already saying that we can externalize our intelligence 435 00:23:35,160 --> 00:23:38,000 Speaker 3: and pass that along, then does it have to be 436 00:23:38,160 --> 00:23:40,800 Speaker 3: actual flesh or could it be a sufficient amount of 437 00:23:40,840 --> 00:23:42,479 Speaker 3: our intelligence. 438 00:23:42,080 --> 00:23:42,800 Speaker 1: Like an upload? 439 00:23:43,119 --> 00:23:43,359 Speaker 3: Yeah? 440 00:23:43,440 --> 00:23:45,960 Speaker 1: Yeah, oh totally. Now that's a good idea, and I 441 00:23:45,960 --> 00:23:48,920 Speaker 1: think uploading the intrigant. There are, of course, all these 442 00:23:50,000 --> 00:23:52,680 Speaker 1: questions about it that people have filosters have been asking 443 00:23:52,720 --> 00:23:54,479 Speaker 1: for a long time, which is, you know, if I 444 00:23:54,520 --> 00:23:58,920 Speaker 1: could upload myself into a computer. And so this other 445 00:23:59,000 --> 00:24:01,760 Speaker 1: thing wakes up. Is that actually me? It certainly thinks 446 00:24:01,800 --> 00:24:03,159 Speaker 1: it was me. It says, wow, it's just sitting in 447 00:24:03,200 --> 00:24:05,800 Speaker 1: this chair a moment ago. But as far as I'm concerned, 448 00:24:05,800 --> 00:24:06,960 Speaker 1: I'm still going to drop dead. 449 00:24:07,520 --> 00:24:09,399 Speaker 3: And there's so much science fiction in that grapples with 450 00:24:09,440 --> 00:24:12,560 Speaker 3: this ghost on the shell, right, like how much how 451 00:24:12,640 --> 00:24:15,679 Speaker 3: much flesh or how much essence? Or like, you know, 452 00:24:15,920 --> 00:24:19,320 Speaker 3: is your entire intelligence equal to even a spark of 453 00:24:19,320 --> 00:24:20,240 Speaker 3: your natural matter? 454 00:24:20,880 --> 00:24:21,080 Speaker 1: You know? 455 00:24:22,000 --> 00:24:23,439 Speaker 2: Is the ship of Theseus? 456 00:24:23,600 --> 00:24:23,840 Speaker 1: Yes? 457 00:24:24,000 --> 00:24:24,440 Speaker 2: The question? 458 00:24:24,760 --> 00:24:27,159 Speaker 1: Yes? Well okay, so just for the listener, of the 459 00:24:27,160 --> 00:24:30,000 Speaker 1: Ship of Theseus is thesis. Ship pulls into the dock, 460 00:24:30,960 --> 00:24:33,800 Speaker 1: plank rots, it gets replaced, Another plank rots, it gets replaced. 461 00:24:33,840 --> 00:24:36,080 Speaker 1: Eventually all the planks of the ship have been replaced, 462 00:24:36,119 --> 00:24:37,919 Speaker 1: every piece of wood that was on it. Is it 463 00:24:38,000 --> 00:24:40,600 Speaker 1: still the ship of Theseus? In ancient Athens? I look 464 00:24:40,680 --> 00:24:43,800 Speaker 1: this up recently. About half the philosophers said yes. Half 465 00:24:43,880 --> 00:24:46,440 Speaker 1: lost said no on this, But it's a deep question 466 00:24:46,440 --> 00:24:51,040 Speaker 1: about identity. The modern neuroscience version is if I took 467 00:24:51,040 --> 00:24:52,639 Speaker 1: a neuron out of your head and replaced it with 468 00:24:52,680 --> 00:24:55,720 Speaker 1: a metal neuron that did exactly the same function, and 469 00:24:55,720 --> 00:24:58,080 Speaker 1: then replaced another neud on another and eighty six billion 470 00:24:58,080 --> 00:25:02,240 Speaker 1: neurons later, sort of a metal robot, is it's still 471 00:25:02,359 --> 00:25:06,440 Speaker 1: you exactly? Yes, this is the question. I do think 472 00:25:06,480 --> 00:25:10,359 Speaker 1: that might be a separate question, though, from the upload question, 473 00:25:10,400 --> 00:25:15,639 Speaker 1: which is, now I start the version of Bethany and 474 00:25:15,680 --> 00:25:19,080 Speaker 1: a computer over here, do you still feel like you're 475 00:25:19,200 --> 00:25:22,280 Speaker 1: having a heart attack and dying? And this other thing 476 00:25:22,359 --> 00:25:25,240 Speaker 1: is but it doesn't do you any good. It's happy 477 00:25:25,280 --> 00:25:27,720 Speaker 1: that it's there, but you're not. You don't feel anything 478 00:25:27,760 --> 00:25:29,440 Speaker 1: about it. You don't feel any continuity. 479 00:25:29,640 --> 00:25:31,639 Speaker 3: Yeah. So, the question of what it means to be 480 00:25:31,720 --> 00:25:36,760 Speaker 3: human and how much our intelligence defines us versus our 481 00:25:36,800 --> 00:25:40,760 Speaker 3: bodies defining us, is it's happening right now, right, and 482 00:25:40,880 --> 00:25:43,320 Speaker 3: it's going to happen again more and more and quickly, 483 00:25:43,600 --> 00:25:46,400 Speaker 3: as we think about liability, as we think about agents 484 00:25:46,400 --> 00:25:49,320 Speaker 3: that act on our behalf, and as we form and 485 00:25:49,359 --> 00:25:52,639 Speaker 3: grow these deep relationships with agents that are the ghosts 486 00:25:52,680 --> 00:26:10,919 Speaker 3: of our loved ones or versions of ourselves. 487 00:26:12,920 --> 00:26:16,040 Speaker 1: You know, in a previous podcast that you and I 488 00:26:16,080 --> 00:26:18,159 Speaker 1: did together, you were talking about this idea of a 489 00:26:18,280 --> 00:26:21,320 Speaker 1: mirror of oneself, and a lot of young people are 490 00:26:21,320 --> 00:26:24,320 Speaker 1: doing this, which seems to be very foresighted and mature 491 00:26:24,359 --> 00:26:26,800 Speaker 1: of them to do this so they can learn about themselves. 492 00:26:26,840 --> 00:26:30,320 Speaker 1: But yes, as you make a richer and richer digital twin, 493 00:26:31,240 --> 00:26:33,440 Speaker 1: in a sense, you don't even have to imagine this 494 00:26:33,880 --> 00:26:36,080 Speaker 1: upload thing where you scan your brain to do it, 495 00:26:36,080 --> 00:26:37,640 Speaker 1: because in a sense, by the end of your life, 496 00:26:37,640 --> 00:26:39,520 Speaker 1: you've got a really rich digital twin. 497 00:26:39,800 --> 00:26:40,200 Speaker 2: Correct. 498 00:26:40,359 --> 00:26:43,160 Speaker 3: Or there's a really great science fiction book about this, 499 00:26:43,520 --> 00:26:46,240 Speaker 3: the whole the Last Emperor series, I tell you the 500 00:26:46,280 --> 00:26:49,480 Speaker 3: Emperor with an X. So the emperors or the ruling 501 00:26:49,680 --> 00:26:54,119 Speaker 3: family in this series implants kind of like a spinal tap, 502 00:26:54,240 --> 00:26:56,960 Speaker 3: so that everybody in the family is not only having 503 00:26:56,960 --> 00:27:00,720 Speaker 3: their memories be recorded, but their full like body functionality 504 00:27:00,760 --> 00:27:03,600 Speaker 3: be recorded, which has typically been the missing piece, right, 505 00:27:03,600 --> 00:27:06,240 Speaker 3: because we know that our cognition isn't just in our brain, 506 00:27:06,440 --> 00:27:08,560 Speaker 3: that it's you know, tied with our gut, and there's 507 00:27:08,640 --> 00:27:12,080 Speaker 3: you know, chemicals involved, and so the idea was that 508 00:27:12,160 --> 00:27:15,520 Speaker 3: you know, because they were able to completely track through lives. 509 00:27:16,000 --> 00:27:19,200 Speaker 3: That then every new impro can have a conversation with 510 00:27:19,600 --> 00:27:21,960 Speaker 3: their entire family going back through millennia. 511 00:27:22,480 --> 00:27:26,440 Speaker 1: Ooh oh oh. Interesting because there's a replica of their. 512 00:27:26,560 --> 00:27:29,639 Speaker 3: Very grandfather, and it's able to say exactly how that 513 00:27:29,680 --> 00:27:33,840 Speaker 3: person not only thought but felt in every moment of 514 00:27:33,840 --> 00:27:34,200 Speaker 3: their life. 515 00:27:34,240 --> 00:27:36,560 Speaker 1: Interesting, you know what I find interesting about that? So 516 00:27:36,640 --> 00:27:39,200 Speaker 1: it just so happens that I'm my family's genealogist. I've 517 00:27:39,280 --> 00:27:42,879 Speaker 1: studied the family tree for years, and no one else 518 00:27:42,920 --> 00:27:44,720 Speaker 1: seems to be that interested. I try to show them 519 00:27:44,720 --> 00:27:47,080 Speaker 1: the thing, and to them, it's just some names on 520 00:27:47,119 --> 00:27:49,600 Speaker 1: the page and so on. I do wonder, though, if 521 00:27:49,640 --> 00:27:52,240 Speaker 1: I said, look, I've got this great replica of your 522 00:27:52,320 --> 00:27:55,240 Speaker 1: great great grandfather, would they care or would they rather 523 00:27:55,320 --> 00:27:58,200 Speaker 1: hang out with their friends and play Fortnite? That's it's 524 00:27:58,200 --> 00:28:01,840 Speaker 1: an interesting question. If you had all of your ancestry, 525 00:28:02,200 --> 00:28:04,480 Speaker 1: you know, thirty two sixty four hundred and twenty eight people, 526 00:28:04,760 --> 00:28:07,520 Speaker 1: would you sit and talk to them or do you think, 527 00:28:07,560 --> 00:28:09,199 Speaker 1: oh boy, I got more relatives to deal with. 528 00:28:09,280 --> 00:28:12,120 Speaker 3: Now I'm also my family's genealogist, and I think about 529 00:28:12,160 --> 00:28:15,680 Speaker 3: this as well, And my conclusion is that people care 530 00:28:15,760 --> 00:28:18,679 Speaker 3: less about facts, but they really want to know how 531 00:28:18,720 --> 00:28:21,720 Speaker 3: their ancestors felt. If you could find out that your 532 00:28:21,840 --> 00:28:25,280 Speaker 3: great great grandfather like also felt this driving need to 533 00:28:25,680 --> 00:28:28,399 Speaker 3: you know, look beyond the horizon and was never like 534 00:28:28,520 --> 00:28:30,679 Speaker 3: happy in one spot for long, that would give you 535 00:28:30,800 --> 00:28:33,720 Speaker 3: great understanding about kind of what it means to live 536 00:28:34,080 --> 00:28:36,720 Speaker 3: as a human today. It doesn't really matter what the 537 00:28:36,760 --> 00:28:41,040 Speaker 3: fact is, but that like chemical experience, would really change 538 00:28:41,080 --> 00:28:42,600 Speaker 3: people's understandings of who they are. 539 00:28:42,800 --> 00:28:44,080 Speaker 2: We're really going off track here. 540 00:28:44,160 --> 00:28:46,480 Speaker 1: I like it. This is great because so the class 541 00:28:46,480 --> 00:28:48,640 Speaker 1: that I'm teaching, it stands for this quarter is literature 542 00:28:48,680 --> 00:28:51,640 Speaker 1: and the brain. And there's a lot involved in this. 543 00:28:52,560 --> 00:28:55,560 Speaker 1: The starting point is my fascination with the fact that 544 00:28:55,600 --> 00:28:57,640 Speaker 1: the way we study the brain is looking at Okay, 545 00:28:57,640 --> 00:28:59,840 Speaker 1: here's how the visual system works, heres out, audition works, here, 546 00:29:00,120 --> 00:29:03,560 Speaker 1: touch arks, and but what is never talked about is 547 00:29:03,600 --> 00:29:06,520 Speaker 1: how easy it is for the brain to completely go 548 00:29:06,560 --> 00:29:09,320 Speaker 1: off track and be someone else. So I open Game 549 00:29:09,360 --> 00:29:12,400 Speaker 1: of Thrones and I am John Snow or the next chapter, 550 00:29:12,440 --> 00:29:14,800 Speaker 1: I am Denaros Targarian, and so on. We can just 551 00:29:15,040 --> 00:29:19,440 Speaker 1: slip into other shoes so readily, and the story of 552 00:29:19,520 --> 00:29:22,000 Speaker 1: neuroscience and every textbook that we read is you know, 553 00:29:22,040 --> 00:29:24,840 Speaker 1: it's all about in mutual information and finding out exactly 554 00:29:24,920 --> 00:29:27,880 Speaker 1: what you're hearing and seeing, but it's not somehow we're 555 00:29:27,960 --> 00:29:31,400 Speaker 1: so easily able to slip on other identities. So that's 556 00:29:31,480 --> 00:29:35,680 Speaker 1: fascinating to me. But one of the jobs of literature 557 00:29:36,120 --> 00:29:38,840 Speaker 1: is that it allows us to experience broader worlds or 558 00:29:38,840 --> 00:29:41,360 Speaker 1: be in situations that we would never be in, and 559 00:29:41,400 --> 00:29:44,080 Speaker 1: we get to learn from those situations. And so what's 560 00:29:44,080 --> 00:29:47,480 Speaker 1: interesting about science fiction is really stretching that out and 561 00:29:47,520 --> 00:29:51,560 Speaker 1: really trying things out at a very different place, at 562 00:29:51,560 --> 00:29:54,640 Speaker 1: a different scale. And you'd get the same thing if 563 00:29:54,680 --> 00:29:57,480 Speaker 1: you met your grea great grandfather who got to tell 564 00:29:57,480 --> 00:30:00,280 Speaker 1: you about something that was really meaningful in this is 565 00:30:00,840 --> 00:30:03,680 Speaker 1: era which is totally removed from yours, but you'd get 566 00:30:03,680 --> 00:30:05,640 Speaker 1: to really experience something, oh. 567 00:30:05,520 --> 00:30:08,959 Speaker 3: And understand how he made meaning because, as you know, 568 00:30:09,200 --> 00:30:12,960 Speaker 3: your upload download rate doesn't really matter. We create memories 569 00:30:13,000 --> 00:30:16,680 Speaker 3: through meaning and through narrative, and that's kind of inherent 570 00:30:16,920 --> 00:30:19,840 Speaker 3: kind of how our memory works. So I have this 571 00:30:19,920 --> 00:30:21,960 Speaker 3: debate with my friends at neuralink where I'm like, really, 572 00:30:22,000 --> 00:30:25,160 Speaker 3: will the upload download rate really like make us superhuman? 573 00:30:25,760 --> 00:30:29,080 Speaker 3: Or will a brain computer interface much more likely be 574 00:30:29,160 --> 00:30:31,200 Speaker 3: one in which we have an agent in our head 575 00:30:31,240 --> 00:30:34,040 Speaker 3: which we are conversing with because in fact, that's how 576 00:30:34,080 --> 00:30:38,480 Speaker 3: we process information better we create meaning we're dilgic, we 577 00:30:38,520 --> 00:30:41,800 Speaker 3: ask questions, and that that will be the memory augmentation 578 00:30:42,760 --> 00:30:45,880 Speaker 3: versus some kind of I just learned Kung Fust the situation. 579 00:30:46,480 --> 00:30:49,480 Speaker 1: Oh, I love that. Are there any other science fiction 580 00:30:49,520 --> 00:30:52,320 Speaker 1: books that you've read that you feel like this really 581 00:30:52,360 --> 00:30:54,400 Speaker 1: shines an interesting light on where we are right now 582 00:30:54,400 --> 00:30:55,320 Speaker 1: in twenty twenty five. 583 00:30:56,000 --> 00:31:00,480 Speaker 3: Dune is an epic book and an epic series, and 584 00:31:00,680 --> 00:31:04,200 Speaker 3: of course Denny made an epic set of movies around it. 585 00:31:04,360 --> 00:31:07,000 Speaker 3: But I think one of the reasons that it is 586 00:31:07,120 --> 00:31:10,440 Speaker 3: capturing the public imagination right now is because it is 587 00:31:10,520 --> 00:31:14,880 Speaker 3: all about that tension between kind of physical and social 588 00:31:14,880 --> 00:31:19,000 Speaker 3: striving and being your higher self, you know, like can 589 00:31:19,040 --> 00:31:22,680 Speaker 3: you actually become like an evolve into being better humans? 590 00:31:22,840 --> 00:31:24,440 Speaker 3: And there's a lot of oil politics in there, and 591 00:31:24,440 --> 00:31:26,200 Speaker 3: I'm going to show those aside for a second, But 592 00:31:26,400 --> 00:31:30,200 Speaker 3: to me, the core theme of Doom is around human 593 00:31:30,320 --> 00:31:34,959 Speaker 3: evolution and using meditation and using all of the different 594 00:31:35,000 --> 00:31:38,400 Speaker 3: skills that we have from all the different you know, 595 00:31:38,800 --> 00:31:42,840 Speaker 3: veins of knowledge throughout the world to naturally evolve ourselves. 596 00:31:42,920 --> 00:31:45,280 Speaker 3: That's what the been a guess er it did, right. 597 00:31:45,360 --> 00:31:47,840 Speaker 1: So if anyone who hasn't seen the movie or read 598 00:31:47,840 --> 00:31:50,760 Speaker 1: the book or read the book, give us the quick stuff. 599 00:31:50,760 --> 00:31:54,560 Speaker 3: Okay, So the quick summary is, in this wonderful space 600 00:31:54,600 --> 00:31:59,080 Speaker 3: opera far funk future, they've outlawed thinking machines because they 601 00:31:59,120 --> 00:32:02,920 Speaker 3: had some bad running with killer machines, and so they 602 00:32:03,080 --> 00:32:07,720 Speaker 3: focused on natural human evolution and augmentation. And there are 603 00:32:07,760 --> 00:32:13,160 Speaker 3: some specific groups that have started to use basically yoga, meditation, 604 00:32:13,520 --> 00:32:18,600 Speaker 3: memory techniques, somatic dreaming like you know, hyper awareness, as 605 00:32:18,640 --> 00:32:24,640 Speaker 3: well as you know, psychedelic substances to dramatically expand and 606 00:32:24,800 --> 00:32:26,840 Speaker 3: increase the scope of their capabilities. 607 00:32:27,960 --> 00:32:29,360 Speaker 2: And you see people's doing it now. 608 00:32:29,400 --> 00:32:31,640 Speaker 3: I mean this has frankly been something that we've done 609 00:32:32,200 --> 00:32:34,680 Speaker 3: in you know groups around the world forever, but there 610 00:32:34,680 --> 00:32:38,480 Speaker 3: hasn't necessarily been a you know, a an effort around 611 00:32:38,520 --> 00:32:40,120 Speaker 3: it for intelligence augmentation. 612 00:32:40,400 --> 00:32:41,760 Speaker 1: Oh you see echoes of that. 613 00:32:42,160 --> 00:32:45,520 Speaker 3: Well, Dune is very important right now as we have 614 00:32:45,920 --> 00:32:52,200 Speaker 3: new technology coming online, an awareness of how different techniques 615 00:32:52,240 --> 00:32:55,080 Speaker 3: around what we eat and how we meditate, how we 616 00:32:55,160 --> 00:32:58,600 Speaker 3: use our memory can produce you know, longer life can 617 00:32:58,640 --> 00:33:02,400 Speaker 3: slow aging can inc intelligence, And there's this tension right 618 00:33:02,400 --> 00:33:04,280 Speaker 3: now around are we going to be able to do 619 00:33:04,320 --> 00:33:06,280 Speaker 3: any of that naturally or are we going to have 620 00:33:06,320 --> 00:33:07,360 Speaker 3: to hand it off to machines. 621 00:33:12,960 --> 00:33:16,240 Speaker 1: That was my conversation with Bethany Maples. So let's come 622 00:33:16,280 --> 00:33:19,440 Speaker 1: back to where we began. In nineteen sixty six, Star 623 00:33:19,520 --> 00:33:22,720 Speaker 1: Trek put glowing tablets into the hands of its characters, 624 00:33:23,120 --> 00:33:28,040 Speaker 1: and decades later, engineers turned that fiction into fact. That's 625 00:33:28,160 --> 00:33:30,840 Speaker 1: just one example of what we've been circling today, the 626 00:33:30,880 --> 00:33:35,760 Speaker 1: way that science fiction becomes a rehearsal for the future. Now, 627 00:33:35,920 --> 00:33:37,920 Speaker 1: it would be too simple to say that science fiction 628 00:33:38,080 --> 00:33:42,280 Speaker 1: is prediction or prophecy, because it often gets things wrong. Instead, 629 00:33:42,320 --> 00:33:44,560 Speaker 1: I think it makes sense to say that it's the 630 00:33:44,720 --> 00:33:48,560 Speaker 1: place where we write down our collective hopes and fears, 631 00:33:49,280 --> 00:33:53,840 Speaker 1: our dreams of flying, of reading minds, of living forever, 632 00:33:54,360 --> 00:33:58,040 Speaker 1: of building companions who will never leave us. And by 633 00:33:58,080 --> 00:34:01,320 Speaker 1: committing these things to the page, these ideas have a 634 00:34:01,440 --> 00:34:04,960 Speaker 1: chance to blossom. They plant seeds in the minds of 635 00:34:05,000 --> 00:34:08,480 Speaker 1: young people, and they go on to build the technology 636 00:34:08,520 --> 00:34:12,480 Speaker 1: of tomorrow. And when that technology finally arrives, whether it's 637 00:34:12,840 --> 00:34:17,000 Speaker 1: AI tutors or digital twins, or companions that blur the 638 00:34:17,040 --> 00:34:20,400 Speaker 1: line between friendship and therapy whatever, we often feel a 639 00:34:20,600 --> 00:34:25,560 Speaker 1: sense of deja vus. Haven't we been here before? Yes, 640 00:34:25,680 --> 00:34:28,719 Speaker 1: but only in the pages of a novel or in 641 00:34:28,760 --> 00:34:32,440 Speaker 1: the flicker of a movie screen. So as we wrap up, 642 00:34:32,480 --> 00:34:34,880 Speaker 1: I'll leave you with this thought. Sometimes the stories we 643 00:34:34,920 --> 00:34:37,880 Speaker 1: tell are more than just entertainment. Every once in a while, 644 00:34:38,239 --> 00:34:40,440 Speaker 1: the novels on your shelf and the shows on your 645 00:34:40,440 --> 00:34:44,440 Speaker 1: screen are a way to move ideas forward. And so 646 00:34:44,480 --> 00:34:49,239 Speaker 1: the next question becomes, what stories do we want to 647 00:34:49,320 --> 00:34:53,560 Speaker 1: tell now? Because if science fiction has a way of 648 00:34:53,680 --> 00:34:57,880 Speaker 1: becoming real life, then all of us, all the writers, 649 00:34:57,920 --> 00:35:03,200 Speaker 1: the readers, the engineers, the visionaries, we all share some responsibility. 650 00:35:03,600 --> 00:35:11,000 Speaker 1: We are all collaborators in building the future. Go to 651 00:35:11,000 --> 00:35:13,840 Speaker 1: Eagleman dot com slash podcasts for more information and to 652 00:35:13,880 --> 00:35:17,719 Speaker 1: find further reading. Send me an email at podcasts at 653 00:35:17,719 --> 00:35:21,319 Speaker 1: eagleman dot com with questions or discussion, and check out 654 00:35:21,320 --> 00:35:24,400 Speaker 1: and subscribe to Inner Cosmos on YouTube for videos of 655 00:35:24,440 --> 00:35:29,239 Speaker 1: each episode and to leave comments until next time. I'm 656 00:35:29,320 --> 00:35:31,880 Speaker 1: David Eagleman, and this is Inner Cosmos.